2026-03-09T17:19:27.391 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-09T17:19:27.399 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-09T17:19:27.425 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/585 branch: squid description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/no kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{v18.2.0} 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/yes 3-inline/no 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} email: null first_in_suite: false flavor: default job_id: '585' last_in_suite: false machine_type: vps meta: - desc: 'setup ceph/v18.2.0 ' name: kyr-2026-03-09_11:23:05-orch-squid-none-default-vps no_nested_subset: false os_type: centos os_version: 9.stream overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: cluster-conf: mgr: client mount timeout: 30 debug client: 20 debug mgr: 20 debug ms: 1 mon warn on pool no app: false conf: client: client mount timeout: 600 debug client: 20 debug ms: 1 rados mon op timeout: 900 rados osd op timeout: 900 global: mon pg warn min per osd: 0 mds: debug mds: 20 debug mds balancer: 20 debug ms: 1 mds debug frag: true mds debug scatterstat: true mds op complaint time: 180 mds verify scatter: true osd op complaint time: 180 rados mon op timeout: 900 rados osd op timeout: 900 mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 mon down mkfs grace: 300 mon op complaint time: 120 osd: bdev async discard: true bdev enable discard: true bluestore allocator: bitmap bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug ms: 1 debug osd: 20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd mclock iops capacity threshold hdd: 49000 osd objectstore: bluestore osd op complaint time: 180 flavor: default fs: xfs log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) - FS_DEGRADED - filesystem is degraded - FS_INLINE_DATA_DEPRECATED - FS_WITH_FAILED_MDS - MDS_ALL_DOWN - filesystem is offline - is offline because no MDS - MDS_DAMAGE - MDS_DEGRADED - MDS_FAILED - MDS_INSUFFICIENT_STANDBY - MDS_UP_LESS_THAN_MAX - online, but wants - filesystem is online with fewer MDS than max_mds - POOL_APP_NOT_ENABLED - do not have an application enabled - overall HEALTH_ - Replacing daemon - deprecated feature inline_data - MGR_MODULE_ERROR - OSD_DOWN - osds down - overall HEALTH_ - \(OSD_DOWN\) - \(OSD_ - but it is still running - is not responding - MON_DOWN - PG_AVAILABILITY - PG_DEGRADED - Reduced data availability - Degraded data redundancy - pg .* is stuck inactive - pg .* is .*degraded - pg .* is stuck peering sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: bluestore: true conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} osd: bdev async discard: true bdev enable discard: true bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd objectstore: bluestore fs: xfs install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath kclient: syntax: v1 selinux: allowlist: - scontext=system_u:system_r:logrotate_t:s0 - scontext=system_u:system_r:getty_t:s0 thrashosds: bdev_inject_crash: 2 bdev_inject_crash_probability: 0.5 workunit: branch: tt-squid sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - host.a - client.0 - osd.0 - osd.1 - osd.2 - - host.b - client.1 - osd.3 - osd.4 - osd.5 seed: 3443 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 subset: 1/64 suite: orch suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 targets: vm06.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL3isagDJR55odFw2aEtyih3bLW5eYTdDFMRv7FGculFhukcRA7syCeekZOAUvOfLuNSbr9f2DZeWPMd4t5C7EY= vm09.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCgBKDef7XvlS4YOMGXDWYv6Z8KNMzF6JQTMjICKhONH1rQjxAPoMy7q7f3Bd4hVkp3MSZCvOS6K/+/QUpVJzY8= tasks: - install: exclude_packages: - ceph-volume tag: v18.2.0 - print: '**** done install task...' - cephadm: compiled_cephadm_branch: reef conf: osd: osd_class_default_list: '*' osd_class_load_list: '*' image: quay.io/ceph/ceph:v18.2.0 roleless: true - print: '**** done end installing v18.2.0 cephadm ...' - cephadm.shell: host.a: - ceph config set mgr mgr/cephadm/use_repo_digest true --force - print: '**** done cephadm.shell ceph config set mgr...' - cephadm.shell: host.a: - ceph orch status - ceph orch ps - ceph orch ls - ceph orch host ls - ceph orch device ls - cephadm.shell: host.a: - ceph fs volume create cephfs --placement=4 - ceph fs dump - cephadm.shell: host.a: - ceph fs set cephfs max_mds 1 - cephadm.shell: host.a: - ceph fs set cephfs allow_standby_replay true - cephadm.shell: host.a: - ceph fs set cephfs inline_data false - cephadm.shell: host.a: - ceph fs dump - ceph --format=json fs dump | jq -e ".filesystems | length == 1" - while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done - fs.pre_upgrade_save: null - ceph-fuse: null - print: '**** done client' - parallel: - upgrade-tasks - workload-tasks - cephadm.shell: host.a: - ceph fs dump - fs.post_upgrade_checks: null teuthology: fragments_dropped: - /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/suites/orch/cephadm/mds_upgrade_sequence/tasks/3-upgrade-mgr-staggered.yaml meta: {} postmerge: - "local kernel = py_attrgetter(yaml).get('kernel')\nif kernel ~= nil then\n local\ \ branch = py_attrgetter(kernel).get('branch')\n if branch and not kernel.branch:find\ \ \"-all$\" then\n log.debug(\"removing default kernel specification: %s\"\ , kernel)\n py_attrgetter(kernel).pop('branch', nil)\n py_attrgetter(kernel).pop('deb',\ \ nil)\n py_attrgetter(kernel).pop('flavor', nil)\n py_attrgetter(kernel).pop('kdb',\ \ nil)\n py_attrgetter(kernel).pop('koji', nil)\n py_attrgetter(kernel).pop('koji_task',\ \ nil)\n py_attrgetter(kernel).pop('rpm', nil)\n py_attrgetter(kernel).pop('sha1',\ \ nil)\n py_attrgetter(kernel).pop('tag', nil)\n end\nend\n" variables: fail_fs: false teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-09_11:23:05 tube: vps upgrade-tasks: sequential: - cephadm.shell: env: - sha1 host.a: - ceph config set mgr mgr/orchestrator/fail_fs false || true - cephadm.shell: env: - sha1 host.a: - ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force - ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force - ceph config set global log_to_journald false --force - ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 - cephadm.shell: env: - sha1 host.a: - while ceph orch upgrade status | jq '.in_progress' | grep true && ! ceph orch upgrade status | jq '.message' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done - ceph orch ps - ceph orch upgrade status - ceph health detail - ceph versions - echo "wait for servicemap items w/ changing names to refresh" - sleep 60 - ceph orch ps - ceph versions - ceph versions | jq -e '.overall | length == 1' - ceph versions | jq -e '.overall | keys' | grep $sha1 user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 workload-tasks: sequential: - workunit: clients: all: - suites/fsstress.sh 2026-03-09T17:19:27.425 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa; will attempt to use it 2026-03-09T17:19:27.426 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks 2026-03-09T17:19:27.426 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-09T17:19:27.426 INFO:teuthology.task.internal:Checking packages... 2026-03-09T17:19:27.426 INFO:teuthology.task.internal:Checking packages for os_type 'centos', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-09T17:19:27.426 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-09T17:19:27.426 INFO:teuthology.packaging:ref: None 2026-03-09T17:19:27.426 INFO:teuthology.packaging:tag: None 2026-03-09T17:19:27.426 INFO:teuthology.packaging:branch: squid 2026-03-09T17:19:27.426 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T17:19:27.426 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=squid 2026-03-09T17:19:28.123 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678.ge911bdeb 2026-03-09T17:19:28.124 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-09T17:19:28.124 INFO:teuthology.task.internal:no buildpackages task found 2026-03-09T17:19:28.124 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-09T17:19:28.125 INFO:teuthology.task.internal:Saving configuration 2026-03-09T17:19:28.134 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-09T17:19:28.135 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-09T17:19:28.142 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm06.local', 'description': '/archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/585', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-09 17:16:37.753052', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:06', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL3isagDJR55odFw2aEtyih3bLW5eYTdDFMRv7FGculFhukcRA7syCeekZOAUvOfLuNSbr9f2DZeWPMd4t5C7EY='} 2026-03-09T17:19:28.148 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm09.local', 'description': '/archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/585', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-09 17:16:37.752605', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:09', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCgBKDef7XvlS4YOMGXDWYv6Z8KNMzF6JQTMjICKhONH1rQjxAPoMy7q7f3Bd4hVkp3MSZCvOS6K/+/QUpVJzY8='} 2026-03-09T17:19:28.148 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-09T17:19:28.149 INFO:teuthology.task.internal:roles: ubuntu@vm06.local - ['host.a', 'client.0', 'osd.0', 'osd.1', 'osd.2'] 2026-03-09T17:19:28.149 INFO:teuthology.task.internal:roles: ubuntu@vm09.local - ['host.b', 'client.1', 'osd.3', 'osd.4', 'osd.5'] 2026-03-09T17:19:28.149 INFO:teuthology.run_tasks:Running task console_log... 2026-03-09T17:19:28.155 DEBUG:teuthology.task.console_log:vm06 does not support IPMI; excluding 2026-03-09T17:19:28.159 DEBUG:teuthology.task.console_log:vm09 does not support IPMI; excluding 2026-03-09T17:19:28.159 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7fbc4de76170>, signals=[15]) 2026-03-09T17:19:28.159 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-09T17:19:28.160 INFO:teuthology.task.internal:Opening connections... 2026-03-09T17:19:28.160 DEBUG:teuthology.task.internal:connecting to ubuntu@vm06.local 2026-03-09T17:19:28.160 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm06.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-09T17:19:28.217 DEBUG:teuthology.task.internal:connecting to ubuntu@vm09.local 2026-03-09T17:19:28.218 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm09.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-09T17:19:28.278 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-09T17:19:28.279 DEBUG:teuthology.orchestra.run.vm06:> uname -m 2026-03-09T17:19:28.314 INFO:teuthology.orchestra.run.vm06.stdout:x86_64 2026-03-09T17:19:28.314 DEBUG:teuthology.orchestra.run.vm06:> cat /etc/os-release 2026-03-09T17:19:28.368 INFO:teuthology.orchestra.run.vm06.stdout:NAME="CentOS Stream" 2026-03-09T17:19:28.368 INFO:teuthology.orchestra.run.vm06.stdout:VERSION="9" 2026-03-09T17:19:28.368 INFO:teuthology.orchestra.run.vm06.stdout:ID="centos" 2026-03-09T17:19:28.368 INFO:teuthology.orchestra.run.vm06.stdout:ID_LIKE="rhel fedora" 2026-03-09T17:19:28.368 INFO:teuthology.orchestra.run.vm06.stdout:VERSION_ID="9" 2026-03-09T17:19:28.368 INFO:teuthology.orchestra.run.vm06.stdout:PLATFORM_ID="platform:el9" 2026-03-09T17:19:28.368 INFO:teuthology.orchestra.run.vm06.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-09T17:19:28.368 INFO:teuthology.orchestra.run.vm06.stdout:ANSI_COLOR="0;31" 2026-03-09T17:19:28.368 INFO:teuthology.orchestra.run.vm06.stdout:LOGO="fedora-logo-icon" 2026-03-09T17:19:28.368 INFO:teuthology.orchestra.run.vm06.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-09T17:19:28.368 INFO:teuthology.orchestra.run.vm06.stdout:HOME_URL="https://centos.org/" 2026-03-09T17:19:28.368 INFO:teuthology.orchestra.run.vm06.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-09T17:19:28.368 INFO:teuthology.orchestra.run.vm06.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-09T17:19:28.368 INFO:teuthology.orchestra.run.vm06.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-09T17:19:28.368 INFO:teuthology.lock.ops:Updating vm06.local on lock server 2026-03-09T17:19:28.373 DEBUG:teuthology.orchestra.run.vm09:> uname -m 2026-03-09T17:19:28.391 INFO:teuthology.orchestra.run.vm09.stdout:x86_64 2026-03-09T17:19:28.391 DEBUG:teuthology.orchestra.run.vm09:> cat /etc/os-release 2026-03-09T17:19:28.445 INFO:teuthology.orchestra.run.vm09.stdout:NAME="CentOS Stream" 2026-03-09T17:19:28.445 INFO:teuthology.orchestra.run.vm09.stdout:VERSION="9" 2026-03-09T17:19:28.445 INFO:teuthology.orchestra.run.vm09.stdout:ID="centos" 2026-03-09T17:19:28.445 INFO:teuthology.orchestra.run.vm09.stdout:ID_LIKE="rhel fedora" 2026-03-09T17:19:28.445 INFO:teuthology.orchestra.run.vm09.stdout:VERSION_ID="9" 2026-03-09T17:19:28.445 INFO:teuthology.orchestra.run.vm09.stdout:PLATFORM_ID="platform:el9" 2026-03-09T17:19:28.445 INFO:teuthology.orchestra.run.vm09.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-09T17:19:28.445 INFO:teuthology.orchestra.run.vm09.stdout:ANSI_COLOR="0;31" 2026-03-09T17:19:28.445 INFO:teuthology.orchestra.run.vm09.stdout:LOGO="fedora-logo-icon" 2026-03-09T17:19:28.445 INFO:teuthology.orchestra.run.vm09.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-09T17:19:28.445 INFO:teuthology.orchestra.run.vm09.stdout:HOME_URL="https://centos.org/" 2026-03-09T17:19:28.446 INFO:teuthology.orchestra.run.vm09.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-09T17:19:28.446 INFO:teuthology.orchestra.run.vm09.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-09T17:19:28.446 INFO:teuthology.orchestra.run.vm09.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-09T17:19:28.446 INFO:teuthology.lock.ops:Updating vm09.local on lock server 2026-03-09T17:19:28.450 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-09T17:19:28.452 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-09T17:19:28.453 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-09T17:19:28.453 DEBUG:teuthology.orchestra.run.vm06:> test '!' -e /home/ubuntu/cephtest 2026-03-09T17:19:28.455 DEBUG:teuthology.orchestra.run.vm09:> test '!' -e /home/ubuntu/cephtest 2026-03-09T17:19:28.502 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-09T17:19:28.503 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-09T17:19:28.503 DEBUG:teuthology.orchestra.run.vm06:> test -z $(ls -A /var/lib/ceph) 2026-03-09T17:19:28.509 DEBUG:teuthology.orchestra.run.vm09:> test -z $(ls -A /var/lib/ceph) 2026-03-09T17:19:28.524 INFO:teuthology.orchestra.run.vm06.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-09T17:19:28.559 INFO:teuthology.orchestra.run.vm09.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-09T17:19:28.560 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-09T17:19:28.571 DEBUG:teuthology.orchestra.run.vm06:> test -e /ceph-qa-ready 2026-03-09T17:19:28.586 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T17:19:28.779 DEBUG:teuthology.orchestra.run.vm09:> test -e /ceph-qa-ready 2026-03-09T17:19:28.793 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T17:19:28.984 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-09T17:19:28.985 INFO:teuthology.task.internal:Creating test directory... 2026-03-09T17:19:28.985 DEBUG:teuthology.orchestra.run.vm06:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-09T17:19:28.987 DEBUG:teuthology.orchestra.run.vm09:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-09T17:19:29.001 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-09T17:19:29.003 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-09T17:19:29.004 INFO:teuthology.task.internal:Creating archive directory... 2026-03-09T17:19:29.004 DEBUG:teuthology.orchestra.run.vm06:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-09T17:19:29.042 DEBUG:teuthology.orchestra.run.vm09:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-09T17:19:29.061 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-09T17:19:29.062 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-09T17:19:29.063 DEBUG:teuthology.orchestra.run.vm06:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-09T17:19:29.111 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T17:19:29.111 DEBUG:teuthology.orchestra.run.vm09:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-09T17:19:29.125 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T17:19:29.125 DEBUG:teuthology.orchestra.run.vm06:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-09T17:19:29.153 DEBUG:teuthology.orchestra.run.vm09:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-09T17:19:29.174 INFO:teuthology.orchestra.run.vm06.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-09T17:19:29.183 INFO:teuthology.orchestra.run.vm06.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-09T17:19:29.189 INFO:teuthology.orchestra.run.vm09.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-09T17:19:29.198 INFO:teuthology.orchestra.run.vm09.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-09T17:19:29.199 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-09T17:19:29.200 INFO:teuthology.task.internal:Configuring sudo... 2026-03-09T17:19:29.200 DEBUG:teuthology.orchestra.run.vm06:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-09T17:19:29.227 DEBUG:teuthology.orchestra.run.vm09:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-09T17:19:29.266 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-09T17:19:29.268 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-09T17:19:29.268 DEBUG:teuthology.orchestra.run.vm06:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-09T17:19:29.291 DEBUG:teuthology.orchestra.run.vm09:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-09T17:19:29.321 DEBUG:teuthology.orchestra.run.vm06:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T17:19:29.365 DEBUG:teuthology.orchestra.run.vm06:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T17:19:29.421 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:19:29.421 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-09T17:19:29.479 DEBUG:teuthology.orchestra.run.vm09:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T17:19:29.500 DEBUG:teuthology.orchestra.run.vm09:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T17:19:29.555 DEBUG:teuthology.orchestra.run.vm09:> set -ex 2026-03-09T17:19:29.555 DEBUG:teuthology.orchestra.run.vm09:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-09T17:19:29.615 DEBUG:teuthology.orchestra.run.vm06:> sudo service rsyslog restart 2026-03-09T17:19:29.617 DEBUG:teuthology.orchestra.run.vm09:> sudo service rsyslog restart 2026-03-09T17:19:29.643 INFO:teuthology.orchestra.run.vm06.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-09T17:19:29.683 INFO:teuthology.orchestra.run.vm09.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-09T17:19:29.923 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-09T17:19:29.924 INFO:teuthology.task.internal:Starting timer... 2026-03-09T17:19:29.924 INFO:teuthology.run_tasks:Running task pcp... 2026-03-09T17:19:29.927 INFO:teuthology.run_tasks:Running task selinux... 2026-03-09T17:19:29.929 DEBUG:teuthology.task:Applying overrides for task selinux: {'allowlist': ['scontext=system_u:system_r:logrotate_t:s0', 'scontext=system_u:system_r:getty_t:s0']} 2026-03-09T17:19:29.929 INFO:teuthology.task.selinux:Excluding vm06: VMs are not yet supported 2026-03-09T17:19:29.929 INFO:teuthology.task.selinux:Excluding vm09: VMs are not yet supported 2026-03-09T17:19:29.929 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-09T17:19:29.929 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-09T17:19:29.929 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-09T17:19:29.929 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-09T17:19:29.931 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-09T17:19:29.931 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/ceph/ceph-cm-ansible.git 2026-03-09T17:19:29.933 INFO:teuthology.repo_utils:Fetching github.com_ceph_ceph-cm-ansible_main from origin 2026-03-09T17:19:35.719 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-09T17:19:35.724 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-09T17:19:35.725 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventoryil2rqhig --limit vm06.local,vm09.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-09T17:22:16.208 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm06.local'), Remote(name='ubuntu@vm09.local')] 2026-03-09T17:22:16.208 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm06.local' 2026-03-09T17:22:16.209 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm06.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-09T17:22:16.274 DEBUG:teuthology.orchestra.run.vm06:> true 2026-03-09T17:22:16.345 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm06.local' 2026-03-09T17:22:16.345 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm09.local' 2026-03-09T17:22:16.345 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm09.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-09T17:22:16.409 DEBUG:teuthology.orchestra.run.vm09:> true 2026-03-09T17:22:16.487 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm09.local' 2026-03-09T17:22:16.487 INFO:teuthology.run_tasks:Running task clock... 2026-03-09T17:22:16.489 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-09T17:22:16.489 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-09T17:22:16.490 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T17:22:16.491 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-09T17:22:16.491 DEBUG:teuthology.orchestra.run.vm09:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T17:22:16.529 INFO:teuthology.orchestra.run.vm06.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-09T17:22:16.550 INFO:teuthology.orchestra.run.vm06.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-09T17:22:16.564 INFO:teuthology.orchestra.run.vm09.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-09T17:22:16.576 INFO:teuthology.orchestra.run.vm06.stderr:sudo: ntpd: command not found 2026-03-09T17:22:16.579 INFO:teuthology.orchestra.run.vm09.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-09T17:22:16.593 INFO:teuthology.orchestra.run.vm06.stdout:506 Cannot talk to daemon 2026-03-09T17:22:16.606 INFO:teuthology.orchestra.run.vm09.stderr:sudo: ntpd: command not found 2026-03-09T17:22:16.612 INFO:teuthology.orchestra.run.vm06.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-09T17:22:16.616 INFO:teuthology.orchestra.run.vm09.stdout:506 Cannot talk to daemon 2026-03-09T17:22:16.631 INFO:teuthology.orchestra.run.vm09.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-09T17:22:16.632 INFO:teuthology.orchestra.run.vm06.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-09T17:22:16.646 INFO:teuthology.orchestra.run.vm09.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-09T17:22:16.684 INFO:teuthology.orchestra.run.vm06.stderr:bash: line 1: ntpq: command not found 2026-03-09T17:22:16.690 INFO:teuthology.orchestra.run.vm09.stderr:bash: line 1: ntpq: command not found 2026-03-09T17:22:16.742 INFO:teuthology.orchestra.run.vm09.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-09T17:22:16.742 INFO:teuthology.orchestra.run.vm09.stdout:=============================================================================== 2026-03-09T17:22:16.742 INFO:teuthology.orchestra.run.vm09.stdout:^? stratum2-4.NTP.TechFak.U> 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T17:22:16.742 INFO:teuthology.orchestra.run.vm09.stdout:^? t2.ipfu.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T17:22:16.742 INFO:teuthology.orchestra.run.vm09.stdout:^? time.ndless.net 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T17:22:16.742 INFO:teuthology.orchestra.run.vm09.stdout:^? 172-104-154-182.ip.linod> 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T17:22:16.743 INFO:teuthology.orchestra.run.vm06.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-09T17:22:16.743 INFO:teuthology.orchestra.run.vm06.stdout:=============================================================================== 2026-03-09T17:22:16.743 INFO:teuthology.orchestra.run.vm06.stdout:^? 172-104-154-182.ip.linod> 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T17:22:16.743 INFO:teuthology.orchestra.run.vm06.stdout:^? stratum2-4.NTP.TechFak.U> 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T17:22:16.743 INFO:teuthology.orchestra.run.vm06.stdout:^? t2.ipfu.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T17:22:16.743 INFO:teuthology.orchestra.run.vm06.stdout:^? time.ndless.net 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-09T17:22:16.743 INFO:teuthology.run_tasks:Running task install... 2026-03-09T17:22:16.745 DEBUG:teuthology.task.install:project ceph 2026-03-09T17:22:16.745 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-09T17:22:16.745 DEBUG:teuthology.task.install:config {'exclude_packages': ['ceph-volume'], 'tag': 'v18.2.0', 'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-09T17:22:16.745 INFO:teuthology.task.install:Using flavor: default 2026-03-09T17:22:16.747 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-09T17:22:16.747 INFO:teuthology.task.install:extra packages: [] 2026-03-09T17:22:16.748 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': 'v18.2.0', 'wait_for_package': False} 2026-03-09T17:22:16.748 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-09T17:22:16.748 INFO:teuthology.packaging:ref: None 2026-03-09T17:22:16.748 INFO:teuthology.packaging:tag: v18.2.0 2026-03-09T17:22:16.748 INFO:teuthology.packaging:branch: None 2026-03-09T17:22:16.748 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T17:22:18.013 DEBUG:teuthology.repo_utils:git ls-remote https://github.com/ceph/ceph v18.2.0^{} -> 5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-09T17:22:18.014 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-09T17:22:18.015 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': 'v18.2.0', 'wait_for_package': False} 2026-03-09T17:22:18.015 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-09T17:22:18.015 INFO:teuthology.packaging:ref: None 2026-03-09T17:22:18.015 INFO:teuthology.packaging:tag: v18.2.0 2026-03-09T17:22:18.015 INFO:teuthology.packaging:branch: None 2026-03-09T17:22:18.015 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T17:22:18.015 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-09T17:22:18.573 INFO:teuthology.task.install.rpm:Pulling from https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/ 2026-03-09T17:22:18.573 INFO:teuthology.task.install.rpm:Package version is 18.2.0-0 2026-03-09T17:22:18.672 INFO:teuthology.task.install.rpm:Pulling from https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/ 2026-03-09T17:22:18.672 INFO:teuthology.task.install.rpm:Package version is 18.2.0-0 2026-03-09T17:22:18.986 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-09T17:22:18.986 DEBUG:teuthology.orchestra.run.vm09:> set -ex 2026-03-09T17:22:18.986 DEBUG:teuthology.orchestra.run.vm09:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-09T17:22:19.019 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-09T17:22:19.019 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-09T17:22:19.019 INFO:teuthology.packaging:ref: None 2026-03-09T17:22:19.019 INFO:teuthology.packaging:tag: v18.2.0 2026-03-09T17:22:19.019 INFO:teuthology.packaging:branch: None 2026-03-09T17:22:19.019 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T17:22:19.019 DEBUG:teuthology.orchestra.run.vm09:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/v18.2.0/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-09T17:22:19.046 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-09T17:22:19.046 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:22:19.046 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-09T17:22:19.091 DEBUG:teuthology.orchestra.run.vm09:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-09T17:22:19.092 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-09T17:22:19.092 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-09T17:22:19.092 INFO:teuthology.packaging:ref: None 2026-03-09T17:22:19.092 INFO:teuthology.packaging:tag: v18.2.0 2026-03-09T17:22:19.092 INFO:teuthology.packaging:branch: None 2026-03-09T17:22:19.092 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T17:22:19.092 DEBUG:teuthology.orchestra.run.vm06:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/v18.2.0/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-09T17:22:19.164 DEBUG:teuthology.orchestra.run.vm06:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-09T17:22:19.172 DEBUG:teuthology.orchestra.run.vm09:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-09T17:22:19.245 INFO:teuthology.orchestra.run.vm09.stdout:check_obsoletes = 1 2026-03-09T17:22:19.247 DEBUG:teuthology.orchestra.run.vm09:> sudo yum clean all 2026-03-09T17:22:19.253 DEBUG:teuthology.orchestra.run.vm06:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-09T17:22:19.289 INFO:teuthology.orchestra.run.vm06.stdout:check_obsoletes = 1 2026-03-09T17:22:19.290 DEBUG:teuthology.orchestra.run.vm06:> sudo yum clean all 2026-03-09T17:22:19.436 INFO:teuthology.orchestra.run.vm09.stdout:41 files removed 2026-03-09T17:22:19.462 DEBUG:teuthology.orchestra.run.vm09:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-09T17:22:19.509 INFO:teuthology.orchestra.run.vm06.stdout:41 files removed 2026-03-09T17:22:19.547 DEBUG:teuthology.orchestra.run.vm06:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-09T17:22:20.465 INFO:teuthology.orchestra.run.vm09.stdout:ceph packages for x86_64 91 kB/s | 76 kB 00:00 2026-03-09T17:22:20.587 INFO:teuthology.orchestra.run.vm06.stdout:ceph packages for x86_64 94 kB/s | 76 kB 00:00 2026-03-09T17:22:21.118 INFO:teuthology.orchestra.run.vm09.stdout:ceph noarch packages 15 kB/s | 9.3 kB 00:00 2026-03-09T17:22:21.237 INFO:teuthology.orchestra.run.vm06.stdout:ceph noarch packages 15 kB/s | 9.3 kB 00:00 2026-03-09T17:22:21.743 INFO:teuthology.orchestra.run.vm09.stdout:ceph source packages 3.5 kB/s | 2.2 kB 00:00 2026-03-09T17:22:21.858 INFO:teuthology.orchestra.run.vm06.stdout:ceph source packages 3.6 kB/s | 2.2 kB 00:00 2026-03-09T17:22:30.221 INFO:teuthology.orchestra.run.vm06.stdout:CentOS Stream 9 - BaseOS 1.1 MB/s | 8.9 MB 00:08 2026-03-09T17:22:34.554 INFO:teuthology.orchestra.run.vm09.stdout:CentOS Stream 9 - BaseOS 713 kB/s | 8.9 MB 00:12 2026-03-09T17:22:40.338 INFO:teuthology.orchestra.run.vm09.stdout:CentOS Stream 9 - AppStream 5.3 MB/s | 27 MB 00:05 2026-03-09T17:22:48.014 INFO:teuthology.orchestra.run.vm06.stdout:CentOS Stream 9 - AppStream 1.6 MB/s | 27 MB 00:16 2026-03-09T17:22:58.778 INFO:teuthology.orchestra.run.vm09.stdout:CentOS Stream 9 - CRB 516 kB/s | 8.0 MB 00:15 2026-03-09T17:22:59.830 INFO:teuthology.orchestra.run.vm09.stdout:CentOS Stream 9 - Extras packages 97 kB/s | 20 kB 00:00 2026-03-09T17:23:00.337 INFO:teuthology.orchestra.run.vm09.stdout:Extra Packages for Enterprise Linux 48 MB/s | 20 MB 00:00 2026-03-09T17:23:04.227 INFO:teuthology.orchestra.run.vm06.stdout:CentOS Stream 9 - CRB 611 kB/s | 8.0 MB 00:13 2026-03-09T17:23:05.162 INFO:teuthology.orchestra.run.vm09.stdout:lab-extras 55 kB/s | 50 kB 00:00 2026-03-09T17:23:05.718 INFO:teuthology.orchestra.run.vm06.stdout:CentOS Stream 9 - Extras packages 33 kB/s | 20 kB 00:00 2026-03-09T17:23:06.563 INFO:teuthology.orchestra.run.vm06.stdout:Extra Packages for Enterprise Linux 27 MB/s | 20 MB 00:00 2026-03-09T17:23:06.614 INFO:teuthology.orchestra.run.vm09.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-09T17:23:06.614 INFO:teuthology.orchestra.run.vm09.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-09T17:23:06.618 INFO:teuthology.orchestra.run.vm09.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-09T17:23:06.619 INFO:teuthology.orchestra.run.vm09.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-09T17:23:06.646 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:23:06.650 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:23:06.650 INFO:teuthology.orchestra.run.vm09.stdout: Package Arch Version Repository Size 2026-03-09T17:23:06.650 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout:Installing: 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph x86_64 2:18.2.0-0.el9 ceph 6.4 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-base x86_64 2:18.2.0-0.el9 ceph 5.2 M 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 ceph 835 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 ceph 142 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 ceph 1.4 M 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 ceph-noarch 127 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 ceph-noarch 1.7 M 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 ceph-noarch 7.4 M 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 ceph-noarch 47 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 ceph 7.6 M 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-test x86_64 2:18.2.0-0.el9 ceph 40 M 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: cephadm noarch 2:18.2.0-0.el9 ceph-noarch 209 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 ceph 30 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 ceph 653 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: librados-devel x86_64 2:18.2.0-0.el9 ceph 126 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 ceph 155 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: python3-rados x86_64 2:18.2.0-0.el9 ceph 321 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: python3-rbd x86_64 2:18.2.0-0.el9 ceph 297 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: python3-rgw x86_64 2:18.2.0-0.el9 ceph 99 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 ceph 86 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 ceph 169 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout:Upgrading: 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: librados2 x86_64 2:18.2.0-0.el9 ceph 3.3 M 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: librbd1 x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout:Installing dependencies: 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-common x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 ceph-noarch 23 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mds x86_64 2:18.2.0-0.el9 ceph 2.1 M 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 ceph-noarch 240 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mon x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-osd x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 ceph-noarch 15 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 ceph 24 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-09T17:23:06.651 INFO:teuthology.orchestra.run.vm09.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 ceph 161 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 ceph 474 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: librgw2 x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 ceph 45 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 ceph 119 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-jwt noarch 2.4.0-1.el9 epel 41 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-09T17:23:06.652 INFO:teuthology.orchestra.run.vm09.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout:Installing weak dependencies: 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 epel 9.0 k 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout:Transaction Summary 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout:Install 117 Packages 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout:Upgrade 2 Packages 2026-03-09T17:23:06.653 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:06.654 INFO:teuthology.orchestra.run.vm09.stdout:Total download size: 182 M 2026-03-09T17:23:06.654 INFO:teuthology.orchestra.run.vm09.stdout:Downloading Packages: 2026-03-09T17:23:07.713 INFO:teuthology.orchestra.run.vm09.stdout:(1/119): ceph-18.2.0-0.el9.x86_64.rpm 21 kB/s | 6.4 kB 00:00 2026-03-09T17:23:08.310 INFO:teuthology.orchestra.run.vm09.stdout:(2/119): ceph-fuse-18.2.0-0.el9.x86_64.rpm 1.4 MB/s | 835 kB 00:00 2026-03-09T17:23:08.412 INFO:teuthology.orchestra.run.vm09.stdout:(3/119): ceph-immutable-object-cache-18.2.0-0.e 1.4 MB/s | 142 kB 00:00 2026-03-09T17:23:08.718 INFO:teuthology.orchestra.run.vm09.stdout:(4/119): ceph-mds-18.2.0-0.el9.x86_64.rpm 6.9 MB/s | 2.1 MB 00:00 2026-03-09T17:23:08.928 INFO:teuthology.orchestra.run.vm09.stdout:(5/119): ceph-mgr-18.2.0-0.el9.x86_64.rpm 6.9 MB/s | 1.4 MB 00:00 2026-03-09T17:23:09.369 INFO:teuthology.orchestra.run.vm09.stdout:(6/119): ceph-base-18.2.0-0.el9.x86_64.rpm 2.6 MB/s | 5.2 MB 00:01 2026-03-09T17:23:09.730 INFO:teuthology.orchestra.run.vm09.stdout:(7/119): ceph-mon-18.2.0-0.el9.x86_64.rpm 5.5 MB/s | 4.4 MB 00:00 2026-03-09T17:23:11.040 INFO:teuthology.orchestra.run.vm09.stdout:(8/119): ceph-radosgw-18.2.0-0.el9.x86_64.rpm 5.8 MB/s | 7.6 MB 00:01 2026-03-09T17:23:11.140 INFO:teuthology.orchestra.run.vm09.stdout:(9/119): ceph-selinux-18.2.0-0.el9.x86_64.rpm 242 kB/s | 24 kB 00:00 2026-03-09T17:23:11.252 INFO:teuthology.orchestra.run.vm06.stdout:lab-extras 65 kB/s | 50 kB 00:00 2026-03-09T17:23:12.615 INFO:teuthology.orchestra.run.vm06.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-09T17:23:12.615 INFO:teuthology.orchestra.run.vm06.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-09T17:23:12.619 INFO:teuthology.orchestra.run.vm06.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-09T17:23:12.620 INFO:teuthology.orchestra.run.vm06.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-09T17:23:12.645 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout:Installing: 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: ceph x86_64 2:18.2.0-0.el9 ceph 6.4 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: ceph-base x86_64 2:18.2.0-0.el9 ceph 5.2 M 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 ceph 835 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 ceph 142 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 ceph 1.4 M 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 ceph-noarch 127 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 ceph-noarch 1.7 M 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 ceph-noarch 7.4 M 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 ceph-noarch 47 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 ceph 7.6 M 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: ceph-test x86_64 2:18.2.0-0.el9 ceph 40 M 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: cephadm noarch 2:18.2.0-0.el9 ceph-noarch 209 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 ceph 30 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 ceph 653 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: librados-devel x86_64 2:18.2.0-0.el9 ceph 126 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 ceph 155 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: python3-rados x86_64 2:18.2.0-0.el9 ceph 321 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: python3-rbd x86_64 2:18.2.0-0.el9 ceph 297 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: python3-rgw x86_64 2:18.2.0-0.el9 ceph 99 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 ceph 86 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 ceph 169 k 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout:Upgrading: 2026-03-09T17:23:12.649 INFO:teuthology.orchestra.run.vm06.stdout: librados2 x86_64 2:18.2.0-0.el9 ceph 3.3 M 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: librbd1 x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout:Installing dependencies: 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: ceph-common x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 ceph-noarch 23 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mds x86_64 2:18.2.0-0.el9 ceph 2.1 M 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 ceph-noarch 240 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: ceph-osd x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 ceph-noarch 15 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 ceph 24 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 ceph 161 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 ceph 474 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: librgw2 x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 ceph 45 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 ceph 119 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt noarch 2.4.0-1.el9 epel 41 k 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-09T17:23:12.650 INFO:teuthology.orchestra.run.vm06.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout:Installing weak dependencies: 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 epel 9.0 k 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout:Install 117 Packages 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout:Upgrade 2 Packages 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout:Total download size: 182 M 2026-03-09T17:23:12.651 INFO:teuthology.orchestra.run.vm06.stdout:Downloading Packages: 2026-03-09T17:23:12.685 INFO:teuthology.orchestra.run.vm09.stdout:(10/119): ceph-osd-18.2.0-0.el9.x86_64.rpm 5.3 MB/s | 18 MB 00:03 2026-03-09T17:23:12.785 INFO:teuthology.orchestra.run.vm09.stdout:(11/119): libcephfs-devel-18.2.0-0.el9.x86_64.r 304 kB/s | 30 kB 00:00 2026-03-09T17:23:12.987 INFO:teuthology.orchestra.run.vm09.stdout:(12/119): libcephfs2-18.2.0-0.el9.x86_64.rpm 3.2 MB/s | 653 kB 00:00 2026-03-09T17:23:13.132 INFO:teuthology.orchestra.run.vm09.stdout:(13/119): ceph-common-18.2.0-0.el9.x86_64.rpm 3.2 MB/s | 18 MB 00:05 2026-03-09T17:23:13.134 INFO:teuthology.orchestra.run.vm09.stdout:(14/119): libcephsqlite-18.2.0-0.el9.x86_64.rpm 1.1 MB/s | 161 kB 00:00 2026-03-09T17:23:13.238 INFO:teuthology.orchestra.run.vm09.stdout:(15/119): librados-devel-18.2.0-0.el9.x86_64.rp 1.2 MB/s | 126 kB 00:00 2026-03-09T17:23:13.784 INFO:teuthology.orchestra.run.vm06.stdout:(1/119): ceph-18.2.0-0.el9.x86_64.rpm 21 kB/s | 6.4 kB 00:00 2026-03-09T17:23:14.268 INFO:teuthology.orchestra.run.vm09.stdout:(16/119): libradosstriper1-18.2.0-0.el9.x86_64. 418 kB/s | 474 kB 00:01 2026-03-09T17:23:14.369 INFO:teuthology.orchestra.run.vm09.stdout:(17/119): python3-ceph-argparse-18.2.0-0.el9.x8 449 kB/s | 45 kB 00:00 2026-03-09T17:23:14.381 INFO:teuthology.orchestra.run.vm06.stdout:(2/119): ceph-fuse-18.2.0-0.el9.x86_64.rpm 1.4 MB/s | 835 kB 00:00 2026-03-09T17:23:14.470 INFO:teuthology.orchestra.run.vm09.stdout:(18/119): python3-ceph-common-18.2.0-0.el9.x86_ 1.2 MB/s | 119 kB 00:00 2026-03-09T17:23:14.481 INFO:teuthology.orchestra.run.vm06.stdout:(3/119): ceph-immutable-object-cache-18.2.0-0.e 1.4 MB/s | 142 kB 00:00 2026-03-09T17:23:14.531 INFO:teuthology.orchestra.run.vm09.stdout:(19/119): librgw2-18.2.0-0.el9.x86_64.rpm 3.4 MB/s | 4.4 MB 00:01 2026-03-09T17:23:14.571 INFO:teuthology.orchestra.run.vm09.stdout:(20/119): python3-cephfs-18.2.0-0.el9.x86_64.rp 1.5 MB/s | 155 kB 00:00 2026-03-09T17:23:14.635 INFO:teuthology.orchestra.run.vm09.stdout:(21/119): python3-rados-18.2.0-0.el9.x86_64.rpm 3.0 MB/s | 321 kB 00:00 2026-03-09T17:23:14.675 INFO:teuthology.orchestra.run.vm09.stdout:(22/119): python3-rbd-18.2.0-0.el9.x86_64.rpm 2.8 MB/s | 297 kB 00:00 2026-03-09T17:23:14.735 INFO:teuthology.orchestra.run.vm09.stdout:(23/119): python3-rgw-18.2.0-0.el9.x86_64.rpm 992 kB/s | 99 kB 00:00 2026-03-09T17:23:14.776 INFO:teuthology.orchestra.run.vm09.stdout:(24/119): rbd-fuse-18.2.0-0.el9.x86_64.rpm 855 kB/s | 86 kB 00:00 2026-03-09T17:23:14.878 INFO:teuthology.orchestra.run.vm09.stdout:(25/119): rbd-nbd-18.2.0-0.el9.x86_64.rpm 1.6 MB/s | 169 kB 00:00 2026-03-09T17:23:14.885 INFO:teuthology.orchestra.run.vm06.stdout:(4/119): ceph-mds-18.2.0-0.el9.x86_64.rpm 5.2 MB/s | 2.1 MB 00:00 2026-03-09T17:23:14.978 INFO:teuthology.orchestra.run.vm09.stdout:(26/119): ceph-grafana-dashboards-18.2.0-0.el9. 232 kB/s | 23 kB 00:00 2026-03-09T17:23:15.080 INFO:teuthology.orchestra.run.vm09.stdout:(27/119): ceph-mgr-cephadm-18.2.0-0.el9.noarch. 1.2 MB/s | 127 kB 00:00 2026-03-09T17:23:15.188 INFO:teuthology.orchestra.run.vm06.stdout:(5/119): ceph-mgr-18.2.0-0.el9.x86_64.rpm 4.8 MB/s | 1.4 MB 00:00 2026-03-09T17:23:15.353 INFO:teuthology.orchestra.run.vm06.stdout:(6/119): ceph-base-18.2.0-0.el9.x86_64.rpm 2.8 MB/s | 5.2 MB 00:01 2026-03-09T17:23:15.619 INFO:teuthology.orchestra.run.vm09.stdout:(28/119): ceph-mgr-dashboard-18.2.0-0.el9.noarc 3.1 MB/s | 1.7 MB 00:00 2026-03-09T17:23:15.632 INFO:teuthology.orchestra.run.vm09.stdout:(29/119): rbd-mirror-18.2.0-0.el9.x86_64.rpm 3.3 MB/s | 3.0 MB 00:00 2026-03-09T17:23:15.735 INFO:teuthology.orchestra.run.vm09.stdout:(30/119): ceph-mgr-modules-core-18.2.0-0.el9.no 2.3 MB/s | 240 kB 00:00 2026-03-09T17:23:15.836 INFO:teuthology.orchestra.run.vm09.stdout:(31/119): ceph-mgr-rook-18.2.0-0.el9.noarch.rpm 475 kB/s | 47 kB 00:00 2026-03-09T17:23:15.935 INFO:teuthology.orchestra.run.vm09.stdout:(32/119): ceph-prometheus-alerts-18.2.0-0.el9.n 147 kB/s | 15 kB 00:00 2026-03-09T17:23:16.037 INFO:teuthology.orchestra.run.vm09.stdout:(33/119): cephadm-18.2.0-0.el9.noarch.rpm 2.0 MB/s | 209 kB 00:00 2026-03-09T17:23:16.089 INFO:teuthology.orchestra.run.vm06.stdout:(7/119): ceph-mon-18.2.0-0.el9.x86_64.rpm 4.9 MB/s | 4.4 MB 00:00 2026-03-09T17:23:16.236 INFO:teuthology.orchestra.run.vm09.stdout:(34/119): ledmon-libs-1.1.0-3.el9.x86_64.rpm 204 kB/s | 40 kB 00:00 2026-03-09T17:23:16.412 INFO:teuthology.orchestra.run.vm09.stdout:(35/119): libconfig-1.7.2-9.el9.x86_64.rpm 412 kB/s | 72 kB 00:00 2026-03-09T17:23:16.588 INFO:teuthology.orchestra.run.vm09.stdout:(36/119): libgfortran-11.5.0-14.el9.x86_64.rpm 4.4 MB/s | 794 kB 00:00 2026-03-09T17:23:16.589 INFO:teuthology.orchestra.run.vm06.stdout:(8/119): ceph-common-18.2.0-0.el9.x86_64.rpm 5.9 MB/s | 18 MB 00:03 2026-03-09T17:23:16.655 INFO:teuthology.orchestra.run.vm09.stdout:(37/119): libquadmath-11.5.0-14.el9.x86_64.rpm 2.7 MB/s | 184 kB 00:00 2026-03-09T17:23:16.693 INFO:teuthology.orchestra.run.vm06.stdout:(9/119): ceph-selinux-18.2.0-0.el9.x86_64.rpm 232 kB/s | 24 kB 00:00 2026-03-09T17:23:16.694 INFO:teuthology.orchestra.run.vm09.stdout:(38/119): mailcap-2.1.49-5.el9.noarch.rpm 862 kB/s | 33 kB 00:00 2026-03-09T17:23:16.733 INFO:teuthology.orchestra.run.vm09.stdout:(39/119): python3-cffi-1.14.5-5.el9.x86_64.rpm 6.5 MB/s | 253 kB 00:00 2026-03-09T17:23:16.840 INFO:teuthology.orchestra.run.vm09.stdout:(40/119): python3-cryptography-36.0.1-5.el9.x86 12 MB/s | 1.2 MB 00:00 2026-03-09T17:23:16.876 INFO:teuthology.orchestra.run.vm09.stdout:(41/119): python3-ply-3.11-14.el9.noarch.rpm 2.9 MB/s | 106 kB 00:00 2026-03-09T17:23:16.912 INFO:teuthology.orchestra.run.vm09.stdout:(42/119): python3-pycparser-2.20-6.el9.noarch.r 3.6 MB/s | 135 kB 00:00 2026-03-09T17:23:16.969 INFO:teuthology.orchestra.run.vm09.stdout:(43/119): python3-requests-2.25.1-10.el9.noarch 2.2 MB/s | 126 kB 00:00 2026-03-09T17:23:17.103 INFO:teuthology.orchestra.run.vm09.stdout:(44/119): python3-urllib3-1.26.5-7.el9.noarch.r 1.6 MB/s | 218 kB 00:00 2026-03-09T17:23:17.313 INFO:teuthology.orchestra.run.vm09.stdout:(45/119): ceph-test-18.2.0-0.el9.x86_64.rpm 6.4 MB/s | 40 MB 00:06 2026-03-09T17:23:17.314 INFO:teuthology.orchestra.run.vm09.stdout:(46/119): boost-program-options-1.75.0-13.el9.x 493 kB/s | 104 kB 00:00 2026-03-09T17:23:17.364 INFO:teuthology.orchestra.run.vm06.stdout:(10/119): ceph-osd-18.2.0-0.el9.x86_64.rpm 8.7 MB/s | 18 MB 00:02 2026-03-09T17:23:17.463 INFO:teuthology.orchestra.run.vm06.stdout:(11/119): libcephfs-devel-18.2.0-0.el9.x86_64.r 308 kB/s | 30 kB 00:00 2026-03-09T17:23:17.578 INFO:teuthology.orchestra.run.vm06.stdout:(12/119): libcephfs2-18.2.0-0.el9.x86_64.rpm 5.5 MB/s | 653 kB 00:00 2026-03-09T17:23:17.608 INFO:teuthology.orchestra.run.vm06.stdout:(13/119): ceph-radosgw-18.2.0-0.el9.x86_64.rpm 5.0 MB/s | 7.6 MB 00:01 2026-03-09T17:23:17.614 INFO:teuthology.orchestra.run.vm09.stdout:(47/119): flexiblas-netlib-3.0.4-9.el9.x86_64.r 10 MB/s | 3.0 MB 00:00 2026-03-09T17:23:17.681 INFO:teuthology.orchestra.run.vm06.stdout:(14/119): libcephsqlite-18.2.0-0.el9.x86_64.rpm 1.5 MB/s | 161 kB 00:00 2026-03-09T17:23:17.708 INFO:teuthology.orchestra.run.vm06.stdout:(15/119): librados-devel-18.2.0-0.el9.x86_64.rp 1.2 MB/s | 126 kB 00:00 2026-03-09T17:23:17.772 INFO:teuthology.orchestra.run.vm09.stdout:(48/119): flexiblas-openblas-openmp-3.0.4-9.el9 94 kB/s | 15 kB 00:00 2026-03-09T17:23:17.789 INFO:teuthology.orchestra.run.vm09.stdout:(49/119): flexiblas-3.0.4-9.el9.x86_64.rpm 62 kB/s | 30 kB 00:00 2026-03-09T17:23:17.789 INFO:teuthology.orchestra.run.vm06.stdout:(16/119): libradosstriper1-18.2.0-0.el9.x86_64. 4.3 MB/s | 474 kB 00:00 2026-03-09T17:23:17.844 INFO:teuthology.orchestra.run.vm09.stdout:(50/119): libpmemobj-1.12.1-1.el9.x86_64.rpm 2.2 MB/s | 160 kB 00:00 2026-03-09T17:23:17.888 INFO:teuthology.orchestra.run.vm06.stdout:(17/119): python3-ceph-argparse-18.2.0-0.el9.x8 455 kB/s | 45 kB 00:00 2026-03-09T17:23:17.921 INFO:teuthology.orchestra.run.vm09.stdout:(51/119): librabbitmq-0.11.0-7.el9.x86_64.rpm 344 kB/s | 45 kB 00:00 2026-03-09T17:23:17.960 INFO:teuthology.orchestra.run.vm09.stdout:(52/119): librdkafka-1.6.1-102.el9.x86_64.rpm 5.6 MB/s | 662 kB 00:00 2026-03-09T17:23:17.989 INFO:teuthology.orchestra.run.vm06.stdout:(18/119): python3-ceph-common-18.2.0-0.el9.x86_ 1.2 MB/s | 119 kB 00:00 2026-03-09T17:23:17.997 INFO:teuthology.orchestra.run.vm09.stdout:(53/119): libxslt-1.1.34-12.el9.x86_64.rpm 6.1 MB/s | 233 kB 00:00 2026-03-09T17:23:18.060 INFO:teuthology.orchestra.run.vm09.stdout:(54/119): ceph-mgr-diskprediction-local-18.2.0- 3.0 MB/s | 7.4 MB 00:02 2026-03-09T17:23:18.061 INFO:teuthology.orchestra.run.vm09.stdout:(55/119): libstoragemgmt-1.10.1-1.el9.x86_64.rp 1.7 MB/s | 246 kB 00:00 2026-03-09T17:23:18.089 INFO:teuthology.orchestra.run.vm06.stdout:(19/119): python3-cephfs-18.2.0-0.el9.x86_64.rp 1.5 MB/s | 155 kB 00:00 2026-03-09T17:23:18.113 INFO:teuthology.orchestra.run.vm09.stdout:(56/119): lttng-ust-2.12.0-6.el9.x86_64.rpm 2.5 MB/s | 292 kB 00:00 2026-03-09T17:23:18.191 INFO:teuthology.orchestra.run.vm06.stdout:(20/119): python3-rados-18.2.0-0.el9.x86_64.rpm 3.1 MB/s | 321 kB 00:00 2026-03-09T17:23:18.376 INFO:teuthology.orchestra.run.vm09.stdout:(57/119): openblas-0.3.29-1.el9.x86_64.rpm 133 kB/s | 42 kB 00:00 2026-03-09T17:23:18.542 INFO:teuthology.orchestra.run.vm09.stdout:(58/119): openblas-openmp-0.3.29-1.el9.x86_64.r 11 MB/s | 5.3 MB 00:00 2026-03-09T17:23:18.568 INFO:teuthology.orchestra.run.vm09.stdout:(59/119): python3-babel-2.9.1-2.el9.noarch.rpm 13 MB/s | 6.0 MB 00:00 2026-03-09T17:23:18.570 INFO:teuthology.orchestra.run.vm09.stdout:(60/119): python3-devel-3.9.25-3.el9.x86_64.rpm 1.2 MB/s | 244 kB 00:00 2026-03-09T17:23:18.711 INFO:teuthology.orchestra.run.vm06.stdout:(21/119): librgw2-18.2.0-0.el9.x86_64.rpm 4.4 MB/s | 4.4 MB 00:01 2026-03-09T17:23:18.812 INFO:teuthology.orchestra.run.vm06.stdout:(22/119): python3-rgw-18.2.0-0.el9.x86_64.rpm 992 kB/s | 99 kB 00:00 2026-03-09T17:23:18.912 INFO:teuthology.orchestra.run.vm06.stdout:(23/119): rbd-fuse-18.2.0-0.el9.x86_64.rpm 858 kB/s | 86 kB 00:00 2026-03-09T17:23:18.962 INFO:teuthology.orchestra.run.vm09.stdout:(61/119): python3-jmespath-1.0.1-1.el9.noarch.r 121 kB/s | 48 kB 00:00 2026-03-09T17:23:18.966 INFO:teuthology.orchestra.run.vm09.stdout:(62/119): python3-jinja2-2.11.3-8.el9.noarch.rp 587 kB/s | 249 kB 00:00 2026-03-09T17:23:19.134 INFO:teuthology.orchestra.run.vm09.stdout:(63/119): python3-mako-1.1.4-6.el9.noarch.rpm 1.0 MB/s | 172 kB 00:00 2026-03-09T17:23:19.146 INFO:teuthology.orchestra.run.vm09.stdout:(64/119): python3-markupsafe-1.1.1-12.el9.x86_6 193 kB/s | 35 kB 00:00 2026-03-09T17:23:19.195 INFO:teuthology.orchestra.run.vm09.stdout:(65/119): python3-libstoragemgmt-1.10.1-1.el9.x 283 kB/s | 177 kB 00:00 2026-03-09T17:23:19.215 INFO:teuthology.orchestra.run.vm06.stdout:(24/119): python3-rbd-18.2.0-0.el9.x86_64.rpm 290 kB/s | 297 kB 00:01 2026-03-09T17:23:19.270 INFO:teuthology.orchestra.run.vm09.stdout:(66/119): python3-numpy-f2py-1.23.5-2.el9.x86_6 3.5 MB/s | 442 kB 00:00 2026-03-09T17:23:19.280 INFO:teuthology.orchestra.run.vm09.stdout:(67/119): python3-pyasn1-0.4.8-7.el9.noarch.rpm 1.8 MB/s | 157 kB 00:00 2026-03-09T17:23:19.316 INFO:teuthology.orchestra.run.vm06.stdout:(25/119): rbd-nbd-18.2.0-0.el9.x86_64.rpm 1.6 MB/s | 169 kB 00:00 2026-03-09T17:23:19.416 INFO:teuthology.orchestra.run.vm06.stdout:(26/119): ceph-grafana-dashboards-18.2.0-0.el9. 230 kB/s | 23 kB 00:00 2026-03-09T17:23:19.484 INFO:teuthology.orchestra.run.vm09.stdout:(68/119): python3-pyasn1-modules-0.4.8-7.el9.no 1.3 MB/s | 277 kB 00:00 2026-03-09T17:23:19.514 INFO:teuthology.orchestra.run.vm09.stdout:(69/119): python3-requests-oauthlib-1.3.0-12.el 230 kB/s | 54 kB 00:00 2026-03-09T17:23:19.517 INFO:teuthology.orchestra.run.vm06.stdout:(27/119): ceph-mgr-cephadm-18.2.0-0.el9.noarch. 1.2 MB/s | 127 kB 00:00 2026-03-09T17:23:19.606 INFO:teuthology.orchestra.run.vm09.stdout:(70/119): python3-toml-0.10.2-6.el9.noarch.rpm 455 kB/s | 42 kB 00:00 2026-03-09T17:23:19.613 INFO:teuthology.orchestra.run.vm06.stdout:(28/119): rbd-mirror-18.2.0-0.el9.x86_64.rpm 4.3 MB/s | 3.0 MB 00:00 2026-03-09T17:23:19.676 INFO:teuthology.orchestra.run.vm09.stdout:(71/119): python3-numpy-1.23.5-2.el9.x86_64.rpm 11 MB/s | 6.1 MB 00:00 2026-03-09T17:23:19.695 INFO:teuthology.orchestra.run.vm09.stdout:(72/119): socat-1.7.4.1-8.el9.x86_64.rpm 3.3 MB/s | 303 kB 00:00 2026-03-09T17:23:19.702 INFO:teuthology.orchestra.run.vm09.stdout:(73/119): fmt-8.1.1-5.el9.x86_64.rpm 16 MB/s | 111 kB 00:00 2026-03-09T17:23:19.710 INFO:teuthology.orchestra.run.vm09.stdout:(74/119): gperftools-libs-2.9.1-3.el9.x86_64.rp 43 MB/s | 308 kB 00:00 2026-03-09T17:23:19.740 INFO:teuthology.orchestra.run.vm09.stdout:(75/119): xmlstarlet-1.6.1-20.el9.x86_64.rpm 999 kB/s | 64 kB 00:00 2026-03-09T17:23:19.749 INFO:teuthology.orchestra.run.vm09.stdout:(76/119): libarrow-doc-9.0.0-15.el9.noarch.rpm 2.6 MB/s | 25 kB 00:00 2026-03-09T17:23:19.758 INFO:teuthology.orchestra.run.vm09.stdout:(77/119): liboath-2.6.12-1.el9.x86_64.rpm 6.0 MB/s | 49 kB 00:00 2026-03-09T17:23:19.762 INFO:teuthology.orchestra.run.vm09.stdout:(78/119): libunwind-1.6.2-1.el9.x86_64.rpm 18 MB/s | 67 kB 00:00 2026-03-09T17:23:19.775 INFO:teuthology.orchestra.run.vm09.stdout:(79/119): parquet-libs-9.0.0-15.el9.x86_64.rpm 61 MB/s | 838 kB 00:00 2026-03-09T17:23:19.785 INFO:teuthology.orchestra.run.vm09.stdout:(80/119): python3-asyncssh-2.13.2-5.el9.noarch. 56 MB/s | 548 kB 00:00 2026-03-09T17:23:19.789 INFO:teuthology.orchestra.run.vm09.stdout:(81/119): python3-autocommand-2.2.2-8.el9.noarc 7.4 MB/s | 29 kB 00:00 2026-03-09T17:23:19.792 INFO:teuthology.orchestra.run.vm09.stdout:(82/119): python3-backports-tarfile-1.2.0-1.el9 22 MB/s | 60 kB 00:00 2026-03-09T17:23:19.795 INFO:teuthology.orchestra.run.vm09.stdout:(83/119): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 17 MB/s | 43 kB 00:00 2026-03-09T17:23:19.798 INFO:teuthology.orchestra.run.vm09.stdout:(84/119): python3-cachetools-4.2.4-1.el9.noarch 12 MB/s | 32 kB 00:00 2026-03-09T17:23:19.800 INFO:teuthology.orchestra.run.vm09.stdout:(85/119): python3-certifi-2023.05.07-4.el9.noar 6.2 MB/s | 14 kB 00:00 2026-03-09T17:23:19.804 INFO:teuthology.orchestra.run.vm09.stdout:(86/119): python3-cheroot-10.0.1-4.el9.noarch.r 43 MB/s | 173 kB 00:00 2026-03-09T17:23:19.811 INFO:teuthology.orchestra.run.vm09.stdout:(87/119): python3-cherrypy-18.6.1-2.el9.noarch. 57 MB/s | 358 kB 00:00 2026-03-09T17:23:19.817 INFO:teuthology.orchestra.run.vm09.stdout:(88/119): python3-google-auth-2.45.0-1.el9.noar 44 MB/s | 254 kB 00:00 2026-03-09T17:23:19.822 INFO:teuthology.orchestra.run.vm09.stdout:(89/119): python3-jaraco-8.2.1-3.el9.noarch.rpm 2.4 MB/s | 11 kB 00:00 2026-03-09T17:23:19.826 INFO:teuthology.orchestra.run.vm09.stdout:(90/119): python3-jaraco-classes-3.2.1-5.el9.no 3.9 MB/s | 18 kB 00:00 2026-03-09T17:23:19.829 INFO:teuthology.orchestra.run.vm09.stdout:(91/119): python3-jaraco-collections-3.0.0-8.el 9.5 MB/s | 23 kB 00:00 2026-03-09T17:23:19.832 INFO:teuthology.orchestra.run.vm09.stdout:(92/119): python3-jaraco-context-6.0.1-3.el9.no 8.4 MB/s | 20 kB 00:00 2026-03-09T17:23:19.834 INFO:teuthology.orchestra.run.vm09.stdout:(93/119): python3-jaraco-functools-3.5.0-2.el9. 8.9 MB/s | 19 kB 00:00 2026-03-09T17:23:19.837 INFO:teuthology.orchestra.run.vm09.stdout:(94/119): python3-jaraco-text-4.0.0-2.el9.noarc 9.7 MB/s | 26 kB 00:00 2026-03-09T17:23:19.849 INFO:teuthology.orchestra.run.vm09.stdout:(95/119): libarrow-9.0.0-15.el9.x86_64.rpm 32 MB/s | 4.4 MB 00:00 2026-03-09T17:23:19.849 INFO:teuthology.orchestra.run.vm09.stdout:(96/119): python3-jwt+crypto-2.4.0-1.el9.noarch 731 kB/s | 9.0 kB 00:00 2026-03-09T17:23:19.852 INFO:teuthology.orchestra.run.vm09.stdout:(97/119): python3-jwt-2.4.0-1.el9.noarch.rpm 15 MB/s | 41 kB 00:00 2026-03-09T17:23:19.859 INFO:teuthology.orchestra.run.vm09.stdout:(98/119): python3-logutils-0.3.5-21.el9.noarch. 6.6 MB/s | 46 kB 00:00 2026-03-09T17:23:19.868 INFO:teuthology.orchestra.run.vm09.stdout:(99/119): python3-more-itertools-8.12.0-2.el9.n 9.2 MB/s | 79 kB 00:00 2026-03-09T17:23:19.873 INFO:teuthology.orchestra.run.vm09.stdout:(100/119): python3-kubernetes-26.1.0-3.el9.noar 45 MB/s | 1.0 MB 00:00 2026-03-09T17:23:19.873 INFO:teuthology.orchestra.run.vm06.stdout:(29/119): ceph-mgr-dashboard-18.2.0-0.el9.noarc 4.8 MB/s | 1.7 MB 00:00 2026-03-09T17:23:19.873 INFO:teuthology.orchestra.run.vm09.stdout:(101/119): python3-natsort-7.1.1-5.el9.noarch.r 10 MB/s | 58 kB 00:00 2026-03-09T17:23:19.878 INFO:teuthology.orchestra.run.vm09.stdout:(102/119): python3-pecan-1.4.2-3.el9.noarch.rpm 52 MB/s | 272 kB 00:00 2026-03-09T17:23:19.879 INFO:teuthology.orchestra.run.vm09.stdout:(103/119): python3-portend-3.1.0-2.el9.noarch.r 3.3 MB/s | 16 kB 00:00 2026-03-09T17:23:19.881 INFO:teuthology.orchestra.run.vm09.stdout:(104/119): python3-pyOpenSSL-21.0.0-1.el9.noarc 28 MB/s | 90 kB 00:00 2026-03-09T17:23:19.882 INFO:teuthology.orchestra.run.vm09.stdout:(105/119): python3-repoze-lru-0.7-16.el9.noarch 9.8 MB/s | 31 kB 00:00 2026-03-09T17:23:19.885 INFO:teuthology.orchestra.run.vm09.stdout:(106/119): python3-rsa-4.9-2.el9.noarch.rpm 23 MB/s | 59 kB 00:00 2026-03-09T17:23:19.887 INFO:teuthology.orchestra.run.vm09.stdout:(107/119): python3-routes-2.5.1-5.el9.noarch.rp 36 MB/s | 188 kB 00:00 2026-03-09T17:23:19.888 INFO:teuthology.orchestra.run.vm09.stdout:(108/119): python3-tempora-5.0.0-2.el9.noarch.r 12 MB/s | 36 kB 00:00 2026-03-09T17:23:19.891 INFO:teuthology.orchestra.run.vm09.stdout:(109/119): python3-typing-extensions-4.15.0-1.e 24 MB/s | 86 kB 00:00 2026-03-09T17:23:19.893 INFO:teuthology.orchestra.run.vm09.stdout:(110/119): python3-webob-1.8.8-2.el9.noarch.rpm 46 MB/s | 230 kB 00:00 2026-03-09T17:23:19.894 INFO:teuthology.orchestra.run.vm09.stdout:(111/119): python3-websocket-client-1.2.3-2.el9 29 MB/s | 90 kB 00:00 2026-03-09T17:23:19.899 INFO:teuthology.orchestra.run.vm09.stdout:(112/119): python3-xmltodict-0.12.0-15.el9.noar 4.9 MB/s | 22 kB 00:00 2026-03-09T17:23:19.900 INFO:teuthology.orchestra.run.vm09.stdout:(113/119): python3-werkzeug-2.0.3-3.el9.1.noarc 55 MB/s | 427 kB 00:00 2026-03-09T17:23:19.901 INFO:teuthology.orchestra.run.vm09.stdout:(114/119): python3-zc-lockfile-2.0-10.el9.noarc 8.6 MB/s | 20 kB 00:00 2026-03-09T17:23:19.905 INFO:teuthology.orchestra.run.vm09.stdout:(115/119): re2-20211101-20.el9.x86_64.rpm 41 MB/s | 191 kB 00:00 2026-03-09T17:23:19.932 INFO:teuthology.orchestra.run.vm09.stdout:(116/119): thrift-0.15.0-4.el9.x86_64.rpm 52 MB/s | 1.6 MB 00:00 2026-03-09T17:23:19.974 INFO:teuthology.orchestra.run.vm06.stdout:(30/119): ceph-mgr-modules-core-18.2.0-0.el9.no 2.3 MB/s | 240 kB 00:00 2026-03-09T17:23:20.074 INFO:teuthology.orchestra.run.vm06.stdout:(31/119): ceph-mgr-rook-18.2.0-0.el9.noarch.rpm 478 kB/s | 47 kB 00:00 2026-03-09T17:23:20.173 INFO:teuthology.orchestra.run.vm06.stdout:(32/119): ceph-prometheus-alerts-18.2.0-0.el9.n 147 kB/s | 15 kB 00:00 2026-03-09T17:23:20.275 INFO:teuthology.orchestra.run.vm06.stdout:(33/119): cephadm-18.2.0-0.el9.noarch.rpm 2.0 MB/s | 209 kB 00:00 2026-03-09T17:23:20.633 INFO:teuthology.orchestra.run.vm06.stdout:(34/119): ledmon-libs-1.1.0-3.el9.x86_64.rpm 113 kB/s | 40 kB 00:00 2026-03-09T17:23:20.904 INFO:teuthology.orchestra.run.vm09.stdout:(117/119): librados2-18.2.0-0.el9.x86_64.rpm 3.3 MB/s | 3.3 MB 00:00 2026-03-09T17:23:21.091 INFO:teuthology.orchestra.run.vm09.stdout:(118/119): python3-scipy-1.9.3-2.el9.x86_64.rpm 12 MB/s | 19 MB 00:01 2026-03-09T17:23:21.225 INFO:teuthology.orchestra.run.vm09.stdout:(119/119): librbd1-18.2.0-0.el9.x86_64.rpm 2.3 MB/s | 3.0 MB 00:01 2026-03-09T17:23:21.231 INFO:teuthology.orchestra.run.vm09.stdout:-------------------------------------------------------------------------------- 2026-03-09T17:23:21.231 INFO:teuthology.orchestra.run.vm09.stdout:Total 12 MB/s | 182 MB 00:14 2026-03-09T17:23:21.284 INFO:teuthology.orchestra.run.vm06.stdout:(35/119): libconfig-1.7.2-9.el9.x86_64.rpm 111 kB/s | 72 kB 00:00 2026-03-09T17:23:21.688 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction check 2026-03-09T17:23:21.711 INFO:teuthology.orchestra.run.vm06.stdout:(36/119): ceph-mgr-diskprediction-local-18.2.0- 3.5 MB/s | 7.4 MB 00:02 2026-03-09T17:23:21.732 INFO:teuthology.orchestra.run.vm09.stdout:Transaction check succeeded. 2026-03-09T17:23:21.733 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction test 2026-03-09T17:23:22.246 INFO:teuthology.orchestra.run.vm06.stdout:(37/119): libgfortran-11.5.0-14.el9.x86_64.rpm 827 kB/s | 794 kB 00:00 2026-03-09T17:23:22.369 INFO:teuthology.orchestra.run.vm06.stdout:(38/119): libquadmath-11.5.0-14.el9.x86_64.rpm 281 kB/s | 184 kB 00:00 2026-03-09T17:23:22.376 INFO:teuthology.orchestra.run.vm06.stdout:(39/119): mailcap-2.1.49-5.el9.noarch.rpm 255 kB/s | 33 kB 00:00 2026-03-09T17:23:22.474 INFO:teuthology.orchestra.run.vm09.stdout:Transaction test succeeded. 2026-03-09T17:23:22.475 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction 2026-03-09T17:23:22.513 INFO:teuthology.orchestra.run.vm06.stdout:(40/119): python3-cffi-1.14.5-5.el9.x86_64.rpm 1.7 MB/s | 253 kB 00:00 2026-03-09T17:23:22.611 INFO:teuthology.orchestra.run.vm06.stdout:(41/119): python3-cryptography-36.0.1-5.el9.x86 5.3 MB/s | 1.2 MB 00:00 2026-03-09T17:23:22.645 INFO:teuthology.orchestra.run.vm06.stdout:(42/119): python3-ply-3.11-14.el9.noarch.rpm 808 kB/s | 106 kB 00:00 2026-03-09T17:23:22.740 INFO:teuthology.orchestra.run.vm06.stdout:(43/119): python3-pycparser-2.20-6.el9.noarch.r 1.0 MB/s | 135 kB 00:00 2026-03-09T17:23:22.774 INFO:teuthology.orchestra.run.vm06.stdout:(44/119): python3-requests-2.25.1-10.el9.noarch 993 kB/s | 126 kB 00:00 2026-03-09T17:23:22.868 INFO:teuthology.orchestra.run.vm06.stdout:(45/119): python3-urllib3-1.26.5-7.el9.noarch.r 1.7 MB/s | 218 kB 00:00 2026-03-09T17:23:23.008 INFO:teuthology.orchestra.run.vm06.stdout:(46/119): boost-program-options-1.75.0-13.el9.x 443 kB/s | 104 kB 00:00 2026-03-09T17:23:23.010 INFO:teuthology.orchestra.run.vm06.stdout:(47/119): flexiblas-3.0.4-9.el9.x86_64.rpm 209 kB/s | 30 kB 00:00 2026-03-09T17:23:23.065 INFO:teuthology.orchestra.run.vm06.stdout:(48/119): flexiblas-openblas-openmp-3.0.4-9.el9 274 kB/s | 15 kB 00:00 2026-03-09T17:23:23.210 INFO:teuthology.orchestra.run.vm06.stdout:(49/119): libpmemobj-1.12.1-1.el9.x86_64.rpm 1.1 MB/s | 160 kB 00:00 2026-03-09T17:23:23.261 INFO:teuthology.orchestra.run.vm06.stdout:(50/119): flexiblas-netlib-3.0.4-9.el9.x86_64.r 12 MB/s | 3.0 MB 00:00 2026-03-09T17:23:23.262 INFO:teuthology.orchestra.run.vm06.stdout:(51/119): librabbitmq-0.11.0-7.el9.x86_64.rpm 882 kB/s | 45 kB 00:00 2026-03-09T17:23:23.308 INFO:teuthology.orchestra.run.vm09.stdout: Preparing : 1/1 2026-03-09T17:23:23.319 INFO:teuthology.orchestra.run.vm06.stdout:(52/119): librdkafka-1.6.1-102.el9.x86_64.rpm 11 MB/s | 662 kB 00:00 2026-03-09T17:23:23.324 INFO:teuthology.orchestra.run.vm09.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/121 2026-03-09T17:23:23.339 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/121 2026-03-09T17:23:23.359 INFO:teuthology.orchestra.run.vm06.stdout:(53/119): libstoragemgmt-1.10.1-1.el9.x86_64.rp 2.5 MB/s | 246 kB 00:00 2026-03-09T17:23:23.376 INFO:teuthology.orchestra.run.vm06.stdout:(54/119): libxslt-1.1.34-12.el9.x86_64.rpm 4.0 MB/s | 233 kB 00:00 2026-03-09T17:23:23.495 INFO:teuthology.orchestra.run.vm06.stdout:(55/119): ceph-test-18.2.0-0.el9.x86_64.rpm 5.8 MB/s | 40 MB 00:06 2026-03-09T17:23:23.500 INFO:teuthology.orchestra.run.vm06.stdout:(56/119): openblas-0.3.29-1.el9.x86_64.rpm 340 kB/s | 42 kB 00:00 2026-03-09T17:23:23.503 INFO:teuthology.orchestra.run.vm06.stdout:(57/119): lttng-ust-2.12.0-6.el9.x86_64.rpm 2.0 MB/s | 292 kB 00:00 2026-03-09T17:23:23.508 INFO:teuthology.orchestra.run.vm09.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/121 2026-03-09T17:23:23.511 INFO:teuthology.orchestra.run.vm09.stdout: Upgrading : librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-09T17:23:23.559 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-09T17:23:23.560 INFO:teuthology.orchestra.run.vm09.stdout: Installing : libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-09T17:23:23.594 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-09T17:23:23.607 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-rados-2:18.2.0-0.el9.x86_64 6/121 2026-03-09T17:23:23.616 INFO:teuthology.orchestra.run.vm09.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/121 2026-03-09T17:23:23.622 INFO:teuthology.orchestra.run.vm09.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/121 2026-03-09T17:23:23.631 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/121 2026-03-09T17:23:23.633 INFO:teuthology.orchestra.run.vm09.stdout: Installing : libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-09T17:23:23.671 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-09T17:23:23.672 INFO:teuthology.orchestra.run.vm09.stdout: Installing : libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-09T17:23:23.726 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-09T17:23:23.731 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/121 2026-03-09T17:23:23.763 INFO:teuthology.orchestra.run.vm09.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/121 2026-03-09T17:23:23.775 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/121 2026-03-09T17:23:23.779 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/121 2026-03-09T17:23:23.812 INFO:teuthology.orchestra.run.vm09.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/121 2026-03-09T17:23:23.833 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/121 2026-03-09T17:23:23.845 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/121 2026-03-09T17:23:23.857 INFO:teuthology.orchestra.run.vm06.stdout:(58/119): python3-babel-2.9.1-2.el9.noarch.rpm 17 MB/s | 6.0 MB 00:00 2026-03-09T17:23:23.865 INFO:teuthology.orchestra.run.vm09.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/121 2026-03-09T17:23:23.869 INFO:teuthology.orchestra.run.vm09.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/121 2026-03-09T17:23:23.879 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/121 2026-03-09T17:23:23.892 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 22/121 2026-03-09T17:23:23.907 INFO:teuthology.orchestra.run.vm06.stdout:(59/119): python3-devel-3.9.25-3.el9.x86_64.rpm 606 kB/s | 244 kB 00:00 2026-03-09T17:23:23.911 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-cephfs-2:18.2.0-0.el9.x86_64 23/121 2026-03-09T17:23:23.915 INFO:teuthology.orchestra.run.vm06.stdout:(60/119): python3-jinja2-2.11.3-8.el9.noarch.rp 4.2 MB/s | 249 kB 00:00 2026-03-09T17:23:23.945 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/121 2026-03-09T17:23:23.957 INFO:teuthology.orchestra.run.vm06.stdout:(61/119): python3-jmespath-1.0.1-1.el9.noarch.r 950 kB/s | 48 kB 00:00 2026-03-09T17:23:23.973 INFO:teuthology.orchestra.run.vm06.stdout:(62/119): python3-libstoragemgmt-1.10.1-1.el9.x 3.0 MB/s | 177 kB 00:00 2026-03-09T17:23:24.009 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/121 2026-03-09T17:23:24.014 INFO:teuthology.orchestra.run.vm06.stdout:(63/119): python3-mako-1.1.4-6.el9.noarch.rpm 3.0 MB/s | 172 kB 00:00 2026-03-09T17:23:24.023 INFO:teuthology.orchestra.run.vm06.stdout:(64/119): python3-markupsafe-1.1.1-12.el9.x86_6 694 kB/s | 35 kB 00:00 2026-03-09T17:23:24.028 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/121 2026-03-09T17:23:24.036 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/121 2026-03-09T17:23:24.046 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/121 2026-03-09T17:23:24.051 INFO:teuthology.orchestra.run.vm09.stdout: Installing : librados-devel-2:18.2.0-0.el9.x86_64 29/121 2026-03-09T17:23:24.075 INFO:teuthology.orchestra.run.vm06.stdout:(65/119): openblas-openmp-0.3.29-1.el9.x86_64.r 9.1 MB/s | 5.3 MB 00:00 2026-03-09T17:23:24.078 INFO:teuthology.orchestra.run.vm06.stdout:(66/119): python3-numpy-f2py-1.23.5-2.el9.x86_6 7.9 MB/s | 442 kB 00:00 2026-03-09T17:23:24.089 INFO:teuthology.orchestra.run.vm09.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/121 2026-03-09T17:23:24.095 INFO:teuthology.orchestra.run.vm09.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/121 2026-03-09T17:23:24.114 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/121 2026-03-09T17:23:24.128 INFO:teuthology.orchestra.run.vm06.stdout:(67/119): python3-pyasn1-0.4.8-7.el9.noarch.rpm 3.0 MB/s | 157 kB 00:00 2026-03-09T17:23:24.130 INFO:teuthology.orchestra.run.vm06.stdout:(68/119): python3-pyasn1-modules-0.4.8-7.el9.no 5.3 MB/s | 277 kB 00:00 2026-03-09T17:23:24.142 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/121 2026-03-09T17:23:24.149 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/121 2026-03-09T17:23:24.156 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/121 2026-03-09T17:23:24.171 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/121 2026-03-09T17:23:24.178 INFO:teuthology.orchestra.run.vm06.stdout:(69/119): python3-requests-oauthlib-1.3.0-12.el 1.1 MB/s | 54 kB 00:00 2026-03-09T17:23:24.184 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/121 2026-03-09T17:23:24.195 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/121 2026-03-09T17:23:24.228 INFO:teuthology.orchestra.run.vm06.stdout:(70/119): python3-toml-0.10.2-6.el9.noarch.rpm 829 kB/s | 42 kB 00:00 2026-03-09T17:23:24.261 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/121 2026-03-09T17:23:24.269 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/121 2026-03-09T17:23:24.280 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/121 2026-03-09T17:23:24.282 INFO:teuthology.orchestra.run.vm06.stdout:(71/119): socat-1.7.4.1-8.el9.x86_64.rpm 5.5 MB/s | 303 kB 00:00 2026-03-09T17:23:24.328 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/121 2026-03-09T17:23:24.334 INFO:teuthology.orchestra.run.vm06.stdout:(72/119): xmlstarlet-1.6.1-20.el9.x86_64.rpm 1.2 MB/s | 64 kB 00:00 2026-03-09T17:23:24.343 INFO:teuthology.orchestra.run.vm06.stdout:(73/119): fmt-8.1.1-5.el9.x86_64.rpm 12 MB/s | 111 kB 00:00 2026-03-09T17:23:24.354 INFO:teuthology.orchestra.run.vm06.stdout:(74/119): gperftools-libs-2.9.1-3.el9.x86_64.rp 28 MB/s | 308 kB 00:00 2026-03-09T17:23:24.437 INFO:teuthology.orchestra.run.vm06.stdout:(75/119): libarrow-9.0.0-15.el9.x86_64.rpm 53 MB/s | 4.4 MB 00:00 2026-03-09T17:23:24.440 INFO:teuthology.orchestra.run.vm06.stdout:(76/119): libarrow-doc-9.0.0-15.el9.noarch.rpm 9.7 MB/s | 25 kB 00:00 2026-03-09T17:23:24.443 INFO:teuthology.orchestra.run.vm06.stdout:(77/119): liboath-2.6.12-1.el9.x86_64.rpm 18 MB/s | 49 kB 00:00 2026-03-09T17:23:24.446 INFO:teuthology.orchestra.run.vm06.stdout:(78/119): libunwind-1.6.2-1.el9.x86_64.rpm 26 MB/s | 67 kB 00:00 2026-03-09T17:23:24.460 INFO:teuthology.orchestra.run.vm06.stdout:(79/119): parquet-libs-9.0.0-15.el9.x86_64.rpm 58 MB/s | 838 kB 00:00 2026-03-09T17:23:24.468 INFO:teuthology.orchestra.run.vm06.stdout:(80/119): python3-asyncssh-2.13.2-5.el9.noarch. 66 MB/s | 548 kB 00:00 2026-03-09T17:23:24.471 INFO:teuthology.orchestra.run.vm06.stdout:(81/119): python3-autocommand-2.2.2-8.el9.noarc 11 MB/s | 29 kB 00:00 2026-03-09T17:23:24.477 INFO:teuthology.orchestra.run.vm06.stdout:(82/119): python3-backports-tarfile-1.2.0-1.el9 12 MB/s | 60 kB 00:00 2026-03-09T17:23:24.479 INFO:teuthology.orchestra.run.vm06.stdout:(83/119): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 17 MB/s | 43 kB 00:00 2026-03-09T17:23:24.482 INFO:teuthology.orchestra.run.vm06.stdout:(84/119): python3-cachetools-4.2.4-1.el9.noarch 12 MB/s | 32 kB 00:00 2026-03-09T17:23:24.489 INFO:teuthology.orchestra.run.vm06.stdout:(85/119): python3-certifi-2023.05.07-4.el9.noar 2.2 MB/s | 14 kB 00:00 2026-03-09T17:23:24.493 INFO:teuthology.orchestra.run.vm06.stdout:(86/119): python3-cheroot-10.0.1-4.el9.noarch.r 43 MB/s | 173 kB 00:00 2026-03-09T17:23:24.499 INFO:teuthology.orchestra.run.vm06.stdout:(87/119): python3-cherrypy-18.6.1-2.el9.noarch. 57 MB/s | 358 kB 00:00 2026-03-09T17:23:24.504 INFO:teuthology.orchestra.run.vm06.stdout:(88/119): python3-google-auth-2.45.0-1.el9.noar 49 MB/s | 254 kB 00:00 2026-03-09T17:23:24.506 INFO:teuthology.orchestra.run.vm06.stdout:(89/119): python3-jaraco-8.2.1-3.el9.noarch.rpm 5.1 MB/s | 11 kB 00:00 2026-03-09T17:23:24.510 INFO:teuthology.orchestra.run.vm06.stdout:(90/119): python3-jaraco-classes-3.2.1-5.el9.no 5.7 MB/s | 18 kB 00:00 2026-03-09T17:23:24.512 INFO:teuthology.orchestra.run.vm06.stdout:(91/119): python3-jaraco-collections-3.0.0-8.el 11 MB/s | 23 kB 00:00 2026-03-09T17:23:24.514 INFO:teuthology.orchestra.run.vm06.stdout:(92/119): python3-jaraco-context-6.0.1-3.el9.no 9.5 MB/s | 20 kB 00:00 2026-03-09T17:23:24.516 INFO:teuthology.orchestra.run.vm06.stdout:(93/119): python3-jaraco-functools-3.5.0-2.el9. 10 MB/s | 19 kB 00:00 2026-03-09T17:23:24.519 INFO:teuthology.orchestra.run.vm06.stdout:(94/119): python3-jaraco-text-4.0.0-2.el9.noarc 9.2 MB/s | 26 kB 00:00 2026-03-09T17:23:24.522 INFO:teuthology.orchestra.run.vm06.stdout:(95/119): python3-jwt+crypto-2.4.0-1.el9.noarch 2.9 MB/s | 9.0 kB 00:00 2026-03-09T17:23:24.525 INFO:teuthology.orchestra.run.vm06.stdout:(96/119): python3-jwt-2.4.0-1.el9.noarch.rpm 18 MB/s | 41 kB 00:00 2026-03-09T17:23:24.543 INFO:teuthology.orchestra.run.vm06.stdout:(97/119): python3-kubernetes-26.1.0-3.el9.noarc 58 MB/s | 1.0 MB 00:00 2026-03-09T17:23:24.545 INFO:teuthology.orchestra.run.vm06.stdout:(98/119): python3-logutils-0.3.5-21.el9.noarch. 18 MB/s | 46 kB 00:00 2026-03-09T17:23:24.549 INFO:teuthology.orchestra.run.vm06.stdout:(99/119): python3-more-itertools-8.12.0-2.el9.n 23 MB/s | 79 kB 00:00 2026-03-09T17:23:24.553 INFO:teuthology.orchestra.run.vm06.stdout:(100/119): python3-natsort-7.1.1-5.el9.noarch.r 16 MB/s | 58 kB 00:00 2026-03-09T17:23:24.558 INFO:teuthology.orchestra.run.vm06.stdout:(101/119): python3-pecan-1.4.2-3.el9.noarch.rpm 55 MB/s | 272 kB 00:00 2026-03-09T17:23:24.560 INFO:teuthology.orchestra.run.vm06.stdout:(102/119): python3-portend-3.1.0-2.el9.noarch.r 7.4 MB/s | 16 kB 00:00 2026-03-09T17:23:24.563 INFO:teuthology.orchestra.run.vm06.stdout:(103/119): python3-pyOpenSSL-21.0.0-1.el9.noarc 31 MB/s | 90 kB 00:00 2026-03-09T17:23:24.569 INFO:teuthology.orchestra.run.vm06.stdout:(104/119): python3-repoze-lru-0.7-16.el9.noarch 5.4 MB/s | 31 kB 00:00 2026-03-09T17:23:24.577 INFO:teuthology.orchestra.run.vm06.stdout:(105/119): python3-routes-2.5.1-5.el9.noarch.rp 24 MB/s | 188 kB 00:00 2026-03-09T17:23:24.582 INFO:teuthology.orchestra.run.vm06.stdout:(106/119): python3-rsa-4.9-2.el9.noarch.rpm 13 MB/s | 59 kB 00:00 2026-03-09T17:23:24.585 INFO:teuthology.orchestra.run.vm06.stdout:(107/119): python3-tempora-5.0.0-2.el9.noarch.r 12 MB/s | 36 kB 00:00 2026-03-09T17:23:24.590 INFO:teuthology.orchestra.run.vm06.stdout:(108/119): python3-typing-extensions-4.15.0-1.e 17 MB/s | 86 kB 00:00 2026-03-09T17:23:24.599 INFO:teuthology.orchestra.run.vm06.stdout:(109/119): python3-webob-1.8.8-2.el9.noarch.rpm 28 MB/s | 230 kB 00:00 2026-03-09T17:23:24.603 INFO:teuthology.orchestra.run.vm06.stdout:(110/119): python3-websocket-client-1.2.3-2.el9 24 MB/s | 90 kB 00:00 2026-03-09T17:23:24.615 INFO:teuthology.orchestra.run.vm06.stdout:(111/119): python3-werkzeug-2.0.3-3.el9.1.noarc 37 MB/s | 427 kB 00:00 2026-03-09T17:23:24.617 INFO:teuthology.orchestra.run.vm06.stdout:(112/119): python3-xmltodict-0.12.0-15.el9.noar 7.2 MB/s | 22 kB 00:00 2026-03-09T17:23:24.620 INFO:teuthology.orchestra.run.vm06.stdout:(113/119): python3-zc-lockfile-2.0-10.el9.noarc 9.2 MB/s | 20 kB 00:00 2026-03-09T17:23:24.625 INFO:teuthology.orchestra.run.vm06.stdout:(114/119): re2-20211101-20.el9.x86_64.rpm 39 MB/s | 191 kB 00:00 2026-03-09T17:23:24.654 INFO:teuthology.orchestra.run.vm06.stdout:(115/119): thrift-0.15.0-4.el9.x86_64.rpm 54 MB/s | 1.6 MB 00:00 2026-03-09T17:23:24.749 INFO:teuthology.orchestra.run.vm06.stdout:(116/119): python3-numpy-1.23.5-2.el9.x86_64.rp 8.3 MB/s | 6.1 MB 00:00 2026-03-09T17:23:24.752 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/121 2026-03-09T17:23:24.772 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/121 2026-03-09T17:23:24.779 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/121 2026-03-09T17:23:24.789 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/121 2026-03-09T17:23:24.795 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/121 2026-03-09T17:23:24.804 INFO:teuthology.orchestra.run.vm09.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/121 2026-03-09T17:23:24.808 INFO:teuthology.orchestra.run.vm09.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/121 2026-03-09T17:23:24.813 INFO:teuthology.orchestra.run.vm09.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/121 2026-03-09T17:23:24.824 INFO:teuthology.orchestra.run.vm09.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/121 2026-03-09T17:23:24.834 INFO:teuthology.orchestra.run.vm09.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/121 2026-03-09T17:23:24.838 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/121 2026-03-09T17:23:24.847 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/121 2026-03-09T17:23:24.852 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/121 2026-03-09T17:23:24.863 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/121 2026-03-09T17:23:24.869 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/121 2026-03-09T17:23:24.914 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/121 2026-03-09T17:23:25.130 INFO:teuthology.orchestra.run.vm06.stdout:(117/119): python3-scipy-1.9.3-2.el9.x86_64.rpm 19 MB/s | 19 MB 00:00 2026-03-09T17:23:25.213 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/121 2026-03-09T17:23:25.251 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/121 2026-03-09T17:23:25.259 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-09T17:23:25.333 INFO:teuthology.orchestra.run.vm09.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/121 2026-03-09T17:23:25.336 INFO:teuthology.orchestra.run.vm09.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/121 2026-03-09T17:23:25.360 INFO:teuthology.orchestra.run.vm09.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/121 2026-03-09T17:23:25.644 INFO:teuthology.orchestra.run.vm06.stdout:(118/119): librbd1-18.2.0-0.el9.x86_64.rpm 3.4 MB/s | 3.0 MB 00:00 2026-03-09T17:23:25.796 INFO:teuthology.orchestra.run.vm09.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/121 2026-03-09T17:23:25.856 INFO:teuthology.orchestra.run.vm06.stdout:(119/119): librados2-18.2.0-0.el9.x86_64.rpm 2.7 MB/s | 3.3 MB 00:01 2026-03-09T17:23:25.859 INFO:teuthology.orchestra.run.vm06.stdout:-------------------------------------------------------------------------------- 2026-03-09T17:23:25.859 INFO:teuthology.orchestra.run.vm06.stdout:Total 14 MB/s | 182 MB 00:13 2026-03-09T17:23:25.887 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-09T17:23:26.399 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T17:23:26.448 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T17:23:26.448 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T17:23:26.717 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-09T17:23:26.745 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/121 2026-03-09T17:23:26.752 INFO:teuthology.orchestra.run.vm09.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/121 2026-03-09T17:23:26.757 INFO:teuthology.orchestra.run.vm09.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/121 2026-03-09T17:23:26.908 INFO:teuthology.orchestra.run.vm09.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/121 2026-03-09T17:23:26.911 INFO:teuthology.orchestra.run.vm09.stdout: Upgrading : librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-09T17:23:26.942 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-09T17:23:26.947 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-rbd-2:18.2.0-0.el9.x86_64 73/121 2026-03-09T17:23:26.955 INFO:teuthology.orchestra.run.vm09.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/121 2026-03-09T17:23:27.176 INFO:teuthology.orchestra.run.vm09.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/121 2026-03-09T17:23:27.180 INFO:teuthology.orchestra.run.vm09.stdout: Installing : librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-09T17:23:27.202 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-09T17:23:27.212 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-rgw-2:18.2.0-0.el9.x86_64 77/121 2026-03-09T17:23:27.230 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/121 2026-03-09T17:23:27.237 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T17:23:27.237 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T17:23:27.251 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/121 2026-03-09T17:23:27.345 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/121 2026-03-09T17:23:27.359 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/121 2026-03-09T17:23:27.387 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/121 2026-03-09T17:23:27.423 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/121 2026-03-09T17:23:27.482 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/121 2026-03-09T17:23:27.494 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/121 2026-03-09T17:23:27.497 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-jwt-2.4.0-1.el9.noarch 86/121 2026-03-09T17:23:27.503 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-jwt+crypto-2.4.0-1.el9.noarch 87/121 2026-03-09T17:23:27.507 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 88/121 2026-03-09T17:23:27.512 INFO:teuthology.orchestra.run.vm09.stdout: Installing : mailcap-2.1.49-5.el9.noarch 89/121 2026-03-09T17:23:27.514 INFO:teuthology.orchestra.run.vm09.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 90/121 2026-03-09T17:23:27.536 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-09T17:23:27.536 INFO:teuthology.orchestra.run.vm09.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-09T17:23:27.536 INFO:teuthology.orchestra.run.vm09.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-09T17:23:27.536 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:27.549 INFO:teuthology.orchestra.run.vm09.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-09T17:23:27.579 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-09T17:23:27.579 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-09T17:23:27.579 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:27.597 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 92/121 2026-03-09T17:23:27.652 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-09T17:23:27.655 INFO:teuthology.orchestra.run.vm09.stdout: Installing : cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-09T17:23:27.662 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 94/121 2026-03-09T17:23:27.697 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 95/121 2026-03-09T17:23:27.701 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-ceph-common-2:18.2.0-0.el9.x86_64 96/121 2026-03-09T17:23:28.110 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T17:23:28.120 INFO:teuthology.orchestra.run.vm06.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/121 2026-03-09T17:23:28.135 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/121 2026-03-09T17:23:28.392 INFO:teuthology.orchestra.run.vm06.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/121 2026-03-09T17:23:28.394 INFO:teuthology.orchestra.run.vm06.stdout: Upgrading : librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-09T17:23:28.446 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-09T17:23:28.448 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-09T17:23:28.487 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-09T17:23:28.501 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-rados-2:18.2.0-0.el9.x86_64 6/121 2026-03-09T17:23:28.504 INFO:teuthology.orchestra.run.vm06.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/121 2026-03-09T17:23:28.507 INFO:teuthology.orchestra.run.vm06.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/121 2026-03-09T17:23:28.516 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/121 2026-03-09T17:23:28.518 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-09T17:23:28.555 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-09T17:23:28.556 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-09T17:23:28.611 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-09T17:23:28.616 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/121 2026-03-09T17:23:28.644 INFO:teuthology.orchestra.run.vm06.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/121 2026-03-09T17:23:28.655 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/121 2026-03-09T17:23:28.659 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/121 2026-03-09T17:23:28.689 INFO:teuthology.orchestra.run.vm06.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/121 2026-03-09T17:23:28.709 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-09T17:23:28.711 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/121 2026-03-09T17:23:28.715 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-09T17:23:28.717 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/121 2026-03-09T17:23:28.728 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/121 2026-03-09T17:23:28.731 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/121 2026-03-09T17:23:28.737 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/121 2026-03-09T17:23:28.748 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 22/121 2026-03-09T17:23:28.764 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cephfs-2:18.2.0-0.el9.x86_64 23/121 2026-03-09T17:23:28.797 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/121 2026-03-09T17:23:28.867 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/121 2026-03-09T17:23:28.887 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/121 2026-03-09T17:23:28.896 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/121 2026-03-09T17:23:28.906 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/121 2026-03-09T17:23:28.911 INFO:teuthology.orchestra.run.vm06.stdout: Installing : librados-devel-2:18.2.0-0.el9.x86_64 29/121 2026-03-09T17:23:28.951 INFO:teuthology.orchestra.run.vm06.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/121 2026-03-09T17:23:28.959 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/121 2026-03-09T17:23:28.983 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/121 2026-03-09T17:23:29.012 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/121 2026-03-09T17:23:29.022 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/121 2026-03-09T17:23:29.030 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/121 2026-03-09T17:23:29.045 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-09T17:23:29.046 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/121 2026-03-09T17:23:29.053 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-09T17:23:29.059 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/121 2026-03-09T17:23:29.073 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/121 2026-03-09T17:23:29.096 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-09T17:23:29.097 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-09T17:23:29.097 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-09T17:23:29.097 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:29.102 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-09T17:23:29.149 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/121 2026-03-09T17:23:29.159 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/121 2026-03-09T17:23:29.169 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/121 2026-03-09T17:23:29.228 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/121 2026-03-09T17:23:29.691 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/121 2026-03-09T17:23:29.708 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/121 2026-03-09T17:23:29.713 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/121 2026-03-09T17:23:29.721 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/121 2026-03-09T17:23:29.726 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/121 2026-03-09T17:23:29.733 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/121 2026-03-09T17:23:29.737 INFO:teuthology.orchestra.run.vm06.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/121 2026-03-09T17:23:29.741 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/121 2026-03-09T17:23:29.753 INFO:teuthology.orchestra.run.vm06.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/121 2026-03-09T17:23:29.761 INFO:teuthology.orchestra.run.vm06.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/121 2026-03-09T17:23:29.769 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/121 2026-03-09T17:23:29.779 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/121 2026-03-09T17:23:29.785 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/121 2026-03-09T17:23:29.794 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/121 2026-03-09T17:23:29.799 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/121 2026-03-09T17:23:29.848 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/121 2026-03-09T17:23:30.170 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/121 2026-03-09T17:23:30.216 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/121 2026-03-09T17:23:30.223 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-09T17:23:30.290 INFO:teuthology.orchestra.run.vm06.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/121 2026-03-09T17:23:30.294 INFO:teuthology.orchestra.run.vm06.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/121 2026-03-09T17:23:30.320 INFO:teuthology.orchestra.run.vm06.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/121 2026-03-09T17:23:30.743 INFO:teuthology.orchestra.run.vm06.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/121 2026-03-09T17:23:30.850 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-09T17:23:31.849 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-09T17:23:31.879 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/121 2026-03-09T17:23:31.887 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/121 2026-03-09T17:23:31.894 INFO:teuthology.orchestra.run.vm06.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/121 2026-03-09T17:23:32.060 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/121 2026-03-09T17:23:32.063 INFO:teuthology.orchestra.run.vm06.stdout: Upgrading : librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-09T17:23:32.104 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-09T17:23:32.109 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-rbd-2:18.2.0-0.el9.x86_64 73/121 2026-03-09T17:23:32.118 INFO:teuthology.orchestra.run.vm06.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/121 2026-03-09T17:23:32.340 INFO:teuthology.orchestra.run.vm06.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/121 2026-03-09T17:23:32.343 INFO:teuthology.orchestra.run.vm06.stdout: Installing : librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-09T17:23:32.364 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-09T17:23:32.372 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-rgw-2:18.2.0-0.el9.x86_64 77/121 2026-03-09T17:23:32.390 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/121 2026-03-09T17:23:32.411 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/121 2026-03-09T17:23:32.505 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/121 2026-03-09T17:23:32.520 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/121 2026-03-09T17:23:32.548 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/121 2026-03-09T17:23:32.586 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/121 2026-03-09T17:23:32.655 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/121 2026-03-09T17:23:32.669 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/121 2026-03-09T17:23:32.672 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jwt-2.4.0-1.el9.noarch 86/121 2026-03-09T17:23:32.679 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jwt+crypto-2.4.0-1.el9.noarch 87/121 2026-03-09T17:23:32.684 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 88/121 2026-03-09T17:23:32.689 INFO:teuthology.orchestra.run.vm06.stdout: Installing : mailcap-2.1.49-5.el9.noarch 89/121 2026-03-09T17:23:32.692 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 90/121 2026-03-09T17:23:32.714 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-09T17:23:32.714 INFO:teuthology.orchestra.run.vm06.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-09T17:23:32.714 INFO:teuthology.orchestra.run.vm06.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-09T17:23:32.714 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:32.726 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-09T17:23:32.758 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-09T17:23:32.759 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-09T17:23:32.759 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:32.777 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 92/121 2026-03-09T17:23:32.837 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-09T17:23:32.841 INFO:teuthology.orchestra.run.vm06.stdout: Installing : cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-09T17:23:32.846 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 94/121 2026-03-09T17:23:32.877 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 95/121 2026-03-09T17:23:32.881 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-ceph-common-2:18.2.0-0.el9.x86_64 96/121 2026-03-09T17:23:33.920 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-09T17:23:33.926 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-09T17:23:34.258 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-09T17:23:34.266 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-09T17:23:34.307 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-09T17:23:34.307 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-09T17:23:34.307 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-09T17:23:34.307 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:34.312 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-09T17:23:36.110 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-09T17:23:36.110 INFO:teuthology.orchestra.run.vm09.stdout:skipping the directory /sys 2026-03-09T17:23:36.110 INFO:teuthology.orchestra.run.vm09.stdout:skipping the directory /proc 2026-03-09T17:23:36.110 INFO:teuthology.orchestra.run.vm09.stdout:skipping the directory /mnt 2026-03-09T17:23:36.110 INFO:teuthology.orchestra.run.vm09.stdout:skipping the directory /var/tmp 2026-03-09T17:23:36.110 INFO:teuthology.orchestra.run.vm09.stdout:skipping the directory /home 2026-03-09T17:23:36.110 INFO:teuthology.orchestra.run.vm09.stdout:skipping the directory /root 2026-03-09T17:23:36.110 INFO:teuthology.orchestra.run.vm09.stdout:skipping the directory /tmp 2026-03-09T17:23:36.110 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:36.145 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-09T17:23:36.277 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-09T17:23:36.282 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-09T17:23:36.838 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-09T17:23:36.842 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-09T17:23:36.906 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-09T17:23:36.984 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 103/121 2026-03-09T17:23:36.987 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-09T17:23:37.012 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-09T17:23:37.012 INFO:teuthology.orchestra.run.vm09.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:23:37.012 INFO:teuthology.orchestra.run.vm09.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-09T17:23:37.012 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-09T17:23:37.012 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-09T17:23:37.012 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:37.025 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-09T17:23:37.137 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-09T17:23:37.140 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-09T17:23:37.163 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-09T17:23:37.163 INFO:teuthology.orchestra.run.vm09.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:23:37.163 INFO:teuthology.orchestra.run.vm09.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-09T17:23:37.163 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-09T17:23:37.163 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-09T17:23:37.163 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:37.402 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-09T17:23:37.429 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-09T17:23:37.429 INFO:teuthology.orchestra.run.vm09.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:23:37.429 INFO:teuthology.orchestra.run.vm09.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-09T17:23:37.429 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-09T17:23:37.429 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-09T17:23:37.429 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:38.311 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-09T17:23:38.343 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-09T17:23:38.343 INFO:teuthology.orchestra.run.vm09.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:23:38.343 INFO:teuthology.orchestra.run.vm09.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-09T17:23:38.343 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-09T17:23:38.344 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-09T17:23:38.344 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:38.720 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-2:18.2.0-0.el9.x86_64 109/121 2026-03-09T17:23:38.729 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-09T17:23:38.753 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-09T17:23:38.753 INFO:teuthology.orchestra.run.vm09.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:23:38.753 INFO:teuthology.orchestra.run.vm09.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-09T17:23:38.753 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-09T17:23:38.753 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-09T17:23:38.753 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:38.764 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-09T17:23:38.781 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-09T17:23:38.781 INFO:teuthology.orchestra.run.vm09.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:23:38.781 INFO:teuthology.orchestra.run.vm09.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-09T17:23:38.781 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:38.930 INFO:teuthology.orchestra.run.vm09.stdout: Installing : rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-09T17:23:38.949 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-09T17:23:38.949 INFO:teuthology.orchestra.run.vm09.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:23:38.949 INFO:teuthology.orchestra.run.vm09.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-09T17:23:38.949 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-09T17:23:38.949 INFO:teuthology.orchestra.run.vm09.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-09T17:23:38.949 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:41.096 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-test-2:18.2.0-0.el9.x86_64 113/121 2026-03-09T17:23:41.108 INFO:teuthology.orchestra.run.vm09.stdout: Installing : rbd-fuse-2:18.2.0-0.el9.x86_64 114/121 2026-03-09T17:23:41.114 INFO:teuthology.orchestra.run.vm09.stdout: Installing : rbd-nbd-2:18.2.0-0.el9.x86_64 115/121 2026-03-09T17:23:41.156 INFO:teuthology.orchestra.run.vm09.stdout: Installing : libcephfs-devel-2:18.2.0-0.el9.x86_64 116/121 2026-03-09T17:23:41.164 INFO:teuthology.orchestra.run.vm09.stdout: Installing : ceph-fuse-2:18.2.0-0.el9.x86_64 117/121 2026-03-09T17:23:41.176 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 118/121 2026-03-09T17:23:41.180 INFO:teuthology.orchestra.run.vm09.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 119/121 2026-03-09T17:23:41.180 INFO:teuthology.orchestra.run.vm09.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-09T17:23:41.197 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-09T17:23:41.197 INFO:teuthology.orchestra.run.vm09.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-09T17:23:41.513 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-09T17:23:41.513 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /sys 2026-03-09T17:23:41.513 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /proc 2026-03-09T17:23:41.513 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /mnt 2026-03-09T17:23:41.513 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /var/tmp 2026-03-09T17:23:41.513 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /home 2026-03-09T17:23:41.513 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /root 2026-03-09T17:23:41.513 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /tmp 2026-03-09T17:23:41.513 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:41.550 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-09T17:23:41.705 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-09T17:23:41.816 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-09T17:23:42.306 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-09T17:23:42.306 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/121 2026-03-09T17:23:42.306 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 2/121 2026-03-09T17:23:42.306 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 3/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 4/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 5/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 6/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 7/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 8/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 9/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 10/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 11/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 12/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 13/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 14/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 15/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 16/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 17/121 2026-03-09T17:23:42.307 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 18/121 2026-03-09T17:23:42.308 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 19/121 2026-03-09T17:23:42.308 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 20/121 2026-03-09T17:23:42.308 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 21/121 2026-03-09T17:23:42.308 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 22/121 2026-03-09T17:23:42.308 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 23/121 2026-03-09T17:23:42.308 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 24/121 2026-03-09T17:23:42.308 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 25/121 2026-03-09T17:23:42.308 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 26/121 2026-03-09T17:23:42.308 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 27/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 28/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 29/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 30/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 31/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 32/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 33/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 34/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 35/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/121 2026-03-09T17:23:42.309 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/121 2026-03-09T17:23:42.310 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/121 2026-03-09T17:23:42.310 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/121 2026-03-09T17:23:42.310 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-09T17:23:42.311 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/121 2026-03-09T17:23:42.315 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 97/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 98/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 99/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 100/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 101/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 102/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 103/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 104/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 105/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 106/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 107/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 108/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 109/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 110/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 111/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 112/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 113/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 114/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 115/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : re2-1:20211101-20.el9.x86_64 116/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 117/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 118/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 119/121 2026-03-09T17:23:42.316 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 120/121 2026-03-09T17:23:42.405 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-09T17:23:42.408 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-09T17:23:42.447 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 121/121 2026-03-09T17:23:42.447 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:42.447 INFO:teuthology.orchestra.run.vm09.stdout:Upgraded: 2026-03-09T17:23:42.447 INFO:teuthology.orchestra.run.vm09.stdout: librados2-2:18.2.0-0.el9.x86_64 librbd1-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.447 INFO:teuthology.orchestra.run.vm09.stdout:Installed: 2026-03-09T17:23:42.447 INFO:teuthology.orchestra.run.vm09.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ceph-test-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: librados-devel-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-09T17:23:42.448 INFO:teuthology.orchestra.run.vm09.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-ply-3.11-14.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.449 INFO:teuthology.orchestra.run.vm09.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:42.450 INFO:teuthology.orchestra.run.vm09.stdout: re2-1:20211101-20.el9.x86_64 2026-03-09T17:23:42.450 INFO:teuthology.orchestra.run.vm09.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-09T17:23:42.450 INFO:teuthology.orchestra.run.vm09.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-09T17:23:42.450 INFO:teuthology.orchestra.run.vm09.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-09T17:23:42.450 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:23:42.450 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:23:42.474 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-09T17:23:42.538 DEBUG:teuthology.parallel:result is None 2026-03-09T17:23:42.557 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 103/121 2026-03-09T17:23:42.559 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-09T17:23:42.584 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-09T17:23:42.584 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:23:42.584 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-09T17:23:42.584 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-09T17:23:42.584 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-09T17:23:42.584 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:42.598 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-09T17:23:42.711 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-09T17:23:42.714 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-09T17:23:42.739 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-09T17:23:42.739 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:23:42.739 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-09T17:23:42.739 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-09T17:23:42.739 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-09T17:23:42.739 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:42.982 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-09T17:23:43.007 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-09T17:23:43.007 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:23:43.007 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-09T17:23:43.007 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-09T17:23:43.007 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-09T17:23:43.007 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:43.904 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-09T17:23:43.934 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-09T17:23:43.934 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:23:43.934 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-09T17:23:43.934 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-09T17:23:43.934 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-09T17:23:43.934 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:44.323 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-2:18.2.0-0.el9.x86_64 109/121 2026-03-09T17:23:44.326 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-09T17:23:44.350 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-09T17:23:44.350 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:23:44.350 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-09T17:23:44.350 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-09T17:23:44.350 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-09T17:23:44.350 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:44.363 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-09T17:23:44.386 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-09T17:23:44.386 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:23:44.386 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-09T17:23:44.386 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:44.543 INFO:teuthology.orchestra.run.vm06.stdout: Installing : rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-09T17:23:44.571 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-09T17:23:44.571 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:23:44.571 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-09T17:23:44.571 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-09T17:23:44.571 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-09T17:23:44.571 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:46.693 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-test-2:18.2.0-0.el9.x86_64 113/121 2026-03-09T17:23:46.705 INFO:teuthology.orchestra.run.vm06.stdout: Installing : rbd-fuse-2:18.2.0-0.el9.x86_64 114/121 2026-03-09T17:23:46.711 INFO:teuthology.orchestra.run.vm06.stdout: Installing : rbd-nbd-2:18.2.0-0.el9.x86_64 115/121 2026-03-09T17:23:46.752 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libcephfs-devel-2:18.2.0-0.el9.x86_64 116/121 2026-03-09T17:23:46.760 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-fuse-2:18.2.0-0.el9.x86_64 117/121 2026-03-09T17:23:46.769 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 118/121 2026-03-09T17:23:46.774 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 119/121 2026-03-09T17:23:46.774 INFO:teuthology.orchestra.run.vm06.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-09T17:23:46.793 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-09T17:23:46.793 INFO:teuthology.orchestra.run.vm06.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 2/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 3/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 4/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 5/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 6/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 7/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 8/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 9/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 10/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 11/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 12/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 13/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 14/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 15/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 16/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 17/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 18/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 19/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 20/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 21/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 22/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 23/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 24/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 25/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 26/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 27/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 28/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 29/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 30/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 31/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 32/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 33/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 34/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 35/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/121 2026-03-09T17:23:47.838 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/121 2026-03-09T17:23:47.840 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 97/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 98/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 99/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 100/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 101/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 102/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 103/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 104/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 105/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 106/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 107/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 108/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 109/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 110/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 111/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 112/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 113/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 114/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 115/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : re2-1:20211101-20.el9.x86_64 116/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 117/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 118/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 119/121 2026-03-09T17:23:47.841 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 120/121 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 121/121 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout:Upgraded: 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: librados2-2:18.2.0-0.el9.x86_64 librbd1-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout:Installed: 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.956 INFO:teuthology.orchestra.run.vm06.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: ceph-test-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: librados-devel-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-09T17:23:47.957 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-ply-3.11-14.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: re2-1:20211101-20.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:23:47.958 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:23:48.045 DEBUG:teuthology.parallel:result is None 2026-03-09T17:23:48.045 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-09T17:23:48.045 INFO:teuthology.packaging:ref: None 2026-03-09T17:23:48.045 INFO:teuthology.packaging:tag: v18.2.0 2026-03-09T17:23:48.045 INFO:teuthology.packaging:branch: None 2026-03-09T17:23:48.045 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T17:23:48.045 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-09T17:23:48.638 DEBUG:teuthology.orchestra.run.vm06:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-09T17:23:48.660 INFO:teuthology.orchestra.run.vm06.stdout:18.2.0-0.el9 2026-03-09T17:23:48.660 INFO:teuthology.packaging:The installed version of ceph is 18.2.0-0.el9 2026-03-09T17:23:48.660 INFO:teuthology.task.install:The correct ceph version 18.2.0-0 is installed. 2026-03-09T17:23:48.661 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-09T17:23:48.661 INFO:teuthology.packaging:ref: None 2026-03-09T17:23:48.661 INFO:teuthology.packaging:tag: v18.2.0 2026-03-09T17:23:48.661 INFO:teuthology.packaging:branch: None 2026-03-09T17:23:48.661 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T17:23:48.662 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-09T17:23:49.259 DEBUG:teuthology.orchestra.run.vm09:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-09T17:23:49.282 INFO:teuthology.orchestra.run.vm09.stdout:18.2.0-0.el9 2026-03-09T17:23:49.282 INFO:teuthology.packaging:The installed version of ceph is 18.2.0-0.el9 2026-03-09T17:23:49.282 INFO:teuthology.task.install:The correct ceph version 18.2.0-0 is installed. 2026-03-09T17:23:49.283 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-09T17:23:49.283 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:23:49.283 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-09T17:23:49.315 DEBUG:teuthology.orchestra.run.vm09:> set -ex 2026-03-09T17:23:49.315 DEBUG:teuthology.orchestra.run.vm09:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-09T17:23:49.352 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-09T17:23:49.352 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:23:49.352 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/usr/bin/daemon-helper 2026-03-09T17:23:49.383 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-09T17:23:49.454 DEBUG:teuthology.orchestra.run.vm09:> set -ex 2026-03-09T17:23:49.454 DEBUG:teuthology.orchestra.run.vm09:> sudo dd of=/usr/bin/daemon-helper 2026-03-09T17:23:49.480 DEBUG:teuthology.orchestra.run.vm09:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-09T17:23:49.550 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-09T17:23:49.550 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:23:49.550 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-09T17:23:49.579 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-09T17:23:49.644 DEBUG:teuthology.orchestra.run.vm09:> set -ex 2026-03-09T17:23:49.644 DEBUG:teuthology.orchestra.run.vm09:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-09T17:23:49.672 DEBUG:teuthology.orchestra.run.vm09:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-09T17:23:49.743 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-09T17:23:49.743 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:23:49.743 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/usr/bin/stdin-killer 2026-03-09T17:23:49.770 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-09T17:23:49.839 DEBUG:teuthology.orchestra.run.vm09:> set -ex 2026-03-09T17:23:49.839 DEBUG:teuthology.orchestra.run.vm09:> sudo dd of=/usr/bin/stdin-killer 2026-03-09T17:23:49.873 DEBUG:teuthology.orchestra.run.vm09:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-09T17:23:49.941 INFO:teuthology.run_tasks:Running task print... 2026-03-09T17:23:49.944 INFO:teuthology.task.print:**** done install task... 2026-03-09T17:23:49.944 INFO:teuthology.run_tasks:Running task cephadm... 2026-03-09T17:23:49.994 INFO:tasks.cephadm:Config: {'compiled_cephadm_branch': 'reef', 'conf': {'osd': {'osd_class_default_list': '*', 'osd_class_load_list': '*', 'bdev async discard': True, 'bdev enable discard': True, 'bluestore allocator': 'bitmap', 'bluestore block size': 96636764160, 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd op complaint time': 180}, 'client': {'client mount timeout': 600, 'debug client': 20, 'debug ms': 1, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'global': {'mon pg warn min per osd': 0}, 'mds': {'debug mds': 20, 'debug mds balancer': 20, 'debug ms': 1, 'mds debug frag': True, 'mds debug scatterstat': True, 'mds op complaint time': 180, 'mds verify scatter': True, 'osd op complaint time': 180, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20, 'mon down mkfs grace': 300, 'mon op complaint time': 120}}, 'image': 'quay.io/ceph/ceph:v18.2.0', 'roleless': True, 'cluster-conf': {'mgr': {'client mount timeout': 30, 'debug client': 20, 'debug mgr': 20, 'debug ms': 1, 'mon warn on pool no app': False}}, 'flavor': 'default', 'fs': 'xfs', 'log-ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', 'FS_DEGRADED', 'filesystem is degraded', 'FS_INLINE_DATA_DEPRECATED', 'FS_WITH_FAILED_MDS', 'MDS_ALL_DOWN', 'filesystem is offline', 'is offline because no MDS', 'MDS_DAMAGE', 'MDS_DEGRADED', 'MDS_FAILED', 'MDS_INSUFFICIENT_STANDBY', 'MDS_UP_LESS_THAN_MAX', 'online, but wants', 'filesystem is online with fewer MDS than max_mds', 'POOL_APP_NOT_ENABLED', 'do not have an application enabled', 'overall HEALTH_', 'Replacing daemon', 'deprecated feature inline_data', 'MGR_MODULE_ERROR', 'OSD_DOWN', 'osds down', 'overall HEALTH_', '\\(OSD_DOWN\\)', '\\(OSD_', 'but it is still running', 'is not responding', 'MON_DOWN', 'PG_AVAILABILITY', 'PG_DEGRADED', 'Reduced data availability', 'Degraded data redundancy', 'pg .* is stuck inactive', 'pg .* is .*degraded', 'pg .* is stuck peering'], 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'} 2026-03-09T17:23:49.994 INFO:tasks.cephadm:Cluster image is quay.io/ceph/ceph:v18.2.0 2026-03-09T17:23:49.994 INFO:tasks.cephadm:Cluster fsid is bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:23:49.994 INFO:tasks.cephadm:Choosing monitor IPs and ports... 2026-03-09T17:23:49.994 INFO:tasks.cephadm:No mon roles; fabricating mons 2026-03-09T17:23:49.994 INFO:tasks.cephadm:Monitor IPs: {'mon.vm06': '192.168.123.106', 'mon.vm09': '192.168.123.109'} 2026-03-09T17:23:49.994 INFO:tasks.cephadm:Normalizing hostnames... 2026-03-09T17:23:49.994 DEBUG:teuthology.orchestra.run.vm06:> sudo hostname $(hostname -s) 2026-03-09T17:23:50.020 DEBUG:teuthology.orchestra.run.vm09:> sudo hostname $(hostname -s) 2026-03-09T17:23:50.047 INFO:tasks.cephadm:Downloading "compiled" cephadm from cachra for reef 2026-03-09T17:23:50.047 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T17:23:50.644 INFO:tasks.cephadm:builder_project result: [{'url': 'https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'chacra_url': 'https://3.chacra.ceph.com/repos/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'ref': 'squid', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'distro': 'centos', 'distro_version': '9', 'distro_codename': None, 'modified': '2026-02-25 18:55:15.146628', 'status': 'ready', 'flavor': 'default', 'project': 'ceph', 'archs': ['source', 'x86_64'], 'extra': {'version': '19.2.3-678-ge911bdeb', 'package_manager_version': '19.2.3-678.ge911bdeb', 'build_url': 'https://jenkins.ceph.com/job/ceph-dev-pipeline/3275/', 'root_build_cause': '', 'node_name': '10.20.192.26+soko16', 'job_name': 'ceph-dev-pipeline'}}] 2026-03-09T17:23:51.494 INFO:tasks.util.chacra:got chacra host 3.chacra.ceph.com, ref reef, sha1 ab47f43c099b2cbae6e21342fe673ce251da54d6 from https://shaman.ceph.com/api/search/?project=ceph&distros=centos%2F9%2Fx86_64&flavor=default&ref=reef 2026-03-09T17:23:51.495 INFO:tasks.cephadm:Discovered cachra url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-09T17:23:51.495 INFO:tasks.cephadm:Downloading cephadm from url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-09T17:23:51.495 DEBUG:teuthology.orchestra.run.vm06:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-09T17:23:52.923 INFO:teuthology.orchestra.run.vm06.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 9 17:23 /home/ubuntu/cephtest/cephadm 2026-03-09T17:23:52.923 DEBUG:teuthology.orchestra.run.vm09:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-09T17:23:54.167 INFO:teuthology.orchestra.run.vm09.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 9 17:23 /home/ubuntu/cephtest/cephadm 2026-03-09T17:23:54.167 DEBUG:teuthology.orchestra.run.vm06:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-09T17:23:54.183 DEBUG:teuthology.orchestra.run.vm09:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-09T17:23:54.203 INFO:tasks.cephadm:Pulling image quay.io/ceph/ceph:v18.2.0 on all hosts... 2026-03-09T17:23:54.204 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 pull 2026-03-09T17:23:54.225 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 pull 2026-03-09T17:23:54.376 INFO:teuthology.orchestra.run.vm06.stderr:Pulling container image quay.io/ceph/ceph:v18.2.0... 2026-03-09T17:23:54.388 INFO:teuthology.orchestra.run.vm09.stderr:Pulling container image quay.io/ceph/ceph:v18.2.0... 2026-03-09T17:24:14.080 INFO:teuthology.orchestra.run.vm09.stdout:{ 2026-03-09T17:24:14.080 INFO:teuthology.orchestra.run.vm09.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-09T17:24:14.080 INFO:teuthology.orchestra.run.vm09.stdout: "image_id": "dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946", 2026-03-09T17:24:14.080 INFO:teuthology.orchestra.run.vm09.stdout: "repo_digests": [ 2026-03-09T17:24:14.080 INFO:teuthology.orchestra.run.vm09.stdout: "quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007", 2026-03-09T17:24:14.080 INFO:teuthology.orchestra.run.vm09.stdout: "quay.io/ceph/ceph@sha256:d58cf65589d0abf9d5261cd46fb62be3bfb29098febc78c0fcdf116a15274d27" 2026-03-09T17:24:14.080 INFO:teuthology.orchestra.run.vm09.stdout: ] 2026-03-09T17:24:14.080 INFO:teuthology.orchestra.run.vm09.stdout:} 2026-03-09T17:24:14.554 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:24:14.554 INFO:teuthology.orchestra.run.vm06.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-09T17:24:14.554 INFO:teuthology.orchestra.run.vm06.stdout: "image_id": "dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946", 2026-03-09T17:24:14.554 INFO:teuthology.orchestra.run.vm06.stdout: "repo_digests": [ 2026-03-09T17:24:14.554 INFO:teuthology.orchestra.run.vm06.stdout: "quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007", 2026-03-09T17:24:14.554 INFO:teuthology.orchestra.run.vm06.stdout: "quay.io/ceph/ceph@sha256:d58cf65589d0abf9d5261cd46fb62be3bfb29098febc78c0fcdf116a15274d27" 2026-03-09T17:24:14.554 INFO:teuthology.orchestra.run.vm06.stdout: ] 2026-03-09T17:24:14.554 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:24:14.566 DEBUG:teuthology.orchestra.run.vm06:> sudo mkdir -p /etc/ceph 2026-03-09T17:24:14.595 DEBUG:teuthology.orchestra.run.vm09:> sudo mkdir -p /etc/ceph 2026-03-09T17:24:14.624 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod 777 /etc/ceph 2026-03-09T17:24:14.661 DEBUG:teuthology.orchestra.run.vm09:> sudo chmod 777 /etc/ceph 2026-03-09T17:24:14.689 INFO:tasks.cephadm:Writing seed config... 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] osd_class_default_list = * 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] osd_class_load_list = * 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] bdev async discard = True 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] bdev enable discard = True 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] bluestore allocator = bitmap 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] bluestore block size = 96636764160 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] bluestore fsck on mount = True 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] debug bluefs = 1/20 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] debug bluestore = 1/20 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] debug ms = 1 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] debug osd = 20 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] debug rocksdb = 4/10 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] mon osd backfillfull_ratio = 0.85 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] mon osd full ratio = 0.9 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] mon osd nearfull ratio = 0.8 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] osd failsafe full ratio = 0.95 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] osd mclock iops capacity threshold hdd = 49000 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] osd objectstore = bluestore 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [osd] osd op complaint time = 180 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [client] client mount timeout = 600 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [client] debug client = 20 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [client] debug ms = 1 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [client] rados mon op timeout = 900 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [client] rados osd op timeout = 900 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [global] mon pg warn min per osd = 0 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [mds] debug mds = 20 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [mds] debug mds balancer = 20 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [mds] debug ms = 1 2026-03-09T17:24:14.690 INFO:tasks.cephadm: override: [mds] mds debug frag = True 2026-03-09T17:24:14.691 INFO:tasks.cephadm: override: [mds] mds debug scatterstat = True 2026-03-09T17:24:14.691 INFO:tasks.cephadm: override: [mds] mds op complaint time = 180 2026-03-09T17:24:14.691 INFO:tasks.cephadm: override: [mds] mds verify scatter = True 2026-03-09T17:24:14.691 INFO:tasks.cephadm: override: [mds] osd op complaint time = 180 2026-03-09T17:24:14.691 INFO:tasks.cephadm: override: [mds] rados mon op timeout = 900 2026-03-09T17:24:14.691 INFO:tasks.cephadm: override: [mds] rados osd op timeout = 900 2026-03-09T17:24:14.691 INFO:tasks.cephadm: override: [mgr] debug mgr = 20 2026-03-09T17:24:14.691 INFO:tasks.cephadm: override: [mgr] debug ms = 1 2026-03-09T17:24:14.691 INFO:tasks.cephadm: override: [mon] debug mon = 20 2026-03-09T17:24:14.691 INFO:tasks.cephadm: override: [mon] debug ms = 1 2026-03-09T17:24:14.691 INFO:tasks.cephadm: override: [mon] debug paxos = 20 2026-03-09T17:24:14.691 INFO:tasks.cephadm: override: [mon] mon down mkfs grace = 300 2026-03-09T17:24:14.691 INFO:tasks.cephadm: override: [mon] mon op complaint time = 120 2026-03-09T17:24:14.691 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:24:14.691 DEBUG:teuthology.orchestra.run.vm06:> dd of=/home/ubuntu/cephtest/seed.ceph.conf 2026-03-09T17:24:14.716 DEBUG:tasks.cephadm:Final config: [global] # make logging friendly to teuthology log_to_file = true log_to_stderr = false log to journald = false mon cluster log to file = true mon cluster log file level = debug mon clock drift allowed = 1.000 # replicate across OSDs, not hosts osd crush chooseleaf type = 0 #osd pool default size = 2 osd pool default erasure code profile = plugin=jerasure technique=reed_sol_van k=2 m=1 crush-failure-domain=osd # enable some debugging auth debug = true ms die on old message = true ms die on bug = true debug asserts on shutdown = true # adjust warnings mon max pg per osd = 10000# >= luminous mon pg warn max object skew = 0 mon osd allow primary affinity = true mon osd allow pg remap = true mon warn on legacy crush tunables = false mon warn on crush straw calc version zero = false mon warn on no sortbitwise = false mon warn on osd down out interval zero = false mon warn on too few osds = false mon_warn_on_pool_pg_num_not_power_of_two = false # disable pg_autoscaler by default for new pools osd_pool_default_pg_autoscale_mode = off # tests delete pools mon allow pool delete = true fsid = bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 mon pg warn min per osd = 0 [osd] osd scrub load threshold = 5.0 osd scrub max interval = 600 osd mclock profile = high_recovery_ops osd recover clone overlap = true osd recovery max chunk = 1048576 osd deep scrub update digest min age = 30 osd map max advance = 10 osd memory target autotune = true # debugging osd debug shutdown = true osd debug op order = true osd debug verify stray on activate = true osd debug pg log writeout = true osd debug verify cached snaps = true osd debug verify missing on start = true osd debug misdirected ops = true osd op queue = debug_random osd op queue cut off = debug_random osd shutdown pgref assert = true bdev debug aio = true osd sloppy crc = true osd_class_default_list = * osd_class_load_list = * bdev async discard = True bdev enable discard = True bluestore allocator = bitmap bluestore block size = 96636764160 bluestore fsck on mount = True debug bluefs = 1/20 debug bluestore = 1/20 debug ms = 1 debug osd = 20 debug rocksdb = 4/10 mon osd backfillfull_ratio = 0.85 mon osd full ratio = 0.9 mon osd nearfull ratio = 0.8 osd failsafe full ratio = 0.95 osd mclock iops capacity threshold hdd = 49000 osd objectstore = bluestore osd op complaint time = 180 [mgr] mon reweight min pgs per osd = 4 mon reweight min bytes per osd = 10 mgr/telemetry/nag = false debug mgr = 20 debug ms = 1 [mon] mon data avail warn = 5 mon mgr mkfs grace = 240 mon reweight min pgs per osd = 4 mon osd reporter subtree level = osd mon osd prime pg temp = true mon reweight min bytes per osd = 10 # rotate auth tickets quickly to exercise renewal paths auth mon ticket ttl = 660# 11m auth service ticket ttl = 240# 4m # don't complain about global id reclaim mon_warn_on_insecure_global_id_reclaim = false mon_warn_on_insecure_global_id_reclaim_allowed = false debug mon = 20 debug ms = 1 debug paxos = 20 mon down mkfs grace = 300 mon op complaint time = 120 [client.rgw] rgw cache enabled = true rgw enable ops log = true rgw enable usage log = true [client] client mount timeout = 600 debug client = 20 debug ms = 1 rados mon op timeout = 900 rados osd op timeout = 900 [mds] debug mds = 20 debug mds balancer = 20 debug ms = 1 mds debug frag = True mds debug scatterstat = True mds op complaint time = 180 mds verify scatter = True osd op complaint time = 180 rados mon op timeout = 900 rados osd op timeout = 900 2026-03-09T17:24:14.716 DEBUG:teuthology.orchestra.run.vm06:mon.vm06> sudo journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm06.service 2026-03-09T17:24:14.758 INFO:tasks.cephadm:Bootstrapping... 2026-03-09T17:24:14.758 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 -v bootstrap --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 --config /home/ubuntu/cephtest/seed.ceph.conf --output-config /etc/ceph/ceph.conf --output-keyring /etc/ceph/ceph.client.admin.keyring --output-pub-ssh-key /home/ubuntu/cephtest/ceph.pub --mon-ip 192.168.123.106 --skip-admin-label && sudo chmod +r /etc/ceph/ceph.client.admin.keyring 2026-03-09T17:24:14.881 INFO:teuthology.orchestra.run.vm06.stdout:-------------------------------------------------------------------------------- 2026-03-09T17:24:14.881 INFO:teuthology.orchestra.run.vm06.stdout:cephadm ['--image', 'quay.io/ceph/ceph:v18.2.0', '-v', 'bootstrap', '--fsid', 'bcd3bcc2-1bdc-11f1-97b3-3f61613e7048', '--config', '/home/ubuntu/cephtest/seed.ceph.conf', '--output-config', '/etc/ceph/ceph.conf', '--output-keyring', '/etc/ceph/ceph.client.admin.keyring', '--output-pub-ssh-key', '/home/ubuntu/cephtest/ceph.pub', '--mon-ip', '192.168.123.106', '--skip-admin-label'] 2026-03-09T17:24:14.901 INFO:teuthology.orchestra.run.vm06.stdout:/bin/podman: stdout 5.8.0 2026-03-09T17:24:14.901 INFO:teuthology.orchestra.run.vm06.stderr:Specifying an fsid for your cluster offers no advantages and may increase the likelihood of fsid conflicts. 2026-03-09T17:24:14.901 INFO:teuthology.orchestra.run.vm06.stdout:Verifying podman|docker is present... 2026-03-09T17:24:14.920 INFO:teuthology.orchestra.run.vm06.stdout:/bin/podman: stdout 5.8.0 2026-03-09T17:24:14.921 INFO:teuthology.orchestra.run.vm06.stdout:Verifying lvm2 is present... 2026-03-09T17:24:14.921 INFO:teuthology.orchestra.run.vm06.stdout:Verifying time synchronization is in place... 2026-03-09T17:24:14.929 INFO:teuthology.orchestra.run.vm06.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-09T17:24:14.929 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-09T17:24:14.936 INFO:teuthology.orchestra.run.vm06.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-09T17:24:14.936 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stdout inactive 2026-03-09T17:24:14.943 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stdout enabled 2026-03-09T17:24:14.949 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stdout active 2026-03-09T17:24:14.949 INFO:teuthology.orchestra.run.vm06.stdout:Unit chronyd.service is enabled and running 2026-03-09T17:24:14.949 INFO:teuthology.orchestra.run.vm06.stdout:Repeating the final host check... 2026-03-09T17:24:14.970 INFO:teuthology.orchestra.run.vm06.stdout:/bin/podman: stdout 5.8.0 2026-03-09T17:24:14.970 INFO:teuthology.orchestra.run.vm06.stdout:podman (/bin/podman) version 5.8.0 is present 2026-03-09T17:24:14.970 INFO:teuthology.orchestra.run.vm06.stdout:systemctl is present 2026-03-09T17:24:14.970 INFO:teuthology.orchestra.run.vm06.stdout:lvcreate is present 2026-03-09T17:24:14.976 INFO:teuthology.orchestra.run.vm06.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-09T17:24:14.976 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-09T17:24:14.982 INFO:teuthology.orchestra.run.vm06.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-09T17:24:14.982 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stdout inactive 2026-03-09T17:24:14.988 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stdout enabled 2026-03-09T17:24:14.995 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stdout active 2026-03-09T17:24:14.995 INFO:teuthology.orchestra.run.vm06.stdout:Unit chronyd.service is enabled and running 2026-03-09T17:24:14.995 INFO:teuthology.orchestra.run.vm06.stdout:Host looks OK 2026-03-09T17:24:14.995 INFO:teuthology.orchestra.run.vm06.stdout:Cluster fsid: bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:24:14.995 INFO:teuthology.orchestra.run.vm06.stdout:Acquiring lock 139809531933792 on /run/cephadm/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048.lock 2026-03-09T17:24:14.995 INFO:teuthology.orchestra.run.vm06.stdout:Lock 139809531933792 acquired on /run/cephadm/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048.lock 2026-03-09T17:24:14.995 INFO:teuthology.orchestra.run.vm06.stdout:Verifying IP 192.168.123.106 port 3300 ... 2026-03-09T17:24:14.996 INFO:teuthology.orchestra.run.vm06.stdout:Verifying IP 192.168.123.106 port 6789 ... 2026-03-09T17:24:14.996 INFO:teuthology.orchestra.run.vm06.stdout:Base mon IP(s) is [192.168.123.106:3300, 192.168.123.106:6789], mon addrv is [v2:192.168.123.106:3300,v1:192.168.123.106:6789] 2026-03-09T17:24:15.000 INFO:teuthology.orchestra.run.vm06.stdout:/sbin/ip: stdout default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.106 metric 100 2026-03-09T17:24:15.000 INFO:teuthology.orchestra.run.vm06.stdout:/sbin/ip: stdout 192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.106 metric 100 2026-03-09T17:24:15.002 INFO:teuthology.orchestra.run.vm06.stdout:/sbin/ip: stdout ::1 dev lo proto kernel metric 256 pref medium 2026-03-09T17:24:15.002 INFO:teuthology.orchestra.run.vm06.stdout:/sbin/ip: stdout fe80::/64 dev eth0 proto kernel metric 1024 pref medium 2026-03-09T17:24:15.005 INFO:teuthology.orchestra.run.vm06.stdout:/sbin/ip: stdout 1: lo: mtu 65536 state UNKNOWN qlen 1000 2026-03-09T17:24:15.005 INFO:teuthology.orchestra.run.vm06.stdout:/sbin/ip: stdout inet6 ::1/128 scope host 2026-03-09T17:24:15.005 INFO:teuthology.orchestra.run.vm06.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-09T17:24:15.005 INFO:teuthology.orchestra.run.vm06.stdout:/sbin/ip: stdout 2: eth0: mtu 1500 state UP qlen 1000 2026-03-09T17:24:15.005 INFO:teuthology.orchestra.run.vm06.stdout:/sbin/ip: stdout inet6 fe80::5055:ff:fe00:6/64 scope link noprefixroute 2026-03-09T17:24:15.005 INFO:teuthology.orchestra.run.vm06.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-09T17:24:15.006 INFO:teuthology.orchestra.run.vm06.stdout:Mon IP `192.168.123.106` is in CIDR network `192.168.123.0/24` 2026-03-09T17:24:15.006 INFO:teuthology.orchestra.run.vm06.stdout:Mon IP `192.168.123.106` is in CIDR network `192.168.123.0/24` 2026-03-09T17:24:15.006 INFO:teuthology.orchestra.run.vm06.stdout:Inferred mon public CIDR from local network configuration ['192.168.123.0/24', '192.168.123.0/24'] 2026-03-09T17:24:15.006 INFO:teuthology.orchestra.run.vm06.stdout:Internal network (--cluster-network) has not been provided, OSD replication will default to the public_network 2026-03-09T17:24:15.007 INFO:teuthology.orchestra.run.vm06.stdout:Pulling container image quay.io/ceph/ceph:v18.2.0... 2026-03-09T17:24:16.379 INFO:teuthology.orchestra.run.vm06.stdout:/bin/podman: stdout dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946 2026-03-09T17:24:16.380 INFO:teuthology.orchestra.run.vm06.stdout:/bin/podman: stderr Trying to pull quay.io/ceph/ceph:v18.2.0... 2026-03-09T17:24:16.380 INFO:teuthology.orchestra.run.vm06.stdout:/bin/podman: stderr Getting image source signatures 2026-03-09T17:24:16.380 INFO:teuthology.orchestra.run.vm06.stdout:/bin/podman: stderr Copying blob sha256:3bd20aeff60302f668275dc2005d10679ae56492967a3a5a54fd3dde85333aec 2026-03-09T17:24:16.380 INFO:teuthology.orchestra.run.vm06.stdout:/bin/podman: stderr Copying blob sha256:46af8f5390d4e94fc57efb422ccb97bb53dfe5b948546bfc191b46557eb2dbd9 2026-03-09T17:24:16.380 INFO:teuthology.orchestra.run.vm06.stdout:/bin/podman: stderr Copying config sha256:dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946 2026-03-09T17:24:16.380 INFO:teuthology.orchestra.run.vm06.stdout:/bin/podman: stderr Writing manifest to image destination 2026-03-09T17:24:16.563 INFO:teuthology.orchestra.run.vm06.stdout:ceph: stdout ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable) 2026-03-09T17:24:16.563 INFO:teuthology.orchestra.run.vm06.stdout:Ceph version: ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable) 2026-03-09T17:24:16.564 INFO:teuthology.orchestra.run.vm06.stdout:Extracting ceph user uid/gid from container image... 2026-03-09T17:24:16.673 INFO:teuthology.orchestra.run.vm06.stdout:stat: stdout 167 167 2026-03-09T17:24:16.673 INFO:teuthology.orchestra.run.vm06.stdout:Creating initial keys... 2026-03-09T17:24:16.773 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph-authtool: stdout AQBAAq9pgDReLBAAKyYiAbkDzW+faptZGNm+uQ== 2026-03-09T17:24:16.888 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph-authtool: stdout AQBAAq9pUgtUMhAAhpORozX/mRN4IL8auZf4nQ== 2026-03-09T17:24:17.016 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph-authtool: stdout AQBAAq9pGWPyORAAmIag3Upj5nNU6hEFuxhUww== 2026-03-09T17:24:17.016 INFO:teuthology.orchestra.run.vm06.stdout:Creating initial monmap... 2026-03-09T17:24:17.120 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-09T17:24:17.120 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/monmaptool: stdout setting min_mon_release = pacific 2026-03-09T17:24:17.120 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: set fsid to bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:24:17.120 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-09T17:24:17.120 INFO:teuthology.orchestra.run.vm06.stdout:monmaptool for vm06 [v2:192.168.123.106:3300,v1:192.168.123.106:6789] on /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-09T17:24:17.120 INFO:teuthology.orchestra.run.vm06.stdout:setting min_mon_release = pacific 2026-03-09T17:24:17.120 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/monmaptool: set fsid to bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:24:17.120 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-09T17:24:17.120 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:24:17.120 INFO:teuthology.orchestra.run.vm06.stdout:Creating mon... 2026-03-09T17:24:17.253 INFO:teuthology.orchestra.run.vm06.stdout:create mon.vm06 on 2026-03-09T17:24:17.430 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stderr Removed "/etc/systemd/system/multi-user.target.wants/ceph.target". 2026-03-09T17:24:17.562 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /etc/systemd/system/ceph.target. 2026-03-09T17:24:17.693 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048.target → /etc/systemd/system/ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048.target. 2026-03-09T17:24:17.693 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph.target.wants/ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048.target → /etc/systemd/system/ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048.target. 2026-03-09T17:24:17.899 INFO:teuthology.orchestra.run.vm06.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm06 2026-03-09T17:24:17.899 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stderr Failed to reset failed state of unit ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm06.service: Unit ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm06.service not loaded. 2026-03-09T17:24:18.038 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048.target.wants/ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm06.service → /etc/systemd/system/ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@.service. 2026-03-09T17:24:18.238 INFO:teuthology.orchestra.run.vm06.stdout:firewalld does not appear to be present 2026-03-09T17:24:18.238 INFO:teuthology.orchestra.run.vm06.stdout:Not possible to enable service . firewalld.service is not available 2026-03-09T17:24:18.238 INFO:teuthology.orchestra.run.vm06.stdout:Waiting for mon to start... 2026-03-09T17:24:18.238 INFO:teuthology.orchestra.run.vm06.stdout:Waiting for mon... 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout cluster: 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout id: bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout health: HEALTH_OK 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout services: 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mon: 1 daemons, quorum vm06 (age 0.18299s) 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mgr: no daemons active 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout osd: 0 osds: 0 up, 0 in 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout data: 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout pools: 0 pools, 0 pgs 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout objects: 0 objects, 0 B 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout usage: 0 B used, 0 B / 0 B avail 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout pgs: 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.398+0000 7efeb60ef700 1 Processor -- start 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.399+0000 7efeb60ef700 1 -- start start 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.399+0000 7efeb60ef700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efeb007b6d0 0x7efeb007bae0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.399+0000 7efeb60ef700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efeb007c020 con 0x7efeb007b6d0 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.400+0000 7efeaf7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efeb007b6d0 0x7efeb007bae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.400+0000 7efeaf7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efeb007b6d0 0x7efeb007bae0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:36082/0 (socket says 192.168.123.106:36082) 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.400+0000 7efeaf7fe700 1 -- 192.168.123.106:0/680821519 learned_addr learned my addr 192.168.123.106:0/680821519 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.400+0000 7efeaf7fe700 1 -- 192.168.123.106:0/680821519 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efeb007c160 con 0x7efeb007b6d0 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.400+0000 7efeaf7fe700 1 --2- 192.168.123.106:0/680821519 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efeb007b6d0 0x7efeb007bae0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7efe98009a90 tx=0x7efe98009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=636ae2b46ee95240 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.400+0000 7efeae7fc700 1 -- 192.168.123.106:0/680821519 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efe98004030 con 0x7efeb007b6d0 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.401+0000 7efeae7fc700 1 -- 192.168.123.106:0/680821519 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7efe98004190 con 0x7efeb007b6d0 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.401+0000 7efeae7fc700 1 -- 192.168.123.106:0/680821519 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efe98004320 con 0x7efeb007b6d0 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.401+0000 7efeb60ef700 1 -- 192.168.123.106:0/680821519 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efeb007b6d0 msgr2=0x7efeb007bae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.401+0000 7efeb60ef700 1 --2- 192.168.123.106:0/680821519 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efeb007b6d0 0x7efeb007bae0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7efe98009a90 tx=0x7efe98009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.401+0000 7efeb60ef700 1 -- 192.168.123.106:0/680821519 shutdown_connections 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.401+0000 7efeb60ef700 1 --2- 192.168.123.106:0/680821519 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efeb007b6d0 0x7efeb007bae0 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.401+0000 7efeb60ef700 1 -- 192.168.123.106:0/680821519 >> 192.168.123.106:0/680821519 conn(0x7efeb0103770 msgr2=0x7efeb0105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.401+0000 7efeb60ef700 1 -- 192.168.123.106:0/680821519 shutdown_connections 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.401+0000 7efeb60ef700 1 -- 192.168.123.106:0/680821519 wait complete. 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.402+0000 7efeb60ef700 1 Processor -- start 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.402+0000 7efeb60ef700 1 -- start start 2026-03-09T17:24:18.499 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.402+0000 7efeb60ef700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efeb007b6d0 0x7efeb019f750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.402+0000 7efeb60ef700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efeb019fc90 con 0x7efeb007b6d0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.402+0000 7efeaf7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efeb007b6d0 0x7efeb019f750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.402+0000 7efeaf7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efeb007b6d0 0x7efeb019f750 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:36094/0 (socket says 192.168.123.106:36094) 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.402+0000 7efeaf7fe700 1 -- 192.168.123.106:0/677787588 learned_addr learned my addr 192.168.123.106:0/677787588 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.403+0000 7efeaf7fe700 1 -- 192.168.123.106:0/677787588 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efe98009740 con 0x7efeb007b6d0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.403+0000 7efeaf7fe700 1 --2- 192.168.123.106:0/677787588 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efeb007b6d0 0x7efeb019f750 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7efe98004000 tx=0x7efe98004750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.403+0000 7efeacff9700 1 -- 192.168.123.106:0/677787588 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efe9800be40 con 0x7efeb007b6d0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.403+0000 7efeacff9700 1 -- 192.168.123.106:0/677787588 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7efe980036a0 con 0x7efeb007b6d0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.403+0000 7efeb60ef700 1 -- 192.168.123.106:0/677787588 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efeb019fe90 con 0x7efeb007b6d0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.403+0000 7efeb60ef700 1 -- 192.168.123.106:0/677787588 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efeb01a0330 con 0x7efeb007b6d0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.404+0000 7efeacff9700 1 -- 192.168.123.106:0/677787588 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efe98003990 con 0x7efeb007b6d0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.404+0000 7efeacff9700 1 -- 192.168.123.106:0/677787588 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7efe9801c460 con 0x7efeb007b6d0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.404+0000 7efeacff9700 1 -- 192.168.123.106:0/677787588 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7efe9801a790 con 0x7efeb007b6d0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.404+0000 7efeb60ef700 1 -- 192.168.123.106:0/677787588 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efeb0199700 con 0x7efeb007b6d0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.405+0000 7efeacff9700 1 -- 192.168.123.106:0/677787588 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7efe9801a9a0 con 0x7efeb007b6d0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.443+0000 7efeb60ef700 1 -- 192.168.123.106:0/677787588 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "status"} v 0) v1 -- 0x7efeb0062380 con 0x7efeb007b6d0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.444+0000 7efeacff9700 1 -- 192.168.123.106:0/677787588 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "status"}]=0 v0) v1 ==== 54+0+319 (secure 0 0 0) 0x7efe9801ab40 con 0x7efeb007b6d0 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.445+0000 7efeb60ef700 1 -- 192.168.123.106:0/677787588 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efeb007b6d0 msgr2=0x7efeb019f750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.445+0000 7efeb60ef700 1 --2- 192.168.123.106:0/677787588 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efeb007b6d0 0x7efeb019f750 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7efe98004000 tx=0x7efe98004750 comp rx=0 tx=0).stop 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.445+0000 7efeb60ef700 1 -- 192.168.123.106:0/677787588 shutdown_connections 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.445+0000 7efeb60ef700 1 --2- 192.168.123.106:0/677787588 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efeb007b6d0 0x7efeb019f750 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.445+0000 7efeb60ef700 1 -- 192.168.123.106:0/677787588 >> 192.168.123.106:0/677787588 conn(0x7efeb0103770 msgr2=0x7efeb0106b40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.445+0000 7efeb60ef700 1 -- 192.168.123.106:0/677787588 shutdown_connections 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.445+0000 7efeb60ef700 1 -- 192.168.123.106:0/677787588 wait complete. 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:mon is available 2026-03-09T17:24:18.500 INFO:teuthology.orchestra.run.vm06.stdout:Assimilating anything we can from ceph.conf... 2026-03-09T17:24:18.764 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 2026-03-09T17:24:18.764 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout [global] 2026-03-09T17:24:18.764 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout fsid = bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:24:18.764 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mon_host = [v2:192.168.123.106:3300,v1:192.168.123.106:6789] 2026-03-09T17:24:18.764 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-09T17:24:18.764 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout [osd] 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.636+0000 7f6b2fb3f700 1 Processor -- start 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.637+0000 7f6b2fb3f700 1 -- start start 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.637+0000 7f6b2fb3f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b28104c20 0x7f6b28105030 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.637+0000 7f6b2fb3f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6b28105570 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.637+0000 7f6b2d8db700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b28104c20 0x7f6b28105030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.637+0000 7f6b2d8db700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b28104c20 0x7f6b28105030 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:36104/0 (socket says 192.168.123.106:36104) 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.637+0000 7f6b2d8db700 1 -- 192.168.123.106:0/2370178126 learned_addr learned my addr 192.168.123.106:0/2370178126 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.638+0000 7f6b2d8db700 1 -- 192.168.123.106:0/2370178126 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6b281056b0 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.638+0000 7f6b2d8db700 1 --2- 192.168.123.106:0/2370178126 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b28104c20 0x7f6b28105030 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7f6b18009cf0 tx=0x7f6b1800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=695886b140853677 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.638+0000 7f6b2c8d9700 1 -- 192.168.123.106:0/2370178126 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6b18004030 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.638+0000 7f6b2c8d9700 1 -- 192.168.123.106:0/2370178126 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f6b18004190 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.639+0000 7f6b2fb3f700 1 -- 192.168.123.106:0/2370178126 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b28104c20 msgr2=0x7f6b28105030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.639+0000 7f6b2fb3f700 1 --2- 192.168.123.106:0/2370178126 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b28104c20 0x7f6b28105030 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7f6b18009cf0 tx=0x7f6b1800b0e0 comp rx=0 tx=0).stop 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.639+0000 7f6b2fb3f700 1 -- 192.168.123.106:0/2370178126 shutdown_connections 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.639+0000 7f6b2fb3f700 1 --2- 192.168.123.106:0/2370178126 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b28104c20 0x7f6b28105030 unknown :-1 s=CLOSED pgs=3 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.639+0000 7f6b2fb3f700 1 -- 192.168.123.106:0/2370178126 >> 192.168.123.106:0/2370178126 conn(0x7f6b28100270 msgr2=0x7f6b281026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.639+0000 7f6b2fb3f700 1 -- 192.168.123.106:0/2370178126 shutdown_connections 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.639+0000 7f6b2fb3f700 1 -- 192.168.123.106:0/2370178126 wait complete. 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.640+0000 7f6b2fb3f700 1 Processor -- start 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.640+0000 7f6b2fb3f700 1 -- start start 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.640+0000 7f6b2fb3f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b28104c20 0x7f6b2819b4f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.640+0000 7f6b2fb3f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6b28105570 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.640+0000 7f6b2d8db700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b28104c20 0x7f6b2819b4f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.640+0000 7f6b2d8db700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b28104c20 0x7f6b2819b4f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:36114/0 (socket says 192.168.123.106:36114) 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.640+0000 7f6b2d8db700 1 -- 192.168.123.106:0/2195196723 learned_addr learned my addr 192.168.123.106:0/2195196723 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.641+0000 7f6b2d8db700 1 -- 192.168.123.106:0/2195196723 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6b18009740 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.641+0000 7f6b2d8db700 1 --2- 192.168.123.106:0/2195196723 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b28104c20 0x7f6b2819b4f0 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f6b1800b4e0 tx=0x7f6b18004750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.641+0000 7f6b1effd700 1 -- 192.168.123.106:0/2195196723 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6b180036a0 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.641+0000 7f6b1effd700 1 -- 192.168.123.106:0/2195196723 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f6b18003800 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.641+0000 7f6b1effd700 1 -- 192.168.123.106:0/2195196723 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6b18003970 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.641+0000 7f6b2fb3f700 1 -- 192.168.123.106:0/2195196723 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6b2819ba30 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.641+0000 7f6b2fb3f700 1 -- 192.168.123.106:0/2195196723 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6b2819bed0 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.642+0000 7f6b1effd700 1 -- 192.168.123.106:0/2195196723 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f6b18022020 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.642+0000 7f6b1effd700 1 -- 192.168.123.106:0/2195196723 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f6b1801ba50 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.642+0000 7f6b2fb3f700 1 -- 192.168.123.106:0/2195196723 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6b2804f9e0 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.644+0000 7f6b1effd700 1 -- 192.168.123.106:0/2195196723 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f6b1802a050 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.681+0000 7f6b2fb3f700 1 -- 192.168.123.106:0/2195196723 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7f6b28062380 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.687+0000 7f6b1effd700 1 -- 192.168.123.106:0/2195196723 <== mon.0 v2:192.168.123.106:3300/0 7 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f6b1801bc60 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.687+0000 7f6b1effd700 1 -- 192.168.123.106:0/2195196723 <== mon.0 v2:192.168.123.106:3300/0 8 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v2) v1 ==== 70+0+435 (secure 0 0 0) 0x7f6b1802fb30 con 0x7f6b28104c20 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.689+0000 7f6b2fb3f700 1 -- 192.168.123.106:0/2195196723 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b28104c20 msgr2=0x7f6b2819b4f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.689+0000 7f6b2fb3f700 1 --2- 192.168.123.106:0/2195196723 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b28104c20 0x7f6b2819b4f0 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f6b1800b4e0 tx=0x7f6b18004750 comp rx=0 tx=0).stop 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.690+0000 7f6b2fb3f700 1 -- 192.168.123.106:0/2195196723 shutdown_connections 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.690+0000 7f6b2fb3f700 1 --2- 192.168.123.106:0/2195196723 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b28104c20 0x7f6b2819b4f0 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.690+0000 7f6b2fb3f700 1 -- 192.168.123.106:0/2195196723 >> 192.168.123.106:0/2195196723 conn(0x7f6b28100270 msgr2=0x7f6b2810aa20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.690+0000 7f6b2fb3f700 1 -- 192.168.123.106:0/2195196723 shutdown_connections 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.690+0000 7f6b2fb3f700 1 -- 192.168.123.106:0/2195196723 wait complete. 2026-03-09T17:24:18.765 INFO:teuthology.orchestra.run.vm06.stdout:Generating new minimal ceph.conf... 2026-03-09T17:24:19.012 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.906+0000 7fb81abee700 1 Processor -- start 2026-03-09T17:24:19.012 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.907+0000 7fb81abee700 1 -- start start 2026-03-09T17:24:19.012 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.907+0000 7fb81abee700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814106830 0x7fb814108c10 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:19.012 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.907+0000 7fb81abee700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb8140745b0 con 0x7fb814106830 2026-03-09T17:24:19.012 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.908+0000 7fb81898a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814106830 0x7fb814108c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:19.012 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.908+0000 7fb81898a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814106830 0x7fb814108c10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:36116/0 (socket says 192.168.123.106:36116) 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.908+0000 7fb81898a700 1 -- 192.168.123.106:0/983873884 learned_addr learned my addr 192.168.123.106:0/983873884 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.908+0000 7fb81898a700 1 -- 192.168.123.106:0/983873884 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb8140746f0 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.908+0000 7fb81898a700 1 --2- 192.168.123.106:0/983873884 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814106830 0x7fb814108c10 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fb804009a90 tx=0x7fb804009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=1bfa6a29bab68c7e server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.908+0000 7fb8137fe700 1 -- 192.168.123.106:0/983873884 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb804004030 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.908+0000 7fb8137fe700 1 -- 192.168.123.106:0/983873884 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7fb80400b7e0 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.908+0000 7fb8137fe700 1 -- 192.168.123.106:0/983873884 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb804003970 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.909+0000 7fb81abee700 1 -- 192.168.123.106:0/983873884 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814106830 msgr2=0x7fb814108c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.909+0000 7fb81abee700 1 --2- 192.168.123.106:0/983873884 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814106830 0x7fb814108c10 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fb804009a90 tx=0x7fb804009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.909+0000 7fb81abee700 1 -- 192.168.123.106:0/983873884 shutdown_connections 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.909+0000 7fb81abee700 1 --2- 192.168.123.106:0/983873884 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814106830 0x7fb814108c10 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.909+0000 7fb81abee700 1 -- 192.168.123.106:0/983873884 >> 192.168.123.106:0/983873884 conn(0x7fb814100270 msgr2=0x7fb8141026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.909+0000 7fb81abee700 1 -- 192.168.123.106:0/983873884 shutdown_connections 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.909+0000 7fb81abee700 1 -- 192.168.123.106:0/983873884 wait complete. 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.910+0000 7fb81abee700 1 Processor -- start 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.910+0000 7fb81abee700 1 -- start start 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.910+0000 7fb81abee700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814106830 0x7fb814106190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.910+0000 7fb81abee700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb8141066d0 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.910+0000 7fb81898a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814106830 0x7fb814106190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.910+0000 7fb81898a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814106830 0x7fb814106190 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:36128/0 (socket says 192.168.123.106:36128) 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.910+0000 7fb81898a700 1 -- 192.168.123.106:0/3611895094 learned_addr learned my addr 192.168.123.106:0/3611895094 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.911+0000 7fb81898a700 1 -- 192.168.123.106:0/3611895094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb804009740 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.911+0000 7fb81898a700 1 --2- 192.168.123.106:0/3611895094 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814106830 0x7fb814106190 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fb80400bfb0 tx=0x7fb804003be0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.911+0000 7fb811ffb700 1 -- 192.168.123.106:0/3611895094 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb8040040c0 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.911+0000 7fb811ffb700 1 -- 192.168.123.106:0/3611895094 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7fb804004220 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.911+0000 7fb811ffb700 1 -- 192.168.123.106:0/3611895094 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb804011420 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.911+0000 7fb81abee700 1 -- 192.168.123.106:0/3611895094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb8141047e0 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.911+0000 7fb81abee700 1 -- 192.168.123.106:0/3611895094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb814104c80 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.912+0000 7fb811ffb700 1 -- 192.168.123.106:0/3611895094 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fb804011880 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.912+0000 7fb811ffb700 1 -- 192.168.123.106:0/3611895094 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fb8040193c0 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.913+0000 7fb81abee700 1 -- 192.168.123.106:0/3611895094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb81404fa50 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.914+0000 7fb811ffb700 1 -- 192.168.123.106:0/3611895094 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7fb804028050 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.952+0000 7fb81abee700 1 -- 192.168.123.106:0/3611895094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7fb814104f30 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.952+0000 7fb811ffb700 1 -- 192.168.123.106:0/3611895094 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v2) v1 ==== 76+0+181 (secure 0 0 0) 0x7fb80401e030 con 0x7fb814106830 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.954+0000 7fb81abee700 1 -- 192.168.123.106:0/3611895094 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814106830 msgr2=0x7fb814106190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.954+0000 7fb81abee700 1 --2- 192.168.123.106:0/3611895094 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814106830 0x7fb814106190 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fb80400bfb0 tx=0x7fb804003be0 comp rx=0 tx=0).stop 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.954+0000 7fb81abee700 1 -- 192.168.123.106:0/3611895094 shutdown_connections 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.954+0000 7fb81abee700 1 --2- 192.168.123.106:0/3611895094 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb814106830 0x7fb814106190 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.954+0000 7fb81abee700 1 -- 192.168.123.106:0/3611895094 >> 192.168.123.106:0/3611895094 conn(0x7fb814100270 msgr2=0x7fb814100f50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.954+0000 7fb81abee700 1 -- 192.168.123.106:0/3611895094 shutdown_connections 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:18.954+0000 7fb81abee700 1 -- 192.168.123.106:0/3611895094 wait complete. 2026-03-09T17:24:19.013 INFO:teuthology.orchestra.run.vm06.stdout:Restarting the monitor... 2026-03-09T17:24:19.103 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 systemd[1]: Stopping Ceph mon.vm06 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:24:19.356 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06[57018]: 2026-03-09T17:24:19.100+0000 7fcdf7804700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm06 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:24:19.356 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06[57018]: 2026-03-09T17:24:19.100+0000 7fcdf7804700 -1 mon.vm06@0(leader) e1 *** Got Signal Terminated *** 2026-03-09T17:24:19.356 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 podman[57227]: 2026-03-09 17:24:19.143390171 +0000 UTC m=+0.056407931 container died c36b00c006eb17f1e2ad2415b30387c9772e42b783ce806e1a42b998e637b3d7 (image=quay.io/ceph/ceph:v18.2.0, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06, org.label-schema.build-date=20231212, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.29.1, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_CLEAN=True, RELEASE=HEAD, org.label-schema.license=GPLv2, maintainer=Guillaume Abrioux , GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308) 2026-03-09T17:24:19.356 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 podman[57227]: 2026-03-09 17:24:19.157746574 +0000 UTC m=+0.070764334 container remove c36b00c006eb17f1e2ad2415b30387c9772e42b783ce806e1a42b998e637b3d7 (image=quay.io/ceph/ceph:v18.2.0, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06, org.label-schema.vendor=CentOS, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.schema-version=1.0) 2026-03-09T17:24:19.356 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 bash[57227]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06 2026-03-09T17:24:19.356 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm06.service: Deactivated successfully. 2026-03-09T17:24:19.356 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 systemd[1]: Stopped Ceph mon.vm06 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:24:19.356 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 systemd[1]: Starting Ceph mon.vm06 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:24:19.356 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 podman[57293]: 2026-03-09 17:24:19.328794396 +0000 UTC m=+0.018602451 container create e0e1a20b15774e123118b89b6bcd72097e9e605c2aae3b546764a23ffc53003d (image=quay.io/ceph/ceph:v18.2.0, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06, RELEASE=HEAD, org.label-schema.license=GPLv2, CEPH_POINT_RELEASE=-18.2.0, org.label-schema.name=CentOS Stream 8 Base Image, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, maintainer=Guillaume Abrioux , GIT_CLEAN=True, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20231212, GIT_BRANCH=HEAD, io.buildah.version=1.29.1, org.label-schema.schema-version=1.0, GIT_REPO=https://github.com/ceph/ceph-container.git) 2026-03-09T17:24:19.378 INFO:teuthology.orchestra.run.vm06.stdout:Setting public_network to 192.168.123.0/24 in global config section 2026-03-09T17:24:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 podman[57293]: 2026-03-09 17:24:19.36376585 +0000 UTC m=+0.053573915 container init e0e1a20b15774e123118b89b6bcd72097e9e605c2aae3b546764a23ffc53003d (image=quay.io/ceph/ceph:v18.2.0, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06, io.buildah.version=1.29.1, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.0, RELEASE=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.vendor=CentOS, org.label-schema.build-date=20231212, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.name=CentOS Stream 8 Base Image, maintainer=Guillaume Abrioux , GIT_CLEAN=True, ceph=True, org.label-schema.license=GPLv2) 2026-03-09T17:24:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 podman[57293]: 2026-03-09 17:24:19.367496242 +0000 UTC m=+0.057304297 container start e0e1a20b15774e123118b89b6bcd72097e9e605c2aae3b546764a23ffc53003d (image=quay.io/ceph/ceph:v18.2.0, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, org.label-schema.name=CentOS Stream 8 Base Image, maintainer=Guillaume Abrioux , GIT_CLEAN=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.29.1, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, ceph=True, org.label-schema.build-date=20231212, GIT_BRANCH=HEAD) 2026-03-09T17:24:19.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 bash[57293]: e0e1a20b15774e123118b89b6bcd72097e9e605c2aae3b546764a23ffc53003d 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 podman[57293]: 2026-03-09 17:24:19.320853272 +0000 UTC m=+0.010661327 image pull dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946 quay.io/ceph/ceph:v18.2.0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 systemd[1]: Started Ceph mon.vm06 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: set uid:gid to 167:167 (ceph:ceph) 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable), process ceph-mon, pid 2 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: pidfile_write: ignore empty --pid-file 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: load: jerasure load: lrc 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: RocksDB version: 7.9.2 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Git sha 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Compile date 2023-08-03 19:21:13 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: DB SUMMARY 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: DB Session ID: 5V6F8NLU62FRJXETTWHO 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: CURRENT file: CURRENT 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: IDENTITY file: IDENTITY 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: MANIFEST file: MANIFEST-000010 size: 179 Bytes 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm06/store.db dir, Total Num: 1, files: 000008.sst 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm06/store.db: 000009.log size: 89048 ; 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.error_if_exists: 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.create_if_missing: 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.paranoid_checks: 1 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.env: 0x56316471e720 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.fs: PosixFileSystem 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.info_log: 0x5631669a3340 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_file_opening_threads: 16 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.statistics: (nil) 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.use_fsync: 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_log_file_size: 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.keep_log_file_num: 1000 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.recycle_log_file_num: 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.allow_fallocate: 1 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.allow_mmap_reads: 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.allow_mmap_writes: 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.use_direct_reads: 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.create_missing_column_families: 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.db_log_dir: 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.wal_dir: 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.advise_random_on_open: 1 2026-03-09T17:24:19.618 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.db_write_buffer_size: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.write_buffer_manager: 0x563165c325a0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.rate_limiter: (nil) 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.wal_recovery_mode: 2 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.enable_thread_tracking: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.enable_pipelined_write: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.unordered_write: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.row_cache: None 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.wal_filter: None 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.allow_ingest_behind: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.two_write_queues: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.manual_wal_flush: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.wal_compression: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.atomic_flush: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.log_readahead_size: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.best_efforts_recovery: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.allow_data_in_errors: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.db_host_id: __hostname__ 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_background_jobs: 2 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_background_compactions: -1 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_subcompactions: 1 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_total_wal_size: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_open_files: -1 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.bytes_per_sync: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-09T17:24:19.619 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compaction_readahead_size: 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_background_flushes: -1 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Compression algorithms supported: 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: kZSTD supported: 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: kXpressCompression supported: 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: kLZ4HCCompression supported: 1 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: kZlibCompression supported: 1 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: kSnappyCompression supported: 1 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: kLZ4Compression supported: 1 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: kBZip2Compression supported: 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm06/store.db/MANIFEST-000010 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.merge_operator: 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compaction_filter: None 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compaction_filter_factory: None 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.sst_partitioner_factory: None 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5631669a3460) 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: cache_index_and_filter_blocks: 1 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: pin_top_level_index_and_filter: 1 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_type: 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: data_block_index_type: 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_shortening: 1 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: checksum: 4 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: no_block_cache: 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache: 0x563165cb5350 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_name: BinnedLRUCache 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_options: 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: capacity : 536870912 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: num_shard_bits : 4 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: strict_capacity_limit : 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: high_pri_pool_ratio: 0.000 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_compressed: (nil) 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: persistent_cache: (nil) 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_size: 4096 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_size_deviation: 10 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_restart_interval: 16 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_block_restart_interval: 1 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: metadata_block_size: 4096 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: partition_filters: 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: use_delta_encoding: 1 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: filter_policy: bloomfilter 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: whole_key_filtering: 1 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: verify_compression: 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: read_amp_bytes_per_bit: 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: format_version: 5 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: enable_index_compression: 1 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_align: 0 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: max_auto_readahead_size: 262144 2026-03-09T17:24:19.620 INFO:journalctl@ceph.mon.vm06.vm06.stdout: prepopulate_block_cache: 0 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout: initial_auto_readahead_size: 8192 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout: num_file_reads_for_auto_readahead: 2 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.write_buffer_size: 33554432 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_write_buffer_number: 2 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compression: NoCompression 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.bottommost_compression: Disabled 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.prefix_extractor: nullptr 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.num_levels: 7 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compression_opts.level: 32767 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compression_opts.strategy: 0 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compression_opts.enabled: false 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.target_file_size_base: 67108864 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-09T17:24:19.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.arena_block_size: 1048576 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.disable_auto_compactions: 0 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.table_properties_collectors: 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.inplace_update_support: 0 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.bloom_locality: 0 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.max_successive_merges: 0 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.paranoid_file_checks: 0 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.force_consistency_checks: 1 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.report_bg_io_stats: 0 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.ttl: 2592000 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.enable_blob_files: false 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.min_blob_size: 0 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.blob_file_size: 268435456 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-09T17:24:19.622 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.blob_file_starting_level: 0 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm06/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 12, last_sequence is 5, log_number is 5,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 5 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 486416ba-d76b-4929-a454-49d49e04ccf4 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773077059400991, "job": 1, "event": "recovery_started", "wal_files": [9]} 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #9 mode 2 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773077059402793, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 13, "file_size": 84711, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8, "largest_seqno": 287, "table_properties": {"data_size": 82789, "index_size": 209, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 13288, "raw_average_key_size": 51, "raw_value_size": 75614, "raw_average_value_size": 293, "num_data_blocks": 9, "num_entries": 258, "num_filter_entries": 258, "num_deletions": 3, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773077059, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "486416ba-d76b-4929-a454-49d49e04ccf4", "db_session_id": "5V6F8NLU62FRJXETTWHO", "orig_file_number": 13, "seqno_to_time_mapping": "N/A"}} 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773077059402910, "job": 1, "event": "recovery_finished"} 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: [db/version_set.cc:5047] Creating manifest 15 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm06/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x563165d52000 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: DB pointer 0x563165d3e000 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: starting mon.vm06 rank 0 at public addrs [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] at bind addrs [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon_data /var/lib/ceph/mon/ceph-vm06 fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: mon.vm06@-1(???) e1 preinit fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: mon.vm06@-1(???).mds e1 new map 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: mon.vm06@-1(???).mds e1 print_map 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: e1 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: legacy client fscid: -1 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: No filesystems configured 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: mon.vm06@-1(???).osd e1 crush map has features 3314932999778484224, adjusting msgr requires 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: mon.vm06@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: mon.vm06@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: mon.vm06@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** DB Stats ** 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** Compaction Stats [default] ** 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-09T17:24:19.623 INFO:journalctl@ceph.mon.vm06.vm06.stdout: L0 2/0 84.57 KB 0.5 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 55.1 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Sum 2/0 84.57 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 55.1 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 55.1 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** Compaction Stats [default] ** 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 55.1 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Flush(GB): cumulative 0.000, interval 0.000 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative compaction: 0.00 GB write, 9.96 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval compaction: 0.00 GB write, 9.96 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Block cache BinnedLRUCache@0x563165cb5350#2 capacity: 512.00 MB usage: 33.28 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.5e-05 secs_since: 0 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Block cache entry stats(count,size,portion): DataBlock(6,31.98 KB,0.00610054%) FilterBlock(2,0.89 KB,0.000169873%) IndexBlock(2,0.41 KB,7.7486e-05%) Misc(1,0.00 KB,0%) 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: mon.vm06@-1(???).paxosservice(auth 1..2) refresh upgraded, format 0 -> 3 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: mon.vm06 is new leader, mons vm06 in quorum (ranks 0) 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: monmap e1: 1 mons at {vm06=[v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0]} removed_ranks: {} 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: fsmap 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: osdmap e1: 0 total, 0 up, 0 in 2026-03-09T17:24:19.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:19 vm06 ceph-mon[57307]: mgrmap e1: no daemons active 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.545+0000 7f2a09a91700 1 Processor -- start 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.545+0000 7f2a09a91700 1 -- start start 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.545+0000 7f2a09a91700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a0407b6d0 0x7f2a0407bae0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.545+0000 7f2a09a91700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a0407c020 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.546+0000 7f2a037fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a0407b6d0 0x7f2a0407bae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.546+0000 7f2a037fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a0407b6d0 0x7f2a0407bae0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:36138/0 (socket says 192.168.123.106:36138) 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.546+0000 7f2a037fe700 1 -- 192.168.123.106:0/843374037 learned_addr learned my addr 192.168.123.106:0/843374037 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.546+0000 7f2a037fe700 1 -- 192.168.123.106:0/843374037 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2a0407c160 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.547+0000 7f2a037fe700 1 --2- 192.168.123.106:0/843374037 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a0407b6d0 0x7f2a0407bae0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f29ec01ab30 tx=0x7f29ec01ae40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=c66ac7ceff200ada server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.547+0000 7f2a027fc700 1 -- 192.168.123.106:0/843374037 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f29ec004030 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.547+0000 7f2a027fc700 1 -- 192.168.123.106:0/843374037 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f29ec01c8b0 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.547+0000 7f2a027fc700 1 -- 192.168.123.106:0/843374037 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f29ec003ab0 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.548+0000 7f2a09a91700 1 -- 192.168.123.106:0/843374037 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a0407b6d0 msgr2=0x7f2a0407bae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.548+0000 7f2a09a91700 1 --2- 192.168.123.106:0/843374037 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a0407b6d0 0x7f2a0407bae0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f29ec01ab30 tx=0x7f29ec01ae40 comp rx=0 tx=0).stop 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.548+0000 7f2a09a91700 1 -- 192.168.123.106:0/843374037 shutdown_connections 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.548+0000 7f2a09a91700 1 --2- 192.168.123.106:0/843374037 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a0407b6d0 0x7f2a0407bae0 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.548+0000 7f2a09a91700 1 -- 192.168.123.106:0/843374037 >> 192.168.123.106:0/843374037 conn(0x7f2a04103770 msgr2=0x7f2a04105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.548+0000 7f2a09a91700 1 -- 192.168.123.106:0/843374037 shutdown_connections 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.548+0000 7f2a09a91700 1 -- 192.168.123.106:0/843374037 wait complete. 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.549+0000 7f2a09a91700 1 Processor -- start 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.549+0000 7f2a09a91700 1 -- start start 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.549+0000 7f2a09a91700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a0407b6d0 0x7f2a04197770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.549+0000 7f2a09a91700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a04197cb0 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.550+0000 7f2a037fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a0407b6d0 0x7f2a04197770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.550+0000 7f2a037fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a0407b6d0 0x7f2a04197770 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:36154/0 (socket says 192.168.123.106:36154) 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.550+0000 7f2a037fe700 1 -- 192.168.123.106:0/815692381 learned_addr learned my addr 192.168.123.106:0/815692381 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.550+0000 7f2a037fe700 1 -- 192.168.123.106:0/815692381 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f29ec01a7e0 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.550+0000 7f2a037fe700 1 --2- 192.168.123.106:0/815692381 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a0407b6d0 0x7f2a04197770 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f29ec000c00 tx=0x7f29ec003fc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.550+0000 7f2a00ff9700 1 -- 192.168.123.106:0/815692381 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f29ec004370 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.550+0000 7f2a00ff9700 1 -- 192.168.123.106:0/815692381 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f29ec0044d0 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.550+0000 7f2a09a91700 1 -- 192.168.123.106:0/815692381 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2a04197eb0 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.551+0000 7f2a09a91700 1 -- 192.168.123.106:0/815692381 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2a04198350 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.551+0000 7f2a00ff9700 1 -- 192.168.123.106:0/815692381 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f29ec022650 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.551+0000 7f2a00ff9700 1 -- 192.168.123.106:0/815692381 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f29ec022ab0 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.551+0000 7f2a00ff9700 1 -- 192.168.123.106:0/815692381 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f29ec035d90 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.552+0000 7f2a09a91700 1 -- 192.168.123.106:0/815692381 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2a04190fd0 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.554+0000 7f2a00ff9700 1 -- 192.168.123.106:0/815692381 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f29ec054b00 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.594+0000 7f2a09a91700 1 -- 192.168.123.106:0/815692381 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config set, name=public_network}] v 0) v1 -- 0x7f2a0402d050 con 0x7f2a0407b6d0 2026-03-09T17:24:19.661 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.600+0000 7f2a00ff9700 1 -- 192.168.123.106:0/815692381 <== mon.0 v2:192.168.123.106:3300/0 7 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f29ec022e50 con 0x7f2a0407b6d0 2026-03-09T17:24:19.662 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.601+0000 7f2a00ff9700 1 -- 192.168.123.106:0/815692381 <== mon.0 v2:192.168.123.106:3300/0 8 ==== mon_command_ack([{prefix=config set, name=public_network}]=0 v3)=0 v3) v1 ==== 130+0+0 (secure 0 0 0) 0x7f29ec043030 con 0x7f2a0407b6d0 2026-03-09T17:24:19.662 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.603+0000 7f2a09a91700 1 -- 192.168.123.106:0/815692381 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a0407b6d0 msgr2=0x7f2a04197770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:19.662 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.603+0000 7f2a09a91700 1 --2- 192.168.123.106:0/815692381 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a0407b6d0 0x7f2a04197770 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f29ec000c00 tx=0x7f29ec003fc0 comp rx=0 tx=0).stop 2026-03-09T17:24:19.662 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.603+0000 7f2a09a91700 1 -- 192.168.123.106:0/815692381 shutdown_connections 2026-03-09T17:24:19.662 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.603+0000 7f2a09a91700 1 --2- 192.168.123.106:0/815692381 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a0407b6d0 0x7f2a04197770 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:19.662 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.603+0000 7f2a09a91700 1 -- 192.168.123.106:0/815692381 >> 192.168.123.106:0/815692381 conn(0x7f2a04103770 msgr2=0x7f2a041067f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:19.662 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.603+0000 7f2a09a91700 1 -- 192.168.123.106:0/815692381 shutdown_connections 2026-03-09T17:24:19.662 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:19.603+0000 7f2a09a91700 1 -- 192.168.123.106:0/815692381 wait complete. 2026-03-09T17:24:19.662 INFO:teuthology.orchestra.run.vm06.stdout:Wrote config to /etc/ceph/ceph.conf 2026-03-09T17:24:19.663 INFO:teuthology.orchestra.run.vm06.stdout:Wrote keyring to /etc/ceph/ceph.client.admin.keyring 2026-03-09T17:24:19.663 INFO:teuthology.orchestra.run.vm06.stdout:Creating mgr... 2026-03-09T17:24:19.664 INFO:teuthology.orchestra.run.vm06.stdout:Verifying port 0.0.0.0:9283 ... 2026-03-09T17:24:19.664 INFO:teuthology.orchestra.run.vm06.stdout:Verifying port 0.0.0.0:8765 ... 2026-03-09T17:24:19.664 INFO:teuthology.orchestra.run.vm06.stdout:Verifying port 0.0.0.0:8443 ... 2026-03-09T17:24:19.825 INFO:teuthology.orchestra.run.vm06.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mgr.vm06.pbgzei 2026-03-09T17:24:19.825 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stderr Failed to reset failed state of unit ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mgr.vm06.pbgzei.service: Unit ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mgr.vm06.pbgzei.service not loaded. 2026-03-09T17:24:19.953 INFO:teuthology.orchestra.run.vm06.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048.target.wants/ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mgr.vm06.pbgzei.service → /etc/systemd/system/ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@.service. 2026-03-09T17:24:20.168 INFO:teuthology.orchestra.run.vm06.stdout:firewalld does not appear to be present 2026-03-09T17:24:20.168 INFO:teuthology.orchestra.run.vm06.stdout:Not possible to enable service . firewalld.service is not available 2026-03-09T17:24:20.168 INFO:teuthology.orchestra.run.vm06.stdout:firewalld does not appear to be present 2026-03-09T17:24:20.168 INFO:teuthology.orchestra.run.vm06.stdout:Not possible to open ports <[9283, 8765, 8443]>. firewalld.service is not available 2026-03-09T17:24:20.168 INFO:teuthology.orchestra.run.vm06.stdout:Waiting for mgr to start... 2026-03-09T17:24:20.168 INFO:teuthology.orchestra.run.vm06.stdout:Waiting for mgr... 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout { 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "fsid": "bcd3bcc2-1bdc-11f1-97b3-3f61613e7048", 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "health": { 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 0 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout ], 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "vm06" 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout ], 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "quorum_age": 0, 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-09T17:24:20.416 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-09T17:24:20.418 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-09T17:24:20.418 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-09T17:24:20.418 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "restful" 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout ], 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "modified": "2026-03-09T17:24:18.262584+0000", 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout } 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.333+0000 7f458ac8b700 1 Processor -- start 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.333+0000 7f458ac8b700 1 -- start start 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.333+0000 7f458ac8b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4584072b50 0x7f4584071050 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.333+0000 7f458ac8b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4584071590 con 0x7f4584072b50 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.337+0000 7f4588a27700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4584072b50 0x7f4584071050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.337+0000 7f4588a27700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4584072b50 0x7f4584071050 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:36182/0 (socket says 192.168.123.106:36182) 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.337+0000 7f4588a27700 1 -- 192.168.123.106:0/2682613452 learned_addr learned my addr 192.168.123.106:0/2682613452 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.337+0000 7f4588a27700 1 -- 192.168.123.106:0/2682613452 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f45840716d0 con 0x7f4584072b50 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.337+0000 7f4588a27700 1 --2- 192.168.123.106:0/2682613452 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4584072b50 0x7f4584071050 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7f4574009a90 tx=0x7f4574009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=adff9e7f9a01cf06 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.337+0000 7f45837fe700 1 -- 192.168.123.106:0/2682613452 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4574004030 con 0x7f4584072b50 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.337+0000 7f45837fe700 1 -- 192.168.123.106:0/2682613452 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f457400b7e0 con 0x7f4584072b50 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.337+0000 7f45837fe700 1 -- 192.168.123.106:0/2682613452 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f45740039f0 con 0x7f4584072b50 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.338+0000 7f458ac8b700 1 -- 192.168.123.106:0/2682613452 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4584072b50 msgr2=0x7f4584071050 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:20.419 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.338+0000 7f458ac8b700 1 --2- 192.168.123.106:0/2682613452 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4584072b50 0x7f4584071050 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7f4574009a90 tx=0x7f4574009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.338+0000 7f458ac8b700 1 -- 192.168.123.106:0/2682613452 shutdown_connections 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.338+0000 7f458ac8b700 1 --2- 192.168.123.106:0/2682613452 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4584072b50 0x7f4584071050 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.338+0000 7f458ac8b700 1 -- 192.168.123.106:0/2682613452 >> 192.168.123.106:0/2682613452 conn(0x7f458406c970 msgr2=0x7f458406eda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.339+0000 7f458ac8b700 1 -- 192.168.123.106:0/2682613452 shutdown_connections 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.339+0000 7f458ac8b700 1 -- 192.168.123.106:0/2682613452 wait complete. 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.339+0000 7f458ac8b700 1 Processor -- start 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.339+0000 7f458ac8b700 1 -- start start 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.339+0000 7f458ac8b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4584072b50 0x7f458411af30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.339+0000 7f458ac8b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f458411c8a0 con 0x7f4584072b50 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.340+0000 7f4588a27700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4584072b50 0x7f458411af30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.340+0000 7f4588a27700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4584072b50 0x7f458411af30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:36186/0 (socket says 192.168.123.106:36186) 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.340+0000 7f4588a27700 1 -- 192.168.123.106:0/2602706503 learned_addr learned my addr 192.168.123.106:0/2602706503 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.340+0000 7f4588a27700 1 -- 192.168.123.106:0/2602706503 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4574009740 con 0x7f4584072b50 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.340+0000 7f4588a27700 1 --2- 192.168.123.106:0/2602706503 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4584072b50 0x7f458411af30 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f457400be40 tx=0x7f457400bf20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.340+0000 7f4581ffb700 1 -- 192.168.123.106:0/2602706503 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f45740040d0 con 0x7f4584072b50 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.340+0000 7f458ac8b700 1 -- 192.168.123.106:0/2602706503 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f458411b470 con 0x7f4584072b50 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.340+0000 7f458ac8b700 1 -- 192.168.123.106:0/2602706503 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f458411b910 con 0x7f4584072b50 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.341+0000 7f4581ffb700 1 -- 192.168.123.106:0/2602706503 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f457402b430 con 0x7f4584072b50 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.341+0000 7f4581ffb700 1 -- 192.168.123.106:0/2602706503 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f457401a430 con 0x7f4584072b50 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.341+0000 7f458ac8b700 1 -- 192.168.123.106:0/2602706503 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4570005320 con 0x7f4584072b50 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.341+0000 7f4581ffb700 1 -- 192.168.123.106:0/2602706503 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f457402b5a0 con 0x7f4584072b50 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.342+0000 7f4581ffb700 1 -- 192.168.123.106:0/2602706503 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f457401ad30 con 0x7f4584072b50 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.343+0000 7f4581ffb700 1 -- 192.168.123.106:0/2602706503 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f457401a590 con 0x7f4584072b50 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.381+0000 7f458ac8b700 1 -- 192.168.123.106:0/2602706503 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f45700059f0 con 0x7f4584072b50 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.382+0000 7f4581ffb700 1 -- 192.168.123.106:0/2602706503 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7f4574004230 con 0x7f4584072b50 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.384+0000 7f456b7fe700 1 -- 192.168.123.106:0/2602706503 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4584072b50 msgr2=0x7f458411af30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.384+0000 7f456b7fe700 1 --2- 192.168.123.106:0/2602706503 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4584072b50 0x7f458411af30 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f457400be40 tx=0x7f457400bf20 comp rx=0 tx=0).stop 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.384+0000 7f456b7fe700 1 -- 192.168.123.106:0/2602706503 shutdown_connections 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.384+0000 7f456b7fe700 1 --2- 192.168.123.106:0/2602706503 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4584072b50 0x7f458411af30 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.384+0000 7f456b7fe700 1 -- 192.168.123.106:0/2602706503 >> 192.168.123.106:0/2602706503 conn(0x7f458406c970 msgr2=0x7f458406d5e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.385+0000 7f456b7fe700 1 -- 192.168.123.106:0/2602706503 shutdown_connections 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:20.385+0000 7f456b7fe700 1 -- 192.168.123.106:0/2602706503 wait complete. 2026-03-09T17:24:20.420 INFO:teuthology.orchestra.run.vm06.stdout:mgr not available, waiting (1/15)... 2026-03-09T17:24:20.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:20 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/815692381' entity='client.admin' 2026-03-09T17:24:20.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:20 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/2602706503' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout { 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "fsid": "bcd3bcc2-1bdc-11f1-97b3-3f61613e7048", 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "health": { 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 0 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout ], 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "vm06" 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout ], 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "quorum_age": 3, 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-09T17:24:22.732 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-09T17:24:22.733 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-09T17:24:22.733 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-09T17:24:22.733 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:22.733 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-09T17:24:22.733 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-09T17:24:22.733 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-09T17:24:22.733 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "restful" 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout ], 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "modified": "2026-03-09T17:24:18.262584+0000", 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout } 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.641+0000 7fdba8994700 1 Processor -- start 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.641+0000 7fdba8994700 1 -- start start 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.641+0000 7fdba8994700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdba4072ac0 0x7fdba4070fc0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.641+0000 7fdba8994700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdba4071500 con 0x7fdba4072ac0 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.642+0000 7fdba2d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdba4072ac0 0x7fdba4070fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.642+0000 7fdba2d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdba4072ac0 0x7fdba4070fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:36202/0 (socket says 192.168.123.106:36202) 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.642+0000 7fdba2d9d700 1 -- 192.168.123.106:0/2070726311 learned_addr learned my addr 192.168.123.106:0/2070726311 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.642+0000 7fdba2d9d700 1 -- 192.168.123.106:0/2070726311 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdba4071640 con 0x7fdba4072ac0 2026-03-09T17:24:22.734 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.642+0000 7fdba2d9d700 1 --2- 192.168.123.106:0/2070726311 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdba4072ac0 0x7fdba4070fc0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fdb9400d0d0 tx=0x7fdb9400d3e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a41d4cda6ecf813 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:22.736 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.643+0000 7fdba1d9b700 1 -- 192.168.123.106:0/2070726311 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdb94010070 con 0x7fdba4072ac0 2026-03-09T17:24:22.736 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.643+0000 7fdba1d9b700 1 -- 192.168.123.106:0/2070726311 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fdb94004030 con 0x7fdba4072ac0 2026-03-09T17:24:22.736 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.643+0000 7fdba8994700 1 -- 192.168.123.106:0/2070726311 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdba4072ac0 msgr2=0x7fdba4070fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:22.736 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.643+0000 7fdba8994700 1 --2- 192.168.123.106:0/2070726311 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdba4072ac0 0x7fdba4070fc0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fdb9400d0d0 tx=0x7fdb9400d3e0 comp rx=0 tx=0).stop 2026-03-09T17:24:22.736 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.643+0000 7fdba8994700 1 -- 192.168.123.106:0/2070726311 shutdown_connections 2026-03-09T17:24:22.736 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.643+0000 7fdba8994700 1 --2- 192.168.123.106:0/2070726311 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdba4072ac0 0x7fdba4070fc0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:22.736 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.643+0000 7fdba8994700 1 -- 192.168.123.106:0/2070726311 >> 192.168.123.106:0/2070726311 conn(0x7fdba406c9d0 msgr2=0x7fdba406ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:22.736 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.643+0000 7fdba8994700 1 -- 192.168.123.106:0/2070726311 shutdown_connections 2026-03-09T17:24:22.736 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.643+0000 7fdba8994700 1 -- 192.168.123.106:0/2070726311 wait complete. 2026-03-09T17:24:22.736 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.644+0000 7fdba8994700 1 Processor -- start 2026-03-09T17:24:22.736 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.644+0000 7fdba8994700 1 -- start start 2026-03-09T17:24:22.736 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.644+0000 7fdba8994700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdba4086420 0x7fdba4089a00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:22.736 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.644+0000 7fdba8994700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb94003b60 con 0x7fdba4086420 2026-03-09T17:24:22.736 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.644+0000 7fdba2d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdba4086420 0x7fdba4089a00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.644+0000 7fdba2d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdba4086420 0x7fdba4089a00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58824/0 (socket says 192.168.123.106:58824) 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.644+0000 7fdba2d9d700 1 -- 192.168.123.106:0/3046831974 learned_addr learned my addr 192.168.123.106:0/3046831974 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.644+0000 7fdba2d9d700 1 -- 192.168.123.106:0/3046831974 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdb940088c0 con 0x7fdba4086420 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.645+0000 7fdba2d9d700 1 --2- 192.168.123.106:0/3046831974 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdba4086420 0x7fdba4089a00 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fdb94004150 tx=0x7fdb9400dfd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.645+0000 7fdb8bfff700 1 -- 192.168.123.106:0/3046831974 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdb94010050 con 0x7fdba4086420 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.645+0000 7fdba8994700 1 -- 192.168.123.106:0/3046831974 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdba4086830 con 0x7fdba4086420 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.645+0000 7fdba8994700 1 -- 192.168.123.106:0/3046831974 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdba4086cd0 con 0x7fdba4086420 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.646+0000 7fdb8bfff700 1 -- 192.168.123.106:0/3046831974 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fdb940045a0 con 0x7fdba4086420 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.646+0000 7fdb8bfff700 1 -- 192.168.123.106:0/3046831974 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdb9401f690 con 0x7fdba4086420 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.646+0000 7fdb8bfff700 1 -- 192.168.123.106:0/3046831974 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fdb9401d040 con 0x7fdba4086420 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.646+0000 7fdb8bfff700 1 -- 192.168.123.106:0/3046831974 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fdb94028d80 con 0x7fdba4086420 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.646+0000 7fdba8994700 1 -- 192.168.123.106:0/3046831974 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdb90005320 con 0x7fdba4086420 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.648+0000 7fdb8bfff700 1 -- 192.168.123.106:0/3046831974 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7fdb9402d050 con 0x7fdba4086420 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.689+0000 7fdba8994700 1 -- 192.168.123.106:0/3046831974 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7fdb90005190 con 0x7fdba4086420 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.689+0000 7fdb8bfff700 1 -- 192.168.123.106:0/3046831974 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7fdb94023070 con 0x7fdba4086420 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.691+0000 7fdb89ffb700 1 -- 192.168.123.106:0/3046831974 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdba4086420 msgr2=0x7fdba4089a00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.691+0000 7fdb89ffb700 1 --2- 192.168.123.106:0/3046831974 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdba4086420 0x7fdba4089a00 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fdb94004150 tx=0x7fdb9400dfd0 comp rx=0 tx=0).stop 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.691+0000 7fdb89ffb700 1 -- 192.168.123.106:0/3046831974 shutdown_connections 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.691+0000 7fdb89ffb700 1 --2- 192.168.123.106:0/3046831974 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdba4086420 0x7fdba4089a00 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.691+0000 7fdb89ffb700 1 -- 192.168.123.106:0/3046831974 >> 192.168.123.106:0/3046831974 conn(0x7fdba406c9d0 msgr2=0x7fdba406d520 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.691+0000 7fdb89ffb700 1 -- 192.168.123.106:0/3046831974 shutdown_connections 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:22.691+0000 7fdb89ffb700 1 -- 192.168.123.106:0/3046831974 wait complete. 2026-03-09T17:24:22.737 INFO:teuthology.orchestra.run.vm06.stdout:mgr not available, waiting (2/15)... 2026-03-09T17:24:23.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:22 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3046831974' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-09T17:24:24.976 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 2026-03-09T17:24:24.977 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout { 2026-03-09T17:24:24.977 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "fsid": "bcd3bcc2-1bdc-11f1-97b3-3f61613e7048", 2026-03-09T17:24:24.977 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "health": { 2026-03-09T17:24:24.977 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-09T17:24:24.977 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-09T17:24:24.979 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-09T17:24:24.979 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:24.979 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-09T17:24:24.979 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-09T17:24:24.979 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 0 2026-03-09T17:24:24.979 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout ], 2026-03-09T17:24:24.979 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "vm06" 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout ], 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "quorum_age": 5, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "restful" 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout ], 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "modified": "2026-03-09T17:24:18.262584+0000", 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout } 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.888+0000 7fb0c2c57700 1 Processor -- start 2026-03-09T17:24:24.980 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.889+0000 7fb0c2c57700 1 -- start start 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.889+0000 7fb0c2c57700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb0bc107f40 0x7fb0bc108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.889+0000 7fb0c2c57700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb0bc108890 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.889+0000 7fb0c09f3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb0bc107f40 0x7fb0bc108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.889+0000 7fb0c09f3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb0bc107f40 0x7fb0bc108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58830/0 (socket says 192.168.123.106:58830) 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.889+0000 7fb0c09f3700 1 -- 192.168.123.106:0/1353524927 learned_addr learned my addr 192.168.123.106:0/1353524927 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.890+0000 7fb0c09f3700 1 -- 192.168.123.106:0/1353524927 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb0bc1089d0 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.890+0000 7fb0c09f3700 1 --2- 192.168.123.106:0/1353524927 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb0bc107f40 0x7fb0bc108350 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fb0b0009a90 tx=0x7fb0b0009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=e11bd4074f8e9810 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.890+0000 7fb0bb7fe700 1 -- 192.168.123.106:0/1353524927 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb0b0004030 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.891+0000 7fb0bb7fe700 1 -- 192.168.123.106:0/1353524927 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fb0b000b7e0 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.891+0000 7fb0c2c57700 1 -- 192.168.123.106:0/1353524927 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb0bc107f40 msgr2=0x7fb0bc108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.891+0000 7fb0c2c57700 1 --2- 192.168.123.106:0/1353524927 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb0bc107f40 0x7fb0bc108350 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fb0b0009a90 tx=0x7fb0b0009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.895+0000 7fb0c2c57700 1 -- 192.168.123.106:0/1353524927 shutdown_connections 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.895+0000 7fb0c2c57700 1 --2- 192.168.123.106:0/1353524927 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb0bc107f40 0x7fb0bc108350 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.895+0000 7fb0c2c57700 1 -- 192.168.123.106:0/1353524927 >> 192.168.123.106:0/1353524927 conn(0x7fb0bc103770 msgr2=0x7fb0bc105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.895+0000 7fb0c2c57700 1 -- 192.168.123.106:0/1353524927 shutdown_connections 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.895+0000 7fb0c2c57700 1 -- 192.168.123.106:0/1353524927 wait complete. 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.895+0000 7fb0c2c57700 1 Processor -- start 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.896+0000 7fb0c2c57700 1 -- start start 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.896+0000 7fb0c2c57700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb0bc107f40 0x7fb0bc07eb70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.896+0000 7fb0c2c57700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb0bc108890 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.896+0000 7fb0c09f3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb0bc107f40 0x7fb0bc07eb70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.896+0000 7fb0c09f3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb0bc107f40 0x7fb0bc07eb70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58836/0 (socket says 192.168.123.106:58836) 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.896+0000 7fb0c09f3700 1 -- 192.168.123.106:0/3281347072 learned_addr learned my addr 192.168.123.106:0/3281347072 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.896+0000 7fb0c09f3700 1 -- 192.168.123.106:0/3281347072 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb0b0009740 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.897+0000 7fb0c09f3700 1 --2- 192.168.123.106:0/3281347072 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb0bc107f40 0x7fb0bc07eb70 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fb0b000bdd0 tx=0x7fb0b000beb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.898+0000 7fb0b9ffb700 1 -- 192.168.123.106:0/3281347072 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb0b0003f30 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.898+0000 7fb0c2c57700 1 -- 192.168.123.106:0/3281347072 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb0bc07f0b0 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.898+0000 7fb0c2c57700 1 -- 192.168.123.106:0/3281347072 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb0bc07b5d0 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.898+0000 7fb0c2c57700 1 -- 192.168.123.106:0/3281347072 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb0bc04f9e0 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.899+0000 7fb0b9ffb700 1 -- 192.168.123.106:0/3281347072 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fb0b0004530 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.899+0000 7fb0b9ffb700 1 -- 192.168.123.106:0/3281347072 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb0b0024d00 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.899+0000 7fb0b9ffb700 1 -- 192.168.123.106:0/3281347072 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fb0b001b440 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.900+0000 7fb0b9ffb700 1 -- 192.168.123.106:0/3281347072 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fb0b002e430 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.901+0000 7fb0b9ffb700 1 -- 192.168.123.106:0/3281347072 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7fb0b001f070 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.943+0000 7fb0c2c57700 1 -- 192.168.123.106:0/3281347072 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7fb0bc062380 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.944+0000 7fb0b9ffb700 1 -- 192.168.123.106:0/3281347072 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7fb0b0024460 con 0x7fb0bc107f40 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.946+0000 7fb0c2c57700 1 -- 192.168.123.106:0/3281347072 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb0bc107f40 msgr2=0x7fb0bc07eb70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.946+0000 7fb0c2c57700 1 --2- 192.168.123.106:0/3281347072 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb0bc107f40 0x7fb0bc07eb70 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fb0b000bdd0 tx=0x7fb0b000beb0 comp rx=0 tx=0).stop 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.946+0000 7fb0c2c57700 1 -- 192.168.123.106:0/3281347072 shutdown_connections 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.946+0000 7fb0c2c57700 1 --2- 192.168.123.106:0/3281347072 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb0bc107f40 0x7fb0bc07eb70 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.946+0000 7fb0c2c57700 1 -- 192.168.123.106:0/3281347072 >> 192.168.123.106:0/3281347072 conn(0x7fb0bc103770 msgr2=0x7fb0bc105f60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.946+0000 7fb0c2c57700 1 -- 192.168.123.106:0/3281347072 shutdown_connections 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:24.947+0000 7fb0c2c57700 1 -- 192.168.123.106:0/3281347072 wait complete. 2026-03-09T17:24:24.981 INFO:teuthology.orchestra.run.vm06.stdout:mgr not available, waiting (3/15)... 2026-03-09T17:24:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:25 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3281347072' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-09T17:24:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:25 vm06 ceph-mon[57307]: Activating manager daemon vm06.pbgzei 2026-03-09T17:24:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:25 vm06 ceph-mon[57307]: mgrmap e2: vm06.pbgzei(active, starting, since 0.00414092s) 2026-03-09T17:24:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:25 vm06 ceph-mon[57307]: from='mgr.14100 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T17:24:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:25 vm06 ceph-mon[57307]: from='mgr.14100 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T17:24:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:25 vm06 ceph-mon[57307]: from='mgr.14100 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T17:24:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:25 vm06 ceph-mon[57307]: from='mgr.14100 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:24:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:25 vm06 ceph-mon[57307]: from='mgr.14100 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr metadata", "who": "vm06.pbgzei", "id": "vm06.pbgzei"}]: dispatch 2026-03-09T17:24:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:25 vm06 ceph-mon[57307]: Manager daemon vm06.pbgzei is now available 2026-03-09T17:24:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:25 vm06 ceph-mon[57307]: from='mgr.14100 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/mirror_snapshot_schedule"}]: dispatch 2026-03-09T17:24:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:25 vm06 ceph-mon[57307]: from='mgr.14100 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:25 vm06 ceph-mon[57307]: from='mgr.14100 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/trash_purge_schedule"}]: dispatch 2026-03-09T17:24:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:25 vm06 ceph-mon[57307]: from='mgr.14100 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:25 vm06 ceph-mon[57307]: from='mgr.14100 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:27.276 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:27 vm06 ceph-mon[57307]: mgrmap e3: vm06.pbgzei(active, since 1.00849s) 2026-03-09T17:24:27.326 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 2026-03-09T17:24:27.326 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout { 2026-03-09T17:24:27.326 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "fsid": "bcd3bcc2-1bdc-11f1-97b3-3f61613e7048", 2026-03-09T17:24:27.326 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "health": { 2026-03-09T17:24:27.326 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-09T17:24:27.326 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-09T17:24:27.326 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-09T17:24:27.326 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:27.326 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-09T17:24:27.327 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-09T17:24:27.327 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 0 2026-03-09T17:24:27.327 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout ], 2026-03-09T17:24:27.327 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-09T17:24:27.327 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "vm06" 2026-03-09T17:24:27.327 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout ], 2026-03-09T17:24:27.327 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "quorum_age": 7, 2026-03-09T17:24:27.327 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-09T17:24:27.327 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:27.327 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-09T17:24:27.327 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-09T17:24:27.327 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-09T17:24:27.328 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-09T17:24:27.329 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-09T17:24:27.329 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-09T17:24:27.329 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:27.329 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-09T17:24:27.329 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:27.329 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-09T17:24:27.329 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-09T17:24:27.329 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:27.329 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-09T17:24:27.329 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-09T17:24:27.330 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-09T17:24:27.330 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-09T17:24:27.330 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-09T17:24:27.330 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-09T17:24:27.330 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "restful" 2026-03-09T17:24:27.330 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout ], 2026-03-09T17:24:27.330 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T17:24:27.330 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:27.330 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-09T17:24:27.330 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-09T17:24:27.330 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "modified": "2026-03-09T17:24:18.262584+0000", 2026-03-09T17:24:27.330 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-09T17:24:27.330 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout }, 2026-03-09T17:24:27.330 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout } 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.113+0000 7f96e5b97700 1 Processor -- start 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.113+0000 7f96e5b97700 1 -- start start 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.114+0000 7f96e5b97700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e0074530 0x7f96e010acc0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.114+0000 7f96e5b97700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f96e01066b0 con 0x7f96e0074530 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.114+0000 7f96df7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e0074530 0x7f96e010acc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.114+0000 7f96df7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e0074530 0x7f96e010acc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58896/0 (socket says 192.168.123.106:58896) 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.114+0000 7f96df7fe700 1 -- 192.168.123.106:0/1056158415 learned_addr learned my addr 192.168.123.106:0/1056158415 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.115+0000 7f96df7fe700 1 -- 192.168.123.106:0/1056158415 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f96e010b200 con 0x7f96e0074530 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.115+0000 7f96df7fe700 1 --2- 192.168.123.106:0/1056158415 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e0074530 0x7f96e010acc0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f96d0009a90 tx=0x7f96d0009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=70ce19221273316 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.115+0000 7f96deffd700 1 -- 192.168.123.106:0/1056158415 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f96d0004030 con 0x7f96e0074530 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.115+0000 7f96deffd700 1 -- 192.168.123.106:0/1056158415 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f96d000b7e0 con 0x7f96e0074530 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.115+0000 7f96deffd700 1 -- 192.168.123.106:0/1056158415 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f96d0003a40 con 0x7f96e0074530 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.116+0000 7f96e5b97700 1 -- 192.168.123.106:0/1056158415 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e0074530 msgr2=0x7f96e010acc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.116+0000 7f96e5b97700 1 --2- 192.168.123.106:0/1056158415 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e0074530 0x7f96e010acc0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f96d0009a90 tx=0x7f96d0009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.116+0000 7f96e5b97700 1 -- 192.168.123.106:0/1056158415 shutdown_connections 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.116+0000 7f96e5b97700 1 --2- 192.168.123.106:0/1056158415 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e0074530 0x7f96e010acc0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:27.331 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.116+0000 7f96e5b97700 1 -- 192.168.123.106:0/1056158415 >> 192.168.123.106:0/1056158415 conn(0x7f96e0100270 msgr2=0x7f96e01026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:27.332 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.116+0000 7f96e5b97700 1 -- 192.168.123.106:0/1056158415 shutdown_connections 2026-03-09T17:24:27.332 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.116+0000 7f96e5b97700 1 -- 192.168.123.106:0/1056158415 wait complete. 2026-03-09T17:24:27.332 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.118+0000 7f96e5b97700 1 Processor -- start 2026-03-09T17:24:27.332 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.118+0000 7f96e5b97700 1 -- start start 2026-03-09T17:24:27.332 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.120+0000 7f96e5b97700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e019bb30 0x7f96e019bf40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:27.332 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.120+0000 7f96e5b97700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f96e019c480 con 0x7f96e019bb30 2026-03-09T17:24:27.332 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.120+0000 7f96df7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e019bb30 0x7f96e019bf40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:27.332 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.121+0000 7f96df7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e019bb30 0x7f96e019bf40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58898/0 (socket says 192.168.123.106:58898) 2026-03-09T17:24:27.332 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.121+0000 7f96df7fe700 1 -- 192.168.123.106:0/3601424575 learned_addr learned my addr 192.168.123.106:0/3601424575 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:27.332 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.121+0000 7f96df7fe700 1 -- 192.168.123.106:0/3601424575 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f96d0009740 con 0x7f96e019bb30 2026-03-09T17:24:27.332 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.121+0000 7f96df7fe700 1 --2- 192.168.123.106:0/3601424575 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e019bb30 0x7f96e019bf40 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f96d000bef0 tx=0x7f96d0003ba0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:27.332 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.124+0000 7f96dd7fa700 1 -- 192.168.123.106:0/3601424575 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f96d0004240 con 0x7f96e019bb30 2026-03-09T17:24:27.332 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.124+0000 7f96dd7fa700 1 -- 192.168.123.106:0/3601424575 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f96d00043a0 con 0x7f96e019bb30 2026-03-09T17:24:27.333 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.124+0000 7f96dd7fa700 1 -- 192.168.123.106:0/3601424575 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f96d0004240 con 0x7f96e019bb30 2026-03-09T17:24:27.333 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.125+0000 7f96e5b97700 1 -- 192.168.123.106:0/3601424575 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f96e019c680 con 0x7f96e019bb30 2026-03-09T17:24:27.333 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.125+0000 7f96e5b97700 1 -- 192.168.123.106:0/3601424575 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f96e019f2e0 con 0x7f96e019bb30 2026-03-09T17:24:27.333 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.126+0000 7f96e5b97700 1 -- 192.168.123.106:0/3601424575 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f96e0062380 con 0x7f96e019bb30 2026-03-09T17:24:27.333 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.126+0000 7f96dd7fa700 1 -- 192.168.123.106:0/3601424575 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 4) v1 ==== 45068+0+0 (secure 0 0 0) 0x7f96d0028030 con 0x7f96e019bb30 2026-03-09T17:24:27.333 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.126+0000 7f96dd7fa700 1 --2- 192.168.123.106:0/3601424575 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f96cc0383b0 0x7f96cc03a860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:27.333 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.127+0000 7f96d7fff700 1 --2- 192.168.123.106:0/3601424575 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f96cc0383b0 0x7f96cc03a860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:27.333 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.127+0000 7f96d7fff700 1 --2- 192.168.123.106:0/3601424575 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f96cc0383b0 0x7f96cc03a860 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f96c8006fd0 tx=0x7f96c8006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:27.333 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.127+0000 7f96dd7fa700 1 -- 192.168.123.106:0/3601424575 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f96d0025530 con 0x7f96e019bb30 2026-03-09T17:24:27.333 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.130+0000 7f96dd7fa700 1 -- 192.168.123.106:0/3601424575 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f96d0011ae0 con 0x7f96e019bb30 2026-03-09T17:24:27.333 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.274+0000 7f96e5b97700 1 -- 192.168.123.106:0/3601424575 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f96e0074530 con 0x7f96e019bb30 2026-03-09T17:24:27.333 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.275+0000 7f96dd7fa700 1 -- 192.168.123.106:0/3601424575 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1240 (secure 0 0 0) 0x7f96d0011ae0 con 0x7f96e019bb30 2026-03-09T17:24:27.334 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.278+0000 7f96e5b97700 1 -- 192.168.123.106:0/3601424575 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f96cc0383b0 msgr2=0x7f96cc03a860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:27.334 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.278+0000 7f96e5b97700 1 --2- 192.168.123.106:0/3601424575 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f96cc0383b0 0x7f96cc03a860 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f96c8006fd0 tx=0x7f96c8006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:27.334 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.278+0000 7f96e5b97700 1 -- 192.168.123.106:0/3601424575 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e019bb30 msgr2=0x7f96e019bf40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:27.334 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.278+0000 7f96e5b97700 1 --2- 192.168.123.106:0/3601424575 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e019bb30 0x7f96e019bf40 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f96d000bef0 tx=0x7f96d0003ba0 comp rx=0 tx=0).stop 2026-03-09T17:24:27.334 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.278+0000 7f96e5b97700 1 -- 192.168.123.106:0/3601424575 shutdown_connections 2026-03-09T17:24:27.334 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.279+0000 7f96e5b97700 1 --2- 192.168.123.106:0/3601424575 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f96cc0383b0 0x7f96cc03a860 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:27.334 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.279+0000 7f96e5b97700 1 --2- 192.168.123.106:0/3601424575 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e019bb30 0x7f96e019bf40 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:27.334 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.279+0000 7f96e5b97700 1 -- 192.168.123.106:0/3601424575 >> 192.168.123.106:0/3601424575 conn(0x7f96e0100270 msgr2=0x7f96e0100ef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:27.334 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.279+0000 7f96e5b97700 1 -- 192.168.123.106:0/3601424575 shutdown_connections 2026-03-09T17:24:27.334 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.279+0000 7f96e5b97700 1 -- 192.168.123.106:0/3601424575 wait complete. 2026-03-09T17:24:27.334 INFO:teuthology.orchestra.run.vm06.stdout:mgr is available 2026-03-09T17:24:27.616 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 2026-03-09T17:24:27.616 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout [global] 2026-03-09T17:24:27.616 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout fsid = bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:24:27.616 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-09T17:24:27.616 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-09T17:24:27.616 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-09T17:24:27.616 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-09T17:24:27.616 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 2026-03-09T17:24:27.616 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-09T17:24:27.617 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-09T17:24:27.617 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 2026-03-09T17:24:27.617 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout [osd] 2026-03-09T17:24:27.617 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-09T17:24:27.617 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-09T17:24:27.617 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-09T17:24:27.617 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-09T17:24:27.617 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-09T17:24:27.617 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.470+0000 7f576dba2700 1 Processor -- start 2026-03-09T17:24:27.617 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.470+0000 7f576dba2700 1 -- start start 2026-03-09T17:24:27.617 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.471+0000 7f576dba2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f576807ad50 0x7f5768079250 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.471+0000 7f576dba2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5768079790 con 0x7f576807ad50 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.471+0000 7f57677fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f576807ad50 0x7f5768079250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.471+0000 7f57677fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f576807ad50 0x7f5768079250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58902/0 (socket says 192.168.123.106:58902) 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.471+0000 7f57677fe700 1 -- 192.168.123.106:0/1003175283 learned_addr learned my addr 192.168.123.106:0/1003175283 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.471+0000 7f57677fe700 1 -- 192.168.123.106:0/1003175283 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f57680798d0 con 0x7f576807ad50 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.472+0000 7f57677fe700 1 --2- 192.168.123.106:0/1003175283 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f576807ad50 0x7f5768079250 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f5750009a90 tx=0x7f5750009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=d78e21eb0a747c86 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.472+0000 7f57667fc700 1 -- 192.168.123.106:0/1003175283 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5750004030 con 0x7f576807ad50 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.472+0000 7f57667fc700 1 -- 192.168.123.106:0/1003175283 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f575000b7e0 con 0x7f576807ad50 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.472+0000 7f576dba2700 1 -- 192.168.123.106:0/1003175283 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f576807ad50 msgr2=0x7f5768079250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.472+0000 7f576dba2700 1 --2- 192.168.123.106:0/1003175283 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f576807ad50 0x7f5768079250 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f5750009a90 tx=0x7f5750009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.473+0000 7f576dba2700 1 -- 192.168.123.106:0/1003175283 shutdown_connections 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.473+0000 7f576dba2700 1 --2- 192.168.123.106:0/1003175283 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f576807ad50 0x7f5768079250 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.473+0000 7f576dba2700 1 -- 192.168.123.106:0/1003175283 >> 192.168.123.106:0/1003175283 conn(0x7f57681013a0 msgr2=0x7f57681037b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.473+0000 7f576dba2700 1 -- 192.168.123.106:0/1003175283 shutdown_connections 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.473+0000 7f576dba2700 1 -- 192.168.123.106:0/1003175283 wait complete. 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.473+0000 7f576dba2700 1 Processor -- start 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.474+0000 7f576dba2700 1 -- start start 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.474+0000 7f576dba2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f576807ad50 0x7f57681a05a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.474+0000 7f576dba2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f57681a0ae0 con 0x7f576807ad50 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.474+0000 7f57677fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f576807ad50 0x7f57681a05a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.474+0000 7f57677fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f576807ad50 0x7f57681a05a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58906/0 (socket says 192.168.123.106:58906) 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.474+0000 7f57677fe700 1 -- 192.168.123.106:0/3425727786 learned_addr learned my addr 192.168.123.106:0/3425727786 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.474+0000 7f57677fe700 1 -- 192.168.123.106:0/3425727786 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5750009740 con 0x7f576807ad50 2026-03-09T17:24:27.618 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.474+0000 7f57677fe700 1 --2- 192.168.123.106:0/3425727786 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f576807ad50 0x7f57681a05a0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f575000bd60 tx=0x7f575000be40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:27.619 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.475+0000 7f5764ff9700 1 -- 192.168.123.106:0/3425727786 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f575001b650 con 0x7f576807ad50 2026-03-09T17:24:27.619 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.475+0000 7f576dba2700 1 -- 192.168.123.106:0/3425727786 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f57681a0ce0 con 0x7f576807ad50 2026-03-09T17:24:27.619 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.475+0000 7f576dba2700 1 -- 192.168.123.106:0/3425727786 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f57681a1180 con 0x7f576807ad50 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.475+0000 7f5764ff9700 1 -- 192.168.123.106:0/3425727786 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f575001bc50 con 0x7f576807ad50 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.475+0000 7f5764ff9700 1 -- 192.168.123.106:0/3425727786 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f575002db30 con 0x7f576807ad50 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.476+0000 7f5764ff9700 1 -- 192.168.123.106:0/3425727786 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 4) v1 ==== 45068+0+0 (secure 0 0 0) 0x7f575002dc90 con 0x7f576807ad50 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.476+0000 7f5764ff9700 1 --2- 192.168.123.106:0/3425727786 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f57540383b0 0x7f575403a860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.476+0000 7f5764ff9700 1 -- 192.168.123.106:0/3425727786 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f575004d000 con 0x7f576807ad50 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.476+0000 7f5766ffd700 1 --2- 192.168.123.106:0/3425727786 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f57540383b0 0x7f575403a860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.477+0000 7f5766ffd700 1 --2- 192.168.123.106:0/3425727786 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f57540383b0 0x7f575403a860 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f5758006fd0 tx=0x7f5758006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.477+0000 7f576dba2700 1 -- 192.168.123.106:0/3425727786 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5748005320 con 0x7f576807ad50 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.480+0000 7f5764ff9700 1 -- 192.168.123.106:0/3425727786 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5750023400 con 0x7f576807ad50 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.580+0000 7f576dba2700 1 -- 192.168.123.106:0/3425727786 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7f5748005f70 con 0x7f576807ad50 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.585+0000 7f5764ff9700 1 -- 192.168.123.106:0/3425727786 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v3) v1 ==== 70+0+373 (secure 0 0 0) 0x7f575001a3d0 con 0x7f576807ad50 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.587+0000 7f576dba2700 1 -- 192.168.123.106:0/3425727786 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f57540383b0 msgr2=0x7f575403a860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.587+0000 7f576dba2700 1 --2- 192.168.123.106:0/3425727786 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f57540383b0 0x7f575403a860 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f5758006fd0 tx=0x7f5758006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.587+0000 7f576dba2700 1 -- 192.168.123.106:0/3425727786 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f576807ad50 msgr2=0x7f57681a05a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.587+0000 7f576dba2700 1 --2- 192.168.123.106:0/3425727786 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f576807ad50 0x7f57681a05a0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f575000bd60 tx=0x7f575000be40 comp rx=0 tx=0).stop 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.588+0000 7f576dba2700 1 -- 192.168.123.106:0/3425727786 shutdown_connections 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.588+0000 7f576dba2700 1 --2- 192.168.123.106:0/3425727786 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f57540383b0 0x7f575403a860 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.588+0000 7f576dba2700 1 --2- 192.168.123.106:0/3425727786 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f576807ad50 0x7f57681a05a0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.588+0000 7f576dba2700 1 -- 192.168.123.106:0/3425727786 >> 192.168.123.106:0/3425727786 conn(0x7f57681013a0 msgr2=0x7f5768102000 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.588+0000 7f576dba2700 1 -- 192.168.123.106:0/3425727786 shutdown_connections 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.588+0000 7f576dba2700 1 -- 192.168.123.106:0/3425727786 wait complete. 2026-03-09T17:24:27.621 INFO:teuthology.orchestra.run.vm06.stdout:Enabling cephadm module... 2026-03-09T17:24:28.131 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:28 vm06 ceph-mon[57307]: mgrmap e4: vm06.pbgzei(active, since 2s) 2026-03-09T17:24:28.131 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:28 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3601424575' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-09T17:24:28.131 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:28 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3425727786' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch 2026-03-09T17:24:28.131 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:28 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/809786989' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "cephadm"}]: dispatch 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.740+0000 7f17d73d2700 1 Processor -- start 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.741+0000 7f17d73d2700 1 -- start start 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.741+0000 7f17d73d2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f17d0104620 0x7f17d0106a40 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.741+0000 7f17d73d2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17d00745b0 con 0x7f17d0104620 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.741+0000 7f17d516e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f17d0104620 0x7f17d0106a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.741+0000 7f17d516e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f17d0104620 0x7f17d0106a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58920/0 (socket says 192.168.123.106:58920) 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.741+0000 7f17d516e700 1 -- 192.168.123.106:0/2346670617 learned_addr learned my addr 192.168.123.106:0/2346670617 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.742+0000 7f17d516e700 1 -- 192.168.123.106:0/2346670617 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f17d00746f0 con 0x7f17d0104620 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.742+0000 7f17d516e700 1 --2- 192.168.123.106:0/2346670617 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f17d0104620 0x7f17d0106a40 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f17c0009a90 tx=0x7f17c0009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=7801e579a8fb869c server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.742+0000 7f17c7fff700 1 -- 192.168.123.106:0/2346670617 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f17c0004030 con 0x7f17d0104620 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.742+0000 7f17c7fff700 1 -- 192.168.123.106:0/2346670617 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f17c000b7e0 con 0x7f17d0104620 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.743+0000 7f17d73d2700 1 -- 192.168.123.106:0/2346670617 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f17d0104620 msgr2=0x7f17d0106a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.743+0000 7f17d73d2700 1 --2- 192.168.123.106:0/2346670617 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f17d0104620 0x7f17d0106a40 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f17c0009a90 tx=0x7f17c0009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.743+0000 7f17d73d2700 1 -- 192.168.123.106:0/2346670617 shutdown_connections 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.743+0000 7f17d73d2700 1 --2- 192.168.123.106:0/2346670617 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f17d0104620 0x7f17d0106a40 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.743+0000 7f17d73d2700 1 -- 192.168.123.106:0/2346670617 >> 192.168.123.106:0/2346670617 conn(0x7f17d0100270 msgr2=0x7f17d01026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.743+0000 7f17d73d2700 1 -- 192.168.123.106:0/2346670617 shutdown_connections 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.743+0000 7f17d73d2700 1 -- 192.168.123.106:0/2346670617 wait complete. 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.743+0000 7f17d73d2700 1 Processor -- start 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.744+0000 7f17d73d2700 1 -- start start 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.744+0000 7f17d73d2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f17d0104620 0x7f17d01a0430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.744+0000 7f17d73d2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17d00745b0 con 0x7f17d0104620 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.744+0000 7f17d516e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f17d0104620 0x7f17d01a0430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.744+0000 7f17d516e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f17d0104620 0x7f17d01a0430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58926/0 (socket says 192.168.123.106:58926) 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.744+0000 7f17d516e700 1 -- 192.168.123.106:0/809786989 learned_addr learned my addr 192.168.123.106:0/809786989 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.744+0000 7f17d516e700 1 -- 192.168.123.106:0/809786989 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f17c0009740 con 0x7f17d0104620 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.744+0000 7f17d516e700 1 --2- 192.168.123.106:0/809786989 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f17d0104620 0x7f17d01a0430 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f17c000be00 tx=0x7f17c000bee0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.745+0000 7f17c67fc700 1 -- 192.168.123.106:0/809786989 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f17c0003f60 con 0x7f17d0104620 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.745+0000 7f17c67fc700 1 -- 192.168.123.106:0/809786989 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f17c0004560 con 0x7f17d0104620 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.745+0000 7f17c67fc700 1 -- 192.168.123.106:0/809786989 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f17c0024d30 con 0x7f17d0104620 2026-03-09T17:24:28.177 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.745+0000 7f17d73d2700 1 -- 192.168.123.106:0/809786989 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f17d01a0970 con 0x7f17d0104620 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.745+0000 7f17d73d2700 1 -- 192.168.123.106:0/809786989 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f17d01a0e10 con 0x7f17d0104620 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.746+0000 7f17c67fc700 1 -- 192.168.123.106:0/809786989 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 4) v1 ==== 45068+0+0 (secure 0 0 0) 0x7f17c00040c0 con 0x7f17d0104620 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.746+0000 7f17d73d2700 1 -- 192.168.123.106:0/809786989 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f17d019a910 con 0x7f17d0104620 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.746+0000 7f17c67fc700 1 --2- 192.168.123.106:0/809786989 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f17bc0383f0 0x7f17bc03a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.746+0000 7f17c67fc700 1 -- 192.168.123.106:0/809786989 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f17c004be90 con 0x7f17d0104620 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.747+0000 7f17d496d700 1 --2- 192.168.123.106:0/809786989 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f17bc0383f0 0x7f17bc03a8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.749+0000 7f17c67fc700 1 -- 192.168.123.106:0/809786989 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f17c004f030 con 0x7f17d0104620 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.749+0000 7f17d496d700 1 --2- 192.168.123.106:0/809786989 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f17bc0383f0 0x7f17bc03a8a0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f17cc006fd0 tx=0x7f17cc006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:27.878+0000 7f17d73d2700 1 -- 192.168.123.106:0/809786989 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "cephadm"} v 0) v1 -- 0x7f17d004f9e0 con 0x7f17d0104620 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.129+0000 7f17c67fc700 1 -- 192.168.123.106:0/809786989 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mgrmap(e 5) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f17c002e7b0 con 0x7f17d0104620 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.130+0000 7f17c67fc700 1 -- 192.168.123.106:0/809786989 <== mon.0 v2:192.168.123.106:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "cephadm"}]=0 v5) v1 ==== 86+0+0 (secure 0 0 0) 0x7f17c004bbf0 con 0x7f17d0104620 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.133+0000 7f17d73d2700 1 -- 192.168.123.106:0/809786989 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f17bc0383f0 msgr2=0x7f17bc03a8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.134+0000 7f17d73d2700 1 --2- 192.168.123.106:0/809786989 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f17bc0383f0 0x7f17bc03a8a0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f17cc006fd0 tx=0x7f17cc006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.134+0000 7f17d73d2700 1 -- 192.168.123.106:0/809786989 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f17d0104620 msgr2=0x7f17d01a0430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.134+0000 7f17d73d2700 1 --2- 192.168.123.106:0/809786989 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f17d0104620 0x7f17d01a0430 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f17c000be00 tx=0x7f17c000bee0 comp rx=0 tx=0).stop 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.134+0000 7f17d73d2700 1 -- 192.168.123.106:0/809786989 shutdown_connections 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.134+0000 7f17d73d2700 1 --2- 192.168.123.106:0/809786989 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f17bc0383f0 0x7f17bc03a8a0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.134+0000 7f17d73d2700 1 --2- 192.168.123.106:0/809786989 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f17d0104620 0x7f17d01a0430 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.134+0000 7f17d73d2700 1 -- 192.168.123.106:0/809786989 >> 192.168.123.106:0/809786989 conn(0x7f17d0100270 msgr2=0x7f17d01026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.134+0000 7f17d73d2700 1 -- 192.168.123.106:0/809786989 shutdown_connections 2026-03-09T17:24:28.178 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.134+0000 7f17d73d2700 1 -- 192.168.123.106:0/809786989 wait complete. 2026-03-09T17:24:28.520 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout { 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 5, 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "active_name": "vm06.pbgzei", 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout } 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.335+0000 7f0b98a09700 1 Processor -- start 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.335+0000 7f0b98a09700 1 -- start start 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.335+0000 7f0b98a09700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b94103e70 0x7f0b94104280 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.335+0000 7f0b98a09700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0b941047c0 con 0x7f0b94103e70 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.339+0000 7f0b9259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b94103e70 0x7f0b94104280 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.339+0000 7f0b9259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b94103e70 0x7f0b94104280 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58954/0 (socket says 192.168.123.106:58954) 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.339+0000 7f0b9259c700 1 -- 192.168.123.106:0/746732719 learned_addr learned my addr 192.168.123.106:0/746732719 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.339+0000 7f0b9259c700 1 -- 192.168.123.106:0/746732719 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0b94104900 con 0x7f0b94103e70 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.339+0000 7f0b9259c700 1 --2- 192.168.123.106:0/746732719 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b94103e70 0x7f0b94104280 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f0b7c009a90 tx=0x7f0b7c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=dd2134a3698c5513 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.339+0000 7f0b9159a700 1 -- 192.168.123.106:0/746732719 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0b7c004030 con 0x7f0b94103e70 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.339+0000 7f0b9159a700 1 -- 192.168.123.106:0/746732719 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f0b7c00b7e0 con 0x7f0b94103e70 2026-03-09T17:24:28.521 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.340+0000 7f0b98a09700 1 -- 192.168.123.106:0/746732719 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b94103e70 msgr2=0x7f0b94104280 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.340+0000 7f0b98a09700 1 --2- 192.168.123.106:0/746732719 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b94103e70 0x7f0b94104280 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f0b7c009a90 tx=0x7f0b7c009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.342+0000 7f0b98a09700 1 -- 192.168.123.106:0/746732719 shutdown_connections 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.342+0000 7f0b98a09700 1 --2- 192.168.123.106:0/746732719 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b94103e70 0x7f0b94104280 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.342+0000 7f0b98a09700 1 -- 192.168.123.106:0/746732719 >> 192.168.123.106:0/746732719 conn(0x7f0b940ff870 msgr2=0x7f0b94101c80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.342+0000 7f0b98a09700 1 -- 192.168.123.106:0/746732719 shutdown_connections 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.342+0000 7f0b98a09700 1 -- 192.168.123.106:0/746732719 wait complete. 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.342+0000 7f0b98a09700 1 Processor -- start 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.343+0000 7f0b98a09700 1 -- start start 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.343+0000 7f0b98a09700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b94103e70 0x7f0b941ab7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.343+0000 7f0b98a09700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0b94072a70 con 0x7f0b94103e70 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.343+0000 7f0b9259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b94103e70 0x7f0b941ab7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.343+0000 7f0b9259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b94103e70 0x7f0b941ab7f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58962/0 (socket says 192.168.123.106:58962) 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.343+0000 7f0b9259c700 1 -- 192.168.123.106:0/906044119 learned_addr learned my addr 192.168.123.106:0/906044119 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.343+0000 7f0b9259c700 1 -- 192.168.123.106:0/906044119 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0b7c009740 con 0x7f0b94103e70 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.344+0000 7f0b9259c700 1 --2- 192.168.123.106:0/906044119 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b94103e70 0x7f0b941ab7f0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f0b7c0047d0 tx=0x7f0b7c00b890 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.345+0000 7f0b8b7fe700 1 -- 192.168.123.106:0/906044119 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0b7c00bd40 con 0x7f0b94103e70 2026-03-09T17:24:28.522 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.345+0000 7f0b98a09700 1 -- 192.168.123.106:0/906044119 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0b94105ab0 con 0x7f0b94103e70 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.345+0000 7f0b98a09700 1 -- 192.168.123.106:0/906044119 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0b941abd30 con 0x7f0b94103e70 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.345+0000 7f0b8b7fe700 1 -- 192.168.123.106:0/906044119 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f0b7c00bea0 con 0x7f0b94103e70 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.345+0000 7f0b8b7fe700 1 -- 192.168.123.106:0/906044119 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0b7c01bad0 con 0x7f0b94103e70 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.346+0000 7f0b98a09700 1 -- 192.168.123.106:0/906044119 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0b74005320 con 0x7f0b94103e70 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.346+0000 7f0b8b7fe700 1 -- 192.168.123.106:0/906044119 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 5) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f0b7c003810 con 0x7f0b94103e70 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.346+0000 7f0b8b7fe700 1 --2- 192.168.123.106:0/906044119 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0b800381b0 0x7f0b8003a660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.347+0000 7f0b91d9b700 1 -- 192.168.123.106:0/906044119 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0b800381b0 msgr2=0x7f0b8003a660 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.347+0000 7f0b91d9b700 1 --2- 192.168.123.106:0/906044119 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0b800381b0 0x7f0b8003a660 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.347+0000 7f0b8b7fe700 1 -- 192.168.123.106:0/906044119 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f0b7c0289e0 con 0x7f0b94103e70 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.350+0000 7f0b8b7fe700 1 -- 192.168.123.106:0/906044119 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f0b7c02d430 con 0x7f0b94103e70 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.485+0000 7f0b98a09700 1 -- 192.168.123.106:0/906044119 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7f0b74006200 con 0x7f0b94103e70 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.486+0000 7f0b8b7fe700 1 -- 192.168.123.106:0/906044119 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v5) v1 ==== 56+0+98 (secure 0 0 0) 0x7f0b7c01cd20 con 0x7f0b94103e70 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.488+0000 7f0b897fa700 1 -- 192.168.123.106:0/906044119 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0b800381b0 msgr2=0x7f0b8003a660 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.488+0000 7f0b897fa700 1 --2- 192.168.123.106:0/906044119 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0b800381b0 0x7f0b8003a660 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.489+0000 7f0b897fa700 1 -- 192.168.123.106:0/906044119 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b94103e70 msgr2=0x7f0b941ab7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.489+0000 7f0b897fa700 1 --2- 192.168.123.106:0/906044119 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b94103e70 0x7f0b941ab7f0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f0b7c0047d0 tx=0x7f0b7c00b890 comp rx=0 tx=0).stop 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.489+0000 7f0b897fa700 1 -- 192.168.123.106:0/906044119 shutdown_connections 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.489+0000 7f0b897fa700 1 --2- 192.168.123.106:0/906044119 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0b800381b0 0x7f0b8003a660 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.489+0000 7f0b897fa700 1 --2- 192.168.123.106:0/906044119 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b94103e70 0x7f0b941ab7f0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.489+0000 7f0b897fa700 1 -- 192.168.123.106:0/906044119 >> 192.168.123.106:0/906044119 conn(0x7f0b940ff870 msgr2=0x7f0b94101140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:28.523 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.489+0000 7f0b897fa700 1 -- 192.168.123.106:0/906044119 shutdown_connections 2026-03-09T17:24:28.524 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.489+0000 7f0b897fa700 1 -- 192.168.123.106:0/906044119 wait complete. 2026-03-09T17:24:28.524 INFO:teuthology.orchestra.run.vm06.stdout:Waiting for the mgr to restart... 2026-03-09T17:24:28.524 INFO:teuthology.orchestra.run.vm06.stdout:Waiting for mgr epoch 5... 2026-03-09T17:24:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:29 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/809786989' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished 2026-03-09T17:24:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:29 vm06 ceph-mon[57307]: mgrmap e5: vm06.pbgzei(active, since 3s) 2026-03-09T17:24:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:29 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/906044119' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: Active manager daemon vm06.pbgzei restarted 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: Activating manager daemon vm06.pbgzei 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: osdmap e2: 0 total, 0 up, 0 in 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: mgrmap e6: vm06.pbgzei(active, starting, since 0.00598074s) 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr metadata", "who": "vm06.pbgzei", "id": "vm06.pbgzei"}]: dispatch 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: Manager daemon vm06.pbgzei is now available 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:24:33.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:32 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout { 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 7, 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout } 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.667+0000 7fc4d728f700 1 Processor -- start 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.667+0000 7fc4d728f700 1 -- start start 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.667+0000 7fc4d728f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4d0072a40 0x7fc4d0071060 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.667+0000 7fc4d728f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc4d00715a0 con 0x7fc4d0072a40 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.667+0000 7fc4d628d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4d0072a40 0x7fc4d0071060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.667+0000 7fc4d628d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4d0072a40 0x7fc4d0071060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58970/0 (socket says 192.168.123.106:58970) 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.667+0000 7fc4d628d700 1 -- 192.168.123.106:0/2430588285 learned_addr learned my addr 192.168.123.106:0/2430588285 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.669+0000 7fc4d628d700 1 -- 192.168.123.106:0/2430588285 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc4d00716e0 con 0x7fc4d0072a40 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.669+0000 7fc4d628d700 1 --2- 192.168.123.106:0/2430588285 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4d0072a40 0x7fc4d0071060 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fc4cc009a90 tx=0x7fc4cc009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=95d879e616f653bb server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.675+0000 7fc4d528b700 1 -- 192.168.123.106:0/2430588285 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc4cc004030 con 0x7fc4d0072a40 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.675+0000 7fc4d528b700 1 -- 192.168.123.106:0/2430588285 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fc4cc00b7e0 con 0x7fc4d0072a40 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.675+0000 7fc4d528b700 1 -- 192.168.123.106:0/2430588285 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc4cc0039f0 con 0x7fc4d0072a40 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.675+0000 7fc4d728f700 1 -- 192.168.123.106:0/2430588285 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4d0072a40 msgr2=0x7fc4d0071060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.675+0000 7fc4d728f700 1 --2- 192.168.123.106:0/2430588285 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4d0072a40 0x7fc4d0071060 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fc4cc009a90 tx=0x7fc4cc009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.675+0000 7fc4d728f700 1 -- 192.168.123.106:0/2430588285 shutdown_connections 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.675+0000 7fc4d728f700 1 --2- 192.168.123.106:0/2430588285 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4d0072a40 0x7fc4d0071060 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.675+0000 7fc4d728f700 1 -- 192.168.123.106:0/2430588285 >> 192.168.123.106:0/2430588285 conn(0x7fc4d006c9d0 msgr2=0x7fc4d006ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.677+0000 7fc4d728f700 1 -- 192.168.123.106:0/2430588285 shutdown_connections 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.677+0000 7fc4d728f700 1 -- 192.168.123.106:0/2430588285 wait complete. 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.677+0000 7fc4d728f700 1 Processor -- start 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.677+0000 7fc4d728f700 1 -- start start 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.677+0000 7fc4d728f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4d01a8820 0x7fc4d01a8c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.677+0000 7fc4d728f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc4d01a9170 con 0x7fc4d01a8820 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.678+0000 7fc4d628d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4d01a8820 0x7fc4d01a8c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.678+0000 7fc4d628d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4d01a8820 0x7fc4d01a8c30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58986/0 (socket says 192.168.123.106:58986) 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.678+0000 7fc4d628d700 1 -- 192.168.123.106:0/3553752644 learned_addr learned my addr 192.168.123.106:0/3553752644 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.678+0000 7fc4d628d700 1 -- 192.168.123.106:0/3553752644 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc4cc009740 con 0x7fc4d01a8820 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.679+0000 7fc4d628d700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4d01a8820 0x7fc4d01a8c30 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fc4cc0037e0 tx=0x7fc4cc003b40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.679+0000 7fc4c77fe700 1 -- 192.168.123.106:0/3553752644 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc4cc003fd0 con 0x7fc4d01a8820 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.679+0000 7fc4d728f700 1 -- 192.168.123.106:0/3553752644 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc4d01a9370 con 0x7fc4d01a8820 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.679+0000 7fc4d728f700 1 -- 192.168.123.106:0/3553752644 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc4d007b250 con 0x7fc4d01a8820 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.680+0000 7fc4c77fe700 1 -- 192.168.123.106:0/3553752644 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fc4cc024460 con 0x7fc4d01a8820 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.680+0000 7fc4c77fe700 1 -- 192.168.123.106:0/3553752644 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc4cc01b440 con 0x7fc4d01a8820 2026-03-09T17:24:33.939 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.680+0000 7fc4c77fe700 1 -- 192.168.123.106:0/3553752644 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 5) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fc4cc01b620 con 0x7fc4d01a8820 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.681+0000 7fc4c77fe700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 0x7fc4bc03a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.681+0000 7fc4c77fe700 1 -- 192.168.123.106:0/3553752644 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fc4cc04d280 con 0x7fc4d01a8820 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.681+0000 7fc4d5a8c700 1 -- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 msgr2=0x7fc4bc03a9c0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.681+0000 7fc4d5a8c700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 0x7fc4bc03a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.682+0000 7fc4d728f700 1 -- 192.168.123.106:0/3553752644 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7fc4b4000d40 con 0x7fc4bc038510 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.881+0000 7fc4d5a8c700 1 -- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 msgr2=0x7fc4bc03a9c0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:28.881+0000 7fc4d5a8c700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 0x7fc4bc03a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:29.282+0000 7fc4d5a8c700 1 -- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 msgr2=0x7fc4bc03a9c0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:29.282+0000 7fc4d5a8c700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 0x7fc4bc03a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:30.083+0000 7fc4d5a8c700 1 -- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 msgr2=0x7fc4bc03a9c0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:30.083+0000 7fc4d5a8c700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 0x7fc4bc03a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:31.683+0000 7fc4d5a8c700 1 -- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 msgr2=0x7fc4bc03a9c0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:31.683+0000 7fc4d5a8c700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 0x7fc4bc03a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:32.875+0000 7fc4c77fe700 1 -- 192.168.123.106:0/3553752644 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mgrmap(e 6) v1 ==== 44846+0+0 (secure 0 0 0) 0x7fc4cc02dd00 con 0x7fc4d01a8820 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:32.875+0000 7fc4c77fe700 1 -- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 msgr2=0x7fc4bc03a9c0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:32.875+0000 7fc4c77fe700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 0x7fc4bc03a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.877+0000 7fc4c77fe700 1 -- 192.168.123.106:0/3553752644 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7fc4cc02e430 con 0x7fc4d01a8820 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.877+0000 7fc4c77fe700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 0x7fc4bc03a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.877+0000 7fc4c77fe700 1 -- 192.168.123.106:0/3553752644 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7fc4b4000d40 con 0x7fc4bc038510 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.878+0000 7fc4d5a8c700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 0x7fc4bc03a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.878+0000 7fc4d5a8c700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 0x7fc4bc03a9c0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7fc4c0003a60 tx=0x7fc4c00092b0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.880+0000 7fc4c77fe700 1 -- 192.168.123.106:0/3553752644 <== mgr.14120 v2:192.168.123.106:6800/2 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+6910 (secure 0 0 0) 0x7fc4b4000d40 con 0x7fc4bc038510 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.884+0000 7fc4d728f700 1 -- 192.168.123.106:0/3553752644 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7fc4b4002800 con 0x7fc4bc038510 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.884+0000 7fc4c77fe700 1 -- 192.168.123.106:0/3553752644 <== mgr.14120 v2:192.168.123.106:6800/2 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+51 (secure 0 0 0) 0x7fc4b4002800 con 0x7fc4bc038510 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.885+0000 7fc4c57fa700 1 -- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 msgr2=0x7fc4bc03a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.885+0000 7fc4c57fa700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 0x7fc4bc03a9c0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7fc4c0003a60 tx=0x7fc4c00092b0 comp rx=0 tx=0).stop 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.885+0000 7fc4c57fa700 1 -- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4d01a8820 msgr2=0x7fc4d01a8c30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.885+0000 7fc4c57fa700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4d01a8820 0x7fc4d01a8c30 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fc4cc0037e0 tx=0x7fc4cc003b40 comp rx=0 tx=0).stop 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.885+0000 7fc4c57fa700 1 -- 192.168.123.106:0/3553752644 shutdown_connections 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.885+0000 7fc4c57fa700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4bc038510 0x7fc4bc03a9c0 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.885+0000 7fc4c57fa700 1 --2- 192.168.123.106:0/3553752644 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4d01a8820 0x7fc4d01a8c30 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.885+0000 7fc4c57fa700 1 -- 192.168.123.106:0/3553752644 >> 192.168.123.106:0/3553752644 conn(0x7fc4d006c9d0 msgr2=0x7fc4d006dfd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.885+0000 7fc4c57fa700 1 -- 192.168.123.106:0/3553752644 shutdown_connections 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:33.885+0000 7fc4c57fa700 1 -- 192.168.123.106:0/3553752644 wait complete. 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:mgr epoch 5 is available 2026-03-09T17:24:33.940 INFO:teuthology.orchestra.run.vm06.stdout:Setting orchestrator backend to cephadm... 2026-03-09T17:24:34.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:33 vm06 ceph-mon[57307]: Found migration_current of "None". Setting to last migration. 2026-03-09T17:24:34.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:33 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/mirror_snapshot_schedule"}]: dispatch 2026-03-09T17:24:34.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:33 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/trash_purge_schedule"}]: dispatch 2026-03-09T17:24:34.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:33 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:34.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:33 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:34.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:33 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:24:34.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:33 vm06 ceph-mon[57307]: mgrmap e7: vm06.pbgzei(active, since 1.00972s) 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.077+0000 7fa1d25b9700 1 Processor -- start 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.077+0000 7fa1d25b9700 1 -- start start 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.078+0000 7fa1d25b9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1cc104620 0x7fa1cc106a40 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.078+0000 7fa1d25b9700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa1cc0745b0 con 0x7fa1cc104620 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.078+0000 7fa1cbfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1cc104620 0x7fa1cc106a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.078+0000 7fa1cbfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1cc104620 0x7fa1cc106a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35082/0 (socket says 192.168.123.106:35082) 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.078+0000 7fa1cbfff700 1 -- 192.168.123.106:0/2071265679 learned_addr learned my addr 192.168.123.106:0/2071265679 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.079+0000 7fa1cbfff700 1 -- 192.168.123.106:0/2071265679 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa1cc0746f0 con 0x7fa1cc104620 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.079+0000 7fa1cbfff700 1 --2- 192.168.123.106:0/2071265679 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1cc104620 0x7fa1cc106a40 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fa1b4009a90 tx=0x7fa1b4009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=72421481d79a3106 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.079+0000 7fa1caffd700 1 -- 192.168.123.106:0/2071265679 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa1b4004030 con 0x7fa1cc104620 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.079+0000 7fa1caffd700 1 -- 192.168.123.106:0/2071265679 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fa1b400b7e0 con 0x7fa1cc104620 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.079+0000 7fa1d25b9700 1 -- 192.168.123.106:0/2071265679 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1cc104620 msgr2=0x7fa1cc106a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.079+0000 7fa1d25b9700 1 --2- 192.168.123.106:0/2071265679 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1cc104620 0x7fa1cc106a40 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fa1b4009a90 tx=0x7fa1b4009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.080+0000 7fa1d25b9700 1 -- 192.168.123.106:0/2071265679 shutdown_connections 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.080+0000 7fa1d25b9700 1 --2- 192.168.123.106:0/2071265679 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1cc104620 0x7fa1cc106a40 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.080+0000 7fa1d25b9700 1 -- 192.168.123.106:0/2071265679 >> 192.168.123.106:0/2071265679 conn(0x7fa1cc100270 msgr2=0x7fa1cc1026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.080+0000 7fa1d25b9700 1 -- 192.168.123.106:0/2071265679 shutdown_connections 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.080+0000 7fa1d25b9700 1 -- 192.168.123.106:0/2071265679 wait complete. 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.080+0000 7fa1d25b9700 1 Processor -- start 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.081+0000 7fa1d25b9700 1 -- start start 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.081+0000 7fa1d25b9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1cc104620 0x7fa1cc1a0090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.081+0000 7fa1d25b9700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa1cc0745b0 con 0x7fa1cc104620 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.081+0000 7fa1cbfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1cc104620 0x7fa1cc1a0090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.081+0000 7fa1cbfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1cc104620 0x7fa1cc1a0090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35094/0 (socket says 192.168.123.106:35094) 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.081+0000 7fa1cbfff700 1 -- 192.168.123.106:0/903340555 learned_addr learned my addr 192.168.123.106:0/903340555 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.081+0000 7fa1cbfff700 1 -- 192.168.123.106:0/903340555 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa1b4009740 con 0x7fa1cc104620 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.081+0000 7fa1cbfff700 1 --2- 192.168.123.106:0/903340555 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1cc104620 0x7fa1cc1a0090 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fa1b4004050 tx=0x7fa1b4004130 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.081+0000 7fa1c97fa700 1 -- 192.168.123.106:0/903340555 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa1b4004410 con 0x7fa1cc104620 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.081+0000 7fa1c97fa700 1 -- 192.168.123.106:0/903340555 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fa1b4004570 con 0x7fa1cc104620 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.081+0000 7fa1c97fa700 1 -- 192.168.123.106:0/903340555 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa1b4011600 con 0x7fa1cc104620 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.082+0000 7fa1d25b9700 1 -- 192.168.123.106:0/903340555 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa1cc1a05d0 con 0x7fa1cc104620 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.082+0000 7fa1d25b9700 1 -- 192.168.123.106:0/903340555 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa1cc1a09f0 con 0x7fa1cc104620 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.083+0000 7fa1d25b9700 1 -- 192.168.123.106:0/903340555 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa1cc199900 con 0x7fa1cc104620 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.084+0000 7fa1c97fa700 1 -- 192.168.123.106:0/903340555 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7fa1b4028020 con 0x7fa1cc104620 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.084+0000 7fa1c97fa700 1 --2- 192.168.123.106:0/903340555 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa1b8038390 0x7fa1b803a840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:34.238 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.084+0000 7fa1c97fa700 1 -- 192.168.123.106:0/903340555 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fa1b404bcd0 con 0x7fa1cc104620 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.086+0000 7fa1cb7fe700 1 --2- 192.168.123.106:0/903340555 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa1b8038390 0x7fa1b803a840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.086+0000 7fa1cb7fe700 1 --2- 192.168.123.106:0/903340555 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa1b8038390 0x7fa1b803a840 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fa1bc006fd0 tx=0x7fa1bc006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.086+0000 7fa1c97fa700 1 -- 192.168.123.106:0/903340555 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa1b4051020 con 0x7fa1cc104620 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.196+0000 7fa1d25b9700 1 -- 192.168.123.106:0/903340555 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}) v1 -- 0x7fa1cc02d050 con 0x7fa1b8038390 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.205+0000 7fa1c97fa700 1 -- 192.168.123.106:0/903340555 <== mgr.14120 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7fa1cc02d050 con 0x7fa1b8038390 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.207+0000 7fa1d25b9700 1 -- 192.168.123.106:0/903340555 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa1b8038390 msgr2=0x7fa1b803a840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.208+0000 7fa1d25b9700 1 --2- 192.168.123.106:0/903340555 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa1b8038390 0x7fa1b803a840 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fa1bc006fd0 tx=0x7fa1bc006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.208+0000 7fa1d25b9700 1 -- 192.168.123.106:0/903340555 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1cc104620 msgr2=0x7fa1cc1a0090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.208+0000 7fa1d25b9700 1 --2- 192.168.123.106:0/903340555 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1cc104620 0x7fa1cc1a0090 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fa1b4004050 tx=0x7fa1b4004130 comp rx=0 tx=0).stop 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.208+0000 7fa1d25b9700 1 -- 192.168.123.106:0/903340555 shutdown_connections 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.208+0000 7fa1d25b9700 1 --2- 192.168.123.106:0/903340555 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa1b8038390 0x7fa1b803a840 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.208+0000 7fa1d25b9700 1 --2- 192.168.123.106:0/903340555 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1cc104620 0x7fa1cc1a0090 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.208+0000 7fa1d25b9700 1 -- 192.168.123.106:0/903340555 >> 192.168.123.106:0/903340555 conn(0x7fa1cc100270 msgr2=0x7fa1cc1026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.208+0000 7fa1d25b9700 1 -- 192.168.123.106:0/903340555 shutdown_connections 2026-03-09T17:24:34.239 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.208+0000 7fa1d25b9700 1 -- 192.168.123.106:0/903340555 wait complete. 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout value unchanged 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.355+0000 7ff71512c700 1 Processor -- start 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.356+0000 7ff71512c700 1 -- start start 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.356+0000 7ff71512c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff710106830 0x7ff710108c10 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.356+0000 7ff71512c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff7100745b0 con 0x7ff710106830 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.356+0000 7ff70ed9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff710106830 0x7ff710108c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.357+0000 7ff70ed9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff710106830 0x7ff710108c10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35102/0 (socket says 192.168.123.106:35102) 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.357+0000 7ff70ed9d700 1 -- 192.168.123.106:0/3652927479 learned_addr learned my addr 192.168.123.106:0/3652927479 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.357+0000 7ff70ed9d700 1 -- 192.168.123.106:0/3652927479 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff7100746f0 con 0x7ff710106830 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.357+0000 7ff70ed9d700 1 --2- 192.168.123.106:0/3652927479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff710106830 0x7ff710108c10 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7ff6f8009cf0 tx=0x7ff6f800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=5b7d91e533f60149 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.358+0000 7ff70dd9b700 1 -- 192.168.123.106:0/3652927479 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff6f8004030 con 0x7ff710106830 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.358+0000 7ff70dd9b700 1 -- 192.168.123.106:0/3652927479 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7ff6f800b810 con 0x7ff710106830 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.358+0000 7ff70dd9b700 1 -- 192.168.123.106:0/3652927479 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff6f8003a90 con 0x7ff710106830 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.358+0000 7ff71512c700 1 -- 192.168.123.106:0/3652927479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff710106830 msgr2=0x7ff710108c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.358+0000 7ff71512c700 1 --2- 192.168.123.106:0/3652927479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff710106830 0x7ff710108c10 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7ff6f8009cf0 tx=0x7ff6f800b0e0 comp rx=0 tx=0).stop 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.358+0000 7ff71512c700 1 -- 192.168.123.106:0/3652927479 shutdown_connections 2026-03-09T17:24:34.532 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.358+0000 7ff71512c700 1 --2- 192.168.123.106:0/3652927479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff710106830 0x7ff710108c10 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:34.533 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.358+0000 7ff71512c700 1 -- 192.168.123.106:0/3652927479 >> 192.168.123.106:0/3652927479 conn(0x7ff710100270 msgr2=0x7ff7101026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:34.533 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.359+0000 7ff71512c700 1 -- 192.168.123.106:0/3652927479 shutdown_connections 2026-03-09T17:24:34.533 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.359+0000 7ff71512c700 1 -- 192.168.123.106:0/3652927479 wait complete. 2026-03-09T17:24:34.533 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.359+0000 7ff71512c700 1 Processor -- start 2026-03-09T17:24:34.533 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.359+0000 7ff71512c700 1 -- start start 2026-03-09T17:24:34.533 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.360+0000 7ff71512c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff710106830 0x7ff71019bc40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:34.533 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.360+0000 7ff71512c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff71019c180 con 0x7ff710106830 2026-03-09T17:24:34.533 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.360+0000 7ff70ed9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff710106830 0x7ff71019bc40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:34.533 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.360+0000 7ff70ed9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff710106830 0x7ff71019bc40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35108/0 (socket says 192.168.123.106:35108) 2026-03-09T17:24:34.533 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.360+0000 7ff70ed9d700 1 -- 192.168.123.106:0/2461898796 learned_addr learned my addr 192.168.123.106:0/2461898796 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:34.533 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.360+0000 7ff70ed9d700 1 -- 192.168.123.106:0/2461898796 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6f8009740 con 0x7ff710106830 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.361+0000 7ff70ed9d700 1 --2- 192.168.123.106:0/2461898796 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff710106830 0x7ff71019bc40 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7ff6f8009710 tx=0x7ff6f8003e00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.361+0000 7ff707fff700 1 -- 192.168.123.106:0/2461898796 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff6f8004120 con 0x7ff710106830 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.361+0000 7ff707fff700 1 -- 192.168.123.106:0/2461898796 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7ff6f8004280 con 0x7ff710106830 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.361+0000 7ff71512c700 1 -- 192.168.123.106:0/2461898796 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff71019c380 con 0x7ff710106830 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.361+0000 7ff707fff700 1 -- 192.168.123.106:0/2461898796 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff6f8011510 con 0x7ff710106830 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.361+0000 7ff71512c700 1 -- 192.168.123.106:0/2461898796 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff71019c820 con 0x7ff710106830 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.362+0000 7ff707fff700 1 -- 192.168.123.106:0/2461898796 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7ff6f801a460 con 0x7ff710106830 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.362+0000 7ff71512c700 1 -- 192.168.123.106:0/2461898796 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff710062380 con 0x7ff710106830 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.363+0000 7ff707fff700 1 --2- 192.168.123.106:0/2461898796 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff6fc0382f0 0x7ff6fc03a7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.363+0000 7ff707fff700 1 -- 192.168.123.106:0/2461898796 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7ff6f804bbc0 con 0x7ff710106830 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.363+0000 7ff70e59c700 1 --2- 192.168.123.106:0/2461898796 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff6fc0382f0 0x7ff6fc03a7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.364+0000 7ff70e59c700 1 --2- 192.168.123.106:0/2461898796 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff6fc0382f0 0x7ff6fc03a7a0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7ff700006fd0 tx=0x7ff700006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.366+0000 7ff707fff700 1 -- 192.168.123.106:0/2461898796 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7ff6f801aa00 con 0x7ff710106830 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.475+0000 7ff71512c700 1 -- 192.168.123.106:0/2461898796 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}) v1 -- 0x7ff71019f1a0 con 0x7ff6fc0382f0 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.476+0000 7ff707fff700 1 -- 192.168.123.106:0/2461898796 <== mgr.14120 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+16 (secure 0 0 0) 0x7ff71019f1a0 con 0x7ff6fc0382f0 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.480+0000 7ff71512c700 1 -- 192.168.123.106:0/2461898796 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff6fc0382f0 msgr2=0x7ff6fc03a7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.480+0000 7ff71512c700 1 --2- 192.168.123.106:0/2461898796 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff6fc0382f0 0x7ff6fc03a7a0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7ff700006fd0 tx=0x7ff700006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.480+0000 7ff71512c700 1 -- 192.168.123.106:0/2461898796 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff710106830 msgr2=0x7ff71019bc40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.480+0000 7ff71512c700 1 --2- 192.168.123.106:0/2461898796 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff710106830 0x7ff71019bc40 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7ff6f8009710 tx=0x7ff6f8003e00 comp rx=0 tx=0).stop 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.482+0000 7ff71512c700 1 -- 192.168.123.106:0/2461898796 shutdown_connections 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.482+0000 7ff71512c700 1 --2- 192.168.123.106:0/2461898796 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff6fc0382f0 0x7ff6fc03a7a0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.482+0000 7ff71512c700 1 --2- 192.168.123.106:0/2461898796 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff710106830 0x7ff71019bc40 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.482+0000 7ff71512c700 1 -- 192.168.123.106:0/2461898796 >> 192.168.123.106:0/2461898796 conn(0x7ff710100270 msgr2=0x7ff710100f00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.482+0000 7ff71512c700 1 -- 192.168.123.106:0/2461898796 shutdown_connections 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.482+0000 7ff71512c700 1 -- 192.168.123.106:0/2461898796 wait complete. 2026-03-09T17:24:34.534 INFO:teuthology.orchestra.run.vm06.stdout:Generating ssh key... 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.653+0000 7fb6ecaef700 1 Processor -- start 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.653+0000 7fb6ecaef700 1 -- start start 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.653+0000 7fb6ecaef700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6e8107f40 0x7fb6e8108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.653+0000 7fb6ecaef700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb6e8108890 con 0x7fb6e8107f40 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.653+0000 7fb6e659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6e8107f40 0x7fb6e8108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.653+0000 7fb6e659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6e8107f40 0x7fb6e8108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35122/0 (socket says 192.168.123.106:35122) 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.653+0000 7fb6e659c700 1 -- 192.168.123.106:0/1673920776 learned_addr learned my addr 192.168.123.106:0/1673920776 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.654+0000 7fb6e659c700 1 -- 192.168.123.106:0/1673920776 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb6e81089d0 con 0x7fb6e8107f40 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.654+0000 7fb6e659c700 1 --2- 192.168.123.106:0/1673920776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6e8107f40 0x7fb6e8108350 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fb6d8009cf0 tx=0x7fb6d800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=3c93a928123f60a5 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.654+0000 7fb6e5d9b700 1 -- 192.168.123.106:0/1673920776 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb6d8004030 con 0x7fb6e8107f40 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.654+0000 7fb6e5d9b700 1 -- 192.168.123.106:0/1673920776 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fb6d800b810 con 0x7fb6e8107f40 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.655+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1673920776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6e8107f40 msgr2=0x7fb6e8108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.655+0000 7fb6ecaef700 1 --2- 192.168.123.106:0/1673920776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6e8107f40 0x7fb6e8108350 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fb6d8009cf0 tx=0x7fb6d800b0e0 comp rx=0 tx=0).stop 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.655+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1673920776 shutdown_connections 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.655+0000 7fb6ecaef700 1 --2- 192.168.123.106:0/1673920776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6e8107f40 0x7fb6e8108350 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.655+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1673920776 >> 192.168.123.106:0/1673920776 conn(0x7fb6e8103770 msgr2=0x7fb6e8105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.655+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1673920776 shutdown_connections 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.655+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1673920776 wait complete. 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.656+0000 7fb6ecaef700 1 Processor -- start 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.656+0000 7fb6ecaef700 1 -- start start 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.656+0000 7fb6ecaef700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6e8107f40 0x7fb6e819c000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.656+0000 7fb6e659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6e8107f40 0x7fb6e819c000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.657+0000 7fb6e659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6e8107f40 0x7fb6e819c000 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35136/0 (socket says 192.168.123.106:35136) 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.657+0000 7fb6e659c700 1 -- 192.168.123.106:0/1503164549 learned_addr learned my addr 192.168.123.106:0/1503164549 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.657+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1503164549 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb6e8108890 con 0x7fb6e8107f40 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.657+0000 7fb6e659c700 1 -- 192.168.123.106:0/1503164549 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb6d8009740 con 0x7fb6e8107f40 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.657+0000 7fb6e659c700 1 --2- 192.168.123.106:0/1503164549 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6e8107f40 0x7fb6e819c000 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fb6d8003f30 tx=0x7fb6d8004010 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.657+0000 7fb6df7fe700 1 -- 192.168.123.106:0/1503164549 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb6d800bed0 con 0x7fb6e8107f40 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.657+0000 7fb6df7fe700 1 -- 192.168.123.106:0/1503164549 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fb6d8003710 con 0x7fb6e8107f40 2026-03-09T17:24:34.858 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.657+0000 7fb6df7fe700 1 -- 192.168.123.106:0/1503164549 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb6d801ae20 con 0x7fb6e8107f40 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.657+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1503164549 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb6e819c540 con 0x7fb6e8107f40 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.657+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1503164549 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb6e819c9e0 con 0x7fb6e8107f40 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.658+0000 7fb6df7fe700 1 -- 192.168.123.106:0/1503164549 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7fb6d8004260 con 0x7fb6e8107f40 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.658+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1503164549 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb6e8195ea0 con 0x7fb6e8107f40 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.658+0000 7fb6df7fe700 1 --2- 192.168.123.106:0/1503164549 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb6c80382d0 0x7fb6c803a780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.659+0000 7fb6dffff700 1 --2- 192.168.123.106:0/1503164549 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb6c80382d0 0x7fb6c803a780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.659+0000 7fb6df7fe700 1 -- 192.168.123.106:0/1503164549 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fb6d804b990 con 0x7fb6e8107f40 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.660+0000 7fb6dffff700 1 --2- 192.168.123.106:0/1503164549 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb6c80382d0 0x7fb6c803a780 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fb6d0006fd0 tx=0x7fb6d0006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.662+0000 7fb6df7fe700 1 -- 192.168.123.106:0/1503164549 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb6d8018b40 con 0x7fb6e8107f40 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.766+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1503164549 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}) v1 -- 0x7fb6e80008d0 con 0x7fb6c80382d0 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.820+0000 7fb6df7fe700 1 -- 192.168.123.106:0/1503164549 <== mgr.14120 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7fb6e80008d0 con 0x7fb6c80382d0 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.822+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1503164549 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb6c80382d0 msgr2=0x7fb6c803a780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.822+0000 7fb6ecaef700 1 --2- 192.168.123.106:0/1503164549 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb6c80382d0 0x7fb6c803a780 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fb6d0006fd0 tx=0x7fb6d0006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.822+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1503164549 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6e8107f40 msgr2=0x7fb6e819c000 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.822+0000 7fb6ecaef700 1 --2- 192.168.123.106:0/1503164549 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6e8107f40 0x7fb6e819c000 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fb6d8003f30 tx=0x7fb6d8004010 comp rx=0 tx=0).stop 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.823+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1503164549 shutdown_connections 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.823+0000 7fb6ecaef700 1 --2- 192.168.123.106:0/1503164549 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb6c80382d0 0x7fb6c803a780 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.823+0000 7fb6ecaef700 1 --2- 192.168.123.106:0/1503164549 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6e8107f40 0x7fb6e819c000 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.823+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1503164549 >> 192.168.123.106:0/1503164549 conn(0x7fb6e8103770 msgr2=0x7fb6e8105f60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.823+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1503164549 shutdown_connections 2026-03-09T17:24:34.859 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.823+0000 7fb6ecaef700 1 -- 192.168.123.106:0/1503164549 wait complete. 2026-03-09T17:24:35.101 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:34 vm06 ceph-mon[57307]: [09/Mar/2026:17:24:33] ENGINE Bus STARTING 2026-03-09T17:24:35.101 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:34 vm06 ceph-mon[57307]: [09/Mar/2026:17:24:33] ENGINE Serving on http://192.168.123.106:8765 2026-03-09T17:24:35.101 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:34 vm06 ceph-mon[57307]: [09/Mar/2026:17:24:33] ENGINE Serving on https://192.168.123.106:7150 2026-03-09T17:24:35.101 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:34 vm06 ceph-mon[57307]: [09/Mar/2026:17:24:33] ENGINE Bus STARTED 2026-03-09T17:24:35.101 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:34 vm06 ceph-mon[57307]: from='client.14124 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-09T17:24:35.101 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:34 vm06 ceph-mon[57307]: from='client.14124 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-09T17:24:35.101 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:34 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:35.101 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:34 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:24:35.101 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:34 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:35.101 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:34 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDODie6fu/M7FlzARj/8VPaLI20flGNbea4ppO4avQFGEGi0UoE07yY6gGjj3JAtobLEKkp0qbtJAkcmx8OcUTvAhKrU0SQhlqLu2yLmiEs4rPw3NAsNkqe27oW2EgbEENgk9lEFMiAmEo4cT0elziaRlMoxOhbhC0/LRq9gmIqNknxIXpJuy8owPK2VKLPI+6mfwUJKJ3Oq6fITi0u3BCOb0+GpQKHWZUNWk1q1Rp6OXCar7eya9PLtMRmQ3lLQWLbCapmSUU1hI43XyD8XqOcW1hADGTNpV1rktu4Bz3rqsE4lnXXfdlhrlL168suh9mTuUUWx/g8aF5cYJCjj5ou/+xghtBiGRF87eoXdR8UUnWQ5vmKsaHuCsuBXI9OzKVmc7S/nk7DDLU+vGxJAmSWB//fp+eu8XcKqDOEHxPGWmhY9Ze0HPt+Da33pZIZw30qZgYds30KcTDfv3sFnP7NIckMpPeN2X01uRUHZ0zWf8kZ69hjTRJh370z1zCoJT0= ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.984+0000 7f558df90700 1 Processor -- start 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.985+0000 7f558df90700 1 -- start start 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.985+0000 7f558df90700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5588107f20 0x7f5588108330 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.985+0000 7f558df90700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5588108870 con 0x7f5588107f20 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.985+0000 7f55877fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5588107f20 0x7f5588108330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.985+0000 7f55877fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5588107f20 0x7f5588108330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35148/0 (socket says 192.168.123.106:35148) 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.985+0000 7f55877fe700 1 -- 192.168.123.106:0/2531718361 learned_addr learned my addr 192.168.123.106:0/2531718361 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.986+0000 7f55877fe700 1 -- 192.168.123.106:0/2531718361 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f55881089b0 con 0x7f5588107f20 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.986+0000 7f55877fe700 1 --2- 192.168.123.106:0/2531718361 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5588107f20 0x7f5588108330 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f5578009a90 tx=0x7f5578009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=150910843812eb08 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.986+0000 7f5586ffd700 1 -- 192.168.123.106:0/2531718361 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5578004030 con 0x7f5588107f20 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.986+0000 7f5586ffd700 1 -- 192.168.123.106:0/2531718361 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f557800b7e0 con 0x7f5588107f20 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.986+0000 7f5586ffd700 1 -- 192.168.123.106:0/2531718361 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f55780039f0 con 0x7f5588107f20 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.987+0000 7f558df90700 1 -- 192.168.123.106:0/2531718361 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5588107f20 msgr2=0x7f5588108330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.987+0000 7f558df90700 1 --2- 192.168.123.106:0/2531718361 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5588107f20 0x7f5588108330 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f5578009a90 tx=0x7f5578009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:35.156 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.987+0000 7f558df90700 1 -- 192.168.123.106:0/2531718361 shutdown_connections 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.987+0000 7f558df90700 1 --2- 192.168.123.106:0/2531718361 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5588107f20 0x7f5588108330 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.987+0000 7f558df90700 1 -- 192.168.123.106:0/2531718361 >> 192.168.123.106:0/2531718361 conn(0x7f558807b4b0 msgr2=0x7f558807b8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.987+0000 7f558df90700 1 -- 192.168.123.106:0/2531718361 shutdown_connections 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.987+0000 7f558df90700 1 -- 192.168.123.106:0/2531718361 wait complete. 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.987+0000 7f558df90700 1 Processor -- start 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.988+0000 7f558df90700 1 -- start start 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.988+0000 7f558df90700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f558819bdd0 0x7f558819c1e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.988+0000 7f558df90700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f558819c720 con 0x7f558819bdd0 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.988+0000 7f55877fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f558819bdd0 0x7f558819c1e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.988+0000 7f55877fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f558819bdd0 0x7f558819c1e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35160/0 (socket says 192.168.123.106:35160) 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.988+0000 7f55877fe700 1 -- 192.168.123.106:0/3872391001 learned_addr learned my addr 192.168.123.106:0/3872391001 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.988+0000 7f55877fe700 1 -- 192.168.123.106:0/3872391001 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5578009740 con 0x7f558819bdd0 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.989+0000 7f55877fe700 1 --2- 192.168.123.106:0/3872391001 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f558819bdd0 0x7f558819c1e0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f55780037e0 tx=0x7f5578003b40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.989+0000 7f55857fa700 1 -- 192.168.123.106:0/3872391001 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5578003fd0 con 0x7f558819bdd0 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.989+0000 7f55857fa700 1 -- 192.168.123.106:0/3872391001 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f5578024460 con 0x7f558819bdd0 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.989+0000 7f55857fa700 1 -- 192.168.123.106:0/3872391001 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f557801b440 con 0x7f558819bdd0 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.989+0000 7f558df90700 1 -- 192.168.123.106:0/3872391001 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f558819c920 con 0x7f558819bdd0 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.989+0000 7f558df90700 1 -- 192.168.123.106:0/3872391001 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f558807c560 con 0x7f558819bdd0 2026-03-09T17:24:35.157 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.990+0000 7f55857fa700 1 -- 192.168.123.106:0/3872391001 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7f557801b5a0 con 0x7f558819bdd0 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.990+0000 7f55857fa700 1 --2- 192.168.123.106:0/3872391001 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5574038340 0x7f557403a7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.990+0000 7f55857fa700 1 -- 192.168.123.106:0/3872391001 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f557804cfd0 con 0x7f558819bdd0 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.990+0000 7f557edff700 1 --2- 192.168.123.106:0/3872391001 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5574038340 0x7f557403a7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.991+0000 7f558df90700 1 -- 192.168.123.106:0/3872391001 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f558819cab0 con 0x7f558819bdd0 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.991+0000 7f557edff700 1 --2- 192.168.123.106:0/3872391001 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5574038340 0x7f557403a7f0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f5570006fd0 tx=0x7f5570006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:34.994+0000 7f55857fa700 1 -- 192.168.123.106:0/3872391001 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f557801f030 con 0x7f558819bdd0 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.098+0000 7f558df90700 1 -- 192.168.123.106:0/3872391001 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f5588105df0 con 0x7f5574038340 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.101+0000 7f55857fa700 1 -- 192.168.123.106:0/3872391001 <== mgr.14120 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+595 (secure 0 0 0) 0x7f5588105df0 con 0x7f5574038340 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.104+0000 7f558df90700 1 -- 192.168.123.106:0/3872391001 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5574038340 msgr2=0x7f557403a7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.104+0000 7f558df90700 1 --2- 192.168.123.106:0/3872391001 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5574038340 0x7f557403a7f0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f5570006fd0 tx=0x7f5570006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.104+0000 7f558df90700 1 -- 192.168.123.106:0/3872391001 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f558819bdd0 msgr2=0x7f558819c1e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.104+0000 7f558df90700 1 --2- 192.168.123.106:0/3872391001 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f558819bdd0 0x7f558819c1e0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f55780037e0 tx=0x7f5578003b40 comp rx=0 tx=0).stop 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.104+0000 7f558df90700 1 -- 192.168.123.106:0/3872391001 shutdown_connections 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.104+0000 7f558df90700 1 --2- 192.168.123.106:0/3872391001 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5574038340 0x7f557403a7f0 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.104+0000 7f558df90700 1 --2- 192.168.123.106:0/3872391001 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f558819bdd0 0x7f558819c1e0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.104+0000 7f558df90700 1 -- 192.168.123.106:0/3872391001 >> 192.168.123.106:0/3872391001 conn(0x7f558807b4b0 msgr2=0x7f55881056e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.104+0000 7f558df90700 1 -- 192.168.123.106:0/3872391001 shutdown_connections 2026-03-09T17:24:35.158 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.104+0000 7f558df90700 1 -- 192.168.123.106:0/3872391001 wait complete. 2026-03-09T17:24:35.159 INFO:teuthology.orchestra.run.vm06.stdout:Wrote public SSH key to /home/ubuntu/cephtest/ceph.pub 2026-03-09T17:24:35.159 INFO:teuthology.orchestra.run.vm06.stdout:Adding key to root@localhost authorized_keys... 2026-03-09T17:24:35.159 INFO:teuthology.orchestra.run.vm06.stdout:Adding host vm06... 2026-03-09T17:24:36.256 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:35 vm06 ceph-mon[57307]: from='client.14132 -' entity='client.admin' cmd=[{"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:36.256 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:35 vm06 ceph-mon[57307]: from='client.14134 -' entity='client.admin' cmd=[{"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:36.256 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:35 vm06 ceph-mon[57307]: from='client.14136 -' entity='client.admin' cmd=[{"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:36.256 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:35 vm06 ceph-mon[57307]: Generating ssh key... 2026-03-09T17:24:36.256 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:35 vm06 ceph-mon[57307]: from='client.14138 -' entity='client.admin' cmd=[{"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:36.256 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:35 vm06 ceph-mon[57307]: mgrmap e8: vm06.pbgzei(active, since 2s) 2026-03-09T17:24:36.971 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:36 vm06 ceph-mon[57307]: from='client.14140 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm06", "addr": "192.168.123.106", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:36.971 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:36 vm06 ceph-mon[57307]: Deploying cephadm binary to vm06 2026-03-09T17:24:37.184 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout Added host 'vm06' with addr '192.168.123.106' 2026-03-09T17:24:37.184 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.286+0000 7f190a44f700 1 Processor -- start 2026-03-09T17:24:37.184 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.287+0000 7f190a44f700 1 -- start start 2026-03-09T17:24:37.184 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.287+0000 7f190a44f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1904104c20 0x7f1904105030 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:37.184 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.287+0000 7f190a44f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1904105570 con 0x7f1904104c20 2026-03-09T17:24:37.184 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.287+0000 7f1903fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1904104c20 0x7f1904105030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:37.184 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.287+0000 7f1903fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1904104c20 0x7f1904105030 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35162/0 (socket says 192.168.123.106:35162) 2026-03-09T17:24:37.184 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.287+0000 7f1903fff700 1 -- 192.168.123.106:0/3168974696 learned_addr learned my addr 192.168.123.106:0/3168974696 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:37.184 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.288+0000 7f1903fff700 1 -- 192.168.123.106:0/3168974696 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f19041056b0 con 0x7f1904104c20 2026-03-09T17:24:37.184 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.288+0000 7f1903fff700 1 --2- 192.168.123.106:0/3168974696 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1904104c20 0x7f1904105030 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f18ec009a90 tx=0x7f18ec009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=fc08c0cfd53600c9 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:37.184 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.288+0000 7f1902ffd700 1 -- 192.168.123.106:0/3168974696 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f18ec004030 con 0x7f1904104c20 2026-03-09T17:24:37.184 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.289+0000 7f1902ffd700 1 -- 192.168.123.106:0/3168974696 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f18ec00b7e0 con 0x7f1904104c20 2026-03-09T17:24:37.184 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.289+0000 7f190a44f700 1 -- 192.168.123.106:0/3168974696 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1904104c20 msgr2=0x7f1904105030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:37.184 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.289+0000 7f190a44f700 1 --2- 192.168.123.106:0/3168974696 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1904104c20 0x7f1904105030 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f18ec009a90 tx=0x7f18ec009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.289+0000 7f190a44f700 1 -- 192.168.123.106:0/3168974696 shutdown_connections 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.289+0000 7f190a44f700 1 --2- 192.168.123.106:0/3168974696 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1904104c20 0x7f1904105030 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.289+0000 7f190a44f700 1 -- 192.168.123.106:0/3168974696 >> 192.168.123.106:0/3168974696 conn(0x7f1904100270 msgr2=0x7f19041026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.289+0000 7f190a44f700 1 -- 192.168.123.106:0/3168974696 shutdown_connections 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.289+0000 7f190a44f700 1 -- 192.168.123.106:0/3168974696 wait complete. 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.290+0000 7f190a44f700 1 Processor -- start 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.290+0000 7f190a44f700 1 -- start start 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.290+0000 7f190a44f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1904104c20 0x7f190419bc90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.290+0000 7f1903fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1904104c20 0x7f190419bc90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.291+0000 7f1903fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1904104c20 0x7f190419bc90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35176/0 (socket says 192.168.123.106:35176) 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.291+0000 7f1903fff700 1 -- 192.168.123.106:0/1428840153 learned_addr learned my addr 192.168.123.106:0/1428840153 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.291+0000 7f190a44f700 1 -- 192.168.123.106:0/1428840153 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1904105570 con 0x7f1904104c20 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.291+0000 7f1903fff700 1 -- 192.168.123.106:0/1428840153 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f18ec009740 con 0x7f1904104c20 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.291+0000 7f1903fff700 1 --2- 192.168.123.106:0/1428840153 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1904104c20 0x7f190419bc90 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f18ec003e90 tx=0x7f18ec003f70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.291+0000 7f19017fa700 1 -- 192.168.123.106:0/1428840153 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f18ec0043d0 con 0x7f1904104c20 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.291+0000 7f19017fa700 1 -- 192.168.123.106:0/1428840153 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f18ec004530 con 0x7f1904104c20 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.291+0000 7f19017fa700 1 -- 192.168.123.106:0/1428840153 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f18ec011670 con 0x7f1904104c20 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.291+0000 7f190a44f700 1 -- 192.168.123.106:0/1428840153 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f190419c1d0 con 0x7f1904104c20 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.291+0000 7f190a44f700 1 -- 192.168.123.106:0/1428840153 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f190419c5f0 con 0x7f1904104c20 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.292+0000 7f19017fa700 1 -- 192.168.123.106:0/1428840153 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7f18ec0117d0 con 0x7f1904104c20 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.293+0000 7f190a44f700 1 -- 192.168.123.106:0/1428840153 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f190419c8a0 con 0x7f1904104c20 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.293+0000 7f19017fa700 1 --2- 192.168.123.106:0/1428840153 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f18f0038340 0x7f18f003a7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.293+0000 7f19017fa700 1 -- 192.168.123.106:0/1428840153 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f18ec04bea0 con 0x7f1904104c20 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.293+0000 7f19037fe700 1 --2- 192.168.123.106:0/1428840153 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f18f0038340 0x7f18f003a7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.293+0000 7f19037fe700 1 --2- 192.168.123.106:0/1428840153 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f18f0038340 0x7f18f003a7f0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f18f4006fd0 tx=0x7f18f4006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.295+0000 7f19017fa700 1 -- 192.168.123.106:0/1428840153 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f18ec029330 con 0x7f1904104c20 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.398+0000 7f190a44f700 1 -- 192.168.123.106:0/1428840153 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm06", "addr": "192.168.123.106", "target": ["mon-mgr", ""]}) v1 -- 0x7f1904106d40 con 0x7f18f0038340 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:35.822+0000 7f19017fa700 1 -- 192.168.123.106:0/1428840153 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f18ec02b430 con 0x7f1904104c20 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.139+0000 7f19017fa700 1 -- 192.168.123.106:0/1428840153 <== mgr.14120 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7f1904106d40 con 0x7f18f0038340 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.143+0000 7f190a44f700 1 -- 192.168.123.106:0/1428840153 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f18f0038340 msgr2=0x7f18f003a7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.143+0000 7f190a44f700 1 --2- 192.168.123.106:0/1428840153 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f18f0038340 0x7f18f003a7f0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f18f4006fd0 tx=0x7f18f4006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.143+0000 7f190a44f700 1 -- 192.168.123.106:0/1428840153 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1904104c20 msgr2=0x7f190419bc90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.143+0000 7f190a44f700 1 --2- 192.168.123.106:0/1428840153 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1904104c20 0x7f190419bc90 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f18ec003e90 tx=0x7f18ec003f70 comp rx=0 tx=0).stop 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.143+0000 7f190a44f700 1 -- 192.168.123.106:0/1428840153 shutdown_connections 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.143+0000 7f190a44f700 1 --2- 192.168.123.106:0/1428840153 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f18f0038340 0x7f18f003a7f0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.143+0000 7f190a44f700 1 --2- 192.168.123.106:0/1428840153 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1904104c20 0x7f190419bc90 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.143+0000 7f190a44f700 1 -- 192.168.123.106:0/1428840153 >> 192.168.123.106:0/1428840153 conn(0x7f1904100270 msgr2=0x7f1904100f20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.143+0000 7f190a44f700 1 -- 192.168.123.106:0/1428840153 shutdown_connections 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.143+0000 7f190a44f700 1 -- 192.168.123.106:0/1428840153 wait complete. 2026-03-09T17:24:37.185 INFO:teuthology.orchestra.run.vm06.stdout:Deploying mon service with default placement... 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout Scheduled mon update... 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.329+0000 7f640aa2b700 1 Processor -- start 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.329+0000 7f640aa2b700 1 -- start start 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.329+0000 7f640aa2b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6404104a80 0x7f6404104e90 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.333+0000 7f6403fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6404104a80 0x7f6404104e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.333+0000 7f6403fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6404104a80 0x7f6404104e90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35192/0 (socket says 192.168.123.106:35192) 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.333+0000 7f6403fff700 1 -- 192.168.123.106:0/2506816870 learned_addr learned my addr 192.168.123.106:0/2506816870 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.333+0000 7f640aa2b700 1 -- 192.168.123.106:0/2506816870 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f64041053d0 con 0x7f6404104a80 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.345+0000 7f6403fff700 1 -- 192.168.123.106:0/2506816870 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6404105510 con 0x7f6404104a80 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.345+0000 7f6403fff700 1 --2- 192.168.123.106:0/2506816870 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6404104a80 0x7f6404104e90 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f63f4009a90 tx=0x7f63f4009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=b56aa84a04f6515 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.345+0000 7f6402ffd700 1 -- 192.168.123.106:0/2506816870 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f63f4004030 con 0x7f6404104a80 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.345+0000 7f6402ffd700 1 -- 192.168.123.106:0/2506816870 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f63f400b7e0 con 0x7f6404104a80 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.346+0000 7f640aa2b700 1 -- 192.168.123.106:0/2506816870 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6404104a80 msgr2=0x7f6404104e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.346+0000 7f640aa2b700 1 --2- 192.168.123.106:0/2506816870 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6404104a80 0x7f6404104e90 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f63f4009a90 tx=0x7f63f4009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.346+0000 7f640aa2b700 1 -- 192.168.123.106:0/2506816870 shutdown_connections 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.346+0000 7f640aa2b700 1 --2- 192.168.123.106:0/2506816870 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6404104a80 0x7f6404104e90 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.346+0000 7f640aa2b700 1 -- 192.168.123.106:0/2506816870 >> 192.168.123.106:0/2506816870 conn(0x7f64041000f0 msgr2=0x7f6404102500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.346+0000 7f640aa2b700 1 -- 192.168.123.106:0/2506816870 shutdown_connections 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.346+0000 7f640aa2b700 1 -- 192.168.123.106:0/2506816870 wait complete. 2026-03-09T17:24:37.552 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.347+0000 7f640aa2b700 1 Processor -- start 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.347+0000 7f640aa2b700 1 -- start start 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.347+0000 7f640aa2b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f64041975c0 0x7f64041979d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.347+0000 7f640aa2b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f63f4014070 con 0x7f64041975c0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.347+0000 7f6403fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f64041975c0 0x7f64041979d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.347+0000 7f6403fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f64041975c0 0x7f64041979d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35200/0 (socket says 192.168.123.106:35200) 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.347+0000 7f6403fff700 1 -- 192.168.123.106:0/920489200 learned_addr learned my addr 192.168.123.106:0/920489200 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.347+0000 7f6403fff700 1 -- 192.168.123.106:0/920489200 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f63f4009740 con 0x7f64041975c0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.348+0000 7f6403fff700 1 --2- 192.168.123.106:0/920489200 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f64041975c0 0x7f64041979d0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f63f4009130 tx=0x7f63f4010750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.348+0000 7f64017fa700 1 -- 192.168.123.106:0/920489200 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f63f4010af0 con 0x7f64041975c0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.348+0000 7f640aa2b700 1 -- 192.168.123.106:0/920489200 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6404197f10 con 0x7f64041975c0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.348+0000 7f640aa2b700 1 -- 192.168.123.106:0/920489200 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f64041abc10 con 0x7f64041975c0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.349+0000 7f64017fa700 1 -- 192.168.123.106:0/920489200 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f63f4010c50 con 0x7f64041975c0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.349+0000 7f64017fa700 1 -- 192.168.123.106:0/920489200 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f63f40193f0 con 0x7f64041975c0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.349+0000 7f64017fa700 1 -- 192.168.123.106:0/920489200 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f63f4019550 con 0x7f64041975c0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.350+0000 7f64017fa700 1 --2- 192.168.123.106:0/920489200 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f63ec038500 0x7f63ec03a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.350+0000 7f64017fa700 1 -- 192.168.123.106:0/920489200 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f63f404cc60 con 0x7f64041975c0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.350+0000 7f640aa2b700 1 -- 192.168.123.106:0/920489200 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f63f0005320 con 0x7f64041975c0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.355+0000 7f64037fe700 1 --2- 192.168.123.106:0/920489200 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f63ec038500 0x7f63ec03a9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.355+0000 7f64017fa700 1 -- 192.168.123.106:0/920489200 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f63f402a430 con 0x7f64041975c0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.359+0000 7f64037fe700 1 --2- 192.168.123.106:0/920489200 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f63ec038500 0x7f63ec03a9b0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f63f8006fd0 tx=0x7f63f8006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.471+0000 7f640aa2b700 1 -- 192.168.123.106:0/920489200 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}) v1 -- 0x7f63f0000bf0 con 0x7f63ec038500 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.476+0000 7f64017fa700 1 -- 192.168.123.106:0/920489200 <== mgr.14120 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f63f0000bf0 con 0x7f63ec038500 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.480+0000 7f640aa2b700 1 -- 192.168.123.106:0/920489200 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f63ec038500 msgr2=0x7f63ec03a9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.480+0000 7f640aa2b700 1 --2- 192.168.123.106:0/920489200 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f63ec038500 0x7f63ec03a9b0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f63f8006fd0 tx=0x7f63f8006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.480+0000 7f640aa2b700 1 -- 192.168.123.106:0/920489200 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f64041975c0 msgr2=0x7f64041979d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.480+0000 7f640aa2b700 1 --2- 192.168.123.106:0/920489200 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f64041975c0 0x7f64041979d0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f63f4009130 tx=0x7f63f4010750 comp rx=0 tx=0).stop 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.480+0000 7f640aa2b700 1 -- 192.168.123.106:0/920489200 shutdown_connections 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.480+0000 7f640aa2b700 1 --2- 192.168.123.106:0/920489200 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f63ec038500 0x7f63ec03a9b0 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.480+0000 7f640aa2b700 1 --2- 192.168.123.106:0/920489200 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f64041975c0 0x7f64041979d0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.480+0000 7f640aa2b700 1 -- 192.168.123.106:0/920489200 >> 192.168.123.106:0/920489200 conn(0x7f64041000f0 msgr2=0x7f64041023d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.480+0000 7f640aa2b700 1 -- 192.168.123.106:0/920489200 shutdown_connections 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.481+0000 7f640aa2b700 1 -- 192.168.123.106:0/920489200 wait complete. 2026-03-09T17:24:37.553 INFO:teuthology.orchestra.run.vm06.stdout:Deploying mgr service with default placement... 2026-03-09T17:24:37.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout Scheduled mgr update... 2026-03-09T17:24:37.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.699+0000 7f6f25441700 1 Processor -- start 2026-03-09T17:24:37.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.700+0000 7f6f25441700 1 -- start start 2026-03-09T17:24:37.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.700+0000 7f6f25441700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f2007ad50 0x7f6f20079250 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:37.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.700+0000 7f6f25441700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f20079790 con 0x7f6f2007ad50 2026-03-09T17:24:37.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.700+0000 7f6f1effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f2007ad50 0x7f6f20079250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:37.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.700+0000 7f6f1effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f2007ad50 0x7f6f20079250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35202/0 (socket says 192.168.123.106:35202) 2026-03-09T17:24:37.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.700+0000 7f6f1effd700 1 -- 192.168.123.106:0/3141423442 learned_addr learned my addr 192.168.123.106:0/3141423442 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:37.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.701+0000 7f6f1effd700 1 -- 192.168.123.106:0/3141423442 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6f200798d0 con 0x7f6f2007ad50 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.701+0000 7f6f1effd700 1 --2- 192.168.123.106:0/3141423442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f2007ad50 0x7f6f20079250 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f6f08009a90 tx=0x7f6f08009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=d384442b53216087 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.701+0000 7f6f1dffb700 1 -- 192.168.123.106:0/3141423442 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6f08004030 con 0x7f6f2007ad50 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.701+0000 7f6f1dffb700 1 -- 192.168.123.106:0/3141423442 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f6f0800b7e0 con 0x7f6f2007ad50 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.702+0000 7f6f25441700 1 -- 192.168.123.106:0/3141423442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f2007ad50 msgr2=0x7f6f20079250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.702+0000 7f6f25441700 1 --2- 192.168.123.106:0/3141423442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f2007ad50 0x7f6f20079250 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f6f08009a90 tx=0x7f6f08009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.702+0000 7f6f25441700 1 -- 192.168.123.106:0/3141423442 shutdown_connections 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.702+0000 7f6f25441700 1 --2- 192.168.123.106:0/3141423442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f2007ad50 0x7f6f20079250 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.702+0000 7f6f25441700 1 -- 192.168.123.106:0/3141423442 >> 192.168.123.106:0/3141423442 conn(0x7f6f201013a0 msgr2=0x7f6f201037b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.702+0000 7f6f25441700 1 -- 192.168.123.106:0/3141423442 shutdown_connections 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.702+0000 7f6f25441700 1 -- 192.168.123.106:0/3141423442 wait complete. 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.703+0000 7f6f25441700 1 Processor -- start 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.703+0000 7f6f25441700 1 -- start start 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.703+0000 7f6f25441700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f201a01b0 0x7f6f201a05c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.703+0000 7f6f25441700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f20079790 con 0x7f6f201a01b0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.703+0000 7f6f1effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f201a01b0 0x7f6f201a05c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.703+0000 7f6f1effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f201a01b0 0x7f6f201a05c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35212/0 (socket says 192.168.123.106:35212) 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.703+0000 7f6f1effd700 1 -- 192.168.123.106:0/3145990030 learned_addr learned my addr 192.168.123.106:0/3145990030 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.704+0000 7f6f1effd700 1 -- 192.168.123.106:0/3145990030 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6f08009740 con 0x7f6f201a01b0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.704+0000 7f6f1effd700 1 --2- 192.168.123.106:0/3145990030 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f201a01b0 0x7f6f201a05c0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f6f0800bd00 tx=0x7f6f0800bde0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.704+0000 7f6f17fff700 1 -- 192.168.123.106:0/3145990030 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6f0801a670 con 0x7f6f201a01b0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.704+0000 7f6f25441700 1 -- 192.168.123.106:0/3145990030 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6f201a0b00 con 0x7f6f201a01b0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.704+0000 7f6f25441700 1 -- 192.168.123.106:0/3145990030 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6f201a3790 con 0x7f6f201a01b0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.705+0000 7f6f17fff700 1 -- 192.168.123.106:0/3145990030 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f6f0801ac70 con 0x7f6f201a01b0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.705+0000 7f6f17fff700 1 -- 192.168.123.106:0/3145990030 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6f080044e0 con 0x7f6f201a01b0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.705+0000 7f6f17fff700 1 -- 192.168.123.106:0/3145990030 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f6f0801a7d0 con 0x7f6f201a01b0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.706+0000 7f6f17fff700 1 --2- 192.168.123.106:0/3145990030 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6f0c038490 0x7f6f0c03a940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.706+0000 7f6f1e7fc700 1 --2- 192.168.123.106:0/3145990030 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6f0c038490 0x7f6f0c03a940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.706+0000 7f6f1e7fc700 1 --2- 192.168.123.106:0/3145990030 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6f0c038490 0x7f6f0c03a940 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f6f10006fd0 tx=0x7f6f10006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.706+0000 7f6f17fff700 1 -- 192.168.123.106:0/3145990030 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f6f0804b670 con 0x7f6f201a01b0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.706+0000 7f6f25441700 1 -- 192.168.123.106:0/3145990030 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6f00005320 con 0x7f6f201a01b0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.709+0000 7f6f17fff700 1 -- 192.168.123.106:0/3145990030 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6f08003df0 con 0x7f6f201a01b0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.724+0000 7f6f17fff700 1 -- 192.168.123.106:0/3145990030 <== mon.0 v2:192.168.123.106:3300/0 7 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6f08003c60 con 0x7f6f201a01b0 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.840+0000 7f6f25441700 1 -- 192.168.123.106:0/3145990030 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}) v1 -- 0x7f6f00000bf0 con 0x7f6f0c038490 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.847+0000 7f6f17fff700 1 -- 192.168.123.106:0/3145990030 <== mgr.14120 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f6f00000bf0 con 0x7f6f0c038490 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.853+0000 7f6f15ffb700 1 -- 192.168.123.106:0/3145990030 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6f0c038490 msgr2=0x7f6f0c03a940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.853+0000 7f6f15ffb700 1 --2- 192.168.123.106:0/3145990030 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6f0c038490 0x7f6f0c03a940 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f6f10006fd0 tx=0x7f6f10006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.853+0000 7f6f15ffb700 1 -- 192.168.123.106:0/3145990030 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f201a01b0 msgr2=0x7f6f201a05c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.853+0000 7f6f15ffb700 1 --2- 192.168.123.106:0/3145990030 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f201a01b0 0x7f6f201a05c0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f6f0800bd00 tx=0x7f6f0800bde0 comp rx=0 tx=0).stop 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.853+0000 7f6f15ffb700 1 -- 192.168.123.106:0/3145990030 shutdown_connections 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.853+0000 7f6f15ffb700 1 --2- 192.168.123.106:0/3145990030 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6f0c038490 0x7f6f0c03a940 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.853+0000 7f6f15ffb700 1 --2- 192.168.123.106:0/3145990030 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f201a01b0 0x7f6f201a05c0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.853+0000 7f6f15ffb700 1 -- 192.168.123.106:0/3145990030 >> 192.168.123.106:0/3145990030 conn(0x7f6f201013a0 msgr2=0x7f6f201037b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.854+0000 7f6f15ffb700 1 -- 192.168.123.106:0/3145990030 shutdown_connections 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:37.854+0000 7f6f15ffb700 1 -- 192.168.123.106:0/3145990030 wait complete. 2026-03-09T17:24:37.899 INFO:teuthology.orchestra.run.vm06.stdout:Deploying crash service with default placement... 2026-03-09T17:24:38.288 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout Scheduled crash update... 2026-03-09T17:24:38.288 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.041+0000 7faf262c0700 1 Processor -- start 2026-03-09T17:24:38.288 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.042+0000 7faf262c0700 1 -- start start 2026-03-09T17:24:38.288 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.042+0000 7faf262c0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf20105d10 0x7faf20106120 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:38.288 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.042+0000 7faf262c0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faf20106660 con 0x7faf20105d10 2026-03-09T17:24:38.288 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.042+0000 7faf1ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf20105d10 0x7faf20106120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:38.288 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.042+0000 7faf1ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf20105d10 0x7faf20106120 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35222/0 (socket says 192.168.123.106:35222) 2026-03-09T17:24:38.288 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.042+0000 7faf1ffff700 1 -- 192.168.123.106:0/3718099450 learned_addr learned my addr 192.168.123.106:0/3718099450 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:38.288 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.043+0000 7faf1ffff700 1 -- 192.168.123.106:0/3718099450 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faf201067a0 con 0x7faf20105d10 2026-03-09T17:24:38.288 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.043+0000 7faf1ffff700 1 --2- 192.168.123.106:0/3718099450 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf20105d10 0x7faf20106120 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7faf08009a90 tx=0x7faf08009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=6f7bc6ff109dbd4f server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:38.288 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.043+0000 7faf1effd700 1 -- 192.168.123.106:0/3718099450 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7faf08004030 con 0x7faf20105d10 2026-03-09T17:24:38.288 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.043+0000 7faf1effd700 1 -- 192.168.123.106:0/3718099450 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7faf0800b7e0 con 0x7faf20105d10 2026-03-09T17:24:38.288 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.043+0000 7faf262c0700 1 -- 192.168.123.106:0/3718099450 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf20105d10 msgr2=0x7faf20106120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:38.288 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.043+0000 7faf262c0700 1 --2- 192.168.123.106:0/3718099450 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf20105d10 0x7faf20106120 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7faf08009a90 tx=0x7faf08009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:38.289 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.044+0000 7faf262c0700 1 -- 192.168.123.106:0/3718099450 shutdown_connections 2026-03-09T17:24:38.289 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.044+0000 7faf262c0700 1 --2- 192.168.123.106:0/3718099450 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf20105d10 0x7faf20106120 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:38.289 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.044+0000 7faf262c0700 1 -- 192.168.123.106:0/3718099450 >> 192.168.123.106:0/3718099450 conn(0x7faf20101360 msgr2=0x7faf20103790 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:38.289 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.045+0000 7faf262c0700 1 -- 192.168.123.106:0/3718099450 shutdown_connections 2026-03-09T17:24:38.289 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.045+0000 7faf262c0700 1 -- 192.168.123.106:0/3718099450 wait complete. 2026-03-09T17:24:38.289 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.045+0000 7faf262c0700 1 Processor -- start 2026-03-09T17:24:38.289 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.045+0000 7faf262c0700 1 -- start start 2026-03-09T17:24:38.289 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.045+0000 7faf262c0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf20105d10 0x7faf20199e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:38.289 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.045+0000 7faf262c0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faf20106660 con 0x7faf20105d10 2026-03-09T17:24:38.289 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.046+0000 7faf1ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf20105d10 0x7faf20199e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:38.289 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.046+0000 7faf1ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf20105d10 0x7faf20199e70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35236/0 (socket says 192.168.123.106:35236) 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.046+0000 7faf1ffff700 1 -- 192.168.123.106:0/1772926555 learned_addr learned my addr 192.168.123.106:0/1772926555 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.046+0000 7faf1ffff700 1 -- 192.168.123.106:0/1772926555 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faf08009740 con 0x7faf20105d10 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.046+0000 7faf1ffff700 1 --2- 192.168.123.106:0/1772926555 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf20105d10 0x7faf20199e70 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7faf0800bd80 tx=0x7faf0800be60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.075+0000 7faf1d7fa700 1 -- 192.168.123.106:0/1772926555 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7faf08003900 con 0x7faf20105d10 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.075+0000 7faf1d7fa700 1 -- 192.168.123.106:0/1772926555 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7faf080044c0 con 0x7faf20105d10 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.075+0000 7faf1d7fa700 1 -- 192.168.123.106:0/1772926555 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7faf08024d80 con 0x7faf20105d10 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.075+0000 7faf262c0700 1 -- 192.168.123.106:0/1772926555 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faf2019a3b0 con 0x7faf20105d10 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.075+0000 7faf262c0700 1 -- 192.168.123.106:0/1772926555 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faf2019a850 con 0x7faf20105d10 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.077+0000 7faf262c0700 1 -- 192.168.123.106:0/1772926555 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faf00005320 con 0x7faf20105d10 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.081+0000 7faf1d7fa700 1 -- 192.168.123.106:0/1772926555 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7faf0802b030 con 0x7faf20105d10 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.081+0000 7faf1d7fa700 1 --2- 192.168.123.106:0/1772926555 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7faf0c038460 0x7faf0c03a910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.081+0000 7faf1d7fa700 1 -- 192.168.123.106:0/1772926555 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7faf0804c0d0 con 0x7faf20105d10 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.082+0000 7faf1d7fa700 1 -- 192.168.123.106:0/1772926555 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7faf08024460 con 0x7faf20105d10 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.085+0000 7faf1f7fe700 1 --2- 192.168.123.106:0/1772926555 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7faf0c038460 0x7faf0c03a910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.085+0000 7faf1f7fe700 1 --2- 192.168.123.106:0/1772926555 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7faf0c038460 0x7faf0c03a910 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7faf10006fd0 tx=0x7faf10006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.204+0000 7faf262c0700 1 -- 192.168.123.106:0/1772926555 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}) v1 -- 0x7faf00000bf0 con 0x7faf0c038460 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.208+0000 7faf1d7fa700 1 -- 192.168.123.106:0/1772926555 <== mgr.14120 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+26 (secure 0 0 0) 0x7faf00000bf0 con 0x7faf0c038460 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.217+0000 7faf262c0700 1 -- 192.168.123.106:0/1772926555 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7faf0c038460 msgr2=0x7faf0c03a910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.217+0000 7faf262c0700 1 --2- 192.168.123.106:0/1772926555 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7faf0c038460 0x7faf0c03a910 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7faf10006fd0 tx=0x7faf10006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.217+0000 7faf262c0700 1 -- 192.168.123.106:0/1772926555 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf20105d10 msgr2=0x7faf20199e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.217+0000 7faf262c0700 1 --2- 192.168.123.106:0/1772926555 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf20105d10 0x7faf20199e70 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7faf0800bd80 tx=0x7faf0800be60 comp rx=0 tx=0).stop 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.217+0000 7faf262c0700 1 -- 192.168.123.106:0/1772926555 shutdown_connections 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.217+0000 7faf262c0700 1 --2- 192.168.123.106:0/1772926555 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7faf0c038460 0x7faf0c03a910 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.217+0000 7faf262c0700 1 --2- 192.168.123.106:0/1772926555 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf20105d10 0x7faf20199e70 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.217+0000 7faf262c0700 1 -- 192.168.123.106:0/1772926555 >> 192.168.123.106:0/1772926555 conn(0x7faf20101360 msgr2=0x7faf20103740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.217+0000 7faf262c0700 1 -- 192.168.123.106:0/1772926555 shutdown_connections 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.217+0000 7faf262c0700 1 -- 192.168.123.106:0/1772926555 wait complete. 2026-03-09T17:24:38.290 INFO:teuthology.orchestra.run.vm06.stdout:Deploying ceph-exporter service with default placement... 2026-03-09T17:24:38.403 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:38 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:38.404 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:38 vm06 ceph-mon[57307]: Added host vm06 2026-03-09T17:24:38.404 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:38 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:24:38.404 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:38 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:38.404 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:38 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:38.404 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:38 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:38.404 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:38 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout Scheduled ceph-exporter update... 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.437+0000 7f8eabc55700 1 Processor -- start 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.437+0000 7f8eabc55700 1 -- start start 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.438+0000 7f8eabc55700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ea4071190 0x7f8ea40715a0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.438+0000 7f8eabc55700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ea4072c90 con 0x7f8ea4071190 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.438+0000 7f8ea99f1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ea4071190 0x7f8ea40715a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.438+0000 7f8ea99f1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ea4071190 0x7f8ea40715a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35244/0 (socket says 192.168.123.106:35244) 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.438+0000 7f8ea99f1700 1 -- 192.168.123.106:0/3257052697 learned_addr learned my addr 192.168.123.106:0/3257052697 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.440+0000 7f8ea99f1700 1 -- 192.168.123.106:0/3257052697 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8ea4072dd0 con 0x7f8ea4071190 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.440+0000 7f8ea99f1700 1 --2- 192.168.123.106:0/3257052697 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ea4071190 0x7f8ea40715a0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f8ea000b5b0 tx=0x7f8ea000b8c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=fd4eb37a7e9b66e0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.440+0000 7f8ea89ef700 1 -- 192.168.123.106:0/3257052697 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ea0004030 con 0x7f8ea4071190 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.440+0000 7f8ea89ef700 1 -- 192.168.123.106:0/3257052697 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8ea000c040 con 0x7f8ea4071190 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.440+0000 7f8eabc55700 1 -- 192.168.123.106:0/3257052697 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ea4071190 msgr2=0x7f8ea40715a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.440+0000 7f8eabc55700 1 --2- 192.168.123.106:0/3257052697 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ea4071190 0x7f8ea40715a0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f8ea000b5b0 tx=0x7f8ea000b8c0 comp rx=0 tx=0).stop 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.441+0000 7f8eabc55700 1 -- 192.168.123.106:0/3257052697 shutdown_connections 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.441+0000 7f8eabc55700 1 --2- 192.168.123.106:0/3257052697 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ea4071190 0x7f8ea40715a0 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.441+0000 7f8eabc55700 1 -- 192.168.123.106:0/3257052697 >> 192.168.123.106:0/3257052697 conn(0x7f8ea406cc30 msgr2=0x7f8ea406f060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.441+0000 7f8eabc55700 1 -- 192.168.123.106:0/3257052697 shutdown_connections 2026-03-09T17:24:38.628 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.441+0000 7f8eabc55700 1 -- 192.168.123.106:0/3257052697 wait complete. 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.441+0000 7f8eabc55700 1 Processor -- start 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.441+0000 7f8eabc55700 1 -- start start 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.441+0000 7f8eabc55700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ea40865a0 0x7f8ea40869b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.441+0000 7f8eabc55700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ea0003870 con 0x7f8ea40865a0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.441+0000 7f8ea99f1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ea40865a0 0x7f8ea40869b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.442+0000 7f8ea99f1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ea40865a0 0x7f8ea40869b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35246/0 (socket says 192.168.123.106:35246) 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.442+0000 7f8ea99f1700 1 -- 192.168.123.106:0/2697391862 learned_addr learned my addr 192.168.123.106:0/2697391862 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.442+0000 7f8ea99f1700 1 -- 192.168.123.106:0/2697391862 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8ea000b260 con 0x7f8ea40865a0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.442+0000 7f8ea99f1700 1 --2- 192.168.123.106:0/2697391862 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ea40865a0 0x7f8ea40869b0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f8ea000d010 tx=0x7f8ea0011740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.442+0000 7f8e9affd700 1 -- 192.168.123.106:0/2697391862 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ea0011b60 con 0x7f8ea40865a0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.442+0000 7f8e9affd700 1 -- 192.168.123.106:0/2697391862 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8ea0011cc0 con 0x7f8ea40865a0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.443+0000 7f8e9affd700 1 -- 192.168.123.106:0/2697391862 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ea001a5f0 con 0x7f8ea40865a0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.443+0000 7f8eabc55700 1 -- 192.168.123.106:0/2697391862 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8ea4089b80 con 0x7f8ea40865a0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.443+0000 7f8eabc55700 1 -- 192.168.123.106:0/2697391862 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8ea4086f70 con 0x7f8ea40865a0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.444+0000 7f8e9affd700 1 -- 192.168.123.106:0/2697391862 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f8ea0003c80 con 0x7f8ea40865a0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.444+0000 7f8e9affd700 1 --2- 192.168.123.106:0/2697391862 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8e9003a340 0x7f8e9003c7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.444+0000 7f8e9affd700 1 -- 192.168.123.106:0/2697391862 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f8ea002a030 con 0x7f8ea40865a0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.444+0000 7f8e98ff9700 1 -- 192.168.123.106:0/2697391862 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8ea404f9e0 con 0x7f8ea40865a0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.445+0000 7f8ea91f0700 1 --2- 192.168.123.106:0/2697391862 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8e9003a340 0x7f8e9003c7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.445+0000 7f8ea91f0700 1 --2- 192.168.123.106:0/2697391862 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8e9003a340 0x7f8e9003c7f0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f8e94006fd0 tx=0x7f8e94006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.447+0000 7f8e9affd700 1 -- 192.168.123.106:0/2697391862 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8ea0020030 con 0x7f8ea40865a0 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.587+0000 7f8e98ff9700 1 -- 192.168.123.106:0/2697391862 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7f8ea4061b60 con 0x7f8e9003a340 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.592+0000 7f8e9affd700 1 -- 192.168.123.106:0/2697391862 <== mgr.14120 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7f8ea4061b60 con 0x7f8e9003a340 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.596+0000 7f8eabc55700 1 -- 192.168.123.106:0/2697391862 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8e9003a340 msgr2=0x7f8e9003c7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.596+0000 7f8eabc55700 1 --2- 192.168.123.106:0/2697391862 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8e9003a340 0x7f8e9003c7f0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f8e94006fd0 tx=0x7f8e94006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.596+0000 7f8eabc55700 1 -- 192.168.123.106:0/2697391862 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ea40865a0 msgr2=0x7f8ea40869b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.596+0000 7f8eabc55700 1 --2- 192.168.123.106:0/2697391862 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ea40865a0 0x7f8ea40869b0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f8ea000d010 tx=0x7f8ea0011740 comp rx=0 tx=0).stop 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.596+0000 7f8eabc55700 1 -- 192.168.123.106:0/2697391862 shutdown_connections 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.596+0000 7f8eabc55700 1 --2- 192.168.123.106:0/2697391862 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8e9003a340 0x7f8e9003c7f0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.596+0000 7f8eabc55700 1 --2- 192.168.123.106:0/2697391862 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ea40865a0 0x7f8ea40869b0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.596+0000 7f8eabc55700 1 -- 192.168.123.106:0/2697391862 >> 192.168.123.106:0/2697391862 conn(0x7f8ea406cc30 msgr2=0x7f8ea406e8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.597+0000 7f8eabc55700 1 -- 192.168.123.106:0/2697391862 shutdown_connections 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.597+0000 7f8eabc55700 1 -- 192.168.123.106:0/2697391862 wait complete. 2026-03-09T17:24:38.629 INFO:teuthology.orchestra.run.vm06.stdout:Deploying prometheus service with default placement... 2026-03-09T17:24:39.058 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout Scheduled prometheus update... 2026-03-09T17:24:39.058 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.870+0000 7f6ddfb79700 1 Processor -- start 2026-03-09T17:24:39.058 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.872+0000 7f6ddfb79700 1 -- start start 2026-03-09T17:24:39.058 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.872+0000 7f6ddfb79700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6dd8071840 0x7f6dd8071c50 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:39.058 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.872+0000 7f6ddfb79700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6dd8072190 con 0x7f6dd8071840 2026-03-09T17:24:39.058 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.872+0000 7f6ddd915700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6dd8071840 0x7f6dd8071c50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:39.058 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.872+0000 7f6ddd915700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6dd8071840 0x7f6dd8071c50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35262/0 (socket says 192.168.123.106:35262) 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.872+0000 7f6ddd915700 1 -- 192.168.123.106:0/1645575781 learned_addr learned my addr 192.168.123.106:0/1645575781 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.873+0000 7f6ddd915700 1 -- 192.168.123.106:0/1645575781 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6dd80722d0 con 0x7f6dd8071840 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.873+0000 7f6ddd915700 1 --2- 192.168.123.106:0/1645575781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6dd8071840 0x7f6dd8071c50 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f6dd4009480 tx=0x7f6dd4009790 comp rx=0 tx=0).ready entity=mon.0 client_cookie=8630dad04a1a0550 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.873+0000 7f6ddc913700 1 -- 192.168.123.106:0/1645575781 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6dd4004030 con 0x7f6dd8071840 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.873+0000 7f6ddc913700 1 -- 192.168.123.106:0/1645575781 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6dd400b7e0 con 0x7f6dd8071840 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.873+0000 7f6ddc913700 1 -- 192.168.123.106:0/1645575781 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6dd4018c60 con 0x7f6dd8071840 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.877+0000 7f6ddfb79700 1 -- 192.168.123.106:0/1645575781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6dd8071840 msgr2=0x7f6dd8071c50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.877+0000 7f6ddfb79700 1 --2- 192.168.123.106:0/1645575781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6dd8071840 0x7f6dd8071c50 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f6dd4009480 tx=0x7f6dd4009790 comp rx=0 tx=0).stop 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.878+0000 7f6ddfb79700 1 -- 192.168.123.106:0/1645575781 shutdown_connections 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.878+0000 7f6ddfb79700 1 --2- 192.168.123.106:0/1645575781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6dd8071840 0x7f6dd8071c50 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.878+0000 7f6ddfb79700 1 -- 192.168.123.106:0/1645575781 >> 192.168.123.106:0/1645575781 conn(0x7f6dd806cc30 msgr2=0x7f6dd806f060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.878+0000 7f6ddfb79700 1 -- 192.168.123.106:0/1645575781 shutdown_connections 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.878+0000 7f6ddfb79700 1 -- 192.168.123.106:0/1645575781 wait complete. 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.879+0000 7f6ddfb79700 1 Processor -- start 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.879+0000 7f6ddfb79700 1 -- start start 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.879+0000 7f6ddfb79700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6dd8071840 0x7f6dd81a0060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.879+0000 7f6ddfb79700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6dd81a05a0 con 0x7f6dd8071840 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.879+0000 7f6ddd915700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6dd8071840 0x7f6dd81a0060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.879+0000 7f6ddd915700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6dd8071840 0x7f6dd81a0060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35268/0 (socket says 192.168.123.106:35268) 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.879+0000 7f6ddd915700 1 -- 192.168.123.106:0/2871188776 learned_addr learned my addr 192.168.123.106:0/2871188776 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.879+0000 7f6ddd915700 1 -- 192.168.123.106:0/2871188776 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6dd4009160 con 0x7f6dd8071840 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.879+0000 7f6ddd915700 1 --2- 192.168.123.106:0/2871188776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6dd8071840 0x7f6dd81a0060 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f6dd401a040 tx=0x7f6dd40106a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.880+0000 7f6dceffd700 1 -- 192.168.123.106:0/2871188776 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6dd4010a50 con 0x7f6dd8071840 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.880+0000 7f6ddfb79700 1 -- 192.168.123.106:0/2871188776 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6dd81a07a0 con 0x7f6dd8071840 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.880+0000 7f6ddfb79700 1 -- 192.168.123.106:0/2871188776 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6dd81a0c40 con 0x7f6dd8071840 2026-03-09T17:24:39.059 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.881+0000 7f6dceffd700 1 -- 192.168.123.106:0/2871188776 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6dd4010bb0 con 0x7f6dd8071840 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.881+0000 7f6dceffd700 1 -- 192.168.123.106:0/2871188776 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6dd4020560 con 0x7f6dd8071840 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.881+0000 7f6dceffd700 1 -- 192.168.123.106:0/2871188776 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f6dd4020780 con 0x7f6dd8071840 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.882+0000 7f6ddfb79700 1 -- 192.168.123.106:0/2871188776 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6dbc005320 con 0x7f6dd8071840 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.886+0000 7f6dceffd700 1 --2- 192.168.123.106:0/2871188776 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6dc4038510 0x7f6dc403a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.886+0000 7f6ddd114700 1 --2- 192.168.123.106:0/2871188776 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6dc4038510 0x7f6dc403a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.886+0000 7f6ddd114700 1 --2- 192.168.123.106:0/2871188776 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6dc4038510 0x7f6dc403a9c0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f6dd000ad30 tx=0x7f6dd00093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.886+0000 7f6dceffd700 1 -- 192.168.123.106:0/2871188776 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f6dd4013070 con 0x7f6dd8071840 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:38.886+0000 7f6dceffd700 1 -- 192.168.123.106:0/2871188776 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6dd4010d20 con 0x7f6dd8071840 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.004+0000 7f6ddfb79700 1 -- 192.168.123.106:0/2871188776 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}) v1 -- 0x7f6dbc000bf0 con 0x7f6dc4038510 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.011+0000 7f6dceffd700 1 -- 192.168.123.106:0/2871188776 <== mgr.14120 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+31 (secure 0 0 0) 0x7f6dbc000bf0 con 0x7f6dc4038510 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.015+0000 7f6dccff9700 1 -- 192.168.123.106:0/2871188776 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6dc4038510 msgr2=0x7f6dc403a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.015+0000 7f6dccff9700 1 --2- 192.168.123.106:0/2871188776 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6dc4038510 0x7f6dc403a9c0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f6dd000ad30 tx=0x7f6dd00093f0 comp rx=0 tx=0).stop 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.015+0000 7f6dccff9700 1 -- 192.168.123.106:0/2871188776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6dd8071840 msgr2=0x7f6dd81a0060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.015+0000 7f6dccff9700 1 --2- 192.168.123.106:0/2871188776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6dd8071840 0x7f6dd81a0060 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f6dd401a040 tx=0x7f6dd40106a0 comp rx=0 tx=0).stop 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.015+0000 7f6dccff9700 1 -- 192.168.123.106:0/2871188776 shutdown_connections 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.015+0000 7f6dccff9700 1 --2- 192.168.123.106:0/2871188776 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6dc4038510 0x7f6dc403a9c0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.015+0000 7f6dccff9700 1 --2- 192.168.123.106:0/2871188776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6dd8071840 0x7f6dd81a0060 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.015+0000 7f6dccff9700 1 -- 192.168.123.106:0/2871188776 >> 192.168.123.106:0/2871188776 conn(0x7f6dd806cc30 msgr2=0x7f6dd806e960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.016+0000 7f6dccff9700 1 -- 192.168.123.106:0/2871188776 shutdown_connections 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.016+0000 7f6dccff9700 1 -- 192.168.123.106:0/2871188776 wait complete. 2026-03-09T17:24:39.060 INFO:teuthology.orchestra.run.vm06.stdout:Deploying grafana service with default placement... 2026-03-09T17:24:39.318 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:39 vm06 ceph-mon[57307]: from='client.14142 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:39.318 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:39 vm06 ceph-mon[57307]: Saving service mon spec with placement count:5 2026-03-09T17:24:39.318 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:39 vm06 ceph-mon[57307]: from='client.14144 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:39.318 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:39 vm06 ceph-mon[57307]: Saving service mgr spec with placement count:2 2026-03-09T17:24:39.318 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:39 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:39.318 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:39 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:39.318 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:39 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:39.318 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:39 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:39.402 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout Scheduled grafana update... 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.180+0000 7f29eea04700 1 Processor -- start 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.180+0000 7f29eea04700 1 -- start start 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.180+0000 7f29eea04700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29e00a4370 0x7f29e00a4780 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.180+0000 7f29eea04700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f29e00a4d50 con 0x7f29e00a4370 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.181+0000 7f29eda02700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29e00a4370 0x7f29e00a4780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.181+0000 7f29eda02700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29e00a4370 0x7f29e00a4780 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35278/0 (socket says 192.168.123.106:35278) 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.181+0000 7f29eda02700 1 -- 192.168.123.106:0/1613682639 learned_addr learned my addr 192.168.123.106:0/1613682639 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.181+0000 7f29eda02700 1 -- 192.168.123.106:0/1613682639 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f29e00a5570 con 0x7f29e00a4370 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.181+0000 7f29eda02700 1 --2- 192.168.123.106:0/1613682639 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29e00a4370 0x7f29e00a4780 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f29e4009480 tx=0x7f29e4009790 comp rx=0 tx=0).ready entity=mon.0 client_cookie=9989e594f25a051b server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.182+0000 7f29eca00700 1 -- 192.168.123.106:0/1613682639 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f29e4004030 con 0x7f29e00a4370 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.182+0000 7f29eca00700 1 -- 192.168.123.106:0/1613682639 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f29e400b7e0 con 0x7f29e00a4370 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.182+0000 7f29eea04700 1 -- 192.168.123.106:0/1613682639 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29e00a4370 msgr2=0x7f29e00a4780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.183+0000 7f29eea04700 1 --2- 192.168.123.106:0/1613682639 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29e00a4370 0x7f29e00a4780 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f29e4009480 tx=0x7f29e4009790 comp rx=0 tx=0).stop 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.183+0000 7f29eea04700 1 -- 192.168.123.106:0/1613682639 shutdown_connections 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.183+0000 7f29eea04700 1 --2- 192.168.123.106:0/1613682639 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29e00a4370 0x7f29e00a4780 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.183+0000 7f29eea04700 1 -- 192.168.123.106:0/1613682639 >> 192.168.123.106:0/1613682639 conn(0x7f29e009f4b0 msgr2=0x7f29e00a1900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.183+0000 7f29eea04700 1 -- 192.168.123.106:0/1613682639 shutdown_connections 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.183+0000 7f29eea04700 1 -- 192.168.123.106:0/1613682639 wait complete. 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.188+0000 7f29eea04700 1 Processor -- start 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.192+0000 7f29eea04700 1 -- start start 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.192+0000 7f29eea04700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29e0140510 0x7f29e0140920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.192+0000 7f29eea04700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f29e4018ac0 con 0x7f29e0140510 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.193+0000 7f29eda02700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29e0140510 0x7f29e0140920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.193+0000 7f29eda02700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29e0140510 0x7f29e0140920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35288/0 (socket says 192.168.123.106:35288) 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.193+0000 7f29eda02700 1 -- 192.168.123.106:0/1779178521 learned_addr learned my addr 192.168.123.106:0/1779178521 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.196+0000 7f29eda02700 1 -- 192.168.123.106:0/1779178521 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f29e4009160 con 0x7f29e0140510 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.197+0000 7f29eda02700 1 --2- 192.168.123.106:0/1779178521 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29e0140510 0x7f29e0140920 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f29e4009130 tx=0x7f29e400bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.199+0000 7f29deffd700 1 -- 192.168.123.106:0/1779178521 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f29e4009e30 con 0x7f29e0140510 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.199+0000 7f29deffd700 1 -- 192.168.123.106:0/1779178521 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f29e40036a0 con 0x7f29e0140510 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.199+0000 7f29deffd700 1 -- 192.168.123.106:0/1779178521 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f29e4017760 con 0x7f29e0140510 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.199+0000 7f29eea04700 1 -- 192.168.123.106:0/1779178521 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f29e0140e60 con 0x7f29e0140510 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.200+0000 7f29eea04700 1 -- 192.168.123.106:0/1779178521 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f29e0143b70 con 0x7f29e0140510 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.201+0000 7f29eea04700 1 -- 192.168.123.106:0/1779178521 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f29cc005320 con 0x7f29e0140510 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.202+0000 7f29deffd700 1 -- 192.168.123.106:0/1779178521 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f29e4003810 con 0x7f29e0140510 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.202+0000 7f29deffd700 1 --2- 192.168.123.106:0/1779178521 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f29d4038440 0x7f29d403a8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.202+0000 7f29deffd700 1 -- 192.168.123.106:0/1779178521 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f29e402f080 con 0x7f29e0140510 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.205+0000 7f29ed201700 1 --2- 192.168.123.106:0/1779178521 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f29d4038440 0x7f29d403a8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.205+0000 7f29ed201700 1 --2- 192.168.123.106:0/1779178521 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f29d4038440 0x7f29d403a8f0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f29e80665b0 tx=0x7f29e8073360 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.205+0000 7f29deffd700 1 -- 192.168.123.106:0/1779178521 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f29e4003ac0 con 0x7f29e0140510 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.315+0000 7f29eea04700 1 -- 192.168.123.106:0/1779178521 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}) v1 -- 0x7f29cc000bf0 con 0x7f29d4038440 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.360+0000 7f29deffd700 1 -- 192.168.123.106:0/1779178521 <== mgr.14120 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+28 (secure 0 0 0) 0x7f29cc000bf0 con 0x7f29d4038440 2026-03-09T17:24:39.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.363+0000 7f29eea04700 1 -- 192.168.123.106:0/1779178521 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f29d4038440 msgr2=0x7f29d403a8f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:39.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.363+0000 7f29eea04700 1 --2- 192.168.123.106:0/1779178521 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f29d4038440 0x7f29d403a8f0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f29e80665b0 tx=0x7f29e8073360 comp rx=0 tx=0).stop 2026-03-09T17:24:39.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.363+0000 7f29eea04700 1 -- 192.168.123.106:0/1779178521 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29e0140510 msgr2=0x7f29e0140920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:39.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.363+0000 7f29eea04700 1 --2- 192.168.123.106:0/1779178521 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29e0140510 0x7f29e0140920 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f29e4009130 tx=0x7f29e400bac0 comp rx=0 tx=0).stop 2026-03-09T17:24:39.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.363+0000 7f29eea04700 1 -- 192.168.123.106:0/1779178521 shutdown_connections 2026-03-09T17:24:39.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.363+0000 7f29eea04700 1 --2- 192.168.123.106:0/1779178521 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f29d4038440 0x7f29d403a8f0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:39.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.363+0000 7f29eea04700 1 --2- 192.168.123.106:0/1779178521 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29e0140510 0x7f29e0140920 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:39.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.363+0000 7f29eea04700 1 -- 192.168.123.106:0/1779178521 >> 192.168.123.106:0/1779178521 conn(0x7f29e009f4b0 msgr2=0x7f29e009ffd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:39.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.364+0000 7f29eea04700 1 -- 192.168.123.106:0/1779178521 shutdown_connections 2026-03-09T17:24:39.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.364+0000 7f29eea04700 1 -- 192.168.123.106:0/1779178521 wait complete. 2026-03-09T17:24:39.404 INFO:teuthology.orchestra.run.vm06.stdout:Deploying node-exporter service with default placement... 2026-03-09T17:24:39.740 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout Scheduled node-exporter update... 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.539+0000 7f7ab479e700 1 Processor -- start 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.540+0000 7f7ab479e700 1 -- start start 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.540+0000 7f7ab479e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7aac071430 0x7f7aac071840 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.540+0000 7f7ab479e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7aac071d80 con 0x7f7aac071430 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.540+0000 7f7ab253a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7aac071430 0x7f7aac071840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.540+0000 7f7ab253a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7aac071430 0x7f7aac071840 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35290/0 (socket says 192.168.123.106:35290) 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.540+0000 7f7ab253a700 1 -- 192.168.123.106:0/67983682 learned_addr learned my addr 192.168.123.106:0/67983682 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.540+0000 7f7ab253a700 1 -- 192.168.123.106:0/67983682 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7aac071ec0 con 0x7f7aac071430 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.541+0000 7f7ab253a700 1 --2- 192.168.123.106:0/67983682 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7aac071430 0x7f7aac071840 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f7aa8009cf0 tx=0x7f7aa800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=30602ab731a8b2e server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.541+0000 7f7ab1538700 1 -- 192.168.123.106:0/67983682 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7aa8004030 con 0x7f7aac071430 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.542+0000 7f7ab1538700 1 -- 192.168.123.106:0/67983682 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7aa800b810 con 0x7f7aac071430 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.542+0000 7f7ab1538700 1 -- 192.168.123.106:0/67983682 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7aa8003b10 con 0x7f7aac071430 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.543+0000 7f7ab479e700 1 -- 192.168.123.106:0/67983682 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7aac071430 msgr2=0x7f7aac071840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.543+0000 7f7ab479e700 1 --2- 192.168.123.106:0/67983682 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7aac071430 0x7f7aac071840 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f7aa8009cf0 tx=0x7f7aa800b0e0 comp rx=0 tx=0).stop 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.543+0000 7f7ab479e700 1 -- 192.168.123.106:0/67983682 shutdown_connections 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.543+0000 7f7ab479e700 1 --2- 192.168.123.106:0/67983682 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7aac071430 0x7f7aac071840 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.543+0000 7f7ab479e700 1 -- 192.168.123.106:0/67983682 >> 192.168.123.106:0/67983682 conn(0x7f7aac06c970 msgr2=0x7f7aac06eda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.544+0000 7f7ab479e700 1 -- 192.168.123.106:0/67983682 shutdown_connections 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.544+0000 7f7ab479e700 1 -- 192.168.123.106:0/67983682 wait complete. 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.544+0000 7f7ab479e700 1 Processor -- start 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.544+0000 7f7ab479e700 1 -- start start 2026-03-09T17:24:39.741 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.544+0000 7f7ab479e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7aac071430 0x7f7aac19ff00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:39.742 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.544+0000 7f7ab479e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7aac1a0440 con 0x7f7aac071430 2026-03-09T17:24:39.742 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.545+0000 7f7ab253a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7aac071430 0x7f7aac19ff00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:39.742 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.545+0000 7f7ab253a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7aac071430 0x7f7aac19ff00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35298/0 (socket says 192.168.123.106:35298) 2026-03-09T17:24:39.742 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.545+0000 7f7ab253a700 1 -- 192.168.123.106:0/2975470577 learned_addr learned my addr 192.168.123.106:0/2975470577 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:39.742 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.545+0000 7f7ab253a700 1 -- 192.168.123.106:0/2975470577 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7aa8009740 con 0x7f7aac071430 2026-03-09T17:24:39.742 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.545+0000 7f7ab253a700 1 --2- 192.168.123.106:0/2975470577 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7aac071430 0x7f7aac19ff00 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f7aa8000c00 tx=0x7f7aa8011770 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:39.742 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.546+0000 7f7aa37fe700 1 -- 192.168.123.106:0/2975470577 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7aa8011a60 con 0x7f7aac071430 2026-03-09T17:24:39.742 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.546+0000 7f7ab479e700 1 -- 192.168.123.106:0/2975470577 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7aac1a0640 con 0x7f7aac071430 2026-03-09T17:24:39.742 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.547+0000 7f7ab479e700 1 -- 192.168.123.106:0/2975470577 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7aac1a0ae0 con 0x7f7aac071430 2026-03-09T17:24:39.742 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.548+0000 7f7aa37fe700 1 -- 192.168.123.106:0/2975470577 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7aa8011bc0 con 0x7f7aac071430 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.548+0000 7f7aa37fe700 1 -- 192.168.123.106:0/2975470577 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7aa801a520 con 0x7f7aac071430 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.548+0000 7f7aa37fe700 1 -- 192.168.123.106:0/2975470577 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f7aa801a780 con 0x7f7aac071430 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.548+0000 7f7aa37fe700 1 --2- 192.168.123.106:0/2975470577 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7a98038510 0x7f7a9803a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.548+0000 7f7ab1d39700 1 --2- 192.168.123.106:0/2975470577 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7a98038510 0x7f7a9803a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.548+0000 7f7ab479e700 1 -- 192.168.123.106:0/2975470577 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7aac04efc0 con 0x7f7aac071430 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.552+0000 7f7aa37fe700 1 -- 192.168.123.106:0/2975470577 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f7aa804d250 con 0x7f7aac071430 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.552+0000 7f7ab1d39700 1 --2- 192.168.123.106:0/2975470577 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7a98038510 0x7f7a9803a9c0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f7aa400ad30 tx=0x7f7aa40093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.552+0000 7f7aa37fe700 1 -- 192.168.123.106:0/2975470577 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7aa801aa30 con 0x7f7aac071430 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.672+0000 7f7ab479e700 1 -- 192.168.123.106:0/2975470577 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7f7aac113550 con 0x7f7a98038510 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.677+0000 7f7aa37fe700 1 -- 192.168.123.106:0/2975470577 <== mgr.14120 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7f7aac113550 con 0x7f7a98038510 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.681+0000 7f7aa17fa700 1 -- 192.168.123.106:0/2975470577 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7a98038510 msgr2=0x7f7a9803a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.681+0000 7f7aa17fa700 1 --2- 192.168.123.106:0/2975470577 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7a98038510 0x7f7a9803a9c0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f7aa400ad30 tx=0x7f7aa40093f0 comp rx=0 tx=0).stop 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.681+0000 7f7aa17fa700 1 -- 192.168.123.106:0/2975470577 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7aac071430 msgr2=0x7f7aac19ff00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.681+0000 7f7aa17fa700 1 --2- 192.168.123.106:0/2975470577 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7aac071430 0x7f7aac19ff00 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f7aa8000c00 tx=0x7f7aa8011770 comp rx=0 tx=0).stop 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.681+0000 7f7aa17fa700 1 -- 192.168.123.106:0/2975470577 shutdown_connections 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.681+0000 7f7aa17fa700 1 --2- 192.168.123.106:0/2975470577 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7a98038510 0x7f7a9803a9c0 secure :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f7aa400ad30 tx=0x7f7aa40093f0 comp rx=0 tx=0).stop 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.681+0000 7f7aa17fa700 1 --2- 192.168.123.106:0/2975470577 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7aac071430 0x7f7aac19ff00 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.681+0000 7f7aa17fa700 1 -- 192.168.123.106:0/2975470577 >> 192.168.123.106:0/2975470577 conn(0x7f7aac06c970 msgr2=0x7f7aac06eda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.682+0000 7f7aa17fa700 1 -- 192.168.123.106:0/2975470577 shutdown_connections 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.683+0000 7f7aa17fa700 1 -- 192.168.123.106:0/2975470577 wait complete. 2026-03-09T17:24:39.743 INFO:teuthology.orchestra.run.vm06.stdout:Deploying alertmanager service with default placement... 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout Scheduled alertmanager update... 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.880+0000 7feba6632700 1 Processor -- start 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.880+0000 7feba6632700 1 -- start start 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.880+0000 7feba6632700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feba0071410 0x7feba0071820 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.880+0000 7feba6632700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feba0071d60 con 0x7feba0071410 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.880+0000 7feba5630700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feba0071410 0x7feba0071820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.880+0000 7feba5630700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feba0071410 0x7feba0071820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35302/0 (socket says 192.168.123.106:35302) 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.880+0000 7feba5630700 1 -- 192.168.123.106:0/2188424178 learned_addr learned my addr 192.168.123.106:0/2188424178 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.881+0000 7feba5630700 1 -- 192.168.123.106:0/2188424178 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feba0071ea0 con 0x7feba0071410 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.881+0000 7feba5630700 1 --2- 192.168.123.106:0/2188424178 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feba0071410 0x7feba0071820 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7feb9c009cf0 tx=0x7feb9c00b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=781fd704843d3843 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.881+0000 7feb97fff700 1 -- 192.168.123.106:0/2188424178 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7feb9c004030 con 0x7feba0071410 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.881+0000 7feb97fff700 1 -- 192.168.123.106:0/2188424178 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7feb9c00b810 con 0x7feba0071410 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.882+0000 7feba6632700 1 -- 192.168.123.106:0/2188424178 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feba0071410 msgr2=0x7feba0071820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.882+0000 7feba6632700 1 --2- 192.168.123.106:0/2188424178 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feba0071410 0x7feba0071820 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7feb9c009cf0 tx=0x7feb9c00b0e0 comp rx=0 tx=0).stop 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.882+0000 7feba6632700 1 -- 192.168.123.106:0/2188424178 shutdown_connections 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.882+0000 7feba6632700 1 --2- 192.168.123.106:0/2188424178 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feba0071410 0x7feba0071820 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.882+0000 7feba6632700 1 -- 192.168.123.106:0/2188424178 >> 192.168.123.106:0/2188424178 conn(0x7feba006c9d0 msgr2=0x7feba006ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.882+0000 7feba6632700 1 -- 192.168.123.106:0/2188424178 shutdown_connections 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.882+0000 7feba6632700 1 -- 192.168.123.106:0/2188424178 wait complete. 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.882+0000 7feba6632700 1 Processor -- start 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.882+0000 7feba6632700 1 -- start start 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.883+0000 7feba6632700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feba01a0030 0x7feba01a0440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.883+0000 7feba6632700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feba0071d60 con 0x7feba01a0030 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.883+0000 7feba5630700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feba01a0030 0x7feba01a0440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.883+0000 7feba5630700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feba01a0030 0x7feba01a0440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35312/0 (socket says 192.168.123.106:35312) 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.883+0000 7feba5630700 1 -- 192.168.123.106:0/1681503145 learned_addr learned my addr 192.168.123.106:0/1681503145 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.883+0000 7feba5630700 1 -- 192.168.123.106:0/1681503145 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feb9c009740 con 0x7feba01a0030 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.883+0000 7feba5630700 1 --2- 192.168.123.106:0/1681503145 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feba01a0030 0x7feba01a0440 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7feb9c009cc0 tx=0x7feb9c00bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.884+0000 7feb967fc700 1 -- 192.168.123.106:0/1681503145 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7feb9c003950 con 0x7feba01a0030 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.884+0000 7feba6632700 1 -- 192.168.123.106:0/1681503145 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feba01a0980 con 0x7feba01a0030 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.884+0000 7feba6632700 1 -- 192.168.123.106:0/1681503145 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feba01a3610 con 0x7feba01a0030 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.885+0000 7feba6632700 1 -- 192.168.123.106:0/1681503145 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feba004f030 con 0x7feba01a0030 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.888+0000 7feb967fc700 1 -- 192.168.123.106:0/1681503145 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7feb9c0043c0 con 0x7feba01a0030 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.888+0000 7feb967fc700 1 -- 192.168.123.106:0/1681503145 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7feb9c01ac80 con 0x7feba01a0030 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.888+0000 7feb967fc700 1 -- 192.168.123.106:0/1681503145 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7feb9c011420 con 0x7feba01a0030 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.888+0000 7feb967fc700 1 --2- 192.168.123.106:0/1681503145 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7feb8c0384e0 0x7feb8c03a990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.888+0000 7feb967fc700 1 -- 192.168.123.106:0/1681503145 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7feb9c04c7d0 con 0x7feba01a0030 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.888+0000 7feba4e2f700 1 --2- 192.168.123.106:0/1681503145 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7feb8c0384e0 0x7feb8c03a990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.889+0000 7feb967fc700 1 -- 192.168.123.106:0/1681503145 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7feb9c04ecc0 con 0x7feba01a0030 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:39.889+0000 7feba4e2f700 1 --2- 192.168.123.106:0/1681503145 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7feb8c0384e0 0x7feb8c03a990 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7feb9800ad30 tx=0x7feb980093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.019+0000 7feba6632700 1 -- 192.168.123.106:0/1681503145 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}) v1 -- 0x7feba0199cb0 con 0x7feb8c0384e0 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.132+0000 7feb967fc700 1 -- 192.168.123.106:0/1681503145 <== mgr.14120 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+33 (secure 0 0 0) 0x7feba0199cb0 con 0x7feb8c0384e0 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.134+0000 7feb8bfff700 1 -- 192.168.123.106:0/1681503145 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7feb8c0384e0 msgr2=0x7feb8c03a990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.134+0000 7feb8bfff700 1 --2- 192.168.123.106:0/1681503145 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7feb8c0384e0 0x7feb8c03a990 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7feb9800ad30 tx=0x7feb980093f0 comp rx=0 tx=0).stop 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.135+0000 7feb8bfff700 1 -- 192.168.123.106:0/1681503145 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feba01a0030 msgr2=0x7feba01a0440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:40.497 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.135+0000 7feb8bfff700 1 --2- 192.168.123.106:0/1681503145 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feba01a0030 0x7feba01a0440 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7feb9c009cc0 tx=0x7feb9c00bfa0 comp rx=0 tx=0).stop 2026-03-09T17:24:40.498 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.135+0000 7feb8bfff700 1 -- 192.168.123.106:0/1681503145 shutdown_connections 2026-03-09T17:24:40.498 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.135+0000 7feb8bfff700 1 --2- 192.168.123.106:0/1681503145 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7feb8c0384e0 0x7feb8c03a990 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:40.498 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.135+0000 7feb8bfff700 1 --2- 192.168.123.106:0/1681503145 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7feba01a0030 0x7feba01a0440 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:40.498 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.135+0000 7feb8bfff700 1 -- 192.168.123.106:0/1681503145 >> 192.168.123.106:0/1681503145 conn(0x7feba006c9d0 msgr2=0x7feba006ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:40.498 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.135+0000 7feb8bfff700 1 -- 192.168.123.106:0/1681503145 shutdown_connections 2026-03-09T17:24:40.498 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.135+0000 7feb8bfff700 1 -- 192.168.123.106:0/1681503145 wait complete. 2026-03-09T17:24:40.750 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:40 vm06 ceph-mon[57307]: from='client.14146 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:40.751 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:40 vm06 ceph-mon[57307]: Saving service crash spec with placement * 2026-03-09T17:24:40.751 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:40 vm06 ceph-mon[57307]: from='client.14148 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:40.751 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:40 vm06 ceph-mon[57307]: Saving service ceph-exporter spec with placement * 2026-03-09T17:24:40.751 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:40 vm06 ceph-mon[57307]: from='client.14150 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:40.751 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:40 vm06 ceph-mon[57307]: Saving service prometheus spec with placement count:1 2026-03-09T17:24:40.751 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:40 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:40.751 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:40 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:40.751 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:40 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:40.751 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:40 vm06 ceph-mon[57307]: from='mgr.14120 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:40.814 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.627+0000 7f721b605700 1 Processor -- start 2026-03-09T17:24:40.814 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.628+0000 7f721b605700 1 -- start start 2026-03-09T17:24:40.814 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.628+0000 7f721b605700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7214105d30 0x7f7214106140 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:40.814 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.628+0000 7f721b605700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7214106680 con 0x7f7214105d30 2026-03-09T17:24:40.814 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.628+0000 7f72193a1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7214105d30 0x7f7214106140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:40.815 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.628+0000 7f72193a1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7214105d30 0x7f7214106140 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35324/0 (socket says 192.168.123.106:35324) 2026-03-09T17:24:40.815 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.628+0000 7f72193a1700 1 -- 192.168.123.106:0/393079171 learned_addr learned my addr 192.168.123.106:0/393079171 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:40.815 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.628+0000 7f72193a1700 1 -- 192.168.123.106:0/393079171 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f72141067c0 con 0x7f7214105d30 2026-03-09T17:24:40.815 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.629+0000 7f72193a1700 1 --2- 192.168.123.106:0/393079171 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7214105d30 0x7f7214106140 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f7204009a90 tx=0x7f7204009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=bf208486e33a3543 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:40.815 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.629+0000 7f720bfff700 1 -- 192.168.123.106:0/393079171 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f720400fbf0 con 0x7f7214105d30 2026-03-09T17:24:40.815 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.629+0000 7f720bfff700 1 -- 192.168.123.106:0/393079171 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7204004510 con 0x7f7214105d30 2026-03-09T17:24:40.815 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.629+0000 7f720bfff700 1 -- 192.168.123.106:0/393079171 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7204017450 con 0x7f7214105d30 2026-03-09T17:24:40.815 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.629+0000 7f721b605700 1 -- 192.168.123.106:0/393079171 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7214105d30 msgr2=0x7f7214106140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:40.815 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.629+0000 7f721b605700 1 --2- 192.168.123.106:0/393079171 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7214105d30 0x7f7214106140 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f7204009a90 tx=0x7f7204009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:40.815 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.630+0000 7f721b605700 1 -- 192.168.123.106:0/393079171 shutdown_connections 2026-03-09T17:24:40.816 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.630+0000 7f721b605700 1 --2- 192.168.123.106:0/393079171 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7214105d30 0x7f7214106140 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:40.816 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.630+0000 7f721b605700 1 -- 192.168.123.106:0/393079171 >> 192.168.123.106:0/393079171 conn(0x7f72141013a0 msgr2=0x7f72141037b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:40.816 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.630+0000 7f721b605700 1 -- 192.168.123.106:0/393079171 shutdown_connections 2026-03-09T17:24:40.816 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.630+0000 7f721b605700 1 -- 192.168.123.106:0/393079171 wait complete. 2026-03-09T17:24:40.816 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.630+0000 7f721b605700 1 Processor -- start 2026-03-09T17:24:40.816 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.630+0000 7f721b605700 1 -- start start 2026-03-09T17:24:40.816 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.631+0000 7f721b605700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7214105d30 0x7f7214197830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:40.816 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.631+0000 7f721b605700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7214197d70 con 0x7f7214105d30 2026-03-09T17:24:40.816 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.631+0000 7f72193a1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7214105d30 0x7f7214197830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:40.816 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.631+0000 7f72193a1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7214105d30 0x7f7214197830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35326/0 (socket says 192.168.123.106:35326) 2026-03-09T17:24:40.816 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.631+0000 7f72193a1700 1 -- 192.168.123.106:0/301508687 learned_addr learned my addr 192.168.123.106:0/301508687 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:40.816 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.631+0000 7f72193a1700 1 -- 192.168.123.106:0/301508687 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7204009740 con 0x7f7214105d30 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.631+0000 7f72193a1700 1 --2- 192.168.123.106:0/301508687 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7214105d30 0x7f7214197830 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f7204012040 tx=0x7f72040040e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.632+0000 7f720a7fc700 1 -- 192.168.123.106:0/301508687 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7204017450 con 0x7f7214105d30 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.632+0000 7f720a7fc700 1 -- 192.168.123.106:0/301508687 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7204017bb0 con 0x7f7214105d30 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.632+0000 7f720a7fc700 1 -- 192.168.123.106:0/301508687 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7204020c10 con 0x7f7214105d30 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.632+0000 7f721b605700 1 -- 192.168.123.106:0/301508687 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7214197f70 con 0x7f7214105d30 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.632+0000 7f721b605700 1 -- 192.168.123.106:0/301508687 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7214198410 con 0x7f7214105d30 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.633+0000 7f720a7fc700 1 -- 192.168.123.106:0/301508687 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f720401e070 con 0x7f7214105d30 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.633+0000 7f720a7fc700 1 --2- 192.168.123.106:0/301508687 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f72000384c0 0x7f720003a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.633+0000 7f720a7fc700 1 -- 192.168.123.106:0/301508687 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f7204050f70 con 0x7f7214105d30 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.633+0000 7f7218ba0700 1 --2- 192.168.123.106:0/301508687 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f72000384c0 0x7f720003a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.634+0000 7f721b605700 1 -- 192.168.123.106:0/301508687 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f72141916c0 con 0x7f7214105d30 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.637+0000 7f7218ba0700 1 --2- 192.168.123.106:0/301508687 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f72000384c0 0x7f720003a970 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f7210006fd0 tx=0x7f7210006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.637+0000 7f720a7fc700 1 -- 192.168.123.106:0/301508687 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7204024020 con 0x7f7214105d30 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.748+0000 7f721b605700 1 -- 192.168.123.106:0/301508687 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/container_init}] v 0) v1 -- 0x7f7214078ff0 con 0x7f7214105d30 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.755+0000 7f720a7fc700 1 -- 192.168.123.106:0/301508687 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/container_init}]=0 v7)=0 v7) v1 ==== 142+0+0 (secure 0 0 0) 0x7f720404f0d0 con 0x7f7214105d30 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.759+0000 7f721b605700 1 -- 192.168.123.106:0/301508687 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f72000384c0 msgr2=0x7f720003a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.759+0000 7f721b605700 1 --2- 192.168.123.106:0/301508687 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f72000384c0 0x7f720003a970 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f7210006fd0 tx=0x7f7210006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.759+0000 7f721b605700 1 -- 192.168.123.106:0/301508687 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7214105d30 msgr2=0x7f7214197830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.759+0000 7f721b605700 1 --2- 192.168.123.106:0/301508687 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7214105d30 0x7f7214197830 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f7204012040 tx=0x7f72040040e0 comp rx=0 tx=0).stop 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.759+0000 7f721b605700 1 -- 192.168.123.106:0/301508687 shutdown_connections 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.759+0000 7f721b605700 1 --2- 192.168.123.106:0/301508687 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f72000384c0 0x7f720003a970 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.759+0000 7f721b605700 1 --2- 192.168.123.106:0/301508687 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7214105d30 0x7f7214197830 secure :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f7204012040 tx=0x7f72040040e0 comp rx=0 tx=0).stop 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.759+0000 7f721b605700 1 -- 192.168.123.106:0/301508687 >> 192.168.123.106:0/301508687 conn(0x7f72141013a0 msgr2=0x7f7214102bf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.759+0000 7f721b605700 1 -- 192.168.123.106:0/301508687 shutdown_connections 2026-03-09T17:24:40.817 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.760+0000 7f721b605700 1 -- 192.168.123.106:0/301508687 wait complete. 2026-03-09T17:24:41.124 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.945+0000 7fae5be6a700 1 Processor -- start 2026-03-09T17:24:41.124 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.945+0000 7fae5be6a700 1 -- start start 2026-03-09T17:24:41.124 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.945+0000 7fae5be6a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae54104630 0x7fae54106a50 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:41.124 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.945+0000 7fae5be6a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fae54079da0 con 0x7fae54104630 2026-03-09T17:24:41.124 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.946+0000 7fae59c06700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae54104630 0x7fae54106a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.946+0000 7fae59c06700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae54104630 0x7fae54106a50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35334/0 (socket says 192.168.123.106:35334) 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.946+0000 7fae59c06700 1 -- 192.168.123.106:0/282221655 learned_addr learned my addr 192.168.123.106:0/282221655 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.946+0000 7fae59c06700 1 -- 192.168.123.106:0/282221655 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fae54079ee0 con 0x7fae54104630 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.946+0000 7fae59c06700 1 --2- 192.168.123.106:0/282221655 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae54104630 0x7fae54106a50 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fae50009cf0 tx=0x7fae5000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=dcae432c76c2a337 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.947+0000 7fae58c04700 1 -- 192.168.123.106:0/282221655 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fae50004030 con 0x7fae54104630 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.947+0000 7fae58c04700 1 -- 192.168.123.106:0/282221655 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fae5000b810 con 0x7fae54104630 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.947+0000 7fae5be6a700 1 -- 192.168.123.106:0/282221655 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae54104630 msgr2=0x7fae54106a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.947+0000 7fae5be6a700 1 --2- 192.168.123.106:0/282221655 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae54104630 0x7fae54106a50 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fae50009cf0 tx=0x7fae5000b0e0 comp rx=0 tx=0).stop 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.947+0000 7fae5be6a700 1 -- 192.168.123.106:0/282221655 shutdown_connections 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.947+0000 7fae5be6a700 1 --2- 192.168.123.106:0/282221655 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae54104630 0x7fae54106a50 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.947+0000 7fae5be6a700 1 -- 192.168.123.106:0/282221655 >> 192.168.123.106:0/282221655 conn(0x7fae541002a0 msgr2=0x7fae541026b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.948+0000 7fae5be6a700 1 -- 192.168.123.106:0/282221655 shutdown_connections 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.948+0000 7fae5be6a700 1 -- 192.168.123.106:0/282221655 wait complete. 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.948+0000 7fae5be6a700 1 Processor -- start 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.948+0000 7fae5be6a700 1 -- start start 2026-03-09T17:24:41.125 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.949+0000 7fae5be6a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae54104630 0x7fae541a0470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.949+0000 7fae5be6a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fae54079da0 con 0x7fae54104630 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.949+0000 7fae59c06700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae54104630 0x7fae541a0470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.949+0000 7fae59c06700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae54104630 0x7fae541a0470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35346/0 (socket says 192.168.123.106:35346) 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.949+0000 7fae59c06700 1 -- 192.168.123.106:0/2148852215 learned_addr learned my addr 192.168.123.106:0/2148852215 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.949+0000 7fae59c06700 1 -- 192.168.123.106:0/2148852215 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fae50009740 con 0x7fae54104630 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.949+0000 7fae59c06700 1 --2- 192.168.123.106:0/2148852215 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae54104630 0x7fae541a0470 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fae50006e90 tx=0x7fae5000bdb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.950+0000 7fae4affd700 1 -- 192.168.123.106:0/2148852215 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fae50003950 con 0x7fae54104630 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.950+0000 7fae4affd700 1 -- 192.168.123.106:0/2148852215 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fae50004420 con 0x7fae54104630 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.950+0000 7fae4affd700 1 -- 192.168.123.106:0/2148852215 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fae5001ad80 con 0x7fae54104630 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.950+0000 7fae5be6a700 1 -- 192.168.123.106:0/2148852215 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fae541a09b0 con 0x7fae54104630 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.951+0000 7fae5be6a700 1 -- 192.168.123.106:0/2148852215 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fae541a0e50 con 0x7fae54104630 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.952+0000 7fae5be6a700 1 -- 192.168.123.106:0/2148852215 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fae5404fa50 con 0x7fae54104630 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.955+0000 7fae4affd700 1 -- 192.168.123.106:0/2148852215 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fae50003f40 con 0x7fae54104630 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.955+0000 7fae4affd700 1 --2- 192.168.123.106:0/2148852215 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fae40038490 0x7fae4003a940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.955+0000 7fae4affd700 1 -- 192.168.123.106:0/2148852215 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fae50029020 con 0x7fae54104630 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.955+0000 7fae59405700 1 --2- 192.168.123.106:0/2148852215 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fae40038490 0x7fae4003a940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.955+0000 7fae4affd700 1 -- 192.168.123.106:0/2148852215 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fae5004c2f0 con 0x7fae54104630 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:40.956+0000 7fae59405700 1 --2- 192.168.123.106:0/2148852215 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fae40038490 0x7fae4003a940 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fae44006fd0 tx=0x7fae44006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.062+0000 7fae5be6a700 1 -- 192.168.123.106:0/2148852215 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config set, name=mgr/dashboard/ssl_server_port}] v 0) v1 -- 0x7fae5402d020 con 0x7fae54104630 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.068+0000 7fae4affd700 1 -- 192.168.123.106:0/2148852215 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/dashboard/ssl_server_port}]=0 v8)=0 v8) v1 ==== 130+0+0 (secure 0 0 0) 0x7fae50004590 con 0x7fae54104630 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.072+0000 7fae5be6a700 1 -- 192.168.123.106:0/2148852215 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fae40038490 msgr2=0x7fae4003a940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.072+0000 7fae5be6a700 1 --2- 192.168.123.106:0/2148852215 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fae40038490 0x7fae4003a940 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fae44006fd0 tx=0x7fae44006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.073+0000 7fae5be6a700 1 -- 192.168.123.106:0/2148852215 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae54104630 msgr2=0x7fae541a0470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.073+0000 7fae5be6a700 1 --2- 192.168.123.106:0/2148852215 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae54104630 0x7fae541a0470 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fae50006e90 tx=0x7fae5000bdb0 comp rx=0 tx=0).stop 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.073+0000 7fae5be6a700 1 -- 192.168.123.106:0/2148852215 shutdown_connections 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.073+0000 7fae5be6a700 1 --2- 192.168.123.106:0/2148852215 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fae40038490 0x7fae4003a940 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.073+0000 7fae5be6a700 1 --2- 192.168.123.106:0/2148852215 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae54104630 0x7fae541a0470 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.073+0000 7fae5be6a700 1 -- 192.168.123.106:0/2148852215 >> 192.168.123.106:0/2148852215 conn(0x7fae541002a0 msgr2=0x7fae54102680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.073+0000 7fae5be6a700 1 -- 192.168.123.106:0/2148852215 shutdown_connections 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.073+0000 7fae5be6a700 1 -- 192.168.123.106:0/2148852215 wait complete. 2026-03-09T17:24:41.126 INFO:teuthology.orchestra.run.vm06.stdout:Enabling the dashboard module... 2026-03-09T17:24:41.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:41 vm06 ceph-mon[57307]: from='client.14152 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:41 vm06 ceph-mon[57307]: Saving service grafana spec with placement count:1 2026-03-09T17:24:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:41 vm06 ceph-mon[57307]: from='client.14154 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:41 vm06 ceph-mon[57307]: Saving service node-exporter spec with placement * 2026-03-09T17:24:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:41 vm06 ceph-mon[57307]: from='client.14156 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:41 vm06 ceph-mon[57307]: Saving service alertmanager spec with placement count:1 2026-03-09T17:24:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:41 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/301508687' entity='client.admin' 2026-03-09T17:24:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:41 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/2148852215' entity='client.admin' 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.258+0000 7fbd9e7ca700 1 Processor -- start 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.258+0000 7fbd9e7ca700 1 -- start start 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.259+0000 7fbd9e7ca700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd98104c30 0x7fbd98105040 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.259+0000 7fbd9e7ca700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd98105580 con 0x7fbd98104c30 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.259+0000 7fbd97fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd98104c30 0x7fbd98105040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.259+0000 7fbd97fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd98104c30 0x7fbd98105040 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35356/0 (socket says 192.168.123.106:35356) 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.259+0000 7fbd97fff700 1 -- 192.168.123.106:0/72660913 learned_addr learned my addr 192.168.123.106:0/72660913 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.259+0000 7fbd97fff700 1 -- 192.168.123.106:0/72660913 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbd981056c0 con 0x7fbd98104c30 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.260+0000 7fbd97fff700 1 --2- 192.168.123.106:0/72660913 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd98104c30 0x7fbd98105040 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fbd8c009a90 tx=0x7fbd8c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=36568dc27f903cc2 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.260+0000 7fbd96ffd700 1 -- 192.168.123.106:0/72660913 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbd8c004030 con 0x7fbd98104c30 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.260+0000 7fbd96ffd700 1 -- 192.168.123.106:0/72660913 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbd8c00b7e0 con 0x7fbd98104c30 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.260+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/72660913 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd98104c30 msgr2=0x7fbd98105040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.260+0000 7fbd9e7ca700 1 --2- 192.168.123.106:0/72660913 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd98104c30 0x7fbd98105040 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fbd8c009a90 tx=0x7fbd8c009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.261+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/72660913 shutdown_connections 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.261+0000 7fbd9e7ca700 1 --2- 192.168.123.106:0/72660913 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd98104c30 0x7fbd98105040 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.261+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/72660913 >> 192.168.123.106:0/72660913 conn(0x7fbd981002a0 msgr2=0x7fbd981026b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:42.502 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.261+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/72660913 shutdown_connections 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.261+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/72660913 wait complete. 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.261+0000 7fbd9e7ca700 1 Processor -- start 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.262+0000 7fbd9e7ca700 1 -- start start 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.262+0000 7fbd9e7ca700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd98104c30 0x7fbd98197c70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.262+0000 7fbd9e7ca700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd981981b0 con 0x7fbd98104c30 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.262+0000 7fbd97fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd98104c30 0x7fbd98197c70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.262+0000 7fbd97fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd98104c30 0x7fbd98197c70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35370/0 (socket says 192.168.123.106:35370) 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.262+0000 7fbd97fff700 1 -- 192.168.123.106:0/47882574 learned_addr learned my addr 192.168.123.106:0/47882574 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.262+0000 7fbd97fff700 1 -- 192.168.123.106:0/47882574 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbd8c009740 con 0x7fbd98104c30 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.263+0000 7fbd97fff700 1 --2- 192.168.123.106:0/47882574 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd98104c30 0x7fbd98197c70 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fbd8c0106d0 tx=0x7fbd8c0107b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.263+0000 7fbd957fa700 1 -- 192.168.123.106:0/47882574 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbd8c0038e0 con 0x7fbd98104c30 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.263+0000 7fbd957fa700 1 -- 192.168.123.106:0/47882574 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbd8c00be50 con 0x7fbd98104c30 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.263+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/47882574 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbd981983b0 con 0x7fbd98104c30 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.263+0000 7fbd957fa700 1 -- 192.168.123.106:0/47882574 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbd8c0193f0 con 0x7fbd98104c30 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.263+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/47882574 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbd98198850 con 0x7fbd98104c30 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.264+0000 7fbd957fa700 1 -- 192.168.123.106:0/47882574 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fbd8c019550 con 0x7fbd98104c30 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.264+0000 7fbd957fa700 1 --2- 192.168.123.106:0/47882574 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbd800384b0 0x7fbd8003a960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.265+0000 7fbd957fa700 1 -- 192.168.123.106:0/47882574 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fbd8c04ee60 con 0x7fbd98104c30 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.265+0000 7fbd977fe700 1 --2- 192.168.123.106:0/47882574 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbd800384b0 0x7fbd8003a960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.265+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/47882574 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbd98191560 con 0x7fbd98104c30 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.268+0000 7fbd977fe700 1 --2- 192.168.123.106:0/47882574 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbd800384b0 0x7fbd8003a960 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fbd88006fd0 tx=0x7fbd88006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.268+0000 7fbd957fa700 1 -- 192.168.123.106:0/47882574 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fbd8c02a400 con 0x7fbd98104c30 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:41.420+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/47882574 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "dashboard"} v 0) v1 -- 0x7fbd9804f9e0 con 0x7fbd98104c30 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.424+0000 7fbd957fa700 1 -- 192.168.123.106:0/47882574 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mgrmap(e 9) v1 ==== 45092+0+0 (secure 0 0 0) 0x7fbd8c02a780 con 0x7fbd98104c30 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.426+0000 7fbd957fa700 1 -- 192.168.123.106:0/47882574 <== mon.0 v2:192.168.123.106:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "dashboard"}]=0 v9) v1 ==== 88+0+0 (secure 0 0 0) 0x7fbd8c04e7b0 con 0x7fbd98104c30 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.429+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/47882574 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbd800384b0 msgr2=0x7fbd8003a960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.429+0000 7fbd9e7ca700 1 --2- 192.168.123.106:0/47882574 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbd800384b0 0x7fbd8003a960 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fbd88006fd0 tx=0x7fbd88006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.429+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/47882574 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd98104c30 msgr2=0x7fbd98197c70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.429+0000 7fbd9e7ca700 1 --2- 192.168.123.106:0/47882574 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd98104c30 0x7fbd98197c70 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fbd8c0106d0 tx=0x7fbd8c0107b0 comp rx=0 tx=0).stop 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.429+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/47882574 shutdown_connections 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.429+0000 7fbd9e7ca700 1 --2- 192.168.123.106:0/47882574 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbd800384b0 0x7fbd8003a960 secure :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fbd88006fd0 tx=0x7fbd88006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.429+0000 7fbd9e7ca700 1 --2- 192.168.123.106:0/47882574 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd98104c30 0x7fbd98197c70 secure :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fbd8c0106d0 tx=0x7fbd8c0107b0 comp rx=0 tx=0).stop 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.429+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/47882574 >> 192.168.123.106:0/47882574 conn(0x7fbd981002a0 msgr2=0x7fbd9818e6a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.429+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/47882574 shutdown_connections 2026-03-09T17:24:42.503 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.429+0000 7fbd9e7ca700 1 -- 192.168.123.106:0/47882574 wait complete. 2026-03-09T17:24:42.681 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:42 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/47882574' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout { 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "epoch": 9, 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "active_name": "vm06.pbgzei", 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout } 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.647+0000 7fdb289c6700 1 Processor -- start 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.648+0000 7fdb289c6700 1 -- start start 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.648+0000 7fdb289c6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb20071430 0x7fdb20071840 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.648+0000 7fdb289c6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb20071d80 con 0x7fdb20071430 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.648+0000 7fdb26762700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb20071430 0x7fdb20071840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.648+0000 7fdb26762700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb20071430 0x7fdb20071840 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54342/0 (socket says 192.168.123.106:54342) 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.648+0000 7fdb26762700 1 -- 192.168.123.106:0/2871024902 learned_addr learned my addr 192.168.123.106:0/2871024902 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.649+0000 7fdb26762700 1 -- 192.168.123.106:0/2871024902 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdb20071ec0 con 0x7fdb20071430 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.649+0000 7fdb26762700 1 --2- 192.168.123.106:0/2871024902 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb20071430 0x7fdb20071840 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fdb1400ab30 tx=0x7fdb14010730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=daeec31f76e2c24 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.649+0000 7fdb25760700 1 -- 192.168.123.106:0/2871024902 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdb14010e00 con 0x7fdb20071430 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.650+0000 7fdb25760700 1 -- 192.168.123.106:0/2871024902 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdb14004510 con 0x7fdb20071430 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.650+0000 7fdb289c6700 1 -- 192.168.123.106:0/2871024902 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb20071430 msgr2=0x7fdb20071840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.650+0000 7fdb289c6700 1 --2- 192.168.123.106:0/2871024902 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb20071430 0x7fdb20071840 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fdb1400ab30 tx=0x7fdb14010730 comp rx=0 tx=0).stop 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.650+0000 7fdb289c6700 1 -- 192.168.123.106:0/2871024902 shutdown_connections 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.650+0000 7fdb289c6700 1 --2- 192.168.123.106:0/2871024902 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb20071430 0x7fdb20071840 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.650+0000 7fdb289c6700 1 -- 192.168.123.106:0/2871024902 >> 192.168.123.106:0/2871024902 conn(0x7fdb2006c970 msgr2=0x7fdb2006eda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.650+0000 7fdb289c6700 1 -- 192.168.123.106:0/2871024902 shutdown_connections 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.651+0000 7fdb289c6700 1 -- 192.168.123.106:0/2871024902 wait complete. 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.651+0000 7fdb289c6700 1 Processor -- start 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.651+0000 7fdb289c6700 1 -- start start 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.651+0000 7fdb289c6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb201b0f20 0x7fdb201b1330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.651+0000 7fdb289c6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb1401a4a0 con 0x7fdb201b0f20 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.652+0000 7fdb26762700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb201b0f20 0x7fdb201b1330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.652+0000 7fdb26762700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb201b0f20 0x7fdb201b1330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54344/0 (socket says 192.168.123.106:54344) 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.652+0000 7fdb26762700 1 -- 192.168.123.106:0/1705134565 learned_addr learned my addr 192.168.123.106:0/1705134565 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.652+0000 7fdb26762700 1 -- 192.168.123.106:0/1705134565 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdb1400a7e0 con 0x7fdb201b0f20 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.652+0000 7fdb26762700 1 --2- 192.168.123.106:0/1705134565 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb201b0f20 0x7fdb201b1330 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7fdb14006b20 tx=0x7fdb14004650 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.653+0000 7fdb137fe700 1 -- 192.168.123.106:0/1705134565 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdb1401ae00 con 0x7fdb201b0f20 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.653+0000 7fdb137fe700 1 -- 192.168.123.106:0/1705134565 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdb1400f070 con 0x7fdb201b0f20 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.653+0000 7fdb137fe700 1 -- 192.168.123.106:0/1705134565 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdb14022a50 con 0x7fdb201b0f20 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.655+0000 7fdb289c6700 1 -- 192.168.123.106:0/1705134565 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdb201b1870 con 0x7fdb201b0f20 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.655+0000 7fdb289c6700 1 -- 192.168.123.106:0/1705134565 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdb201b4580 con 0x7fdb201b0f20 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.655+0000 7fdb289c6700 1 -- 192.168.123.106:0/1705134565 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdb2004f030 con 0x7fdb201b0f20 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.660+0000 7fdb137fe700 1 -- 192.168.123.106:0/1705134565 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 9) v1 ==== 45092+0+0 (secure 0 0 0) 0x7fdb14018070 con 0x7fdb201b0f20 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.660+0000 7fdb137fe700 1 --2- 192.168.123.106:0/1705134565 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdb0c0384e0 0x7fdb0c03a990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.660+0000 7fdb25f61700 1 -- 192.168.123.106:0/1705134565 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdb0c0384e0 msgr2=0x7fdb0c03a990 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.660+0000 7fdb25f61700 1 --2- 192.168.123.106:0/1705134565 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdb0c0384e0 0x7fdb0c03a990 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.660+0000 7fdb137fe700 1 -- 192.168.123.106:0/1705134565 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fdb14031080 con 0x7fdb201b0f20 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.660+0000 7fdb137fe700 1 -- 192.168.123.106:0/1705134565 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fdb1404b4a0 con 0x7fdb201b0f20 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.794+0000 7fdb289c6700 1 -- 192.168.123.106:0/1705134565 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7fdb201b4890 con 0x7fdb201b0f20 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.795+0000 7fdb137fe700 1 -- 192.168.123.106:0/1705134565 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v9) v1 ==== 56+0+98 (secure 0 0 0) 0x7fdb14027030 con 0x7fdb201b0f20 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.798+0000 7fdb117fa700 1 -- 192.168.123.106:0/1705134565 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdb0c0384e0 msgr2=0x7fdb0c03a990 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.798+0000 7fdb117fa700 1 --2- 192.168.123.106:0/1705134565 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdb0c0384e0 0x7fdb0c03a990 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.798+0000 7fdb117fa700 1 -- 192.168.123.106:0/1705134565 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb201b0f20 msgr2=0x7fdb201b1330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.798+0000 7fdb117fa700 1 --2- 192.168.123.106:0/1705134565 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb201b0f20 0x7fdb201b1330 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7fdb14006b20 tx=0x7fdb14004650 comp rx=0 tx=0).stop 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.798+0000 7fdb117fa700 1 -- 192.168.123.106:0/1705134565 shutdown_connections 2026-03-09T17:24:42.832 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.798+0000 7fdb117fa700 1 --2- 192.168.123.106:0/1705134565 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdb0c0384e0 0x7fdb0c03a990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:42.833 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.798+0000 7fdb117fa700 1 --2- 192.168.123.106:0/1705134565 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb201b0f20 0x7fdb201b1330 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:42.833 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.799+0000 7fdb117fa700 1 -- 192.168.123.106:0/1705134565 >> 192.168.123.106:0/1705134565 conn(0x7fdb2006c970 msgr2=0x7fdb2006d400 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:42.833 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.799+0000 7fdb117fa700 1 -- 192.168.123.106:0/1705134565 shutdown_connections 2026-03-09T17:24:42.833 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.799+0000 7fdb117fa700 1 -- 192.168.123.106:0/1705134565 wait complete. 2026-03-09T17:24:42.833 INFO:teuthology.orchestra.run.vm06.stdout:Waiting for the mgr to restart... 2026-03-09T17:24:42.833 INFO:teuthology.orchestra.run.vm06.stdout:Waiting for mgr epoch 9... 2026-03-09T17:24:43.680 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:43 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/47882574' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished 2026-03-09T17:24:43.681 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:43 vm06 ceph-mon[57307]: mgrmap e9: vm06.pbgzei(active, since 9s) 2026-03-09T17:24:43.681 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:43 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/1705134565' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-09T17:24:47.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:47 vm06 ceph-mon[57307]: Active manager daemon vm06.pbgzei restarted 2026-03-09T17:24:47.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:47 vm06 ceph-mon[57307]: Activating manager daemon vm06.pbgzei 2026-03-09T17:24:47.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:47 vm06 ceph-mon[57307]: osdmap e3: 0 total, 0 up, 0 in 2026-03-09T17:24:47.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:47 vm06 ceph-mon[57307]: mgrmap e10: vm06.pbgzei(active, starting, since 0.111649s) 2026-03-09T17:24:47.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:47 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:24:47.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:47 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr metadata", "who": "vm06.pbgzei", "id": "vm06.pbgzei"}]: dispatch 2026-03-09T17:24:47.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:47 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T17:24:47.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:47 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T17:24:47.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:47 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T17:24:47.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:47 vm06 ceph-mon[57307]: Manager daemon vm06.pbgzei is now available 2026-03-09T17:24:47.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:47 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:24:47.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:47 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout { 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 11, 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout } 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.978+0000 7f10b8a52700 1 Processor -- start 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.978+0000 7f10b8a52700 1 -- start start 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.978+0000 7f10b8a52700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b4071410 0x7f10b4071820 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.978+0000 7f10b8a52700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f10b4071d60 con 0x7f10b4071410 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.979+0000 7f10b2d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b4071410 0x7f10b4071820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.979+0000 7f10b2d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b4071410 0x7f10b4071820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54354/0 (socket says 192.168.123.106:54354) 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.979+0000 7f10b2d9d700 1 -- 192.168.123.106:0/2253849275 learned_addr learned my addr 192.168.123.106:0/2253849275 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.979+0000 7f10b2d9d700 1 -- 192.168.123.106:0/2253849275 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f10b4071ea0 con 0x7f10b4071410 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.980+0000 7f10b2d9d700 1 --2- 192.168.123.106:0/2253849275 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b4071410 0x7f10b4071820 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f10a400d180 tx=0x7f10a400d490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=507ad7451fabe218 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.980+0000 7f10b1d9b700 1 -- 192.168.123.106:0/2253849275 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f10a4010070 con 0x7f10b4071410 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.980+0000 7f10b1d9b700 1 -- 192.168.123.106:0/2253849275 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f10a4004510 con 0x7f10b4071410 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.980+0000 7f10b8a52700 1 -- 192.168.123.106:0/2253849275 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b4071410 msgr2=0x7f10b4071820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.980+0000 7f10b8a52700 1 --2- 192.168.123.106:0/2253849275 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b4071410 0x7f10b4071820 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f10a400d180 tx=0x7f10a400d490 comp rx=0 tx=0).stop 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.981+0000 7f10b8a52700 1 -- 192.168.123.106:0/2253849275 shutdown_connections 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.981+0000 7f10b8a52700 1 --2- 192.168.123.106:0/2253849275 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b4071410 0x7f10b4071820 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.981+0000 7f10b8a52700 1 -- 192.168.123.106:0/2253849275 >> 192.168.123.106:0/2253849275 conn(0x7f10b406c9d0 msgr2=0x7f10b406ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:48.486 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.981+0000 7f10b8a52700 1 -- 192.168.123.106:0/2253849275 shutdown_connections 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.981+0000 7f10b8a52700 1 -- 192.168.123.106:0/2253849275 wait complete. 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.981+0000 7f10b8a52700 1 Processor -- start 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.981+0000 7f10b8a52700 1 -- start start 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.981+0000 7f10b8a52700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b419ff40 0x7f10b41a0350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.981+0000 7f10b8a52700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f10a4003c20 con 0x7f10b419ff40 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.982+0000 7f10b2d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b419ff40 0x7f10b41a0350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.982+0000 7f10b2d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b419ff40 0x7f10b41a0350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54356/0 (socket says 192.168.123.106:54356) 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.982+0000 7f10b2d9d700 1 -- 192.168.123.106:0/2226817147 learned_addr learned my addr 192.168.123.106:0/2226817147 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.982+0000 7f10b2d9d700 1 -- 192.168.123.106:0/2226817147 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f10a40087c0 con 0x7f10b419ff40 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.982+0000 7f10b2d9d700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b419ff40 0x7f10b41a0350 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f10a4008c10 tx=0x7f10a4008cf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.983+0000 7f109bfff700 1 -- 192.168.123.106:0/2226817147 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f10a4010050 con 0x7f10b419ff40 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.983+0000 7f10b8a52700 1 -- 192.168.123.106:0/2226817147 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f10b41a0890 con 0x7f10b419ff40 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.983+0000 7f10b8a52700 1 -- 192.168.123.106:0/2226817147 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f10b41a1510 con 0x7f10b419ff40 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.983+0000 7f109bfff700 1 -- 192.168.123.106:0/2226817147 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f10a400deb0 con 0x7f10b419ff40 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.983+0000 7f109bfff700 1 -- 192.168.123.106:0/2226817147 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f10a4016440 con 0x7f10b419ff40 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.984+0000 7f109bfff700 1 -- 192.168.123.106:0/2226817147 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 9) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f10a40041f0 con 0x7f10b419ff40 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.985+0000 7f109bfff700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 0x7f109c03a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.985+0000 7f10b259c700 1 -- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 msgr2=0x7f109c03a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.985+0000 7f10b259c700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 0x7f109c03a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.985+0000 7f109bfff700 1 -- 192.168.123.106:0/2226817147 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f109c03b0c0 con 0x7f109c038500 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:42.985+0000 7f109bfff700 1 -- 192.168.123.106:0/2226817147 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f10a404b920 con 0x7f10b419ff40 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:43.185+0000 7f10b259c700 1 -- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 msgr2=0x7f109c03a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:43.185+0000 7f10b259c700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 0x7f109c03a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:43.585+0000 7f10b259c700 1 -- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 msgr2=0x7f109c03a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:43.586+0000 7f10b259c700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 0x7f109c03a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:44.386+0000 7f10b259c700 1 -- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 msgr2=0x7f109c03a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:44.386+0000 7f10b259c700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 0x7f109c03a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:45.988+0000 7f10b259c700 1 -- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 msgr2=0x7f109c03a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:45.988+0000 7f10b259c700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 0x7f109c03a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:47.416+0000 7f109bfff700 1 -- 192.168.123.106:0/2226817147 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mgrmap(e 10) v1 ==== 44859+0+0 (secure 0 0 0) 0x7f10a40044a0 con 0x7f10b419ff40 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:47.417+0000 7f109bfff700 1 -- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 msgr2=0x7f109c03a9b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:47.417+0000 7f109bfff700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 0x7f109c03a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.421+0000 7f109bfff700 1 -- 192.168.123.106:0/2226817147 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7f10a4026c30 con 0x7f10b419ff40 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.421+0000 7f109bfff700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 0x7f109c03a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.421+0000 7f109bfff700 1 -- 192.168.123.106:0/2226817147 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f109c03b0c0 con 0x7f109c038500 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.422+0000 7f10b259c700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 0x7f109c03a9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.422+0000 7f10b259c700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 0x7f109c03a9b0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f10ac003de0 tx=0x7f10ac0073e0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.424+0000 7f109bfff700 1 -- 192.168.123.106:0/2226817147 <== mgr.14164 v2:192.168.123.106:6800/2 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+6910 (secure 0 0 0) 0x7f109c03b0c0 con 0x7f109c038500 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.427+0000 7f10b8a52700 1 -- 192.168.123.106:0/2226817147 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7f10b41a12c0 con 0x7f109c038500 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.429+0000 7f109bfff700 1 -- 192.168.123.106:0/2226817147 <== mgr.14164 v2:192.168.123.106:6800/2 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+52 (secure 0 0 0) 0x7f10b41a12c0 con 0x7f109c038500 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.429+0000 7f10b8a52700 1 -- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 msgr2=0x7f109c03a9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.430+0000 7f10b8a52700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 0x7f109c03a9b0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f10ac003de0 tx=0x7f10ac0073e0 comp rx=0 tx=0).stop 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.430+0000 7f10b8a52700 1 -- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b419ff40 msgr2=0x7f10b41a0350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.430+0000 7f10b8a52700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b419ff40 0x7f10b41a0350 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f10a4008c10 tx=0x7f10a4008cf0 comp rx=0 tx=0).stop 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.430+0000 7f10b8a52700 1 -- 192.168.123.106:0/2226817147 shutdown_connections 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.430+0000 7f10b8a52700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c038500 0x7f109c03a9b0 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:48.487 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.430+0000 7f10b8a52700 1 --2- 192.168.123.106:0/2226817147 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b419ff40 0x7f10b41a0350 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:48.488 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.430+0000 7f10b8a52700 1 -- 192.168.123.106:0/2226817147 >> 192.168.123.106:0/2226817147 conn(0x7f10b406c9d0 msgr2=0x7f10b406d450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:48.488 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.430+0000 7f10b8a52700 1 -- 192.168.123.106:0/2226817147 shutdown_connections 2026-03-09T17:24:48.488 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.430+0000 7f10b8a52700 1 -- 192.168.123.106:0/2226817147 wait complete. 2026-03-09T17:24:48.488 INFO:teuthology.orchestra.run.vm06.stdout:mgr epoch 9 is available 2026-03-09T17:24:48.488 INFO:teuthology.orchestra.run.vm06.stdout:Generating a dashboard self-signed certificate... 2026-03-09T17:24:48.771 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:48 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/mirror_snapshot_schedule"}]: dispatch 2026-03-09T17:24:48.771 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:48 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/trash_purge_schedule"}]: dispatch 2026-03-09T17:24:48.771 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:48 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:48.771 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:48 vm06 ceph-mon[57307]: mgrmap e11: vm06.pbgzei(active, since 1.11604s) 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout Self-signed certificate created 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.644+0000 7ff7a8ad1700 1 Processor -- start 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.644+0000 7ff7a8ad1700 1 -- start start 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.645+0000 7ff7a8ad1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff7a41043b0 0x7ff7a41047c0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.646+0000 7ff7a259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff7a41043b0 0x7ff7a41047c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.646+0000 7ff7a259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff7a41043b0 0x7ff7a41047c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54430/0 (socket says 192.168.123.106:54430) 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.646+0000 7ff7a259c700 1 -- 192.168.123.106:0/2761958818 learned_addr learned my addr 192.168.123.106:0/2761958818 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.646+0000 7ff7a8ad1700 1 -- 192.168.123.106:0/2761958818 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff7a4104d00 con 0x7ff7a41043b0 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.647+0000 7ff7a259c700 1 -- 192.168.123.106:0/2761958818 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff7a4104e40 con 0x7ff7a41043b0 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.647+0000 7ff7a259c700 1 --2- 192.168.123.106:0/2761958818 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff7a41043b0 0x7ff7a41047c0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7ff794009cf0 tx=0x7ff79400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a3861139df941c0b server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.649+0000 7ff7a159a700 1 -- 192.168.123.106:0/2761958818 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff794004030 con 0x7ff7a41043b0 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.649+0000 7ff7a159a700 1 -- 192.168.123.106:0/2761958818 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff79400b810 con 0x7ff7a41043b0 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.649+0000 7ff7a159a700 1 -- 192.168.123.106:0/2761958818 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff794003b10 con 0x7ff7a41043b0 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.650+0000 7ff7a8ad1700 1 -- 192.168.123.106:0/2761958818 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff7a41043b0 msgr2=0x7ff7a41047c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.650+0000 7ff7a8ad1700 1 --2- 192.168.123.106:0/2761958818 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff7a41043b0 0x7ff7a41047c0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7ff794009cf0 tx=0x7ff79400b0e0 comp rx=0 tx=0).stop 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.650+0000 7ff7a8ad1700 1 -- 192.168.123.106:0/2761958818 shutdown_connections 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.650+0000 7ff7a8ad1700 1 --2- 192.168.123.106:0/2761958818 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff7a41043b0 0x7ff7a41047c0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.650+0000 7ff7a8ad1700 1 -- 192.168.123.106:0/2761958818 >> 192.168.123.106:0/2761958818 conn(0x7ff7a40ffa60 msgr2=0x7ff7a4101e70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.651+0000 7ff7a8ad1700 1 -- 192.168.123.106:0/2761958818 shutdown_connections 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.651+0000 7ff7a8ad1700 1 -- 192.168.123.106:0/2761958818 wait complete. 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.652+0000 7ff7a8ad1700 1 Processor -- start 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.652+0000 7ff7a8ad1700 1 -- start start 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.652+0000 7ff7a8ad1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff7a41043b0 0x7ff7a4197550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.652+0000 7ff7a259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff7a41043b0 0x7ff7a4197550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:48.898 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.652+0000 7ff7a259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff7a41043b0 0x7ff7a4197550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54438/0 (socket says 192.168.123.106:54438) 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.653+0000 7ff7a259c700 1 -- 192.168.123.106:0/638848840 learned_addr learned my addr 192.168.123.106:0/638848840 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.653+0000 7ff7a8ad1700 1 -- 192.168.123.106:0/638848840 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff7a4197a90 con 0x7ff7a41043b0 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.653+0000 7ff7a259c700 1 -- 192.168.123.106:0/638848840 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff794009740 con 0x7ff7a41043b0 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.653+0000 7ff7a259c700 1 --2- 192.168.123.106:0/638848840 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff7a41043b0 0x7ff7a4197550 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7ff794009cc0 tx=0x7ff794011ac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.653+0000 7ff7937fe700 1 -- 192.168.123.106:0/638848840 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff7940036a0 con 0x7ff7a41043b0 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.654+0000 7ff7a8ad1700 1 -- 192.168.123.106:0/638848840 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff7a4197c90 con 0x7ff7a41043b0 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.654+0000 7ff7a8ad1700 1 -- 192.168.123.106:0/638848840 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff7a41980d0 con 0x7ff7a41043b0 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.655+0000 7ff7937fe700 1 -- 192.168.123.106:0/638848840 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff794011cd0 con 0x7ff7a41043b0 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.655+0000 7ff7937fe700 1 -- 192.168.123.106:0/638848840 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff79401a6d0 con 0x7ff7a41043b0 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.656+0000 7ff7937fe700 1 -- 192.168.123.106:0/638848840 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7ff79401a8f0 con 0x7ff7a41043b0 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.656+0000 7ff7937fe700 1 --2- 192.168.123.106:0/638848840 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff78c03c900 0x7ff78c03edb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.656+0000 7ff7937fe700 1 -- 192.168.123.106:0/638848840 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7ff794052600 con 0x7ff7a41043b0 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.656+0000 7ff7a1d9b700 1 --2- 192.168.123.106:0/638848840 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff78c03c900 0x7ff78c03edb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.657+0000 7ff7917fa700 1 -- 192.168.123.106:0/638848840 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff784005320 con 0x7ff7a41043b0 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.660+0000 7ff7937fe700 1 -- 192.168.123.106:0/638848840 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7ff794053050 con 0x7ff7a41043b0 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.660+0000 7ff7a1d9b700 1 --2- 192.168.123.106:0/638848840 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff78c03c900 0x7ff78c03edb0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7ff798006fd0 tx=0x7ff798006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.768+0000 7ff7917fa700 1 -- 192.168.123.106:0/638848840 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}) v1 -- 0x7ff784000bf0 con 0x7ff78c03c900 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.848+0000 7ff7937fe700 1 -- 192.168.123.106:0/638848840 <== mgr.14164 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7ff784000bf0 con 0x7ff78c03c900 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.850+0000 7ff7917fa700 1 -- 192.168.123.106:0/638848840 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff78c03c900 msgr2=0x7ff78c03edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.850+0000 7ff7917fa700 1 --2- 192.168.123.106:0/638848840 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff78c03c900 0x7ff78c03edb0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7ff798006fd0 tx=0x7ff798006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.850+0000 7ff7917fa700 1 -- 192.168.123.106:0/638848840 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff7a41043b0 msgr2=0x7ff7a4197550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.850+0000 7ff7917fa700 1 --2- 192.168.123.106:0/638848840 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff7a41043b0 0x7ff7a4197550 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7ff794009cc0 tx=0x7ff794011ac0 comp rx=0 tx=0).stop 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.851+0000 7ff7917fa700 1 -- 192.168.123.106:0/638848840 shutdown_connections 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.851+0000 7ff7917fa700 1 --2- 192.168.123.106:0/638848840 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff78c03c900 0x7ff78c03edb0 secure :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7ff798006fd0 tx=0x7ff798006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.851+0000 7ff7917fa700 1 --2- 192.168.123.106:0/638848840 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff7a41043b0 0x7ff7a4197550 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.851+0000 7ff7917fa700 1 -- 192.168.123.106:0/638848840 >> 192.168.123.106:0/638848840 conn(0x7ff7a40ffa60 msgr2=0x7ff7a418be10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.851+0000 7ff7917fa700 1 -- 192.168.123.106:0/638848840 shutdown_connections 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:48.851+0000 7ff7917fa700 1 -- 192.168.123.106:0/638848840 wait complete. 2026-03-09T17:24:48.899 INFO:teuthology.orchestra.run.vm06.stdout:Creating initial admin user... 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout {"username": "admin", "password": "$2b$12$7fqTQVeoIi1QYv6f7al3nu6BDsdkuXFDagBGqqwRg6FSkaofrycK6", "roles": ["administrator"], "name": null, "email": null, "lastUpdate": 1773077089, "enabled": true, "pwdExpirationDate": null, "pwdUpdateRequired": true} 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.044+0000 7fb337fff700 1 Processor -- start 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.044+0000 7fb337fff700 1 -- start start 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.044+0000 7fb337fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb338072a40 0x7fb338071060 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.044+0000 7fb337fff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb3380715a0 con 0x7fb338072a40 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.044+0000 7fb336ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb338072a40 0x7fb338071060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.044+0000 7fb336ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb338072a40 0x7fb338071060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54454/0 (socket says 192.168.123.106:54454) 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.044+0000 7fb336ffd700 1 -- 192.168.123.106:0/2636755869 learned_addr learned my addr 192.168.123.106:0/2636755869 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.046+0000 7fb336ffd700 1 -- 192.168.123.106:0/2636755869 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb3380716e0 con 0x7fb338072a40 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.046+0000 7fb336ffd700 1 --2- 192.168.123.106:0/2636755869 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb338072a40 0x7fb338071060 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fb33000b0d0 tx=0x7fb33000b490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=829bbafe71b658ef server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.046+0000 7fb335ffb700 1 -- 192.168.123.106:0/2636755869 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb33000e070 con 0x7fb338072a40 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.046+0000 7fb335ffb700 1 -- 192.168.123.106:0/2636755869 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb330003a20 con 0x7fb338072a40 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.047+0000 7fb337fff700 1 -- 192.168.123.106:0/2636755869 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb338072a40 msgr2=0x7fb338071060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.047+0000 7fb337fff700 1 --2- 192.168.123.106:0/2636755869 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb338072a40 0x7fb338071060 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fb33000b0d0 tx=0x7fb33000b490 comp rx=0 tx=0).stop 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.047+0000 7fb337fff700 1 -- 192.168.123.106:0/2636755869 shutdown_connections 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.047+0000 7fb337fff700 1 --2- 192.168.123.106:0/2636755869 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb338072a40 0x7fb338071060 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.047+0000 7fb337fff700 1 -- 192.168.123.106:0/2636755869 >> 192.168.123.106:0/2636755869 conn(0x7fb33806c9d0 msgr2=0x7fb33806ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.047+0000 7fb337fff700 1 -- 192.168.123.106:0/2636755869 shutdown_connections 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.047+0000 7fb337fff700 1 -- 192.168.123.106:0/2636755869 wait complete. 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.048+0000 7fb337fff700 1 Processor -- start 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.048+0000 7fb337fff700 1 -- start start 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.048+0000 7fb337fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3381a8710 0x7fb3381a8b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.048+0000 7fb337fff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb330004510 con 0x7fb3381a8710 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.048+0000 7fb336ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3381a8710 0x7fb3381a8b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.048+0000 7fb336ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3381a8710 0x7fb3381a8b20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54458/0 (socket says 192.168.123.106:54458) 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.048+0000 7fb336ffd700 1 -- 192.168.123.106:0/1312778224 learned_addr learned my addr 192.168.123.106:0/1312778224 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.049+0000 7fb336ffd700 1 -- 192.168.123.106:0/1312778224 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb330009d20 con 0x7fb3381a8710 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.049+0000 7fb336ffd700 1 --2- 192.168.123.106:0/1312778224 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3381a8710 0x7fb3381a8b20 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fb330003c60 tx=0x7fb330003d40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:49.403 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.049+0000 7fb31ffff700 1 -- 192.168.123.106:0/1312778224 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb33000e070 con 0x7fb3381a8710 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.049+0000 7fb31ffff700 1 -- 192.168.123.106:0/1312778224 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb330009ba0 con 0x7fb3381a8710 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.049+0000 7fb337fff700 1 -- 192.168.123.106:0/1312778224 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb3381a9060 con 0x7fb3381a8710 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.050+0000 7fb337fff700 1 -- 192.168.123.106:0/1312778224 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb3381abd70 con 0x7fb3381a8710 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.049+0000 7fb31ffff700 1 -- 192.168.123.106:0/1312778224 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb330012660 con 0x7fb3381a8710 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.051+0000 7fb31ffff700 1 -- 192.168.123.106:0/1312778224 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7fb330019040 con 0x7fb3381a8710 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.051+0000 7fb31ffff700 1 --2- 192.168.123.106:0/1312778224 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb320038070 0x7fb32003a520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.051+0000 7fb31ffff700 1 -- 192.168.123.106:0/1312778224 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fb330050600 con 0x7fb3381a8710 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.051+0000 7fb3367fc700 1 --2- 192.168.123.106:0/1312778224 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb320038070 0x7fb32003a520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.051+0000 7fb3367fc700 1 --2- 192.168.123.106:0/1312778224 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb320038070 0x7fb32003a520 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fb32c009990 tx=0x7fb32c006e30 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.051+0000 7fb337fff700 1 -- 192.168.123.106:0/1312778224 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb338062380 con 0x7fb3381a8710 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.059+0000 7fb31ffff700 1 -- 192.168.123.106:0/1312778224 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb330017080 con 0x7fb3381a8710 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.187+0000 7fb337fff700 1 -- 192.168.123.106:0/1312778224 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}) v1 -- 0x7fb3381abf60 con 0x7fb320038070 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.366+0000 7fb31ffff700 1 -- 192.168.123.106:0/1312778224 <== mgr.14164 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+252 (secure 0 0 0) 0x7fb3381abf60 con 0x7fb320038070 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.368+0000 7fb337fff700 1 -- 192.168.123.106:0/1312778224 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb320038070 msgr2=0x7fb32003a520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.368+0000 7fb337fff700 1 --2- 192.168.123.106:0/1312778224 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb320038070 0x7fb32003a520 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fb32c009990 tx=0x7fb32c006e30 comp rx=0 tx=0).stop 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.369+0000 7fb337fff700 1 -- 192.168.123.106:0/1312778224 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3381a8710 msgr2=0x7fb3381a8b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.369+0000 7fb337fff700 1 --2- 192.168.123.106:0/1312778224 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3381a8710 0x7fb3381a8b20 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fb330003c60 tx=0x7fb330003d40 comp rx=0 tx=0).stop 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.369+0000 7fb337fff700 1 -- 192.168.123.106:0/1312778224 shutdown_connections 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.369+0000 7fb337fff700 1 --2- 192.168.123.106:0/1312778224 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb320038070 0x7fb32003a520 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.369+0000 7fb337fff700 1 --2- 192.168.123.106:0/1312778224 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3381a8710 0x7fb3381a8b20 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.369+0000 7fb337fff700 1 -- 192.168.123.106:0/1312778224 >> 192.168.123.106:0/1312778224 conn(0x7fb33806c9d0 msgr2=0x7fb33806d450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.369+0000 7fb337fff700 1 -- 192.168.123.106:0/1312778224 shutdown_connections 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.369+0000 7fb337fff700 1 -- 192.168.123.106:0/1312778224 wait complete. 2026-03-09T17:24:49.404 INFO:teuthology.orchestra.run.vm06.stdout:Fetching dashboard port number... 2026-03-09T17:24:49.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:49 vm06 ceph-mon[57307]: [09/Mar/2026:17:24:48] ENGINE Bus STARTING 2026-03-09T17:24:49.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:49 vm06 ceph-mon[57307]: [09/Mar/2026:17:24:48] ENGINE Serving on http://192.168.123.106:8765 2026-03-09T17:24:49.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:49 vm06 ceph-mon[57307]: [09/Mar/2026:17:24:48] ENGINE Serving on https://192.168.123.106:7150 2026-03-09T17:24:49.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:49 vm06 ceph-mon[57307]: [09/Mar/2026:17:24:48] ENGINE Bus STARTED 2026-03-09T17:24:49.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:49 vm06 ceph-mon[57307]: from='client.14168 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-09T17:24:49.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:49 vm06 ceph-mon[57307]: from='client.14168 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-09T17:24:49.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:49 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:49.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:49 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:49.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:49 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:49.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:49 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stdout 8443 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.545+0000 7f54692eb700 1 Processor -- start 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.545+0000 7f54692eb700 1 -- start start 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.546+0000 7f54692eb700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5464106270 0x7f5464106680 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.546+0000 7f54692eb700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5464106bc0 con 0x7f5464106270 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.546+0000 7f5462ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5464106270 0x7f5464106680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.546+0000 7f5462ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5464106270 0x7f5464106680 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54460/0 (socket says 192.168.123.106:54460) 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.546+0000 7f5462ffd700 1 -- 192.168.123.106:0/2083391976 learned_addr learned my addr 192.168.123.106:0/2083391976 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.546+0000 7f5462ffd700 1 -- 192.168.123.106:0/2083391976 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5464106d00 con 0x7f5464106270 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.547+0000 7f5462ffd700 1 --2- 192.168.123.106:0/2083391976 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5464106270 0x7f5464106680 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f544c009a90 tx=0x7f544c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=9064eeeae371b096 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.547+0000 7f5461ffb700 1 -- 192.168.123.106:0/2083391976 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f544c004030 con 0x7f5464106270 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.547+0000 7f5461ffb700 1 -- 192.168.123.106:0/2083391976 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f544c00b7e0 con 0x7f5464106270 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.547+0000 7f5461ffb700 1 -- 192.168.123.106:0/2083391976 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f544c003ae0 con 0x7f5464106270 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.548+0000 7f54692eb700 1 -- 192.168.123.106:0/2083391976 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5464106270 msgr2=0x7f5464106680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.548+0000 7f54692eb700 1 --2- 192.168.123.106:0/2083391976 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5464106270 0x7f5464106680 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f544c009a90 tx=0x7f544c009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.548+0000 7f54692eb700 1 -- 192.168.123.106:0/2083391976 shutdown_connections 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.548+0000 7f54692eb700 1 --2- 192.168.123.106:0/2083391976 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5464106270 0x7f5464106680 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.548+0000 7f54692eb700 1 -- 192.168.123.106:0/2083391976 >> 192.168.123.106:0/2083391976 conn(0x7f54640ffa10 msgr2=0x7f5464101e20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.548+0000 7f54692eb700 1 -- 192.168.123.106:0/2083391976 shutdown_connections 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.548+0000 7f54692eb700 1 -- 192.168.123.106:0/2083391976 wait complete. 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.548+0000 7f54692eb700 1 Processor -- start 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.549+0000 7f54692eb700 1 -- start start 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.549+0000 7f54692eb700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f546419be80 0x7f546419c290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.549+0000 7f54692eb700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f546419c7d0 con 0x7f546419be80 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.549+0000 7f5462ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f546419be80 0x7f546419c290 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:49.718 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.549+0000 7f5462ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f546419be80 0x7f546419c290 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54466/0 (socket says 192.168.123.106:54466) 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.549+0000 7f5462ffd700 1 -- 192.168.123.106:0/1515162303 learned_addr learned my addr 192.168.123.106:0/1515162303 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.549+0000 7f5462ffd700 1 -- 192.168.123.106:0/1515162303 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f544c009740 con 0x7f546419be80 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.549+0000 7f5462ffd700 1 --2- 192.168.123.106:0/1515162303 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f546419be80 0x7f546419c290 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f544c0038b0 tx=0x7f544c00bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.550+0000 7f545bfff700 1 -- 192.168.123.106:0/1515162303 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f544c003fa0 con 0x7f546419be80 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.550+0000 7f545bfff700 1 -- 192.168.123.106:0/1515162303 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f544c01b440 con 0x7f546419be80 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.550+0000 7f545bfff700 1 -- 192.168.123.106:0/1515162303 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f544c011420 con 0x7f546419be80 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.550+0000 7f54692eb700 1 -- 192.168.123.106:0/1515162303 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f546419c9d0 con 0x7f546419be80 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.550+0000 7f54692eb700 1 -- 192.168.123.106:0/1515162303 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f546419f630 con 0x7f546419be80 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.551+0000 7f545bfff700 1 -- 192.168.123.106:0/1515162303 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7f544c0115a0 con 0x7f546419be80 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.551+0000 7f54692eb700 1 -- 192.168.123.106:0/1515162303 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5464062380 con 0x7f546419be80 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.551+0000 7f545bfff700 1 --2- 192.168.123.106:0/1515162303 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f54500383f0 0x7f545003a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.551+0000 7f545bfff700 1 -- 192.168.123.106:0/1515162303 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f544c04d600 con 0x7f546419be80 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.551+0000 7f54627fc700 1 --2- 192.168.123.106:0/1515162303 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f54500383f0 0x7f545003a8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.553+0000 7f54627fc700 1 --2- 192.168.123.106:0/1515162303 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f54500383f0 0x7f545003a8a0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f5454006fd0 tx=0x7f5454006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.555+0000 7f545bfff700 1 -- 192.168.123.106:0/1515162303 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f544c02e3e0 con 0x7f546419be80 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.659+0000 7f54692eb700 1 -- 192.168.123.106:0/1515162303 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"} v 0) v1 -- 0x7f546419fb60 con 0x7f546419be80 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.660+0000 7f545bfff700 1 -- 192.168.123.106:0/1515162303 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]=0 v8) v1 ==== 112+0+5 (secure 0 0 0) 0x7f544c02e3e0 con 0x7f546419be80 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.663+0000 7f54692eb700 1 -- 192.168.123.106:0/1515162303 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f54500383f0 msgr2=0x7f545003a8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.663+0000 7f54692eb700 1 --2- 192.168.123.106:0/1515162303 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f54500383f0 0x7f545003a8a0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f5454006fd0 tx=0x7f5454006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.663+0000 7f54692eb700 1 -- 192.168.123.106:0/1515162303 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f546419be80 msgr2=0x7f546419c290 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.663+0000 7f54692eb700 1 --2- 192.168.123.106:0/1515162303 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f546419be80 0x7f546419c290 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f544c0038b0 tx=0x7f544c00bfa0 comp rx=0 tx=0).stop 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.663+0000 7f54692eb700 1 -- 192.168.123.106:0/1515162303 shutdown_connections 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.663+0000 7f54692eb700 1 --2- 192.168.123.106:0/1515162303 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f54500383f0 0x7f545003a8a0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.663+0000 7f54692eb700 1 --2- 192.168.123.106:0/1515162303 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f546419be80 0x7f546419c290 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.663+0000 7f54692eb700 1 -- 192.168.123.106:0/1515162303 >> 192.168.123.106:0/1515162303 conn(0x7f54640ffa10 msgr2=0x7f5464101020 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.663+0000 7f54692eb700 1 -- 192.168.123.106:0/1515162303 shutdown_connections 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.664+0000 7f54692eb700 1 -- 192.168.123.106:0/1515162303 wait complete. 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:firewalld does not appear to be present 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:Not possible to open ports <[8443]>. firewalld.service is not available 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:Ceph Dashboard is now available at: 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout: URL: https://vm06.local:8443/ 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout: User: admin 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout: Password: zmlj4vub4f 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:Saving cluster configuration to /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config directory 2026-03-09T17:24:49.719 INFO:teuthology.orchestra.run.vm06.stdout:Enabling autotune for osd_memory_target 2026-03-09T17:24:50.031 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.855+0000 7f4d7fdbb700 1 Processor -- start 2026-03-09T17:24:50.031 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.855+0000 7f4d7fdbb700 1 -- start start 2026-03-09T17:24:50.031 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.855+0000 7f4d7fdbb700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d7807c7e0 0x7f4d7807cbf0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:50.031 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.855+0000 7f4d7fdbb700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d7807d130 con 0x7f4d7807c7e0 2026-03-09T17:24:50.031 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.855+0000 7f4d7db57700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d7807c7e0 0x7f4d7807cbf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:50.031 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.855+0000 7f4d7db57700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d7807c7e0 0x7f4d7807cbf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54478/0 (socket says 192.168.123.106:54478) 2026-03-09T17:24:50.031 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.855+0000 7f4d7db57700 1 -- 192.168.123.106:0/2122540591 learned_addr learned my addr 192.168.123.106:0/2122540591 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:50.031 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.856+0000 7f4d7db57700 1 -- 192.168.123.106:0/2122540591 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4d7807d270 con 0x7f4d7807c7e0 2026-03-09T17:24:50.031 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.856+0000 7f4d7db57700 1 --2- 192.168.123.106:0/2122540591 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d7807c7e0 0x7f4d7807cbf0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f4d68009a90 tx=0x7f4d68009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=3494485113df13a5 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.857+0000 7f4d7cb55700 1 -- 192.168.123.106:0/2122540591 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4d68004030 con 0x7f4d7807c7e0 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.857+0000 7f4d7cb55700 1 -- 192.168.123.106:0/2122540591 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4d6800b7e0 con 0x7f4d7807c7e0 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.857+0000 7f4d7cb55700 1 -- 192.168.123.106:0/2122540591 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4d68003ae0 con 0x7f4d7807c7e0 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.857+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/2122540591 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d7807c7e0 msgr2=0x7f4d7807cbf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.857+0000 7f4d7fdbb700 1 --2- 192.168.123.106:0/2122540591 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d7807c7e0 0x7f4d7807cbf0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f4d68009a90 tx=0x7f4d68009da0 comp rx=0 tx=0).stop 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.857+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/2122540591 shutdown_connections 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.857+0000 7f4d7fdbb700 1 --2- 192.168.123.106:0/2122540591 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d7807c7e0 0x7f4d7807cbf0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.857+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/2122540591 >> 192.168.123.106:0/2122540591 conn(0x7f4d7807b4b0 msgr2=0x7f4d7807b8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.857+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/2122540591 shutdown_connections 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.858+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/2122540591 wait complete. 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.858+0000 7f4d7fdbb700 1 Processor -- start 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.858+0000 7f4d7fdbb700 1 -- start start 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.858+0000 7f4d7fdbb700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d7807c7e0 0x7f4d7819ff60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.858+0000 7f4d7fdbb700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d781a04a0 con 0x7f4d7807c7e0 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.858+0000 7f4d7db57700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d7807c7e0 0x7f4d7819ff60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.858+0000 7f4d7db57700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d7807c7e0 0x7f4d7819ff60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54480/0 (socket says 192.168.123.106:54480) 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.858+0000 7f4d7db57700 1 -- 192.168.123.106:0/4011016871 learned_addr learned my addr 192.168.123.106:0/4011016871 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:50.032 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.859+0000 7f4d7db57700 1 -- 192.168.123.106:0/4011016871 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4d68009740 con 0x7f4d7807c7e0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.859+0000 7f4d7db57700 1 --2- 192.168.123.106:0/4011016871 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d7807c7e0 0x7f4d7819ff60 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f4d68009710 tx=0x7f4d6800bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.859+0000 7f4d6effd700 1 -- 192.168.123.106:0/4011016871 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4d680041a0 con 0x7f4d7807c7e0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.859+0000 7f4d6effd700 1 -- 192.168.123.106:0/4011016871 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4d68004300 con 0x7f4d7807c7e0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.859+0000 7f4d6effd700 1 -- 192.168.123.106:0/4011016871 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4d680114a0 con 0x7f4d7807c7e0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.859+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/4011016871 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4d781a06a0 con 0x7f4d7807c7e0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.859+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/4011016871 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4d781a0b40 con 0x7f4d7807c7e0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.860+0000 7f4d6effd700 1 -- 192.168.123.106:0/4011016871 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f4d68011690 con 0x7f4d7807c7e0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.861+0000 7f4d6effd700 1 --2- 192.168.123.106:0/4011016871 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4d640385d0 0x7f4d6403aa80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.861+0000 7f4d6effd700 1 -- 192.168.123.106:0/4011016871 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f4d6804d1c0 con 0x7f4d7807c7e0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.861+0000 7f4d7d356700 1 --2- 192.168.123.106:0/4011016871 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4d640385d0 0x7f4d6403aa80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.861+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/4011016871 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4d78199790 con 0x7f4d7807c7e0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.865+0000 7f4d6effd700 1 -- 192.168.123.106:0/4011016871 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4d6802b430 con 0x7f4d7807c7e0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.865+0000 7f4d7d356700 1 --2- 192.168.123.106:0/4011016871 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4d640385d0 0x7f4d6403aa80 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f4d74006fd0 tx=0x7f4d74006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.972+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/4011016871 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config set, name=osd_memory_target_autotune}] v 0) v1 -- 0x7f4d7804f9e0 con 0x7f4d7807c7e0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.974+0000 7f4d6effd700 1 -- 192.168.123.106:0/4011016871 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{prefix=config set, name=osd_memory_target_autotune}]=0 v8)=0 v8) v1 ==== 127+0+0 (secure 0 0 0) 0x7f4d68018b40 con 0x7f4d7807c7e0 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.977+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/4011016871 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4d640385d0 msgr2=0x7f4d6403aa80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.977+0000 7f4d7fdbb700 1 --2- 192.168.123.106:0/4011016871 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4d640385d0 0x7f4d6403aa80 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f4d74006fd0 tx=0x7f4d74006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.977+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/4011016871 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d7807c7e0 msgr2=0x7f4d7819ff60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.977+0000 7f4d7fdbb700 1 --2- 192.168.123.106:0/4011016871 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d7807c7e0 0x7f4d7819ff60 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f4d68009710 tx=0x7f4d6800bfa0 comp rx=0 tx=0).stop 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.977+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/4011016871 shutdown_connections 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.977+0000 7f4d7fdbb700 1 --2- 192.168.123.106:0/4011016871 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4d640385d0 0x7f4d6403aa80 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.977+0000 7f4d7fdbb700 1 --2- 192.168.123.106:0/4011016871 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d7807c7e0 0x7f4d7819ff60 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.977+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/4011016871 >> 192.168.123.106:0/4011016871 conn(0x7f4d7807b4b0 msgr2=0x7f4d78106960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.978+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/4011016871 shutdown_connections 2026-03-09T17:24:50.033 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:49.978+0000 7f4d7fdbb700 1 -- 192.168.123.106:0/4011016871 wait complete. 2026-03-09T17:24:50.640 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:50 vm06 ceph-mon[57307]: from='client.14176 -' entity='client.admin' cmd=[{"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:50.640 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:50 vm06 ceph-mon[57307]: from='client.14178 -' entity='client.admin' cmd=[{"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:50.640 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:50 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/1515162303' entity='client.admin' cmd=[{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]: dispatch 2026-03-09T17:24:50.640 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:50 vm06 ceph-mon[57307]: mgrmap e12: vm06.pbgzei(active, since 2s) 2026-03-09T17:24:50.640 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:50 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/2834715927' entity='client.admin' 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.165+0000 7f7c3b231700 1 Processor -- start 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.165+0000 7f7c3b231700 1 -- start start 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.166+0000 7f7c3b231700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c34104620 0x7f7c34106a40 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.166+0000 7f7c3b231700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7c340745b0 con 0x7f7c34104620 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.166+0000 7f7c38fcd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c34104620 0x7f7c34106a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.166+0000 7f7c38fcd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c34104620 0x7f7c34106a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54486/0 (socket says 192.168.123.106:54486) 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.166+0000 7f7c38fcd700 1 -- 192.168.123.106:0/4174906240 learned_addr learned my addr 192.168.123.106:0/4174906240 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.166+0000 7f7c38fcd700 1 -- 192.168.123.106:0/4174906240 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7c340746f0 con 0x7f7c34104620 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.167+0000 7f7c38fcd700 1 --2- 192.168.123.106:0/4174906240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c34104620 0x7f7c34106a40 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f7c24009cf0 tx=0x7f7c2400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=9c2577dd4f2d6a3d server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.167+0000 7f7c337fe700 1 -- 192.168.123.106:0/4174906240 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7c24004030 con 0x7f7c34104620 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.167+0000 7f7c337fe700 1 -- 192.168.123.106:0/4174906240 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7c2400b810 con 0x7f7c34104620 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.168+0000 7f7c3b231700 1 -- 192.168.123.106:0/4174906240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c34104620 msgr2=0x7f7c34106a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.168+0000 7f7c3b231700 1 --2- 192.168.123.106:0/4174906240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c34104620 0x7f7c34106a40 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f7c24009cf0 tx=0x7f7c2400b0e0 comp rx=0 tx=0).stop 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.168+0000 7f7c3b231700 1 -- 192.168.123.106:0/4174906240 shutdown_connections 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.168+0000 7f7c3b231700 1 --2- 192.168.123.106:0/4174906240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c34104620 0x7f7c34106a40 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.168+0000 7f7c3b231700 1 -- 192.168.123.106:0/4174906240 >> 192.168.123.106:0/4174906240 conn(0x7f7c34100270 msgr2=0x7f7c341026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.168+0000 7f7c3b231700 1 -- 192.168.123.106:0/4174906240 shutdown_connections 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.168+0000 7f7c3b231700 1 -- 192.168.123.106:0/4174906240 wait complete. 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.169+0000 7f7c3b231700 1 Processor -- start 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.169+0000 7f7c3b231700 1 -- start start 2026-03-09T17:24:50.646 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.169+0000 7f7c3b231700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c34104620 0x7f7c341a0470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.169+0000 7f7c38fcd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c34104620 0x7f7c341a0470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.170+0000 7f7c38fcd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c34104620 0x7f7c341a0470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54500/0 (socket says 192.168.123.106:54500) 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.170+0000 7f7c38fcd700 1 -- 192.168.123.106:0/2834715927 learned_addr learned my addr 192.168.123.106:0/2834715927 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.170+0000 7f7c3b231700 1 -- 192.168.123.106:0/2834715927 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7c340745b0 con 0x7f7c34104620 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.170+0000 7f7c38fcd700 1 -- 192.168.123.106:0/2834715927 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7c24009740 con 0x7f7c34104620 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.170+0000 7f7c38fcd700 1 --2- 192.168.123.106:0/2834715927 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c34104620 0x7f7c341a0470 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f7c24006e90 tx=0x7f7c24003d30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.171+0000 7f7c31ffb700 1 -- 192.168.123.106:0/2834715927 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7c24003f40 con 0x7f7c34104620 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.171+0000 7f7c31ffb700 1 -- 192.168.123.106:0/2834715927 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7c24004580 con 0x7f7c34104620 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.171+0000 7f7c31ffb700 1 -- 192.168.123.106:0/2834715927 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7c2401ae60 con 0x7f7c34104620 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.171+0000 7f7c3b231700 1 -- 192.168.123.106:0/2834715927 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7c341a09b0 con 0x7f7c34104620 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.171+0000 7f7c3b231700 1 -- 192.168.123.106:0/2834715927 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7c341a0e50 con 0x7f7c34104620 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.172+0000 7f7c31ffb700 1 -- 192.168.123.106:0/2834715927 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f7c24011420 con 0x7f7c34104620 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.172+0000 7f7c3b231700 1 -- 192.168.123.106:0/2834715927 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7c3419a910 con 0x7f7c34104620 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.172+0000 7f7c31ffb700 1 --2- 192.168.123.106:0/2834715927 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7c1c038460 0x7f7c1c03a910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.172+0000 7f7c31ffb700 1 -- 192.168.123.106:0/2834715927 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f7c2404bfd0 con 0x7f7c34104620 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.172+0000 7f7c33fff700 1 --2- 192.168.123.106:0/2834715927 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7c1c038460 0x7f7c1c03a910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.175+0000 7f7c33fff700 1 --2- 192.168.123.106:0/2834715927 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7c1c038460 0x7f7c1c03a910 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f7c28006fd0 tx=0x7f7c28006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.175+0000 7f7c31ffb700 1 -- 192.168.123.106:0/2834715927 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7c24010970 con 0x7f7c34104620 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.339+0000 7f7c3b231700 1 -- 192.168.123.106:0/2834715927 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config-key set, key=mgr/dashboard/cluster/status}] v 0) v1 -- 0x7f7c3402cce0 con 0x7f7c34104620 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.374+0000 7f7c31ffb700 1 -- 192.168.123.106:0/2834715927 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{prefix=config-key set, key=mgr/dashboard/cluster/status}]=0 set mgr/dashboard/cluster/status v34)=0 set mgr/dashboard/cluster/status v34) v1 ==== 153+0+0 (secure 0 0 0) 0x7f7c24010380 con 0x7f7c34104620 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.377+0000 7f7c3b231700 1 -- 192.168.123.106:0/2834715927 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7c1c038460 msgr2=0x7f7c1c03a910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.377+0000 7f7c3b231700 1 --2- 192.168.123.106:0/2834715927 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7c1c038460 0x7f7c1c03a910 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f7c28006fd0 tx=0x7f7c28006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.377+0000 7f7c3b231700 1 -- 192.168.123.106:0/2834715927 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c34104620 msgr2=0x7f7c341a0470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.377+0000 7f7c3b231700 1 --2- 192.168.123.106:0/2834715927 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c34104620 0x7f7c341a0470 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f7c24006e90 tx=0x7f7c24003d30 comp rx=0 tx=0).stop 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.377+0000 7f7c3b231700 1 -- 192.168.123.106:0/2834715927 shutdown_connections 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.377+0000 7f7c3b231700 1 --2- 192.168.123.106:0/2834715927 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7c1c038460 0x7f7c1c03a910 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.377+0000 7f7c3b231700 1 --2- 192.168.123.106:0/2834715927 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7c34104620 0x7f7c341a0470 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.378+0000 7f7c3b231700 1 -- 192.168.123.106:0/2834715927 >> 192.168.123.106:0/2834715927 conn(0x7f7c34100270 msgr2=0x7f7c341026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.378+0000 7f7c3b231700 1 -- 192.168.123.106:0/2834715927 shutdown_connections 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr 2026-03-09T17:24:50.378+0000 7f7c3b231700 1 -- 192.168.123.106:0/2834715927 wait complete. 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:/usr/bin/ceph: stderr set mgr/dashboard/cluster/status 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:You can access the Ceph CLI as following in case of multi-cluster or non-default config: 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout: sudo /home/ubuntu/cephtest/cephadm shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:Or, if you are only running a single cluster on this host: 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout: sudo /home/ubuntu/cephtest/cephadm shell 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:Please consider enabling telemetry to help improve Ceph: 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout: ceph telemetry on 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout:For more information see: 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:24:50.647 INFO:teuthology.orchestra.run.vm06.stdout: https://docs.ceph.com/en/latest/mgr/telemetry/ 2026-03-09T17:24:50.648 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:24:50.648 INFO:teuthology.orchestra.run.vm06.stdout:Bootstrap complete. 2026-03-09T17:24:50.678 INFO:tasks.cephadm:Fetching config... 2026-03-09T17:24:50.678 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:24:50.678 DEBUG:teuthology.orchestra.run.vm06:> dd if=/etc/ceph/ceph.conf of=/dev/stdout 2026-03-09T17:24:50.758 INFO:tasks.cephadm:Fetching client.admin keyring... 2026-03-09T17:24:50.758 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:24:50.758 DEBUG:teuthology.orchestra.run.vm06:> dd if=/etc/ceph/ceph.client.admin.keyring of=/dev/stdout 2026-03-09T17:24:50.820 INFO:tasks.cephadm:Fetching mon keyring... 2026-03-09T17:24:50.820 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:24:50.820 DEBUG:teuthology.orchestra.run.vm06:> sudo dd if=/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/keyring of=/dev/stdout 2026-03-09T17:24:50.890 INFO:tasks.cephadm:Fetching pub ssh key... 2026-03-09T17:24:50.890 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:24:50.890 DEBUG:teuthology.orchestra.run.vm06:> dd if=/home/ubuntu/cephtest/ceph.pub of=/dev/stdout 2026-03-09T17:24:50.950 INFO:tasks.cephadm:Installing pub ssh key for root users... 2026-03-09T17:24:50.950 DEBUG:teuthology.orchestra.run.vm06:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDODie6fu/M7FlzARj/8VPaLI20flGNbea4ppO4avQFGEGi0UoE07yY6gGjj3JAtobLEKkp0qbtJAkcmx8OcUTvAhKrU0SQhlqLu2yLmiEs4rPw3NAsNkqe27oW2EgbEENgk9lEFMiAmEo4cT0elziaRlMoxOhbhC0/LRq9gmIqNknxIXpJuy8owPK2VKLPI+6mfwUJKJ3Oq6fITi0u3BCOb0+GpQKHWZUNWk1q1Rp6OXCar7eya9PLtMRmQ3lLQWLbCapmSUU1hI43XyD8XqOcW1hADGTNpV1rktu4Bz3rqsE4lnXXfdlhrlL168suh9mTuUUWx/g8aF5cYJCjj5ou/+xghtBiGRF87eoXdR8UUnWQ5vmKsaHuCsuBXI9OzKVmc7S/nk7DDLU+vGxJAmSWB//fp+eu8XcKqDOEHxPGWmhY9Ze0HPt+Da33pZIZw30qZgYds30KcTDfv3sFnP7NIckMpPeN2X01uRUHZ0zWf8kZ69hjTRJh370z1zCoJT0= ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-09T17:24:51.031 INFO:teuthology.orchestra.run.vm06.stdout:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDODie6fu/M7FlzARj/8VPaLI20flGNbea4ppO4avQFGEGi0UoE07yY6gGjj3JAtobLEKkp0qbtJAkcmx8OcUTvAhKrU0SQhlqLu2yLmiEs4rPw3NAsNkqe27oW2EgbEENgk9lEFMiAmEo4cT0elziaRlMoxOhbhC0/LRq9gmIqNknxIXpJuy8owPK2VKLPI+6mfwUJKJ3Oq6fITi0u3BCOb0+GpQKHWZUNWk1q1Rp6OXCar7eya9PLtMRmQ3lLQWLbCapmSUU1hI43XyD8XqOcW1hADGTNpV1rktu4Bz3rqsE4lnXXfdlhrlL168suh9mTuUUWx/g8aF5cYJCjj5ou/+xghtBiGRF87eoXdR8UUnWQ5vmKsaHuCsuBXI9OzKVmc7S/nk7DDLU+vGxJAmSWB//fp+eu8XcKqDOEHxPGWmhY9Ze0HPt+Da33pZIZw30qZgYds30KcTDfv3sFnP7NIckMpPeN2X01uRUHZ0zWf8kZ69hjTRJh370z1zCoJT0= ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:24:51.043 DEBUG:teuthology.orchestra.run.vm09:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDODie6fu/M7FlzARj/8VPaLI20flGNbea4ppO4avQFGEGi0UoE07yY6gGjj3JAtobLEKkp0qbtJAkcmx8OcUTvAhKrU0SQhlqLu2yLmiEs4rPw3NAsNkqe27oW2EgbEENgk9lEFMiAmEo4cT0elziaRlMoxOhbhC0/LRq9gmIqNknxIXpJuy8owPK2VKLPI+6mfwUJKJ3Oq6fITi0u3BCOb0+GpQKHWZUNWk1q1Rp6OXCar7eya9PLtMRmQ3lLQWLbCapmSUU1hI43XyD8XqOcW1hADGTNpV1rktu4Bz3rqsE4lnXXfdlhrlL168suh9mTuUUWx/g8aF5cYJCjj5ou/+xghtBiGRF87eoXdR8UUnWQ5vmKsaHuCsuBXI9OzKVmc7S/nk7DDLU+vGxJAmSWB//fp+eu8XcKqDOEHxPGWmhY9Ze0HPt+Da33pZIZw30qZgYds30KcTDfv3sFnP7NIckMpPeN2X01uRUHZ0zWf8kZ69hjTRJh370z1zCoJT0= ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-09T17:24:51.078 INFO:teuthology.orchestra.run.vm09.stdout:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDODie6fu/M7FlzARj/8VPaLI20flGNbea4ppO4avQFGEGi0UoE07yY6gGjj3JAtobLEKkp0qbtJAkcmx8OcUTvAhKrU0SQhlqLu2yLmiEs4rPw3NAsNkqe27oW2EgbEENgk9lEFMiAmEo4cT0elziaRlMoxOhbhC0/LRq9gmIqNknxIXpJuy8owPK2VKLPI+6mfwUJKJ3Oq6fITi0u3BCOb0+GpQKHWZUNWk1q1Rp6OXCar7eya9PLtMRmQ3lLQWLbCapmSUU1hI43XyD8XqOcW1hADGTNpV1rktu4Bz3rqsE4lnXXfdlhrlL168suh9mTuUUWx/g8aF5cYJCjj5ou/+xghtBiGRF87eoXdR8UUnWQ5vmKsaHuCsuBXI9OzKVmc7S/nk7DDLU+vGxJAmSWB//fp+eu8XcKqDOEHxPGWmhY9Ze0HPt+Da33pZIZw30qZgYds30KcTDfv3sFnP7NIckMpPeN2X01uRUHZ0zWf8kZ69hjTRJh370z1zCoJT0= ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:24:51.088 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph config set mgr mgr/cephadm/allow_ptrace true 2026-03-09T17:24:51.260 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:24:51.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.637+0000 7f107ffff700 1 -- 192.168.123.106:0/4228646524 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10800713c0 msgr2=0x7f10800717d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:51.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.637+0000 7f107ffff700 1 --2- 192.168.123.106:0/4228646524 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10800713c0 0x7f10800717d0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f1070009b50 tx=0x7f1070009e60 comp rx=0 tx=0).stop 2026-03-09T17:24:51.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.637+0000 7f107ffff700 1 -- 192.168.123.106:0/4228646524 shutdown_connections 2026-03-09T17:24:51.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.637+0000 7f107ffff700 1 --2- 192.168.123.106:0/4228646524 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10800713c0 0x7f10800717d0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:51.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.637+0000 7f107ffff700 1 -- 192.168.123.106:0/4228646524 >> 192.168.123.106:0/4228646524 conn(0x7f108006cd30 msgr2=0x7f108006f180 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:51.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.638+0000 7f107ffff700 1 -- 192.168.123.106:0/4228646524 shutdown_connections 2026-03-09T17:24:51.641 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.638+0000 7f107ffff700 1 -- 192.168.123.106:0/4228646524 wait complete. 2026-03-09T17:24:51.641 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.638+0000 7f107ffff700 1 Processor -- start 2026-03-09T17:24:51.641 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.639+0000 7f107ffff700 1 -- start start 2026-03-09T17:24:51.641 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.639+0000 7f107ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f108019b5f0 0x7f108019ba00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:51.641 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.639+0000 7f107ffff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1070023070 con 0x7f108019b5f0 2026-03-09T17:24:51.641 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.639+0000 7f107effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f108019b5f0 0x7f108019ba00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:51.641 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.639+0000 7f107effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f108019b5f0 0x7f108019ba00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54512/0 (socket says 192.168.123.106:54512) 2026-03-09T17:24:51.642 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.639+0000 7f107effd700 1 -- 192.168.123.106:0/1553946060 learned_addr learned my addr 192.168.123.106:0/1553946060 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:51.642 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.639+0000 7f107effd700 1 -- 192.168.123.106:0/1553946060 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f10700097e0 con 0x7f108019b5f0 2026-03-09T17:24:51.642 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.640+0000 7f107effd700 1 --2- 192.168.123.106:0/1553946060 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f108019b5f0 0x7f108019ba00 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f1070009790 tx=0x7f10700049c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:51.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.641+0000 7f1067fff700 1 -- 192.168.123.106:0/1553946060 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f107002e070 con 0x7f108019b5f0 2026-03-09T17:24:51.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.641+0000 7f107ffff700 1 -- 192.168.123.106:0/1553946060 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f108019bf40 con 0x7f108019b5f0 2026-03-09T17:24:51.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.641+0000 7f107ffff700 1 -- 192.168.123.106:0/1553946060 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f108019ebd0 con 0x7f108019b5f0 2026-03-09T17:24:51.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.642+0000 7f1067fff700 1 -- 192.168.123.106:0/1553946060 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f107001cb50 con 0x7f108019b5f0 2026-03-09T17:24:51.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.642+0000 7f107ffff700 1 -- 192.168.123.106:0/1553946060 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1080195210 con 0x7f108019b5f0 2026-03-09T17:24:51.647 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.645+0000 7f1067fff700 1 -- 192.168.123.106:0/1553946060 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f107001f3f0 con 0x7f108019b5f0 2026-03-09T17:24:51.648 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.646+0000 7f1067fff700 1 -- 192.168.123.106:0/1553946060 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f107001f610 con 0x7f108019b5f0 2026-03-09T17:24:51.648 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.646+0000 7f1067fff700 1 --2- 192.168.123.106:0/1553946060 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f1068038550 0x7f106803aa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:51.648 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.646+0000 7f107e7fc700 1 --2- 192.168.123.106:0/1553946060 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f1068038550 0x7f106803aa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:51.649 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.646+0000 7f1067fff700 1 -- 192.168.123.106:0/1553946060 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f107005f9e0 con 0x7f108019b5f0 2026-03-09T17:24:51.649 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.647+0000 7f107e7fc700 1 --2- 192.168.123.106:0/1553946060 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f1068038550 0x7f106803aa00 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f107800ad30 tx=0x7f10780093f0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:51.649 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.647+0000 7f1067fff700 1 -- 192.168.123.106:0/1553946060 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f107005fd50 con 0x7f108019b5f0 2026-03-09T17:24:51.756 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.754+0000 7f107ffff700 1 -- 192.168.123.106:0/1553946060 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/allow_ptrace}] v 0) v1 -- 0x7f1080062380 con 0x7f108019b5f0 2026-03-09T17:24:51.762 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.759+0000 7f1067fff700 1 -- 192.168.123.106:0/1553946060 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/allow_ptrace}]=0 v9)=0 v9) v1 ==== 125+0+0 (secure 0 0 0) 0x7f107001ed90 con 0x7f108019b5f0 2026-03-09T17:24:51.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.769+0000 7f107ffff700 1 -- 192.168.123.106:0/1553946060 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f1068038550 msgr2=0x7f106803aa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:51.772 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.769+0000 7f107ffff700 1 --2- 192.168.123.106:0/1553946060 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f1068038550 0x7f106803aa00 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f107800ad30 tx=0x7f10780093f0 comp rx=0 tx=0).stop 2026-03-09T17:24:51.772 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.769+0000 7f107ffff700 1 -- 192.168.123.106:0/1553946060 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f108019b5f0 msgr2=0x7f108019ba00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:51.772 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.769+0000 7f107ffff700 1 --2- 192.168.123.106:0/1553946060 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f108019b5f0 0x7f108019ba00 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f1070009790 tx=0x7f10700049c0 comp rx=0 tx=0).stop 2026-03-09T17:24:51.772 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.770+0000 7f107ffff700 1 -- 192.168.123.106:0/1553946060 shutdown_connections 2026-03-09T17:24:51.772 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.770+0000 7f107ffff700 1 --2- 192.168.123.106:0/1553946060 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f1068038550 0x7f106803aa00 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:51.772 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.770+0000 7f107ffff700 1 --2- 192.168.123.106:0/1553946060 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f108019b5f0 0x7f108019ba00 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:51.772 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.770+0000 7f107ffff700 1 -- 192.168.123.106:0/1553946060 >> 192.168.123.106:0/1553946060 conn(0x7f108006cd30 msgr2=0x7f108006e660 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:51.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.772+0000 7f107ffff700 1 -- 192.168.123.106:0/1553946060 shutdown_connections 2026-03-09T17:24:51.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:51.772+0000 7f107ffff700 1 -- 192.168.123.106:0/1553946060 wait complete. 2026-03-09T17:24:51.867 INFO:tasks.cephadm:Distributing conf and client.admin keyring to all hosts + 0755 2026-03-09T17:24:51.867 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph orch client-keyring set client.admin '*' --mode 0755 2026-03-09T17:24:52.045 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:24:52.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.363+0000 7febfdfff700 1 -- 192.168.123.106:0/587591937 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febf8102230 msgr2=0x7febf8102640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:52.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.363+0000 7febfdfff700 1 --2- 192.168.123.106:0/587591937 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febf8102230 0x7febf8102640 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7febe0009b00 tx=0x7febe0009e10 comp rx=0 tx=0).stop 2026-03-09T17:24:52.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.364+0000 7febfdfff700 1 -- 192.168.123.106:0/587591937 shutdown_connections 2026-03-09T17:24:52.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.364+0000 7febfdfff700 1 --2- 192.168.123.106:0/587591937 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febf8102230 0x7febf8102640 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:52.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.364+0000 7febfdfff700 1 -- 192.168.123.106:0/587591937 >> 192.168.123.106:0/587591937 conn(0x7febf80fd8d0 msgr2=0x7febf80ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:52.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.364+0000 7febfdfff700 1 -- 192.168.123.106:0/587591937 shutdown_connections 2026-03-09T17:24:52.366 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.364+0000 7febfdfff700 1 -- 192.168.123.106:0/587591937 wait complete. 2026-03-09T17:24:52.367 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.364+0000 7febfdfff700 1 Processor -- start 2026-03-09T17:24:52.367 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.364+0000 7febfdfff700 1 -- start start 2026-03-09T17:24:52.367 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.365+0000 7febfdfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febf8102230 0x7febf8197380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:52.367 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.365+0000 7febfdfff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7febf81978c0 con 0x7febf8102230 2026-03-09T17:24:52.367 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.365+0000 7febf77fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febf8102230 0x7febf8197380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:52.367 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.365+0000 7febf77fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febf8102230 0x7febf8197380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:54530/0 (socket says 192.168.123.106:54530) 2026-03-09T17:24:52.367 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.365+0000 7febf77fe700 1 -- 192.168.123.106:0/2786605189 learned_addr learned my addr 192.168.123.106:0/2786605189 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:52.367 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.365+0000 7febf77fe700 1 -- 192.168.123.106:0/2786605189 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7febe00097e0 con 0x7febf8102230 2026-03-09T17:24:52.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.365+0000 7febf77fe700 1 --2- 192.168.123.106:0/2786605189 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febf8102230 0x7febf8197380 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7febe0004750 tx=0x7febe0005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:52.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.366+0000 7febf4ff9700 1 -- 192.168.123.106:0/2786605189 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7febe001c070 con 0x7febf8102230 2026-03-09T17:24:52.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.366+0000 7febf4ff9700 1 -- 192.168.123.106:0/2786605189 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7febe0021470 con 0x7febf8102230 2026-03-09T17:24:52.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.366+0000 7febf4ff9700 1 -- 192.168.123.106:0/2786605189 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7febe000f460 con 0x7febf8102230 2026-03-09T17:24:52.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.366+0000 7febfdfff700 1 -- 192.168.123.106:0/2786605189 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7febf8197ac0 con 0x7febf8102230 2026-03-09T17:24:52.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.366+0000 7febfdfff700 1 -- 192.168.123.106:0/2786605189 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7febf8197f60 con 0x7febf8102230 2026-03-09T17:24:52.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.367+0000 7febf4ff9700 1 -- 192.168.123.106:0/2786605189 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7febe000f5c0 con 0x7febf8102230 2026-03-09T17:24:52.370 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.368+0000 7febf4ff9700 1 --2- 192.168.123.106:0/2786605189 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7febe4038470 0x7febe403a920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:52.370 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.368+0000 7febf4ff9700 1 -- 192.168.123.106:0/2786605189 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7febe004d3c0 con 0x7febf8102230 2026-03-09T17:24:52.370 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.368+0000 7febf6ffd700 1 --2- 192.168.123.106:0/2786605189 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7febe4038470 0x7febe403a920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:52.370 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.368+0000 7febfdfff700 1 -- 192.168.123.106:0/2786605189 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7febf8191080 con 0x7febf8102230 2026-03-09T17:24:52.373 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.370+0000 7febf6ffd700 1 --2- 192.168.123.106:0/2786605189 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7febe4038470 0x7febe403a920 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7febe8006fd0 tx=0x7febe8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:52.373 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.371+0000 7febf4ff9700 1 -- 192.168.123.106:0/2786605189 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7febe0026070 con 0x7febf8102230 2026-03-09T17:24:52.508 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.503+0000 7febfdfff700 1 -- 192.168.123.106:0/2786605189 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}) v1 -- 0x7febf8061190 con 0x7febe4038470 2026-03-09T17:24:52.533 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.528+0000 7febf4ff9700 1 -- 192.168.123.106:0/2786605189 <== mgr.14164 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7febf8061190 con 0x7febe4038470 2026-03-09T17:24:52.533 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.530+0000 7febfdfff700 1 -- 192.168.123.106:0/2786605189 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7febe4038470 msgr2=0x7febe403a920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:52.533 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.530+0000 7febfdfff700 1 --2- 192.168.123.106:0/2786605189 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7febe4038470 0x7febe403a920 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7febe8006fd0 tx=0x7febe8006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:52.533 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.530+0000 7febfdfff700 1 -- 192.168.123.106:0/2786605189 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febf8102230 msgr2=0x7febf8197380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:52.533 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.530+0000 7febfdfff700 1 --2- 192.168.123.106:0/2786605189 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febf8102230 0x7febf8197380 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7febe0004750 tx=0x7febe0005dc0 comp rx=0 tx=0).stop 2026-03-09T17:24:52.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.532+0000 7febfdfff700 1 -- 192.168.123.106:0/2786605189 shutdown_connections 2026-03-09T17:24:52.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.532+0000 7febfdfff700 1 --2- 192.168.123.106:0/2786605189 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7febe4038470 0x7febe403a920 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:52.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.532+0000 7febfdfff700 1 --2- 192.168.123.106:0/2786605189 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febf8102230 0x7febf8197380 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:52.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.532+0000 7febfdfff700 1 -- 192.168.123.106:0/2786605189 >> 192.168.123.106:0/2786605189 conn(0x7febf80fd8d0 msgr2=0x7febf80fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:52.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.534+0000 7febfdfff700 1 -- 192.168.123.106:0/2786605189 shutdown_connections 2026-03-09T17:24:52.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:52.534+0000 7febfdfff700 1 -- 192.168.123.106:0/2786605189 wait complete. 2026-03-09T17:24:52.597 INFO:tasks.cephadm:Writing (initial) conf and keyring to vm09 2026-03-09T17:24:52.597 DEBUG:teuthology.orchestra.run.vm09:> set -ex 2026-03-09T17:24:52.597 DEBUG:teuthology.orchestra.run.vm09:> dd of=/etc/ceph/ceph.conf 2026-03-09T17:24:52.616 DEBUG:teuthology.orchestra.run.vm09:> set -ex 2026-03-09T17:24:52.616 DEBUG:teuthology.orchestra.run.vm09:> dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:24:52.677 INFO:tasks.cephadm:Adding host vm09 to orchestrator... 2026-03-09T17:24:52.677 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph orch host add vm09 2026-03-09T17:24:52.859 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:52 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/1553946060' entity='client.admin' 2026-03-09T17:24:52.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:52 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:52.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:52 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:52.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:52 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:24:52.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:52 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:52.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:52 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:24:52.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:52 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T17:24:52.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:52 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:24:52.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:52 vm06 ceph-mon[57307]: Deploying daemon ceph-exporter.vm06 on vm06 2026-03-09T17:24:52.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:52 vm06 ceph-mon[57307]: from='client.14188 -' entity='client.admin' cmd=[{"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:52.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:52 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:52.923 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:24:53.478 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.474+0000 7f09ca75c700 1 -- 192.168.123.106:0/3097376208 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09c40713c0 msgr2=0x7f09c40717d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:53.478 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.474+0000 7f09ca75c700 1 --2- 192.168.123.106:0/3097376208 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09c40713c0 0x7f09c40717d0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f09bc009b00 tx=0x7f09bc009e10 comp rx=0 tx=0).stop 2026-03-09T17:24:53.479 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.476+0000 7f09ca75c700 1 -- 192.168.123.106:0/3097376208 shutdown_connections 2026-03-09T17:24:53.479 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.476+0000 7f09ca75c700 1 --2- 192.168.123.106:0/3097376208 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09c40713c0 0x7f09c40717d0 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:53.479 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.476+0000 7f09ca75c700 1 -- 192.168.123.106:0/3097376208 >> 192.168.123.106:0/3097376208 conn(0x7f09c406cd30 msgr2=0x7f09c406f180 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:53.480 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.477+0000 7f09ca75c700 1 -- 192.168.123.106:0/3097376208 shutdown_connections 2026-03-09T17:24:53.480 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.477+0000 7f09ca75c700 1 -- 192.168.123.106:0/3097376208 wait complete. 2026-03-09T17:24:53.480 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.478+0000 7f09ca75c700 1 Processor -- start 2026-03-09T17:24:53.480 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.478+0000 7f09ca75c700 1 -- start start 2026-03-09T17:24:53.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.478+0000 7f09ca75c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09c40713c0 0x7f09c41a3dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:53.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.478+0000 7f09ca75c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f09c41a4310 con 0x7f09c40713c0 2026-03-09T17:24:53.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.479+0000 7f09c975a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09c40713c0 0x7f09c41a3dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:53.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.479+0000 7f09c975a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09c40713c0 0x7f09c41a3dd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:56834/0 (socket says 192.168.123.106:56834) 2026-03-09T17:24:53.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.479+0000 7f09c975a700 1 -- 192.168.123.106:0/678246206 learned_addr learned my addr 192.168.123.106:0/678246206 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:53.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.479+0000 7f09c975a700 1 -- 192.168.123.106:0/678246206 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f09bc0097e0 con 0x7f09c40713c0 2026-03-09T17:24:53.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.479+0000 7f09c975a700 1 --2- 192.168.123.106:0/678246206 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09c40713c0 0x7f09c41a3dd0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f09bc000c00 tx=0x7f09bc005330 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:53.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.480+0000 7f09c27fc700 1 -- 192.168.123.106:0/678246206 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f09bc01d070 con 0x7f09c40713c0 2026-03-09T17:24:53.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.480+0000 7f09c27fc700 1 -- 192.168.123.106:0/678246206 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f09bc004530 con 0x7f09c40713c0 2026-03-09T17:24:53.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.480+0000 7f09c27fc700 1 -- 192.168.123.106:0/678246206 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f09bc003c30 con 0x7f09c40713c0 2026-03-09T17:24:53.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.480+0000 7f09ca75c700 1 -- 192.168.123.106:0/678246206 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f09c41a4510 con 0x7f09c40713c0 2026-03-09T17:24:53.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.480+0000 7f09ca75c700 1 -- 192.168.123.106:0/678246206 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f09c41a49b0 con 0x7f09c40713c0 2026-03-09T17:24:53.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.481+0000 7f09c27fc700 1 -- 192.168.123.106:0/678246206 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f09bc005450 con 0x7f09c40713c0 2026-03-09T17:24:53.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.481+0000 7f09ca75c700 1 -- 192.168.123.106:0/678246206 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f09c419e040 con 0x7f09c40713c0 2026-03-09T17:24:53.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.481+0000 7f09c27fc700 1 --2- 192.168.123.106:0/678246206 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f09ac038450 0x7f09ac03a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:53.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.481+0000 7f09c27fc700 1 -- 192.168.123.106:0/678246206 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f09bc04bf80 con 0x7f09c40713c0 2026-03-09T17:24:53.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.482+0000 7f09c8f59700 1 --2- 192.168.123.106:0/678246206 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f09ac038450 0x7f09ac03a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:53.488 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.486+0000 7f09c27fc700 1 -- 192.168.123.106:0/678246206 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f09bc029540 con 0x7f09c40713c0 2026-03-09T17:24:53.491 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.487+0000 7f09c8f59700 1 --2- 192.168.123.106:0/678246206 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f09ac038450 0x7f09ac03a900 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f09b8006fd0 tx=0x7f09b8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:53.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.611+0000 7f09ca75c700 1 -- 192.168.123.106:0/678246206 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm09", "target": ["mon-mgr", ""]}) v1 -- 0x7f09c4061190 con 0x7f09ac038450 2026-03-09T17:24:53.766 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:53.762+0000 7f09c27fc700 1 -- 192.168.123.106:0/678246206 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f09bc02acb0 con 0x7f09c40713c0 2026-03-09T17:24:54.732 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:54 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:54.732 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:54 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:54.732 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:54 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:54.732 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:54 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:54.732 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:54 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T17:24:54.732 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:54 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-09T17:24:54.732 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:54 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:24:54.733 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:54 vm06 ceph-mon[57307]: mgrmap e13: vm06.pbgzei(active, since 6s) 2026-03-09T17:24:54.733 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:54 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:54.733 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:54 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:54.733 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:54 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:54.733 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:54 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:55.524 INFO:teuthology.orchestra.run.vm06.stdout:Added host 'vm09' with addr '192.168.123.109' 2026-03-09T17:24:55.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.519+0000 7f09c27fc700 1 -- 192.168.123.106:0/678246206 <== mgr.14164 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7f09c4061190 con 0x7f09ac038450 2026-03-09T17:24:55.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.522+0000 7f09ca75c700 1 -- 192.168.123.106:0/678246206 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f09ac038450 msgr2=0x7f09ac03a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:55.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.522+0000 7f09ca75c700 1 --2- 192.168.123.106:0/678246206 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f09ac038450 0x7f09ac03a900 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f09b8006fd0 tx=0x7f09b8006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:55.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.522+0000 7f09ca75c700 1 -- 192.168.123.106:0/678246206 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09c40713c0 msgr2=0x7f09c41a3dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:55.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.522+0000 7f09ca75c700 1 --2- 192.168.123.106:0/678246206 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09c40713c0 0x7f09c41a3dd0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f09bc000c00 tx=0x7f09bc005330 comp rx=0 tx=0).stop 2026-03-09T17:24:55.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.522+0000 7f09ca75c700 1 -- 192.168.123.106:0/678246206 shutdown_connections 2026-03-09T17:24:55.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.522+0000 7f09ca75c700 1 --2- 192.168.123.106:0/678246206 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f09ac038450 0x7f09ac03a900 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:55.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.522+0000 7f09ca75c700 1 --2- 192.168.123.106:0/678246206 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09c40713c0 0x7f09c41a3dd0 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:55.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.522+0000 7f09ca75c700 1 -- 192.168.123.106:0/678246206 >> 192.168.123.106:0/678246206 conn(0x7f09c406cd30 msgr2=0x7f09c406fa10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:55.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.522+0000 7f09ca75c700 1 -- 192.168.123.106:0/678246206 shutdown_connections 2026-03-09T17:24:55.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.522+0000 7f09ca75c700 1 -- 192.168.123.106:0/678246206 wait complete. 2026-03-09T17:24:55.582 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph orch host ls --format=json 2026-03-09T17:24:55.734 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:24:55.762 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:55 vm06 ceph-mon[57307]: Deploying daemon crash.vm06 on vm06 2026-03-09T17:24:55.762 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:55 vm06 ceph-mon[57307]: from='client.14191 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm09", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:24:55.762 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:55 vm06 ceph-mon[57307]: Deploying cephadm binary to vm09 2026-03-09T17:24:55.762 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:55 vm06 ceph-mon[57307]: Deploying daemon node-exporter.vm06 on vm06 2026-03-09T17:24:55.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.989+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/2197852030 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd7e4102230 msgr2=0x7fd7e4102640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:55.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.989+0000 7fd7eb6cc700 1 --2- 192.168.123.106:0/2197852030 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd7e4102230 0x7fd7e4102640 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7fd7e0009b00 tx=0x7fd7e0009e10 comp rx=0 tx=0).stop 2026-03-09T17:24:55.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.990+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/2197852030 shutdown_connections 2026-03-09T17:24:55.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.990+0000 7fd7eb6cc700 1 --2- 192.168.123.106:0/2197852030 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd7e4102230 0x7fd7e4102640 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:55.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.990+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/2197852030 >> 192.168.123.106:0/2197852030 conn(0x7fd7e40fd8d0 msgr2=0x7fd7e40ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:55.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.990+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/2197852030 shutdown_connections 2026-03-09T17:24:55.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.990+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/2197852030 wait complete. 2026-03-09T17:24:55.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.990+0000 7fd7eb6cc700 1 Processor -- start 2026-03-09T17:24:55.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.990+0000 7fd7eb6cc700 1 -- start start 2026-03-09T17:24:55.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.991+0000 7fd7eb6cc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd7e4102230 0x7fd7e4197380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:55.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.991+0000 7fd7eb6cc700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd7e41978c0 con 0x7fd7e4102230 2026-03-09T17:24:55.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.991+0000 7fd7e9468700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd7e4102230 0x7fd7e4197380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:55.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.991+0000 7fd7e9468700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd7e4102230 0x7fd7e4197380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:56862/0 (socket says 192.168.123.106:56862) 2026-03-09T17:24:55.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.991+0000 7fd7e9468700 1 -- 192.168.123.106:0/144363828 learned_addr learned my addr 192.168.123.106:0/144363828 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:55.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.991+0000 7fd7e9468700 1 -- 192.168.123.106:0/144363828 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd7e00097e0 con 0x7fd7e4102230 2026-03-09T17:24:55.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.991+0000 7fd7e9468700 1 --2- 192.168.123.106:0/144363828 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd7e4102230 0x7fd7e4197380 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7fd7e0004750 tx=0x7fd7e0005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:55.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.991+0000 7fd7da7fc700 1 -- 192.168.123.106:0/144363828 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd7e001c070 con 0x7fd7e4102230 2026-03-09T17:24:55.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.991+0000 7fd7da7fc700 1 -- 192.168.123.106:0/144363828 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd7e0021470 con 0x7fd7e4102230 2026-03-09T17:24:55.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.991+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/144363828 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd7e4197ac0 con 0x7fd7e4102230 2026-03-09T17:24:55.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.992+0000 7fd7da7fc700 1 -- 192.168.123.106:0/144363828 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd7e000f460 con 0x7fd7e4102230 2026-03-09T17:24:55.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.992+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/144363828 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd7e4197f60 con 0x7fd7e4102230 2026-03-09T17:24:55.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.992+0000 7fd7da7fc700 1 -- 192.168.123.106:0/144363828 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fd7e000f5c0 con 0x7fd7e4102230 2026-03-09T17:24:55.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.993+0000 7fd7da7fc700 1 --2- 192.168.123.106:0/144363828 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd7d0038510 0x7fd7d003a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:55.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.993+0000 7fd7da7fc700 1 -- 192.168.123.106:0/144363828 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fd7e004d4a0 con 0x7fd7e4102230 2026-03-09T17:24:55.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.993+0000 7fd7e8c67700 1 --2- 192.168.123.106:0/144363828 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd7d0038510 0x7fd7d003a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:55.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.993+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/144363828 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd7e4191080 con 0x7fd7e4102230 2026-03-09T17:24:55.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.996+0000 7fd7da7fc700 1 -- 192.168.123.106:0/144363828 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd7e0026070 con 0x7fd7e4102230 2026-03-09T17:24:55.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:55.996+0000 7fd7e8c67700 1 --2- 192.168.123.106:0/144363828 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd7d0038510 0x7fd7d003a9c0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7fd7d4006fd0 tx=0x7fd7d4006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:56.117 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.115+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/144363828 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7fd7e4061190 con 0x7fd7d0038510 2026-03-09T17:24:56.119 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.116+0000 7fd7da7fc700 1 -- 192.168.123.106:0/144363828 <== mgr.14164 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+155 (secure 0 0 0) 0x7fd7e4061190 con 0x7fd7d0038510 2026-03-09T17:24:56.119 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:24:56.119 INFO:teuthology.orchestra.run.vm06.stdout:[{"addr": "192.168.123.106", "hostname": "vm06", "labels": [], "status": ""}, {"addr": "192.168.123.109", "hostname": "vm09", "labels": [], "status": ""}] 2026-03-09T17:24:56.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.119+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/144363828 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd7d0038510 msgr2=0x7fd7d003a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:56.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.119+0000 7fd7eb6cc700 1 --2- 192.168.123.106:0/144363828 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd7d0038510 0x7fd7d003a9c0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7fd7d4006fd0 tx=0x7fd7d4006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:56.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.119+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/144363828 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd7e4102230 msgr2=0x7fd7e4197380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:56.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.119+0000 7fd7eb6cc700 1 --2- 192.168.123.106:0/144363828 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd7e4102230 0x7fd7e4197380 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7fd7e0004750 tx=0x7fd7e0005dc0 comp rx=0 tx=0).stop 2026-03-09T17:24:56.122 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.119+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/144363828 shutdown_connections 2026-03-09T17:24:56.122 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.120+0000 7fd7eb6cc700 1 --2- 192.168.123.106:0/144363828 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd7d0038510 0x7fd7d003a9c0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:56.122 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.120+0000 7fd7eb6cc700 1 --2- 192.168.123.106:0/144363828 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd7e4102230 0x7fd7e4197380 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:56.122 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.120+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/144363828 >> 192.168.123.106:0/144363828 conn(0x7fd7e40fd8d0 msgr2=0x7fd7e40fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:56.122 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.120+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/144363828 shutdown_connections 2026-03-09T17:24:56.122 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.120+0000 7fd7eb6cc700 1 -- 192.168.123.106:0/144363828 wait complete. 2026-03-09T17:24:56.192 INFO:tasks.cephadm:Setting crush tunables to default 2026-03-09T17:24:56.193 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd crush tunables default 2026-03-09T17:24:56.348 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:24:56.614 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.611+0000 7f55f0fc0700 1 -- 192.168.123.106:0/4038037490 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55ec102240 msgr2=0x7f55ec102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:56.614 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.611+0000 7f55f0fc0700 1 --2- 192.168.123.106:0/4038037490 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55ec102240 0x7f55ec102650 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f55d4009b00 tx=0x7f55d4009e10 comp rx=0 tx=0).stop 2026-03-09T17:24:56.614 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.612+0000 7f55f0fc0700 1 -- 192.168.123.106:0/4038037490 shutdown_connections 2026-03-09T17:24:56.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.612+0000 7f55f0fc0700 1 --2- 192.168.123.106:0/4038037490 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55ec102240 0x7f55ec102650 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:56.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.612+0000 7f55f0fc0700 1 -- 192.168.123.106:0/4038037490 >> 192.168.123.106:0/4038037490 conn(0x7f55ec0fd8d0 msgr2=0x7f55ec0ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:56.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.612+0000 7f55f0fc0700 1 -- 192.168.123.106:0/4038037490 shutdown_connections 2026-03-09T17:24:56.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.612+0000 7f55f0fc0700 1 -- 192.168.123.106:0/4038037490 wait complete. 2026-03-09T17:24:56.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.613+0000 7f55f0fc0700 1 Processor -- start 2026-03-09T17:24:56.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.613+0000 7f55f0fc0700 1 -- start start 2026-03-09T17:24:56.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.613+0000 7f55f0fc0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55ec102240 0x7f55ec197450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:56.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.613+0000 7f55f0fc0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f55ec197990 con 0x7f55ec102240 2026-03-09T17:24:56.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.613+0000 7f55ea59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55ec102240 0x7f55ec197450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:56.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.613+0000 7f55ea59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55ec102240 0x7f55ec197450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:56874/0 (socket says 192.168.123.106:56874) 2026-03-09T17:24:56.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.613+0000 7f55ea59c700 1 -- 192.168.123.106:0/715097951 learned_addr learned my addr 192.168.123.106:0/715097951 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:24:56.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.613+0000 7f55ea59c700 1 -- 192.168.123.106:0/715097951 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f55d40097e0 con 0x7f55ec102240 2026-03-09T17:24:56.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.613+0000 7f55ea59c700 1 --2- 192.168.123.106:0/715097951 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55ec102240 0x7f55ec197450 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f55d4004d40 tx=0x7f55d4004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:56.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.614+0000 7f55e37fe700 1 -- 192.168.123.106:0/715097951 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f55d401c070 con 0x7f55ec102240 2026-03-09T17:24:56.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.614+0000 7f55e37fe700 1 -- 192.168.123.106:0/715097951 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f55d40056f0 con 0x7f55ec102240 2026-03-09T17:24:56.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.614+0000 7f55e37fe700 1 -- 192.168.123.106:0/715097951 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f55d4017440 con 0x7f55ec102240 2026-03-09T17:24:56.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.614+0000 7f55f0fc0700 1 -- 192.168.123.106:0/715097951 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f55ec197b90 con 0x7f55ec102240 2026-03-09T17:24:56.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.614+0000 7f55f0fc0700 1 -- 192.168.123.106:0/715097951 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f55ec198030 con 0x7f55ec102240 2026-03-09T17:24:56.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.615+0000 7f55e37fe700 1 -- 192.168.123.106:0/715097951 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f55d40175a0 con 0x7f55ec102240 2026-03-09T17:24:56.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.615+0000 7f55e37fe700 1 --2- 192.168.123.106:0/715097951 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f55d8038510 0x7f55d803a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:56.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.615+0000 7f55e37fe700 1 -- 192.168.123.106:0/715097951 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f55d404d0e0 con 0x7f55ec102240 2026-03-09T17:24:56.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.615+0000 7f55e9d9b700 1 --2- 192.168.123.106:0/715097951 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f55d8038510 0x7f55d803a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:56.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.616+0000 7f55e9d9b700 1 --2- 192.168.123.106:0/715097951 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f55d8038510 0x7f55d803a9c0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f55dc006fd0 tx=0x7f55dc006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:56.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.616+0000 7f55f0fc0700 1 -- 192.168.123.106:0/715097951 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f55ec191090 con 0x7f55ec102240 2026-03-09T17:24:56.621 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.619+0000 7f55e37fe700 1 -- 192.168.123.106:0/715097951 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f55d4025070 con 0x7f55ec102240 2026-03-09T17:24:56.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:56 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:56.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:56 vm06 ceph-mon[57307]: Added host vm09 2026-03-09T17:24:56.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:56 vm06 ceph-mon[57307]: from='client.14193 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T17:24:56.738 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:56.735+0000 7f55f0fc0700 1 -- 192.168.123.106:0/715097951 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd crush tunables", "profile": "default"} v 0) v1 -- 0x7f55ec062380 con 0x7f55ec102240 2026-03-09T17:24:57.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:57.528+0000 7f55e37fe700 1 -- 192.168.123.106:0/715097951 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd crush tunables", "profile": "default"}]=0 adjusted tunables profile to default v4) v1 ==== 124+0+0 (secure 0 0 0) 0x7f55ec062380 con 0x7f55ec102240 2026-03-09T17:24:57.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:57.531+0000 7f55f0fc0700 1 -- 192.168.123.106:0/715097951 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f55d8038510 msgr2=0x7f55d803a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:57.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:57.531+0000 7f55f0fc0700 1 --2- 192.168.123.106:0/715097951 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f55d8038510 0x7f55d803a9c0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f55dc006fd0 tx=0x7f55dc006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:57.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:57.531+0000 7f55f0fc0700 1 -- 192.168.123.106:0/715097951 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55ec102240 msgr2=0x7f55ec197450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:57.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:57.531+0000 7f55f0fc0700 1 --2- 192.168.123.106:0/715097951 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55ec102240 0x7f55ec197450 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f55d4004d40 tx=0x7f55d4004e20 comp rx=0 tx=0).stop 2026-03-09T17:24:57.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:57.531+0000 7f55f0fc0700 1 -- 192.168.123.106:0/715097951 shutdown_connections 2026-03-09T17:24:57.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:57.531+0000 7f55f0fc0700 1 --2- 192.168.123.106:0/715097951 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f55d8038510 0x7f55d803a9c0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:57.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:57.531+0000 7f55f0fc0700 1 --2- 192.168.123.106:0/715097951 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55ec102240 0x7f55ec197450 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:57.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:57.532+0000 7f55f0fc0700 1 -- 192.168.123.106:0/715097951 >> 192.168.123.106:0/715097951 conn(0x7f55ec0fd8d0 msgr2=0x7f55ec0fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:57.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:57.532+0000 7f55f0fc0700 1 -- 192.168.123.106:0/715097951 shutdown_connections 2026-03-09T17:24:57.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:24:57.532+0000 7f55f0fc0700 1 -- 192.168.123.106:0/715097951 wait complete. 2026-03-09T17:24:57.539 INFO:teuthology.orchestra.run.vm06.stderr:adjusted tunables profile to default 2026-03-09T17:24:57.576 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:57 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/715097951' entity='client.admin' cmd=[{"prefix": "osd crush tunables", "profile": "default"}]: dispatch 2026-03-09T17:24:57.576 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:57 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:57.576 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:57 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:57.576 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:57 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:57.576 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:57 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:57.576 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:57 vm06 ceph-mon[57307]: Deploying daemon alertmanager.vm06 on vm06 2026-03-09T17:24:57.576 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:57 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:24:57.607 INFO:tasks.cephadm:Adding mon.vm06 on vm06 2026-03-09T17:24:57.608 INFO:tasks.cephadm:Adding mon.vm09 on vm09 2026-03-09T17:24:57.608 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph orch apply mon '2;vm06:192.168.123.106=vm06;vm09:192.168.123.109=vm09' 2026-03-09T17:24:57.760 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:24:57.800 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:24:58.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:58 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/715097951' entity='client.admin' cmd='[{"prefix": "osd crush tunables", "profile": "default"}]': finished 2026-03-09T17:24:58.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:24:58 vm06 ceph-mon[57307]: osdmap e4: 0 total, 0 up, 0 in 2026-03-09T17:24:59.127 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.124+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/604846337 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4b4100020 msgr2=0x7fc4b4100430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:59.127 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.124+0000 7fc4ba5a0700 1 --2- 192.168.123.109:0/604846337 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4b4100020 0x7fc4b4100430 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7fc49c009b00 tx=0x7fc49c009e10 comp rx=0 tx=0).stop 2026-03-09T17:24:59.127 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.126+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/604846337 shutdown_connections 2026-03-09T17:24:59.127 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.126+0000 7fc4ba5a0700 1 --2- 192.168.123.109:0/604846337 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4b4100020 0x7fc4b4100430 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:59.127 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.126+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/604846337 >> 192.168.123.109:0/604846337 conn(0x7fc4b40fb5b0 msgr2=0x7fc4b40fda00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:59.127 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.126+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/604846337 shutdown_connections 2026-03-09T17:24:59.127 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.126+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/604846337 wait complete. 2026-03-09T17:24:59.128 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.127+0000 7fc4ba5a0700 1 Processor -- start 2026-03-09T17:24:59.128 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.127+0000 7fc4ba5a0700 1 -- start start 2026-03-09T17:24:59.128 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.127+0000 7fc4ba5a0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4b4100020 0x7fc4b4192fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:59.128 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.127+0000 7fc4ba5a0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc4b4193510 con 0x7fc4b4100020 2026-03-09T17:24:59.128 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.127+0000 7fc4b3fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4b4100020 0x7fc4b4192fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:59.128 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.128+0000 7fc4b3fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4b4100020 0x7fc4b4192fd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:41964/0 (socket says 192.168.123.109:41964) 2026-03-09T17:24:59.128 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.128+0000 7fc4b3fff700 1 -- 192.168.123.109:0/1747490587 learned_addr learned my addr 192.168.123.109:0/1747490587 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:24:59.129 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.128+0000 7fc4b3fff700 1 -- 192.168.123.109:0/1747490587 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc49c0097e0 con 0x7fc4b4100020 2026-03-09T17:24:59.129 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.128+0000 7fc4b3fff700 1 --2- 192.168.123.109:0/1747490587 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4b4100020 0x7fc4b4192fd0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fc49c004750 tx=0x7fc49c005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:59.129 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.128+0000 7fc4b17fa700 1 -- 192.168.123.109:0/1747490587 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc49c01c070 con 0x7fc4b4100020 2026-03-09T17:24:59.129 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.129+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/1747490587 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc4b4193710 con 0x7fc4b4100020 2026-03-09T17:24:59.129 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.129+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/1747490587 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc4b4193bb0 con 0x7fc4b4100020 2026-03-09T17:24:59.131 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.129+0000 7fc4b17fa700 1 -- 192.168.123.109:0/1747490587 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc49c021470 con 0x7fc4b4100020 2026-03-09T17:24:59.131 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.129+0000 7fc4b17fa700 1 -- 192.168.123.109:0/1747490587 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc49c00f460 con 0x7fc4b4100020 2026-03-09T17:24:59.131 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.130+0000 7fc4b17fa700 1 -- 192.168.123.109:0/1747490587 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fc49c00f620 con 0x7fc4b4100020 2026-03-09T17:24:59.131 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.130+0000 7fc4b17fa700 1 --2- 192.168.123.109:0/1747490587 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4a003c960 0x7fc4a003ee10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:59.131 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.130+0000 7fc4b17fa700 1 -- 192.168.123.109:0/1747490587 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fc49c04d450 con 0x7fc4b4100020 2026-03-09T17:24:59.131 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.130+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/1747490587 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc494005320 con 0x7fc4b4100020 2026-03-09T17:24:59.131 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.130+0000 7fc4b37fe700 1 --2- 192.168.123.109:0/1747490587 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4a003c960 0x7fc4a003ee10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:59.132 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.131+0000 7fc4b37fe700 1 --2- 192.168.123.109:0/1747490587 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4a003c960 0x7fc4a003ee10 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fc4a4006fd0 tx=0x7fc4a4006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:59.137 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.135+0000 7fc4b17fa700 1 -- 192.168.123.109:0/1747490587 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc49c026070 con 0x7fc4b4100020 2026-03-09T17:24:59.252 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.251+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/1747490587 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "placement": "2;vm06:192.168.123.106=vm06;vm09:192.168.123.109=vm09", "target": ["mon-mgr", ""]}) v1 -- 0x7fc494000c90 con 0x7fc4a003c960 2026-03-09T17:24:59.267 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.266+0000 7fc4b17fa700 1 -- 192.168.123.109:0/1747490587 <== mgr.14164 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7fc494000c90 con 0x7fc4a003c960 2026-03-09T17:24:59.267 INFO:teuthology.orchestra.run.vm09.stdout:Scheduled mon update... 2026-03-09T17:24:59.269 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.268+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/1747490587 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4a003c960 msgr2=0x7fc4a003ee10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:59.269 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.268+0000 7fc4ba5a0700 1 --2- 192.168.123.109:0/1747490587 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4a003c960 0x7fc4a003ee10 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fc4a4006fd0 tx=0x7fc4a4006e40 comp rx=0 tx=0).stop 2026-03-09T17:24:59.269 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.268+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/1747490587 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4b4100020 msgr2=0x7fc4b4192fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:59.269 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.268+0000 7fc4ba5a0700 1 --2- 192.168.123.109:0/1747490587 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4b4100020 0x7fc4b4192fd0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fc49c004750 tx=0x7fc49c005dc0 comp rx=0 tx=0).stop 2026-03-09T17:24:59.269 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.268+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/1747490587 shutdown_connections 2026-03-09T17:24:59.269 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.268+0000 7fc4ba5a0700 1 --2- 192.168.123.109:0/1747490587 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc4a003c960 0x7fc4a003ee10 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:59.270 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.268+0000 7fc4ba5a0700 1 --2- 192.168.123.109:0/1747490587 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4b4100020 0x7fc4b4192fd0 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:59.270 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.268+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/1747490587 >> 192.168.123.109:0/1747490587 conn(0x7fc4b40fb5b0 msgr2=0x7fc4b40fc280 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:59.270 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.269+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/1747490587 shutdown_connections 2026-03-09T17:24:59.270 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.269+0000 7fc4ba5a0700 1 -- 192.168.123.109:0/1747490587 wait complete. 2026-03-09T17:24:59.338 DEBUG:teuthology.orchestra.run.vm09:mon.vm09> sudo journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm09.service 2026-03-09T17:24:59.339 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:24:59.339 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:24:59.532 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:24:59.575 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:24:59.866 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.864+0000 7f6c69aea700 1 -- 192.168.123.109:0/364366189 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c64100020 msgr2=0x7f6c64100430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:24:59.866 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.864+0000 7f6c69aea700 1 --2- 192.168.123.109:0/364366189 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c64100020 0x7f6c64100430 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f6c54009b00 tx=0x7f6c54009e10 comp rx=0 tx=0).stop 2026-03-09T17:24:59.866 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.865+0000 7f6c69aea700 1 -- 192.168.123.109:0/364366189 shutdown_connections 2026-03-09T17:24:59.866 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.865+0000 7f6c69aea700 1 --2- 192.168.123.109:0/364366189 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c64100020 0x7f6c64100430 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:24:59.866 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.865+0000 7f6c69aea700 1 -- 192.168.123.109:0/364366189 >> 192.168.123.109:0/364366189 conn(0x7f6c640fb5b0 msgr2=0x7f6c640fda00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:24:59.866 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.866+0000 7f6c69aea700 1 -- 192.168.123.109:0/364366189 shutdown_connections 2026-03-09T17:24:59.866 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.866+0000 7f6c69aea700 1 -- 192.168.123.109:0/364366189 wait complete. 2026-03-09T17:24:59.867 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.866+0000 7f6c69aea700 1 Processor -- start 2026-03-09T17:24:59.867 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.866+0000 7f6c69aea700 1 -- start start 2026-03-09T17:24:59.867 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.867+0000 7f6c69aea700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c64100020 0x7f6c64192fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:59.868 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.867+0000 7f6c69aea700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6c641934e0 con 0x7f6c64100020 2026-03-09T17:24:59.868 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.867+0000 7f6c637fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c64100020 0x7f6c64192fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:59.868 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.867+0000 7f6c637fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c64100020 0x7f6c64192fa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:41980/0 (socket says 192.168.123.109:41980) 2026-03-09T17:24:59.868 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.867+0000 7f6c637fe700 1 -- 192.168.123.109:0/1746875272 learned_addr learned my addr 192.168.123.109:0/1746875272 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:24:59.868 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.868+0000 7f6c637fe700 1 -- 192.168.123.109:0/1746875272 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6c540097e0 con 0x7f6c64100020 2026-03-09T17:24:59.869 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.868+0000 7f6c637fe700 1 --2- 192.168.123.109:0/1746875272 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c64100020 0x7f6c64192fa0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f6c54004750 tx=0x7f6c54005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:59.870 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.868+0000 7f6c617fa700 1 -- 192.168.123.109:0/1746875272 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6c5401c070 con 0x7f6c64100020 2026-03-09T17:24:59.870 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.868+0000 7f6c617fa700 1 -- 192.168.123.109:0/1746875272 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6c54021470 con 0x7f6c64100020 2026-03-09T17:24:59.870 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.868+0000 7f6c617fa700 1 -- 192.168.123.109:0/1746875272 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6c5400f460 con 0x7f6c64100020 2026-03-09T17:24:59.870 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.868+0000 7f6c69aea700 1 -- 192.168.123.109:0/1746875272 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6c641936e0 con 0x7f6c64100020 2026-03-09T17:24:59.870 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.868+0000 7f6c69aea700 1 -- 192.168.123.109:0/1746875272 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6c64193b80 con 0x7f6c64100020 2026-03-09T17:24:59.871 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.869+0000 7f6c617fa700 1 -- 192.168.123.109:0/1746875272 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f6c5400f5c0 con 0x7f6c64100020 2026-03-09T17:24:59.871 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.870+0000 7f6c617fa700 1 --2- 192.168.123.109:0/1746875272 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6c50038540 0x7f6c5003a9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:24:59.871 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.870+0000 7f6c617fa700 1 -- 192.168.123.109:0/1746875272 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f6c5404d420 con 0x7f6c64100020 2026-03-09T17:24:59.871 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.870+0000 7f6c5bfff700 1 --2- 192.168.123.109:0/1746875272 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6c50038540 0x7f6c5003a9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:24:59.872 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.871+0000 7f6c69aea700 1 -- 192.168.123.109:0/1746875272 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6c6418cc90 con 0x7f6c64100020 2026-03-09T17:24:59.872 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.871+0000 7f6c5bfff700 1 --2- 192.168.123.109:0/1746875272 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6c50038540 0x7f6c5003a9f0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f6c4c006fd0 tx=0x7f6c4c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:24:59.875 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:24:59.874+0000 7f6c617fa700 1 -- 192.168.123.109:0/1746875272 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6c54026070 con 0x7f6c64100020 2026-03-09T17:25:00.035 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:00.034+0000 7f6c69aea700 1 -- 192.168.123.109:0/1746875272 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f6c64062380 con 0x7f6c64100020 2026-03-09T17:25:00.036 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:00.035+0000 7f6c617fa700 1 -- 192.168.123.109:0/1746875272 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f6c54029720 con 0x7f6c64100020 2026-03-09T17:25:00.036 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:00.036 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:00.039 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:00.038+0000 7f6c69aea700 1 -- 192.168.123.109:0/1746875272 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6c50038540 msgr2=0x7f6c5003a9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:00.039 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:00.038+0000 7f6c69aea700 1 --2- 192.168.123.109:0/1746875272 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6c50038540 0x7f6c5003a9f0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f6c4c006fd0 tx=0x7f6c4c006e40 comp rx=0 tx=0).stop 2026-03-09T17:25:00.039 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:00.038+0000 7f6c69aea700 1 -- 192.168.123.109:0/1746875272 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c64100020 msgr2=0x7f6c64192fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:00.039 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:00.038+0000 7f6c69aea700 1 --2- 192.168.123.109:0/1746875272 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c64100020 0x7f6c64192fa0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f6c54004750 tx=0x7f6c54005dc0 comp rx=0 tx=0).stop 2026-03-09T17:25:00.040 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:00.038+0000 7f6c69aea700 1 -- 192.168.123.109:0/1746875272 shutdown_connections 2026-03-09T17:25:00.040 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:00.038+0000 7f6c69aea700 1 --2- 192.168.123.109:0/1746875272 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6c50038540 0x7f6c5003a9f0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:00.040 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:00.038+0000 7f6c69aea700 1 --2- 192.168.123.109:0/1746875272 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c64100020 0x7f6c64192fa0 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:00.040 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:00.038+0000 7f6c69aea700 1 -- 192.168.123.109:0/1746875272 >> 192.168.123.109:0/1746875272 conn(0x7f6c640fb5b0 msgr2=0x7f6c640fc280 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:00.040 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:00.039+0000 7f6c69aea700 1 -- 192.168.123.109:0/1746875272 shutdown_connections 2026-03-09T17:25:00.040 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:00.039+0000 7f6c69aea700 1 -- 192.168.123.109:0/1746875272 wait complete. 2026-03-09T17:25:00.040 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:00.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:00 vm06 ceph-mon[57307]: from='client.14197 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "placement": "2;vm06:192.168.123.106=vm06;vm09:192.168.123.109=vm09", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:25:00.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:00 vm06 ceph-mon[57307]: Saving service mon spec with placement vm06:192.168.123.106=vm06;vm09:192.168.123.109=vm09;count:2 2026-03-09T17:25:00.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:00 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:00.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:00 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/1746875272' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:01.099 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:01.099 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:01.262 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:01.297 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:01.579 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.577+0000 7fda0f77b700 1 -- 192.168.123.109:0/2271931126 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda08102240 msgr2=0x7fda08102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:01.579 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.577+0000 7fda0f77b700 1 --2- 192.168.123.109:0/2271931126 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda08102240 0x7fda08102650 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7fd9f8009b00 tx=0x7fd9f8009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:01.580 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.578+0000 7fda0f77b700 1 -- 192.168.123.109:0/2271931126 shutdown_connections 2026-03-09T17:25:01.580 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.578+0000 7fda0f77b700 1 --2- 192.168.123.109:0/2271931126 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda08102240 0x7fda08102650 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:01.580 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.578+0000 7fda0f77b700 1 -- 192.168.123.109:0/2271931126 >> 192.168.123.109:0/2271931126 conn(0x7fda080fd8d0 msgr2=0x7fda080ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:01.580 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.579+0000 7fda0f77b700 1 -- 192.168.123.109:0/2271931126 shutdown_connections 2026-03-09T17:25:01.580 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.579+0000 7fda0f77b700 1 -- 192.168.123.109:0/2271931126 wait complete. 2026-03-09T17:25:01.580 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.579+0000 7fda0f77b700 1 Processor -- start 2026-03-09T17:25:01.580 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.580+0000 7fda0f77b700 1 -- start start 2026-03-09T17:25:01.581 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.580+0000 7fda0f77b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda08197630 0x7fda08197a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:01.581 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.580+0000 7fda0f77b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda08197f80 con 0x7fda08197630 2026-03-09T17:25:01.581 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.580+0000 7fda0d517700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda08197630 0x7fda08197a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:01.582 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.581+0000 7fda0d517700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda08197630 0x7fda08197a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:41994/0 (socket says 192.168.123.109:41994) 2026-03-09T17:25:01.582 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.581+0000 7fda0d517700 1 -- 192.168.123.109:0/2527410826 learned_addr learned my addr 192.168.123.109:0/2527410826 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:01.582 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.581+0000 7fda0d517700 1 -- 192.168.123.109:0/2527410826 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd9f80097e0 con 0x7fda08197630 2026-03-09T17:25:01.582 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.581+0000 7fda0d517700 1 --2- 192.168.123.109:0/2527410826 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda08197630 0x7fda08197a40 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7fda081032a0 tx=0x7fd9f8004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:01.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.582+0000 7fd9fe7fc700 1 -- 192.168.123.109:0/2527410826 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd9f801c070 con 0x7fda08197630 2026-03-09T17:25:01.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.582+0000 7fd9fe7fc700 1 -- 192.168.123.109:0/2527410826 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd9f80056f0 con 0x7fda08197630 2026-03-09T17:25:01.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.582+0000 7fd9fe7fc700 1 -- 192.168.123.109:0/2527410826 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd9f8021e00 con 0x7fda08197630 2026-03-09T17:25:01.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.582+0000 7fda0f77b700 1 -- 192.168.123.109:0/2527410826 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fda08198180 con 0x7fda08197630 2026-03-09T17:25:01.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.582+0000 7fda0f77b700 1 -- 192.168.123.109:0/2527410826 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fda0819ade0 con 0x7fda08197630 2026-03-09T17:25:01.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.583+0000 7fd9fe7fc700 1 -- 192.168.123.109:0/2527410826 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fd9f800f460 con 0x7fda08197630 2026-03-09T17:25:01.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.584+0000 7fd9fe7fc700 1 --2- 192.168.123.109:0/2527410826 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd9f40384c0 0x7fd9f403a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:01.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.584+0000 7fd9fe7fc700 1 -- 192.168.123.109:0/2527410826 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fd9f804d410 con 0x7fda08197630 2026-03-09T17:25:01.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.584+0000 7fda0f77b700 1 -- 192.168.123.109:0/2527410826 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fda08191090 con 0x7fda08197630 2026-03-09T17:25:01.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.584+0000 7fda0cd16700 1 --2- 192.168.123.109:0/2527410826 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd9f40384c0 0x7fd9f403a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:01.588 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.587+0000 7fd9fe7fc700 1 -- 192.168.123.109:0/2527410826 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd9f8026030 con 0x7fda08197630 2026-03-09T17:25:01.588 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.587+0000 7fda0cd16700 1 --2- 192.168.123.109:0/2527410826 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd9f40384c0 0x7fd9f403a970 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fda04006fd0 tx=0x7fda04006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:01.740 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.739+0000 7fda0f77b700 1 -- 192.168.123.109:0/2527410826 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fda08062380 con 0x7fda08197630 2026-03-09T17:25:01.741 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.740+0000 7fd9fe7fc700 1 -- 192.168.123.109:0/2527410826 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fd9f801f730 con 0x7fda08197630 2026-03-09T17:25:01.741 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:01.741 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:01.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.742+0000 7fda0f77b700 1 -- 192.168.123.109:0/2527410826 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd9f40384c0 msgr2=0x7fd9f403a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:01.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.742+0000 7fda0f77b700 1 --2- 192.168.123.109:0/2527410826 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd9f40384c0 0x7fd9f403a970 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fda04006fd0 tx=0x7fda04006e40 comp rx=0 tx=0).stop 2026-03-09T17:25:01.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.743+0000 7fda0f77b700 1 -- 192.168.123.109:0/2527410826 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda08197630 msgr2=0x7fda08197a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:01.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.743+0000 7fda0f77b700 1 --2- 192.168.123.109:0/2527410826 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda08197630 0x7fda08197a40 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7fda081032a0 tx=0x7fd9f8004dc0 comp rx=0 tx=0).stop 2026-03-09T17:25:01.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.743+0000 7fda0f77b700 1 -- 192.168.123.109:0/2527410826 shutdown_connections 2026-03-09T17:25:01.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.743+0000 7fda0f77b700 1 --2- 192.168.123.109:0/2527410826 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd9f40384c0 0x7fd9f403a970 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:01.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.743+0000 7fda0f77b700 1 --2- 192.168.123.109:0/2527410826 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda08197630 0x7fda08197a40 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:01.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.743+0000 7fda0f77b700 1 -- 192.168.123.109:0/2527410826 >> 192.168.123.109:0/2527410826 conn(0x7fda080fd8d0 msgr2=0x7fda080fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:01.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.743+0000 7fda0f77b700 1 -- 192.168.123.109:0/2527410826 shutdown_connections 2026-03-09T17:25:01.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:01.743+0000 7fda0f77b700 1 -- 192.168.123.109:0/2527410826 wait complete. 2026-03-09T17:25:01.745 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:02.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:01 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:02.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:01 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:02.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:01 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:02.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:01 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:02.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:01 vm06 ceph-mon[57307]: Regenerating cephadm self-signed grafana TLS certificates 2026-03-09T17:25:02.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:01 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:02.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:01 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:02.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:01 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-09T17:25:02.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:01 vm06 ceph-mon[57307]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-09T17:25:02.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:01 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:02.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:01 vm06 ceph-mon[57307]: Deploying daemon grafana.vm06 on vm06 2026-03-09T17:25:02.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:01 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/2527410826' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:02.796 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:02.796 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:02.942 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:02.984 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:03.265 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.262+0000 7f09e2d5b700 1 -- 192.168.123.109:0/44978156 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09dc100aa0 msgr2=0x7f09dc102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:03.265 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.262+0000 7f09e2d5b700 1 --2- 192.168.123.109:0/44978156 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09dc100aa0 0x7f09dc102e80 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f09cc009b00 tx=0x7f09cc009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:03.265 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.263+0000 7f09e2d5b700 1 -- 192.168.123.109:0/44978156 shutdown_connections 2026-03-09T17:25:03.265 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.263+0000 7f09e2d5b700 1 --2- 192.168.123.109:0/44978156 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09dc100aa0 0x7f09dc102e80 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:03.265 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.264+0000 7f09e2d5b700 1 -- 192.168.123.109:0/44978156 >> 192.168.123.109:0/44978156 conn(0x7f09dc0fa4a0 msgr2=0x7f09dc0fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:03.265 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.264+0000 7f09e2d5b700 1 -- 192.168.123.109:0/44978156 shutdown_connections 2026-03-09T17:25:03.265 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.264+0000 7f09e2d5b700 1 -- 192.168.123.109:0/44978156 wait complete. 2026-03-09T17:25:03.265 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.264+0000 7f09e2d5b700 1 Processor -- start 2026-03-09T17:25:03.266 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.265+0000 7f09e2d5b700 1 -- start start 2026-03-09T17:25:03.266 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.265+0000 7f09e2d5b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09dc1003c0 0x7f09dc0fea50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:03.266 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.265+0000 7f09e2d5b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f09dc1007d0 con 0x7f09dc1003c0 2026-03-09T17:25:03.266 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.265+0000 7f09e0af7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09dc1003c0 0x7f09dc0fea50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:03.266 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.266+0000 7f09e0af7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09dc1003c0 0x7f09dc0fea50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:42010/0 (socket says 192.168.123.109:42010) 2026-03-09T17:25:03.266 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.266+0000 7f09e0af7700 1 -- 192.168.123.109:0/2689030496 learned_addr learned my addr 192.168.123.109:0/2689030496 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:03.267 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.266+0000 7f09e0af7700 1 -- 192.168.123.109:0/2689030496 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f09cc0097e0 con 0x7f09dc1003c0 2026-03-09T17:25:03.267 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.266+0000 7f09e0af7700 1 --2- 192.168.123.109:0/2689030496 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09dc1003c0 0x7f09dc0fea50 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f09cc00b5c0 tx=0x7f09cc005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:03.268 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.267+0000 7f09d9ffb700 1 -- 192.168.123.109:0/2689030496 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f09cc01c070 con 0x7f09dc1003c0 2026-03-09T17:25:03.268 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.267+0000 7f09e2d5b700 1 -- 192.168.123.109:0/2689030496 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f09dc0fef90 con 0x7f09dc1003c0 2026-03-09T17:25:03.268 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.267+0000 7f09e2d5b700 1 -- 192.168.123.109:0/2689030496 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f09dc0ff430 con 0x7f09dc1003c0 2026-03-09T17:25:03.269 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.268+0000 7f09d9ffb700 1 -- 192.168.123.109:0/2689030496 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f09cc021470 con 0x7f09dc1003c0 2026-03-09T17:25:03.269 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.268+0000 7f09d9ffb700 1 -- 192.168.123.109:0/2689030496 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f09cc00f460 con 0x7f09dc1003c0 2026-03-09T17:25:03.270 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.269+0000 7f09d9ffb700 1 -- 192.168.123.109:0/2689030496 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f09cc00f5e0 con 0x7f09dc1003c0 2026-03-09T17:25:03.270 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.269+0000 7f09d9ffb700 1 --2- 192.168.123.109:0/2689030496 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f09c4038510 0x7f09c403a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:03.270 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.269+0000 7f09d9ffb700 1 -- 192.168.123.109:0/2689030496 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f09cc04d340 con 0x7f09dc1003c0 2026-03-09T17:25:03.270 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.269+0000 7f09e2d5b700 1 -- 192.168.123.109:0/2689030496 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f09dc18ee60 con 0x7f09dc1003c0 2026-03-09T17:25:03.270 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.269+0000 7f09dbfff700 1 --2- 192.168.123.109:0/2689030496 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f09c4038510 0x7f09c403a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:03.275 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.273+0000 7f09dbfff700 1 --2- 192.168.123.109:0/2689030496 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f09c4038510 0x7f09c403a9c0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f09d0006fd0 tx=0x7f09d0006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:03.275 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.274+0000 7f09d9ffb700 1 -- 192.168.123.109:0/2689030496 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f09cc026070 con 0x7f09dc1003c0 2026-03-09T17:25:03.429 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.428+0000 7f09e2d5b700 1 -- 192.168.123.109:0/2689030496 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f09dc100910 con 0x7f09dc1003c0 2026-03-09T17:25:03.429 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.428+0000 7f09d9ffb700 1 -- 192.168.123.109:0/2689030496 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f09cc017440 con 0x7f09dc1003c0 2026-03-09T17:25:03.430 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:03.430 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:03.433 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.432+0000 7f09e2d5b700 1 -- 192.168.123.109:0/2689030496 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f09c4038510 msgr2=0x7f09c403a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:03.433 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.432+0000 7f09e2d5b700 1 --2- 192.168.123.109:0/2689030496 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f09c4038510 0x7f09c403a9c0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f09d0006fd0 tx=0x7f09d0006e40 comp rx=0 tx=0).stop 2026-03-09T17:25:03.433 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.432+0000 7f09e2d5b700 1 -- 192.168.123.109:0/2689030496 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09dc1003c0 msgr2=0x7f09dc0fea50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:03.433 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.432+0000 7f09e2d5b700 1 --2- 192.168.123.109:0/2689030496 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09dc1003c0 0x7f09dc0fea50 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f09cc00b5c0 tx=0x7f09cc005e70 comp rx=0 tx=0).stop 2026-03-09T17:25:03.433 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.432+0000 7f09e2d5b700 1 -- 192.168.123.109:0/2689030496 shutdown_connections 2026-03-09T17:25:03.433 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.432+0000 7f09e2d5b700 1 --2- 192.168.123.109:0/2689030496 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f09c4038510 0x7f09c403a9c0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:03.433 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.432+0000 7f09e2d5b700 1 --2- 192.168.123.109:0/2689030496 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f09dc1003c0 0x7f09dc0fea50 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:03.433 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.433+0000 7f09e2d5b700 1 -- 192.168.123.109:0/2689030496 >> 192.168.123.109:0/2689030496 conn(0x7f09dc0fa4a0 msgr2=0x7f09dc0fb170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:03.434 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.433+0000 7f09e2d5b700 1 -- 192.168.123.109:0/2689030496 shutdown_connections 2026-03-09T17:25:03.434 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:03.433+0000 7f09e2d5b700 1 -- 192.168.123.109:0/2689030496 wait complete. 2026-03-09T17:25:03.435 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:03.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:03 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:03.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:03 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/2689030496' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:04.481 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:04.482 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:04.643 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:04.686 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:04.968 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.966+0000 7ff9765c7700 1 -- 192.168.123.109:0/3879725106 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff970102240 msgr2=0x7ff970102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:04.968 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.966+0000 7ff9765c7700 1 --2- 192.168.123.109:0/3879725106 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff970102240 0x7ff970102650 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7ff960009b00 tx=0x7ff960009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:04.968 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.967+0000 7ff9765c7700 1 -- 192.168.123.109:0/3879725106 shutdown_connections 2026-03-09T17:25:04.968 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.967+0000 7ff9765c7700 1 --2- 192.168.123.109:0/3879725106 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff970102240 0x7ff970102650 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:04.968 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.967+0000 7ff9765c7700 1 -- 192.168.123.109:0/3879725106 >> 192.168.123.109:0/3879725106 conn(0x7ff9700fd8d0 msgr2=0x7ff9700ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:04.968 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.967+0000 7ff9765c7700 1 -- 192.168.123.109:0/3879725106 shutdown_connections 2026-03-09T17:25:04.968 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.967+0000 7ff9765c7700 1 -- 192.168.123.109:0/3879725106 wait complete. 2026-03-09T17:25:04.968 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.968+0000 7ff9765c7700 1 Processor -- start 2026-03-09T17:25:04.969 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.968+0000 7ff9765c7700 1 -- start start 2026-03-09T17:25:04.969 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.968+0000 7ff9765c7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff970102240 0x7ff970197450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:04.969 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.968+0000 7ff9765c7700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff970197990 con 0x7ff970102240 2026-03-09T17:25:04.969 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.968+0000 7ff96ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff970102240 0x7ff970197450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:04.969 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.968+0000 7ff96ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff970102240 0x7ff970197450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:42026/0 (socket says 192.168.123.109:42026) 2026-03-09T17:25:04.969 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.968+0000 7ff96ffff700 1 -- 192.168.123.109:0/3034844523 learned_addr learned my addr 192.168.123.109:0/3034844523 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:04.969 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.969+0000 7ff96ffff700 1 -- 192.168.123.109:0/3034844523 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff9600097e0 con 0x7ff970102240 2026-03-09T17:25:04.970 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.969+0000 7ff96ffff700 1 --2- 192.168.123.109:0/3034844523 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff970102240 0x7ff970197450 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7ff960004d40 tx=0x7ff960004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:04.970 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.969+0000 7ff96dffb700 1 -- 192.168.123.109:0/3034844523 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff96001c070 con 0x7ff970102240 2026-03-09T17:25:04.970 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.969+0000 7ff9765c7700 1 -- 192.168.123.109:0/3034844523 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff970197b90 con 0x7ff970102240 2026-03-09T17:25:04.970 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.969+0000 7ff9765c7700 1 -- 192.168.123.109:0/3034844523 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff970198030 con 0x7ff970102240 2026-03-09T17:25:04.972 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.970+0000 7ff96dffb700 1 -- 192.168.123.109:0/3034844523 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff9600054e0 con 0x7ff970102240 2026-03-09T17:25:04.972 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.970+0000 7ff96dffb700 1 -- 192.168.123.109:0/3034844523 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff960003b70 con 0x7ff970102240 2026-03-09T17:25:04.972 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.971+0000 7ff96dffb700 1 -- 192.168.123.109:0/3034844523 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7ff96000f460 con 0x7ff970102240 2026-03-09T17:25:04.972 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.971+0000 7ff9765c7700 1 -- 192.168.123.109:0/3034844523 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff954005320 con 0x7ff970102240 2026-03-09T17:25:04.972 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.971+0000 7ff96dffb700 1 --2- 192.168.123.109:0/3034844523 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff950038510 0x7ff95003a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:04.972 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.971+0000 7ff96dffb700 1 -- 192.168.123.109:0/3034844523 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7ff96004d390 con 0x7ff970102240 2026-03-09T17:25:04.972 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.971+0000 7ff967fff700 1 --2- 192.168.123.109:0/3034844523 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff950038510 0x7ff95003a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:04.973 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.972+0000 7ff967fff700 1 --2- 192.168.123.109:0/3034844523 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff950038510 0x7ff95003a9c0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7ff958006fd0 tx=0x7ff958006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:04.976 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:04.975+0000 7ff96dffb700 1 -- 192.168.123.109:0/3034844523 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7ff960029980 con 0x7ff970102240 2026-03-09T17:25:05.157 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:05.155+0000 7ff9765c7700 1 -- 192.168.123.109:0/3034844523 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7ff954005190 con 0x7ff970102240 2026-03-09T17:25:05.157 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:05.156+0000 7ff96dffb700 1 -- 192.168.123.109:0/3034844523 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7ff960029390 con 0x7ff970102240 2026-03-09T17:25:05.157 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:05.158 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:05.160 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:05.159+0000 7ff9765c7700 1 -- 192.168.123.109:0/3034844523 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff950038510 msgr2=0x7ff95003a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:05.160 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:05.159+0000 7ff9765c7700 1 --2- 192.168.123.109:0/3034844523 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff950038510 0x7ff95003a9c0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7ff958006fd0 tx=0x7ff958006e40 comp rx=0 tx=0).stop 2026-03-09T17:25:05.160 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:05.159+0000 7ff9765c7700 1 -- 192.168.123.109:0/3034844523 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff970102240 msgr2=0x7ff970197450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:05.160 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:05.159+0000 7ff9765c7700 1 --2- 192.168.123.109:0/3034844523 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff970102240 0x7ff970197450 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7ff960004d40 tx=0x7ff960004e20 comp rx=0 tx=0).stop 2026-03-09T17:25:05.160 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:05.159+0000 7ff9765c7700 1 -- 192.168.123.109:0/3034844523 shutdown_connections 2026-03-09T17:25:05.160 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:05.159+0000 7ff9765c7700 1 --2- 192.168.123.109:0/3034844523 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff950038510 0x7ff95003a9c0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:05.161 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:05.159+0000 7ff9765c7700 1 --2- 192.168.123.109:0/3034844523 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff970102240 0x7ff970197450 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:05.161 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:05.159+0000 7ff9765c7700 1 -- 192.168.123.109:0/3034844523 >> 192.168.123.109:0/3034844523 conn(0x7ff9700fd8d0 msgr2=0x7ff9700fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:05.161 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:05.159+0000 7ff9765c7700 1 -- 192.168.123.109:0/3034844523 shutdown_connections 2026-03-09T17:25:05.161 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:05.159+0000 7ff9765c7700 1 -- 192.168.123.109:0/3034844523 wait complete. 2026-03-09T17:25:05.161 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:05.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:05 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/3034844523' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:06.217 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:06.218 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:06.382 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:06.429 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:06.702 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.700+0000 7fc6cac33700 1 -- 192.168.123.109:0/554550179 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6c4102240 msgr2=0x7fc6c4102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:06.702 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.700+0000 7fc6cac33700 1 --2- 192.168.123.109:0/554550179 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6c4102240 0x7fc6c4102650 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fc6b0009b00 tx=0x7fc6b0009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:06.702 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.701+0000 7fc6cac33700 1 -- 192.168.123.109:0/554550179 shutdown_connections 2026-03-09T17:25:06.702 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.701+0000 7fc6cac33700 1 --2- 192.168.123.109:0/554550179 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6c4102240 0x7fc6c4102650 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:06.702 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.701+0000 7fc6cac33700 1 -- 192.168.123.109:0/554550179 >> 192.168.123.109:0/554550179 conn(0x7fc6c40fd8d0 msgr2=0x7fc6c40ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:06.702 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.701+0000 7fc6cac33700 1 -- 192.168.123.109:0/554550179 shutdown_connections 2026-03-09T17:25:06.702 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.701+0000 7fc6cac33700 1 -- 192.168.123.109:0/554550179 wait complete. 2026-03-09T17:25:06.703 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.702+0000 7fc6cac33700 1 Processor -- start 2026-03-09T17:25:06.703 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.702+0000 7fc6cac33700 1 -- start start 2026-03-09T17:25:06.703 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.702+0000 7fc6cac33700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6c4102240 0x7fc6c4197450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:06.703 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.702+0000 7fc6cac33700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6c4197990 con 0x7fc6c4102240 2026-03-09T17:25:06.703 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.702+0000 7fc6c89cf700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6c4102240 0x7fc6c4197450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:06.703 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.703+0000 7fc6c89cf700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6c4102240 0x7fc6c4197450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:32914/0 (socket says 192.168.123.109:32914) 2026-03-09T17:25:06.703 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.703+0000 7fc6c89cf700 1 -- 192.168.123.109:0/1732416861 learned_addr learned my addr 192.168.123.109:0/1732416861 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:06.704 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.703+0000 7fc6c89cf700 1 -- 192.168.123.109:0/1732416861 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc6b00097e0 con 0x7fc6c4102240 2026-03-09T17:25:06.704 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.703+0000 7fc6c89cf700 1 --2- 192.168.123.109:0/1732416861 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6c4102240 0x7fc6c4197450 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fc6b0004d40 tx=0x7fc6b0004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:06.704 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.703+0000 7fc6c1ffb700 1 -- 192.168.123.109:0/1732416861 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc6b001c070 con 0x7fc6c4102240 2026-03-09T17:25:06.704 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.703+0000 7fc6cac33700 1 -- 192.168.123.109:0/1732416861 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc6c4197b90 con 0x7fc6c4102240 2026-03-09T17:25:06.704 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.704+0000 7fc6cac33700 1 -- 192.168.123.109:0/1732416861 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc6c4198030 con 0x7fc6c4102240 2026-03-09T17:25:06.705 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.704+0000 7fc6c1ffb700 1 -- 192.168.123.109:0/1732416861 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc6b00056f0 con 0x7fc6c4102240 2026-03-09T17:25:06.705 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.704+0000 7fc6c1ffb700 1 -- 192.168.123.109:0/1732416861 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc6b0017440 con 0x7fc6c4102240 2026-03-09T17:25:06.706 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.705+0000 7fc6c1ffb700 1 -- 192.168.123.109:0/1732416861 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fc6b00175a0 con 0x7fc6c4102240 2026-03-09T17:25:06.706 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.705+0000 7fc6c1ffb700 1 --2- 192.168.123.109:0/1732416861 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc6b4038510 0x7fc6b403a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:06.706 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.705+0000 7fc6c1ffb700 1 -- 192.168.123.109:0/1732416861 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fc6b004d150 con 0x7fc6c4102240 2026-03-09T17:25:06.706 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.705+0000 7fc6c3fff700 1 --2- 192.168.123.109:0/1732416861 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc6b4038510 0x7fc6b403a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:06.706 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.705+0000 7fc6cac33700 1 -- 192.168.123.109:0/1732416861 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc6a8005320 con 0x7fc6c4102240 2026-03-09T17:25:06.707 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.706+0000 7fc6c3fff700 1 --2- 192.168.123.109:0/1732416861 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc6b4038510 0x7fc6b403a9c0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fc6b8006fd0 tx=0x7fc6b8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:06.710 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.709+0000 7fc6c1ffb700 1 -- 192.168.123.109:0/1732416861 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc6b0025070 con 0x7fc6c4102240 2026-03-09T17:25:06.858 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.856+0000 7fc6cac33700 1 -- 192.168.123.109:0/1732416861 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fc6a8005190 con 0x7fc6c4102240 2026-03-09T17:25:06.858 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.857+0000 7fc6c1ffb700 1 -- 192.168.123.109:0/1732416861 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fc6b0028b90 con 0x7fc6c4102240 2026-03-09T17:25:06.859 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:06.859 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:06.862 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.861+0000 7fc6cac33700 1 -- 192.168.123.109:0/1732416861 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc6b4038510 msgr2=0x7fc6b403a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:06.862 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.861+0000 7fc6cac33700 1 --2- 192.168.123.109:0/1732416861 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc6b4038510 0x7fc6b403a9c0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fc6b8006fd0 tx=0x7fc6b8006e40 comp rx=0 tx=0).stop 2026-03-09T17:25:06.862 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.861+0000 7fc6cac33700 1 -- 192.168.123.109:0/1732416861 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6c4102240 msgr2=0x7fc6c4197450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:06.862 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.861+0000 7fc6cac33700 1 --2- 192.168.123.109:0/1732416861 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6c4102240 0x7fc6c4197450 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fc6b0004d40 tx=0x7fc6b0004e20 comp rx=0 tx=0).stop 2026-03-09T17:25:06.862 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.861+0000 7fc6cac33700 1 -- 192.168.123.109:0/1732416861 shutdown_connections 2026-03-09T17:25:06.862 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.862+0000 7fc6cac33700 1 --2- 192.168.123.109:0/1732416861 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc6b4038510 0x7fc6b403a9c0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:06.862 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.862+0000 7fc6cac33700 1 --2- 192.168.123.109:0/1732416861 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc6c4102240 0x7fc6c4197450 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:06.863 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.862+0000 7fc6cac33700 1 -- 192.168.123.109:0/1732416861 >> 192.168.123.109:0/1732416861 conn(0x7fc6c40fd8d0 msgr2=0x7fc6c40fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:06.863 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.862+0000 7fc6cac33700 1 -- 192.168.123.109:0/1732416861 shutdown_connections 2026-03-09T17:25:06.863 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:06.862+0000 7fc6cac33700 1 -- 192.168.123.109:0/1732416861 wait complete. 2026-03-09T17:25:06.864 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:07.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:06 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/1732416861' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:07.938 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:07.938 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:08.118 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:08.173 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:08.446 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.444+0000 7f767e80d700 1 -- 192.168.123.109:0/3393409220 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7678100030 msgr2=0x7f7678100440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:08.446 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.444+0000 7f767e80d700 1 --2- 192.168.123.109:0/3393409220 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7678100030 0x7f7678100440 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f7660009b00 tx=0x7f7660009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:08.446 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.445+0000 7f767e80d700 1 -- 192.168.123.109:0/3393409220 shutdown_connections 2026-03-09T17:25:08.446 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.445+0000 7f767e80d700 1 --2- 192.168.123.109:0/3393409220 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7678100030 0x7f7678100440 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:08.446 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.445+0000 7f767e80d700 1 -- 192.168.123.109:0/3393409220 >> 192.168.123.109:0/3393409220 conn(0x7f76780fb5e0 msgr2=0x7f76780fda10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:08.446 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.445+0000 7f767e80d700 1 -- 192.168.123.109:0/3393409220 shutdown_connections 2026-03-09T17:25:08.447 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.446+0000 7f767e80d700 1 -- 192.168.123.109:0/3393409220 wait complete. 2026-03-09T17:25:08.447 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.446+0000 7f767e80d700 1 Processor -- start 2026-03-09T17:25:08.447 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.446+0000 7f767e80d700 1 -- start start 2026-03-09T17:25:08.447 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.446+0000 7f767e80d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f76781953c0 0x7f76781957d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:08.447 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.446+0000 7f767e80d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7678195d10 con 0x7f76781953c0 2026-03-09T17:25:08.448 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.447+0000 7f7677fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f76781953c0 0x7f76781957d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:08.448 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.447+0000 7f7677fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f76781953c0 0x7f76781957d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:32930/0 (socket says 192.168.123.109:32930) 2026-03-09T17:25:08.448 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.447+0000 7f7677fff700 1 -- 192.168.123.109:0/1468055382 learned_addr learned my addr 192.168.123.109:0/1468055382 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:08.448 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.447+0000 7f7677fff700 1 -- 192.168.123.109:0/1468055382 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f76600097e0 con 0x7f76781953c0 2026-03-09T17:25:08.448 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.447+0000 7f7677fff700 1 --2- 192.168.123.109:0/1468055382 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f76781953c0 0x7f76781957d0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f766000c010 tx=0x7f76600052e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:08.448 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.447+0000 7f76757fa700 1 -- 192.168.123.109:0/1468055382 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f766001c070 con 0x7f76781953c0 2026-03-09T17:25:08.448 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.447+0000 7f767e80d700 1 -- 192.168.123.109:0/1468055382 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7678195f10 con 0x7f76781953c0 2026-03-09T17:25:08.448 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.448+0000 7f767e80d700 1 -- 192.168.123.109:0/1468055382 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7678198b70 con 0x7f76781953c0 2026-03-09T17:25:08.450 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.448+0000 7f76757fa700 1 -- 192.168.123.109:0/1468055382 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f766000b810 con 0x7f76781953c0 2026-03-09T17:25:08.450 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.448+0000 7f76757fa700 1 -- 192.168.123.109:0/1468055382 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7660003a90 con 0x7f76781953c0 2026-03-09T17:25:08.450 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.449+0000 7f767e80d700 1 -- 192.168.123.109:0/1468055382 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f767818ee80 con 0x7f76781953c0 2026-03-09T17:25:08.450 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.449+0000 7f76757fa700 1 -- 192.168.123.109:0/1468055382 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f7660003bf0 con 0x7f76781953c0 2026-03-09T17:25:08.450 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.450+0000 7f76757fa700 1 --2- 192.168.123.109:0/1468055382 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f76640384c0 0x7f766403a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:08.451 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.450+0000 7f76757fa700 1 -- 192.168.123.109:0/1468055382 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f766004c2d0 con 0x7f76781953c0 2026-03-09T17:25:08.451 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.450+0000 7f76777fe700 1 --2- 192.168.123.109:0/1468055382 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f76640384c0 0x7f766403a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:08.451 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.450+0000 7f76777fe700 1 --2- 192.168.123.109:0/1468055382 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f76640384c0 0x7f766403a970 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f7668006fd0 tx=0x7f7668006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:08.454 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.453+0000 7f76757fa700 1 -- 192.168.123.109:0/1468055382 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7660004050 con 0x7f76781953c0 2026-03-09T17:25:08.596 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.595+0000 7f767e80d700 1 -- 192.168.123.109:0/1468055382 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f7678062380 con 0x7f76781953c0 2026-03-09T17:25:08.598 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.597+0000 7f76757fa700 1 -- 192.168.123.109:0/1468055382 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f7660025020 con 0x7f76781953c0 2026-03-09T17:25:08.598 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:08.598 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:08.601 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.600+0000 7f767e80d700 1 -- 192.168.123.109:0/1468055382 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f76640384c0 msgr2=0x7f766403a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:08.601 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.600+0000 7f767e80d700 1 --2- 192.168.123.109:0/1468055382 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f76640384c0 0x7f766403a970 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f7668006fd0 tx=0x7f7668006e40 comp rx=0 tx=0).stop 2026-03-09T17:25:08.601 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.600+0000 7f767e80d700 1 -- 192.168.123.109:0/1468055382 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f76781953c0 msgr2=0x7f76781957d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:08.601 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.600+0000 7f767e80d700 1 --2- 192.168.123.109:0/1468055382 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f76781953c0 0x7f76781957d0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f766000c010 tx=0x7f76600052e0 comp rx=0 tx=0).stop 2026-03-09T17:25:08.601 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.600+0000 7f767e80d700 1 -- 192.168.123.109:0/1468055382 shutdown_connections 2026-03-09T17:25:08.601 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.600+0000 7f767e80d700 1 --2- 192.168.123.109:0/1468055382 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f76640384c0 0x7f766403a970 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:08.601 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.600+0000 7f767e80d700 1 --2- 192.168.123.109:0/1468055382 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f76781953c0 0x7f76781957d0 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:08.601 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.600+0000 7f767e80d700 1 -- 192.168.123.109:0/1468055382 >> 192.168.123.109:0/1468055382 conn(0x7f76780fb5e0 msgr2=0x7f76780fc290 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:08.601 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.600+0000 7f767e80d700 1 -- 192.168.123.109:0/1468055382 shutdown_connections 2026-03-09T17:25:08.601 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:08.600+0000 7f767e80d700 1 -- 192.168.123.109:0/1468055382 wait complete. 2026-03-09T17:25:08.602 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:08.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:08 vm06 ceph-mon[57307]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:09.683 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:09.684 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:09.844 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:09.890 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:09.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:09 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/1468055382' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:10.169 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.167+0000 7f0d6e094700 1 -- 192.168.123.109:0/3769247402 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0d680fe250 msgr2=0x7f0d680fe660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:10.169 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.167+0000 7f0d6e094700 1 --2- 192.168.123.109:0/3769247402 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0d680fe250 0x7f0d680fe660 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f0d5c009b00 tx=0x7f0d5c009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:10.169 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.168+0000 7f0d6e094700 1 -- 192.168.123.109:0/3769247402 shutdown_connections 2026-03-09T17:25:10.169 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.168+0000 7f0d6e094700 1 --2- 192.168.123.109:0/3769247402 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0d680fe250 0x7f0d680fe660 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:10.169 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.168+0000 7f0d6e094700 1 -- 192.168.123.109:0/3769247402 >> 192.168.123.109:0/3769247402 conn(0x7f0d680f9800 msgr2=0x7f0d680fbc30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:10.169 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.168+0000 7f0d6e094700 1 -- 192.168.123.109:0/3769247402 shutdown_connections 2026-03-09T17:25:10.169 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.168+0000 7f0d6e094700 1 -- 192.168.123.109:0/3769247402 wait complete. 2026-03-09T17:25:10.170 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.169+0000 7f0d6e094700 1 Processor -- start 2026-03-09T17:25:10.170 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.169+0000 7f0d6e094700 1 -- start start 2026-03-09T17:25:10.170 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.169+0000 7f0d6e094700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0d680fe250 0x7f0d681951a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:10.170 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.169+0000 7f0d6e094700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0d681956e0 con 0x7f0d680fe250 2026-03-09T17:25:10.171 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.170+0000 7f0d677fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0d680fe250 0x7f0d681951a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:10.171 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.170+0000 7f0d677fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0d680fe250 0x7f0d681951a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:32936/0 (socket says 192.168.123.109:32936) 2026-03-09T17:25:10.171 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.170+0000 7f0d677fe700 1 -- 192.168.123.109:0/4054952343 learned_addr learned my addr 192.168.123.109:0/4054952343 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:10.171 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.170+0000 7f0d677fe700 1 -- 192.168.123.109:0/4054952343 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0d5c0097e0 con 0x7f0d680fe250 2026-03-09T17:25:10.171 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.171+0000 7f0d677fe700 1 --2- 192.168.123.109:0/4054952343 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0d680fe250 0x7f0d681951a0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f0d5c004750 tx=0x7f0d5c005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:10.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.171+0000 7f0d64ff9700 1 -- 192.168.123.109:0/4054952343 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0d5c01c070 con 0x7f0d680fe250 2026-03-09T17:25:10.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.171+0000 7f0d6e094700 1 -- 192.168.123.109:0/4054952343 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0d681958e0 con 0x7f0d680fe250 2026-03-09T17:25:10.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.171+0000 7f0d6e094700 1 -- 192.168.123.109:0/4054952343 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0d68195d80 con 0x7f0d680fe250 2026-03-09T17:25:10.173 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.172+0000 7f0d64ff9700 1 -- 192.168.123.109:0/4054952343 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0d5c021470 con 0x7f0d680fe250 2026-03-09T17:25:10.173 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.172+0000 7f0d64ff9700 1 -- 192.168.123.109:0/4054952343 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0d5c00f460 con 0x7f0d680fe250 2026-03-09T17:25:10.173 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.172+0000 7f0d64ff9700 1 -- 192.168.123.109:0/4054952343 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f0d5c00f600 con 0x7f0d680fe250 2026-03-09T17:25:10.173 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.172+0000 7f0d64ff9700 1 --2- 192.168.123.109:0/4054952343 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0d5003c8d0 0x7f0d5003ed80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:10.173 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.172+0000 7f0d66ffd700 1 --2- 192.168.123.109:0/4054952343 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0d5003c8d0 0x7f0d5003ed80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:10.174 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.173+0000 7f0d64ff9700 1 -- 192.168.123.109:0/4054952343 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f0d5c0203d0 con 0x7f0d680fe250 2026-03-09T17:25:10.174 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.173+0000 7f0d6e094700 1 -- 192.168.123.109:0/4054952343 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0d6818ee70 con 0x7f0d680fe250 2026-03-09T17:25:10.174 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.173+0000 7f0d66ffd700 1 --2- 192.168.123.109:0/4054952343 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0d5003c8d0 0x7f0d5003ed80 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f0d58006fd0 tx=0x7f0d58006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:10.177 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.176+0000 7f0d64ff9700 1 -- 192.168.123.109:0/4054952343 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f0d5c017780 con 0x7f0d680fe250 2026-03-09T17:25:10.333 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.331+0000 7f0d6e094700 1 -- 192.168.123.109:0/4054952343 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f0d68062380 con 0x7f0d680fe250 2026-03-09T17:25:10.333 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.332+0000 7f0d64ff9700 1 -- 192.168.123.109:0/4054952343 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f0d5c017780 con 0x7f0d680fe250 2026-03-09T17:25:10.334 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:10.334 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:10.336 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.335+0000 7f0d6e094700 1 -- 192.168.123.109:0/4054952343 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0d5003c8d0 msgr2=0x7f0d5003ed80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:10.336 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.336+0000 7f0d6e094700 1 --2- 192.168.123.109:0/4054952343 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0d5003c8d0 0x7f0d5003ed80 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f0d58006fd0 tx=0x7f0d58006e40 comp rx=0 tx=0).stop 2026-03-09T17:25:10.337 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.336+0000 7f0d6e094700 1 -- 192.168.123.109:0/4054952343 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0d680fe250 msgr2=0x7f0d681951a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:10.337 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.336+0000 7f0d6e094700 1 --2- 192.168.123.109:0/4054952343 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0d680fe250 0x7f0d681951a0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f0d5c004750 tx=0x7f0d5c005dc0 comp rx=0 tx=0).stop 2026-03-09T17:25:10.337 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.336+0000 7f0d6e094700 1 -- 192.168.123.109:0/4054952343 shutdown_connections 2026-03-09T17:25:10.337 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.336+0000 7f0d6e094700 1 --2- 192.168.123.109:0/4054952343 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0d5003c8d0 0x7f0d5003ed80 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:10.337 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.337+0000 7f0d6e094700 1 --2- 192.168.123.109:0/4054952343 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0d680fe250 0x7f0d681951a0 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:10.338 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.337+0000 7f0d6e094700 1 -- 192.168.123.109:0/4054952343 >> 192.168.123.109:0/4054952343 conn(0x7f0d680f9800 msgr2=0x7f0d680fa4b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:10.338 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.337+0000 7f0d6e094700 1 -- 192.168.123.109:0/4054952343 shutdown_connections 2026-03-09T17:25:10.338 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:10.337+0000 7f0d6e094700 1 -- 192.168.123.109:0/4054952343 wait complete. 2026-03-09T17:25:10.339 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:11.138 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:10 vm06 ceph-mon[57307]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:11.138 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:10 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/4054952343' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:11.409 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:11.409 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:11.569 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:11.614 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:11.896 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.895+0000 7fe0d42d7700 1 -- 192.168.123.109:0/2568926559 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0cc100010 msgr2=0x7fe0cc100420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:11.896 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.895+0000 7fe0d42d7700 1 --2- 192.168.123.109:0/2568926559 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0cc100010 0x7fe0cc100420 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7fe0c8009b00 tx=0x7fe0c8009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:11.896 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.895+0000 7fe0d42d7700 1 -- 192.168.123.109:0/2568926559 shutdown_connections 2026-03-09T17:25:11.896 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.895+0000 7fe0d42d7700 1 --2- 192.168.123.109:0/2568926559 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0cc100010 0x7fe0cc100420 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:11.897 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.895+0000 7fe0d42d7700 1 -- 192.168.123.109:0/2568926559 >> 192.168.123.109:0/2568926559 conn(0x7fe0cc0fb5a0 msgr2=0x7fe0cc0fd9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:11.897 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.896+0000 7fe0d42d7700 1 -- 192.168.123.109:0/2568926559 shutdown_connections 2026-03-09T17:25:11.897 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.896+0000 7fe0d42d7700 1 -- 192.168.123.109:0/2568926559 wait complete. 2026-03-09T17:25:11.897 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.896+0000 7fe0d42d7700 1 Processor -- start 2026-03-09T17:25:11.897 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.897+0000 7fe0d42d7700 1 -- start start 2026-03-09T17:25:11.898 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.897+0000 7fe0d42d7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0cc100010 0x7fe0cc1951c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:11.898 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.897+0000 7fe0d42d7700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe0cc195700 con 0x7fe0cc100010 2026-03-09T17:25:11.898 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.897+0000 7fe0d2073700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0cc100010 0x7fe0cc1951c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:11.901 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.900+0000 7fe0d2073700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0cc100010 0x7fe0cc1951c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:32946/0 (socket says 192.168.123.109:32946) 2026-03-09T17:25:11.901 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.900+0000 7fe0d2073700 1 -- 192.168.123.109:0/89672636 learned_addr learned my addr 192.168.123.109:0/89672636 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:11.901 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.900+0000 7fe0d2073700 1 -- 192.168.123.109:0/89672636 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe0c80097e0 con 0x7fe0cc100010 2026-03-09T17:25:11.901 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.900+0000 7fe0d2073700 1 --2- 192.168.123.109:0/89672636 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0cc100010 0x7fe0cc1951c0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fe0c8000c00 tx=0x7fe0c8004740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:11.903 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.902+0000 7fe0c37fe700 1 -- 192.168.123.109:0/89672636 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe0c801c070 con 0x7fe0cc100010 2026-03-09T17:25:11.903 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.902+0000 7fe0c37fe700 1 -- 192.168.123.109:0/89672636 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe0c80053b0 con 0x7fe0cc100010 2026-03-09T17:25:11.903 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.902+0000 7fe0c37fe700 1 -- 192.168.123.109:0/89672636 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe0c800f460 con 0x7fe0cc100010 2026-03-09T17:25:11.903 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.902+0000 7fe0d42d7700 1 -- 192.168.123.109:0/89672636 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe0cc195900 con 0x7fe0cc100010 2026-03-09T17:25:11.903 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.902+0000 7fe0d42d7700 1 -- 192.168.123.109:0/89672636 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe0cc195da0 con 0x7fe0cc100010 2026-03-09T17:25:11.904 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.903+0000 7fe0d42d7700 1 -- 192.168.123.109:0/89672636 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe0cc18ee60 con 0x7fe0cc100010 2026-03-09T17:25:11.905 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.905+0000 7fe0c37fe700 1 -- 192.168.123.109:0/89672636 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fe0c8005520 con 0x7fe0cc100010 2026-03-09T17:25:11.906 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.905+0000 7fe0c37fe700 1 --2- 192.168.123.109:0/89672636 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe0b803c970 0x7fe0b803ee20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:11.906 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.905+0000 7fe0c37fe700 1 -- 192.168.123.109:0/89672636 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fe0c804c900 con 0x7fe0cc100010 2026-03-09T17:25:11.907 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.906+0000 7fe0d1872700 1 --2- 192.168.123.109:0/89672636 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe0b803c970 0x7fe0b803ee20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:11.907 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.906+0000 7fe0d1872700 1 --2- 192.168.123.109:0/89672636 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe0b803c970 0x7fe0b803ee20 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fe0bc006fd0 tx=0x7fe0bc006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:11.911 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:11.910+0000 7fe0c37fe700 1 -- 192.168.123.109:0/89672636 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe0c80209c0 con 0x7fe0cc100010 2026-03-09T17:25:12.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:11 vm06 ceph-mon[57307]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:12.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:11 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:12.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:11 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:12.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:11 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:12.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:11 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:12.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:11 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:12.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:11 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:12.060 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:12.057+0000 7fe0d42d7700 1 -- 192.168.123.109:0/89672636 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fe0cc062380 con 0x7fe0cc100010 2026-03-09T17:25:12.062 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:12.060+0000 7fe0c37fe700 1 -- 192.168.123.109:0/89672636 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fe0c8026070 con 0x7fe0cc100010 2026-03-09T17:25:12.062 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:12.062 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:12.064 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:12.063+0000 7fe0d42d7700 1 -- 192.168.123.109:0/89672636 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe0b803c970 msgr2=0x7fe0b803ee20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:12.064 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:12.063+0000 7fe0d42d7700 1 --2- 192.168.123.109:0/89672636 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe0b803c970 0x7fe0b803ee20 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fe0bc006fd0 tx=0x7fe0bc006e40 comp rx=0 tx=0).stop 2026-03-09T17:25:12.064 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:12.063+0000 7fe0d42d7700 1 -- 192.168.123.109:0/89672636 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0cc100010 msgr2=0x7fe0cc1951c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:12.064 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:12.064+0000 7fe0d42d7700 1 --2- 192.168.123.109:0/89672636 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0cc100010 0x7fe0cc1951c0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fe0c8000c00 tx=0x7fe0c8004740 comp rx=0 tx=0).stop 2026-03-09T17:25:12.065 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:12.064+0000 7fe0d42d7700 1 -- 192.168.123.109:0/89672636 shutdown_connections 2026-03-09T17:25:12.065 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:12.064+0000 7fe0d42d7700 1 --2- 192.168.123.109:0/89672636 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe0b803c970 0x7fe0b803ee20 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:12.065 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:12.064+0000 7fe0d42d7700 1 --2- 192.168.123.109:0/89672636 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0cc100010 0x7fe0cc1951c0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:12.065 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:12.064+0000 7fe0d42d7700 1 -- 192.168.123.109:0/89672636 >> 192.168.123.109:0/89672636 conn(0x7fe0cc0fb5a0 msgr2=0x7fe0cc0fc270 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:12.065 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:12.064+0000 7fe0d42d7700 1 -- 192.168.123.109:0/89672636 shutdown_connections 2026-03-09T17:25:12.065 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:12.064+0000 7fe0d42d7700 1 -- 192.168.123.109:0/89672636 wait complete. 2026-03-09T17:25:12.067 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:13.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:12 vm06 ceph-mon[57307]: Deploying daemon prometheus.vm06 on vm06 2026-03-09T17:25:13.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:12 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/89672636' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:13.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:12 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:13.142 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:13.142 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:13.292 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:13.338 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:13.601 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.599+0000 7f87b5604700 1 -- 192.168.123.109:0/2169660859 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87b0102240 msgr2=0x7f87b0102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:13.601 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.599+0000 7f87b5604700 1 --2- 192.168.123.109:0/2169660859 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87b0102240 0x7f87b0102650 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f8798009b00 tx=0x7f8798009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:13.602 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.600+0000 7f87b5604700 1 -- 192.168.123.109:0/2169660859 shutdown_connections 2026-03-09T17:25:13.602 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.600+0000 7f87b5604700 1 --2- 192.168.123.109:0/2169660859 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87b0102240 0x7f87b0102650 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:13.602 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.600+0000 7f87b5604700 1 -- 192.168.123.109:0/2169660859 >> 192.168.123.109:0/2169660859 conn(0x7f87b00fd8d0 msgr2=0x7f87b00ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:13.602 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.600+0000 7f87b5604700 1 -- 192.168.123.109:0/2169660859 shutdown_connections 2026-03-09T17:25:13.602 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.600+0000 7f87b5604700 1 -- 192.168.123.109:0/2169660859 wait complete. 2026-03-09T17:25:13.602 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.601+0000 7f87b5604700 1 Processor -- start 2026-03-09T17:25:13.602 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.601+0000 7f87b5604700 1 -- start start 2026-03-09T17:25:13.602 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.601+0000 7f87b5604700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87b0102240 0x7f87b0197450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:13.602 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.601+0000 7f87b5604700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f87b0197990 con 0x7f87b0102240 2026-03-09T17:25:13.603 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.602+0000 7f87aeffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87b0102240 0x7f87b0197450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:13.603 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.602+0000 7f87aeffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87b0102240 0x7f87b0197450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:32954/0 (socket says 192.168.123.109:32954) 2026-03-09T17:25:13.603 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.602+0000 7f87aeffd700 1 -- 192.168.123.109:0/1948705938 learned_addr learned my addr 192.168.123.109:0/1948705938 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:13.603 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.602+0000 7f87aeffd700 1 -- 192.168.123.109:0/1948705938 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f87980097e0 con 0x7f87b0102240 2026-03-09T17:25:13.603 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.602+0000 7f87aeffd700 1 --2- 192.168.123.109:0/1948705938 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87b0102240 0x7f87b0197450 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f8798004d40 tx=0x7f8798004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:13.604 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.603+0000 7f87acff9700 1 -- 192.168.123.109:0/1948705938 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f879801c070 con 0x7f87b0102240 2026-03-09T17:25:13.605 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.603+0000 7f87acff9700 1 -- 192.168.123.109:0/1948705938 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f87980056f0 con 0x7f87b0102240 2026-03-09T17:25:13.605 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.603+0000 7f87b5604700 1 -- 192.168.123.109:0/1948705938 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f87b0197b90 con 0x7f87b0102240 2026-03-09T17:25:13.605 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.603+0000 7f87acff9700 1 -- 192.168.123.109:0/1948705938 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8798017440 con 0x7f87b0102240 2026-03-09T17:25:13.605 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.603+0000 7f87b5604700 1 -- 192.168.123.109:0/1948705938 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f87b0198030 con 0x7f87b0102240 2026-03-09T17:25:13.605 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.604+0000 7f87acff9700 1 -- 192.168.123.109:0/1948705938 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f879800f460 con 0x7f87b0102240 2026-03-09T17:25:13.605 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.604+0000 7f87b5604700 1 -- 192.168.123.109:0/1948705938 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f87b0191090 con 0x7f87b0102240 2026-03-09T17:25:13.605 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.604+0000 7f87acff9700 1 --2- 192.168.123.109:0/1948705938 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f879c038510 0x7f879c03a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:13.605 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.604+0000 7f87acff9700 1 -- 192.168.123.109:0/1948705938 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f879804bfe0 con 0x7f87b0102240 2026-03-09T17:25:13.606 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.605+0000 7f87a65ff700 1 --2- 192.168.123.109:0/1948705938 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f879c038510 0x7f879c03a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:13.606 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.606+0000 7f87a65ff700 1 --2- 192.168.123.109:0/1948705938 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f879c038510 0x7f879c03a9c0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f87a0006fd0 tx=0x7f87a0006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:13.610 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.609+0000 7f87acff9700 1 -- 192.168.123.109:0/1948705938 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f879800f920 con 0x7f87b0102240 2026-03-09T17:25:13.762 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.761+0000 7f87b5604700 1 -- 192.168.123.109:0/1948705938 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f87b0062380 con 0x7f87b0102240 2026-03-09T17:25:13.763 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.761+0000 7f87acff9700 1 -- 192.168.123.109:0/1948705938 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f8798025070 con 0x7f87b0102240 2026-03-09T17:25:13.763 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:13.763 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:13.766 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.765+0000 7f87b5604700 1 -- 192.168.123.109:0/1948705938 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f879c038510 msgr2=0x7f879c03a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:13.766 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.765+0000 7f87b5604700 1 --2- 192.168.123.109:0/1948705938 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f879c038510 0x7f879c03a9c0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f87a0006fd0 tx=0x7f87a0006e40 comp rx=0 tx=0).stop 2026-03-09T17:25:13.766 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.765+0000 7f87b5604700 1 -- 192.168.123.109:0/1948705938 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87b0102240 msgr2=0x7f87b0197450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:13.766 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.765+0000 7f87b5604700 1 --2- 192.168.123.109:0/1948705938 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87b0102240 0x7f87b0197450 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f8798004d40 tx=0x7f8798004e20 comp rx=0 tx=0).stop 2026-03-09T17:25:13.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.766+0000 7f87b5604700 1 -- 192.168.123.109:0/1948705938 shutdown_connections 2026-03-09T17:25:13.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.766+0000 7f87b5604700 1 --2- 192.168.123.109:0/1948705938 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f879c038510 0x7f879c03a9c0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:13.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.766+0000 7f87b5604700 1 --2- 192.168.123.109:0/1948705938 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87b0102240 0x7f87b0197450 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:13.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.766+0000 7f87b5604700 1 -- 192.168.123.109:0/1948705938 >> 192.168.123.109:0/1948705938 conn(0x7f87b00fd8d0 msgr2=0x7f87b00fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:13.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.766+0000 7f87b5604700 1 -- 192.168.123.109:0/1948705938 shutdown_connections 2026-03-09T17:25:13.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:13.766+0000 7f87b5604700 1 -- 192.168.123.109:0/1948705938 wait complete. 2026-03-09T17:25:13.768 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:14.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:13 vm06 ceph-mon[57307]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:14.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:13 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/1948705938' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:14.836 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:14.836 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:14.989 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:15.031 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:15.304 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.302+0000 7fab41b7c700 1 -- 192.168.123.109:0/107118329 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab3c100aa0 msgr2=0x7fab3c102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:15.304 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.302+0000 7fab41b7c700 1 --2- 192.168.123.109:0/107118329 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab3c100aa0 0x7fab3c102e80 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7fab24009b00 tx=0x7fab24009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:15.304 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.303+0000 7fab41b7c700 1 -- 192.168.123.109:0/107118329 shutdown_connections 2026-03-09T17:25:15.304 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.303+0000 7fab41b7c700 1 --2- 192.168.123.109:0/107118329 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab3c100aa0 0x7fab3c102e80 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:15.305 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.303+0000 7fab41b7c700 1 -- 192.168.123.109:0/107118329 >> 192.168.123.109:0/107118329 conn(0x7fab3c0fa4a0 msgr2=0x7fab3c0fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:15.305 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.304+0000 7fab41b7c700 1 -- 192.168.123.109:0/107118329 shutdown_connections 2026-03-09T17:25:15.305 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.304+0000 7fab41b7c700 1 -- 192.168.123.109:0/107118329 wait complete. 2026-03-09T17:25:15.305 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.304+0000 7fab41b7c700 1 Processor -- start 2026-03-09T17:25:15.305 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.305+0000 7fab41b7c700 1 -- start start 2026-03-09T17:25:15.305 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.305+0000 7fab41b7c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab3c100aa0 0x7fab3c100400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:15.306 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.305+0000 7fab41b7c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fab3c100940 con 0x7fab3c100aa0 2026-03-09T17:25:15.306 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.305+0000 7fab3b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab3c100aa0 0x7fab3c100400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:15.306 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.305+0000 7fab3b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab3c100aa0 0x7fab3c100400 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:32986/0 (socket says 192.168.123.109:32986) 2026-03-09T17:25:15.306 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.305+0000 7fab3b7fe700 1 -- 192.168.123.109:0/2068900016 learned_addr learned my addr 192.168.123.109:0/2068900016 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:15.306 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.305+0000 7fab3b7fe700 1 -- 192.168.123.109:0/2068900016 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fab240097e0 con 0x7fab3c100aa0 2026-03-09T17:25:15.306 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.306+0000 7fab3b7fe700 1 --2- 192.168.123.109:0/2068900016 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab3c100aa0 0x7fab3c100400 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fab24004f40 tx=0x7fab24005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:15.307 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.306+0000 7fab38ff9700 1 -- 192.168.123.109:0/2068900016 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fab2401c070 con 0x7fab3c100aa0 2026-03-09T17:25:15.307 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.306+0000 7fab38ff9700 1 -- 192.168.123.109:0/2068900016 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fab240053b0 con 0x7fab3c100aa0 2026-03-09T17:25:15.308 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.306+0000 7fab38ff9700 1 -- 192.168.123.109:0/2068900016 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fab2400f460 con 0x7fab3c100aa0 2026-03-09T17:25:15.308 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.306+0000 7fab41b7c700 1 -- 192.168.123.109:0/2068900016 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fab3c0fea50 con 0x7fab3c100aa0 2026-03-09T17:25:15.308 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.306+0000 7fab41b7c700 1 -- 192.168.123.109:0/2068900016 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fab3c0feef0 con 0x7fab3c100aa0 2026-03-09T17:25:15.309 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.307+0000 7fab38ff9700 1 -- 192.168.123.109:0/2068900016 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fab24005520 con 0x7fab3c100aa0 2026-03-09T17:25:15.309 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.307+0000 7fab41b7c700 1 -- 192.168.123.109:0/2068900016 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fab3c04f9e0 con 0x7fab3c100aa0 2026-03-09T17:25:15.309 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.308+0000 7fab38ff9700 1 --2- 192.168.123.109:0/2068900016 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fab28038140 0x7fab2803a5f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:15.309 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.308+0000 7fab38ff9700 1 -- 192.168.123.109:0/2068900016 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fab2404c470 con 0x7fab3c100aa0 2026-03-09T17:25:15.309 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.308+0000 7fab3affd700 1 --2- 192.168.123.109:0/2068900016 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fab28038140 0x7fab2803a5f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:15.309 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.309+0000 7fab3affd700 1 --2- 192.168.123.109:0/2068900016 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fab28038140 0x7fab2803a5f0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fab2c006fd0 tx=0x7fab2c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:15.312 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.311+0000 7fab38ff9700 1 -- 192.168.123.109:0/2068900016 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fab2402aa30 con 0x7fab3c100aa0 2026-03-09T17:25:15.469 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.468+0000 7fab41b7c700 1 -- 192.168.123.109:0/2068900016 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fab3c062380 con 0x7fab3c100aa0 2026-03-09T17:25:15.470 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.469+0000 7fab38ff9700 1 -- 192.168.123.109:0/2068900016 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fab24026020 con 0x7fab3c100aa0 2026-03-09T17:25:15.470 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:15.470 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:15.472 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.471+0000 7fab41b7c700 1 -- 192.168.123.109:0/2068900016 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fab28038140 msgr2=0x7fab2803a5f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:15.472 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.471+0000 7fab41b7c700 1 --2- 192.168.123.109:0/2068900016 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fab28038140 0x7fab2803a5f0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fab2c006fd0 tx=0x7fab2c006e40 comp rx=0 tx=0).stop 2026-03-09T17:25:15.472 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.471+0000 7fab41b7c700 1 -- 192.168.123.109:0/2068900016 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab3c100aa0 msgr2=0x7fab3c100400 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:15.472 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.471+0000 7fab41b7c700 1 --2- 192.168.123.109:0/2068900016 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab3c100aa0 0x7fab3c100400 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fab24004f40 tx=0x7fab24005e70 comp rx=0 tx=0).stop 2026-03-09T17:25:15.473 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.472+0000 7fab41b7c700 1 -- 192.168.123.109:0/2068900016 shutdown_connections 2026-03-09T17:25:15.473 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.472+0000 7fab41b7c700 1 --2- 192.168.123.109:0/2068900016 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fab28038140 0x7fab2803a5f0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:15.473 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.472+0000 7fab41b7c700 1 --2- 192.168.123.109:0/2068900016 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab3c100aa0 0x7fab3c100400 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:15.473 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.472+0000 7fab41b7c700 1 -- 192.168.123.109:0/2068900016 >> 192.168.123.109:0/2068900016 conn(0x7fab3c0fa4a0 msgr2=0x7fab3c0fb170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:15.473 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.472+0000 7fab41b7c700 1 -- 192.168.123.109:0/2068900016 shutdown_connections 2026-03-09T17:25:15.473 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:15.472+0000 7fab41b7c700 1 -- 192.168.123.109:0/2068900016 wait complete. 2026-03-09T17:25:15.473 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:15.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:15 vm06 ceph-mon[57307]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:15.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:15 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/2068900016' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:16.551 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:16.551 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:16.709 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:16.751 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:17.009 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.007+0000 7f14a7dac700 1 -- 192.168.123.109:0/998066681 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14a0102230 msgr2=0x7f14a0102640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:17.009 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.007+0000 7f14a7dac700 1 --2- 192.168.123.109:0/998066681 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14a0102230 0x7f14a0102640 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f1490009b00 tx=0x7f1490009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:17.009 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.008+0000 7f14a7dac700 1 -- 192.168.123.109:0/998066681 shutdown_connections 2026-03-09T17:25:17.009 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.008+0000 7f14a7dac700 1 --2- 192.168.123.109:0/998066681 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14a0102230 0x7f14a0102640 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:17.009 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.008+0000 7f14a7dac700 1 -- 192.168.123.109:0/998066681 >> 192.168.123.109:0/998066681 conn(0x7f14a00fd8d0 msgr2=0x7f14a00ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:17.009 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.008+0000 7f14a7dac700 1 -- 192.168.123.109:0/998066681 shutdown_connections 2026-03-09T17:25:17.009 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.008+0000 7f14a7dac700 1 -- 192.168.123.109:0/998066681 wait complete. 2026-03-09T17:25:17.009 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.008+0000 7f14a7dac700 1 Processor -- start 2026-03-09T17:25:17.009 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.009+0000 7f14a7dac700 1 -- start start 2026-03-09T17:25:17.009 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.009+0000 7f14a7dac700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14a0102230 0x7f14a0197380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:17.010 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.009+0000 7f14a7dac700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f14a01978c0 con 0x7f14a0102230 2026-03-09T17:25:17.010 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.009+0000 7f14a5b48700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14a0102230 0x7f14a0197380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:17.010 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.009+0000 7f14a5b48700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14a0102230 0x7f14a0197380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:40572/0 (socket says 192.168.123.109:40572) 2026-03-09T17:25:17.010 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.009+0000 7f14a5b48700 1 -- 192.168.123.109:0/3142903907 learned_addr learned my addr 192.168.123.109:0/3142903907 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:17.010 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.009+0000 7f14a5b48700 1 -- 192.168.123.109:0/3142903907 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f14900097e0 con 0x7f14a0102230 2026-03-09T17:25:17.010 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.010+0000 7f14a5b48700 1 --2- 192.168.123.109:0/3142903907 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14a0102230 0x7f14a0197380 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f1490004750 tx=0x7f1490005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:17.011 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.010+0000 7f1496ffd700 1 -- 192.168.123.109:0/3142903907 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f149001c070 con 0x7f14a0102230 2026-03-09T17:25:17.011 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.010+0000 7f14a7dac700 1 -- 192.168.123.109:0/3142903907 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f14a0197ac0 con 0x7f14a0102230 2026-03-09T17:25:17.011 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.010+0000 7f14a7dac700 1 -- 192.168.123.109:0/3142903907 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f14a0197f60 con 0x7f14a0102230 2026-03-09T17:25:17.011 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.010+0000 7f1496ffd700 1 -- 192.168.123.109:0/3142903907 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1490021470 con 0x7f14a0102230 2026-03-09T17:25:17.012 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.010+0000 7f1496ffd700 1 -- 192.168.123.109:0/3142903907 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f149000f460 con 0x7f14a0102230 2026-03-09T17:25:17.012 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.011+0000 7f1496ffd700 1 -- 192.168.123.109:0/3142903907 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f1490021ac0 con 0x7f14a0102230 2026-03-09T17:25:17.013 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.012+0000 7f1496ffd700 1 --2- 192.168.123.109:0/3142903907 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f148c038510 0x7f148c03a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:17.013 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.012+0000 7f1496ffd700 1 -- 192.168.123.109:0/3142903907 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f149004c3b0 con 0x7f14a0102230 2026-03-09T17:25:17.013 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.012+0000 7f14a5347700 1 --2- 192.168.123.109:0/3142903907 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f148c038510 0x7f148c03a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:17.013 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.011+0000 7f14a7dac700 1 -- 192.168.123.109:0/3142903907 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1484005320 con 0x7f14a0102230 2026-03-09T17:25:17.013 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.013+0000 7f14a5347700 1 --2- 192.168.123.109:0/3142903907 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f148c038510 0x7f148c03a9c0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f149c006fd0 tx=0x7f149c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:17.016 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.016+0000 7f1496ffd700 1 -- 192.168.123.109:0/3142903907 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f1490026070 con 0x7f14a0102230 2026-03-09T17:25:17.169 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.167+0000 7f14a7dac700 1 -- 192.168.123.109:0/3142903907 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f1484005190 con 0x7f14a0102230 2026-03-09T17:25:17.169 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.168+0000 7f1496ffd700 1 -- 192.168.123.109:0/3142903907 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f1490029540 con 0x7f14a0102230 2026-03-09T17:25:17.169 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:17.170 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:17.171 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.171+0000 7f14a7dac700 1 -- 192.168.123.109:0/3142903907 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f148c038510 msgr2=0x7f148c03a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:17.171 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.171+0000 7f14a7dac700 1 --2- 192.168.123.109:0/3142903907 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f148c038510 0x7f148c03a9c0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f149c006fd0 tx=0x7f149c006e40 comp rx=0 tx=0).stop 2026-03-09T17:25:17.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.171+0000 7f14a7dac700 1 -- 192.168.123.109:0/3142903907 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14a0102230 msgr2=0x7f14a0197380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:17.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.171+0000 7f14a7dac700 1 --2- 192.168.123.109:0/3142903907 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14a0102230 0x7f14a0197380 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f1490004750 tx=0x7f1490005dc0 comp rx=0 tx=0).stop 2026-03-09T17:25:17.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.171+0000 7f14a7dac700 1 -- 192.168.123.109:0/3142903907 shutdown_connections 2026-03-09T17:25:17.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.171+0000 7f14a7dac700 1 --2- 192.168.123.109:0/3142903907 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f148c038510 0x7f148c03a9c0 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:17.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.171+0000 7f14a7dac700 1 --2- 192.168.123.109:0/3142903907 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14a0102230 0x7f14a0197380 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:17.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.171+0000 7f14a7dac700 1 -- 192.168.123.109:0/3142903907 >> 192.168.123.109:0/3142903907 conn(0x7f14a00fd8d0 msgr2=0x7f14a00fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:17.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.171+0000 7f14a7dac700 1 -- 192.168.123.109:0/3142903907 shutdown_connections 2026-03-09T17:25:17.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:17.171+0000 7f14a7dac700 1 -- 192.168.123.109:0/3142903907 wait complete. 2026-03-09T17:25:17.173 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:17.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:17 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/3142903907' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:18.235 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:18.235 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:18.405 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:18.446 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:18.487 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:18 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:18.487 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:18 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:18.487 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:18 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:18.487 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:18 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch 2026-03-09T17:25:18.487 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:18 vm06 ceph-mon[57307]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:18.487 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:18 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:18.723 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.721+0000 7f75daa57700 1 -- 192.168.123.109:0/796907437 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f75d4100030 msgr2=0x7f75d4100440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:18.723 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.721+0000 7f75daa57700 1 --2- 192.168.123.109:0/796907437 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f75d4100030 0x7f75d4100440 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f75bc009b00 tx=0x7f75bc009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:18.723 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.722+0000 7f75daa57700 1 -- 192.168.123.109:0/796907437 shutdown_connections 2026-03-09T17:25:18.723 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.722+0000 7f75daa57700 1 --2- 192.168.123.109:0/796907437 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f75d4100030 0x7f75d4100440 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:18.723 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.722+0000 7f75daa57700 1 -- 192.168.123.109:0/796907437 >> 192.168.123.109:0/796907437 conn(0x7f75d40fb5e0 msgr2=0x7f75d40fda10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:18.723 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.722+0000 7f75daa57700 1 -- 192.168.123.109:0/796907437 shutdown_connections 2026-03-09T17:25:18.723 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.722+0000 7f75daa57700 1 -- 192.168.123.109:0/796907437 wait complete. 2026-03-09T17:25:18.723 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.723+0000 7f75daa57700 1 Processor -- start 2026-03-09T17:25:18.724 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.723+0000 7f75daa57700 1 -- start start 2026-03-09T17:25:18.724 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.723+0000 7f75daa57700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f75d4100030 0x7f75d4192f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:18.724 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.723+0000 7f75daa57700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f75d41934d0 con 0x7f75d4100030 2026-03-09T17:25:18.724 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.723+0000 7f75d3fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f75d4100030 0x7f75d4192f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:18.724 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.723+0000 7f75d3fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f75d4100030 0x7f75d4192f90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:40576/0 (socket says 192.168.123.109:40576) 2026-03-09T17:25:18.724 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.723+0000 7f75d3fff700 1 -- 192.168.123.109:0/4018229584 learned_addr learned my addr 192.168.123.109:0/4018229584 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:18.725 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.724+0000 7f75d3fff700 1 -- 192.168.123.109:0/4018229584 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f75bc0097e0 con 0x7f75d4100030 2026-03-09T17:25:18.725 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.724+0000 7f75d3fff700 1 --2- 192.168.123.109:0/4018229584 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f75d4100030 0x7f75d4192f90 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f75bc004750 tx=0x7f75bc005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:18.725 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.724+0000 7f75d17fa700 1 -- 192.168.123.109:0/4018229584 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f75bc01c070 con 0x7f75d4100030 2026-03-09T17:25:18.726 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.724+0000 7f75d17fa700 1 -- 192.168.123.109:0/4018229584 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f75bc021470 con 0x7f75d4100030 2026-03-09T17:25:18.726 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.724+0000 7f75daa57700 1 -- 192.168.123.109:0/4018229584 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f75d41936d0 con 0x7f75d4100030 2026-03-09T17:25:18.726 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.724+0000 7f75d17fa700 1 -- 192.168.123.109:0/4018229584 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f75bc00f460 con 0x7f75d4100030 2026-03-09T17:25:18.726 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.724+0000 7f75daa57700 1 -- 192.168.123.109:0/4018229584 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f75d4193b70 con 0x7f75d4100030 2026-03-09T17:25:18.726 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.725+0000 7f75d17fa700 1 -- 192.168.123.109:0/4018229584 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 14) v1 ==== 45152+0+0 (secure 0 0 0) 0x7f75bc00f5c0 con 0x7f75d4100030 2026-03-09T17:25:18.727 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.726+0000 7f75d17fa700 1 --2- 192.168.123.109:0/4018229584 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f75c003c9b0 0x7f75c003ee60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:18.727 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.726+0000 7f75d17fa700 1 -- 192.168.123.109:0/4018229584 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f75bc04d470 con 0x7f75d4100030 2026-03-09T17:25:18.727 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.726+0000 7f75d37fe700 1 -- 192.168.123.109:0/4018229584 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f75c003c9b0 msgr2=0x7f75c003ee60 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:25:18.727 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.726+0000 7f75d37fe700 1 --2- 192.168.123.109:0/4018229584 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f75c003c9b0 0x7f75c003ee60 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T17:25:18.727 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.727+0000 7f75daa57700 1 -- 192.168.123.109:0/4018229584 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f75d418cca0 con 0x7f75d4100030 2026-03-09T17:25:18.734 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.731+0000 7f75d17fa700 1 -- 192.168.123.109:0/4018229584 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f75bc026070 con 0x7f75d4100030 2026-03-09T17:25:18.878 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.876+0000 7f75daa57700 1 -- 192.168.123.109:0/4018229584 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f75d4062380 con 0x7f75d4100030 2026-03-09T17:25:18.878 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.877+0000 7f75d17fa700 1 -- 192.168.123.109:0/4018229584 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f75bc029720 con 0x7f75d4100030 2026-03-09T17:25:18.879 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:18.879 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:18.881 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.881+0000 7f75daa57700 1 -- 192.168.123.109:0/4018229584 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f75c003c9b0 msgr2=0x7f75c003ee60 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:25:18.882 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.881+0000 7f75daa57700 1 --2- 192.168.123.109:0/4018229584 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f75c003c9b0 0x7f75c003ee60 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:18.882 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.881+0000 7f75daa57700 1 -- 192.168.123.109:0/4018229584 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f75d4100030 msgr2=0x7f75d4192f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:18.882 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.881+0000 7f75daa57700 1 --2- 192.168.123.109:0/4018229584 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f75d4100030 0x7f75d4192f90 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f75bc004750 tx=0x7f75bc005dc0 comp rx=0 tx=0).stop 2026-03-09T17:25:18.882 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.881+0000 7f75daa57700 1 -- 192.168.123.109:0/4018229584 shutdown_connections 2026-03-09T17:25:18.882 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.881+0000 7f75daa57700 1 --2- 192.168.123.109:0/4018229584 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f75c003c9b0 0x7f75c003ee60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:18.882 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.881+0000 7f75daa57700 1 --2- 192.168.123.109:0/4018229584 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f75d4100030 0x7f75d4192f90 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:18.882 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.882+0000 7f75daa57700 1 -- 192.168.123.109:0/4018229584 >> 192.168.123.109:0/4018229584 conn(0x7f75d40fb5e0 msgr2=0x7f75d40fc290 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:18.883 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.882+0000 7f75daa57700 1 -- 192.168.123.109:0/4018229584 shutdown_connections 2026-03-09T17:25:18.883 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:18.882+0000 7f75daa57700 1 -- 192.168.123.109:0/4018229584 wait complete. 2026-03-09T17:25:18.884 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:19.518 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:19 vm06 ceph-mon[57307]: from='mgr.14164 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished 2026-03-09T17:25:19.518 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:19 vm06 ceph-mon[57307]: mgrmap e14: vm06.pbgzei(active, since 30s) 2026-03-09T17:25:19.518 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:19 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/4018229584' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:19.958 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:19.958 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:20.123 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:20.164 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:20.439 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.437+0000 7f71346db700 1 -- 192.168.123.109:0/668506215 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f712c0fe880 msgr2=0x7f712c100ca0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:20.439 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.437+0000 7f71346db700 1 --2- 192.168.123.109:0/668506215 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f712c0fe880 0x7f712c100ca0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f7128009b00 tx=0x7f7128009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:20.439 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.438+0000 7f71346db700 1 -- 192.168.123.109:0/668506215 shutdown_connections 2026-03-09T17:25:20.439 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.438+0000 7f71346db700 1 --2- 192.168.123.109:0/668506215 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f712c0fe880 0x7f712c100ca0 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:20.439 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.438+0000 7f71346db700 1 -- 192.168.123.109:0/668506215 >> 192.168.123.109:0/668506215 conn(0x7f712c0fa4b0 msgr2=0x7f712c0fc8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:20.439 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.438+0000 7f71346db700 1 -- 192.168.123.109:0/668506215 shutdown_connections 2026-03-09T17:25:20.439 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.438+0000 7f71346db700 1 -- 192.168.123.109:0/668506215 wait complete. 2026-03-09T17:25:20.440 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.439+0000 7f71346db700 1 Processor -- start 2026-03-09T17:25:20.440 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.439+0000 7f71346db700 1 -- start start 2026-03-09T17:25:20.440 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.439+0000 7f71346db700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f712c0fe880 0x7f712c19b770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:20.440 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.439+0000 7f71346db700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f712c19bcb0 con 0x7f712c0fe880 2026-03-09T17:25:20.440 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.440+0000 7f7132477700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f712c0fe880 0x7f712c19b770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:20.440 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.440+0000 7f7132477700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f712c0fe880 0x7f712c19b770 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:40594/0 (socket says 192.168.123.109:40594) 2026-03-09T17:25:20.440 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.440+0000 7f7132477700 1 -- 192.168.123.109:0/2617464392 learned_addr learned my addr 192.168.123.109:0/2617464392 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:20.441 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.440+0000 7f7132477700 1 -- 192.168.123.109:0/2617464392 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f71280097e0 con 0x7f712c0fe880 2026-03-09T17:25:20.441 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.440+0000 7f7132477700 1 --2- 192.168.123.109:0/2617464392 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f712c0fe880 0x7f712c19b770 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f7128004f40 tx=0x7f7128005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:20.442 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.440+0000 7f71237fe700 1 -- 192.168.123.109:0/2617464392 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f712801c070 con 0x7f712c0fe880 2026-03-09T17:25:20.442 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.440+0000 7f71237fe700 1 -- 192.168.123.109:0/2617464392 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f71280053b0 con 0x7f712c0fe880 2026-03-09T17:25:20.442 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.440+0000 7f71237fe700 1 -- 192.168.123.109:0/2617464392 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f712800f460 con 0x7f712c0fe880 2026-03-09T17:25:20.443 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.440+0000 7f71346db700 1 -- 192.168.123.109:0/2617464392 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f712c19beb0 con 0x7f712c0fe880 2026-03-09T17:25:20.443 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.440+0000 7f71346db700 1 -- 192.168.123.109:0/2617464392 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f712c19c350 con 0x7f712c0fe880 2026-03-09T17:25:20.443 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.442+0000 7f71237fe700 1 -- 192.168.123.109:0/2617464392 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 14) v1 ==== 45152+0+0 (secure 0 0 0) 0x7f712800f5e0 con 0x7f712c0fe880 2026-03-09T17:25:20.443 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.442+0000 7f71237fe700 1 --2- 192.168.123.109:0/2617464392 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7118038560 0x7f711803aa10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:20.443 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.442+0000 7f71237fe700 1 -- 192.168.123.109:0/2617464392 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f712804c8e0 con 0x7f712c0fe880 2026-03-09T17:25:20.443 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.442+0000 7f7131c76700 1 -- 192.168.123.109:0/2617464392 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7118038560 msgr2=0x7f711803aa10 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:25:20.443 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.442+0000 7f7131c76700 1 --2- 192.168.123.109:0/2617464392 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7118038560 0x7f711803aa10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T17:25:20.443 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.442+0000 7f71346db700 1 -- 192.168.123.109:0/2617464392 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f712c195b70 con 0x7f712c0fe880 2026-03-09T17:25:20.446 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.446+0000 7f71237fe700 1 -- 192.168.123.109:0/2617464392 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7128026070 con 0x7f712c0fe880 2026-03-09T17:25:20.595 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.594+0000 7f71346db700 1 -- 192.168.123.109:0/2617464392 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f712c062380 con 0x7f712c0fe880 2026-03-09T17:25:20.595 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.594+0000 7f71237fe700 1 -- 192.168.123.109:0/2617464392 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f7128005520 con 0x7f712c0fe880 2026-03-09T17:25:20.595 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:20.596 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:20.598 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.597+0000 7f71346db700 1 -- 192.168.123.109:0/2617464392 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7118038560 msgr2=0x7f711803aa10 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:25:20.598 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.597+0000 7f71346db700 1 --2- 192.168.123.109:0/2617464392 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7118038560 0x7f711803aa10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:20.598 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.597+0000 7f71346db700 1 -- 192.168.123.109:0/2617464392 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f712c0fe880 msgr2=0x7f712c19b770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:20.598 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.597+0000 7f71346db700 1 --2- 192.168.123.109:0/2617464392 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f712c0fe880 0x7f712c19b770 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f7128004f40 tx=0x7f7128005e70 comp rx=0 tx=0).stop 2026-03-09T17:25:20.599 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.597+0000 7f71346db700 1 -- 192.168.123.109:0/2617464392 shutdown_connections 2026-03-09T17:25:20.599 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.597+0000 7f71346db700 1 --2- 192.168.123.109:0/2617464392 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7118038560 0x7f711803aa10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:20.599 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.597+0000 7f71346db700 1 --2- 192.168.123.109:0/2617464392 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f712c0fe880 0x7f712c19b770 secure :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f7128004f40 tx=0x7f7128005e70 comp rx=0 tx=0).stop 2026-03-09T17:25:20.599 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.597+0000 7f71346db700 1 -- 192.168.123.109:0/2617464392 >> 192.168.123.109:0/2617464392 conn(0x7f712c0fa4b0 msgr2=0x7f712c0fb160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:20.599 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.598+0000 7f71346db700 1 -- 192.168.123.109:0/2617464392 shutdown_connections 2026-03-09T17:25:20.599 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:20.598+0000 7f71346db700 1 -- 192.168.123.109:0/2617464392 wait complete. 2026-03-09T17:25:20.600 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:20.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:20 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/2617464392' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:21.662 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:21.663 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:21.796 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:21.834 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:22.096 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.094+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/448363804 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8f080fe8a0 msgr2=0x7f8f08100cc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:22.096 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.094+0000 7f8f0fdaf700 1 --2- 192.168.123.109:0/448363804 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8f080fe8a0 0x7f8f08100cc0 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f8ef8009b00 tx=0x7f8ef8009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:22.097 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.096+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/448363804 shutdown_connections 2026-03-09T17:25:22.097 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.096+0000 7f8f0fdaf700 1 --2- 192.168.123.109:0/448363804 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8f080fe8a0 0x7f8f08100cc0 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:22.097 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.096+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/448363804 >> 192.168.123.109:0/448363804 conn(0x7f8f080fa4f0 msgr2=0x7f8f080fc900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:22.097 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.096+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/448363804 shutdown_connections 2026-03-09T17:25:22.097 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.096+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/448363804 wait complete. 2026-03-09T17:25:22.098 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.097+0000 7f8f0fdaf700 1 Processor -- start 2026-03-09T17:25:22.098 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.097+0000 7f8f0fdaf700 1 -- start start 2026-03-09T17:25:22.098 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.097+0000 7f8f0fdaf700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8f080fe8a0 0x7f8f0819b760 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:22.098 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.097+0000 7f8f0fdaf700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8f0819bca0 con 0x7f8f080fe8a0 2026-03-09T17:25:22.099 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.098+0000 7f8f0db4b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8f080fe8a0 0x7f8f0819b760 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:22.099 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.098+0000 7f8f0db4b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8f080fe8a0 0x7f8f0819b760 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:40606/0 (socket says 192.168.123.109:40606) 2026-03-09T17:25:22.099 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.098+0000 7f8f0db4b700 1 -- 192.168.123.109:0/897188976 learned_addr learned my addr 192.168.123.109:0/897188976 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:22.099 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.098+0000 7f8f0db4b700 1 -- 192.168.123.109:0/897188976 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8ef80097e0 con 0x7f8f080fe8a0 2026-03-09T17:25:22.099 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.098+0000 7f8f0db4b700 1 --2- 192.168.123.109:0/897188976 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8f080fe8a0 0x7f8f0819b760 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f8ef8004f40 tx=0x7f8ef8005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:22.099 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.099+0000 7f8efeffd700 1 -- 192.168.123.109:0/897188976 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ef801c070 con 0x7f8f080fe8a0 2026-03-09T17:25:22.101 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.099+0000 7f8efeffd700 1 -- 192.168.123.109:0/897188976 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8ef80053b0 con 0x7f8f080fe8a0 2026-03-09T17:25:22.101 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.099+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/897188976 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8f0819bea0 con 0x7f8f080fe8a0 2026-03-09T17:25:22.101 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.099+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/897188976 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8f0819c340 con 0x7f8f080fe8a0 2026-03-09T17:25:22.101 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.100+0000 7f8efeffd700 1 -- 192.168.123.109:0/897188976 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ef800f460 con 0x7f8f080fe8a0 2026-03-09T17:25:22.102 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.100+0000 7f8efeffd700 1 -- 192.168.123.109:0/897188976 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 14) v1 ==== 45152+0+0 (secure 0 0 0) 0x7f8ef8021470 con 0x7f8f080fe8a0 2026-03-09T17:25:22.102 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.100+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/897188976 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8f08195b90 con 0x7f8f080fe8a0 2026-03-09T17:25:22.102 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.100+0000 7f8efeffd700 1 --2- 192.168.123.109:0/897188976 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8ef4038510 0x7f8ef403a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:22.102 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.100+0000 7f8efeffd700 1 -- 192.168.123.109:0/897188976 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f8ef804c400 con 0x7f8f080fe8a0 2026-03-09T17:25:22.102 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.100+0000 7f8f0d34a700 1 -- 192.168.123.109:0/897188976 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8ef4038510 msgr2=0x7f8ef403a9c0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.106:6800/2 2026-03-09T17:25:22.102 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.100+0000 7f8f0d34a700 1 --2- 192.168.123.109:0/897188976 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8ef4038510 0x7f8ef403a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T17:25:22.104 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.103+0000 7f8efeffd700 1 -- 192.168.123.109:0/897188976 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8ef8029b50 con 0x7f8f080fe8a0 2026-03-09T17:25:22.255 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.254+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/897188976 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f8f08062380 con 0x7f8f080fe8a0 2026-03-09T17:25:22.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.255+0000 7f8efeffd700 1 -- 192.168.123.109:0/897188976 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f8ef8026030 con 0x7f8f080fe8a0 2026-03-09T17:25:22.257 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:22.257 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:22.259 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.258+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/897188976 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8ef4038510 msgr2=0x7f8ef403a9c0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:25:22.259 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.258+0000 7f8f0fdaf700 1 --2- 192.168.123.109:0/897188976 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8ef4038510 0x7f8ef403a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:22.259 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.258+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/897188976 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8f080fe8a0 msgr2=0x7f8f0819b760 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:22.259 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.258+0000 7f8f0fdaf700 1 --2- 192.168.123.109:0/897188976 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8f080fe8a0 0x7f8f0819b760 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f8ef8004f40 tx=0x7f8ef8005e70 comp rx=0 tx=0).stop 2026-03-09T17:25:22.260 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.259+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/897188976 shutdown_connections 2026-03-09T17:25:22.260 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.259+0000 7f8f0fdaf700 1 --2- 192.168.123.109:0/897188976 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8ef4038510 0x7f8ef403a9c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:22.260 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.259+0000 7f8f0fdaf700 1 --2- 192.168.123.109:0/897188976 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8f080fe8a0 0x7f8f0819b760 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:22.260 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.259+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/897188976 >> 192.168.123.109:0/897188976 conn(0x7f8f080fa4f0 msgr2=0x7f8f080fb150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:22.260 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.259+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/897188976 shutdown_connections 2026-03-09T17:25:22.260 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:22.259+0000 7f8f0fdaf700 1 -- 192.168.123.109:0/897188976 wait complete. 2026-03-09T17:25:22.261 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:22.429 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:22 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/897188976' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:23.306 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:23.306 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:23.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: Active manager daemon vm06.pbgzei restarted 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: Activating manager daemon vm06.pbgzei 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: osdmap e5: 0 total, 0 up, 0 in 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: mgrmap e15: vm06.pbgzei(active, starting, since 0.00439951s) 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr metadata", "who": "vm06.pbgzei", "id": "vm06.pbgzei"}]: dispatch 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: Manager daemon vm06.pbgzei is now available 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/mirror_snapshot_schedule"}]: dispatch 2026-03-09T17:25:23.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:23 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/trash_purge_schedule"}]: dispatch 2026-03-09T17:25:23.489 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:23.538 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:23.904 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.903+0000 7f3986b6f700 1 -- 192.168.123.109:0/1756908033 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39800713c0 msgr2=0x7f39800717d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:23.905 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.903+0000 7f3986b6f700 1 --2- 192.168.123.109:0/1756908033 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39800713c0 0x7f39800717d0 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f397800b3a0 tx=0x7f397800b6b0 comp rx=0 tx=0).stop 2026-03-09T17:25:23.905 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.904+0000 7f3986b6f700 1 -- 192.168.123.109:0/1756908033 shutdown_connections 2026-03-09T17:25:23.905 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.904+0000 7f3986b6f700 1 --2- 192.168.123.109:0/1756908033 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39800713c0 0x7f39800717d0 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:23.905 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.904+0000 7f3986b6f700 1 -- 192.168.123.109:0/1756908033 >> 192.168.123.109:0/1756908033 conn(0x7f398006cd30 msgr2=0x7f398006f180 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:23.905 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.905+0000 7f3986b6f700 1 -- 192.168.123.109:0/1756908033 shutdown_connections 2026-03-09T17:25:23.907 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.906+0000 7f3986b6f700 1 -- 192.168.123.109:0/1756908033 wait complete. 2026-03-09T17:25:23.907 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.906+0000 7f3986b6f700 1 Processor -- start 2026-03-09T17:25:23.908 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.907+0000 7f3986b6f700 1 -- start start 2026-03-09T17:25:23.908 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.907+0000 7f3986b6f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39800713c0 0x7f39801acad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:23.908 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.907+0000 7f3986b6f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3978007b10 con 0x7f39800713c0 2026-03-09T17:25:23.908 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.907+0000 7f3985b6d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39800713c0 0x7f39801acad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:23.908 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.908+0000 7f3985b6d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39800713c0 0x7f39801acad0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:40616/0 (socket says 192.168.123.109:40616) 2026-03-09T17:25:23.908 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.908+0000 7f3985b6d700 1 -- 192.168.123.109:0/3835306690 learned_addr learned my addr 192.168.123.109:0/3835306690 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:23.909 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.908+0000 7f3985b6d700 1 -- 192.168.123.109:0/3835306690 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f397800b050 con 0x7f39800713c0 2026-03-09T17:25:23.909 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.908+0000 7f3985b6d700 1 --2- 192.168.123.109:0/3835306690 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39800713c0 0x7f39801acad0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f3978007920 tx=0x7f3978012850 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:23.909 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.908+0000 7f3976ffd700 1 -- 192.168.123.109:0/3835306690 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3978020070 con 0x7f39800713c0 2026-03-09T17:25:23.910 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.908+0000 7f3986b6f700 1 -- 192.168.123.109:0/3835306690 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f39801ad010 con 0x7f39800713c0 2026-03-09T17:25:23.910 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.908+0000 7f3986b6f700 1 -- 192.168.123.109:0/3835306690 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f39801ad4b0 con 0x7f39800713c0 2026-03-09T17:25:23.910 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.909+0000 7f3976ffd700 1 -- 192.168.123.109:0/3835306690 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3978012d10 con 0x7f39800713c0 2026-03-09T17:25:23.910 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.909+0000 7f3976ffd700 1 -- 192.168.123.109:0/3835306690 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f397800e040 con 0x7f39800713c0 2026-03-09T17:25:23.911 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.910+0000 7f3976ffd700 1 -- 192.168.123.109:0/3835306690 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 15) v1 ==== 44873+0+0 (secure 0 0 0) 0x7f397801a650 con 0x7f39800713c0 2026-03-09T17:25:23.911 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.910+0000 7f3976ffd700 1 -- 192.168.123.109:0/3835306690 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f397804bd80 con 0x7f39800713c0 2026-03-09T17:25:23.911 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.910+0000 7f3974ff9700 1 -- 192.168.123.109:0/3835306690 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f398004efc0 con 0x7f39800713c0 2026-03-09T17:25:23.915 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:23.914+0000 7f3976ffd700 1 -- 192.168.123.109:0/3835306690 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3978017030 con 0x7f39800713c0 2026-03-09T17:25:24.063 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:24.062+0000 7f3974ff9700 1 -- 192.168.123.109:0/3835306690 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f3980062380 con 0x7f39800713c0 2026-03-09T17:25:24.064 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:24.064 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:24.064 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:24.062+0000 7f3976ffd700 1 -- 192.168.123.109:0/3835306690 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f3978025410 con 0x7f39800713c0 2026-03-09T17:25:24.077 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:24.065+0000 7f3986b6f700 1 -- 192.168.123.109:0/3835306690 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39800713c0 msgr2=0x7f39801acad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:24.077 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:24.065+0000 7f3986b6f700 1 --2- 192.168.123.109:0/3835306690 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39800713c0 0x7f39801acad0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f3978007920 tx=0x7f3978012850 comp rx=0 tx=0).stop 2026-03-09T17:25:24.077 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:24.066+0000 7f3986b6f700 1 -- 192.168.123.109:0/3835306690 shutdown_connections 2026-03-09T17:25:24.077 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:24.066+0000 7f3986b6f700 1 --2- 192.168.123.109:0/3835306690 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f39800713c0 0x7f39801acad0 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:24.077 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:24.066+0000 7f3986b6f700 1 -- 192.168.123.109:0/3835306690 >> 192.168.123.109:0/3835306690 conn(0x7f398006cd30 msgr2=0x7f398006fa10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:24.077 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:24.070+0000 7f3986b6f700 1 -- 192.168.123.109:0/3835306690 shutdown_connections 2026-03-09T17:25:24.077 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:24.070+0000 7f3986b6f700 1 -- 192.168.123.109:0/3835306690 wait complete. 2026-03-09T17:25:24.077 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:25.138 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:25.138 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:24 vm06 ceph-mon[57307]: [09/Mar/2026:17:25:23] ENGINE Bus STARTING 2026-03-09T17:25:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:24 vm06 ceph-mon[57307]: [09/Mar/2026:17:25:23] ENGINE Serving on http://192.168.123.106:8765 2026-03-09T17:25:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:24 vm06 ceph-mon[57307]: [09/Mar/2026:17:25:23] ENGINE Serving on https://192.168.123.106:7150 2026-03-09T17:25:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:24 vm06 ceph-mon[57307]: [09/Mar/2026:17:25:23] ENGINE Bus STARTED 2026-03-09T17:25:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:24 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:24 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/3835306690' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:24 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:24 vm06 ceph-mon[57307]: mgrmap e16: vm06.pbgzei(active, since 1.00756s) 2026-03-09T17:25:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:24 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:25.335 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:25.385 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T17:25:25.680 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.678+0000 7f01bcbb3700 1 -- 192.168.123.109:0/2856184313 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f01b80716b0 msgr2=0x7f01b8071ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:25.680 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.678+0000 7f01bcbb3700 1 --2- 192.168.123.109:0/2856184313 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f01b80716b0 0x7f01b8071ac0 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f01a8008790 tx=0x7f01a8008aa0 comp rx=0 tx=0).stop 2026-03-09T17:25:25.680 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.679+0000 7f01bcbb3700 1 -- 192.168.123.109:0/2856184313 shutdown_connections 2026-03-09T17:25:25.680 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.679+0000 7f01bcbb3700 1 --2- 192.168.123.109:0/2856184313 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f01b80716b0 0x7f01b8071ac0 secure :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f01a8008790 tx=0x7f01a8008aa0 comp rx=0 tx=0).stop 2026-03-09T17:25:25.680 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.679+0000 7f01bcbb3700 1 -- 192.168.123.109:0/2856184313 >> 192.168.123.109:0/2856184313 conn(0x7f01b806cf00 msgr2=0x7f01b806f350 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:25.681 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.679+0000 7f01bcbb3700 1 -- 192.168.123.109:0/2856184313 shutdown_connections 2026-03-09T17:25:25.681 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.679+0000 7f01bcbb3700 1 -- 192.168.123.109:0/2856184313 wait complete. 2026-03-09T17:25:25.681 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.680+0000 7f01bcbb3700 1 Processor -- start 2026-03-09T17:25:25.681 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.680+0000 7f01bcbb3700 1 -- start start 2026-03-09T17:25:25.681 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.680+0000 7f01bcbb3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f01b807e8d0 0x7f01b807cf60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:25.681 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.680+0000 7f01bcbb3700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f01a8016070 con 0x7f01b807e8d0 2026-03-09T17:25:25.682 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.680+0000 7f01b659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f01b807e8d0 0x7f01b807cf60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:25.682 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.680+0000 7f01b659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f01b807e8d0 0x7f01b807cf60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:40628/0 (socket says 192.168.123.109:40628) 2026-03-09T17:25:25.682 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.680+0000 7f01b659c700 1 -- 192.168.123.109:0/1682612239 learned_addr learned my addr 192.168.123.109:0/1682612239 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:25.682 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.681+0000 7f01b659c700 1 -- 192.168.123.109:0/1682612239 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f01a8008440 con 0x7f01b807e8d0 2026-03-09T17:25:25.682 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.681+0000 7f01b659c700 1 --2- 192.168.123.109:0/1682612239 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f01b807e8d0 0x7f01b807cf60 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f01a8008760 tx=0x7f01a80036a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:25.682 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.681+0000 7f01a77fe700 1 -- 192.168.123.109:0/1682612239 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f01a8008e80 con 0x7f01b807e8d0 2026-03-09T17:25:25.683 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.682+0000 7f01bcbb3700 1 -- 192.168.123.109:0/1682612239 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f01b807ece0 con 0x7f01b807e8d0 2026-03-09T17:25:25.683 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.682+0000 7f01bcbb3700 1 -- 192.168.123.109:0/1682612239 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f01b807d730 con 0x7f01b807e8d0 2026-03-09T17:25:25.684 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.683+0000 7f01a77fe700 1 -- 192.168.123.109:0/1682612239 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f01a800d2e0 con 0x7f01b807e8d0 2026-03-09T17:25:25.684 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.683+0000 7f01a77fe700 1 -- 192.168.123.109:0/1682612239 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f01a800a500 con 0x7f01b807e8d0 2026-03-09T17:25:25.684 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.683+0000 7f01a77fe700 1 -- 192.168.123.109:0/1682612239 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 16) v1 ==== 45000+0+0 (secure 0 0 0) 0x7f01a800a720 con 0x7f01b807e8d0 2026-03-09T17:25:25.684 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.683+0000 7f01bcbb3700 1 -- 192.168.123.109:0/1682612239 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f01b804fa50 con 0x7f01b807e8d0 2026-03-09T17:25:25.685 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.684+0000 7f01a77fe700 1 --2- 192.168.123.109:0/1682612239 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f01a0038480 0x7f01a003a930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:25.685 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.684+0000 7f01a77fe700 1 -- 192.168.123.109:0/1682612239 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f01a804c8c0 con 0x7f01b807e8d0 2026-03-09T17:25:25.685 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.684+0000 7f01b5d9b700 1 --2- 192.168.123.109:0/1682612239 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f01a0038480 0x7f01a003a930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:25.686 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.685+0000 7f01b5d9b700 1 --2- 192.168.123.109:0/1682612239 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f01a0038480 0x7f01a003a930 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f01b000ad30 tx=0x7f01b00093f0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:25.687 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.686+0000 7f01a77fe700 1 -- 192.168.123.109:0/1682612239 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f01a8010c10 con 0x7f01b807e8d0 2026-03-09T17:25:25.834 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.832+0000 7f01bcbb3700 1 -- 192.168.123.109:0/1682612239 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f01b807ece0 con 0x7f01b807e8d0 2026-03-09T17:25:25.858 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:25.858 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:25.858 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.855+0000 7f01a77fe700 1 -- 192.168.123.109:0/1682612239 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f01b807ece0 con 0x7f01b807e8d0 2026-03-09T17:25:25.860 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.858+0000 7f01bcbb3700 1 -- 192.168.123.109:0/1682612239 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f01a0038480 msgr2=0x7f01a003a930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:25.860 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.858+0000 7f01bcbb3700 1 --2- 192.168.123.109:0/1682612239 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f01a0038480 0x7f01a003a930 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f01b000ad30 tx=0x7f01b00093f0 comp rx=0 tx=0).stop 2026-03-09T17:25:25.860 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.858+0000 7f01bcbb3700 1 -- 192.168.123.109:0/1682612239 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f01b807e8d0 msgr2=0x7f01b807cf60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:25.860 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.858+0000 7f01bcbb3700 1 --2- 192.168.123.109:0/1682612239 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f01b807e8d0 0x7f01b807cf60 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f01a8008760 tx=0x7f01a80036a0 comp rx=0 tx=0).stop 2026-03-09T17:25:25.860 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.859+0000 7f01bcbb3700 1 -- 192.168.123.109:0/1682612239 shutdown_connections 2026-03-09T17:25:25.860 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.859+0000 7f01bcbb3700 1 --2- 192.168.123.109:0/1682612239 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f01a0038480 0x7f01a003a930 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:25.860 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.859+0000 7f01bcbb3700 1 --2- 192.168.123.109:0/1682612239 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f01b807e8d0 0x7f01b807cf60 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:25.860 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.859+0000 7f01bcbb3700 1 -- 192.168.123.109:0/1682612239 >> 192.168.123.109:0/1682612239 conn(0x7f01b806cf00 msgr2=0x7f01b8081cd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:25.860 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.859+0000 7f01bcbb3700 1 -- 192.168.123.109:0/1682612239 shutdown_connections 2026-03-09T17:25:25.860 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:25.859+0000 7f01bcbb3700 1 -- 192.168.123.109:0/1682612239 wait complete. 2026-03-09T17:25:25.861 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:26.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:25 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:26.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:25 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:26.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:25 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:26.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:25 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:26.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:25 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:25:26.908 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:26.908 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:27.109 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:25:27.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:26 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/1682612239' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:27.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:26 vm06 ceph-mon[57307]: mgrmap e17: vm06.pbgzei(active, since 2s) 2026-03-09T17:25:27.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:26 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:27.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:26 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:27.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:26 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:27.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:26 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:27.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:26 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:25:27.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:26 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:27.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:26 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:25:27.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:26 vm06 ceph-mon[57307]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T17:25:27.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:26 vm06 ceph-mon[57307]: Updating vm09:/etc/ceph/ceph.conf 2026-03-09T17:25:27.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:26 vm06 ceph-mon[57307]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.412+0000 7ff7831ce700 1 -- 192.168.123.109:0/725899578 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff77c107fb0 msgr2=0x7ff77c108380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.412+0000 7ff7831ce700 1 --2- 192.168.123.109:0/725899578 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff77c107fb0 0x7ff77c108380 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7ff778009b00 tx=0x7ff778009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.412+0000 7ff7831ce700 1 -- 192.168.123.109:0/725899578 shutdown_connections 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.412+0000 7ff7831ce700 1 --2- 192.168.123.109:0/725899578 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff77c107fb0 0x7ff77c108380 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.412+0000 7ff7831ce700 1 -- 192.168.123.109:0/725899578 >> 192.168.123.109:0/725899578 conn(0x7ff77c06c410 msgr2=0x7ff77c06c810 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.412+0000 7ff7831ce700 1 -- 192.168.123.109:0/725899578 shutdown_connections 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.412+0000 7ff7831ce700 1 -- 192.168.123.109:0/725899578 wait complete. 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.412+0000 7ff7831ce700 1 Processor -- start 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.413+0000 7ff7831ce700 1 -- start start 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.413+0000 7ff7831ce700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff77c107fb0 0x7ff77c134470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.413+0000 7ff7831ce700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff77c137450 con 0x7ff77c107fb0 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.413+0000 7ff7821cc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff77c107fb0 0x7ff77c134470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.413+0000 7ff7821cc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff77c107fb0 0x7ff77c134470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:53756/0 (socket says 192.168.123.109:53756) 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.413+0000 7ff7821cc700 1 -- 192.168.123.109:0/438666721 learned_addr learned my addr 192.168.123.109:0/438666721 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.413+0000 7ff7821cc700 1 -- 192.168.123.109:0/438666721 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff7780097e0 con 0x7ff77c107fb0 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.414+0000 7ff7821cc700 1 --2- 192.168.123.109:0/438666721 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff77c107fb0 0x7ff77c134470 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7ff778005950 tx=0x7ff778004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:27.415 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.414+0000 7ff7737fe700 1 -- 192.168.123.109:0/438666721 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff77801c070 con 0x7ff77c107fb0 2026-03-09T17:25:27.416 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.414+0000 7ff7831ce700 1 -- 192.168.123.109:0/438666721 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff77c134a10 con 0x7ff77c107fb0 2026-03-09T17:25:27.416 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.415+0000 7ff7831ce700 1 -- 192.168.123.109:0/438666721 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff77c134f10 con 0x7ff77c107fb0 2026-03-09T17:25:27.416 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.415+0000 7ff7737fe700 1 -- 192.168.123.109:0/438666721 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff7780056f0 con 0x7ff77c107fb0 2026-03-09T17:25:27.416 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.415+0000 7ff7737fe700 1 -- 192.168.123.109:0/438666721 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff77800f460 con 0x7ff77c107fb0 2026-03-09T17:25:27.417 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.416+0000 7ff7737fe700 1 -- 192.168.123.109:0/438666721 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7ff77800f5e0 con 0x7ff77c107fb0 2026-03-09T17:25:27.417 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.416+0000 7ff7737fe700 1 --2- 192.168.123.109:0/438666721 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff768038690 0x7ff76803ab40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:27.417 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.416+0000 7ff7737fe700 1 -- 192.168.123.109:0/438666721 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7ff77804d510 con 0x7ff77c107fb0 2026-03-09T17:25:27.417 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.416+0000 7ff7819cb700 1 --2- 192.168.123.109:0/438666721 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff768038690 0x7ff76803ab40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:27.418 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.417+0000 7ff7819cb700 1 --2- 192.168.123.109:0/438666721 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff768038690 0x7ff76803ab40 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7ff77400ad30 tx=0x7ff7740093f0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:27.418 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.417+0000 7ff7831ce700 1 -- 192.168.123.109:0/438666721 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff760005320 con 0x7ff77c107fb0 2026-03-09T17:25:27.422 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.421+0000 7ff7737fe700 1 -- 192.168.123.109:0/438666721 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7ff778026020 con 0x7ff77c107fb0 2026-03-09T17:25:27.581 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:27.581 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:27.581 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.576+0000 7ff7831ce700 1 -- 192.168.123.109:0/438666721 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7ff760005190 con 0x7ff77c107fb0 2026-03-09T17:25:27.581 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.576+0000 7ff7737fe700 1 -- 192.168.123.109:0/438666721 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7ff778029720 con 0x7ff77c107fb0 2026-03-09T17:25:27.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.584+0000 7ff7831ce700 1 -- 192.168.123.109:0/438666721 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff768038690 msgr2=0x7ff76803ab40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:27.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.584+0000 7ff7831ce700 1 --2- 192.168.123.109:0/438666721 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff768038690 0x7ff76803ab40 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7ff77400ad30 tx=0x7ff7740093f0 comp rx=0 tx=0).stop 2026-03-09T17:25:27.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.584+0000 7ff7831ce700 1 -- 192.168.123.109:0/438666721 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff77c107fb0 msgr2=0x7ff77c134470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:27.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.584+0000 7ff7831ce700 1 --2- 192.168.123.109:0/438666721 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff77c107fb0 0x7ff77c134470 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7ff778005950 tx=0x7ff778004dc0 comp rx=0 tx=0).stop 2026-03-09T17:25:27.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.584+0000 7ff7831ce700 1 -- 192.168.123.109:0/438666721 shutdown_connections 2026-03-09T17:25:27.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.584+0000 7ff7831ce700 1 --2- 192.168.123.109:0/438666721 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff768038690 0x7ff76803ab40 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:27.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.584+0000 7ff7831ce700 1 --2- 192.168.123.109:0/438666721 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff77c107fb0 0x7ff77c134470 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:27.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.584+0000 7ff7831ce700 1 -- 192.168.123.109:0/438666721 >> 192.168.123.109:0/438666721 conn(0x7ff77c06c410 msgr2=0x7ff77c06fc30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:27.585 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.584+0000 7ff7831ce700 1 -- 192.168.123.109:0/438666721 shutdown_connections 2026-03-09T17:25:27.586 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:27.584+0000 7ff7831ce700 1 -- 192.168.123.109:0/438666721 wait complete. 2026-03-09T17:25:27.590 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: Updating vm09:/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.client.admin.keyring 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.client.admin.keyring 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm09", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm09", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: Deploying daemon ceph-exporter.vm09 on vm09 2026-03-09T17:25:28.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:28 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/438666721' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:28.649 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:28.650 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:28.834 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:25:29.132 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.131+0000 7fd54939b700 1 -- 192.168.123.109:0/3346301836 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd53c0a3f90 msgr2=0x7fd53c0a4360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:29.132 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.131+0000 7fd54939b700 1 --2- 192.168.123.109:0/3346301836 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd53c0a3f90 0x7fd53c0a4360 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7fd538009b00 tx=0x7fd538009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:29.133 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.132+0000 7fd54939b700 1 -- 192.168.123.109:0/3346301836 shutdown_connections 2026-03-09T17:25:29.133 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.132+0000 7fd54939b700 1 --2- 192.168.123.109:0/3346301836 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd53c0a3f90 0x7fd53c0a4360 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:29.133 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.132+0000 7fd54939b700 1 -- 192.168.123.109:0/3346301836 >> 192.168.123.109:0/3346301836 conn(0x7fd53c019ca0 msgr2=0x7fd53c01a0a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:29.133 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.132+0000 7fd54939b700 1 -- 192.168.123.109:0/3346301836 shutdown_connections 2026-03-09T17:25:29.134 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.132+0000 7fd54939b700 1 -- 192.168.123.109:0/3346301836 wait complete. 2026-03-09T17:25:29.134 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.133+0000 7fd54939b700 1 Processor -- start 2026-03-09T17:25:29.134 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.133+0000 7fd54939b700 1 -- start start 2026-03-09T17:25:29.134 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.133+0000 7fd54939b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd53c0a3f90 0x7fd53c0b1f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:29.134 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.133+0000 7fd54939b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd538012070 con 0x7fd53c0a3f90 2026-03-09T17:25:29.134 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.133+0000 7fd543fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd53c0a3f90 0x7fd53c0b1f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:29.134 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.133+0000 7fd543fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd53c0a3f90 0x7fd53c0b1f60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:53776/0 (socket says 192.168.123.109:53776) 2026-03-09T17:25:29.134 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.133+0000 7fd543fff700 1 -- 192.168.123.109:0/1997567708 learned_addr learned my addr 192.168.123.109:0/1997567708 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:29.135 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.134+0000 7fd543fff700 1 -- 192.168.123.109:0/1997567708 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd5380097e0 con 0x7fd53c0a3f90 2026-03-09T17:25:29.137 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.134+0000 7fd543fff700 1 --2- 192.168.123.109:0/1997567708 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd53c0a3f90 0x7fd53c0b1f60 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7fd53800c010 tx=0x7fd538005af0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:29.137 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.136+0000 7fd5417fa700 1 -- 192.168.123.109:0/1997567708 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd53801d070 con 0x7fd53c0a3f90 2026-03-09T17:25:29.137 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.136+0000 7fd54939b700 1 -- 192.168.123.109:0/1997567708 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd53c0b2500 con 0x7fd53c0a3f90 2026-03-09T17:25:29.137 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.136+0000 7fd54939b700 1 -- 192.168.123.109:0/1997567708 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd53c0b2980 con 0x7fd53c0a3f90 2026-03-09T17:25:29.137 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.136+0000 7fd5417fa700 1 -- 192.168.123.109:0/1997567708 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd53800f460 con 0x7fd53c0a3f90 2026-03-09T17:25:29.137 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.136+0000 7fd5417fa700 1 -- 192.168.123.109:0/1997567708 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd5380175a0 con 0x7fd53c0a3f90 2026-03-09T17:25:29.140 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.137+0000 7fd5417fa700 1 -- 192.168.123.109:0/1997567708 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7fd538017700 con 0x7fd53c0a3f90 2026-03-09T17:25:29.140 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.137+0000 7fd5417fa700 1 --2- 192.168.123.109:0/1997567708 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd534038690 0x7fd53403ab40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:29.140 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.137+0000 7fd5417fa700 1 -- 192.168.123.109:0/1997567708 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fd53804d170 con 0x7fd53c0a3f90 2026-03-09T17:25:29.140 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.138+0000 7fd5437fe700 1 --2- 192.168.123.109:0/1997567708 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd534038690 0x7fd53403ab40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:29.140 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.140+0000 7fd54939b700 1 -- 192.168.123.109:0/1997567708 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd528005320 con 0x7fd53c0a3f90 2026-03-09T17:25:29.142 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.140+0000 7fd5437fe700 1 --2- 192.168.123.109:0/1997567708 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd534038690 0x7fd53403ab40 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fd530006fd0 tx=0x7fd530006e40 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:29.143 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.143+0000 7fd5417fa700 1 -- 192.168.123.109:0/1997567708 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd538026070 con 0x7fd53c0a3f90 2026-03-09T17:25:29.358 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.356+0000 7fd54939b700 1 -- 192.168.123.109:0/1997567708 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fd528005190 con 0x7fd53c0a3f90 2026-03-09T17:25:29.358 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.357+0000 7fd5417fa700 1 -- 192.168.123.109:0/1997567708 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fd53802b430 con 0x7fd53c0a3f90 2026-03-09T17:25:29.358 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:29.358 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:29.360 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.359+0000 7fd54939b700 1 -- 192.168.123.109:0/1997567708 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd534038690 msgr2=0x7fd53403ab40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:29.360 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.359+0000 7fd54939b700 1 --2- 192.168.123.109:0/1997567708 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd534038690 0x7fd53403ab40 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fd530006fd0 tx=0x7fd530006e40 comp rx=0 tx=0).stop 2026-03-09T17:25:29.361 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.360+0000 7fd54939b700 1 -- 192.168.123.109:0/1997567708 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd53c0a3f90 msgr2=0x7fd53c0b1f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:29.361 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.360+0000 7fd54939b700 1 --2- 192.168.123.109:0/1997567708 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd53c0a3f90 0x7fd53c0b1f60 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7fd53800c010 tx=0x7fd538005af0 comp rx=0 tx=0).stop 2026-03-09T17:25:29.361 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.360+0000 7fd54939b700 1 -- 192.168.123.109:0/1997567708 shutdown_connections 2026-03-09T17:25:29.361 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.360+0000 7fd54939b700 1 --2- 192.168.123.109:0/1997567708 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd534038690 0x7fd53403ab40 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:29.361 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.360+0000 7fd54939b700 1 --2- 192.168.123.109:0/1997567708 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd53c0a3f90 0x7fd53c0b1f60 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:29.361 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.360+0000 7fd54939b700 1 -- 192.168.123.109:0/1997567708 >> 192.168.123.109:0/1997567708 conn(0x7fd53c019ca0 msgr2=0x7fd53c0aa680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:29.361 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.360+0000 7fd54939b700 1 -- 192.168.123.109:0/1997567708 shutdown_connections 2026-03-09T17:25:29.361 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:29.360+0000 7fd54939b700 1 -- 192.168.123.109:0/1997567708 wait complete. 2026-03-09T17:25:29.365 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:30.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:29 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:30.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:29 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:30.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:29 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:30.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:29 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:30.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:29 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm09", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T17:25:30.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:29 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm09", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-09T17:25:30.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:29 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:30.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:29 vm06 ceph-mon[57307]: Deploying daemon crash.vm09 on vm09 2026-03-09T17:25:30.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:29 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/1997567708' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:30.460 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:30.460 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:30.627 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:25:30.903 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.901+0000 7fb3c65b1700 1 -- 192.168.123.109:0/540174765 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c0102490 msgr2=0x7fb3c0102860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:30.903 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.901+0000 7fb3c65b1700 1 --2- 192.168.123.109:0/540174765 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c0102490 0x7fb3c0102860 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7fb3a8009b00 tx=0x7fb3a8009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:30.903 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.902+0000 7fb3c65b1700 1 -- 192.168.123.109:0/540174765 shutdown_connections 2026-03-09T17:25:30.903 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.902+0000 7fb3c65b1700 1 --2- 192.168.123.109:0/540174765 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c0102490 0x7fb3c0102860 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:30.903 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.902+0000 7fb3c65b1700 1 -- 192.168.123.109:0/540174765 >> 192.168.123.109:0/540174765 conn(0x7fb3c00fdd90 msgr2=0x7fb3c01001a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:30.903 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.902+0000 7fb3c65b1700 1 -- 192.168.123.109:0/540174765 shutdown_connections 2026-03-09T17:25:30.903 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.902+0000 7fb3c65b1700 1 -- 192.168.123.109:0/540174765 wait complete. 2026-03-09T17:25:30.904 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.903+0000 7fb3c65b1700 1 Processor -- start 2026-03-09T17:25:30.904 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.903+0000 7fb3c65b1700 1 -- start start 2026-03-09T17:25:30.904 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.903+0000 7fb3c65b1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c0102490 0x7fb3c019d590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:30.904 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.904+0000 7fb3c65b1700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb3c019dad0 con 0x7fb3c0102490 2026-03-09T17:25:30.904 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.904+0000 7fb3bffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c0102490 0x7fb3c019d590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:30.905 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.904+0000 7fb3bffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c0102490 0x7fb3c019d590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:53798/0 (socket says 192.168.123.109:53798) 2026-03-09T17:25:30.905 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.904+0000 7fb3bffff700 1 -- 192.168.123.109:0/3851293181 learned_addr learned my addr 192.168.123.109:0/3851293181 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:30.905 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.904+0000 7fb3bffff700 1 -- 192.168.123.109:0/3851293181 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb3a80097e0 con 0x7fb3c0102490 2026-03-09T17:25:30.905 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.904+0000 7fb3bffff700 1 --2- 192.168.123.109:0/3851293181 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c0102490 0x7fb3c019d590 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7fb3a8006010 tx=0x7fb3a8004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:30.905 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.905+0000 7fb3bd7fa700 1 -- 192.168.123.109:0/3851293181 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb3a801c070 con 0x7fb3c0102490 2026-03-09T17:25:30.905 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.905+0000 7fb3bd7fa700 1 -- 192.168.123.109:0/3851293181 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb3a8021470 con 0x7fb3c0102490 2026-03-09T17:25:30.906 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.905+0000 7fb3c65b1700 1 -- 192.168.123.109:0/3851293181 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb3c019dcd0 con 0x7fb3c0102490 2026-03-09T17:25:30.906 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.905+0000 7fb3bd7fa700 1 -- 192.168.123.109:0/3851293181 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb3a800f460 con 0x7fb3c0102490 2026-03-09T17:25:30.906 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.905+0000 7fb3c65b1700 1 -- 192.168.123.109:0/3851293181 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb3c0197580 con 0x7fb3c0102490 2026-03-09T17:25:30.907 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.906+0000 7fb3bd7fa700 1 -- 192.168.123.109:0/3851293181 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7fb3a80215e0 con 0x7fb3c0102490 2026-03-09T17:25:30.907 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.906+0000 7fb3c65b1700 1 -- 192.168.123.109:0/3851293181 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb3c019e0f0 con 0x7fb3c0102490 2026-03-09T17:25:30.907 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.906+0000 7fb3bd7fa700 1 --2- 192.168.123.109:0/3851293181 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb3ac038600 0x7fb3ac03aab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:30.907 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.906+0000 7fb3bd7fa700 1 -- 192.168.123.109:0/3851293181 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fb3a804c470 con 0x7fb3c0102490 2026-03-09T17:25:30.909 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.908+0000 7fb3bf7fe700 1 --2- 192.168.123.109:0/3851293181 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb3ac038600 0x7fb3ac03aab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:30.910 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.908+0000 7fb3bf7fe700 1 --2- 192.168.123.109:0/3851293181 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb3ac038600 0x7fb3ac03aab0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fb3b0006fd0 tx=0x7fb3b0006e40 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:30.910 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:30.909+0000 7fb3bd7fa700 1 -- 192.168.123.109:0/3851293181 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb3a800f5c0 con 0x7fb3c0102490 2026-03-09T17:25:31.051 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:31.050+0000 7fb3c65b1700 1 -- 192.168.123.109:0/3851293181 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fb3c0198ab0 con 0x7fb3c0102490 2026-03-09T17:25:31.052 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:31.051+0000 7fb3bd7fa700 1 -- 192.168.123.109:0/3851293181 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fb3a8026030 con 0x7fb3c0102490 2026-03-09T17:25:31.052 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:31.052 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:31.054 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:31.054+0000 7fb3c65b1700 1 -- 192.168.123.109:0/3851293181 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb3ac038600 msgr2=0x7fb3ac03aab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:31.055 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:31.054+0000 7fb3c65b1700 1 --2- 192.168.123.109:0/3851293181 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb3ac038600 0x7fb3ac03aab0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fb3b0006fd0 tx=0x7fb3b0006e40 comp rx=0 tx=0).stop 2026-03-09T17:25:31.055 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:31.054+0000 7fb3c65b1700 1 -- 192.168.123.109:0/3851293181 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c0102490 msgr2=0x7fb3c019d590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:31.055 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:31.054+0000 7fb3c65b1700 1 --2- 192.168.123.109:0/3851293181 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c0102490 0x7fb3c019d590 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7fb3a8006010 tx=0x7fb3a8004dc0 comp rx=0 tx=0).stop 2026-03-09T17:25:31.055 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:31.054+0000 7fb3c65b1700 1 -- 192.168.123.109:0/3851293181 shutdown_connections 2026-03-09T17:25:31.055 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:31.054+0000 7fb3c65b1700 1 --2- 192.168.123.109:0/3851293181 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb3ac038600 0x7fb3ac03aab0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:31.055 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:31.054+0000 7fb3c65b1700 1 --2- 192.168.123.109:0/3851293181 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c0102490 0x7fb3c019d590 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:31.055 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:31.054+0000 7fb3c65b1700 1 -- 192.168.123.109:0/3851293181 >> 192.168.123.109:0/3851293181 conn(0x7fb3c00fdd90 msgr2=0x7fb3c0106670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:31.055 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:31.055+0000 7fb3c65b1700 1 -- 192.168.123.109:0/3851293181 shutdown_connections 2026-03-09T17:25:31.056 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:31.055+0000 7fb3c65b1700 1 -- 192.168.123.109:0/3851293181 wait complete. 2026-03-09T17:25:31.056 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:31.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:31.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:31.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:31.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:31.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:30 vm06 ceph-mon[57307]: Deploying daemon node-exporter.vm09 on vm09 2026-03-09T17:25:32.128 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:32.128 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:32.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:31 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/3851293181' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:32.321 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:25:32.737 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.735+0000 7f3e63fff700 1 -- 192.168.123.109:0/364095901 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e64072730 msgr2=0x7f3e6410edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:32.738 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.735+0000 7f3e63fff700 1 --2- 192.168.123.109:0/364095901 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e64072730 0x7f3e6410edb0 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f3e54009b00 tx=0x7f3e54009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:32.740 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.739+0000 7f3e63fff700 1 -- 192.168.123.109:0/364095901 shutdown_connections 2026-03-09T17:25:32.740 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.739+0000 7f3e63fff700 1 --2- 192.168.123.109:0/364095901 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e64072730 0x7f3e6410edb0 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:32.740 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.739+0000 7f3e63fff700 1 -- 192.168.123.109:0/364095901 >> 192.168.123.109:0/364095901 conn(0x7f3e6406c410 msgr2=0x7f3e6406c810 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:32.743 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.741+0000 7f3e63fff700 1 -- 192.168.123.109:0/364095901 shutdown_connections 2026-03-09T17:25:32.743 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.741+0000 7f3e63fff700 1 -- 192.168.123.109:0/364095901 wait complete. 2026-03-09T17:25:32.743 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.742+0000 7f3e63fff700 1 Processor -- start 2026-03-09T17:25:32.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.742+0000 7f3e63fff700 1 -- start start 2026-03-09T17:25:32.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.742+0000 7f3e63fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e64072730 0x7f3e641a4140 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:32.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.742+0000 7f3e63fff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3e54012070 con 0x7f3e64072730 2026-03-09T17:25:32.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.743+0000 7f3e62ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e64072730 0x7f3e641a4140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:32.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.743+0000 7f3e62ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e64072730 0x7f3e641a4140 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:53816/0 (socket says 192.168.123.109:53816) 2026-03-09T17:25:32.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.743+0000 7f3e62ffd700 1 -- 192.168.123.109:0/1636654697 learned_addr learned my addr 192.168.123.109:0/1636654697 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:32.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.743+0000 7f3e62ffd700 1 -- 192.168.123.109:0/1636654697 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3e540097e0 con 0x7f3e64072730 2026-03-09T17:25:32.744 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.743+0000 7f3e62ffd700 1 --2- 192.168.123.109:0/1636654697 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e64072730 0x7f3e641a4140 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f3e54005230 tx=0x7f3e54005310 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:32.745 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.743+0000 7f3e4bfff700 1 -- 192.168.123.109:0/1636654697 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3e5401d070 con 0x7f3e64072730 2026-03-09T17:25:32.745 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.743+0000 7f3e63fff700 1 -- 192.168.123.109:0/1636654697 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3e641a46e0 con 0x7f3e64072730 2026-03-09T17:25:32.745 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.743+0000 7f3e63fff700 1 -- 192.168.123.109:0/1636654697 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3e641a4b60 con 0x7f3e64072730 2026-03-09T17:25:32.745 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.744+0000 7f3e4bfff700 1 -- 192.168.123.109:0/1636654697 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3e54005670 con 0x7f3e64072730 2026-03-09T17:25:32.745 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.744+0000 7f3e4bfff700 1 -- 192.168.123.109:0/1636654697 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3e5400f460 con 0x7f3e64072730 2026-03-09T17:25:32.745 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.745+0000 7f3e4bfff700 1 -- 192.168.123.109:0/1636654697 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7f3e5400f5c0 con 0x7f3e64072730 2026-03-09T17:25:32.746 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.745+0000 7f3e4bfff700 1 --2- 192.168.123.109:0/1636654697 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f3e4c038690 0x7f3e4c03ab40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:32.746 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.745+0000 7f3e4bfff700 1 -- 192.168.123.109:0/1636654697 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f3e5404d3d0 con 0x7f3e64072730 2026-03-09T17:25:32.746 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.745+0000 7f3e627fc700 1 --2- 192.168.123.109:0/1636654697 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f3e4c038690 0x7f3e4c03ab40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:32.746 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.745+0000 7f3e63fff700 1 -- 192.168.123.109:0/1636654697 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3e50005320 con 0x7f3e64072730 2026-03-09T17:25:32.747 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.746+0000 7f3e627fc700 1 --2- 192.168.123.109:0/1636654697 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f3e4c038690 0x7f3e4c03ab40 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f3e58006fd0 tx=0x7f3e58006e40 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:32.749 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.748+0000 7f3e4bfff700 1 -- 192.168.123.109:0/1636654697 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3e54026020 con 0x7f3e64072730 2026-03-09T17:25:32.921 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.919+0000 7f3e63fff700 1 -- 192.168.123.109:0/1636654697 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f3e500059f0 con 0x7f3e64072730 2026-03-09T17:25:32.923 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.922+0000 7f3e4bfff700 1 -- 192.168.123.109:0/1636654697 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f3e54017440 con 0x7f3e64072730 2026-03-09T17:25:32.923 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:32.923 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:32.925 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.924+0000 7f3e63fff700 1 -- 192.168.123.109:0/1636654697 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f3e4c038690 msgr2=0x7f3e4c03ab40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:32.925 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.924+0000 7f3e63fff700 1 --2- 192.168.123.109:0/1636654697 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f3e4c038690 0x7f3e4c03ab40 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f3e58006fd0 tx=0x7f3e58006e40 comp rx=0 tx=0).stop 2026-03-09T17:25:32.925 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.924+0000 7f3e63fff700 1 -- 192.168.123.109:0/1636654697 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e64072730 msgr2=0x7f3e641a4140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:32.925 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.924+0000 7f3e63fff700 1 --2- 192.168.123.109:0/1636654697 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e64072730 0x7f3e641a4140 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f3e54005230 tx=0x7f3e54005310 comp rx=0 tx=0).stop 2026-03-09T17:25:32.926 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.924+0000 7f3e63fff700 1 -- 192.168.123.109:0/1636654697 shutdown_connections 2026-03-09T17:25:32.926 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.924+0000 7f3e63fff700 1 --2- 192.168.123.109:0/1636654697 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f3e4c038690 0x7f3e4c03ab40 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:32.926 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.924+0000 7f3e63fff700 1 --2- 192.168.123.109:0/1636654697 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e64072730 0x7f3e641a4140 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:32.926 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.924+0000 7f3e63fff700 1 -- 192.168.123.109:0/1636654697 >> 192.168.123.109:0/1636654697 conn(0x7f3e6406c410 msgr2=0x7f3e6410e9b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:32.926 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.925+0000 7f3e63fff700 1 -- 192.168.123.109:0/1636654697 shutdown_connections 2026-03-09T17:25:32.926 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:32.925+0000 7f3e63fff700 1 -- 192.168.123.109:0/1636654697 wait complete. 2026-03-09T17:25:32.926 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:33.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:32 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/1636654697' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:33.986 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:33.986 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:34.244 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:25:34.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm09.lqzvkh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.vm09.lqzvkh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: Deploying daemon mgr.vm09.lqzvkh on vm09 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T17:25:34.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:34.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.764+0000 7f15ed480700 1 -- 192.168.123.109:0/3696864351 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15e807ef60 msgr2=0x7f15e807f330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:34.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.764+0000 7f15ed480700 1 --2- 192.168.123.109:0/3696864351 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15e807ef60 0x7f15e807f330 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7f15d8009b00 tx=0x7f15d8009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:34.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.764+0000 7f15ed480700 1 -- 192.168.123.109:0/3696864351 shutdown_connections 2026-03-09T17:25:34.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.764+0000 7f15ed480700 1 --2- 192.168.123.109:0/3696864351 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15e807ef60 0x7f15e807f330 unknown :-1 s=CLOSED pgs=157 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:34.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.764+0000 7f15ed480700 1 -- 192.168.123.109:0/3696864351 >> 192.168.123.109:0/3696864351 conn(0x7f15e8075ca0 msgr2=0x7f15e80780b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:34.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.764+0000 7f15ed480700 1 -- 192.168.123.109:0/3696864351 shutdown_connections 2026-03-09T17:25:34.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.764+0000 7f15ed480700 1 -- 192.168.123.109:0/3696864351 wait complete. 2026-03-09T17:25:34.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.764+0000 7f15ed480700 1 Processor -- start 2026-03-09T17:25:34.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.764+0000 7f15ed480700 1 -- start start 2026-03-09T17:25:34.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.765+0000 7f15ed480700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15e8071a50 0x7f15e8071e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:34.767 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.765+0000 7f15ed480700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15d8012070 con 0x7f15e8071a50 2026-03-09T17:25:34.768 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.766+0000 7f15e7fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15e8071a50 0x7f15e8071e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:34.768 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.766+0000 7f15e7fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15e8071a50 0x7f15e8071e20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.109:53858/0 (socket says 192.168.123.109:53858) 2026-03-09T17:25:34.768 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.766+0000 7f15e7fff700 1 -- 192.168.123.109:0/1582757435 learned_addr learned my addr 192.168.123.109:0/1582757435 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:34.768 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.766+0000 7f15e7fff700 1 -- 192.168.123.109:0/1582757435 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f15d80097e0 con 0x7f15e8071a50 2026-03-09T17:25:34.769 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.767+0000 7f15e7fff700 1 --2- 192.168.123.109:0/1582757435 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15e8071a50 0x7f15e8071e20 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f15d800c010 tx=0x7f15d800bba0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:34.769 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.767+0000 7f15e57fa700 1 -- 192.168.123.109:0/1582757435 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f15d801c070 con 0x7f15e8071a50 2026-03-09T17:25:34.769 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.767+0000 7f15ed480700 1 -- 192.168.123.109:0/1582757435 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f15e80723c0 con 0x7f15e8071a50 2026-03-09T17:25:34.769 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.767+0000 7f15ed480700 1 -- 192.168.123.109:0/1582757435 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f15e811b6a0 con 0x7f15e8071a50 2026-03-09T17:25:34.769 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.767+0000 7f15e57fa700 1 -- 192.168.123.109:0/1582757435 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f15d8003e30 con 0x7f15e8071a50 2026-03-09T17:25:34.769 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.767+0000 7f15e57fa700 1 -- 192.168.123.109:0/1582757435 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f15d8017440 con 0x7f15e8071a50 2026-03-09T17:25:34.769 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.768+0000 7f15e57fa700 1 -- 192.168.123.109:0/1582757435 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7f15d80175a0 con 0x7f15e8071a50 2026-03-09T17:25:34.769 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.768+0000 7f15e57fa700 1 --2- 192.168.123.109:0/1582757435 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f15d0038640 0x7f15d003aaf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:34.769 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.768+0000 7f15e57fa700 1 -- 192.168.123.109:0/1582757435 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f15d804d240 con 0x7f15e8071a50 2026-03-09T17:25:34.769 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.768+0000 7f15e77fe700 1 --2- 192.168.123.109:0/1582757435 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f15d0038640 0x7f15d003aaf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:34.770 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.769+0000 7f15e77fe700 1 --2- 192.168.123.109:0/1582757435 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f15d0038640 0x7f15d003aaf0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f15e000ad30 tx=0x7f15e00093f0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:34.770 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.770+0000 7f15ed480700 1 -- 192.168.123.109:0/1582757435 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f15d4005320 con 0x7f15e8071a50 2026-03-09T17:25:34.776 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:34.775+0000 7f15e57fa700 1 -- 192.168.123.109:0/1582757435 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f15d802a430 con 0x7f15e8071a50 2026-03-09T17:25:35.095 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:35.092+0000 7f15ed480700 1 -- 192.168.123.109:0/1582757435 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f15d4005190 con 0x7f15e8071a50 2026-03-09T17:25:35.095 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:35.095+0000 7f15e57fa700 1 -- 192.168.123.109:0/1582757435 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f15d801fb60 con 0x7f15e8071a50 2026-03-09T17:25:35.096 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:35.096 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":1,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:24:17.098520Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T17:25:35.103 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:35.101+0000 7f15ed480700 1 -- 192.168.123.109:0/1582757435 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f15d0038640 msgr2=0x7f15d003aaf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:35.103 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:35.101+0000 7f15ed480700 1 --2- 192.168.123.109:0/1582757435 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f15d0038640 0x7f15d003aaf0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f15e000ad30 tx=0x7f15e00093f0 comp rx=0 tx=0).stop 2026-03-09T17:25:35.103 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:35.101+0000 7f15ed480700 1 -- 192.168.123.109:0/1582757435 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15e8071a50 msgr2=0x7f15e8071e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:35.103 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:35.101+0000 7f15ed480700 1 --2- 192.168.123.109:0/1582757435 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15e8071a50 0x7f15e8071e20 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f15d800c010 tx=0x7f15d800bba0 comp rx=0 tx=0).stop 2026-03-09T17:25:35.105 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:35.102+0000 7f15ed480700 1 -- 192.168.123.109:0/1582757435 shutdown_connections 2026-03-09T17:25:35.105 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:35.102+0000 7f15ed480700 1 --2- 192.168.123.109:0/1582757435 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f15d0038640 0x7f15d003aaf0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:35.105 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:35.102+0000 7f15ed480700 1 --2- 192.168.123.109:0/1582757435 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f15e8071a50 0x7f15e8071e20 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:35.106 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:35.102+0000 7f15ed480700 1 -- 192.168.123.109:0/1582757435 >> 192.168.123.109:0/1582757435 conn(0x7f15e8075ca0 msgr2=0x7f15e8076900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:35.110 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:35.109+0000 7f15ed480700 1 -- 192.168.123.109:0/1582757435 shutdown_connections 2026-03-09T17:25:35.112 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:35.109+0000 7f15ed480700 1 -- 192.168.123.109:0/1582757435 wait complete. 2026-03-09T17:25:35.112 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 1 2026-03-09T17:25:35.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:35 vm06 ceph-mon[57307]: Deploying daemon mon.vm09 on vm09 2026-03-09T17:25:35.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:35 vm09 ceph-mon[62061]: mon.vm09@-1(synchronizing) e1 handle_conf_change mon_allow_pool_delete,mon_cluster_log_to_file 2026-03-09T17:25:36.340 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T17:25:36.341 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mon dump -f json 2026-03-09T17:25:36.510 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm09/config 2026-03-09T17:25:40.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: mon.vm06 calling monitor election 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: mon.vm09 calling monitor election 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.? 192.168.123.109:0/429123994' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/crt"}]: dispatch 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: mon.vm06 is new leader, mons vm06,vm09 in quorum (ranks 0,1) 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: monmap e2: 2 mons at {vm06=[v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0],vm09=[v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0]} removed_ranks: {} 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: fsmap 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: osdmap e5: 0 total, 0 up, 0 in 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: mgrmap e17: vm06.pbgzei(active, since 17s) 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: Standby manager daemon vm09.lqzvkh started 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.? 192.168.123.109:0/429123994' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: overall HEALTH_OK 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.? 192.168.123.109:0/429123994' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/key"}]: dispatch 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.? 192.168.123.109:0/429123994' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:40.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: mon.vm06 calling monitor election 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: mon.vm09 calling monitor election 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.? 192.168.123.109:0/429123994' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/crt"}]: dispatch 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: mon.vm06 is new leader, mons vm06,vm09 in quorum (ranks 0,1) 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: monmap e2: 2 mons at {vm06=[v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0],vm09=[v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0]} removed_ranks: {} 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: fsmap 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: osdmap e5: 0 total, 0 up, 0 in 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: mgrmap e17: vm06.pbgzei(active, since 17s) 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: Standby manager daemon vm09.lqzvkh started 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.? 192.168.123.109:0/429123994' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: overall HEALTH_OK 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.? 192.168.123.109:0/429123994' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/key"}]: dispatch 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.? 192.168.123.109:0/429123994' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.252+0000 7fe0c653a700 1 -- 192.168.123.109:0/3683272378 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0c006db80 msgr2=0x7fe0a8005610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.252+0000 7fe0c653a700 1 --2- 192.168.123.109:0/3683272378 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0c006db80 0x7fe0a8005610 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7fe0b0005fa0 tx=0x7fe0b000ff70 comp rx=0 tx=0).stop 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.252+0000 7fe0c653a700 1 -- 192.168.123.109:0/3683272378 shutdown_connections 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.252+0000 7fe0c653a700 1 --2- 192.168.123.109:0/3683272378 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0c006db80 0x7fe0a8005610 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.252+0000 7fe0c653a700 1 -- 192.168.123.109:0/3683272378 >> 192.168.123.109:0/3683272378 conn(0x7fe0c006c6f0 msgr2=0x7fe0c006caf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.252+0000 7fe0c653a700 1 -- 192.168.123.109:0/3683272378 shutdown_connections 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.252+0000 7fe0c653a700 1 -- 192.168.123.109:0/3683272378 wait complete. 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.252+0000 7fe0c653a700 1 Processor -- start 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.252+0000 7fe0c653a700 1 -- start start 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.252+0000 7fe0c653a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0c006db80 0x7fe0c01add70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.252+0000 7fe0c653a700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe0c01ae2b0 0x7fe0c01b2a00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.252+0000 7fe0c653a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe0c01b2f40 con 0x7fe0c006db80 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.253+0000 7fe0c653a700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe0c01b30b0 con 0x7fe0c01ae2b0 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.253+0000 7fe0bf7fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe0c01ae2b0 0x7fe0c01b2a00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.253+0000 7fe0bf7fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe0c01ae2b0 0x7fe0c01b2a00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.109:37548/0 (socket says 192.168.123.109:37548) 2026-03-09T17:25:41.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.253+0000 7fe0bf7fe700 1 -- 192.168.123.109:0/3910720336 learned_addr learned my addr 192.168.123.109:0/3910720336 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:25:41.257 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.255+0000 7fe0bf7fe700 1 -- 192.168.123.109:0/3910720336 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe0c01ae2b0 msgr2=0x7fe0c01b2a00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 12 2026-03-09T17:25:41.257 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.255+0000 7fe0bf7fe700 1 -- 192.168.123.109:0/3910720336 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe0c01ae2b0 msgr2=0x7fe0c01b2a00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-09T17:25:41.257 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.255+0000 7fe0bf7fe700 1 --2- 192.168.123.109:0/3910720336 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe0c01ae2b0 0x7fe0c01b2a00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-09T17:25:41.257 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.255+0000 7fe0bf7fe700 1 --2- 192.168.123.109:0/3910720336 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe0c01ae2b0 0x7fe0c01b2a00 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T17:25:41.257 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.255+0000 7fe0bffff700 1 --2- 192.168.123.109:0/3910720336 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0c006db80 0x7fe0c01add70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:41.257 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.255+0000 7fe0bffff700 1 -- 192.168.123.109:0/3910720336 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe0c01ae2b0 msgr2=0x7fe0c01b2a00 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:25:41.257 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.255+0000 7fe0bffff700 1 --2- 192.168.123.109:0/3910720336 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe0c01ae2b0 0x7fe0c01b2a00 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:41.257 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.255+0000 7fe0bffff700 1 -- 192.168.123.109:0/3910720336 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe0b000f970 con 0x7fe0c006db80 2026-03-09T17:25:41.257 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.255+0000 7fe0bffff700 1 --2- 192.168.123.109:0/3910720336 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0c006db80 0x7fe0c01add70 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7fe0b00127e0 tx=0x7fe0b0014f90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:41.257 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.256+0000 7fe0bd7fa700 1 -- 192.168.123.109:0/3910720336 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe0b0012a30 con 0x7fe0c006db80 2026-03-09T17:25:41.257 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.256+0000 7fe0bd7fa700 1 -- 192.168.123.109:0/3910720336 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe0b001dd50 con 0x7fe0c006db80 2026-03-09T17:25:41.257 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.256+0000 7fe0bd7fa700 1 -- 192.168.123.109:0/3910720336 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe0b0005600 con 0x7fe0c006db80 2026-03-09T17:25:41.257 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.256+0000 7fe0c653a700 1 -- 192.168.123.109:0/3910720336 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe0c01b3330 con 0x7fe0c006db80 2026-03-09T17:25:41.258 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.256+0000 7fe0c653a700 1 -- 192.168.123.109:0/3910720336 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe0c01b3880 con 0x7fe0c006db80 2026-03-09T17:25:41.259 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.258+0000 7fe0bd7fa700 1 -- 192.168.123.109:0/3910720336 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7fe0b0024070 con 0x7fe0c006db80 2026-03-09T17:25:41.259 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.258+0000 7fe0bd7fa700 1 --2- 192.168.123.109:0/3910720336 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe0ac06c820 0x7fe0ac06ecd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:41.259 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.258+0000 7fe0bd7fa700 1 -- 192.168.123.109:0/3910720336 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fe0b008e6d0 con 0x7fe0c006db80 2026-03-09T17:25:41.259 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.258+0000 7fe0bf7fe700 1 --2- 192.168.123.109:0/3910720336 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe0ac06c820 0x7fe0ac06ecd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:41.260 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.259+0000 7fe0bf7fe700 1 --2- 192.168.123.109:0/3910720336 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe0ac06c820 0x7fe0ac06ecd0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fe0b4009180 tx=0x7fe0b4008040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:41.262 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.261+0000 7fe0c653a700 1 -- 192.168.123.109:0/3910720336 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe0a8002170 con 0x7fe0c006db80 2026-03-09T17:25:41.265 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.264+0000 7fe0bd7fa700 1 -- 192.168.123.109:0/3910720336 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe0b0059fa0 con 0x7fe0c006db80 2026-03-09T17:25:41.422 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.421+0000 7fe0c653a700 1 -- 192.168.123.109:0/3910720336 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fe0a8001fe0 con 0x7fe0c006db80 2026-03-09T17:25:41.422 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.421+0000 7fe0bd7fa700 1 -- 192.168.123.109:0/3910720336 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 2 v2) v1 ==== 95+0+1032 (secure 0 0 0) 0x7fe0b002b020 con 0x7fe0c006db80 2026-03-09T17:25:41.423 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:25:41.423 INFO:teuthology.orchestra.run.vm09.stdout:{"epoch":2,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","modified":"2026-03-09T17:25:35.571460Z","created":"2026-03-09T17:24:17.098520Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"},{"rank":1,"name":"vm09","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:3300","nonce":0},{"type":"v1","addr":"192.168.123.109:6789","nonce":0}]},"addr":"192.168.123.109:6789/0","public_addr":"192.168.123.109:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0,1]} 2026-03-09T17:25:41.425 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.424+0000 7fe0c653a700 1 -- 192.168.123.109:0/3910720336 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe0ac06c820 msgr2=0x7fe0ac06ecd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:41.425 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.424+0000 7fe0c653a700 1 --2- 192.168.123.109:0/3910720336 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe0ac06c820 0x7fe0ac06ecd0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fe0b4009180 tx=0x7fe0b4008040 comp rx=0 tx=0).stop 2026-03-09T17:25:41.425 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.424+0000 7fe0c653a700 1 -- 192.168.123.109:0/3910720336 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0c006db80 msgr2=0x7fe0c01add70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:41.425 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.424+0000 7fe0c653a700 1 --2- 192.168.123.109:0/3910720336 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0c006db80 0x7fe0c01add70 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7fe0b00127e0 tx=0x7fe0b0014f90 comp rx=0 tx=0).stop 2026-03-09T17:25:41.426 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.424+0000 7fe0c653a700 1 -- 192.168.123.109:0/3910720336 shutdown_connections 2026-03-09T17:25:41.426 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.424+0000 7fe0c653a700 1 --2- 192.168.123.109:0/3910720336 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe0ac06c820 0x7fe0ac06ecd0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:41.426 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.424+0000 7fe0c653a700 1 --2- 192.168.123.109:0/3910720336 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0c006db80 0x7fe0c01add70 unknown :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:41.426 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.424+0000 7fe0c653a700 1 --2- 192.168.123.109:0/3910720336 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe0c01ae2b0 0x7fe0c01b2a00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:41.426 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.425+0000 7fe0c653a700 1 -- 192.168.123.109:0/3910720336 >> 192.168.123.109:0/3910720336 conn(0x7fe0c006c6f0 msgr2=0x7fe0c010b7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:41.426 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.425+0000 7fe0c653a700 1 -- 192.168.123.109:0/3910720336 shutdown_connections 2026-03-09T17:25:41.426 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:25:41.425+0000 7fe0c653a700 1 -- 192.168.123.109:0/3910720336 wait complete. 2026-03-09T17:25:41.426 INFO:teuthology.orchestra.run.vm09.stderr:dumped monmap epoch 2 2026-03-09T17:25:41.488 INFO:tasks.cephadm:Generating final ceph.conf file... 2026-03-09T17:25:41.488 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph config generate-minimal-conf 2026-03-09T17:25:41.684 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:25:41.777 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:41 vm06 ceph-mon[57307]: mgrmap e18: vm06.pbgzei(active, since 17s), standbys: vm09.lqzvkh 2026-03-09T17:25:41.777 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:41 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr metadata", "who": "vm09.lqzvkh", "id": "vm09.lqzvkh"}]: dispatch 2026-03-09T17:25:41.777 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:41 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:25:41.777 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:41 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:41.777 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:41 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:41.777 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:41 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:41.777 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:41 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:25:41.777 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:41 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/3910720336' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:41.777 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:41 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:25:41.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:41 vm09 ceph-mon[62061]: mgrmap e18: vm06.pbgzei(active, since 17s), standbys: vm09.lqzvkh 2026-03-09T17:25:41.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:41 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr metadata", "who": "vm09.lqzvkh", "id": "vm09.lqzvkh"}]: dispatch 2026-03-09T17:25:41.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:41 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:25:41.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:41 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:41.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:41 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:41.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:41 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:41.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:41 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:25:41.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:41 vm09 ceph-mon[62061]: from='client.? 192.168.123.109:0/3910720336' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T17:25:41.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:41 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:25:42.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.997+0000 7f832adb6700 1 -- 192.168.123.106:0/282527019 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f831c0a4220 msgr2=0x7f831c0a45f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:42.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.997+0000 7f832adb6700 1 --2- 192.168.123.106:0/282527019 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f831c0a4220 0x7f831c0a45f0 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f8318008790 tx=0x7f8318008aa0 comp rx=0 tx=0).stop 2026-03-09T17:25:42.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.998+0000 7f832adb6700 1 -- 192.168.123.106:0/282527019 shutdown_connections 2026-03-09T17:25:42.001 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.998+0000 7f832adb6700 1 --2- 192.168.123.106:0/282527019 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f831c0a4220 0x7f831c0a45f0 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:42.001 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.998+0000 7f832adb6700 1 -- 192.168.123.106:0/282527019 >> 192.168.123.106:0/282527019 conn(0x7f831c019cd0 msgr2=0x7f831c01a0d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:42.002 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.998+0000 7f832adb6700 1 -- 192.168.123.106:0/282527019 shutdown_connections 2026-03-09T17:25:42.002 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.998+0000 7f832adb6700 1 -- 192.168.123.106:0/282527019 wait complete. 2026-03-09T17:25:42.002 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.999+0000 7f832adb6700 1 Processor -- start 2026-03-09T17:25:42.002 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.999+0000 7f832adb6700 1 -- start start 2026-03-09T17:25:42.002 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.999+0000 7f832adb6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f831c00fe20 0x7f831c008e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:42.002 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.999+0000 7f832adb6700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f831c009380 0x7f831c0097f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:42.002 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.999+0000 7f832adb6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f831c00d9c0 con 0x7f831c00fe20 2026-03-09T17:25:42.002 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.999+0000 7f832adb6700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f831c00db30 con 0x7f831c009380 2026-03-09T17:25:42.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.999+0000 7f8328b52700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f831c00fe20 0x7f831c008e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:42.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.999+0000 7f8328b52700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f831c00fe20 0x7f831c008e40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:34756/0 (socket says 192.168.123.106:34756) 2026-03-09T17:25:42.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:41.999+0000 7f8328b52700 1 -- 192.168.123.106:0/3573936894 learned_addr learned my addr 192.168.123.106:0/3573936894 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:25:42.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.000+0000 7f8323fff700 1 --2- 192.168.123.106:0/3573936894 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f831c009380 0x7f831c0097f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:42.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.000+0000 7f8328b52700 1 -- 192.168.123.106:0/3573936894 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f831c009380 msgr2=0x7f831c0097f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:42.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.000+0000 7f8328b52700 1 --2- 192.168.123.106:0/3573936894 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f831c009380 0x7f831c0097f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:42.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.000+0000 7f8328b52700 1 -- 192.168.123.106:0/3573936894 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8318008440 con 0x7f831c00fe20 2026-03-09T17:25:42.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.000+0000 7f8328b52700 1 --2- 192.168.123.106:0/3573936894 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f831c00fe20 0x7f831c008e40 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f831800bf60 tx=0x7f831800bf90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:42.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.001+0000 7f8321ffb700 1 -- 192.168.123.106:0/3573936894 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8318003f00 con 0x7f831c00fe20 2026-03-09T17:25:42.004 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.001+0000 7f832adb6700 1 -- 192.168.123.106:0/3573936894 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f831c00ddb0 con 0x7f831c00fe20 2026-03-09T17:25:42.004 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.001+0000 7f832adb6700 1 -- 192.168.123.106:0/3573936894 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f831c00e300 con 0x7f831c00fe20 2026-03-09T17:25:42.004 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.002+0000 7f8321ffb700 1 -- 192.168.123.106:0/3573936894 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8318004540 con 0x7f831c00fe20 2026-03-09T17:25:42.004 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.002+0000 7f8321ffb700 1 -- 192.168.123.106:0/3573936894 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8318012660 con 0x7f831c00fe20 2026-03-09T17:25:42.005 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.003+0000 7f8321ffb700 1 -- 192.168.123.106:0/3573936894 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f83180127c0 con 0x7f831c00fe20 2026-03-09T17:25:42.005 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.003+0000 7f8321ffb700 1 --2- 192.168.123.106:0/3573936894 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f831406c820 0x7f831406ecd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:42.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.004+0000 7f8323fff700 1 --2- 192.168.123.106:0/3573936894 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f831406c820 0x7f831406ecd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:42.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.004+0000 7f8321ffb700 1 -- 192.168.123.106:0/3573936894 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f831808c960 con 0x7f831c00fe20 2026-03-09T17:25:42.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.004+0000 7f832adb6700 1 -- 192.168.123.106:0/3573936894 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f831c004030 con 0x7f831c00fe20 2026-03-09T17:25:42.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.007+0000 7f8321ffb700 1 -- 192.168.123.106:0/3573936894 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f831801f080 con 0x7f831c00fe20 2026-03-09T17:25:42.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.007+0000 7f8323fff700 1 --2- 192.168.123.106:0/3573936894 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f831406c820 0x7f831406ecd0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f832404f8a0 tx=0x7f832406c040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:42.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.130+0000 7f832adb6700 1 -- 192.168.123.106:0/3573936894 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7f831c00df40 con 0x7f831c00fe20 2026-03-09T17:25:42.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.132+0000 7f8321ffb700 1 -- 192.168.123.106:0/3573936894 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v10) v1 ==== 76+0+235 (secure 0 0 0) 0x7f831c00df40 con 0x7f831c00fe20 2026-03-09T17:25:42.134 INFO:teuthology.orchestra.run.vm06.stdout:# minimal ceph.conf for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:25:42.135 INFO:teuthology.orchestra.run.vm06.stdout:[global] 2026-03-09T17:25:42.135 INFO:teuthology.orchestra.run.vm06.stdout: fsid = bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:25:42.135 INFO:teuthology.orchestra.run.vm06.stdout: mon_host = [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] 2026-03-09T17:25:42.136 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.134+0000 7f832adb6700 1 -- 192.168.123.106:0/3573936894 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f831406c820 msgr2=0x7f831406ecd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:42.136 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.134+0000 7f832adb6700 1 --2- 192.168.123.106:0/3573936894 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f831406c820 0x7f831406ecd0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f832404f8a0 tx=0x7f832406c040 comp rx=0 tx=0).stop 2026-03-09T17:25:42.136 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.134+0000 7f832adb6700 1 -- 192.168.123.106:0/3573936894 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f831c00fe20 msgr2=0x7f831c008e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:42.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.134+0000 7f832adb6700 1 --2- 192.168.123.106:0/3573936894 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f831c00fe20 0x7f831c008e40 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f831800bf60 tx=0x7f831800bf90 comp rx=0 tx=0).stop 2026-03-09T17:25:42.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.135+0000 7f832adb6700 1 -- 192.168.123.106:0/3573936894 shutdown_connections 2026-03-09T17:25:42.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.135+0000 7f832adb6700 1 --2- 192.168.123.106:0/3573936894 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f831406c820 0x7f831406ecd0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:42.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.135+0000 7f832adb6700 1 --2- 192.168.123.106:0/3573936894 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f831c00fe20 0x7f831c008e40 unknown :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:42.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.135+0000 7f832adb6700 1 --2- 192.168.123.106:0/3573936894 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f831c009380 0x7f831c0097f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:42.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.135+0000 7f832adb6700 1 -- 192.168.123.106:0/3573936894 >> 192.168.123.106:0/3573936894 conn(0x7f831c019cd0 msgr2=0x7f831c0a2710 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:42.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.135+0000 7f832adb6700 1 -- 192.168.123.106:0/3573936894 shutdown_connections 2026-03-09T17:25:42.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:42.135+0000 7f832adb6700 1 -- 192.168.123.106:0/3573936894 wait complete. 2026-03-09T17:25:42.202 INFO:tasks.cephadm:Distributing (final) config and client.admin keyring... 2026-03-09T17:25:42.202 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:25:42.202 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/ceph/ceph.conf 2026-03-09T17:25:42.245 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:25:42.245 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:25:42.312 DEBUG:teuthology.orchestra.run.vm09:> set -ex 2026-03-09T17:25:42.312 DEBUG:teuthology.orchestra.run.vm09:> sudo dd of=/etc/ceph/ceph.conf 2026-03-09T17:25:42.337 DEBUG:teuthology.orchestra.run.vm09:> set -ex 2026-03-09T17:25:42.337 DEBUG:teuthology.orchestra.run.vm09:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:25:42.399 INFO:tasks.cephadm:Deploying OSDs... 2026-03-09T17:25:42.399 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:25:42.399 DEBUG:teuthology.orchestra.run.vm06:> dd if=/scratch_devs of=/dev/stdout 2026-03-09T17:25:42.420 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T17:25:42.420 DEBUG:teuthology.orchestra.run.vm06:> ls /dev/[sv]d? 2026-03-09T17:25:42.481 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vda 2026-03-09T17:25:42.481 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vdb 2026-03-09T17:25:42.481 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vdc 2026-03-09T17:25:42.481 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vdd 2026-03-09T17:25:42.481 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vde 2026-03-09T17:25:42.481 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-09T17:25:42.481 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-09T17:25:42.481 DEBUG:teuthology.orchestra.run.vm06:> stat /dev/vdb 2026-03-09T17:25:42.541 INFO:teuthology.orchestra.run.vm06.stdout: File: /dev/vdb 2026-03-09T17:25:42.541 INFO:teuthology.orchestra.run.vm06.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T17:25:42.541 INFO:teuthology.orchestra.run.vm06.stdout:Device: 6h/6d Inode: 220 Links: 1 Device type: fc,10 2026-03-09T17:25:42.541 INFO:teuthology.orchestra.run.vm06.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T17:25:42.541 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T17:25:42.541 INFO:teuthology.orchestra.run.vm06.stdout:Access: 2026-03-09 17:24:51.952332678 +0000 2026-03-09T17:25:42.541 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-09 17:24:51.827332341 +0000 2026-03-09T17:25:42.541 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-09 17:24:51.827332341 +0000 2026-03-09T17:25:42.541 INFO:teuthology.orchestra.run.vm06.stdout: Birth: 2026-03-09 17:17:51.203000000 +0000 2026-03-09T17:25:42.541 DEBUG:teuthology.orchestra.run.vm06:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-09T17:25:42.611 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records in 2026-03-09T17:25:42.612 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records out 2026-03-09T17:25:42.612 INFO:teuthology.orchestra.run.vm06.stderr:512 bytes copied, 0.000148138 s, 3.5 MB/s 2026-03-09T17:25:42.612 DEBUG:teuthology.orchestra.run.vm06:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-09T17:25:42.692 DEBUG:teuthology.orchestra.run.vm06:> stat /dev/vdc 2026-03-09T17:25:42.755 INFO:teuthology.orchestra.run.vm06.stdout: File: /dev/vdc 2026-03-09T17:25:42.755 INFO:teuthology.orchestra.run.vm06.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T17:25:42.755 INFO:teuthology.orchestra.run.vm06.stdout:Device: 6h/6d Inode: 223 Links: 1 Device type: fc,20 2026-03-09T17:25:42.755 INFO:teuthology.orchestra.run.vm06.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T17:25:42.755 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T17:25:42.755 INFO:teuthology.orchestra.run.vm06.stdout:Access: 2026-03-09 17:24:52.014332845 +0000 2026-03-09T17:25:42.755 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-09 17:24:51.827332341 +0000 2026-03-09T17:25:42.755 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-09 17:24:51.827332341 +0000 2026-03-09T17:25:42.755 INFO:teuthology.orchestra.run.vm06.stdout: Birth: 2026-03-09 17:17:51.209000000 +0000 2026-03-09T17:25:42.755 DEBUG:teuthology.orchestra.run.vm06:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-09T17:25:42.827 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records in 2026-03-09T17:25:42.827 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records out 2026-03-09T17:25:42.827 INFO:teuthology.orchestra.run.vm06.stderr:512 bytes copied, 0.000201548 s, 2.5 MB/s 2026-03-09T17:25:42.830 DEBUG:teuthology.orchestra.run.vm06:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-09T17:25:42.889 DEBUG:teuthology.orchestra.run.vm06:> stat /dev/vdd 2026-03-09T17:25:42.948 INFO:teuthology.orchestra.run.vm06.stdout: File: /dev/vdd 2026-03-09T17:25:42.948 INFO:teuthology.orchestra.run.vm06.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T17:25:42.948 INFO:teuthology.orchestra.run.vm06.stdout:Device: 6h/6d Inode: 224 Links: 1 Device type: fc,30 2026-03-09T17:25:42.948 INFO:teuthology.orchestra.run.vm06.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T17:25:42.948 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T17:25:42.948 INFO:teuthology.orchestra.run.vm06.stdout:Access: 2026-03-09 17:24:52.080333023 +0000 2026-03-09T17:25:42.949 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-09 17:24:51.832332355 +0000 2026-03-09T17:25:42.949 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-09 17:24:51.832332355 +0000 2026-03-09T17:25:42.949 INFO:teuthology.orchestra.run.vm06.stdout: Birth: 2026-03-09 17:17:51.213000000 +0000 2026-03-09T17:25:42.949 DEBUG:teuthology.orchestra.run.vm06:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-09T17:25:43.037 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records in 2026-03-09T17:25:43.037 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records out 2026-03-09T17:25:43.037 INFO:teuthology.orchestra.run.vm06.stderr:512 bytes copied, 0.000136565 s, 3.7 MB/s 2026-03-09T17:25:43.039 DEBUG:teuthology.orchestra.run.vm06:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-09T17:25:43.072 DEBUG:teuthology.orchestra.run.vm06:> stat /dev/vde 2026-03-09T17:25:43.134 INFO:teuthology.orchestra.run.vm06.stdout: File: /dev/vde 2026-03-09T17:25:43.134 INFO:teuthology.orchestra.run.vm06.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T17:25:43.134 INFO:teuthology.orchestra.run.vm06.stdout:Device: 6h/6d Inode: 225 Links: 1 Device type: fc,40 2026-03-09T17:25:43.134 INFO:teuthology.orchestra.run.vm06.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T17:25:43.134 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T17:25:43.134 INFO:teuthology.orchestra.run.vm06.stdout:Access: 2026-03-09 17:24:52.155333225 +0000 2026-03-09T17:25:43.134 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-09 17:24:51.835332363 +0000 2026-03-09T17:25:43.134 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-09 17:24:51.835332363 +0000 2026-03-09T17:25:43.135 INFO:teuthology.orchestra.run.vm06.stdout: Birth: 2026-03-09 17:17:51.216000000 +0000 2026-03-09T17:25:43.135 DEBUG:teuthology.orchestra.run.vm06:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-09T17:25:43.208 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records in 2026-03-09T17:25:43.208 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records out 2026-03-09T17:25:43.208 INFO:teuthology.orchestra.run.vm06.stderr:512 bytes copied, 0.000145172 s, 3.5 MB/s 2026-03-09T17:25:43.209 DEBUG:teuthology.orchestra.run.vm06:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-09T17:25:43.267 DEBUG:teuthology.orchestra.run.vm09:> set -ex 2026-03-09T17:25:43.267 DEBUG:teuthology.orchestra.run.vm09:> dd if=/scratch_devs of=/dev/stdout 2026-03-09T17:25:43.282 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T17:25:43.282 DEBUG:teuthology.orchestra.run.vm09:> ls /dev/[sv]d? 2026-03-09T17:25:43.337 INFO:teuthology.orchestra.run.vm09.stdout:/dev/vda 2026-03-09T17:25:43.337 INFO:teuthology.orchestra.run.vm09.stdout:/dev/vdb 2026-03-09T17:25:43.337 INFO:teuthology.orchestra.run.vm09.stdout:/dev/vdc 2026-03-09T17:25:43.337 INFO:teuthology.orchestra.run.vm09.stdout:/dev/vdd 2026-03-09T17:25:43.337 INFO:teuthology.orchestra.run.vm09.stdout:/dev/vde 2026-03-09T17:25:43.337 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-09T17:25:43.337 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-09T17:25:43.337 DEBUG:teuthology.orchestra.run.vm09:> stat /dev/vdb 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: Updating vm09:/etc/ceph/ceph.conf 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3573936894' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.pbgzei", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:25:43.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:42 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: Updating vm09:/etc/ceph/ceph.conf 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/3573936894' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.pbgzei", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:25:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:42 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:43.395 INFO:teuthology.orchestra.run.vm09.stdout: File: /dev/vdb 2026-03-09T17:25:43.395 INFO:teuthology.orchestra.run.vm09.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T17:25:43.395 INFO:teuthology.orchestra.run.vm09.stdout:Device: 6h/6d Inode: 223 Links: 1 Device type: fc,10 2026-03-09T17:25:43.395 INFO:teuthology.orchestra.run.vm09.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T17:25:43.395 INFO:teuthology.orchestra.run.vm09.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T17:25:43.395 INFO:teuthology.orchestra.run.vm09.stdout:Access: 2026-03-09 17:25:25.641088022 +0000 2026-03-09T17:25:43.395 INFO:teuthology.orchestra.run.vm09.stdout:Modify: 2026-03-09 17:25:25.538087937 +0000 2026-03-09T17:25:43.395 INFO:teuthology.orchestra.run.vm09.stdout:Change: 2026-03-09 17:25:25.538087937 +0000 2026-03-09T17:25:43.395 INFO:teuthology.orchestra.run.vm09.stdout: Birth: 2026-03-09 17:16:45.259000000 +0000 2026-03-09T17:25:43.395 DEBUG:teuthology.orchestra.run.vm09:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-09T17:25:43.464 INFO:teuthology.orchestra.run.vm09.stderr:1+0 records in 2026-03-09T17:25:43.464 INFO:teuthology.orchestra.run.vm09.stderr:1+0 records out 2026-03-09T17:25:43.464 INFO:teuthology.orchestra.run.vm09.stderr:512 bytes copied, 0.000140824 s, 3.6 MB/s 2026-03-09T17:25:43.465 DEBUG:teuthology.orchestra.run.vm09:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-09T17:25:43.521 DEBUG:teuthology.orchestra.run.vm09:> stat /dev/vdc 2026-03-09T17:25:43.576 INFO:teuthology.orchestra.run.vm09.stdout: File: /dev/vdc 2026-03-09T17:25:43.576 INFO:teuthology.orchestra.run.vm09.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T17:25:43.576 INFO:teuthology.orchestra.run.vm09.stdout:Device: 6h/6d Inode: 224 Links: 1 Device type: fc,20 2026-03-09T17:25:43.576 INFO:teuthology.orchestra.run.vm09.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T17:25:43.576 INFO:teuthology.orchestra.run.vm09.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T17:25:43.577 INFO:teuthology.orchestra.run.vm09.stdout:Access: 2026-03-09 17:25:25.704088074 +0000 2026-03-09T17:25:43.577 INFO:teuthology.orchestra.run.vm09.stdout:Modify: 2026-03-09 17:25:25.541087940 +0000 2026-03-09T17:25:43.577 INFO:teuthology.orchestra.run.vm09.stdout:Change: 2026-03-09 17:25:25.541087940 +0000 2026-03-09T17:25:43.577 INFO:teuthology.orchestra.run.vm09.stdout: Birth: 2026-03-09 17:16:45.274000000 +0000 2026-03-09T17:25:43.577 DEBUG:teuthology.orchestra.run.vm09:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-09T17:25:43.644 INFO:teuthology.orchestra.run.vm09.stderr:1+0 records in 2026-03-09T17:25:43.644 INFO:teuthology.orchestra.run.vm09.stderr:1+0 records out 2026-03-09T17:25:43.644 INFO:teuthology.orchestra.run.vm09.stderr:512 bytes copied, 0.00014931 s, 3.4 MB/s 2026-03-09T17:25:43.645 DEBUG:teuthology.orchestra.run.vm09:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-09T17:25:43.701 DEBUG:teuthology.orchestra.run.vm09:> stat /dev/vdd 2026-03-09T17:25:43.758 INFO:teuthology.orchestra.run.vm09.stdout: File: /dev/vdd 2026-03-09T17:25:43.758 INFO:teuthology.orchestra.run.vm09.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T17:25:43.758 INFO:teuthology.orchestra.run.vm09.stdout:Device: 6h/6d Inode: 253 Links: 1 Device type: fc,30 2026-03-09T17:25:43.758 INFO:teuthology.orchestra.run.vm09.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T17:25:43.758 INFO:teuthology.orchestra.run.vm09.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T17:25:43.758 INFO:teuthology.orchestra.run.vm09.stdout:Access: 2026-03-09 17:25:25.757088118 +0000 2026-03-09T17:25:43.758 INFO:teuthology.orchestra.run.vm09.stdout:Modify: 2026-03-09 17:25:25.549087946 +0000 2026-03-09T17:25:43.758 INFO:teuthology.orchestra.run.vm09.stdout:Change: 2026-03-09 17:25:25.549087946 +0000 2026-03-09T17:25:43.758 INFO:teuthology.orchestra.run.vm09.stdout: Birth: 2026-03-09 17:16:45.291000000 +0000 2026-03-09T17:25:43.758 DEBUG:teuthology.orchestra.run.vm09:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-09T17:25:43.824 INFO:teuthology.orchestra.run.vm09.stderr:1+0 records in 2026-03-09T17:25:43.824 INFO:teuthology.orchestra.run.vm09.stderr:1+0 records out 2026-03-09T17:25:43.824 INFO:teuthology.orchestra.run.vm09.stderr:512 bytes copied, 0.000150532 s, 3.4 MB/s 2026-03-09T17:25:43.825 DEBUG:teuthology.orchestra.run.vm09:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-09T17:25:43.888 DEBUG:teuthology.orchestra.run.vm09:> stat /dev/vde 2026-03-09T17:25:43.946 INFO:teuthology.orchestra.run.vm09.stdout: File: /dev/vde 2026-03-09T17:25:43.946 INFO:teuthology.orchestra.run.vm09.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T17:25:43.946 INFO:teuthology.orchestra.run.vm09.stdout:Device: 6h/6d Inode: 257 Links: 1 Device type: fc,40 2026-03-09T17:25:43.946 INFO:teuthology.orchestra.run.vm09.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T17:25:43.946 INFO:teuthology.orchestra.run.vm09.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T17:25:43.946 INFO:teuthology.orchestra.run.vm09.stdout:Access: 2026-03-09 17:25:25.805088157 +0000 2026-03-09T17:25:43.947 INFO:teuthology.orchestra.run.vm09.stdout:Modify: 2026-03-09 17:25:25.544087942 +0000 2026-03-09T17:25:43.947 INFO:teuthology.orchestra.run.vm09.stdout:Change: 2026-03-09 17:25:25.544087942 +0000 2026-03-09T17:25:43.947 INFO:teuthology.orchestra.run.vm09.stdout: Birth: 2026-03-09 17:16:45.343000000 +0000 2026-03-09T17:25:43.947 DEBUG:teuthology.orchestra.run.vm09:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-09T17:25:44.012 INFO:teuthology.orchestra.run.vm09.stderr:1+0 records in 2026-03-09T17:25:44.012 INFO:teuthology.orchestra.run.vm09.stderr:1+0 records out 2026-03-09T17:25:44.012 INFO:teuthology.orchestra.run.vm09.stderr:512 bytes copied, 0.000161592 s, 3.2 MB/s 2026-03-09T17:25:44.013 DEBUG:teuthology.orchestra.run.vm09:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-09T17:25:44.069 INFO:tasks.cephadm:Deploying osd.0 on vm06 with /dev/vde... 2026-03-09T17:25:44.069 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- lvm zap /dev/vde 2026-03-09T17:25:44.099 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: Reconfiguring mon.vm06 (unknown last config time)... 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: Reconfiguring daemon mon.vm06 on vm06 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: Reconfiguring mgr.vm06.pbgzei (unknown last config time)... 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: Reconfiguring daemon mgr.vm06.pbgzei on vm06 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: Reconfiguring ceph-exporter.vm06 (monmap changed)... 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: Reconfiguring daemon ceph-exporter.vm06 on vm06 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:44.100 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:44 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:44.281 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:25:44.348 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: Reconfiguring mon.vm06 (unknown last config time)... 2026-03-09T17:25:44.348 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: Reconfiguring daemon mon.vm06 on vm06 2026-03-09T17:25:44.348 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: Reconfiguring mgr.vm06.pbgzei (unknown last config time)... 2026-03-09T17:25:44.349 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: Reconfiguring daemon mgr.vm06.pbgzei on vm06 2026-03-09T17:25:44.349 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:44.349 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:44.349 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: Reconfiguring ceph-exporter.vm06 (monmap changed)... 2026-03-09T17:25:44.349 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:25:44.349 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:44.349 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: Reconfiguring daemon ceph-exporter.vm06 on vm06 2026-03-09T17:25:44.349 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:44.349 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:44.349 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:44.349 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T17:25:44.349 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:44.349 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:44.349 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:44 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:45.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:45 vm06 ceph-mon[57307]: Reconfiguring crash.vm06 (monmap changed)... 2026-03-09T17:25:45.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:45 vm06 ceph-mon[57307]: Reconfiguring daemon crash.vm06 on vm06 2026-03-09T17:25:45.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:45 vm06 ceph-mon[57307]: Reconfiguring alertmanager.vm06 (dependencies changed)... 2026-03-09T17:25:45.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:45 vm06 ceph-mon[57307]: Reconfiguring daemon alertmanager.vm06 on vm06 2026-03-09T17:25:45.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:45 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:45.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:45 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:45.199 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:25:45.212 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph orch daemon add osd vm06:/dev/vde 2026-03-09T17:25:45.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:45 vm09 ceph-mon[62061]: Reconfiguring crash.vm06 (monmap changed)... 2026-03-09T17:25:45.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:45 vm09 ceph-mon[62061]: Reconfiguring daemon crash.vm06 on vm06 2026-03-09T17:25:45.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:45 vm09 ceph-mon[62061]: Reconfiguring alertmanager.vm06 (dependencies changed)... 2026-03-09T17:25:45.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:45 vm09 ceph-mon[62061]: Reconfiguring daemon alertmanager.vm06 on vm06 2026-03-09T17:25:45.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:45 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:45.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:45 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:45.435 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:25:45.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.782+0000 7f28ad41d700 1 -- 192.168.123.106:0/1106143162 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f28a8071db0 msgr2=0x7f28a80721c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:45.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.782+0000 7f28ad41d700 1 --2- 192.168.123.106:0/1106143162 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f28a8071db0 0x7f28a80721c0 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f2898009b00 tx=0x7f2898009e10 comp rx=0 tx=0).stop 2026-03-09T17:25:45.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.784+0000 7f28ad41d700 1 -- 192.168.123.106:0/1106143162 shutdown_connections 2026-03-09T17:25:45.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.784+0000 7f28ad41d700 1 --2- 192.168.123.106:0/1106143162 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f28a8107d50 0x7f28a81081c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:45.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.784+0000 7f28ad41d700 1 --2- 192.168.123.106:0/1106143162 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f28a8071db0 0x7f28a80721c0 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:45.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.784+0000 7f28ad41d700 1 -- 192.168.123.106:0/1106143162 >> 192.168.123.106:0/1106143162 conn(0x7f28a806d3e0 msgr2=0x7f28a806f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:45.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.784+0000 7f28ad41d700 1 -- 192.168.123.106:0/1106143162 shutdown_connections 2026-03-09T17:25:45.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.784+0000 7f28ad41d700 1 -- 192.168.123.106:0/1106143162 wait complete. 2026-03-09T17:25:45.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.784+0000 7f28ad41d700 1 Processor -- start 2026-03-09T17:25:45.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.784+0000 7f28ad41d700 1 -- start start 2026-03-09T17:25:45.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.784+0000 7f28ad41d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f28a8071db0 0x7f28a81a4c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:45.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.784+0000 7f28ad41d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f28a8107d50 0x7f28a81a5150 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:45.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.784+0000 7f28ad41d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f28a81a5770 con 0x7f28a8107d50 2026-03-09T17:25:45.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.784+0000 7f28ad41d700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f28a81aa180 con 0x7f28a8071db0 2026-03-09T17:25:45.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.784+0000 7f28a67fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f28a8107d50 0x7f28a81a5150 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:45.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.785+0000 7f28a6ffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f28a8071db0 0x7f28a81a4c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:45.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.785+0000 7f28a6ffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f28a8071db0 0x7f28a81a4c10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:34730/0 (socket says 192.168.123.106:34730) 2026-03-09T17:25:45.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.785+0000 7f28a6ffd700 1 -- 192.168.123.106:0/3877565608 learned_addr learned my addr 192.168.123.106:0/3877565608 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:25:45.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.785+0000 7f28a6ffd700 1 -- 192.168.123.106:0/3877565608 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f28a8107d50 msgr2=0x7f28a81a5150 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:45.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.785+0000 7f28a6ffd700 1 --2- 192.168.123.106:0/3877565608 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f28a8107d50 0x7f28a81a5150 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:45.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.785+0000 7f28a6ffd700 1 -- 192.168.123.106:0/3877565608 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f28980097e0 con 0x7f28a8071db0 2026-03-09T17:25:45.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.785+0000 7f28a6ffd700 1 --2- 192.168.123.106:0/3877565608 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f28a8071db0 0x7f28a81a4c10 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7f2898005f50 tx=0x7f2898004a60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:45.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.785+0000 7f289ffff700 1 -- 192.168.123.106:0/3877565608 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f289801d070 con 0x7f28a8071db0 2026-03-09T17:25:45.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.785+0000 7f289ffff700 1 -- 192.168.123.106:0/3877565608 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f289800bc50 con 0x7f28a8071db0 2026-03-09T17:25:45.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.785+0000 7f289ffff700 1 -- 192.168.123.106:0/3877565608 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f289800f8d0 con 0x7f28a8071db0 2026-03-09T17:25:45.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.785+0000 7f28ad41d700 1 -- 192.168.123.106:0/3877565608 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f28a81aa320 con 0x7f28a8071db0 2026-03-09T17:25:45.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.786+0000 7f28ad41d700 1 -- 192.168.123.106:0/3877565608 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f28a81aa7e0 con 0x7f28a8071db0 2026-03-09T17:25:45.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.786+0000 7f28ad41d700 1 -- 192.168.123.106:0/3877565608 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f28a8066e40 con 0x7f28a8071db0 2026-03-09T17:25:45.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.787+0000 7f289ffff700 1 -- 192.168.123.106:0/3877565608 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f289800fa30 con 0x7f28a8071db0 2026-03-09T17:25:45.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.788+0000 7f289ffff700 1 --2- 192.168.123.106:0/3877565608 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f289406c6d0 0x7f289406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:45.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.788+0000 7f289ffff700 1 -- 192.168.123.106:0/3877565608 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f289808caa0 con 0x7f28a8071db0 2026-03-09T17:25:45.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.788+0000 7f28a67fc700 1 --2- 192.168.123.106:0/3877565608 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f289406c6d0 0x7f289406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:45.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.788+0000 7f28a67fc700 1 --2- 192.168.123.106:0/3877565608 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f289406c6d0 0x7f289406eb80 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f2890009910 tx=0x7f2890008040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:45.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.790+0000 7f289ffff700 1 -- 192.168.123.106:0/3877565608 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2898058370 con 0x7f28a8071db0 2026-03-09T17:25:45.915 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:45.912+0000 7f28ad41d700 1 -- 192.168.123.106:0/3877565608 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7f28a810c6a0 con 0x7f289406c6d0 2026-03-09T17:25:46.212 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:46 vm06 ceph-mon[57307]: Reconfiguring grafana.vm06 (dependencies changed)... 2026-03-09T17:25:46.212 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:46 vm06 ceph-mon[57307]: Reconfiguring daemon grafana.vm06 on vm06 2026-03-09T17:25:46.212 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:46 vm06 ceph-mon[57307]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:46.212 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:46 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T17:25:46.212 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:46 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T17:25:46.212 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:46 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:46.212 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:46 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:46.212 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:46 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:46.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:46 vm09 ceph-mon[62061]: Reconfiguring grafana.vm06 (dependencies changed)... 2026-03-09T17:25:46.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:46 vm09 ceph-mon[62061]: Reconfiguring daemon grafana.vm06 on vm06 2026-03-09T17:25:46.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:46 vm09 ceph-mon[62061]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:46.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:46 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T17:25:46.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:46 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T17:25:46.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:46 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:46.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:46 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:46.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:46 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:47.698 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:47 vm06 ceph-mon[57307]: from='client.24099 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:25:47.699 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:47 vm06 ceph-mon[57307]: Reconfiguring prometheus.vm06 (dependencies changed)... 2026-03-09T17:25:47.699 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:47 vm06 ceph-mon[57307]: Reconfiguring daemon prometheus.vm06 on vm06 2026-03-09T17:25:47.699 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:47 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3663870830' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "7b23c291-c26f-47f6-aa9d-2b35b2448578"}]: dispatch 2026-03-09T17:25:47.699 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:47 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3663870830' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "7b23c291-c26f-47f6-aa9d-2b35b2448578"}]': finished 2026-03-09T17:25:47.699 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:47 vm06 ceph-mon[57307]: osdmap e6: 1 total, 0 up, 1 in 2026-03-09T17:25:47.699 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:47 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:25:47.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:47 vm09 ceph-mon[62061]: from='client.24099 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:25:47.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:47 vm09 ceph-mon[62061]: Reconfiguring prometheus.vm06 (dependencies changed)... 2026-03-09T17:25:47.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:47 vm09 ceph-mon[62061]: Reconfiguring daemon prometheus.vm06 on vm06 2026-03-09T17:25:47.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:47 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/3663870830' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "7b23c291-c26f-47f6-aa9d-2b35b2448578"}]: dispatch 2026-03-09T17:25:47.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:47 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/3663870830' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "7b23c291-c26f-47f6-aa9d-2b35b2448578"}]': finished 2026-03-09T17:25:47.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:47 vm09 ceph-mon[62061]: osdmap e6: 1 total, 0 up, 1 in 2026-03-09T17:25:47.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:47 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:25:48.793 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:48 vm09 ceph-mon[62061]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:48.793 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:48 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/1256807634' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T17:25:48.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:48 vm06 ceph-mon[57307]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:48.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:48 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/1256807634' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T17:25:49.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:49 vm06 ceph-mon[57307]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:49.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:49 vm09 ceph-mon[62061]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:52.273 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:52 vm06 ceph-mon[57307]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:52.273 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:52 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:52.528 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:52 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:52.528 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:52 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm09", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:25:52.528 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:52 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:52.528 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:52 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:52.528 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:52 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:52.528 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:52 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm09", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T17:25:52.528 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:52 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:52.528 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:52 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T17:25:52.528 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:52 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:52.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:52 vm09 ceph-mon[62061]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:52.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:52 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:52.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:52 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:52.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:52 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm09", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:25:52.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:52 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:52.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:52 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:52.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:52 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:52.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:52 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm09", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T17:25:52.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:52 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:52.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:52 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T17:25:52.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:52 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: Reconfiguring ceph-exporter.vm09 (monmap changed)... 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: Reconfiguring daemon ceph-exporter.vm09 on vm09 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: Reconfiguring crash.vm09 (monmap changed)... 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: Reconfiguring daemon crash.vm09 on vm09 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: Deploying daemon osd.0 on vm06 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm09.lqzvkh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:53.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:53 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:53.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: Reconfiguring ceph-exporter.vm09 (monmap changed)... 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: Reconfiguring daemon ceph-exporter.vm09 on vm09 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: Reconfiguring crash.vm09 (monmap changed)... 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: Reconfiguring daemon crash.vm09 on vm09 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: Deploying daemon osd.0 on vm06 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm09.lqzvkh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:53.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:53 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: Reconfiguring mgr.vm09.lqzvkh (monmap changed)... 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: Reconfiguring daemon mgr.vm09.lqzvkh on vm09 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: Reconfiguring mon.vm09 (monmap changed)... 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: Reconfiguring daemon mon.vm09 on vm09 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm06.local:9093"}]: dispatch 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm06.local:9093"}]: dispatch 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm06.local:3000"}]: dispatch 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm06.local:3000"}]: dispatch 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm06.local:9095"}]: dispatch 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm06.local:9095"}]: dispatch 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:54.332 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:54 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: Reconfiguring mgr.vm09.lqzvkh (monmap changed)... 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: Reconfiguring daemon mgr.vm09.lqzvkh on vm09 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: Reconfiguring mon.vm09 (monmap changed)... 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: Reconfiguring daemon mon.vm09 on vm09 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm06.local:9093"}]: dispatch 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm06.local:9093"}]: dispatch 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm06.local:3000"}]: dispatch 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm06.local:3000"}]: dispatch 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm06.local:9095"}]: dispatch 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm06.local:9095"}]: dispatch 2026-03-09T17:25:54.507 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:54.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:25:54.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:54.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:54.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:54.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:54 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:55.327 INFO:teuthology.orchestra.run.vm06.stdout:Created osd(s) 0 on host 'vm06' 2026-03-09T17:25:55.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:55.323+0000 7f289ffff700 1 -- 192.168.123.106:0/3877565608 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f28a810c6a0 con 0x7f289406c6d0 2026-03-09T17:25:55.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:55.326+0000 7f28ad41d700 1 -- 192.168.123.106:0/3877565608 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f289406c6d0 msgr2=0x7f289406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:55.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:55.326+0000 7f28ad41d700 1 --2- 192.168.123.106:0/3877565608 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f289406c6d0 0x7f289406eb80 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f2890009910 tx=0x7f2890008040 comp rx=0 tx=0).stop 2026-03-09T17:25:55.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:55.326+0000 7f28ad41d700 1 -- 192.168.123.106:0/3877565608 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f28a8071db0 msgr2=0x7f28a81a4c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:55.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:55.326+0000 7f28ad41d700 1 --2- 192.168.123.106:0/3877565608 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f28a8071db0 0x7f28a81a4c10 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7f2898005f50 tx=0x7f2898004a60 comp rx=0 tx=0).stop 2026-03-09T17:25:55.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:55.326+0000 7f28ad41d700 1 -- 192.168.123.106:0/3877565608 shutdown_connections 2026-03-09T17:25:55.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:55.326+0000 7f28ad41d700 1 --2- 192.168.123.106:0/3877565608 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f28a8071db0 0x7f28a81a4c10 unknown :-1 s=CLOSED pgs=3 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:55.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:55.326+0000 7f28ad41d700 1 --2- 192.168.123.106:0/3877565608 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f289406c6d0 0x7f289406eb80 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:55.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:55.326+0000 7f28ad41d700 1 --2- 192.168.123.106:0/3877565608 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f28a8107d50 0x7f28a81a5150 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:55.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:55.326+0000 7f28ad41d700 1 -- 192.168.123.106:0/3877565608 >> 192.168.123.106:0/3877565608 conn(0x7f28a806d3e0 msgr2=0x7f28a810af80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:55.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:55.326+0000 7f28ad41d700 1 -- 192.168.123.106:0/3877565608 shutdown_connections 2026-03-09T17:25:55.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:55.326+0000 7f28ad41d700 1 -- 192.168.123.106:0/3877565608 wait complete. 2026-03-09T17:25:55.395 DEBUG:teuthology.orchestra.run.vm06:osd.0> sudo journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.0.service 2026-03-09T17:25:55.397 INFO:tasks.cephadm:Deploying osd.1 on vm06 with /dev/vdd... 2026-03-09T17:25:55.397 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- lvm zap /dev/vdd 2026-03-09T17:25:55.620 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:25:56.189 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:25:56.201 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph orch daemon add osd vm06:/dev/vdd 2026-03-09T17:25:56.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:56 vm06 ceph-mon[57307]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:56.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:56 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:56.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:56 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:56.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:56 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:56.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:56 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:56.433 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:25:56.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:56 vm09 ceph-mon[62061]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:56.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:56 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:56.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:56 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:56.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:56 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:56.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:56 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:56.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.784+0000 7f5cacbab700 1 -- 192.168.123.106:0/423684094 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5ca0094850 msgr2=0x7f5ca0094ca0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:25:56.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.784+0000 7f5cacbab700 1 --2- 192.168.123.106:0/423684094 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5ca0094850 0x7f5ca0094ca0 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f5c94008680 tx=0x7f5c94008990 comp rx=0 tx=0).stop 2026-03-09T17:25:56.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.787+0000 7f5cacbab700 1 -- 192.168.123.106:0/423684094 shutdown_connections 2026-03-09T17:25:56.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.787+0000 7f5cacbab700 1 --2- 192.168.123.106:0/423684094 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5ca0094850 0x7f5ca0094ca0 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:56.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.787+0000 7f5cacbab700 1 --2- 192.168.123.106:0/423684094 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ca0093650 0x7f5ca0093a60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:56.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.787+0000 7f5cacbab700 1 -- 192.168.123.106:0/423684094 >> 192.168.123.106:0/423684094 conn(0x7f5ca008ebe0 msgr2=0x7f5ca0091030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:25:56.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.787+0000 7f5cacbab700 1 -- 192.168.123.106:0/423684094 shutdown_connections 2026-03-09T17:25:56.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.787+0000 7f5cacbab700 1 -- 192.168.123.106:0/423684094 wait complete. 2026-03-09T17:25:56.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.788+0000 7f5cacbab700 1 Processor -- start 2026-03-09T17:25:56.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.788+0000 7f5cacbab700 1 -- start start 2026-03-09T17:25:56.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.788+0000 7f5cacbab700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5ca0093650 0x7f5ca00a3d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:56.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.788+0000 7f5cacbab700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ca0094850 0x7f5ca00a4260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:56.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.788+0000 7f5cacbab700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ca00a4880 con 0x7f5ca0094850 2026-03-09T17:25:56.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.788+0000 7f5cacbab700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ca00a49c0 con 0x7f5ca0093650 2026-03-09T17:25:56.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.788+0000 7f5ca6ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ca0094850 0x7f5ca00a4260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:56.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.788+0000 7f5ca6ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ca0094850 0x7f5ca00a4260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:55790/0 (socket says 192.168.123.106:55790) 2026-03-09T17:25:56.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.788+0000 7f5ca6ffd700 1 -- 192.168.123.106:0/1497990345 learned_addr learned my addr 192.168.123.106:0/1497990345 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:25:56.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.789+0000 7f5ca6ffd700 1 -- 192.168.123.106:0/1497990345 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5ca0093650 msgr2=0x7f5ca00a3d20 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T17:25:56.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.789+0000 7f5ca6ffd700 1 --2- 192.168.123.106:0/1497990345 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5ca0093650 0x7f5ca00a3d20 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:25:56.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.789+0000 7f5ca6ffd700 1 -- 192.168.123.106:0/1497990345 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5c94008360 con 0x7f5ca0094850 2026-03-09T17:25:56.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.789+0000 7f5ca6ffd700 1 --2- 192.168.123.106:0/1497990345 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ca0094850 0x7f5ca00a4260 secure :-1 s=READY pgs=172 cs=0 l=1 rev1=1 crypto rx=0x7f5c94000c00 tx=0x7f5c9400db60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:56.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.790+0000 7f5ca4ff9700 1 -- 192.168.123.106:0/1497990345 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5c940046d0 con 0x7f5ca0094850 2026-03-09T17:25:56.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.790+0000 7f5cacbab700 1 -- 192.168.123.106:0/1497990345 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5ca00a53f0 con 0x7f5ca0094850 2026-03-09T17:25:56.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.790+0000 7f5cacbab700 1 -- 192.168.123.106:0/1497990345 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5ca00a5940 con 0x7f5ca0094850 2026-03-09T17:25:56.793 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.791+0000 7f5cacbab700 1 -- 192.168.123.106:0/1497990345 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5ca009b4e0 con 0x7f5ca0094850 2026-03-09T17:25:56.796 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.794+0000 7f5ca4ff9700 1 -- 192.168.123.106:0/1497990345 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5c9400a760 con 0x7f5ca0094850 2026-03-09T17:25:56.796 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.794+0000 7f5ca4ff9700 1 -- 192.168.123.106:0/1497990345 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5c94011c70 con 0x7f5ca0094850 2026-03-09T17:25:56.796 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.794+0000 7f5ca4ff9700 1 -- 192.168.123.106:0/1497990345 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f5c9402b430 con 0x7f5ca0094850 2026-03-09T17:25:56.797 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.795+0000 7f5ca4ff9700 1 --2- 192.168.123.106:0/1497990345 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5c9806c7f0 0x7f5c9806eca0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:25:56.797 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.795+0000 7f5ca4ff9700 1 -- 192.168.123.106:0/1497990345 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(6..6 src has 1..6) v4 ==== 1313+0+0 (secure 0 0 0) 0x7f5c94030080 con 0x7f5ca0094850 2026-03-09T17:25:56.797 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.795+0000 7f5ca4ff9700 1 -- 192.168.123.106:0/1497990345 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5c9408d620 con 0x7f5ca0094850 2026-03-09T17:25:56.797 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.795+0000 7f5ca77fe700 1 --2- 192.168.123.106:0/1497990345 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5c9806c7f0 0x7f5c9806eca0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:25:56.801 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.799+0000 7f5ca77fe700 1 --2- 192.168.123.106:0/1497990345 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5c9806c7f0 0x7f5c9806eca0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f5c9c005950 tx=0x7f5c9c0058e0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:25:56.984 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:25:56.981+0000 7f5cacbab700 1 -- 192.168.123.106:0/1497990345 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7f5ca00a5580 con 0x7f5c9806c7f0 2026-03-09T17:25:57.531 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:25:57 vm06 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0[74161]: 2026-03-09T17:25:57.447+0000 7f155e426640 -1 osd.0 0 log_to_monitors true 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: Detected new or changed devices on vm06 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: from='client.14278 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: pgmap v11: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: from='osd.0 [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:57 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:58.144 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:25:58 vm06 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0[74161]: 2026-03-09T17:25:58.072+0000 7f155329c700 -1 osd.0 0 waiting for initial osdmap 2026-03-09T17:25:58.144 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:25:58 vm06 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0[74161]: 2026-03-09T17:25:58.081+0000 7f154e08f700 -1 osd.0 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: Detected new or changed devices on vm06 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: from='client.14278 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T17:25:58.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T17:25:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:25:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:25:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: pgmap v11: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T17:25:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: from='osd.0 [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T17:25:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:57 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='osd.0 [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: osdmap e7: 1 total, 0 up, 1 in 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='osd.0 [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/2407313066' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "e31bb3c2-190d-419c-bb90-f0909a02113b"}]: dispatch 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='osd.0 [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/2407313066' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "e31bb3c2-190d-419c-bb90-f0909a02113b"}]': finished 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: osdmap e8: 2 total, 0 up, 2 in 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/561510761' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:25:59.007 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:25:59 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:59.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='osd.0 [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T17:25:59.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: osdmap e7: 1 total, 0 up, 1 in 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='osd.0 [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/2407313066' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "e31bb3c2-190d-419c-bb90-f0909a02113b"}]: dispatch 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='osd.0 [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/2407313066' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "e31bb3c2-190d-419c-bb90-f0909a02113b"}]': finished 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: osdmap e8: 2 total, 0 up, 2 in 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/561510761' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:25:59.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:25:59 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:00.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:00 vm06 ceph-mon[57307]: osd.0 [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004] boot 2026-03-09T17:26:00.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:00 vm06 ceph-mon[57307]: osdmap e9: 2 total, 1 up, 2 in 2026-03-09T17:26:00.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:00 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:26:00.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:00 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:26:00.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:00 vm06 ceph-mon[57307]: pgmap v15: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T17:26:00.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:00 vm09 ceph-mon[62061]: osd.0 [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004] boot 2026-03-09T17:26:00.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:00 vm09 ceph-mon[62061]: osdmap e9: 2 total, 1 up, 2 in 2026-03-09T17:26:00.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:00 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:26:00.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:00 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:26:00.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:00 vm09 ceph-mon[62061]: pgmap v15: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T17:26:01.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:01 vm06 ceph-mon[57307]: purged_snaps scrub starts 2026-03-09T17:26:01.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:01 vm06 ceph-mon[57307]: purged_snaps scrub ok 2026-03-09T17:26:01.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:01 vm06 ceph-mon[57307]: osdmap e10: 2 total, 1 up, 2 in 2026-03-09T17:26:01.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:01 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:26:01.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:01 vm09 ceph-mon[62061]: purged_snaps scrub starts 2026-03-09T17:26:01.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:01 vm09 ceph-mon[62061]: purged_snaps scrub ok 2026-03-09T17:26:01.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:01 vm09 ceph-mon[62061]: osdmap e10: 2 total, 1 up, 2 in 2026-03-09T17:26:01.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:01 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:26:02.333 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:02 vm06 ceph-mon[57307]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T17:26:02.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:02 vm09 ceph-mon[62061]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T17:26:03.441 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:03 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T17:26:03.441 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:03 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:03.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:03 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T17:26:03.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:03 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:04.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:04 vm06 ceph-mon[57307]: Deploying daemon osd.1 on vm06 2026-03-09T17:26:04.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:04 vm06 ceph-mon[57307]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T17:26:04.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:04 vm09 ceph-mon[62061]: Deploying daemon osd.1 on vm06 2026-03-09T17:26:04.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:04 vm09 ceph-mon[62061]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T17:26:06.082 INFO:teuthology.orchestra.run.vm06.stdout:Created osd(s) 1 on host 'vm06' 2026-03-09T17:26:06.082 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:06.076+0000 7f5ca4ff9700 1 -- 192.168.123.106:0/1497990345 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f5ca00a5580 con 0x7f5c9806c7f0 2026-03-09T17:26:06.082 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:06.080+0000 7f5cacbab700 1 -- 192.168.123.106:0/1497990345 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5c9806c7f0 msgr2=0x7f5c9806eca0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:06.082 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:06.080+0000 7f5cacbab700 1 --2- 192.168.123.106:0/1497990345 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5c9806c7f0 0x7f5c9806eca0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f5c9c005950 tx=0x7f5c9c0058e0 comp rx=0 tx=0).stop 2026-03-09T17:26:06.082 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:06.080+0000 7f5cacbab700 1 -- 192.168.123.106:0/1497990345 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ca0094850 msgr2=0x7f5ca00a4260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:06.082 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:06.080+0000 7f5cacbab700 1 --2- 192.168.123.106:0/1497990345 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ca0094850 0x7f5ca00a4260 secure :-1 s=READY pgs=172 cs=0 l=1 rev1=1 crypto rx=0x7f5c94000c00 tx=0x7f5c9400db60 comp rx=0 tx=0).stop 2026-03-09T17:26:06.082 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:06.080+0000 7f5cacbab700 1 -- 192.168.123.106:0/1497990345 shutdown_connections 2026-03-09T17:26:06.082 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:06.080+0000 7f5cacbab700 1 --2- 192.168.123.106:0/1497990345 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5ca0093650 0x7f5ca00a3d20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:06.082 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:06.080+0000 7f5cacbab700 1 --2- 192.168.123.106:0/1497990345 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5c9806c7f0 0x7f5c9806eca0 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:06.082 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:06.080+0000 7f5cacbab700 1 --2- 192.168.123.106:0/1497990345 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ca0094850 0x7f5ca00a4260 unknown :-1 s=CLOSED pgs=172 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:06.082 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:06.080+0000 7f5cacbab700 1 -- 192.168.123.106:0/1497990345 >> 192.168.123.106:0/1497990345 conn(0x7f5ca008ebe0 msgr2=0x7f5ca008f840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:06.083 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:06.080+0000 7f5cacbab700 1 -- 192.168.123.106:0/1497990345 shutdown_connections 2026-03-09T17:26:06.083 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:06.081+0000 7f5cacbab700 1 -- 192.168.123.106:0/1497990345 wait complete. 2026-03-09T17:26:06.153 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:05 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:06.153 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:05 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:06.153 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:05 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:06.153 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:05 vm06 ceph-mon[57307]: pgmap v19: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T17:26:06.153 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:05 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:06.153 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:05 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:06.183 DEBUG:teuthology.orchestra.run.vm06:osd.1> sudo journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.1.service 2026-03-09T17:26:06.185 INFO:tasks.cephadm:Deploying osd.2 on vm06 with /dev/vdc... 2026-03-09T17:26:06.185 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- lvm zap /dev/vdc 2026-03-09T17:26:06.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:05 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:06.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:05 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:06.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:05 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:06.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:05 vm09 ceph-mon[62061]: pgmap v19: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T17:26:06.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:05 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:06.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:05 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:06.596 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:07.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:07 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:07.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:07 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:07.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:07 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:07.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:07 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:07.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:07 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:07.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:07 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:07.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:07 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:07.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:07 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:07.477 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:26:07.492 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph orch daemon add osd vm06:/dev/vdc 2026-03-09T17:26:07.674 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:26:07 vm06 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1[80006]: 2026-03-09T17:26:07.491+0000 7f60ab5ef640 -1 osd.1 0 log_to_monitors true 2026-03-09T17:26:07.731 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:08.077 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.074+0000 7f7e5ebee700 1 -- 192.168.123.106:0/198124589 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e58072330 msgr2=0x7f7e580770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:08.077 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.074+0000 7f7e5ebee700 1 --2- 192.168.123.106:0/198124589 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e58072330 0x7f7e580770b0 secure :-1 s=READY pgs=179 cs=0 l=1 rev1=1 crypto rx=0x7f7e5000ab20 tx=0x7f7e5000ae30 comp rx=0 tx=0).stop 2026-03-09T17:26:08.077 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.074+0000 7f7e5ebee700 1 -- 192.168.123.106:0/198124589 shutdown_connections 2026-03-09T17:26:08.077 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.074+0000 7f7e5ebee700 1 --2- 192.168.123.106:0/198124589 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e58072330 0x7f7e580770b0 unknown :-1 s=CLOSED pgs=179 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:08.077 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.074+0000 7f7e5ebee700 1 --2- 192.168.123.106:0/198124589 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7e58071950 0x7f7e58071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:08.077 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.074+0000 7f7e5ebee700 1 -- 192.168.123.106:0/198124589 >> 192.168.123.106:0/198124589 conn(0x7f7e5806d1a0 msgr2=0x7f7e5806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:08.077 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.074+0000 7f7e5ebee700 1 -- 192.168.123.106:0/198124589 shutdown_connections 2026-03-09T17:26:08.077 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.074+0000 7f7e5ebee700 1 -- 192.168.123.106:0/198124589 wait complete. 2026-03-09T17:26:08.077 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.075+0000 7f7e5ebee700 1 Processor -- start 2026-03-09T17:26:08.077 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.075+0000 7f7e5ebee700 1 -- start start 2026-03-09T17:26:08.077 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.075+0000 7f7e5ebee700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e58071950 0x7f7e580825b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:08.077 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.075+0000 7f7e5ebee700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7e58082af0 0x7f7e58082f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:08.077 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.075+0000 7f7e5ebee700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7e5812dd80 con 0x7f7e58071950 2026-03-09T17:26:08.077 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.075+0000 7f7e5ebee700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7e5812def0 con 0x7f7e58082af0 2026-03-09T17:26:08.078 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.075+0000 7f7e5dbec700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e58071950 0x7f7e580825b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:08.078 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.076+0000 7f7e5dbec700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e58071950 0x7f7e580825b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:59664/0 (socket says 192.168.123.106:59664) 2026-03-09T17:26:08.078 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.076+0000 7f7e5dbec700 1 -- 192.168.123.106:0/848026028 learned_addr learned my addr 192.168.123.106:0/848026028 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:08.078 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.076+0000 7f7e5d3eb700 1 --2- 192.168.123.106:0/848026028 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7e58082af0 0x7f7e58082f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:08.078 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.076+0000 7f7e5dbec700 1 -- 192.168.123.106:0/848026028 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7e58082af0 msgr2=0x7f7e58082f60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:08.078 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.076+0000 7f7e5dbec700 1 --2- 192.168.123.106:0/848026028 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7e58082af0 0x7f7e58082f60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:08.078 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.076+0000 7f7e5dbec700 1 -- 192.168.123.106:0/848026028 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7e5000a780 con 0x7f7e58071950 2026-03-09T17:26:08.078 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.076+0000 7f7e5dbec700 1 --2- 192.168.123.106:0/848026028 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e58071950 0x7f7e580825b0 secure :-1 s=READY pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7f7e5400ba70 tx=0x7f7e5400be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:08.079 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.076+0000 7f7e4effd700 1 -- 192.168.123.106:0/848026028 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7e5400c760 con 0x7f7e58071950 2026-03-09T17:26:08.079 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.076+0000 7f7e5ebee700 1 -- 192.168.123.106:0/848026028 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7e5812e1d0 con 0x7f7e58071950 2026-03-09T17:26:08.079 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.077+0000 7f7e5ebee700 1 -- 192.168.123.106:0/848026028 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7e5812e720 con 0x7f7e58071950 2026-03-09T17:26:08.080 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.078+0000 7f7e5ebee700 1 -- 192.168.123.106:0/848026028 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7e5806fd90 con 0x7f7e58071950 2026-03-09T17:26:08.084 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.081+0000 7f7e4effd700 1 -- 192.168.123.106:0/848026028 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7e5400cda0 con 0x7f7e58071950 2026-03-09T17:26:08.084 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.081+0000 7f7e4effd700 1 -- 192.168.123.106:0/848026028 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7e54012550 con 0x7f7e58071950 2026-03-09T17:26:08.084 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.081+0000 7f7e4effd700 1 -- 192.168.123.106:0/848026028 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f7e54012770 con 0x7f7e58071950 2026-03-09T17:26:08.084 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.082+0000 7f7e4effd700 1 --2- 192.168.123.106:0/848026028 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7e4406c7f0 0x7f7e4406eca0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:08.084 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.082+0000 7f7e5d3eb700 1 --2- 192.168.123.106:0/848026028 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7e4406c7f0 0x7f7e4406eca0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:08.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.083+0000 7f7e5d3eb700 1 --2- 192.168.123.106:0/848026028 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7e4406c7f0 0x7f7e4406eca0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f7e5000c5c0 tx=0x7f7e5001a040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:08.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.083+0000 7f7e4effd700 1 -- 192.168.123.106:0/848026028 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(10..10 src has 1..10) v4 ==== 1915+0+0 (secure 0 0 0) 0x7f7e5408ccd0 con 0x7f7e58071950 2026-03-09T17:26:08.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.083+0000 7f7e4effd700 1 -- 192.168.123.106:0/848026028 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7e5408f460 con 0x7f7e58071950 2026-03-09T17:26:08.219 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:08 vm06 ceph-mon[57307]: pgmap v20: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T17:26:08.219 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:08 vm06 ceph-mon[57307]: from='osd.1 [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T17:26:08.219 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:08.215+0000 7f7e5ebee700 1 -- 192.168.123.106:0/848026028 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7f7e58061190 con 0x7f7e4406c7f0 2026-03-09T17:26:08.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:08 vm09 ceph-mon[62061]: pgmap v20: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T17:26:08.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:08 vm09 ceph-mon[62061]: from='osd.1 [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T17:26:09.134 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='osd.1 [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T17:26:09.134 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: osdmap e11: 2 total, 1 up, 2 in 2026-03-09T17:26:09.135 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:26:09.135 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='osd.1 [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T17:26:09.135 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='client.14296 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:26:09.135 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T17:26:09.135 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T17:26:09.135 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:09.135 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:26:09.135 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:09.135 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:09.135 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:26:09.135 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:09.135 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:09.135 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:09.135 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:09 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:09.135 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:26:09 vm06 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1[80006]: 2026-03-09T17:26:09.123+0000 7f60a1c68700 -1 osd.1 0 waiting for initial osdmap 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='osd.1 [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: osdmap e11: 2 total, 1 up, 2 in 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='osd.1 [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='client.14296 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:09.345 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:09 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:09.392 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:26:09 vm06 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1[80006]: 2026-03-09T17:26:09.138+0000 7f609c25a700 -1 osd.1 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: Detected new or changed devices on vm06 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: from='osd.1 [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: osdmap e12: 2 total, 1 up, 2 in 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: pgmap v23: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/3097413166' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4aa1d45d-b786-45ab-97d1-aef76daa15f5"}]: dispatch 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4aa1d45d-b786-45ab-97d1-aef76daa15f5"}]: dispatch 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: osd.1 [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563] boot 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "4aa1d45d-b786-45ab-97d1-aef76daa15f5"}]': finished 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: osdmap e13: 3 total, 2 up, 3 in 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:10.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:10 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/902012901' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T17:26:10.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: Detected new or changed devices on vm06 2026-03-09T17:26:10.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: from='osd.1 [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-09T17:26:10.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: osdmap e12: 2 total, 1 up, 2 in 2026-03-09T17:26:10.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:26:10.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:26:10.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: pgmap v23: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T17:26:10.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3097413166' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4aa1d45d-b786-45ab-97d1-aef76daa15f5"}]: dispatch 2026-03-09T17:26:10.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4aa1d45d-b786-45ab-97d1-aef76daa15f5"}]: dispatch 2026-03-09T17:26:10.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: osd.1 [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563] boot 2026-03-09T17:26:10.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "4aa1d45d-b786-45ab-97d1-aef76daa15f5"}]': finished 2026-03-09T17:26:10.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: osdmap e13: 3 total, 2 up, 3 in 2026-03-09T17:26:10.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:26:10.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:26:10.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:10.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:10.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:10 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/902012901' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T17:26:11.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:11 vm09 ceph-mon[62061]: purged_snaps scrub starts 2026-03-09T17:26:11.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:11 vm09 ceph-mon[62061]: purged_snaps scrub ok 2026-03-09T17:26:11.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:11 vm09 ceph-mon[62061]: osdmap e14: 3 total, 2 up, 3 in 2026-03-09T17:26:11.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:11 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:26:11.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:11 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:11.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:11 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:11.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:11 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:11.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:11 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:11.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:11 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:11.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:11 vm06 ceph-mon[57307]: purged_snaps scrub starts 2026-03-09T17:26:11.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:11 vm06 ceph-mon[57307]: purged_snaps scrub ok 2026-03-09T17:26:11.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:11 vm06 ceph-mon[57307]: osdmap e14: 3 total, 2 up, 3 in 2026-03-09T17:26:11.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:11 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:26:11.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:11 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:11.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:11 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:11.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:11 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:11.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:11 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:11.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:11 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:12.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:12 vm06 ceph-mon[57307]: pgmap v26: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T17:26:12.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:12 vm09 ceph-mon[62061]: pgmap v26: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T17:26:14.515 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:14 vm06 ceph-mon[57307]: pgmap v27: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T17:26:14.515 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:14 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T17:26:14.515 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:14 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:14.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:14 vm09 ceph-mon[62061]: pgmap v27: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T17:26:14.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:14 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T17:26:14.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:14 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:15.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:15 vm06 ceph-mon[57307]: Deploying daemon osd.2 on vm06 2026-03-09T17:26:15.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:15 vm06 ceph-mon[57307]: pgmap v28: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T17:26:15.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:15 vm09 ceph-mon[62061]: Deploying daemon osd.2 on vm06 2026-03-09T17:26:15.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:15 vm09 ceph-mon[62061]: pgmap v28: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T17:26:17.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:17.421+0000 7f7e4effd700 1 -- 192.168.123.106:0/848026028 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f7e58061190 con 0x7f7e4406c7f0 2026-03-09T17:26:17.426 INFO:teuthology.orchestra.run.vm06.stdout:Created osd(s) 2 on host 'vm06' 2026-03-09T17:26:17.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:17.423+0000 7f7e5ebee700 1 -- 192.168.123.106:0/848026028 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7e4406c7f0 msgr2=0x7f7e4406eca0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:17.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:17.423+0000 7f7e5ebee700 1 --2- 192.168.123.106:0/848026028 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7e4406c7f0 0x7f7e4406eca0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f7e5000c5c0 tx=0x7f7e5001a040 comp rx=0 tx=0).stop 2026-03-09T17:26:17.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:17.424+0000 7f7e5ebee700 1 -- 192.168.123.106:0/848026028 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e58071950 msgr2=0x7f7e580825b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:17.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:17.424+0000 7f7e5ebee700 1 --2- 192.168.123.106:0/848026028 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e58071950 0x7f7e580825b0 secure :-1 s=READY pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7f7e5400ba70 tx=0x7f7e5400be30 comp rx=0 tx=0).stop 2026-03-09T17:26:17.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:17.424+0000 7f7e5ebee700 1 -- 192.168.123.106:0/848026028 shutdown_connections 2026-03-09T17:26:17.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:17.424+0000 7f7e5ebee700 1 --2- 192.168.123.106:0/848026028 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7e4406c7f0 0x7f7e4406eca0 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:17.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:17.424+0000 7f7e5ebee700 1 --2- 192.168.123.106:0/848026028 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e58071950 0x7f7e580825b0 unknown :-1 s=CLOSED pgs=180 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:17.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:17.424+0000 7f7e5ebee700 1 --2- 192.168.123.106:0/848026028 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7e58082af0 0x7f7e58082f60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:17.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:17.424+0000 7f7e5ebee700 1 -- 192.168.123.106:0/848026028 >> 192.168.123.106:0/848026028 conn(0x7f7e5806d1a0 msgr2=0x7f7e58074510 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:17.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:17.424+0000 7f7e5ebee700 1 -- 192.168.123.106:0/848026028 shutdown_connections 2026-03-09T17:26:17.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:17.424+0000 7f7e5ebee700 1 -- 192.168.123.106:0/848026028 wait complete. 2026-03-09T17:26:17.505 DEBUG:teuthology.orchestra.run.vm06:osd.2> sudo journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.2.service 2026-03-09T17:26:17.506 INFO:tasks.cephadm:Deploying osd.3 on vm09 with /dev/vde... 2026-03-09T17:26:17.506 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- lvm zap /dev/vde 2026-03-09T17:26:17.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:17 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:17.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:17 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:17.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:17 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:17.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:17 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:17.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:17 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:17.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:17 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:17.671 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm09/config 2026-03-09T17:26:18.387 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:26:18.403 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph orch daemon add osd vm09:/dev/vde 2026-03-09T17:26:18.565 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm09/config 2026-03-09T17:26:18.593 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:18 vm09 ceph-mon[62061]: pgmap v29: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T17:26:18.593 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:18 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:18.593 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:18 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:18.593 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:18 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:18.593 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:18 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:18.612 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:18 vm06 ceph-mon[57307]: pgmap v29: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T17:26:18.612 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:18 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:18.612 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:18 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:18.612 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:18 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:18.612 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:18 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:18.838 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.836+0000 7fd1a6c96700 1 -- 192.168.123.109:0/576235942 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd1a0074d80 msgr2=0x7fd1a00731e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:18.838 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.836+0000 7fd1a6c96700 1 --2- 192.168.123.109:0/576235942 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd1a0074d80 0x7fd1a00731e0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fd194009b00 tx=0x7fd194009e10 comp rx=0 tx=0).stop 2026-03-09T17:26:18.838 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.837+0000 7fd1a6c96700 1 -- 192.168.123.109:0/576235942 shutdown_connections 2026-03-09T17:26:18.838 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.837+0000 7fd1a6c96700 1 --2- 192.168.123.109:0/576235942 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1a00737b0 0x7fd1a0073c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:18.838 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.837+0000 7fd1a6c96700 1 --2- 192.168.123.109:0/576235942 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd1a0074d80 0x7fd1a00731e0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:18.838 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.837+0000 7fd1a6c96700 1 -- 192.168.123.109:0/576235942 >> 192.168.123.109:0/576235942 conn(0x7fd1a00fbaa0 msgr2=0x7fd1a00fdf10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:18.838 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.837+0000 7fd1a6c96700 1 -- 192.168.123.109:0/576235942 shutdown_connections 2026-03-09T17:26:18.838 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.837+0000 7fd1a6c96700 1 -- 192.168.123.109:0/576235942 wait complete. 2026-03-09T17:26:18.839 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.838+0000 7fd1a6c96700 1 Processor -- start 2026-03-09T17:26:18.839 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.838+0000 7fd1a6c96700 1 -- start start 2026-03-09T17:26:18.839 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.838+0000 7fd1a6c96700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd1a00737b0 0x7fd1a019c390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:18.840 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.838+0000 7fd1a6c96700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1a0074d80 0x7fd1a019c8d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:18.840 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.838+0000 7fd1a6c96700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd1a019ce60 con 0x7fd1a0074d80 2026-03-09T17:26:18.840 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.838+0000 7fd1a6c96700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd1a019cfa0 con 0x7fd1a00737b0 2026-03-09T17:26:18.840 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.838+0000 7fd1a4a32700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd1a00737b0 0x7fd1a019c390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:18.841 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.838+0000 7fd1a4a32700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd1a00737b0 0x7fd1a019c390 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.109:58292/0 (socket says 192.168.123.109:58292) 2026-03-09T17:26:18.842 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.838+0000 7fd1a4a32700 1 -- 192.168.123.109:0/2226498096 learned_addr learned my addr 192.168.123.109:0/2226498096 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:26:18.842 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.838+0000 7fd1a4a32700 1 -- 192.168.123.109:0/2226498096 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1a0074d80 msgr2=0x7fd1a019c8d0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:26:18.842 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.838+0000 7fd1a4a32700 1 --2- 192.168.123.109:0/2226498096 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1a0074d80 0x7fd1a019c8d0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:18.842 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.838+0000 7fd1a4a32700 1 -- 192.168.123.109:0/2226498096 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd1940097e0 con 0x7fd1a00737b0 2026-03-09T17:26:18.842 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.839+0000 7fd1a4a32700 1 --2- 192.168.123.109:0/2226498096 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd1a00737b0 0x7fd1a019c390 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fd194004990 tx=0x7fd194004a70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:18.842 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.839+0000 7fd19dffb700 1 -- 192.168.123.109:0/2226498096 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd19401d070 con 0x7fd1a00737b0 2026-03-09T17:26:18.842 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.839+0000 7fd1a6c96700 1 -- 192.168.123.109:0/2226498096 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd1a01a1a00 con 0x7fd1a00737b0 2026-03-09T17:26:18.842 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.839+0000 7fd1a6c96700 1 -- 192.168.123.109:0/2226498096 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd1a01a1ec0 con 0x7fd1a00737b0 2026-03-09T17:26:18.842 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.839+0000 7fd19dffb700 1 -- 192.168.123.109:0/2226498096 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd19400bd90 con 0x7fd1a00737b0 2026-03-09T17:26:18.842 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.839+0000 7fd19dffb700 1 -- 192.168.123.109:0/2226498096 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd19400f980 con 0x7fd1a00737b0 2026-03-09T17:26:18.843 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.840+0000 7fd19dffb700 1 -- 192.168.123.109:0/2226498096 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7fd19400fae0 con 0x7fd1a00737b0 2026-03-09T17:26:18.843 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.841+0000 7fd19dffb700 1 --2- 192.168.123.109:0/2226498096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd18806c600 0x7fd18806eab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:18.843 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.841+0000 7fd19dffb700 1 -- 192.168.123.109:0/2226498096 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(14..14 src has 1..14) v4 ==== 2347+0+0 (secure 0 0 0) 0x7fd19408d580 con 0x7fd1a00737b0 2026-03-09T17:26:18.843 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.841+0000 7fd19ffff700 1 --2- 192.168.123.109:0/2226498096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd18806c600 0x7fd18806eab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:18.848 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.843+0000 7fd1a6c96700 1 -- 192.168.123.109:0/2226498096 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd18c005320 con 0x7fd1a00737b0 2026-03-09T17:26:18.848 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.847+0000 7fd19dffb700 1 -- 192.168.123.109:0/2226498096 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd19405c3e0 con 0x7fd1a00737b0 2026-03-09T17:26:18.848 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.847+0000 7fd19ffff700 1 --2- 192.168.123.109:0/2226498096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd18806c600 0x7fd18806eab0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fd19000a9b0 tx=0x7fd190005cb0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:18.891 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:26:18 vm06 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2[86232]: 2026-03-09T17:26:18.754+0000 7eff351df640 -1 osd.2 0 log_to_monitors true 2026-03-09T17:26:18.959 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:18.958+0000 7fd1a6c96700 1 -- 192.168.123.109:0/2226498096 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm09:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7fd18c000bf0 con 0x7fd18806c600 2026-03-09T17:26:19.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:19 vm06 ceph-mon[57307]: from='osd.2 [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T17:26:19.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:19 vm06 ceph-mon[57307]: from='client.24137 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm09:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:26:19.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:19 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T17:26:19.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:19 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T17:26:19.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:19 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:19.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:19 vm06 ceph-mon[57307]: Detected new or changed devices on vm06 2026-03-09T17:26:19.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:19 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:19.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:19 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:19.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:19 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:26:19.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:19 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:19.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:19 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:19.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:19 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:19.368 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:19 vm06 ceph-mon[57307]: pgmap v30: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T17:26:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:19 vm09 ceph-mon[62061]: from='osd.2 [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T17:26:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:19 vm09 ceph-mon[62061]: from='client.24137 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm09:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:26:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:19 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T17:26:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:19 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T17:26:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:19 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:19 vm09 ceph-mon[62061]: Detected new or changed devices on vm06 2026-03-09T17:26:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:19 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:19 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:19 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:26:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:19 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:19 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:19 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:19 vm09 ceph-mon[62061]: pgmap v30: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T17:26:20.098 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:26:20 vm06 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2[86232]: 2026-03-09T17:26:20.093+0000 7eff2a055700 -1 osd.2 0 waiting for initial osdmap 2026-03-09T17:26:20.379 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:26:20 vm06 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2[86232]: 2026-03-09T17:26:20.099+0000 7eff2664b700 -1 osd.2 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T17:26:20.379 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:20 vm06 ceph-mon[57307]: from='osd.2 [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T17:26:20.379 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:20 vm06 ceph-mon[57307]: osdmap e15: 3 total, 2 up, 3 in 2026-03-09T17:26:20.379 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:20 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:26:20.379 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:20 vm06 ceph-mon[57307]: from='osd.2 [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T17:26:20.379 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:20 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:20.379 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:20 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/3760547054' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "733bc53e-8727-4119-a70f-00c09a625789"}]: dispatch 2026-03-09T17:26:20.379 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:20 vm06 ceph-mon[57307]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "733bc53e-8727-4119-a70f-00c09a625789"}]: dispatch 2026-03-09T17:26:20.379 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:20 vm06 ceph-mon[57307]: from='osd.2 [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-09T17:26:20.379 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:20 vm06 ceph-mon[57307]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "733bc53e-8727-4119-a70f-00c09a625789"}]': finished 2026-03-09T17:26:20.379 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:20 vm06 ceph-mon[57307]: osdmap e16: 4 total, 2 up, 4 in 2026-03-09T17:26:20.379 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:20 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:26:20.379 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:20 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:20.379 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:20 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:26:20.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:20 vm09 ceph-mon[62061]: from='osd.2 [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T17:26:20.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:20 vm09 ceph-mon[62061]: osdmap e15: 3 total, 2 up, 3 in 2026-03-09T17:26:20.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:20 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:26:20.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:20 vm09 ceph-mon[62061]: from='osd.2 [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T17:26:20.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:20 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:20.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:20 vm09 ceph-mon[62061]: from='client.? 192.168.123.109:0/3760547054' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "733bc53e-8727-4119-a70f-00c09a625789"}]: dispatch 2026-03-09T17:26:20.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:20 vm09 ceph-mon[62061]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "733bc53e-8727-4119-a70f-00c09a625789"}]: dispatch 2026-03-09T17:26:20.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:20 vm09 ceph-mon[62061]: from='osd.2 [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-09T17:26:20.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:20 vm09 ceph-mon[62061]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "733bc53e-8727-4119-a70f-00c09a625789"}]': finished 2026-03-09T17:26:20.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:20 vm09 ceph-mon[62061]: osdmap e16: 4 total, 2 up, 4 in 2026-03-09T17:26:20.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:20 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:26:20.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:20 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:20.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:20 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:26:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:21 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/3856753984' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T17:26:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:21 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:21 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:21 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:21 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:21 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:21 vm06 ceph-mon[57307]: osd.2 [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701] boot 2026-03-09T17:26:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:21 vm06 ceph-mon[57307]: osdmap e17: 4 total, 3 up, 4 in 2026-03-09T17:26:21.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:21 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:26:21.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:21 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:21.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:21 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-09T17:26:21.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:21 vm09 ceph-mon[62061]: from='client.? 192.168.123.109:0/3856753984' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T17:26:21.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:21 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:21.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:21 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:21.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:21 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:21.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:21 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:21.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:21 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:21.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:21 vm09 ceph-mon[62061]: osd.2 [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701] boot 2026-03-09T17:26:21.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:21 vm09 ceph-mon[62061]: osdmap e17: 4 total, 3 up, 4 in 2026-03-09T17:26:21.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:21 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:26:21.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:21 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:21.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:21 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-09T17:26:22.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:22 vm06 ceph-mon[57307]: purged_snaps scrub starts 2026-03-09T17:26:22.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:22 vm06 ceph-mon[57307]: purged_snaps scrub ok 2026-03-09T17:26:22.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:22 vm06 ceph-mon[57307]: pgmap v34: 0 pgs: ; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-09T17:26:22.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:22 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-09T17:26:22.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:22 vm06 ceph-mon[57307]: osdmap e18: 4 total, 3 up, 4 in 2026-03-09T17:26:22.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:22 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:22.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:22 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-09T17:26:22.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:22 vm09 ceph-mon[62061]: purged_snaps scrub starts 2026-03-09T17:26:22.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:22 vm09 ceph-mon[62061]: purged_snaps scrub ok 2026-03-09T17:26:22.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:22 vm09 ceph-mon[62061]: pgmap v34: 0 pgs: ; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-09T17:26:22.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:22 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-09T17:26:22.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:22 vm09 ceph-mon[62061]: osdmap e18: 4 total, 3 up, 4 in 2026-03-09T17:26:22.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:22 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:22.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:22 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-09T17:26:23.870 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:26:23 vm06 sudo[90021]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vde 2026-03-09T17:26:23.870 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:26:23 vm06 sudo[90021]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T17:26:23.870 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:26:23 vm06 sudo[90021]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-09T17:26:23.870 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:26:23 vm06 sudo[90021]: pam_unix(sudo:session): session closed for user root 2026-03-09T17:26:23.871 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:26:23 vm06 sudo[90024]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vdd 2026-03-09T17:26:23.871 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:26:23 vm06 sudo[90024]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T17:26:23.871 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:26:23 vm06 sudo[90024]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-09T17:26:23.871 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:26:23 vm06 sudo[90024]: pam_unix(sudo:session): session closed for user root 2026-03-09T17:26:24.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 sudo[90030]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vda 2026-03-09T17:26:24.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 sudo[90030]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T17:26:24.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 sudo[90030]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-09T17:26:24.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 sudo[90030]: pam_unix(sudo:session): session closed for user root 2026-03-09T17:26:24.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:26:23 vm06 sudo[90027]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vdc 2026-03-09T17:26:24.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:26:23 vm06 sudo[90027]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T17:26:24.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:26:23 vm06 sudo[90027]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-09T17:26:24.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:26:23 vm06 sudo[90027]: pam_unix(sudo:session): session closed for user root 2026-03-09T17:26:24.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-09T17:26:24.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 ceph-mon[57307]: osdmap e19: 4 total, 3 up, 4 in 2026-03-09T17:26:24.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:24.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 ceph-mon[57307]: pgmap v37: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-09T17:26:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:26:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 ceph-mon[57307]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T17:26:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 ceph-mon[57307]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T17:26:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:26:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:26:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:26:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:24 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 sudo[67420]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vda 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 sudo[67420]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 sudo[67420]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 sudo[67420]: pam_unix(sudo:session): session closed for user root 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 ceph-mon[62061]: osdmap e19: 4 total, 3 up, 4 in 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 ceph-mon[62061]: pgmap v37: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 ceph-mon[62061]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 ceph-mon[62061]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:26:24.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:24 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:26:25.622 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:25 vm09 ceph-mon[62061]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T17:26:25.622 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:25 vm09 ceph-mon[62061]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T17:26:25.622 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:25 vm09 ceph-mon[62061]: osdmap e20: 4 total, 3 up, 4 in 2026-03-09T17:26:25.622 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:25 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:25.622 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:25 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T17:26:25.622 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:25 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:25.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:25 vm06 ceph-mon[57307]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T17:26:25.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:25 vm06 ceph-mon[57307]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T17:26:25.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:25 vm06 ceph-mon[57307]: osdmap e20: 4 total, 3 up, 4 in 2026-03-09T17:26:25.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:25 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:25.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:25 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T17:26:25.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:25 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:25.779 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:25.775+0000 7fd19dffb700 1 -- 192.168.123.109:0/2226498096 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd194058d20 con 0x7fd1a00737b0 2026-03-09T17:26:26.294 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:26 vm09 ceph-mon[62061]: Deploying daemon osd.3 on vm09 2026-03-09T17:26:26.294 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:26 vm09 ceph-mon[62061]: pgmap v39: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-09T17:26:26.294 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:26 vm09 ceph-mon[62061]: mgrmap e19: vm06.pbgzei(active, since 62s), standbys: vm09.lqzvkh 2026-03-09T17:26:26.294 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:26 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:26.294 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:26 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:26.294 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:26 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:26.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:26 vm06 ceph-mon[57307]: Deploying daemon osd.3 on vm09 2026-03-09T17:26:26.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:26 vm06 ceph-mon[57307]: pgmap v39: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-09T17:26:26.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:26 vm06 ceph-mon[57307]: mgrmap e19: vm06.pbgzei(active, since 62s), standbys: vm09.lqzvkh 2026-03-09T17:26:26.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:26 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:26.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:26 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:26.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:26 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:27.193 INFO:teuthology.orchestra.run.vm09.stdout:Created osd(s) 3 on host 'vm09' 2026-03-09T17:26:27.194 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:27.188+0000 7fd19dffb700 1 -- 192.168.123.109:0/2226498096 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fd18c000bf0 con 0x7fd18806c600 2026-03-09T17:26:27.194 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:27.191+0000 7fd1a6c96700 1 -- 192.168.123.109:0/2226498096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd18806c600 msgr2=0x7fd18806eab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:27.194 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:27.191+0000 7fd1a6c96700 1 --2- 192.168.123.109:0/2226498096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd18806c600 0x7fd18806eab0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fd19000a9b0 tx=0x7fd190005cb0 comp rx=0 tx=0).stop 2026-03-09T17:26:27.194 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:27.191+0000 7fd1a6c96700 1 -- 192.168.123.109:0/2226498096 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd1a00737b0 msgr2=0x7fd1a019c390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:27.194 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:27.191+0000 7fd1a6c96700 1 --2- 192.168.123.109:0/2226498096 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd1a00737b0 0x7fd1a019c390 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fd194004990 tx=0x7fd194004a70 comp rx=0 tx=0).stop 2026-03-09T17:26:27.194 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:27.191+0000 7fd1a6c96700 1 -- 192.168.123.109:0/2226498096 shutdown_connections 2026-03-09T17:26:27.194 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:27.191+0000 7fd1a6c96700 1 --2- 192.168.123.109:0/2226498096 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd1a00737b0 0x7fd1a019c390 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:27.194 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:27.191+0000 7fd1a6c96700 1 --2- 192.168.123.109:0/2226498096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd18806c600 0x7fd18806eab0 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:27.194 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:27.191+0000 7fd1a6c96700 1 --2- 192.168.123.109:0/2226498096 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1a0074d80 0x7fd1a019c8d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:27.194 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:27.191+0000 7fd1a6c96700 1 -- 192.168.123.109:0/2226498096 >> 192.168.123.109:0/2226498096 conn(0x7fd1a00fbaa0 msgr2=0x7fd1a0101ec0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:27.194 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:27.191+0000 7fd1a6c96700 1 -- 192.168.123.109:0/2226498096 shutdown_connections 2026-03-09T17:26:27.194 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:27.191+0000 7fd1a6c96700 1 -- 192.168.123.109:0/2226498096 wait complete. 2026-03-09T17:26:27.288 DEBUG:teuthology.orchestra.run.vm09:osd.3> sudo journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.3.service 2026-03-09T17:26:27.289 INFO:tasks.cephadm:Deploying osd.4 on vm09 with /dev/vdd... 2026-03-09T17:26:27.290 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- lvm zap /dev/vdd 2026-03-09T17:26:27.524 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm09/config 2026-03-09T17:26:28.121 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:26:28.148 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph orch daemon add osd vm09:/dev/vdd 2026-03-09T17:26:28.313 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:28 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:28.313 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:28 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:28.313 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:28 vm09 ceph-mon[62061]: pgmap v40: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-09T17:26:28.313 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:28 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:28.313 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:28 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:28.495 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm09/config 2026-03-09T17:26:28.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:28 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:28.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:28 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:28.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:28 vm06 ceph-mon[57307]: pgmap v40: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-09T17:26:28.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:28 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:28.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:28 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:28.883 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.879+0000 7f6971ca5700 1 -- 192.168.123.109:0/3742494192 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f696c074d80 msgr2=0x7f696c0731e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:28.883 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.879+0000 7f6971ca5700 1 --2- 192.168.123.109:0/3742494192 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f696c074d80 0x7f696c0731e0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f695c009b80 tx=0x7f695c009e90 comp rx=0 tx=0).stop 2026-03-09T17:26:28.884 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.882+0000 7f6971ca5700 1 -- 192.168.123.109:0/3742494192 shutdown_connections 2026-03-09T17:26:28.884 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.882+0000 7f6971ca5700 1 --2- 192.168.123.109:0/3742494192 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f696c0737b0 0x7f696c073c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:28.884 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.882+0000 7f6971ca5700 1 --2- 192.168.123.109:0/3742494192 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f696c074d80 0x7f696c0731e0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:28.884 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.882+0000 7f6971ca5700 1 -- 192.168.123.109:0/3742494192 >> 192.168.123.109:0/3742494192 conn(0x7f696c0fba80 msgr2=0x7f696c0fdef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:28.884 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.882+0000 7f6971ca5700 1 -- 192.168.123.109:0/3742494192 shutdown_connections 2026-03-09T17:26:28.885 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.882+0000 7f6971ca5700 1 -- 192.168.123.109:0/3742494192 wait complete. 2026-03-09T17:26:28.885 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.883+0000 7f6971ca5700 1 Processor -- start 2026-03-09T17:26:28.885 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.883+0000 7f6971ca5700 1 -- start start 2026-03-09T17:26:28.887 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.883+0000 7f6971ca5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f696c0737b0 0x7f696c0725a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:28.887 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.883+0000 7f6971ca5700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f696c074d80 0x7f696c070bf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:28.887 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.883+0000 7f6971ca5700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f696c072bc0 con 0x7f696c0737b0 2026-03-09T17:26:28.887 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.883+0000 7f6971ca5700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f696c072d00 con 0x7f696c074d80 2026-03-09T17:26:28.887 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.884+0000 7f696affd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f696c074d80 0x7f696c070bf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:28.887 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.884+0000 7f696affd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f696c074d80 0x7f696c070bf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.109:43034/0 (socket says 192.168.123.109:43034) 2026-03-09T17:26:28.887 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.884+0000 7f696affd700 1 -- 192.168.123.109:0/3956714584 learned_addr learned my addr 192.168.123.109:0/3956714584 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:26:28.887 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.884+0000 7f696b7fe700 1 --2- 192.168.123.109:0/3956714584 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f696c0737b0 0x7f696c0725a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:28.887 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.884+0000 7f696affd700 1 -- 192.168.123.109:0/3956714584 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f696c0737b0 msgr2=0x7f696c0725a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:28.887 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.884+0000 7f696affd700 1 --2- 192.168.123.109:0/3956714584 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f696c0737b0 0x7f696c0725a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:28.887 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.884+0000 7f696affd700 1 -- 192.168.123.109:0/3956714584 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f695c0097e0 con 0x7f696c074d80 2026-03-09T17:26:28.887 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.885+0000 7f696affd700 1 --2- 192.168.123.109:0/3956714584 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f696c074d80 0x7f696c070bf0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f696000b700 tx=0x7f696000bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:28.887 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.885+0000 7f6968ff9700 1 -- 192.168.123.109:0/3956714584 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6960010820 con 0x7f696c074d80 2026-03-09T17:26:28.887 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.885+0000 7f6968ff9700 1 -- 192.168.123.109:0/3956714584 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6960010e60 con 0x7f696c074d80 2026-03-09T17:26:28.889 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.887+0000 7f6968ff9700 1 -- 192.168.123.109:0/3956714584 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6960017570 con 0x7f696c074d80 2026-03-09T17:26:28.890 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.887+0000 7f6971ca5700 1 -- 192.168.123.109:0/3956714584 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f696c071190 con 0x7f696c074d80 2026-03-09T17:26:28.890 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.887+0000 7f6971ca5700 1 -- 192.168.123.109:0/3956714584 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f696c071660 con 0x7f696c074d80 2026-03-09T17:26:28.893 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.888+0000 7f6971ca5700 1 -- 192.168.123.109:0/3956714584 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f696c066e40 con 0x7f696c074d80 2026-03-09T17:26:28.893 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.889+0000 7f6968ff9700 1 -- 192.168.123.109:0/3956714584 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f6960010980 con 0x7f696c074d80 2026-03-09T17:26:28.893 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.890+0000 7f6968ff9700 1 --2- 192.168.123.109:0/3956714584 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f695406c4d0 0x7f695406e980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:28.893 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.890+0000 7f6968ff9700 1 -- 192.168.123.109:0/3956714584 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(20..20 src has 1..20) v4 ==== 3165+0+0 (secure 0 0 0) 0x7f696008a740 con 0x7f696c074d80 2026-03-09T17:26:28.894 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.893+0000 7f6968ff9700 1 -- 192.168.123.109:0/3956714584 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6960059190 con 0x7f696c074d80 2026-03-09T17:26:28.895 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.893+0000 7f696b7fe700 1 --2- 192.168.123.109:0/3956714584 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f695406c4d0 0x7f695406e980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:28.898 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:28.896+0000 7f696b7fe700 1 --2- 192.168.123.109:0/3956714584 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f695406c4d0 0x7f695406e980 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f695c009b50 tx=0x7f695c000bc0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:29.038 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:29.035+0000 7f6971ca5700 1 -- 192.168.123.109:0/3956714584 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm09:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7f696c1b3600 con 0x7f695406c4d0 2026-03-09T17:26:29.393 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:26:29 vm09 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3[67939]: 2026-03-09T17:26:29.166+0000 7f768bdbd640 -1 osd.3 0 log_to_monitors true 2026-03-09T17:26:30.365 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: Detected new or changed devices on vm09 2026-03-09T17:26:30.365 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='client.24157 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm09:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:26:30.365 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:30.365 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='osd.3 [v2:192.168.123.109:6800/1175302832,v1:192.168.123.109:6801/1175302832]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: pgmap v41: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='client.? 192.168.123.109:0/1150641081' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "24911c64-9b6a-4862-9972-34f73f6f3c13"}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "24911c64-9b6a-4862-9972-34f73f6f3c13"}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "24911c64-9b6a-4862-9972-34f73f6f3c13"}]': finished 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: osdmap e21: 5 total, 3 up, 5 in 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='osd.3 [v2:192.168.123.109:6800/1175302832,v1:192.168.123.109:6801/1175302832]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:26:30.366 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:30 vm09 ceph-mon[62061]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: Detected new or changed devices on vm09 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='client.24157 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm09:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='osd.3 [v2:192.168.123.109:6800/1175302832,v1:192.168.123.109:6801/1175302832]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: pgmap v41: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/1150641081' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "24911c64-9b6a-4862-9972-34f73f6f3c13"}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "24911c64-9b6a-4862-9972-34f73f6f3c13"}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "24911c64-9b6a-4862-9972-34f73f6f3c13"}]': finished 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: osdmap e21: 5 total, 3 up, 5 in 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='osd.3 [v2:192.168.123.109:6800/1175302832,v1:192.168.123.109:6801/1175302832]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:26:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:30 vm06 ceph-mon[57307]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:26:31.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:31 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/813320325' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T17:26:31.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:31 vm06 ceph-mon[57307]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm09", "root=default"]}]': finished 2026-03-09T17:26:31.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:31 vm06 ceph-mon[57307]: osdmap e22: 5 total, 3 up, 5 in 2026-03-09T17:26:31.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:31 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:31.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:31 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:31.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:31 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:31.394 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:26:31 vm09 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3[67939]: 2026-03-09T17:26:31.015+0000 7f7680c33700 -1 osd.3 0 waiting for initial osdmap 2026-03-09T17:26:31.394 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:26:31 vm09 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3[67939]: 2026-03-09T17:26:31.032+0000 7f767d229700 -1 osd.3 22 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T17:26:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:31 vm09 ceph-mon[62061]: from='client.? 192.168.123.109:0/813320325' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T17:26:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:31 vm09 ceph-mon[62061]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm09", "root=default"]}]': finished 2026-03-09T17:26:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:31 vm09 ceph-mon[62061]: osdmap e22: 5 total, 3 up, 5 in 2026-03-09T17:26:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:31 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:31 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:31 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:32.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:32 vm06 ceph-mon[57307]: pgmap v44: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-09T17:26:32.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:32 vm06 ceph-mon[57307]: osd.3 [v2:192.168.123.109:6800/1175302832,v1:192.168.123.109:6801/1175302832] boot 2026-03-09T17:26:32.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:32 vm06 ceph-mon[57307]: osdmap e23: 5 total, 4 up, 5 in 2026-03-09T17:26:32.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:32 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:32.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:32 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:32.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:32 vm09 ceph-mon[62061]: pgmap v44: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-09T17:26:32.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:32 vm09 ceph-mon[62061]: osd.3 [v2:192.168.123.109:6800/1175302832,v1:192.168.123.109:6801/1175302832] boot 2026-03-09T17:26:32.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:32 vm09 ceph-mon[62061]: osdmap e23: 5 total, 4 up, 5 in 2026-03-09T17:26:32.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:32 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:26:32.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:32 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:33.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:33 vm06 ceph-mon[57307]: purged_snaps scrub starts 2026-03-09T17:26:33.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:33 vm06 ceph-mon[57307]: purged_snaps scrub ok 2026-03-09T17:26:33.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:33 vm06 ceph-mon[57307]: osdmap e24: 5 total, 4 up, 5 in 2026-03-09T17:26:33.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:33 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:33.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:33 vm09 ceph-mon[62061]: purged_snaps scrub starts 2026-03-09T17:26:33.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:33 vm09 ceph-mon[62061]: purged_snaps scrub ok 2026-03-09T17:26:33.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:33 vm09 ceph-mon[62061]: osdmap e24: 5 total, 4 up, 5 in 2026-03-09T17:26:33.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:33 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:34.098 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:34 vm09 ceph-mon[62061]: pgmap v47: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-09T17:26:34.098 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:34 vm09 ceph-mon[62061]: osdmap e25: 5 total, 4 up, 5 in 2026-03-09T17:26:34.098 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:34 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:34.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:34 vm06 ceph-mon[57307]: pgmap v47: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-09T17:26:34.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:34 vm06 ceph-mon[57307]: osdmap e25: 5 total, 4 up, 5 in 2026-03-09T17:26:34.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:34 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:35.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:35 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T17:26:35.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:35 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:35.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:35 vm09 ceph-mon[62061]: Deploying daemon osd.4 on vm09 2026-03-09T17:26:35.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:35 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T17:26:35.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:35 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:35.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:35 vm06 ceph-mon[57307]: Deploying daemon osd.4 on vm09 2026-03-09T17:26:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:36 vm06 ceph-mon[57307]: pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 107 KiB/s, 0 objects/s recovering 2026-03-09T17:26:36.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:36 vm09 ceph-mon[62061]: pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 107 KiB/s, 0 objects/s recovering 2026-03-09T17:26:37.171 INFO:teuthology.orchestra.run.vm09.stdout:Created osd(s) 4 on host 'vm09' 2026-03-09T17:26:37.171 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:37.166+0000 7f6968ff9700 1 -- 192.168.123.109:0/3956714584 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f696c1b3600 con 0x7f695406c4d0 2026-03-09T17:26:37.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:37.169+0000 7f6971ca5700 1 -- 192.168.123.109:0/3956714584 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f695406c4d0 msgr2=0x7f695406e980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:37.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:37.169+0000 7f6971ca5700 1 --2- 192.168.123.109:0/3956714584 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f695406c4d0 0x7f695406e980 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f695c009b50 tx=0x7f695c000bc0 comp rx=0 tx=0).stop 2026-03-09T17:26:37.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:37.169+0000 7f6971ca5700 1 -- 192.168.123.109:0/3956714584 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f696c074d80 msgr2=0x7f696c070bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:37.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:37.169+0000 7f6971ca5700 1 --2- 192.168.123.109:0/3956714584 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f696c074d80 0x7f696c070bf0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f696000b700 tx=0x7f696000bac0 comp rx=0 tx=0).stop 2026-03-09T17:26:37.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:37.169+0000 7f6971ca5700 1 -- 192.168.123.109:0/3956714584 shutdown_connections 2026-03-09T17:26:37.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:37.169+0000 7f6971ca5700 1 --2- 192.168.123.109:0/3956714584 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f695406c4d0 0x7f695406e980 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:37.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:37.169+0000 7f6971ca5700 1 --2- 192.168.123.109:0/3956714584 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f696c0737b0 0x7f696c0725a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:37.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:37.169+0000 7f6971ca5700 1 --2- 192.168.123.109:0/3956714584 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f696c074d80 0x7f696c070bf0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:37.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:37.169+0000 7f6971ca5700 1 -- 192.168.123.109:0/3956714584 >> 192.168.123.109:0/3956714584 conn(0x7f696c0fba80 msgr2=0x7f696c1053f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:37.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:37.169+0000 7f6971ca5700 1 -- 192.168.123.109:0/3956714584 shutdown_connections 2026-03-09T17:26:37.172 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:37.170+0000 7f6971ca5700 1 -- 192.168.123.109:0/3956714584 wait complete. 2026-03-09T17:26:37.229 DEBUG:teuthology.orchestra.run.vm09:osd.4> sudo journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.4.service 2026-03-09T17:26:37.230 INFO:tasks.cephadm:Deploying osd.5 on vm09 with /dev/vdc... 2026-03-09T17:26:37.230 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- lvm zap /dev/vdc 2026-03-09T17:26:37.271 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:37 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:37.271 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:37 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:37.271 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:37 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:37.271 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:37 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:37.271 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:37 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:37.459 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm09/config 2026-03-09T17:26:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:37 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:37 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:37 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:37 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:37 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:38.052 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:26:38.066 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph orch daemon add osd vm09:/dev/vdc 2026-03-09T17:26:38.253 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm09/config 2026-03-09T17:26:38.344 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:38 vm09 ceph-mon[62061]: pgmap v50: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 75 KiB/s, 0 objects/s recovering 2026-03-09T17:26:38.344 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:38 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:38.344 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:38 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:38.345 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:26:38 vm09 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4[73591]: 2026-03-09T17:26:38.325+0000 7fcfa3635640 -1 osd.4 0 log_to_monitors true 2026-03-09T17:26:38.603 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.599+0000 7fbfd2934700 1 -- 192.168.123.109:0/3669638326 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbfcc0feda0 msgr2=0x7fbfcc1011c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:38.603 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.599+0000 7fbfd2934700 1 --2- 192.168.123.109:0/3669638326 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbfcc0feda0 0x7fbfcc1011c0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fbfbc009b50 tx=0x7fbfbc009e60 comp rx=0 tx=0).stop 2026-03-09T17:26:38.603 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.601+0000 7fbfd2934700 1 -- 192.168.123.109:0/3669638326 shutdown_connections 2026-03-09T17:26:38.603 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.601+0000 7fbfd2934700 1 --2- 192.168.123.109:0/3669638326 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbfcc101700 0x7fbfcc103b80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:38.603 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.601+0000 7fbfd2934700 1 --2- 192.168.123.109:0/3669638326 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbfcc0feda0 0x7fbfcc1011c0 secure :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fbfbc009b50 tx=0x7fbfbc009e60 comp rx=0 tx=0).stop 2026-03-09T17:26:38.603 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.601+0000 7fbfd2934700 1 -- 192.168.123.109:0/3669638326 >> 192.168.123.109:0/3669638326 conn(0x7fbfcc0fa9b0 msgr2=0x7fbfcc0fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:38.603 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.601+0000 7fbfd2934700 1 -- 192.168.123.109:0/3669638326 shutdown_connections 2026-03-09T17:26:38.603 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.601+0000 7fbfd2934700 1 -- 192.168.123.109:0/3669638326 wait complete. 2026-03-09T17:26:38.604 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.602+0000 7fbfd2934700 1 Processor -- start 2026-03-09T17:26:38.604 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.602+0000 7fbfd2934700 1 -- start start 2026-03-09T17:26:38.604 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.603+0000 7fbfd2934700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbfcc101700 0x7fbfcc19c690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:38.605 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.603+0000 7fbfd2934700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbfcc19cbd0 0x7fbfcc1a1c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:38.605 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.603+0000 7fbfd2934700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbfcc19d0d0 con 0x7fbfcc101700 2026-03-09T17:26:38.605 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.603+0000 7fbfd2934700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbfcc19d240 con 0x7fbfcc19cbd0 2026-03-09T17:26:38.605 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.603+0000 7fbfc3fff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbfcc19cbd0 0x7fbfcc1a1c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:38.605 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.603+0000 7fbfc3fff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbfcc19cbd0 0x7fbfcc1a1c40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.109:60808/0 (socket says 192.168.123.109:60808) 2026-03-09T17:26:38.605 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.603+0000 7fbfc3fff700 1 -- 192.168.123.109:0/2972971251 learned_addr learned my addr 192.168.123.109:0/2972971251 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:26:38.607 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.603+0000 7fbfc3fff700 1 -- 192.168.123.109:0/2972971251 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbfcc101700 msgr2=0x7fbfcc19c690 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:38.607 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.603+0000 7fbfc3fff700 1 --2- 192.168.123.109:0/2972971251 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbfcc101700 0x7fbfcc19c690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:38.607 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.603+0000 7fbfc3fff700 1 -- 192.168.123.109:0/2972971251 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbfbc0097e0 con 0x7fbfcc19cbd0 2026-03-09T17:26:38.607 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.603+0000 7fbfc3fff700 1 --2- 192.168.123.109:0/2972971251 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbfcc19cbd0 0x7fbfcc1a1c40 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fbfb400d8d0 tx=0x7fbfb400dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:38.607 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.603+0000 7fbfc9ffb700 1 -- 192.168.123.109:0/2972971251 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbfb4009940 con 0x7fbfcc19cbd0 2026-03-09T17:26:38.607 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.603+0000 7fbfc9ffb700 1 -- 192.168.123.109:0/2972971251 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbfb4010460 con 0x7fbfcc19cbd0 2026-03-09T17:26:38.607 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.604+0000 7fbfc9ffb700 1 -- 192.168.123.109:0/2972971251 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbfb400f5d0 con 0x7fbfcc19cbd0 2026-03-09T17:26:38.608 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.605+0000 7fbfd2934700 1 -- 192.168.123.109:0/2972971251 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbfcc1a21e0 con 0x7fbfcc19cbd0 2026-03-09T17:26:38.610 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.605+0000 7fbfd2934700 1 -- 192.168.123.109:0/2972971251 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbfcc1a26b0 con 0x7fbfcc19cbd0 2026-03-09T17:26:38.610 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.609+0000 7fbfd2934700 1 -- 192.168.123.109:0/2972971251 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbfcc066e40 con 0x7fbfcc19cbd0 2026-03-09T17:26:38.616 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.612+0000 7fbfc9ffb700 1 -- 192.168.123.109:0/2972971251 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fbfb4009af0 con 0x7fbfcc19cbd0 2026-03-09T17:26:38.616 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.613+0000 7fbfc9ffb700 1 --2- 192.168.123.109:0/2972971251 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbfac06c480 0x7fbfac06e930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:38.616 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.613+0000 7fbfc9ffb700 1 -- 192.168.123.109:0/2972971251 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(25..25 src has 1..25) v4 ==== 3697+0+0 (secure 0 0 0) 0x7fbfb408b000 con 0x7fbfcc19cbd0 2026-03-09T17:26:38.616 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.613+0000 7fbfcbfff700 1 --2- 192.168.123.109:0/2972971251 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbfac06c480 0x7fbfac06e930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:38.616 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.614+0000 7fbfc9ffb700 1 -- 192.168.123.109:0/2972971251 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fbfb40595a0 con 0x7fbfcc19cbd0 2026-03-09T17:26:38.616 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.614+0000 7fbfcbfff700 1 --2- 192.168.123.109:0/2972971251 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbfac06c480 0x7fbfac06e930 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fbfbc006010 tx=0x7fbfbc0058e0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:38.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:38 vm06 ceph-mon[57307]: pgmap v50: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 75 KiB/s, 0 objects/s recovering 2026-03-09T17:26:38.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:38 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:38.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:38 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:38.741 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:38.739+0000 7fbfd2934700 1 -- 192.168.123.109:0/2972971251 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm09:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7fbfcc1a2960 con 0x7fbfac06c480 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='osd.4 [v2:192.168.123.109:6808/153890079,v1:192.168.123.109:6809/153890079]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:39.189 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:39 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='osd.4 [v2:192.168.123.109:6808/153890079,v1:192.168.123.109:6809/153890079]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:39.396 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:39 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:40.063 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:26:39 vm09 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4[73591]: 2026-03-09T17:26:39.706+0000 7fcf984ab700 -1 osd.4 0 waiting for initial osdmap 2026-03-09T17:26:40.063 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:26:39 vm09 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4[73591]: 2026-03-09T17:26:39.728+0000 7fcf92a9d700 -1 osd.4 27 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T17:26:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: from='client.24177 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm09:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: Detected new or changed devices on vm09 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: osdmap e26: 5 total, 4 up, 5 in 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: from='osd.4 [v2:192.168.123.109:6808/153890079,v1:192.168.123.109:6809/153890079]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: pgmap v52: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 72 KiB/s, 0 objects/s recovering 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: from='client.? 192.168.123.109:0/1102065589' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4a80decf-2a05-4525-b2be-269b4a9ba65c"}]: dispatch 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4a80decf-2a05-4525-b2be-269b4a9ba65c"}]: dispatch 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm09", "root=default"]}]': finished 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "4a80decf-2a05-4525-b2be-269b4a9ba65c"}]': finished 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: osdmap e27: 6 total, 4 up, 6 in 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:26:40.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:40 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:40.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: from='client.24177 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm09:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: Detected new or changed devices on vm09 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: osdmap e26: 5 total, 4 up, 5 in 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: from='osd.4 [v2:192.168.123.109:6808/153890079,v1:192.168.123.109:6809/153890079]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: pgmap v52: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 72 KiB/s, 0 objects/s recovering 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/1102065589' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4a80decf-2a05-4525-b2be-269b4a9ba65c"}]: dispatch 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4a80decf-2a05-4525-b2be-269b4a9ba65c"}]: dispatch 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm09", "root=default"]}]': finished 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "4a80decf-2a05-4525-b2be-269b4a9ba65c"}]': finished 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: osdmap e27: 6 total, 4 up, 6 in 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:26:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:40 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:41.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:41 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/1994863445' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T17:26:41.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:41 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:41.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:41 vm06 ceph-mon[57307]: osd.4 [v2:192.168.123.109:6808/153890079,v1:192.168.123.109:6809/153890079] boot 2026-03-09T17:26:41.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:41 vm06 ceph-mon[57307]: osdmap e28: 6 total, 5 up, 6 in 2026-03-09T17:26:41.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:41 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:41.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:41 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:26:41.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:41 vm09 ceph-mon[62061]: from='client.? 192.168.123.109:0/1994863445' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T17:26:41.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:41 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:41.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:41 vm09 ceph-mon[62061]: osd.4 [v2:192.168.123.109:6808/153890079,v1:192.168.123.109:6809/153890079] boot 2026-03-09T17:26:41.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:41 vm09 ceph-mon[62061]: osdmap e28: 6 total, 5 up, 6 in 2026-03-09T17:26:41.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:41 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:26:41.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:41 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:26:42.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:42 vm06 ceph-mon[57307]: purged_snaps scrub starts 2026-03-09T17:26:42.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:42 vm06 ceph-mon[57307]: purged_snaps scrub ok 2026-03-09T17:26:42.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:42 vm06 ceph-mon[57307]: pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-09T17:26:42.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:42 vm09 ceph-mon[62061]: purged_snaps scrub starts 2026-03-09T17:26:42.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:42 vm09 ceph-mon[62061]: purged_snaps scrub ok 2026-03-09T17:26:42.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:42 vm09 ceph-mon[62061]: pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-09T17:26:43.554 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:43 vm09 ceph-mon[62061]: osdmap e29: 6 total, 5 up, 6 in 2026-03-09T17:26:43.554 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:43 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:26:43.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:43 vm06 ceph-mon[57307]: osdmap e29: 6 total, 5 up, 6 in 2026-03-09T17:26:43.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:43 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:26:44.565 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:44 vm09 ceph-mon[62061]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T17:26:44.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:44 vm06 ceph-mon[57307]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T17:26:45.480 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:45 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T17:26:45.480 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:45 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:45.480 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:45 vm09 ceph-mon[62061]: Deploying daemon osd.5 on vm09 2026-03-09T17:26:45.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:45 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T17:26:45.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:45 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:45.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:45 vm06 ceph-mon[57307]: Deploying daemon osd.5 on vm09 2026-03-09T17:26:46.513 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:46 vm09 ceph-mon[62061]: pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T17:26:46.513 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:46 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:46.513 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:46 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:46.513 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:46 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:46.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:46 vm06 ceph-mon[57307]: pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T17:26:46.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:46 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:46.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:46 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:46.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:46 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:47.180 INFO:teuthology.orchestra.run.vm09.stdout:Created osd(s) 5 on host 'vm09' 2026-03-09T17:26:47.180 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:47.175+0000 7fbfc9ffb700 1 -- 192.168.123.109:0/2972971251 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fbfcc1a2960 con 0x7fbfac06c480 2026-03-09T17:26:47.180 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:47.177+0000 7fbfd2934700 1 -- 192.168.123.109:0/2972971251 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbfac06c480 msgr2=0x7fbfac06e930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:47.180 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:47.177+0000 7fbfd2934700 1 --2- 192.168.123.109:0/2972971251 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbfac06c480 0x7fbfac06e930 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fbfbc006010 tx=0x7fbfbc0058e0 comp rx=0 tx=0).stop 2026-03-09T17:26:47.180 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:47.177+0000 7fbfd2934700 1 -- 192.168.123.109:0/2972971251 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbfcc19cbd0 msgr2=0x7fbfcc1a1c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:47.180 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:47.177+0000 7fbfd2934700 1 --2- 192.168.123.109:0/2972971251 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbfcc19cbd0 0x7fbfcc1a1c40 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fbfb400d8d0 tx=0x7fbfb400dc90 comp rx=0 tx=0).stop 2026-03-09T17:26:47.180 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:47.177+0000 7fbfd2934700 1 -- 192.168.123.109:0/2972971251 shutdown_connections 2026-03-09T17:26:47.180 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:47.177+0000 7fbfd2934700 1 --2- 192.168.123.109:0/2972971251 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbfac06c480 0x7fbfac06e930 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:47.180 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:47.177+0000 7fbfd2934700 1 --2- 192.168.123.109:0/2972971251 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbfcc101700 0x7fbfcc19c690 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:47.180 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:47.177+0000 7fbfd2934700 1 --2- 192.168.123.109:0/2972971251 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbfcc19cbd0 0x7fbfcc1a1c40 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:47.180 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:47.177+0000 7fbfd2934700 1 -- 192.168.123.109:0/2972971251 >> 192.168.123.109:0/2972971251 conn(0x7fbfcc0fa9b0 msgr2=0x7fbfcc0fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:47.180 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:47.177+0000 7fbfd2934700 1 -- 192.168.123.109:0/2972971251 shutdown_connections 2026-03-09T17:26:47.180 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:47.177+0000 7fbfd2934700 1 -- 192.168.123.109:0/2972971251 wait complete. 2026-03-09T17:26:47.232 DEBUG:teuthology.orchestra.run.vm09:osd.5> sudo journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.5.service 2026-03-09T17:26:47.233 INFO:tasks.cephadm:Waiting for 6 OSDs to come up... 2026-03-09T17:26:47.233 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd stat -f json 2026-03-09T17:26:47.412 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:47.687 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.684+0000 7f41c19f9700 1 -- 192.168.123.106:0/3390399298 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41bc074d80 msgr2=0x7f41bc0731e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:47.687 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.684+0000 7f41c19f9700 1 --2- 192.168.123.106:0/3390399298 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41bc074d80 0x7f41bc0731e0 secure :-1 s=READY pgs=193 cs=0 l=1 rev1=1 crypto rx=0x7f41ac009b00 tx=0x7f41ac009e10 comp rx=0 tx=0).stop 2026-03-09T17:26:47.687 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.685+0000 7f41c19f9700 1 -- 192.168.123.106:0/3390399298 shutdown_connections 2026-03-09T17:26:47.687 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.685+0000 7f41c19f9700 1 --2- 192.168.123.106:0/3390399298 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f41bc0737b0 0x7f41bc073c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:47.687 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.685+0000 7f41c19f9700 1 --2- 192.168.123.106:0/3390399298 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41bc074d80 0x7f41bc0731e0 unknown :-1 s=CLOSED pgs=193 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:47.687 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.685+0000 7f41c19f9700 1 -- 192.168.123.106:0/3390399298 >> 192.168.123.106:0/3390399298 conn(0x7f41bc0fbaa0 msgr2=0x7f41bc0fdf10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:47.687 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.685+0000 7f41c19f9700 1 -- 192.168.123.106:0/3390399298 shutdown_connections 2026-03-09T17:26:47.687 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.685+0000 7f41c19f9700 1 -- 192.168.123.106:0/3390399298 wait complete. 2026-03-09T17:26:47.688 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.686+0000 7f41c19f9700 1 Processor -- start 2026-03-09T17:26:47.688 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.686+0000 7f41c19f9700 1 -- start start 2026-03-09T17:26:47.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.686+0000 7f41c19f9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41bc0737b0 0x7f41bc19c3b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:47.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.686+0000 7f41c19f9700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f41bc074d80 0x7f41bc19c8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:47.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.687+0000 7f41baffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41bc0737b0 0x7f41bc19c3b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:47.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.687+0000 7f41baffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41bc0737b0 0x7f41bc19c3b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:51548/0 (socket says 192.168.123.106:51548) 2026-03-09T17:26:47.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.687+0000 7f41c19f9700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f41bc19cf10 con 0x7f41bc0737b0 2026-03-09T17:26:47.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.687+0000 7f41c19f9700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f41bc19d050 con 0x7f41bc074d80 2026-03-09T17:26:47.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.687+0000 7f41baffd700 1 -- 192.168.123.106:0/1338839938 learned_addr learned my addr 192.168.123.106:0/1338839938 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:47.690 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.687+0000 7f41baffd700 1 -- 192.168.123.106:0/1338839938 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f41bc074d80 msgr2=0x7f41bc19c8f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:47.690 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.687+0000 7f41baffd700 1 --2- 192.168.123.106:0/1338839938 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f41bc074d80 0x7f41bc19c8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:47.690 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.687+0000 7f41baffd700 1 -- 192.168.123.106:0/1338839938 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f41ac0097e0 con 0x7f41bc0737b0 2026-03-09T17:26:47.690 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.687+0000 7f41baffd700 1 --2- 192.168.123.106:0/1338839938 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41bc0737b0 0x7f41bc19c3b0 secure :-1 s=READY pgs=194 cs=0 l=1 rev1=1 crypto rx=0x7f41ac00bb70 tx=0x7f41ac004690 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:47.690 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.688+0000 7f41c09f7700 1 -- 192.168.123.106:0/1338839938 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f41ac01d070 con 0x7f41bc0737b0 2026-03-09T17:26:47.690 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.688+0000 7f41c09f7700 1 -- 192.168.123.106:0/1338839938 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f41ac022470 con 0x7f41bc0737b0 2026-03-09T17:26:47.690 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.688+0000 7f41c09f7700 1 -- 192.168.123.106:0/1338839938 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f41ac00f740 con 0x7f41bc0737b0 2026-03-09T17:26:47.691 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.688+0000 7f41c19f9700 1 -- 192.168.123.106:0/1338839938 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f41bc1a1aa0 con 0x7f41bc0737b0 2026-03-09T17:26:47.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.689+0000 7f41c19f9700 1 -- 192.168.123.106:0/1338839938 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f41bc1a1f10 con 0x7f41bc0737b0 2026-03-09T17:26:47.696 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.690+0000 7f41c19f9700 1 -- 192.168.123.106:0/1338839938 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f41bc066e40 con 0x7f41bc0737b0 2026-03-09T17:26:47.696 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.691+0000 7f41c09f7700 1 -- 192.168.123.106:0/1338839938 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f41ac0225e0 con 0x7f41bc0737b0 2026-03-09T17:26:47.696 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.691+0000 7f41c09f7700 1 --2- 192.168.123.106:0/1338839938 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f41a806c470 0x7f41a806e920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:47.696 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.692+0000 7f41b3fff700 1 --2- 192.168.123.106:0/1338839938 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f41a806c470 0x7f41a806e920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:47.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.692+0000 7f41b3fff700 1 --2- 192.168.123.106:0/1338839938 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f41a806c470 0x7f41a806e920 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f41a4005950 tx=0x7f41a40058e0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:47.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.692+0000 7f41c09f7700 1 -- 192.168.123.106:0/1338839938 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(29..29 src has 1..29) v4 ==== 4129+0+0 (secure 0 0 0) 0x7f41ac08d510 con 0x7f41bc0737b0 2026-03-09T17:26:47.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.694+0000 7f41c09f7700 1 -- 192.168.123.106:0/1338839938 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f41ac05aa30 con 0x7f41bc0737b0 2026-03-09T17:26:47.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.811+0000 7f41c19f9700 1 -- 192.168.123.106:0/1338839938 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f41bc1a22b0 con 0x7f41bc0737b0 2026-03-09T17:26:47.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.812+0000 7f41c09f7700 1 -- 192.168.123.106:0/1338839938 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v29) v1 ==== 74+0+130 (secure 0 0 0) 0x7f41ac027020 con 0x7f41bc0737b0 2026-03-09T17:26:47.815 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:26:47.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.815+0000 7f41c19f9700 1 -- 192.168.123.106:0/1338839938 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f41a806c470 msgr2=0x7f41a806e920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:47.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.815+0000 7f41c19f9700 1 --2- 192.168.123.106:0/1338839938 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f41a806c470 0x7f41a806e920 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f41a4005950 tx=0x7f41a40058e0 comp rx=0 tx=0).stop 2026-03-09T17:26:47.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.815+0000 7f41c19f9700 1 -- 192.168.123.106:0/1338839938 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41bc0737b0 msgr2=0x7f41bc19c3b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:47.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.815+0000 7f41c19f9700 1 --2- 192.168.123.106:0/1338839938 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41bc0737b0 0x7f41bc19c3b0 secure :-1 s=READY pgs=194 cs=0 l=1 rev1=1 crypto rx=0x7f41ac00bb70 tx=0x7f41ac004690 comp rx=0 tx=0).stop 2026-03-09T17:26:47.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.816+0000 7f41c19f9700 1 -- 192.168.123.106:0/1338839938 shutdown_connections 2026-03-09T17:26:47.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.816+0000 7f41c19f9700 1 --2- 192.168.123.106:0/1338839938 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f41a806c470 0x7f41a806e920 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:47.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.816+0000 7f41c19f9700 1 --2- 192.168.123.106:0/1338839938 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f41bc0737b0 0x7f41bc19c3b0 unknown :-1 s=CLOSED pgs=194 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:47.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.816+0000 7f41c19f9700 1 --2- 192.168.123.106:0/1338839938 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f41bc074d80 0x7f41bc19c8f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:47.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.816+0000 7f41c19f9700 1 -- 192.168.123.106:0/1338839938 >> 192.168.123.106:0/1338839938 conn(0x7f41bc0fbaa0 msgr2=0x7f41bc101ec0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:47.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.816+0000 7f41c19f9700 1 -- 192.168.123.106:0/1338839938 shutdown_connections 2026-03-09T17:26:47.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:47.816+0000 7f41c19f9700 1 -- 192.168.123.106:0/1338839938 wait complete. 2026-03-09T17:26:47.890 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":29,"num_osds":6,"num_up_osds":5,"osd_up_since":1773077200,"num_in_osds":6,"osd_in_since":1773077199,"num_remapped_pgs":0} 2026-03-09T17:26:48.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:48 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:48.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:48 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:48.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:48 vm06 ceph-mon[57307]: pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T17:26:48.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:48 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:48.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:48 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:48.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:48 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/1338839938' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T17:26:48.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:48 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:48.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:48 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:48.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:48 vm09 ceph-mon[62061]: pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T17:26:48.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:48 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:48.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:48 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:48.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:48 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/1338839938' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T17:26:48.891 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd stat -f json 2026-03-09T17:26:49.060 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:49.144 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:26:48 vm09 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[79160]: 2026-03-09T17:26:48.743+0000 7fad44858640 -1 osd.5 0 log_to_monitors true 2026-03-09T17:26:49.341 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.338+0000 7f9db5b96700 1 -- 192.168.123.106:0/2568425208 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9db0100fb0 msgr2=0x7f9db0103390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:49.341 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.338+0000 7f9db5b96700 1 --2- 192.168.123.106:0/2568425208 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9db0100fb0 0x7f9db0103390 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7f9d98009b00 tx=0x7f9d98009e10 comp rx=0 tx=0).stop 2026-03-09T17:26:49.341 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.339+0000 7f9db5b96700 1 -- 192.168.123.106:0/2568425208 shutdown_connections 2026-03-09T17:26:49.341 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.339+0000 7f9db5b96700 1 --2- 192.168.123.106:0/2568425208 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9db01038d0 0x7f9db0105cb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:49.341 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.339+0000 7f9db5b96700 1 --2- 192.168.123.106:0/2568425208 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9db0100fb0 0x7f9db0103390 unknown :-1 s=CLOSED pgs=195 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:49.341 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.339+0000 7f9db5b96700 1 -- 192.168.123.106:0/2568425208 >> 192.168.123.106:0/2568425208 conn(0x7f9db00fa990 msgr2=0x7f9db00fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:49.341 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.339+0000 7f9db5b96700 1 -- 192.168.123.106:0/2568425208 shutdown_connections 2026-03-09T17:26:49.341 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.339+0000 7f9db5b96700 1 -- 192.168.123.106:0/2568425208 wait complete. 2026-03-09T17:26:49.342 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.340+0000 7f9db5b96700 1 Processor -- start 2026-03-09T17:26:49.342 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.340+0000 7f9db5b96700 1 -- start start 2026-03-09T17:26:49.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.341+0000 7f9db5b96700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9db01038d0 0x7f9db0193df0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:49.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.341+0000 7f9db5b96700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9db0194330 0x7f9db01993a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:49.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.341+0000 7f9db5b96700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9db0194830 con 0x7f9db01038d0 2026-03-09T17:26:49.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.341+0000 7f9db5b96700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9db01949a0 con 0x7f9db0194330 2026-03-09T17:26:49.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.341+0000 7f9daeffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9db0194330 0x7f9db01993a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:49.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.341+0000 7f9daeffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9db0194330 0x7f9db01993a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:39902/0 (socket says 192.168.123.106:39902) 2026-03-09T17:26:49.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.341+0000 7f9daeffd700 1 -- 192.168.123.106:0/673181568 learned_addr learned my addr 192.168.123.106:0/673181568 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:49.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.342+0000 7f9daeffd700 1 -- 192.168.123.106:0/673181568 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9db01038d0 msgr2=0x7f9db0193df0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:26:49.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.342+0000 7f9daf7fe700 1 --2- 192.168.123.106:0/673181568 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9db01038d0 0x7f9db0193df0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:49.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.342+0000 7f9daeffd700 1 --2- 192.168.123.106:0/673181568 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9db01038d0 0x7f9db0193df0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:49.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.342+0000 7f9daeffd700 1 -- 192.168.123.106:0/673181568 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9d980097e0 con 0x7f9db0194330 2026-03-09T17:26:49.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.342+0000 7f9daf7fe700 1 --2- 192.168.123.106:0/673181568 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9db01038d0 0x7f9db0193df0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:26:49.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.342+0000 7f9daeffd700 1 --2- 192.168.123.106:0/673181568 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9db0194330 0x7f9db01993a0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f9da000d8d0 tx=0x7f9da000dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:49.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.342+0000 7f9dacff9700 1 -- 192.168.123.106:0/673181568 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9da0009940 con 0x7f9db0194330 2026-03-09T17:26:49.345 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.343+0000 7f9db5b96700 1 -- 192.168.123.106:0/673181568 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9db0199940 con 0x7f9db0194330 2026-03-09T17:26:49.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.343+0000 7f9dacff9700 1 -- 192.168.123.106:0/673181568 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9da0010460 con 0x7f9db0194330 2026-03-09T17:26:49.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.343+0000 7f9dacff9700 1 -- 192.168.123.106:0/673181568 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9da000f5d0 con 0x7f9db0194330 2026-03-09T17:26:49.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.343+0000 7f9db5b96700 1 -- 192.168.123.106:0/673181568 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9db0199e60 con 0x7f9db0194330 2026-03-09T17:26:49.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.344+0000 7f9db5b96700 1 -- 192.168.123.106:0/673181568 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9db018ddd0 con 0x7f9db0194330 2026-03-09T17:26:49.347 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.344+0000 7f9dacff9700 1 -- 192.168.123.106:0/673181568 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f9da00105d0 con 0x7f9db0194330 2026-03-09T17:26:49.347 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.345+0000 7f9dacff9700 1 --2- 192.168.123.106:0/673181568 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9d9c070a90 0x7f9d9c072f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:49.347 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.345+0000 7f9daf7fe700 1 --2- 192.168.123.106:0/673181568 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9d9c070a90 0x7f9d9c072f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:49.347 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.345+0000 7f9dacff9700 1 -- 192.168.123.106:0/673181568 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(29..29 src has 1..29) v4 ==== 4129+0+0 (secure 0 0 0) 0x7f9da008a990 con 0x7f9db0194330 2026-03-09T17:26:49.347 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.345+0000 7f9daf7fe700 1 --2- 192.168.123.106:0/673181568 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9d9c070a90 0x7f9d9c072f40 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f9d9800b5c0 tx=0x7f9d98005fd0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:49.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.348+0000 7f9dacff9700 1 -- 192.168.123.106:0/673181568 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9da0058d80 con 0x7f9db0194330 2026-03-09T17:26:49.467 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:49 vm06 ceph-mon[57307]: Detected new or changed devices on vm09 2026-03-09T17:26:49.467 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:49 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:49.467 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:49 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:49.467 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:49 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:26:49.467 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:49 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:49.467 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:49 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:49.467 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:49 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:49.467 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:49 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:49.467 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:49 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:49.467 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:49 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:49.467 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:49 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:49.467 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:49 vm06 ceph-mon[57307]: from='osd.5 [v2:192.168.123.109:6816/764831086,v1:192.168.123.109:6817/764831086]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T17:26:49.467 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:49 vm06 ceph-mon[57307]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T17:26:49.467 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:49 vm06 ceph-mon[57307]: pgmap v60: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T17:26:49.467 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.464+0000 7f9db5b96700 1 -- 192.168.123.106:0/673181568 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f9db0066e40 con 0x7f9db0194330 2026-03-09T17:26:49.468 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.466+0000 7f9dacff9700 1 -- 192.168.123.106:0/673181568 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v29) v1 ==== 74+0+130 (secure 0 0 0) 0x7f9da0058ba0 con 0x7f9db0194330 2026-03-09T17:26:49.468 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:26:49.471 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.469+0000 7f9db5b96700 1 -- 192.168.123.106:0/673181568 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9d9c070a90 msgr2=0x7f9d9c072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:49.471 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.469+0000 7f9db5b96700 1 --2- 192.168.123.106:0/673181568 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9d9c070a90 0x7f9d9c072f40 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f9d9800b5c0 tx=0x7f9d98005fd0 comp rx=0 tx=0).stop 2026-03-09T17:26:49.471 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.469+0000 7f9db5b96700 1 -- 192.168.123.106:0/673181568 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9db0194330 msgr2=0x7f9db01993a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:49.471 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.469+0000 7f9db5b96700 1 --2- 192.168.123.106:0/673181568 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9db0194330 0x7f9db01993a0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f9da000d8d0 tx=0x7f9da000dc90 comp rx=0 tx=0).stop 2026-03-09T17:26:49.471 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.469+0000 7f9db5b96700 1 -- 192.168.123.106:0/673181568 shutdown_connections 2026-03-09T17:26:49.471 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.469+0000 7f9db5b96700 1 --2- 192.168.123.106:0/673181568 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9d9c070a90 0x7f9d9c072f40 secure :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f9d9800b5c0 tx=0x7f9d98005fd0 comp rx=0 tx=0).stop 2026-03-09T17:26:49.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.469+0000 7f9db5b96700 1 --2- 192.168.123.106:0/673181568 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9db01038d0 0x7f9db0193df0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:49.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.469+0000 7f9db5b96700 1 --2- 192.168.123.106:0/673181568 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9db0194330 0x7f9db01993a0 secure :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f9da000d8d0 tx=0x7f9da000dc90 comp rx=0 tx=0).stop 2026-03-09T17:26:49.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.469+0000 7f9db5b96700 1 -- 192.168.123.106:0/673181568 >> 192.168.123.106:0/673181568 conn(0x7f9db00fa990 msgr2=0x7f9db00fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:49.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.469+0000 7f9db5b96700 1 -- 192.168.123.106:0/673181568 shutdown_connections 2026-03-09T17:26:49.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:49.469+0000 7f9db5b96700 1 -- 192.168.123.106:0/673181568 wait complete. 2026-03-09T17:26:49.541 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":29,"num_osds":6,"num_up_osds":5,"osd_up_since":1773077200,"num_in_osds":6,"osd_in_since":1773077199,"num_remapped_pgs":0} 2026-03-09T17:26:49.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:49 vm09 ceph-mon[62061]: Detected new or changed devices on vm09 2026-03-09T17:26:49.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:49 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:49.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:49 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:49.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:49 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:26:49.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:49 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:49.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:49 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:49.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:49 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:49.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:49 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:26:49.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:49 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:26:49.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:49 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:26:49.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:49 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:26:49.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:49 vm09 ceph-mon[62061]: from='osd.5 [v2:192.168.123.109:6816/764831086,v1:192.168.123.109:6817/764831086]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T17:26:49.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:49 vm09 ceph-mon[62061]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T17:26:49.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:49 vm09 ceph-mon[62061]: pgmap v60: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T17:26:50.541 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd stat -f json 2026-03-09T17:26:50.709 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:50.740 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:50 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/673181568' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T17:26:50.740 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:50 vm06 ceph-mon[57307]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T17:26:50.740 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:50 vm06 ceph-mon[57307]: osdmap e30: 6 total, 5 up, 6 in 2026-03-09T17:26:50.740 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:50 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:26:50.740 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:50 vm06 ceph-mon[57307]: from='osd.5 [v2:192.168.123.109:6816/764831086,v1:192.168.123.109:6817/764831086]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:26:50.740 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:50 vm06 ceph-mon[57307]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:26:50.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:50 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/673181568' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T17:26:50.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:50 vm09 ceph-mon[62061]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T17:26:50.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:50 vm09 ceph-mon[62061]: osdmap e30: 6 total, 5 up, 6 in 2026-03-09T17:26:50.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:50 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:26:50.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:50 vm09 ceph-mon[62061]: from='osd.5 [v2:192.168.123.109:6816/764831086,v1:192.168.123.109:6817/764831086]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:26:50.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:50 vm09 ceph-mon[62061]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:26:50.895 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:26:50 vm09 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[79160]: 2026-03-09T17:26:50.558+0000 7fad396ce700 -1 osd.5 0 waiting for initial osdmap 2026-03-09T17:26:50.895 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:26:50 vm09 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[79160]: 2026-03-09T17:26:50.566+0000 7fad354c3700 -1 osd.5 31 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T17:26:50.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.988+0000 7f40b6e5e700 1 -- 192.168.123.106:0/1713639032 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f40b0107d70 msgr2=0x7f40b01081c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:50.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.988+0000 7f40b6e5e700 1 --2- 192.168.123.106:0/1713639032 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f40b0107d70 0x7f40b01081c0 secure :-1 s=READY pgs=196 cs=0 l=1 rev1=1 crypto rx=0x7f40a4009b00 tx=0x7f40a4009e10 comp rx=0 tx=0).stop 2026-03-09T17:26:50.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.990+0000 7f40b6e5e700 1 -- 192.168.123.106:0/1713639032 shutdown_connections 2026-03-09T17:26:50.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.990+0000 7f40b6e5e700 1 --2- 192.168.123.106:0/1713639032 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f40b0107d70 0x7f40b01081c0 unknown :-1 s=CLOSED pgs=196 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:50.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.990+0000 7f40b6e5e700 1 --2- 192.168.123.106:0/1713639032 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f40b0106f60 0x7f40b0107370 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:50.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.990+0000 7f40b6e5e700 1 -- 192.168.123.106:0/1713639032 >> 192.168.123.106:0/1713639032 conn(0x7f40b0075b50 msgr2=0x7f40b0077f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:50.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.990+0000 7f40b6e5e700 1 -- 192.168.123.106:0/1713639032 shutdown_connections 2026-03-09T17:26:50.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.990+0000 7f40b6e5e700 1 -- 192.168.123.106:0/1713639032 wait complete. 2026-03-09T17:26:50.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.991+0000 7f40b6e5e700 1 Processor -- start 2026-03-09T17:26:50.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.991+0000 7f40b6e5e700 1 -- start start 2026-03-09T17:26:50.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.991+0000 7f40b6e5e700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f40b0106f60 0x7f40b019c410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:50.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.991+0000 7f40b6e5e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f40b0107d70 0x7f40b019c950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:50.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.991+0000 7f40b6e5e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f40b019cf70 con 0x7f40b0107d70 2026-03-09T17:26:50.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.991+0000 7f40b6e5e700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f40b019d0b0 con 0x7f40b0106f60 2026-03-09T17:26:50.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.992+0000 7f40b4bfa700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f40b0106f60 0x7f40b019c410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:50.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.992+0000 7f40b4bfa700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f40b0106f60 0x7f40b019c410 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:39924/0 (socket says 192.168.123.106:39924) 2026-03-09T17:26:50.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.992+0000 7f40b4bfa700 1 -- 192.168.123.106:0/787510009 learned_addr learned my addr 192.168.123.106:0/787510009 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:50.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.992+0000 7f40b4bfa700 1 -- 192.168.123.106:0/787510009 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f40b0107d70 msgr2=0x7f40b019c950 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:50.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.992+0000 7f40b4bfa700 1 --2- 192.168.123.106:0/787510009 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f40b0107d70 0x7f40b019c950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:50.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.992+0000 7f40b4bfa700 1 -- 192.168.123.106:0/787510009 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f40a40097e0 con 0x7f40b0106f60 2026-03-09T17:26:50.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.992+0000 7f40b4bfa700 1 --2- 192.168.123.106:0/787510009 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f40b0106f60 0x7f40b019c410 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f40a000d900 tx=0x7f40a000dcc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:50.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.993+0000 7f40adffb700 1 -- 192.168.123.106:0/787510009 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f40a00041d0 con 0x7f40b0106f60 2026-03-09T17:26:50.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.993+0000 7f40b6e5e700 1 -- 192.168.123.106:0/787510009 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f40b01a1b60 con 0x7f40b0106f60 2026-03-09T17:26:50.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.993+0000 7f40b6e5e700 1 -- 192.168.123.106:0/787510009 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f40b01a20b0 con 0x7f40b0106f60 2026-03-09T17:26:50.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.993+0000 7f40adffb700 1 -- 192.168.123.106:0/787510009 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f40a0004330 con 0x7f40b0106f60 2026-03-09T17:26:50.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.993+0000 7f40adffb700 1 -- 192.168.123.106:0/787510009 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f40a0003da0 con 0x7f40b0106f60 2026-03-09T17:26:50.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.995+0000 7f40adffb700 1 -- 192.168.123.106:0/787510009 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f40a0009730 con 0x7f40b0106f60 2026-03-09T17:26:50.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.995+0000 7f40adffb700 1 --2- 192.168.123.106:0/787510009 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f409806c630 0x7f409806eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:50.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.995+0000 7f40adffb700 1 -- 192.168.123.106:0/787510009 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(31..31 src has 1..31) v4 ==== 4166+0+0 (secure 0 0 0) 0x7f40a0021030 con 0x7f40b0106f60 2026-03-09T17:26:50.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.996+0000 7f40affff700 1 --2- 192.168.123.106:0/787510009 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f409806c630 0x7f409806eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:50.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.996+0000 7f40b6e5e700 1 -- 192.168.123.106:0/787510009 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f409c005320 con 0x7f40b0106f60 2026-03-09T17:26:50.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.996+0000 7f40affff700 1 --2- 192.168.123.106:0/787510009 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f409806c630 0x7f409806eae0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f40a400b5c0 tx=0x7f40a4009f90 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:51.001 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:50.999+0000 7f40adffb700 1 -- 192.168.123.106:0/787510009 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f40a0059750 con 0x7f40b0106f60 2026-03-09T17:26:51.111 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:51.109+0000 7f40b6e5e700 1 -- 192.168.123.106:0/787510009 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f409c005190 con 0x7f40b0106f60 2026-03-09T17:26:51.112 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:51.109+0000 7f40adffb700 1 -- 192.168.123.106:0/787510009 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v31) v1 ==== 74+0+130 (secure 0 0 0) 0x7f40a00592e0 con 0x7f40b0106f60 2026-03-09T17:26:51.112 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:26:51.114 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:51.112+0000 7f40b6e5e700 1 -- 192.168.123.106:0/787510009 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f409806c630 msgr2=0x7f409806eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:51.114 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:51.112+0000 7f40b6e5e700 1 --2- 192.168.123.106:0/787510009 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f409806c630 0x7f409806eae0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f40a400b5c0 tx=0x7f40a4009f90 comp rx=0 tx=0).stop 2026-03-09T17:26:51.114 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:51.112+0000 7f40b6e5e700 1 -- 192.168.123.106:0/787510009 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f40b0106f60 msgr2=0x7f40b019c410 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:51.114 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:51.112+0000 7f40b6e5e700 1 --2- 192.168.123.106:0/787510009 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f40b0106f60 0x7f40b019c410 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f40a000d900 tx=0x7f40a000dcc0 comp rx=0 tx=0).stop 2026-03-09T17:26:51.115 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:51.113+0000 7f40b6e5e700 1 -- 192.168.123.106:0/787510009 shutdown_connections 2026-03-09T17:26:51.115 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:51.113+0000 7f40b6e5e700 1 --2- 192.168.123.106:0/787510009 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f40b0106f60 0x7f40b019c410 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:51.115 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:51.113+0000 7f40b6e5e700 1 --2- 192.168.123.106:0/787510009 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f409806c630 0x7f409806eae0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:51.115 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:51.113+0000 7f40b6e5e700 1 --2- 192.168.123.106:0/787510009 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f40b0107d70 0x7f40b019c950 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:51.115 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:51.113+0000 7f40b6e5e700 1 -- 192.168.123.106:0/787510009 >> 192.168.123.106:0/787510009 conn(0x7f40b0075b50 msgr2=0x7f40b010afa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:51.115 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:51.113+0000 7f40b6e5e700 1 -- 192.168.123.106:0/787510009 shutdown_connections 2026-03-09T17:26:51.115 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:51.113+0000 7f40b6e5e700 1 -- 192.168.123.106:0/787510009 wait complete. 2026-03-09T17:26:51.168 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":31,"num_osds":6,"num_up_osds":5,"osd_up_since":1773077200,"num_in_osds":6,"osd_in_since":1773077199,"num_remapped_pgs":0} 2026-03-09T17:26:51.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:51 vm06 ceph-mon[57307]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm09", "root=default"]}]': finished 2026-03-09T17:26:51.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:51 vm06 ceph-mon[57307]: osdmap e31: 6 total, 5 up, 6 in 2026-03-09T17:26:51.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:51 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:26:51.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:51 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:26:51.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:51 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/787510009' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T17:26:51.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:51 vm06 ceph-mon[57307]: pgmap v63: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T17:26:51.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:51 vm09 ceph-mon[62061]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm09", "root=default"]}]': finished 2026-03-09T17:26:51.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:51 vm09 ceph-mon[62061]: osdmap e31: 6 total, 5 up, 6 in 2026-03-09T17:26:51.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:51 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:26:51.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:51 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:26:51.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:51 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/787510009' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T17:26:51.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:51 vm09 ceph-mon[62061]: pgmap v63: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T17:26:52.169 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd stat -f json 2026-03-09T17:26:52.333 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:52.603 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.600+0000 7f34915e8700 1 -- 192.168.123.106:0/1460113810 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f348c0fee80 msgr2=0x7f348c1012a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:52.603 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.600+0000 7f34915e8700 1 --2- 192.168.123.106:0/1460113810 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f348c0fee80 0x7f348c1012a0 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7f3474009b00 tx=0x7f3474009e10 comp rx=0 tx=0).stop 2026-03-09T17:26:52.603 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.601+0000 7f34915e8700 1 -- 192.168.123.106:0/1460113810 shutdown_connections 2026-03-09T17:26:52.603 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.601+0000 7f34915e8700 1 --2- 192.168.123.106:0/1460113810 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f348c1017e0 0x7f348c103c60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:52.603 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.601+0000 7f34915e8700 1 --2- 192.168.123.106:0/1460113810 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f348c0fee80 0x7f348c1012a0 unknown :-1 s=CLOSED pgs=197 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:52.603 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.601+0000 7f34915e8700 1 -- 192.168.123.106:0/1460113810 >> 192.168.123.106:0/1460113810 conn(0x7f348c0faa70 msgr2=0x7f348c0fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:52.603 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.601+0000 7f34915e8700 1 -- 192.168.123.106:0/1460113810 shutdown_connections 2026-03-09T17:26:52.603 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.601+0000 7f34915e8700 1 -- 192.168.123.106:0/1460113810 wait complete. 2026-03-09T17:26:52.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.602+0000 7f34915e8700 1 Processor -- start 2026-03-09T17:26:52.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.602+0000 7f34915e8700 1 -- start start 2026-03-09T17:26:52.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.602+0000 7f34915e8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f348c0fee80 0x7f348c19c420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:52.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.602+0000 7f34915e8700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f348c1017e0 0x7f348c19c960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:52.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.602+0000 7f34915e8700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f348c19cf80 con 0x7f348c0fee80 2026-03-09T17:26:52.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.602+0000 7f34915e8700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f348c19d0c0 con 0x7f348c1017e0 2026-03-09T17:26:52.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.602+0000 7f348affd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f348c0fee80 0x7f348c19c420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:52.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.602+0000 7f348affd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f348c0fee80 0x7f348c19c420 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:51600/0 (socket says 192.168.123.106:51600) 2026-03-09T17:26:52.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.602+0000 7f348affd700 1 -- 192.168.123.106:0/1244792256 learned_addr learned my addr 192.168.123.106:0/1244792256 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:52.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.602+0000 7f348a7fc700 1 --2- 192.168.123.106:0/1244792256 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f348c1017e0 0x7f348c19c960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:52.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.603+0000 7f348affd700 1 -- 192.168.123.106:0/1244792256 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f348c1017e0 msgr2=0x7f348c19c960 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:52.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.603+0000 7f348affd700 1 --2- 192.168.123.106:0/1244792256 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f348c1017e0 0x7f348c19c960 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:52.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.603+0000 7f348affd700 1 -- 192.168.123.106:0/1244792256 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f34740097e0 con 0x7f348c0fee80 2026-03-09T17:26:52.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.603+0000 7f348affd700 1 --2- 192.168.123.106:0/1244792256 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f348c0fee80 0x7f348c19c420 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7f3474009fd0 tx=0x7f3474005eb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:52.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.603+0000 7f3483fff700 1 -- 192.168.123.106:0/1244792256 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f347401d070 con 0x7f348c0fee80 2026-03-09T17:26:52.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.603+0000 7f34915e8700 1 -- 192.168.123.106:0/1244792256 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f348c1a1b10 con 0x7f348c0fee80 2026-03-09T17:26:52.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.603+0000 7f34915e8700 1 -- 192.168.123.106:0/1244792256 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f348c1a2000 con 0x7f348c0fee80 2026-03-09T17:26:52.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.604+0000 7f3483fff700 1 -- 192.168.123.106:0/1244792256 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f347400bcb0 con 0x7f348c0fee80 2026-03-09T17:26:52.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.604+0000 7f3483fff700 1 -- 192.168.123.106:0/1244792256 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f347400f910 con 0x7f348c0fee80 2026-03-09T17:26:52.607 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.605+0000 7f3483fff700 1 -- 192.168.123.106:0/1244792256 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f347400fb30 con 0x7f348c0fee80 2026-03-09T17:26:52.607 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.605+0000 7f3483fff700 1 --2- 192.168.123.106:0/1244792256 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f347806c4d0 0x7f347806e980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:52.607 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.605+0000 7f348a7fc700 1 --2- 192.168.123.106:0/1244792256 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f347806c4d0 0x7f347806e980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:52.607 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.605+0000 7f3483fff700 1 -- 192.168.123.106:0/1244792256 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f347408cc40 con 0x7f348c0fee80 2026-03-09T17:26:52.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.605+0000 7f348a7fc700 1 --2- 192.168.123.106:0/1244792256 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f347806c4d0 0x7f347806e980 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f347c005950 tx=0x7f347c0058e0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:52.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.606+0000 7f34915e8700 1 -- 192.168.123.106:0/1244792256 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f348c196590 con 0x7f348c0fee80 2026-03-09T17:26:52.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.608+0000 7f3483fff700 1 -- 192.168.123.106:0/1244792256 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f347405c3f0 con 0x7f348c0fee80 2026-03-09T17:26:52.719 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.716+0000 7f34915e8700 1 -- 192.168.123.106:0/1244792256 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f348c061960 con 0x7f348c0fee80 2026-03-09T17:26:52.719 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.717+0000 7f3483fff700 1 -- 192.168.123.106:0/1244792256 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v33) v1 ==== 74+0+130 (secure 0 0 0) 0x7f3474027070 con 0x7f348c0fee80 2026-03-09T17:26:52.719 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:26:52.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.719+0000 7f34915e8700 1 -- 192.168.123.106:0/1244792256 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f347806c4d0 msgr2=0x7f347806e980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:52.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.719+0000 7f34915e8700 1 --2- 192.168.123.106:0/1244792256 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f347806c4d0 0x7f347806e980 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f347c005950 tx=0x7f347c0058e0 comp rx=0 tx=0).stop 2026-03-09T17:26:52.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.719+0000 7f34915e8700 1 -- 192.168.123.106:0/1244792256 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f348c0fee80 msgr2=0x7f348c19c420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:52.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.719+0000 7f34915e8700 1 --2- 192.168.123.106:0/1244792256 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f348c0fee80 0x7f348c19c420 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7f3474009fd0 tx=0x7f3474005eb0 comp rx=0 tx=0).stop 2026-03-09T17:26:52.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.720+0000 7f34915e8700 1 -- 192.168.123.106:0/1244792256 shutdown_connections 2026-03-09T17:26:52.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.720+0000 7f34915e8700 1 --2- 192.168.123.106:0/1244792256 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f347806c4d0 0x7f347806e980 secure :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f347c005950 tx=0x7f347c0058e0 comp rx=0 tx=0).stop 2026-03-09T17:26:52.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.720+0000 7f34915e8700 1 --2- 192.168.123.106:0/1244792256 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f348c0fee80 0x7f348c19c420 unknown :-1 s=CLOSED pgs=198 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:52.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.720+0000 7f34915e8700 1 --2- 192.168.123.106:0/1244792256 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f348c1017e0 0x7f348c19c960 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:52.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.720+0000 7f34915e8700 1 -- 192.168.123.106:0/1244792256 >> 192.168.123.106:0/1244792256 conn(0x7f348c0faa70 msgr2=0x7f348c0fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:52.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.720+0000 7f34915e8700 1 -- 192.168.123.106:0/1244792256 shutdown_connections 2026-03-09T17:26:52.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:52.720+0000 7f34915e8700 1 -- 192.168.123.106:0/1244792256 wait complete. 2026-03-09T17:26:52.733 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:52 vm06 ceph-mon[57307]: purged_snaps scrub starts 2026-03-09T17:26:52.733 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:52 vm06 ceph-mon[57307]: purged_snaps scrub ok 2026-03-09T17:26:52.733 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:52 vm06 ceph-mon[57307]: osd.5 [v2:192.168.123.109:6816/764831086,v1:192.168.123.109:6817/764831086] boot 2026-03-09T17:26:52.733 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:52 vm06 ceph-mon[57307]: osdmap e32: 6 total, 6 up, 6 in 2026-03-09T17:26:52.733 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:52 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:26:52.778 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":33,"num_osds":6,"num_up_osds":6,"osd_up_since":1773077211,"num_in_osds":6,"osd_in_since":1773077199,"num_remapped_pgs":0} 2026-03-09T17:26:52.778 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd dump --format=json 2026-03-09T17:26:52.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:52 vm09 ceph-mon[62061]: purged_snaps scrub starts 2026-03-09T17:26:52.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:52 vm09 ceph-mon[62061]: purged_snaps scrub ok 2026-03-09T17:26:52.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:52 vm09 ceph-mon[62061]: osd.5 [v2:192.168.123.109:6816/764831086,v1:192.168.123.109:6817/764831086] boot 2026-03-09T17:26:52.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:52 vm09 ceph-mon[62061]: osdmap e32: 6 total, 6 up, 6 in 2026-03-09T17:26:52.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:52 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:26:52.930 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:53.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.180+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/3111659634 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d48108780 msgr2=0x7f6d48108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:53.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.180+0000 7f6d4f0d0700 1 --2- 192.168.123.106:0/3111659634 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d48108780 0x7f6d48108b50 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7f6d38009b00 tx=0x7f6d38009e10 comp rx=0 tx=0).stop 2026-03-09T17:26:53.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.181+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/3111659634 shutdown_connections 2026-03-09T17:26:53.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.181+0000 7f6d4f0d0700 1 --2- 192.168.123.106:0/3111659634 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6d48102780 0x7f6d48102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:53.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.181+0000 7f6d4f0d0700 1 --2- 192.168.123.106:0/3111659634 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d48108780 0x7f6d48108b50 unknown :-1 s=CLOSED pgs=199 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:53.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.181+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/3111659634 >> 192.168.123.106:0/3111659634 conn(0x7f6d480fe280 msgr2=0x7f6d48100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:53.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.181+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/3111659634 shutdown_connections 2026-03-09T17:26:53.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.181+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/3111659634 wait complete. 2026-03-09T17:26:53.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.182+0000 7f6d4f0d0700 1 Processor -- start 2026-03-09T17:26:53.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.182+0000 7f6d4f0d0700 1 -- start start 2026-03-09T17:26:53.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.182+0000 7f6d4f0d0700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6d48102780 0x7f6d481983e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:53.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.183+0000 7f6d4f0d0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d48108780 0x7f6d48198920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:53.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.183+0000 7f6d4f0d0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6d48199000 con 0x7f6d48108780 2026-03-09T17:26:53.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.183+0000 7f6d4f0d0700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6d4819cd90 con 0x7f6d48102780 2026-03-09T17:26:53.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.183+0000 7f6d47fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d48108780 0x7f6d48198920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:53.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.183+0000 7f6d47fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d48108780 0x7f6d48198920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43664/0 (socket says 192.168.123.106:43664) 2026-03-09T17:26:53.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.183+0000 7f6d47fff700 1 -- 192.168.123.106:0/2326712420 learned_addr learned my addr 192.168.123.106:0/2326712420 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:53.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.183+0000 7f6d4ce6c700 1 --2- 192.168.123.106:0/2326712420 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6d48102780 0x7f6d481983e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:53.186 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.183+0000 7f6d47fff700 1 -- 192.168.123.106:0/2326712420 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6d48102780 msgr2=0x7f6d481983e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:53.186 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.183+0000 7f6d47fff700 1 --2- 192.168.123.106:0/2326712420 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6d48102780 0x7f6d481983e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:53.186 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.183+0000 7f6d47fff700 1 -- 192.168.123.106:0/2326712420 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6d380097e0 con 0x7f6d48108780 2026-03-09T17:26:53.186 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.184+0000 7f6d47fff700 1 --2- 192.168.123.106:0/2326712420 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d48108780 0x7f6d48198920 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7f6d3c00ba70 tx=0x7f6d3c00bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:53.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.184+0000 7f6d45ffb700 1 -- 192.168.123.106:0/2326712420 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6d3c00c700 con 0x7f6d48108780 2026-03-09T17:26:53.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.184+0000 7f6d45ffb700 1 -- 192.168.123.106:0/2326712420 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6d3c00cd40 con 0x7f6d48108780 2026-03-09T17:26:53.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.184+0000 7f6d45ffb700 1 -- 192.168.123.106:0/2326712420 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6d3c012340 con 0x7f6d48108780 2026-03-09T17:26:53.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.184+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/2326712420 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6d4819d070 con 0x7f6d48108780 2026-03-09T17:26:53.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.184+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/2326712420 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6d4819d5c0 con 0x7f6d48108780 2026-03-09T17:26:53.193 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.187+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/2326712420 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6d4804ea50 con 0x7f6d48108780 2026-03-09T17:26:53.193 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.190+0000 7f6d45ffb700 1 -- 192.168.123.106:0/2326712420 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f6d3c0124e0 con 0x7f6d48108780 2026-03-09T17:26:53.193 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.190+0000 7f6d45ffb700 1 --2- 192.168.123.106:0/2326712420 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6d3006c750 0x7f6d3006ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:53.193 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.190+0000 7f6d4ce6c700 1 --2- 192.168.123.106:0/2326712420 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6d3006c750 0x7f6d3006ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:53.193 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.190+0000 7f6d45ffb700 1 -- 192.168.123.106:0/2326712420 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f6d3c08a400 con 0x7f6d48108780 2026-03-09T17:26:53.193 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.191+0000 7f6d45ffb700 1 -- 192.168.123.106:0/2326712420 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6d3c04e720 con 0x7f6d48108780 2026-03-09T17:26:53.193 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.191+0000 7f6d4ce6c700 1 --2- 192.168.123.106:0/2326712420 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6d3006c750 0x7f6d3006ec00 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f6d38006010 tx=0x7f6d3800b540 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:53.308 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.305+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/2326712420 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f6d48066e40 con 0x7f6d48108780 2026-03-09T17:26:53.309 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.307+0000 7f6d45ffb700 1 -- 192.168.123.106:0/2326712420 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v33) v1 ==== 74+0+11260 (secure 0 0 0) 0x7f6d3c056140 con 0x7f6d48108780 2026-03-09T17:26:53.309 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:26:53.309 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":33,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","created":"2026-03-09T17:24:18.262302+0000","modified":"2026-03-09T17:26:52.561455+0000","last_up_change":"2026-03-09T17:26:51.551725+0000","last_in_change":"2026-03-09T17:26:39.694485+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-09T17:26:21.405377+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"7b23c291-c26f-47f6-aa9d-2b35b2448578","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6802","nonce":1643543004},{"type":"v1","addr":"192.168.123.106:6803","nonce":1643543004}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6804","nonce":1643543004},{"type":"v1","addr":"192.168.123.106:6805","nonce":1643543004}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6808","nonce":1643543004},{"type":"v1","addr":"192.168.123.106:6809","nonce":1643543004}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6806","nonce":1643543004},{"type":"v1","addr":"192.168.123.106:6807","nonce":1643543004}]},"public_addr":"192.168.123.106:6803/1643543004","cluster_addr":"192.168.123.106:6805/1643543004","heartbeat_back_addr":"192.168.123.106:6809/1643543004","heartbeat_front_addr":"192.168.123.106:6807/1643543004","state":["exists","up"]},{"osd":1,"uuid":"e31bb3c2-190d-419c-bb90-f0909a02113b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6810","nonce":1344357563},{"type":"v1","addr":"192.168.123.106:6811","nonce":1344357563}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6812","nonce":1344357563},{"type":"v1","addr":"192.168.123.106:6813","nonce":1344357563}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6816","nonce":1344357563},{"type":"v1","addr":"192.168.123.106:6817","nonce":1344357563}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6814","nonce":1344357563},{"type":"v1","addr":"192.168.123.106:6815","nonce":1344357563}]},"public_addr":"192.168.123.106:6811/1344357563","cluster_addr":"192.168.123.106:6813/1344357563","heartbeat_back_addr":"192.168.123.106:6817/1344357563","heartbeat_front_addr":"192.168.123.106:6815/1344357563","state":["exists","up"]},{"osd":2,"uuid":"4aa1d45d-b786-45ab-97d1-aef76daa15f5","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6818","nonce":1615959701},{"type":"v1","addr":"192.168.123.106:6819","nonce":1615959701}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6820","nonce":1615959701},{"type":"v1","addr":"192.168.123.106:6821","nonce":1615959701}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1615959701},{"type":"v1","addr":"192.168.123.106:6825","nonce":1615959701}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6822","nonce":1615959701},{"type":"v1","addr":"192.168.123.106:6823","nonce":1615959701}]},"public_addr":"192.168.123.106:6819/1615959701","cluster_addr":"192.168.123.106:6821/1615959701","heartbeat_back_addr":"192.168.123.106:6825/1615959701","heartbeat_front_addr":"192.168.123.106:6823/1615959701","state":["exists","up"]},{"osd":3,"uuid":"733bc53e-8727-4119-a70f-00c09a625789","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6800","nonce":1175302832},{"type":"v1","addr":"192.168.123.109:6801","nonce":1175302832}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6802","nonce":1175302832},{"type":"v1","addr":"192.168.123.109:6803","nonce":1175302832}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6806","nonce":1175302832},{"type":"v1","addr":"192.168.123.109:6807","nonce":1175302832}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6804","nonce":1175302832},{"type":"v1","addr":"192.168.123.109:6805","nonce":1175302832}]},"public_addr":"192.168.123.109:6801/1175302832","cluster_addr":"192.168.123.109:6803/1175302832","heartbeat_back_addr":"192.168.123.109:6807/1175302832","heartbeat_front_addr":"192.168.123.109:6805/1175302832","state":["exists","up"]},{"osd":4,"uuid":"24911c64-9b6a-4862-9972-34f73f6f3c13","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6808","nonce":153890079},{"type":"v1","addr":"192.168.123.109:6809","nonce":153890079}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6810","nonce":153890079},{"type":"v1","addr":"192.168.123.109:6811","nonce":153890079}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6814","nonce":153890079},{"type":"v1","addr":"192.168.123.109:6815","nonce":153890079}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6812","nonce":153890079},{"type":"v1","addr":"192.168.123.109:6813","nonce":153890079}]},"public_addr":"192.168.123.109:6809/153890079","cluster_addr":"192.168.123.109:6811/153890079","heartbeat_back_addr":"192.168.123.109:6815/153890079","heartbeat_front_addr":"192.168.123.109:6813/153890079","state":["exists","up"]},{"osd":5,"uuid":"4a80decf-2a05-4525-b2be-269b4a9ba65c","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6816","nonce":764831086},{"type":"v1","addr":"192.168.123.109:6817","nonce":764831086}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6818","nonce":764831086},{"type":"v1","addr":"192.168.123.109:6819","nonce":764831086}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6822","nonce":764831086},{"type":"v1","addr":"192.168.123.109:6823","nonce":764831086}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6820","nonce":764831086},{"type":"v1","addr":"192.168.123.109:6821","nonce":764831086}]},"public_addr":"192.168.123.109:6817/764831086","cluster_addr":"192.168.123.109:6819/764831086","heartbeat_back_addr":"192.168.123.109:6823/764831086","heartbeat_front_addr":"192.168.123.109:6821/764831086","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:25:58.468126+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:08.482089+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:19.719460+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:30.196744+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:39.365544+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:49.759361+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.106:0/3698608392":"2026-03-10T17:25:23.192748+0000","192.168.123.106:0/687723179":"2026-03-10T17:24:47.307802+0000","192.168.123.106:0/3891378720":"2026-03-10T17:25:23.192748+0000","192.168.123.106:0/795371929":"2026-03-10T17:24:47.307802+0000","192.168.123.106:0/1414971993":"2026-03-10T17:25:23.192748+0000","192.168.123.106:6800/2":"2026-03-10T17:24:32.871932+0000","192.168.123.106:0/2965444141":"2026-03-10T17:24:32.871932+0000","192.168.123.106:0/3673999554":"2026-03-10T17:24:32.871932+0000","192.168.123.106:0/2700778697":"2026-03-10T17:24:32.871932+0000","192.168.123.106:6801/2":"2026-03-10T17:24:32.871932+0000","192.168.123.106:0/2798770433":"2026-03-10T17:24:47.307802+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-09T17:26:53.312 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.310+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/2326712420 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6d3006c750 msgr2=0x7f6d3006ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:53.312 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.310+0000 7f6d4f0d0700 1 --2- 192.168.123.106:0/2326712420 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6d3006c750 0x7f6d3006ec00 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f6d38006010 tx=0x7f6d3800b540 comp rx=0 tx=0).stop 2026-03-09T17:26:53.312 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.310+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/2326712420 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d48108780 msgr2=0x7f6d48198920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:53.312 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.310+0000 7f6d4f0d0700 1 --2- 192.168.123.106:0/2326712420 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d48108780 0x7f6d48198920 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7f6d3c00ba70 tx=0x7f6d3c00bd80 comp rx=0 tx=0).stop 2026-03-09T17:26:53.312 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.310+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/2326712420 shutdown_connections 2026-03-09T17:26:53.312 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.310+0000 7f6d4f0d0700 1 --2- 192.168.123.106:0/2326712420 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6d48102780 0x7f6d481983e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:53.312 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.310+0000 7f6d4f0d0700 1 --2- 192.168.123.106:0/2326712420 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6d3006c750 0x7f6d3006ec00 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:53.312 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.310+0000 7f6d4f0d0700 1 --2- 192.168.123.106:0/2326712420 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d48108780 0x7f6d48198920 unknown :-1 s=CLOSED pgs=200 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:53.312 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.310+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/2326712420 >> 192.168.123.106:0/2326712420 conn(0x7f6d480fe280 msgr2=0x7f6d480ffa60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:53.313 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.310+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/2326712420 shutdown_connections 2026-03-09T17:26:53.313 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.310+0000 7f6d4f0d0700 1 -- 192.168.123.106:0/2326712420 wait complete. 2026-03-09T17:26:53.386 INFO:tasks.cephadm.ceph_manager.ceph:[{'pool': 1, 'pool_name': '.mgr', 'create_time': '2026-03-09T17:26:21.405377+0000', 'flags': 1, 'flags_names': 'hashpspool', 'type': 1, 'size': 3, 'min_size': 2, 'crush_rule': 0, 'peering_crush_bucket_count': 0, 'peering_crush_bucket_target': 0, 'peering_crush_bucket_barrier': 0, 'peering_crush_bucket_mandatory_member': 2147483647, 'object_hash': 2, 'pg_autoscale_mode': 'off', 'pg_num': 1, 'pg_placement_num': 1, 'pg_placement_num_target': 1, 'pg_num_target': 1, 'pg_num_pending': 1, 'last_pg_merge_meta': {'source_pgid': '0.0', 'ready_epoch': 0, 'last_epoch_started': 0, 'last_epoch_clean': 0, 'source_version': "0'0", 'target_version': "0'0"}, 'last_change': '20', 'last_force_op_resend': '0', 'last_force_op_resend_prenautilus': '0', 'last_force_op_resend_preluminous': '0', 'auid': 0, 'snap_mode': 'selfmanaged', 'snap_seq': 0, 'snap_epoch': 0, 'pool_snaps': [], 'removed_snaps': '[]', 'quota_max_bytes': 0, 'quota_max_objects': 0, 'tiers': [], 'tier_of': -1, 'read_tier': -1, 'write_tier': -1, 'cache_mode': 'none', 'target_max_bytes': 0, 'target_max_objects': 0, 'cache_target_dirty_ratio_micro': 400000, 'cache_target_dirty_high_ratio_micro': 600000, 'cache_target_full_ratio_micro': 800000, 'cache_min_flush_age': 0, 'cache_min_evict_age': 0, 'erasure_code_profile': '', 'hit_set_params': {'type': 'none'}, 'hit_set_period': 0, 'hit_set_count': 0, 'use_gmt_hitset': True, 'min_read_recency_for_promote': 0, 'min_write_recency_for_promote': 0, 'hit_set_grade_decay_rate': 0, 'hit_set_search_last_n': 0, 'grade_table': [], 'stripe_width': 0, 'expected_num_objects': 0, 'fast_read': False, 'options': {'pg_num_max': 32, 'pg_num_min': 1}, 'application_metadata': {'mgr': {}}, 'read_balance': {'score_acting': 6, 'score_stable': 6, 'optimal_score': 0.5, 'raw_score_acting': 3, 'raw_score_stable': 3, 'primary_affinity_weighted': 1, 'average_primary_affinity': 1, 'average_primary_affinity_weighted': 1}}] 2026-03-09T17:26:53.387 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd pool get .mgr pg_num 2026-03-09T17:26:53.548 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:53.848 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:53 vm06 ceph-mon[57307]: osdmap e33: 6 total, 6 up, 6 in 2026-03-09T17:26:53.848 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:53 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/1244792256' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T17:26:53.848 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:53 vm06 ceph-mon[57307]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:26:53.848 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:53 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:26:53.848 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:53 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/2326712420' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T17:26:53.848 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.845+0000 7f23d52db700 1 -- 192.168.123.106:0/793035651 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f23d0068d20 msgr2=0x7f23d0069170 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:53.848 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.845+0000 7f23d52db700 1 --2- 192.168.123.106:0/793035651 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f23d0068d20 0x7f23d0069170 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7f23c0009b00 tx=0x7f23c0009e10 comp rx=0 tx=0).stop 2026-03-09T17:26:53.848 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.846+0000 7f23d52db700 1 -- 192.168.123.106:0/793035651 shutdown_connections 2026-03-09T17:26:53.849 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.846+0000 7f23d52db700 1 --2- 192.168.123.106:0/793035651 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f23d0068d20 0x7f23d0069170 unknown :-1 s=CLOSED pgs=201 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:53.849 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.846+0000 7f23d52db700 1 --2- 192.168.123.106:0/793035651 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f23d0105440 0x7f23d0105810 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:53.849 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.846+0000 7f23d52db700 1 -- 192.168.123.106:0/793035651 >> 192.168.123.106:0/793035651 conn(0x7f23d0077530 msgr2=0x7f23d0077930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:53.849 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.847+0000 7f23d52db700 1 -- 192.168.123.106:0/793035651 shutdown_connections 2026-03-09T17:26:53.849 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.847+0000 7f23d52db700 1 -- 192.168.123.106:0/793035651 wait complete. 2026-03-09T17:26:53.850 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.848+0000 7f23d52db700 1 Processor -- start 2026-03-09T17:26:53.850 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.848+0000 7f23d52db700 1 -- start start 2026-03-09T17:26:53.850 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.848+0000 7f23d52db700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f23d0105440 0x7f23d010bcc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:53.850 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.848+0000 7f23d52db700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f23d010c200 0x7f23d010c670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:53.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.848+0000 7f23d52db700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f23d010fd40 con 0x7f23d010c200 2026-03-09T17:26:53.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.848+0000 7f23d52db700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f23d010feb0 con 0x7f23d0105440 2026-03-09T17:26:53.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.849+0000 7f23ceffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f23d0105440 0x7f23d010bcc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:53.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.849+0000 7f23ceffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f23d0105440 0x7f23d010bcc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:46650/0 (socket says 192.168.123.106:46650) 2026-03-09T17:26:53.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.849+0000 7f23ceffd700 1 -- 192.168.123.106:0/3882599571 learned_addr learned my addr 192.168.123.106:0/3882599571 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:53.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.849+0000 7f23ceffd700 1 -- 192.168.123.106:0/3882599571 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f23d010c200 msgr2=0x7f23d010c670 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:26:53.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.849+0000 7f23ceffd700 1 --2- 192.168.123.106:0/3882599571 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f23d010c200 0x7f23d010c670 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:53.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.849+0000 7f23ceffd700 1 -- 192.168.123.106:0/3882599571 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f23c00097e0 con 0x7f23d0105440 2026-03-09T17:26:53.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.849+0000 7f23ceffd700 1 --2- 192.168.123.106:0/3882599571 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f23d0105440 0x7f23d010bcc0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f23c400c8f0 tx=0x7f23c400ccb0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:53.852 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.849+0000 7f23b7fff700 1 -- 192.168.123.106:0/3882599571 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f23c4007a50 con 0x7f23d0105440 2026-03-09T17:26:53.852 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.849+0000 7f23b7fff700 1 -- 192.168.123.106:0/3882599571 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f23c4007bb0 con 0x7f23d0105440 2026-03-09T17:26:53.852 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.850+0000 7f23d52db700 1 -- 192.168.123.106:0/3882599571 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f23d0071b90 con 0x7f23d0105440 2026-03-09T17:26:53.852 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.850+0000 7f23d52db700 1 -- 192.168.123.106:0/3882599571 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f23d00720e0 con 0x7f23d0105440 2026-03-09T17:26:53.852 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.850+0000 7f23b7fff700 1 -- 192.168.123.106:0/3882599571 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f23c4007a50 con 0x7f23d0105440 2026-03-09T17:26:53.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.851+0000 7f23d52db700 1 -- 192.168.123.106:0/3882599571 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f23d004ea50 con 0x7f23d0105440 2026-03-09T17:26:53.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.851+0000 7f23b7fff700 1 -- 192.168.123.106:0/3882599571 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f23c401f050 con 0x7f23d0105440 2026-03-09T17:26:53.854 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.852+0000 7f23b7fff700 1 --2- 192.168.123.106:0/3882599571 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f23b806c710 0x7f23b806ebc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:53.854 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.852+0000 7f23b7fff700 1 -- 192.168.123.106:0/3882599571 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f23c401d070 con 0x7f23d0105440 2026-03-09T17:26:53.854 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.852+0000 7f23ce7fc700 1 --2- 192.168.123.106:0/3882599571 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f23b806c710 0x7f23b806ebc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:53.854 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.852+0000 7f23ce7fc700 1 --2- 192.168.123.106:0/3882599571 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f23b806c710 0x7f23b806ebc0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f23c0000c00 tx=0x7f23c0005fb0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:53.857 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.855+0000 7f23b7fff700 1 -- 192.168.123.106:0/3882599571 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f23c4055630 con 0x7f23d0105440 2026-03-09T17:26:53.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:53 vm09 ceph-mon[62061]: osdmap e33: 6 total, 6 up, 6 in 2026-03-09T17:26:53.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:53 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/1244792256' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T17:26:53.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:53 vm09 ceph-mon[62061]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:26:53.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:53 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:26:53.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:53 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/2326712420' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T17:26:53.964 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.961+0000 7f23d52db700 1 -- 192.168.123.106:0/3882599571 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"} v 0) v1 -- 0x7f23d010d1f0 con 0x7f23d0105440 2026-03-09T17:26:53.965 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.963+0000 7f23b7fff700 1 -- 192.168.123.106:0/3882599571 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]=0 v33) v1 ==== 93+0+10 (secure 0 0 0) 0x7f23c4055630 con 0x7f23d0105440 2026-03-09T17:26:53.966 INFO:teuthology.orchestra.run.vm06.stdout:pg_num: 1 2026-03-09T17:26:53.968 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.966+0000 7f23d52db700 1 -- 192.168.123.106:0/3882599571 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f23b806c710 msgr2=0x7f23b806ebc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:53.968 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.966+0000 7f23d52db700 1 --2- 192.168.123.106:0/3882599571 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f23b806c710 0x7f23b806ebc0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f23c0000c00 tx=0x7f23c0005fb0 comp rx=0 tx=0).stop 2026-03-09T17:26:53.968 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.966+0000 7f23d52db700 1 -- 192.168.123.106:0/3882599571 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f23d0105440 msgr2=0x7f23d010bcc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:53.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.967+0000 7f23d52db700 1 --2- 192.168.123.106:0/3882599571 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f23d0105440 0x7f23d010bcc0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f23c400c8f0 tx=0x7f23c400ccb0 comp rx=0 tx=0).stop 2026-03-09T17:26:53.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.967+0000 7f23d52db700 1 -- 192.168.123.106:0/3882599571 shutdown_connections 2026-03-09T17:26:53.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.967+0000 7f23d52db700 1 --2- 192.168.123.106:0/3882599571 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f23d0105440 0x7f23d010bcc0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:53.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.967+0000 7f23d52db700 1 --2- 192.168.123.106:0/3882599571 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f23b806c710 0x7f23b806ebc0 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:53.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.967+0000 7f23d52db700 1 --2- 192.168.123.106:0/3882599571 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f23d010c200 0x7f23d010c670 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:53.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.967+0000 7f23d52db700 1 -- 192.168.123.106:0/3882599571 >> 192.168.123.106:0/3882599571 conn(0x7f23d0077530 msgr2=0x7f23d00fd1a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:53.970 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.968+0000 7f23d52db700 1 -- 192.168.123.106:0/3882599571 shutdown_connections 2026-03-09T17:26:53.970 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:53.968+0000 7f23d52db700 1 -- 192.168.123.106:0/3882599571 wait complete. 2026-03-09T17:26:54.058 INFO:tasks.cephadm:Setting up client nodes... 2026-03-09T17:26:54.058 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph auth get-or-create client.0 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-09T17:26:54.242 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:54.523 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.520+0000 7f4fd1d11700 1 -- 192.168.123.106:0/3285804750 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4fcc103a00 msgr2=0x7f4fcc103e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:54.523 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.520+0000 7f4fd1d11700 1 --2- 192.168.123.106:0/3285804750 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4fcc103a00 0x7f4fcc103e70 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7f4fbc009b00 tx=0x7f4fbc009e10 comp rx=0 tx=0).stop 2026-03-09T17:26:54.523 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.521+0000 7f4fd1d11700 1 -- 192.168.123.106:0/3285804750 shutdown_connections 2026-03-09T17:26:54.523 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.521+0000 7f4fd1d11700 1 --2- 192.168.123.106:0/3285804750 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4fcc103a00 0x7f4fcc103e70 unknown :-1 s=CLOSED pgs=202 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:54.523 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.521+0000 7f4fd1d11700 1 --2- 192.168.123.106:0/3285804750 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4fcc102760 0x7f4fcc102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:54.523 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.521+0000 7f4fd1d11700 1 -- 192.168.123.106:0/3285804750 >> 192.168.123.106:0/3285804750 conn(0x7f4fcc0fddb0 msgr2=0x7f4fcc1001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:54.523 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.521+0000 7f4fd1d11700 1 -- 192.168.123.106:0/3285804750 shutdown_connections 2026-03-09T17:26:54.523 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.521+0000 7f4fd1d11700 1 -- 192.168.123.106:0/3285804750 wait complete. 2026-03-09T17:26:54.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.522+0000 7f4fd1d11700 1 Processor -- start 2026-03-09T17:26:54.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.522+0000 7f4fd1d11700 1 -- start start 2026-03-09T17:26:54.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.522+0000 7f4fd1d11700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4fcc102760 0x7f4fcc197fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:54.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.522+0000 7f4fcb7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4fcc102760 0x7f4fcc197fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:54.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.522+0000 7f4fcb7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4fcc102760 0x7f4fcc197fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43690/0 (socket says 192.168.123.106:43690) 2026-03-09T17:26:54.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.522+0000 7f4fd1d11700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4fcc103a00 0x7f4fcc198500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:54.524 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.522+0000 7f4fcb7fe700 1 -- 192.168.123.106:0/462237467 learned_addr learned my addr 192.168.123.106:0/462237467 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:54.525 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.522+0000 7f4fd1d11700 1 -- 192.168.123.106:0/462237467 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4fcc198b20 con 0x7f4fcc102760 2026-03-09T17:26:54.525 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.522+0000 7f4fd1d11700 1 -- 192.168.123.106:0/462237467 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4fcc198c60 con 0x7f4fcc103a00 2026-03-09T17:26:54.525 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.523+0000 7f4fcb7fe700 1 -- 192.168.123.106:0/462237467 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4fcc103a00 msgr2=0x7f4fcc198500 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:54.525 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.523+0000 7f4fcb7fe700 1 --2- 192.168.123.106:0/462237467 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4fcc103a00 0x7f4fcc198500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:54.525 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.523+0000 7f4fcb7fe700 1 -- 192.168.123.106:0/462237467 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4fbc0097e0 con 0x7f4fcc102760 2026-03-09T17:26:54.525 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.523+0000 7f4fcb7fe700 1 --2- 192.168.123.106:0/462237467 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4fcc102760 0x7f4fcc197fc0 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7f4fb400eb10 tx=0x7f4fb400eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:54.525 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.523+0000 7f4fc8ff9700 1 -- 192.168.123.106:0/462237467 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4fb400cca0 con 0x7f4fcc102760 2026-03-09T17:26:54.527 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.523+0000 7f4fd1d11700 1 -- 192.168.123.106:0/462237467 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4fcc19d710 con 0x7f4fcc102760 2026-03-09T17:26:54.527 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.523+0000 7f4fd1d11700 1 -- 192.168.123.106:0/462237467 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4fcc19dc60 con 0x7f4fcc102760 2026-03-09T17:26:54.527 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.523+0000 7f4fc8ff9700 1 -- 192.168.123.106:0/462237467 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4fb400ce00 con 0x7f4fcc102760 2026-03-09T17:26:54.527 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.523+0000 7f4fc8ff9700 1 -- 192.168.123.106:0/462237467 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4fb4018910 con 0x7f4fcc102760 2026-03-09T17:26:54.527 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.524+0000 7f4fc8ff9700 1 -- 192.168.123.106:0/462237467 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f4fb4018b50 con 0x7f4fcc102760 2026-03-09T17:26:54.527 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.525+0000 7f4fc8ff9700 1 --2- 192.168.123.106:0/462237467 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4fb806c6f0 0x7f4fb806eba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:54.527 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.525+0000 7f4fc8ff9700 1 -- 192.168.123.106:0/462237467 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f4fb4014070 con 0x7f4fcc102760 2026-03-09T17:26:54.527 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.525+0000 7f4fcaffd700 1 --2- 192.168.123.106:0/462237467 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4fb806c6f0 0x7f4fb806eba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:54.528 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.526+0000 7f4fcaffd700 1 --2- 192.168.123.106:0/462237467 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4fb806c6f0 0x7f4fb806eba0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f4fbc009ad0 tx=0x7f4fbc005c00 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:54.530 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.526+0000 7f4fd1d11700 1 -- 192.168.123.106:0/462237467 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4fac005320 con 0x7f4fcc102760 2026-03-09T17:26:54.531 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.529+0000 7f4fc8ff9700 1 -- 192.168.123.106:0/462237467 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4fb405b5f0 con 0x7f4fcc102760 2026-03-09T17:26:54.682 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.679+0000 7f4fd1d11700 1 -- 192.168.123.106:0/462237467 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7f4fac005190 con 0x7f4fcc102760 2026-03-09T17:26:54.688 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.685+0000 7f4fc8ff9700 1 -- 192.168.123.106:0/462237467 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v16) v1 ==== 170+0+59 (secure 0 0 0) 0x7f4fb405b180 con 0x7f4fcc102760 2026-03-09T17:26:54.688 INFO:teuthology.orchestra.run.vm06.stdout:[client.0] 2026-03-09T17:26:54.688 INFO:teuthology.orchestra.run.vm06.stdout: key = AQDeAq9pX2adKBAA+0vWLXI9TiRaiCE/IWTNyA== 2026-03-09T17:26:54.691 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.689+0000 7f4fd1d11700 1 -- 192.168.123.106:0/462237467 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4fb806c6f0 msgr2=0x7f4fb806eba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:54.691 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.689+0000 7f4fd1d11700 1 --2- 192.168.123.106:0/462237467 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4fb806c6f0 0x7f4fb806eba0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f4fbc009ad0 tx=0x7f4fbc005c00 comp rx=0 tx=0).stop 2026-03-09T17:26:54.691 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.689+0000 7f4fd1d11700 1 -- 192.168.123.106:0/462237467 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4fcc102760 msgr2=0x7f4fcc197fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:54.691 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.689+0000 7f4fd1d11700 1 --2- 192.168.123.106:0/462237467 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4fcc102760 0x7f4fcc197fc0 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7f4fb400eb10 tx=0x7f4fb400eed0 comp rx=0 tx=0).stop 2026-03-09T17:26:54.691 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.689+0000 7f4fd1d11700 1 -- 192.168.123.106:0/462237467 shutdown_connections 2026-03-09T17:26:54.691 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.689+0000 7f4fd1d11700 1 --2- 192.168.123.106:0/462237467 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4fb806c6f0 0x7f4fb806eba0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:54.691 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.689+0000 7f4fd1d11700 1 --2- 192.168.123.106:0/462237467 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4fcc102760 0x7f4fcc197fc0 unknown :-1 s=CLOSED pgs=203 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:54.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.689+0000 7f4fd1d11700 1 --2- 192.168.123.106:0/462237467 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4fcc103a00 0x7f4fcc198500 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:54.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.689+0000 7f4fd1d11700 1 -- 192.168.123.106:0/462237467 >> 192.168.123.106:0/462237467 conn(0x7f4fcc0fddb0 msgr2=0x7f4fcc100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:54.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.689+0000 7f4fd1d11700 1 -- 192.168.123.106:0/462237467 shutdown_connections 2026-03-09T17:26:54.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:54.689+0000 7f4fd1d11700 1 -- 192.168.123.106:0/462237467 wait complete. 2026-03-09T17:26:54.765 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:26:54.765 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/ceph/ceph.client.0.keyring 2026-03-09T17:26:54.766 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod 0644 /etc/ceph/ceph.client.0.keyring 2026-03-09T17:26:54.792 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:54 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3882599571' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-09T17:26:54.807 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph auth get-or-create client.1 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-09T17:26:54.835 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:54 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/3882599571' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-09T17:26:54.962 INFO:teuthology.orchestra.run.vm09.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm09/config 2026-03-09T17:26:55.250 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.247+0000 7fbbb95e4700 1 -- 192.168.123.109:0/1714966240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbb4100570 msgr2=0x7fbbb4100980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:55.250 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.247+0000 7fbbb95e4700 1 --2- 192.168.123.109:0/1714966240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbb4100570 0x7fbbb4100980 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7fbb9c009b00 tx=0x7fbb9c009e10 comp rx=0 tx=0).stop 2026-03-09T17:26:55.250 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.247+0000 7fbbb1ffb700 1 -- 192.168.123.109:0/1714966240 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbb9c005670 con 0x7fbbb4100570 2026-03-09T17:26:55.250 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.248+0000 7fbbb95e4700 1 -- 192.168.123.109:0/1714966240 shutdown_connections 2026-03-09T17:26:55.250 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.248+0000 7fbbb95e4700 1 --2- 192.168.123.109:0/1714966240 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbb4101770 0x7fbbb4101bc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:55.250 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.248+0000 7fbbb95e4700 1 --2- 192.168.123.109:0/1714966240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbb4100570 0x7fbbb4100980 unknown :-1 s=CLOSED pgs=204 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:55.250 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.248+0000 7fbbb95e4700 1 -- 192.168.123.109:0/1714966240 >> 192.168.123.109:0/1714966240 conn(0x7fbbb40fbb00 msgr2=0x7fbbb40fdf50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:55.251 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.249+0000 7fbbb95e4700 1 -- 192.168.123.109:0/1714966240 shutdown_connections 2026-03-09T17:26:55.251 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.249+0000 7fbbb95e4700 1 -- 192.168.123.109:0/1714966240 wait complete. 2026-03-09T17:26:55.251 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.249+0000 7fbbb95e4700 1 Processor -- start 2026-03-09T17:26:55.251 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.250+0000 7fbbb95e4700 1 -- start start 2026-03-09T17:26:55.252 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.250+0000 7fbbb95e4700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbb4100570 0x7fbbb4195dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:55.252 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.250+0000 7fbbb95e4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbb4101770 0x7fbbb4196310 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:55.252 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.250+0000 7fbbb95e4700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbbb4196930 con 0x7fbbb4101770 2026-03-09T17:26:55.252 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.250+0000 7fbbb95e4700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbbb4196a70 con 0x7fbbb4100570 2026-03-09T17:26:55.253 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.250+0000 7fbbb2ffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbb4100570 0x7fbbb4195dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:55.253 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.250+0000 7fbbb2ffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbb4100570 0x7fbbb4195dd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.109:56260/0 (socket says 192.168.123.109:56260) 2026-03-09T17:26:55.253 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.250+0000 7fbbb2ffd700 1 -- 192.168.123.109:0/3806954774 learned_addr learned my addr 192.168.123.109:0/3806954774 (peer_addr_for_me v2:192.168.123.109:0/0) 2026-03-09T17:26:55.253 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.250+0000 7fbbb27fc700 1 --2- 192.168.123.109:0/3806954774 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbb4101770 0x7fbbb4196310 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:55.253 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.251+0000 7fbbb2ffd700 1 -- 192.168.123.109:0/3806954774 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbb4101770 msgr2=0x7fbbb4196310 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:55.253 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.251+0000 7fbbb2ffd700 1 --2- 192.168.123.109:0/3806954774 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbb4101770 0x7fbbb4196310 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:55.253 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.251+0000 7fbbb2ffd700 1 -- 192.168.123.109:0/3806954774 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbba4009710 con 0x7fbbb4100570 2026-03-09T17:26:55.253 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.251+0000 7fbbb27fc700 1 --2- 192.168.123.109:0/3806954774 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbb4101770 0x7fbbb4196310 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:26:55.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.251+0000 7fbbb2ffd700 1 --2- 192.168.123.109:0/3806954774 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbb4100570 0x7fbbb4195dd0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fbb9c00b5c0 tx=0x7fbb9c004970 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:55.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.251+0000 7fbbabfff700 1 -- 192.168.123.109:0/3806954774 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbb9c01d070 con 0x7fbbb4100570 2026-03-09T17:26:55.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.251+0000 7fbbb95e4700 1 -- 192.168.123.109:0/3806954774 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbb9c0097e0 con 0x7fbbb4100570 2026-03-09T17:26:55.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.252+0000 7fbbb95e4700 1 -- 192.168.123.109:0/3806954774 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbbb40732e0 con 0x7fbbb4100570 2026-03-09T17:26:55.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.252+0000 7fbbabfff700 1 -- 192.168.123.109:0/3806954774 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbb9c004da0 con 0x7fbbb4100570 2026-03-09T17:26:55.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.252+0000 7fbbabfff700 1 -- 192.168.123.109:0/3806954774 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbb9c00f670 con 0x7fbbb4100570 2026-03-09T17:26:55.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.253+0000 7fbbabfff700 1 -- 192.168.123.109:0/3806954774 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fbb9c00f7d0 con 0x7fbbb4100570 2026-03-09T17:26:55.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.253+0000 7fbbabfff700 1 --2- 192.168.123.109:0/3806954774 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbba0074ee0 0x7fbba0077390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:55.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.253+0000 7fbbabfff700 1 -- 192.168.123.109:0/3806954774 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fbb9c08dca0 con 0x7fbbb4100570 2026-03-09T17:26:55.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.254+0000 7fbbb27fc700 1 --2- 192.168.123.109:0/3806954774 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbba0074ee0 0x7fbba0077390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:55.256 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.255+0000 7fbbb27fc700 1 --2- 192.168.123.109:0/3806954774 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbba0074ee0 0x7fbba0077390 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fbba4009f60 tx=0x7fbba4009450 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:55.257 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.255+0000 7fbbb95e4700 1 -- 192.168.123.109:0/3806954774 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbb94005320 con 0x7fbbb4100570 2026-03-09T17:26:55.260 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.258+0000 7fbbabfff700 1 -- 192.168.123.109:0/3806954774 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fbb9c05c2a0 con 0x7fbbb4100570 2026-03-09T17:26:55.404 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.402+0000 7fbbb95e4700 1 -- 192.168.123.109:0/3806954774 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7fbb94005190 con 0x7fbbb4100570 2026-03-09T17:26:55.410 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.408+0000 7fbbabfff700 1 -- 192.168.123.109:0/3806954774 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v17) v1 ==== 170+0+59 (secure 0 0 0) 0x7fbb9c027020 con 0x7fbbb4100570 2026-03-09T17:26:55.411 INFO:teuthology.orchestra.run.vm09.stdout:[client.1] 2026-03-09T17:26:55.411 INFO:teuthology.orchestra.run.vm09.stdout: key = AQDfAq9pOJMiGBAA+XfJJofXjYXzAR280u1kcA== 2026-03-09T17:26:55.413 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.411+0000 7fbbb95e4700 1 -- 192.168.123.109:0/3806954774 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbba0074ee0 msgr2=0x7fbba0077390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:55.413 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.411+0000 7fbbb95e4700 1 --2- 192.168.123.109:0/3806954774 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbba0074ee0 0x7fbba0077390 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fbba4009f60 tx=0x7fbba4009450 comp rx=0 tx=0).stop 2026-03-09T17:26:55.413 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.411+0000 7fbbb95e4700 1 -- 192.168.123.109:0/3806954774 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbb4100570 msgr2=0x7fbbb4195dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:55.413 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.411+0000 7fbbb95e4700 1 --2- 192.168.123.109:0/3806954774 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbb4100570 0x7fbbb4195dd0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fbb9c00b5c0 tx=0x7fbb9c004970 comp rx=0 tx=0).stop 2026-03-09T17:26:55.413 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.412+0000 7fbbb95e4700 1 -- 192.168.123.109:0/3806954774 shutdown_connections 2026-03-09T17:26:55.414 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.412+0000 7fbbb95e4700 1 --2- 192.168.123.109:0/3806954774 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbb4100570 0x7fbbb4195dd0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:55.414 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.412+0000 7fbbb95e4700 1 --2- 192.168.123.109:0/3806954774 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbba0074ee0 0x7fbba0077390 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:55.414 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.412+0000 7fbbb95e4700 1 --2- 192.168.123.109:0/3806954774 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbb4101770 0x7fbbb4196310 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:55.414 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.412+0000 7fbbb95e4700 1 -- 192.168.123.109:0/3806954774 >> 192.168.123.109:0/3806954774 conn(0x7fbbb40fbb00 msgr2=0x7fbbb41049a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:55.414 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.412+0000 7fbbb95e4700 1 -- 192.168.123.109:0/3806954774 shutdown_connections 2026-03-09T17:26:55.414 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:26:55.412+0000 7fbbb95e4700 1 -- 192.168.123.109:0/3806954774 wait complete. 2026-03-09T17:26:55.491 DEBUG:teuthology.orchestra.run.vm09:> set -ex 2026-03-09T17:26:55.491 DEBUG:teuthology.orchestra.run.vm09:> sudo dd of=/etc/ceph/ceph.client.1.keyring 2026-03-09T17:26:55.491 DEBUG:teuthology.orchestra.run.vm09:> sudo chmod 0644 /etc/ceph/ceph.client.1.keyring 2026-03-09T17:26:55.571 INFO:tasks.ceph:Waiting until ceph daemons up and pgs clean... 2026-03-09T17:26:55.571 INFO:tasks.cephadm.ceph_manager.ceph:waiting for mgr available 2026-03-09T17:26:55.571 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mgr dump --format=json 2026-03-09T17:26:55.586 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:55 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/462237467' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T17:26:55.586 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:55 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/462237467' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T17:26:55.586 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:55 vm06 ceph-mon[57307]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:26:55.586 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:55 vm06 ceph-mon[57307]: from='client.? 192.168.123.109:0/3806954774' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T17:26:55.586 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:55 vm06 ceph-mon[57307]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T17:26:55.586 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:55 vm06 ceph-mon[57307]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T17:26:55.741 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:55.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:55 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/462237467' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T17:26:55.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:55 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/462237467' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T17:26:55.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:55 vm09 ceph-mon[62061]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:26:55.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:55 vm09 ceph-mon[62061]: from='client.? 192.168.123.109:0/3806954774' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T17:26:55.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:55 vm09 ceph-mon[62061]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T17:26:55.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:55 vm09 ceph-mon[62061]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T17:26:56.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.024+0000 7fd103ee1700 1 -- 192.168.123.106:0/129134270 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0fc068490 msgr2=0x7fd0fc068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:56.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.024+0000 7fd103ee1700 1 --2- 192.168.123.106:0/129134270 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0fc068490 0x7fd0fc068900 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7fd0f8009b00 tx=0x7fd0f8009e10 comp rx=0 tx=0).stop 2026-03-09T17:26:56.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.025+0000 7fd103ee1700 1 -- 192.168.123.106:0/129134270 shutdown_connections 2026-03-09T17:26:56.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.025+0000 7fd103ee1700 1 --2- 192.168.123.106:0/129134270 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0fc068490 0x7fd0fc068900 unknown :-1 s=CLOSED pgs=205 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:56.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.025+0000 7fd103ee1700 1 --2- 192.168.123.106:0/129134270 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd0fc1013a0 0x7fd0fc101770 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:56.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.025+0000 7fd103ee1700 1 -- 192.168.123.106:0/129134270 >> 192.168.123.106:0/129134270 conn(0x7fd0fc0754a0 msgr2=0x7fd0fc0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:56.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.025+0000 7fd103ee1700 1 -- 192.168.123.106:0/129134270 shutdown_connections 2026-03-09T17:26:56.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.025+0000 7fd103ee1700 1 -- 192.168.123.106:0/129134270 wait complete. 2026-03-09T17:26:56.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.026+0000 7fd103ee1700 1 Processor -- start 2026-03-09T17:26:56.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.026+0000 7fd103ee1700 1 -- start start 2026-03-09T17:26:56.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.026+0000 7fd103ee1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0fc068490 0x7fd0fc1021e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:56.029 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.026+0000 7fd103ee1700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd0fc1013a0 0x7fd0fc102720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:56.029 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.026+0000 7fd103ee1700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd0fc1062e0 con 0x7fd0fc068490 2026-03-09T17:26:56.029 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.026+0000 7fd103ee1700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd0fc102c60 con 0x7fd0fc1013a0 2026-03-09T17:26:56.029 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.027+0000 7fd101c7d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0fc068490 0x7fd0fc1021e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:56.029 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.027+0000 7fd101c7d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0fc068490 0x7fd0fc1021e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43700/0 (socket says 192.168.123.106:43700) 2026-03-09T17:26:56.029 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.027+0000 7fd101c7d700 1 -- 192.168.123.106:0/4057289213 learned_addr learned my addr 192.168.123.106:0/4057289213 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:56.030 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.027+0000 7fd10147c700 1 --2- 192.168.123.106:0/4057289213 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd0fc1013a0 0x7fd0fc102720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:56.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.027+0000 7fd101c7d700 1 -- 192.168.123.106:0/4057289213 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd0fc1013a0 msgr2=0x7fd0fc102720 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:56.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.027+0000 7fd101c7d700 1 --2- 192.168.123.106:0/4057289213 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd0fc1013a0 0x7fd0fc102720 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:56.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.027+0000 7fd101c7d700 1 -- 192.168.123.106:0/4057289213 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd0f80097e0 con 0x7fd0fc068490 2026-03-09T17:26:56.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.027+0000 7fd101c7d700 1 --2- 192.168.123.106:0/4057289213 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0fc068490 0x7fd0fc1021e0 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7fd0f000cc60 tx=0x7fd0f00074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:56.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.028+0000 7fd0eeffd700 1 -- 192.168.123.106:0/4057289213 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd0f0007af0 con 0x7fd0fc068490 2026-03-09T17:26:56.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.028+0000 7fd103ee1700 1 -- 192.168.123.106:0/4057289213 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd0fc102f40 con 0x7fd0fc068490 2026-03-09T17:26:56.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.028+0000 7fd103ee1700 1 -- 192.168.123.106:0/4057289213 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd0fc1a6a70 con 0x7fd0fc068490 2026-03-09T17:26:56.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.028+0000 7fd0eeffd700 1 -- 192.168.123.106:0/4057289213 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd0f0007c50 con 0x7fd0fc068490 2026-03-09T17:26:56.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.028+0000 7fd0eeffd700 1 -- 192.168.123.106:0/4057289213 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd0f0018770 con 0x7fd0fc068490 2026-03-09T17:26:56.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.029+0000 7fd0eeffd700 1 -- 192.168.123.106:0/4057289213 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd0f00188d0 con 0x7fd0fc068490 2026-03-09T17:26:56.033 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.030+0000 7fd0eeffd700 1 --2- 192.168.123.106:0/4057289213 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd0e806c7a0 0x7fd0e806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:56.033 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.030+0000 7fd10147c700 1 --2- 192.168.123.106:0/4057289213 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd0e806c7a0 0x7fd0e806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:56.033 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.030+0000 7fd10147c700 1 --2- 192.168.123.106:0/4057289213 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd0e806c7a0 0x7fd0e806ec50 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fd0fc103a80 tx=0x7fd0f8005c00 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:56.033 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.030+0000 7fd0eeffd700 1 -- 192.168.123.106:0/4057289213 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fd0f008c800 con 0x7fd0fc068490 2026-03-09T17:26:56.036 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.032+0000 7fd103ee1700 1 -- 192.168.123.106:0/4057289213 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd0e0005320 con 0x7fd0fc068490 2026-03-09T17:26:56.037 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.034+0000 7fd0eeffd700 1 -- 192.168.123.106:0/4057289213 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd0f0057340 con 0x7fd0fc068490 2026-03-09T17:26:56.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.173+0000 7fd103ee1700 1 -- 192.168.123.106:0/4057289213 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mgr dump", "format": "json"} v 0) v1 -- 0x7fd0e0005190 con 0x7fd0fc068490 2026-03-09T17:26:56.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.175+0000 7fd0eeffd700 1 -- 192.168.123.106:0/4057289213 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mgr dump", "format": "json"}]=0 v19) v1 ==== 74+0+172843 (secure 0 0 0) 0x7fd0f005a960 con 0x7fd0fc068490 2026-03-09T17:26:56.178 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:26:56.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.180+0000 7fd103ee1700 1 -- 192.168.123.106:0/4057289213 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd0e806c7a0 msgr2=0x7fd0e806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:56.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.180+0000 7fd103ee1700 1 --2- 192.168.123.106:0/4057289213 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd0e806c7a0 0x7fd0e806ec50 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fd0fc103a80 tx=0x7fd0f8005c00 comp rx=0 tx=0).stop 2026-03-09T17:26:56.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.180+0000 7fd103ee1700 1 -- 192.168.123.106:0/4057289213 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0fc068490 msgr2=0x7fd0fc1021e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:56.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.181+0000 7fd103ee1700 1 --2- 192.168.123.106:0/4057289213 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0fc068490 0x7fd0fc1021e0 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7fd0f000cc60 tx=0x7fd0f00074a0 comp rx=0 tx=0).stop 2026-03-09T17:26:56.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.181+0000 7fd103ee1700 1 -- 192.168.123.106:0/4057289213 shutdown_connections 2026-03-09T17:26:56.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.181+0000 7fd103ee1700 1 --2- 192.168.123.106:0/4057289213 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd0e806c7a0 0x7fd0e806ec50 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:56.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.181+0000 7fd103ee1700 1 --2- 192.168.123.106:0/4057289213 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd0fc068490 0x7fd0fc1021e0 unknown :-1 s=CLOSED pgs=206 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:56.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.181+0000 7fd103ee1700 1 --2- 192.168.123.106:0/4057289213 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd0fc1013a0 0x7fd0fc102720 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:56.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.181+0000 7fd103ee1700 1 -- 192.168.123.106:0/4057289213 >> 192.168.123.106:0/4057289213 conn(0x7fd0fc0754a0 msgr2=0x7fd0fc0fdd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:56.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.181+0000 7fd103ee1700 1 -- 192.168.123.106:0/4057289213 shutdown_connections 2026-03-09T17:26:56.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.182+0000 7fd103ee1700 1 -- 192.168.123.106:0/4057289213 wait complete. 2026-03-09T17:26:56.252 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":19,"active_gid":14221,"active_name":"vm06.pbgzei","active_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6800","nonce":2},{"type":"v1","addr":"192.168.123.106:6801","nonce":2}]},"active_addr":"192.168.123.106:6801/2","active_change":"2026-03-09T17:25:23.192848+0000","active_mgr_features":4540138322906710015,"available":true,"standbys":[{"gid":14248,"name":"vm09.lqzvkh","mgr_features":4540138322906710015,"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_password":{"name":"alertmanager_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web password","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_user":{"name":"alertmanager_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web user","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph:v18","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"secs","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"docker.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"prometheus_web_password":{"name":"prometheus_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web password","long_desc":"","tags":[],"see_also":[]},"prometheus_web_user":{"name":"prometheus_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web user","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"noautoscale":{"name":"noautoscale","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"global autoscale flag","long_desc":"Option to turn on/off the autoscaler for all pools","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"drive_group_interval":{"name":"drive_group_interval","type":"float","level":"advanced","flags":0,"default_value":"300.0","min":"","max":"","enum_allowed":[],"desc":"interval in seconds between re-application of applied drive_groups","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}]}],"modules":["cephadm","dashboard","iostat","nfs","prometheus","restful"],"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_password":{"name":"alertmanager_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web password","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_user":{"name":"alertmanager_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web user","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph:v18","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"secs","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"docker.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"prometheus_web_password":{"name":"prometheus_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web password","long_desc":"","tags":[],"see_also":[]},"prometheus_web_user":{"name":"prometheus_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web user","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"noautoscale":{"name":"noautoscale","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"global autoscale flag","long_desc":"Option to turn on/off the autoscaler for all pools","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"drive_group_interval":{"name":"drive_group_interval","type":"float","level":"advanced","flags":0,"default_value":"300.0","min":"","max":"","enum_allowed":[],"desc":"interval in seconds between re-application of applied drive_groups","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}],"services":{"dashboard":"https://192.168.123.106:8443/","prometheus":"http://192.168.123.106:9283/"},"always_on_modules":{"octopus":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"pacific":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"quincy":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"reef":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"]},"last_failure_osd_epoch":5,"active_clients":[{"name":"devicehealth","addrvec":[{"type":"v2","addr":"192.168.123.106:0","nonce":4245576415}]},{"name":"libcephsqlite","addrvec":[{"type":"v2","addr":"192.168.123.106:0","nonce":85092552}]},{"name":"rbd_support","addrvec":[{"type":"v2","addr":"192.168.123.106:0","nonce":1363105795}]},{"name":"volumes","addrvec":[{"type":"v2","addr":"192.168.123.106:0","nonce":3292834485}]}]} 2026-03-09T17:26:56.253 INFO:tasks.cephadm.ceph_manager.ceph:mgr available! 2026-03-09T17:26:56.253 INFO:tasks.cephadm.ceph_manager.ceph:waiting for all up 2026-03-09T17:26:56.253 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd dump --format=json 2026-03-09T17:26:56.472 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:56.747 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:56 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/4057289213' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-09T17:26:56.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.831+0000 7fcbc5757700 1 -- 192.168.123.106:0/1481020403 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcbc0100540 msgr2=0x7fcbc01009b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:56.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.831+0000 7fcbc5757700 1 --2- 192.168.123.106:0/1481020403 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcbc0100540 0x7fcbc01009b0 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7fcbb0009b00 tx=0x7fcbb0009e10 comp rx=0 tx=0).stop 2026-03-09T17:26:56.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.832+0000 7fcbc5757700 1 -- 192.168.123.106:0/1481020403 shutdown_connections 2026-03-09T17:26:56.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.832+0000 7fcbc5757700 1 --2- 192.168.123.106:0/1481020403 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcbc0100540 0x7fcbc01009b0 unknown :-1 s=CLOSED pgs=207 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:56.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.832+0000 7fcbc5757700 1 --2- 192.168.123.106:0/1481020403 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcbc0106560 0x7fcbc0106930 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:56.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.832+0000 7fcbc5757700 1 -- 192.168.123.106:0/1481020403 >> 192.168.123.106:0/1481020403 conn(0x7fcbc00fbfc0 msgr2=0x7fcbc00fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:56.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.832+0000 7fcbc5757700 1 -- 192.168.123.106:0/1481020403 shutdown_connections 2026-03-09T17:26:56.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.832+0000 7fcbc5757700 1 -- 192.168.123.106:0/1481020403 wait complete. 2026-03-09T17:26:56.835 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.833+0000 7fcbc5757700 1 Processor -- start 2026-03-09T17:26:56.835 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.833+0000 7fcbc5757700 1 -- start start 2026-03-09T17:26:56.835 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.833+0000 7fcbc5757700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcbc0100540 0x7fcbc01961e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:56.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.833+0000 7fcbc5757700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcbc0106560 0x7fcbc0196720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:56.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.834+0000 7fcbc5757700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcbc0196e00 con 0x7fcbc0100540 2026-03-09T17:26:56.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.834+0000 7fcbc5757700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcbc019ab90 con 0x7fcbc0106560 2026-03-09T17:26:56.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.834+0000 7fcbbe7fc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcbc0106560 0x7fcbc0196720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:56.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.834+0000 7fcbbe7fc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcbc0106560 0x7fcbc0196720 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:46726/0 (socket says 192.168.123.106:46726) 2026-03-09T17:26:56.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.834+0000 7fcbbe7fc700 1 -- 192.168.123.106:0/590605919 learned_addr learned my addr 192.168.123.106:0/590605919 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:56.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.834+0000 7fcbbe7fc700 1 -- 192.168.123.106:0/590605919 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcbc0100540 msgr2=0x7fcbc01961e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:26:56.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.834+0000 7fcbbeffd700 1 --2- 192.168.123.106:0/590605919 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcbc0100540 0x7fcbc01961e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:56.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.834+0000 7fcbbe7fc700 1 --2- 192.168.123.106:0/590605919 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcbc0100540 0x7fcbc01961e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:56.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.834+0000 7fcbbe7fc700 1 -- 192.168.123.106:0/590605919 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcbb00097e0 con 0x7fcbc0106560 2026-03-09T17:26:56.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.835+0000 7fcbbe7fc700 1 --2- 192.168.123.106:0/590605919 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcbc0106560 0x7fcbc0196720 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fcbb00048c0 tx=0x7fcbb00049a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:56.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.834+0000 7fcbbeffd700 1 --2- 192.168.123.106:0/590605919 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcbc0100540 0x7fcbc01961e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:26:56.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.835+0000 7fcbb7fff700 1 -- 192.168.123.106:0/590605919 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcbb001d070 con 0x7fcbc0106560 2026-03-09T17:26:56.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.835+0000 7fcbb7fff700 1 -- 192.168.123.106:0/590605919 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcbb000bc50 con 0x7fcbc0106560 2026-03-09T17:26:56.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.835+0000 7fcbb7fff700 1 -- 192.168.123.106:0/590605919 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcbb000f700 con 0x7fcbc0106560 2026-03-09T17:26:56.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.835+0000 7fcbc5757700 1 -- 192.168.123.106:0/590605919 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcbc019ae10 con 0x7fcbc0106560 2026-03-09T17:26:56.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.835+0000 7fcbc5757700 1 -- 192.168.123.106:0/590605919 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcbc019b300 con 0x7fcbc0106560 2026-03-09T17:26:56.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.836+0000 7fcbc5757700 1 -- 192.168.123.106:0/590605919 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcbc01089f0 con 0x7fcbc0106560 2026-03-09T17:26:56.839 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.837+0000 7fcbb7fff700 1 -- 192.168.123.106:0/590605919 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fcbb0022a50 con 0x7fcbc0106560 2026-03-09T17:26:56.840 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.838+0000 7fcbb7fff700 1 --2- 192.168.123.106:0/590605919 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fcbac06c630 0x7fcbac06eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:56.840 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.838+0000 7fcbb7fff700 1 -- 192.168.123.106:0/590605919 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fcbb008d690 con 0x7fcbc0106560 2026-03-09T17:26:56.840 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.838+0000 7fcbbeffd700 1 --2- 192.168.123.106:0/590605919 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fcbac06c630 0x7fcbac06eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:56.841 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.839+0000 7fcbbeffd700 1 --2- 192.168.123.106:0/590605919 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fcbac06c630 0x7fcbac06eae0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fcba8005d90 tx=0x7fcba8005d00 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:56.842 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.840+0000 7fcbb7fff700 1 -- 192.168.123.106:0/590605919 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fcbb00581d0 con 0x7fcbc0106560 2026-03-09T17:26:56.950 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.947+0000 7fcbc5757700 1 -- 192.168.123.106:0/590605919 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7fcbc004ea50 con 0x7fcbc0106560 2026-03-09T17:26:56.950 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.948+0000 7fcbb7fff700 1 -- 192.168.123.106:0/590605919 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v33) v1 ==== 74+0+11260 (secure 0 0 0) 0x7fcbb0027020 con 0x7fcbc0106560 2026-03-09T17:26:56.951 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:26:56.951 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":33,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","created":"2026-03-09T17:24:18.262302+0000","modified":"2026-03-09T17:26:52.561455+0000","last_up_change":"2026-03-09T17:26:51.551725+0000","last_in_change":"2026-03-09T17:26:39.694485+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-09T17:26:21.405377+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"7b23c291-c26f-47f6-aa9d-2b35b2448578","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6802","nonce":1643543004},{"type":"v1","addr":"192.168.123.106:6803","nonce":1643543004}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6804","nonce":1643543004},{"type":"v1","addr":"192.168.123.106:6805","nonce":1643543004}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6808","nonce":1643543004},{"type":"v1","addr":"192.168.123.106:6809","nonce":1643543004}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6806","nonce":1643543004},{"type":"v1","addr":"192.168.123.106:6807","nonce":1643543004}]},"public_addr":"192.168.123.106:6803/1643543004","cluster_addr":"192.168.123.106:6805/1643543004","heartbeat_back_addr":"192.168.123.106:6809/1643543004","heartbeat_front_addr":"192.168.123.106:6807/1643543004","state":["exists","up"]},{"osd":1,"uuid":"e31bb3c2-190d-419c-bb90-f0909a02113b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6810","nonce":1344357563},{"type":"v1","addr":"192.168.123.106:6811","nonce":1344357563}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6812","nonce":1344357563},{"type":"v1","addr":"192.168.123.106:6813","nonce":1344357563}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6816","nonce":1344357563},{"type":"v1","addr":"192.168.123.106:6817","nonce":1344357563}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6814","nonce":1344357563},{"type":"v1","addr":"192.168.123.106:6815","nonce":1344357563}]},"public_addr":"192.168.123.106:6811/1344357563","cluster_addr":"192.168.123.106:6813/1344357563","heartbeat_back_addr":"192.168.123.106:6817/1344357563","heartbeat_front_addr":"192.168.123.106:6815/1344357563","state":["exists","up"]},{"osd":2,"uuid":"4aa1d45d-b786-45ab-97d1-aef76daa15f5","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6818","nonce":1615959701},{"type":"v1","addr":"192.168.123.106:6819","nonce":1615959701}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6820","nonce":1615959701},{"type":"v1","addr":"192.168.123.106:6821","nonce":1615959701}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1615959701},{"type":"v1","addr":"192.168.123.106:6825","nonce":1615959701}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6822","nonce":1615959701},{"type":"v1","addr":"192.168.123.106:6823","nonce":1615959701}]},"public_addr":"192.168.123.106:6819/1615959701","cluster_addr":"192.168.123.106:6821/1615959701","heartbeat_back_addr":"192.168.123.106:6825/1615959701","heartbeat_front_addr":"192.168.123.106:6823/1615959701","state":["exists","up"]},{"osd":3,"uuid":"733bc53e-8727-4119-a70f-00c09a625789","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6800","nonce":1175302832},{"type":"v1","addr":"192.168.123.109:6801","nonce":1175302832}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6802","nonce":1175302832},{"type":"v1","addr":"192.168.123.109:6803","nonce":1175302832}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6806","nonce":1175302832},{"type":"v1","addr":"192.168.123.109:6807","nonce":1175302832}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6804","nonce":1175302832},{"type":"v1","addr":"192.168.123.109:6805","nonce":1175302832}]},"public_addr":"192.168.123.109:6801/1175302832","cluster_addr":"192.168.123.109:6803/1175302832","heartbeat_back_addr":"192.168.123.109:6807/1175302832","heartbeat_front_addr":"192.168.123.109:6805/1175302832","state":["exists","up"]},{"osd":4,"uuid":"24911c64-9b6a-4862-9972-34f73f6f3c13","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6808","nonce":153890079},{"type":"v1","addr":"192.168.123.109:6809","nonce":153890079}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6810","nonce":153890079},{"type":"v1","addr":"192.168.123.109:6811","nonce":153890079}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6814","nonce":153890079},{"type":"v1","addr":"192.168.123.109:6815","nonce":153890079}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6812","nonce":153890079},{"type":"v1","addr":"192.168.123.109:6813","nonce":153890079}]},"public_addr":"192.168.123.109:6809/153890079","cluster_addr":"192.168.123.109:6811/153890079","heartbeat_back_addr":"192.168.123.109:6815/153890079","heartbeat_front_addr":"192.168.123.109:6813/153890079","state":["exists","up"]},{"osd":5,"uuid":"4a80decf-2a05-4525-b2be-269b4a9ba65c","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6816","nonce":764831086},{"type":"v1","addr":"192.168.123.109:6817","nonce":764831086}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6818","nonce":764831086},{"type":"v1","addr":"192.168.123.109:6819","nonce":764831086}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6822","nonce":764831086},{"type":"v1","addr":"192.168.123.109:6823","nonce":764831086}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6820","nonce":764831086},{"type":"v1","addr":"192.168.123.109:6821","nonce":764831086}]},"public_addr":"192.168.123.109:6817/764831086","cluster_addr":"192.168.123.109:6819/764831086","heartbeat_back_addr":"192.168.123.109:6823/764831086","heartbeat_front_addr":"192.168.123.109:6821/764831086","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:25:58.468126+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:08.482089+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:19.719460+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:30.196744+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:39.365544+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:49.759361+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.106:0/3698608392":"2026-03-10T17:25:23.192748+0000","192.168.123.106:0/687723179":"2026-03-10T17:24:47.307802+0000","192.168.123.106:0/3891378720":"2026-03-10T17:25:23.192748+0000","192.168.123.106:0/795371929":"2026-03-10T17:24:47.307802+0000","192.168.123.106:0/1414971993":"2026-03-10T17:25:23.192748+0000","192.168.123.106:6800/2":"2026-03-10T17:24:32.871932+0000","192.168.123.106:0/2965444141":"2026-03-10T17:24:32.871932+0000","192.168.123.106:0/3673999554":"2026-03-10T17:24:32.871932+0000","192.168.123.106:0/2700778697":"2026-03-10T17:24:32.871932+0000","192.168.123.106:6801/2":"2026-03-10T17:24:32.871932+0000","192.168.123.106:0/2798770433":"2026-03-10T17:24:47.307802+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-09T17:26:56.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.951+0000 7fcbc5757700 1 -- 192.168.123.106:0/590605919 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fcbac06c630 msgr2=0x7fcbac06eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:56.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.951+0000 7fcbc5757700 1 --2- 192.168.123.106:0/590605919 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fcbac06c630 0x7fcbac06eae0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fcba8005d90 tx=0x7fcba8005d00 comp rx=0 tx=0).stop 2026-03-09T17:26:56.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.951+0000 7fcbc5757700 1 -- 192.168.123.106:0/590605919 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcbc0106560 msgr2=0x7fcbc0196720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:56.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.951+0000 7fcbc5757700 1 --2- 192.168.123.106:0/590605919 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcbc0106560 0x7fcbc0196720 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fcbb00048c0 tx=0x7fcbb00049a0 comp rx=0 tx=0).stop 2026-03-09T17:26:56.954 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.951+0000 7fcbc5757700 1 -- 192.168.123.106:0/590605919 shutdown_connections 2026-03-09T17:26:56.954 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.951+0000 7fcbc5757700 1 --2- 192.168.123.106:0/590605919 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fcbac06c630 0x7fcbac06eae0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:56.954 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.952+0000 7fcbc5757700 1 --2- 192.168.123.106:0/590605919 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcbc0100540 0x7fcbc01961e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:56.954 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.952+0000 7fcbc5757700 1 --2- 192.168.123.106:0/590605919 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcbc0106560 0x7fcbc0196720 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:56.954 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.952+0000 7fcbc5757700 1 -- 192.168.123.106:0/590605919 >> 192.168.123.106:0/590605919 conn(0x7fcbc00fbfc0 msgr2=0x7fcbc00fd910 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:56.954 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.952+0000 7fcbc5757700 1 -- 192.168.123.106:0/590605919 shutdown_connections 2026-03-09T17:26:56.954 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:56.952+0000 7fcbc5757700 1 -- 192.168.123.106:0/590605919 wait complete. 2026-03-09T17:26:57.025 INFO:tasks.cephadm.ceph_manager.ceph:all up! 2026-03-09T17:26:57.026 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd dump --format=json 2026-03-09T17:26:57.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:56 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/4057289213' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-09T17:26:57.197 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:57.464 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.461+0000 7f91ce0bb700 1 -- 192.168.123.106:0/2435413607 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8100670 msgr2=0x7f91c8100ae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:57.464 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.461+0000 7f91ce0bb700 1 --2- 192.168.123.106:0/2435413607 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8100670 0x7f91c8100ae0 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7f91b8009b00 tx=0x7f91b8009e10 comp rx=0 tx=0).stop 2026-03-09T17:26:57.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.462+0000 7f91ce0bb700 1 -- 192.168.123.106:0/2435413607 shutdown_connections 2026-03-09T17:26:57.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.462+0000 7f91ce0bb700 1 --2- 192.168.123.106:0/2435413607 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8100670 0x7f91c8100ae0 unknown :-1 s=CLOSED pgs=208 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:57.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.462+0000 7f91ce0bb700 1 --2- 192.168.123.106:0/2435413607 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f91c8106640 0x7f91c8106a10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:57.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.462+0000 7f91ce0bb700 1 -- 192.168.123.106:0/2435413607 >> 192.168.123.106:0/2435413607 conn(0x7f91c8078580 msgr2=0x7f91c8078980 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:57.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.463+0000 7f91ce0bb700 1 -- 192.168.123.106:0/2435413607 shutdown_connections 2026-03-09T17:26:57.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.463+0000 7f91ce0bb700 1 -- 192.168.123.106:0/2435413607 wait complete. 2026-03-09T17:26:57.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.463+0000 7f91ce0bb700 1 Processor -- start 2026-03-09T17:26:57.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.463+0000 7f91ce0bb700 1 -- start start 2026-03-09T17:26:57.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.464+0000 7f91ce0bb700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8100670 0x7f91c81983e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:57.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.464+0000 7f91ce0bb700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f91c8106640 0x7f91c8198920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:57.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.464+0000 7f91ce0bb700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f91c8198f70 con 0x7f91c8100670 2026-03-09T17:26:57.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.464+0000 7f91ce0bb700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f91c81990b0 con 0x7f91c8106640 2026-03-09T17:26:57.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.464+0000 7f91c6ffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f91c8106640 0x7f91c8198920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:57.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.464+0000 7f91c6ffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f91c8106640 0x7f91c8198920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:46740/0 (socket says 192.168.123.106:46740) 2026-03-09T17:26:57.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.464+0000 7f91c6ffd700 1 -- 192.168.123.106:0/3214690199 learned_addr learned my addr 192.168.123.106:0/3214690199 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:57.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.464+0000 7f91c77fe700 1 --2- 192.168.123.106:0/3214690199 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8100670 0x7f91c81983e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:57.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.464+0000 7f91c77fe700 1 -- 192.168.123.106:0/3214690199 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f91c8106640 msgr2=0x7f91c8198920 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:57.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.464+0000 7f91c77fe700 1 --2- 192.168.123.106:0/3214690199 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f91c8106640 0x7f91c8198920 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:57.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.464+0000 7f91c77fe700 1 -- 192.168.123.106:0/3214690199 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f91b80097e0 con 0x7f91c8100670 2026-03-09T17:26:57.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.464+0000 7f91c6ffd700 1 --2- 192.168.123.106:0/3214690199 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f91c8106640 0x7f91c8198920 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T17:26:57.467 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.465+0000 7f91c77fe700 1 --2- 192.168.123.106:0/3214690199 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8100670 0x7f91c81983e0 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7f91b000c8f0 tx=0x7f91b000cc00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:57.468 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.465+0000 7f91c4ff9700 1 -- 192.168.123.106:0/3214690199 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f91b00043f0 con 0x7f91c8100670 2026-03-09T17:26:57.468 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.465+0000 7f91c4ff9700 1 -- 192.168.123.106:0/3214690199 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f91b0004550 con 0x7f91c8100670 2026-03-09T17:26:57.468 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.465+0000 7f91c4ff9700 1 -- 192.168.123.106:0/3214690199 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f91b0003890 con 0x7f91c8100670 2026-03-09T17:26:57.468 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.465+0000 7f91ce0bb700 1 -- 192.168.123.106:0/3214690199 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f91c819cf00 con 0x7f91c8100670 2026-03-09T17:26:57.468 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.465+0000 7f91ce0bb700 1 -- 192.168.123.106:0/3214690199 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f91c819d400 con 0x7f91c8100670 2026-03-09T17:26:57.469 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.466+0000 7f91c4ff9700 1 -- 192.168.123.106:0/3214690199 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f91b0003b00 con 0x7f91c8100670 2026-03-09T17:26:57.470 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.467+0000 7f91ce0bb700 1 -- 192.168.123.106:0/3214690199 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f91c804ea50 con 0x7f91c8100670 2026-03-09T17:26:57.470 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.467+0000 7f91c4ff9700 1 --2- 192.168.123.106:0/3214690199 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f91b406c680 0x7f91b406eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:57.470 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.467+0000 7f91c4ff9700 1 -- 192.168.123.106:0/3214690199 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f91b008ace0 con 0x7f91c8100670 2026-03-09T17:26:57.471 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.469+0000 7f91c6ffd700 1 --2- 192.168.123.106:0/3214690199 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f91b406c680 0x7f91b406eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:57.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.470+0000 7f91c4ff9700 1 -- 192.168.123.106:0/3214690199 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f91b00568e0 con 0x7f91c8100670 2026-03-09T17:26:57.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.470+0000 7f91c6ffd700 1 --2- 192.168.123.106:0/3214690199 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f91b406c680 0x7f91b406eb30 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f91c8199a00 tx=0x7f91b800b560 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:57.576 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.573+0000 7f91ce0bb700 1 -- 192.168.123.106:0/3214690199 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f91c8066e40 con 0x7f91c8100670 2026-03-09T17:26:57.579 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.577+0000 7f91c4ff9700 1 -- 192.168.123.106:0/3214690199 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v33) v1 ==== 74+0+11260 (secure 0 0 0) 0x7f91b0059f00 con 0x7f91c8100670 2026-03-09T17:26:57.579 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:26:57.579 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":33,"fsid":"bcd3bcc2-1bdc-11f1-97b3-3f61613e7048","created":"2026-03-09T17:24:18.262302+0000","modified":"2026-03-09T17:26:52.561455+0000","last_up_change":"2026-03-09T17:26:51.551725+0000","last_in_change":"2026-03-09T17:26:39.694485+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-09T17:26:21.405377+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"7b23c291-c26f-47f6-aa9d-2b35b2448578","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6802","nonce":1643543004},{"type":"v1","addr":"192.168.123.106:6803","nonce":1643543004}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6804","nonce":1643543004},{"type":"v1","addr":"192.168.123.106:6805","nonce":1643543004}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6808","nonce":1643543004},{"type":"v1","addr":"192.168.123.106:6809","nonce":1643543004}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6806","nonce":1643543004},{"type":"v1","addr":"192.168.123.106:6807","nonce":1643543004}]},"public_addr":"192.168.123.106:6803/1643543004","cluster_addr":"192.168.123.106:6805/1643543004","heartbeat_back_addr":"192.168.123.106:6809/1643543004","heartbeat_front_addr":"192.168.123.106:6807/1643543004","state":["exists","up"]},{"osd":1,"uuid":"e31bb3c2-190d-419c-bb90-f0909a02113b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6810","nonce":1344357563},{"type":"v1","addr":"192.168.123.106:6811","nonce":1344357563}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6812","nonce":1344357563},{"type":"v1","addr":"192.168.123.106:6813","nonce":1344357563}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6816","nonce":1344357563},{"type":"v1","addr":"192.168.123.106:6817","nonce":1344357563}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6814","nonce":1344357563},{"type":"v1","addr":"192.168.123.106:6815","nonce":1344357563}]},"public_addr":"192.168.123.106:6811/1344357563","cluster_addr":"192.168.123.106:6813/1344357563","heartbeat_back_addr":"192.168.123.106:6817/1344357563","heartbeat_front_addr":"192.168.123.106:6815/1344357563","state":["exists","up"]},{"osd":2,"uuid":"4aa1d45d-b786-45ab-97d1-aef76daa15f5","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6818","nonce":1615959701},{"type":"v1","addr":"192.168.123.106:6819","nonce":1615959701}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6820","nonce":1615959701},{"type":"v1","addr":"192.168.123.106:6821","nonce":1615959701}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1615959701},{"type":"v1","addr":"192.168.123.106:6825","nonce":1615959701}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6822","nonce":1615959701},{"type":"v1","addr":"192.168.123.106:6823","nonce":1615959701}]},"public_addr":"192.168.123.106:6819/1615959701","cluster_addr":"192.168.123.106:6821/1615959701","heartbeat_back_addr":"192.168.123.106:6825/1615959701","heartbeat_front_addr":"192.168.123.106:6823/1615959701","state":["exists","up"]},{"osd":3,"uuid":"733bc53e-8727-4119-a70f-00c09a625789","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6800","nonce":1175302832},{"type":"v1","addr":"192.168.123.109:6801","nonce":1175302832}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6802","nonce":1175302832},{"type":"v1","addr":"192.168.123.109:6803","nonce":1175302832}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6806","nonce":1175302832},{"type":"v1","addr":"192.168.123.109:6807","nonce":1175302832}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6804","nonce":1175302832},{"type":"v1","addr":"192.168.123.109:6805","nonce":1175302832}]},"public_addr":"192.168.123.109:6801/1175302832","cluster_addr":"192.168.123.109:6803/1175302832","heartbeat_back_addr":"192.168.123.109:6807/1175302832","heartbeat_front_addr":"192.168.123.109:6805/1175302832","state":["exists","up"]},{"osd":4,"uuid":"24911c64-9b6a-4862-9972-34f73f6f3c13","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6808","nonce":153890079},{"type":"v1","addr":"192.168.123.109:6809","nonce":153890079}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6810","nonce":153890079},{"type":"v1","addr":"192.168.123.109:6811","nonce":153890079}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6814","nonce":153890079},{"type":"v1","addr":"192.168.123.109:6815","nonce":153890079}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6812","nonce":153890079},{"type":"v1","addr":"192.168.123.109:6813","nonce":153890079}]},"public_addr":"192.168.123.109:6809/153890079","cluster_addr":"192.168.123.109:6811/153890079","heartbeat_back_addr":"192.168.123.109:6815/153890079","heartbeat_front_addr":"192.168.123.109:6813/153890079","state":["exists","up"]},{"osd":5,"uuid":"4a80decf-2a05-4525-b2be-269b4a9ba65c","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6816","nonce":764831086},{"type":"v1","addr":"192.168.123.109:6817","nonce":764831086}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6818","nonce":764831086},{"type":"v1","addr":"192.168.123.109:6819","nonce":764831086}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6822","nonce":764831086},{"type":"v1","addr":"192.168.123.109:6823","nonce":764831086}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6820","nonce":764831086},{"type":"v1","addr":"192.168.123.109:6821","nonce":764831086}]},"public_addr":"192.168.123.109:6817/764831086","cluster_addr":"192.168.123.109:6819/764831086","heartbeat_back_addr":"192.168.123.109:6823/764831086","heartbeat_front_addr":"192.168.123.109:6821/764831086","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:25:58.468126+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:08.482089+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:19.719460+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:30.196744+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:39.365544+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T17:26:49.759361+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.106:0/3698608392":"2026-03-10T17:25:23.192748+0000","192.168.123.106:0/687723179":"2026-03-10T17:24:47.307802+0000","192.168.123.106:0/3891378720":"2026-03-10T17:25:23.192748+0000","192.168.123.106:0/795371929":"2026-03-10T17:24:47.307802+0000","192.168.123.106:0/1414971993":"2026-03-10T17:25:23.192748+0000","192.168.123.106:6800/2":"2026-03-10T17:24:32.871932+0000","192.168.123.106:0/2965444141":"2026-03-10T17:24:32.871932+0000","192.168.123.106:0/3673999554":"2026-03-10T17:24:32.871932+0000","192.168.123.106:0/2700778697":"2026-03-10T17:24:32.871932+0000","192.168.123.106:6801/2":"2026-03-10T17:24:32.871932+0000","192.168.123.106:0/2798770433":"2026-03-10T17:24:47.307802+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-09T17:26:57.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.579+0000 7f91ce0bb700 1 -- 192.168.123.106:0/3214690199 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f91b406c680 msgr2=0x7f91b406eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:57.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.579+0000 7f91ce0bb700 1 --2- 192.168.123.106:0/3214690199 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f91b406c680 0x7f91b406eb30 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f91c8199a00 tx=0x7f91b800b560 comp rx=0 tx=0).stop 2026-03-09T17:26:57.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.579+0000 7f91ce0bb700 1 -- 192.168.123.106:0/3214690199 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8100670 msgr2=0x7f91c81983e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:57.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.579+0000 7f91ce0bb700 1 --2- 192.168.123.106:0/3214690199 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8100670 0x7f91c81983e0 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7f91b000c8f0 tx=0x7f91b000cc00 comp rx=0 tx=0).stop 2026-03-09T17:26:57.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.580+0000 7f91ce0bb700 1 -- 192.168.123.106:0/3214690199 shutdown_connections 2026-03-09T17:26:57.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.580+0000 7f91ce0bb700 1 --2- 192.168.123.106:0/3214690199 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f91b406c680 0x7f91b406eb30 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:57.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.580+0000 7f91ce0bb700 1 --2- 192.168.123.106:0/3214690199 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8100670 0x7f91c81983e0 unknown :-1 s=CLOSED pgs=209 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:57.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.580+0000 7f91ce0bb700 1 --2- 192.168.123.106:0/3214690199 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f91c8106640 0x7f91c8198920 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:57.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.580+0000 7f91ce0bb700 1 -- 192.168.123.106:0/3214690199 >> 192.168.123.106:0/3214690199 conn(0x7f91c8078580 msgr2=0x7f91c80fec10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:57.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.580+0000 7f91ce0bb700 1 -- 192.168.123.106:0/3214690199 shutdown_connections 2026-03-09T17:26:57.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:57.580+0000 7f91ce0bb700 1 -- 192.168.123.106:0/3214690199 wait complete. 2026-03-09T17:26:57.655 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph tell osd.0 flush_pg_stats 2026-03-09T17:26:57.655 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph tell osd.1 flush_pg_stats 2026-03-09T17:26:57.655 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph tell osd.2 flush_pg_stats 2026-03-09T17:26:57.655 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph tell osd.3 flush_pg_stats 2026-03-09T17:26:57.655 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph tell osd.4 flush_pg_stats 2026-03-09T17:26:57.655 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph tell osd.5 flush_pg_stats 2026-03-09T17:26:57.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:57 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/590605919' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T17:26:57.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:57 vm06 ceph-mon[57307]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:26:57.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:26:57 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3214690199' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T17:26:58.087 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:58.129 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:58.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:57 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/590605919' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T17:26:58.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:57 vm09 ceph-mon[62061]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:26:58.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:26:57 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/3214690199' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T17:26:58.301 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:58.303 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:58.308 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:58.399 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:58.855 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.849+0000 7f3154f00700 1 -- 192.168.123.106:0/3583340588 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3150071b60 msgr2=0x7f3150071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:58.855 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.849+0000 7f3154f00700 1 --2- 192.168.123.106:0/3583340588 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3150071b60 0x7f3150071fd0 secure :-1 s=READY pgs=210 cs=0 l=1 rev1=1 crypto rx=0x7f3144009b00 tx=0x7f3144009e10 comp rx=0 tx=0).stop 2026-03-09T17:26:58.855 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.850+0000 7f3154f00700 1 -- 192.168.123.106:0/3583340588 shutdown_connections 2026-03-09T17:26:58.855 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.850+0000 7f3154f00700 1 --2- 192.168.123.106:0/3583340588 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3150071b60 0x7f3150071fd0 unknown :-1 s=CLOSED pgs=210 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:58.855 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.850+0000 7f3154f00700 1 --2- 192.168.123.106:0/3583340588 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f315010e9e0 0x7f315010edb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:58.855 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.850+0000 7f3154f00700 1 -- 192.168.123.106:0/3583340588 >> 192.168.123.106:0/3583340588 conn(0x7f315006c6c0 msgr2=0x7f315006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.851+0000 7f3154f00700 1 -- 192.168.123.106:0/3583340588 shutdown_connections 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.851+0000 7f3154f00700 1 -- 192.168.123.106:0/3583340588 wait complete. 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.851+0000 7f3154f00700 1 Processor -- start 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.851+0000 7f3154f00700 1 -- start start 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.851+0000 7f3154f00700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3150071b60 0x7f3150119580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.851+0000 7f3154f00700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f315010e9e0 0x7f3150114580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.851+0000 7f3154f00700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3150114ac0 con 0x7f3150071b60 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.851+0000 7f3154f00700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3150114c30 con 0x7f315010e9e0 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.852+0000 7f314effd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f315010e9e0 0x7f3150114580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.852+0000 7f314f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3150071b60 0x7f3150119580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.852+0000 7f314effd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f315010e9e0 0x7f3150114580 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:46746/0 (socket says 192.168.123.106:46746) 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.852+0000 7f314effd700 1 -- 192.168.123.106:0/565784316 learned_addr learned my addr 192.168.123.106:0/565784316 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.852+0000 7f314f7fe700 1 -- 192.168.123.106:0/565784316 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f315010e9e0 msgr2=0x7f3150114580 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.852+0000 7f314f7fe700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f315010e9e0 0x7f3150114580 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:58.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.852+0000 7f314f7fe700 1 -- 192.168.123.106:0/565784316 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f31440097e0 con 0x7f3150071b60 2026-03-09T17:26:58.858 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.854+0000 7f314f7fe700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3150071b60 0x7f3150119580 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7f314000d8d0 tx=0x7f314000dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:58.858 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.854+0000 7f314cff9700 1 -- 192.168.123.106:0/565784316 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f314000f840 con 0x7f3150071b60 2026-03-09T17:26:58.858 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.854+0000 7f3154f00700 1 -- 192.168.123.106:0/565784316 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3150114f10 con 0x7f3150071b60 2026-03-09T17:26:58.858 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.854+0000 7f3154f00700 1 -- 192.168.123.106:0/565784316 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f31501b7a80 con 0x7f3150071b60 2026-03-09T17:26:58.858 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.855+0000 7f314cff9700 1 -- 192.168.123.106:0/565784316 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f314000fe80 con 0x7f3150071b60 2026-03-09T17:26:58.858 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.855+0000 7f314cff9700 1 -- 192.168.123.106:0/565784316 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f314000e5c0 con 0x7f3150071b60 2026-03-09T17:26:58.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.856+0000 7f314cff9700 1 -- 192.168.123.106:0/565784316 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f314000e770 con 0x7f3150071b60 2026-03-09T17:26:58.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.858+0000 7f3154f00700 1 -- 192.168.123.106:0/565784316 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f313c000ff0 con 0x7f3150071b60 2026-03-09T17:26:58.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.859+0000 7f314cff9700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f313806c680 0x7f313806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:58.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.859+0000 7f314cff9700 1 -- 192.168.123.106:0/565784316 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f314008bfb0 con 0x7f3150071b60 2026-03-09T17:26:58.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.859+0000 7f314cff9700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.109:6808/153890079,v1:192.168.123.109:6809/153890079] conn(0x7f3138072210 0x7f3138074620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:58.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.859+0000 7f314cff9700 1 -- 192.168.123.106:0/565784316 --> [v2:192.168.123.109:6808/153890079,v1:192.168.123.109:6809/153890079] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f3138074cd0 con 0x7f3138072210 2026-03-09T17:26:58.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.859+0000 7f314cff9700 1 -- 192.168.123.106:0/565784316 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7f314000f9f0 con 0x7f3150071b60 2026-03-09T17:26:58.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.859+0000 7f314effd700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f313806c680 0x7f313806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:58.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.859+0000 7f314ffff700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.109:6808/153890079,v1:192.168.123.109:6809/153890079] conn(0x7f3138072210 0x7f3138074620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:58.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.859+0000 7f314effd700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f313806c680 0x7f313806eb30 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f3150115d70 tx=0x7f3144005fd0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:58.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.860+0000 7f314ffff700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.109:6808/153890079,v1:192.168.123.109:6809/153890079] conn(0x7f3138072210 0x7f3138074620 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.4 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:58.868 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.863+0000 7f314cff9700 1 -- 192.168.123.106:0/565784316 <== osd.4 v2:192.168.123.109:6808/153890079 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f3138074cd0 con 0x7f3138072210 2026-03-09T17:26:58.894 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.889+0000 7f3154f00700 1 -- 192.168.123.106:0/565784316 --> [v2:192.168.123.109:6808/153890079,v1:192.168.123.109:6809/153890079] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f313c002da0 con 0x7f3138072210 2026-03-09T17:26:58.903 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.897+0000 7f314cff9700 1 -- 192.168.123.106:0/565784316 <== osd.4 v2:192.168.123.109:6808/153890079 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f313c002da0 con 0x7f3138072210 2026-03-09T17:26:58.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.903+0000 7f31367fc700 1 -- 192.168.123.106:0/565784316 >> [v2:192.168.123.109:6808/153890079,v1:192.168.123.109:6809/153890079] conn(0x7f3138072210 msgr2=0x7f3138074620 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:58.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.903+0000 7f31367fc700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.109:6808/153890079,v1:192.168.123.109:6809/153890079] conn(0x7f3138072210 0x7f3138074620 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:58.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.903+0000 7f31367fc700 1 -- 192.168.123.106:0/565784316 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f313806c680 msgr2=0x7f313806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:58.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.903+0000 7f31367fc700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f313806c680 0x7f313806eb30 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f3150115d70 tx=0x7f3144005fd0 comp rx=0 tx=0).stop 2026-03-09T17:26:58.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.903+0000 7f31367fc700 1 -- 192.168.123.106:0/565784316 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3150071b60 msgr2=0x7f3150119580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:58.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.903+0000 7f31367fc700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3150071b60 0x7f3150119580 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7f314000d8d0 tx=0x7f314000dbe0 comp rx=0 tx=0).stop 2026-03-09T17:26:58.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.903+0000 7f31367fc700 1 -- 192.168.123.106:0/565784316 shutdown_connections 2026-03-09T17:26:58.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.903+0000 7f31367fc700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f313806c680 0x7f313806eb30 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:58.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.903+0000 7f31367fc700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3150071b60 0x7f3150119580 unknown :-1 s=CLOSED pgs=211 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:58.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.903+0000 7f31367fc700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.109:6808/153890079,v1:192.168.123.109:6809/153890079] conn(0x7f3138072210 0x7f3138074620 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:58.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.903+0000 7f31367fc700 1 --2- 192.168.123.106:0/565784316 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f315010e9e0 0x7f3150114580 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:58.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.903+0000 7f31367fc700 1 -- 192.168.123.106:0/565784316 >> 192.168.123.106:0/565784316 conn(0x7f315006c6c0 msgr2=0x7f315006cfa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:58.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.917+0000 7f31367fc700 1 -- 192.168.123.106:0/565784316 shutdown_connections 2026-03-09T17:26:58.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.917+0000 7f31367fc700 1 -- 192.168.123.106:0/565784316 wait complete. 2026-03-09T17:26:58.932 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.928+0000 7f9ad0879700 1 -- 192.168.123.106:0/2534730459 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9acc071e40 msgr2=0x7f9acc0722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:58.932 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.928+0000 7f9ad0879700 1 --2- 192.168.123.106:0/2534730459 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9acc071e40 0x7f9acc0722b0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f9ac400d3f0 tx=0x7f9ac400d700 comp rx=0 tx=0).stop 2026-03-09T17:26:58.932 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.929+0000 7f9ad0879700 1 -- 192.168.123.106:0/2534730459 shutdown_connections 2026-03-09T17:26:58.955 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.929+0000 7f9ad0879700 1 --2- 192.168.123.106:0/2534730459 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9acc071e40 0x7f9acc0722b0 secure :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f9ac400d3f0 tx=0x7f9ac400d700 comp rx=0 tx=0).stop 2026-03-09T17:26:58.955 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.929+0000 7f9ad0879700 1 --2- 192.168.123.106:0/2534730459 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9acc10c8f0 0x7f9acc10ccc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:58.955 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.929+0000 7f9ad0879700 1 -- 192.168.123.106:0/2534730459 >> 192.168.123.106:0/2534730459 conn(0x7f9acc06c6c0 msgr2=0x7f9acc06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.954+0000 7f9ad0879700 1 -- 192.168.123.106:0/2534730459 shutdown_connections 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.954+0000 7f9ad0879700 1 -- 192.168.123.106:0/2534730459 wait complete. 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.954+0000 7f9ad0879700 1 Processor -- start 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.954+0000 7f9ad0879700 1 -- start start 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.954+0000 7f9ad0879700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9acc10c8f0 0x7f9acc083240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.954+0000 7f9ad0879700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9acc083780 0x7f9acc07d230 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.954+0000 7f9ad0879700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9acc083d40 con 0x7f9acc10c8f0 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.954+0000 7f9ad0879700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9acc07d770 con 0x7f9acc083780 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.955+0000 7f9aca59c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9acc083780 0x7f9acc07d230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.957+0000 7f9acad9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9acc10c8f0 0x7f9acc083240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.957+0000 7f9acad9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9acc10c8f0 0x7f9acc083240 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43760/0 (socket says 192.168.123.106:43760) 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.957+0000 7f9acad9d700 1 -- 192.168.123.106:0/1752732403 learned_addr learned my addr 192.168.123.106:0/1752732403 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.959+0000 7f9aca59c700 1 -- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9acc10c8f0 msgr2=0x7f9acc083240 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.959+0000 7f9aca59c700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9acc10c8f0 0x7f9acc083240 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.959+0000 7f9aca59c700 1 -- 192.168.123.106:0/1752732403 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9ac4007ed0 con 0x7f9acc083780 2026-03-09T17:26:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.961+0000 7f9aca59c700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9acc083780 0x7f9acc07d230 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f9ac400a000 tx=0x7f9ac40045d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:58.965 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.963+0000 7f9ab3fff700 1 -- 192.168.123.106:0/1752732403 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ac4004980 con 0x7f9acc083780 2026-03-09T17:26:58.965 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.963+0000 7f9ab3fff700 1 -- 192.168.123.106:0/1752732403 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9ac400fb40 con 0x7f9acc083780 2026-03-09T17:26:58.968 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.964+0000 7f9ab3fff700 1 -- 192.168.123.106:0/1752732403 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ac4020bd0 con 0x7f9acc083780 2026-03-09T17:26:58.968 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.964+0000 7f9ad0879700 1 -- 192.168.123.106:0/1752732403 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9acc07d9f0 con 0x7f9acc083780 2026-03-09T17:26:58.968 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.964+0000 7f9ad0879700 1 -- 192.168.123.106:0/1752732403 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9acc07deb0 con 0x7f9acc083780 2026-03-09T17:26:58.968 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.965+0000 7f9ab1ffb700 1 -- 192.168.123.106:0/1752732403 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f9acc0797f0 con 0x7f9acc083780 2026-03-09T17:26:58.968 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.966+0000 7f9ab3fff700 1 -- 192.168.123.106:0/1752732403 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f9ac400fcb0 con 0x7f9acc083780 2026-03-09T17:26:58.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.967+0000 7f9ab3fff700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9ab406c4d0 0x7f9ab406e980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:58.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.967+0000 7f9ab3fff700 1 -- 192.168.123.106:0/1752732403 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f9ac4013070 con 0x7f9acc083780 2026-03-09T17:26:58.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.967+0000 7f9ab3fff700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004] conn(0x7f9ab4071ef0 0x7f9ab4074300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:58.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.967+0000 7f9ab3fff700 1 -- 192.168.123.106:0/1752732403 --> [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f9ab40749b0 con 0x7f9ab4071ef0 2026-03-09T17:26:58.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.967+0000 7f9ab3fff700 1 -- 192.168.123.106:0/1752732403 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7f9ac4090420 con 0x7f9acc083780 2026-03-09T17:26:58.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.967+0000 7f9acad9d700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9ab406c4d0 0x7f9ab406e980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:58.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.967+0000 7f9acb59e700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004] conn(0x7f9ab4071ef0 0x7f9ab4074300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:58.970 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.968+0000 7f9acb59e700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004] conn(0x7f9ab4071ef0 0x7f9ab4074300 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:58.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.968+0000 7f9acad9d700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9ab406c4d0 0x7f9ab406e980 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f9abc009cc0 tx=0x7f9abc009480 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:58.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.969+0000 7f9ab3fff700 1 -- 192.168.123.106:0/1752732403 <== osd.0 v2:192.168.123.106:6802/1643543004 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f9ab40749b0 con 0x7f9ab4071ef0 2026-03-09T17:26:58.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.985+0000 7f9ad0879700 1 -- 192.168.123.106:0/1752732403 --> [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f9aac000f80 con 0x7f9ab4071ef0 2026-03-09T17:26:58.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.989+0000 7f9ab3fff700 1 -- 192.168.123.106:0/1752732403 <== osd.0 v2:192.168.123.106:6802/1643543004 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f9aac000f80 con 0x7f9ab4071ef0 2026-03-09T17:26:59.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.994+0000 7f9ad0879700 1 -- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004] conn(0x7f9ab4071ef0 msgr2=0x7f9ab4074300 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.994+0000 7f9ad0879700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004] conn(0x7f9ab4071ef0 0x7f9ab4074300 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.996+0000 7f9ad0879700 1 -- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9ab406c4d0 msgr2=0x7f9ab406e980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.996+0000 7f9ad0879700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9ab406c4d0 0x7f9ab406e980 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f9abc009cc0 tx=0x7f9abc009480 comp rx=0 tx=0).stop 2026-03-09T17:26:59.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.996+0000 7f9ad0879700 1 -- 192.168.123.106:0/1752732403 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9acc083780 msgr2=0x7f9acc07d230 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.996+0000 7f9ad0879700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9acc083780 0x7f9acc07d230 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f9ac400a000 tx=0x7f9ac40045d0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.996+0000 7f9ad0879700 1 -- 192.168.123.106:0/1752732403 shutdown_connections 2026-03-09T17:26:59.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.996+0000 7f9ad0879700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:6802/1643543004,v1:192.168.123.106:6803/1643543004] conn(0x7f9ab4071ef0 0x7f9ab4074300 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.996+0000 7f9ad0879700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9ab406c4d0 0x7f9ab406e980 secure :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f9abc009cc0 tx=0x7f9abc009480 comp rx=0 tx=0).stop 2026-03-09T17:26:59.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.996+0000 7f9ad0879700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9acc10c8f0 0x7f9acc083240 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.996+0000 7f9ad0879700 1 --2- 192.168.123.106:0/1752732403 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9acc083780 0x7f9acc07d230 secure :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f9ac400a000 tx=0x7f9ac40045d0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:58.996+0000 7f9ad0879700 1 -- 192.168.123.106:0/1752732403 >> 192.168.123.106:0/1752732403 conn(0x7f9acc06c6c0 msgr2=0x7f9acc10bc70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:59.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.001+0000 7f9ad0879700 1 -- 192.168.123.106:0/1752732403 shutdown_connections 2026-03-09T17:26:59.061 INFO:teuthology.orchestra.run.vm06.stdout:120259084294 2026-03-09T17:26:59.061 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd last-stat-seq osd.4 2026-03-09T17:26:59.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.051+0000 7f9ad0879700 1 -- 192.168.123.106:0/1752732403 wait complete. 2026-03-09T17:26:59.197 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.192+0000 7f0b7ca49700 1 -- 192.168.123.106:0/1381587915 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b700062b0 msgr2=0x7f0b70006680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.197 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.192+0000 7f0b7ca49700 1 --2- 192.168.123.106:0/1381587915 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b700062b0 0x7f0b70006680 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7f0b780669f0 tx=0x7f0b7806a460 comp rx=0 tx=0).stop 2026-03-09T17:26:59.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.197+0000 7f0b7ca49700 1 -- 192.168.123.106:0/1381587915 shutdown_connections 2026-03-09T17:26:59.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.197+0000 7f0b7ca49700 1 --2- 192.168.123.106:0/1381587915 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0b70006bc0 0x7f0b700a19d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.197+0000 7f0b7ca49700 1 --2- 192.168.123.106:0/1381587915 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b700062b0 0x7f0b70006680 unknown :-1 s=CLOSED pgs=212 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.197+0000 7f0b7ca49700 1 -- 192.168.123.106:0/1381587915 >> 192.168.123.106:0/1381587915 conn(0x7f0b7000b180 msgr2=0x7f0b7000b580 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:59.219 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.209+0000 7f0b7ca49700 1 -- 192.168.123.106:0/1381587915 shutdown_connections 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.209+0000 7f0b7ca49700 1 -- 192.168.123.106:0/1381587915 wait complete. 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.213+0000 7f0b7ca49700 1 Processor -- start 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.213+0000 7f0b7ca49700 1 -- start start 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.214+0000 7f0b7ca49700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b700062b0 0x7f0b70096640 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.214+0000 7f0b7ca49700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0b70006bc0 0x7f0b70096b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.214+0000 7f0b7ca49700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0b7009a7d0 con 0x7f0b700062b0 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.214+0000 7f0b7ca49700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0b700970c0 con 0x7f0b70006bc0 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.214+0000 7f0b76d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b700062b0 0x7f0b70096640 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.214+0000 7f0b76d9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b700062b0 0x7f0b70096640 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43770/0 (socket says 192.168.123.106:43770) 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.214+0000 7f0b76d9d700 1 -- 192.168.123.106:0/307366203 learned_addr learned my addr 192.168.123.106:0/307366203 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.214+0000 7f0b7659c700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0b70006bc0 0x7f0b70096b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.215+0000 7f0b76d9d700 1 -- 192.168.123.106:0/307366203 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0b70006bc0 msgr2=0x7f0b70096b80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.215+0000 7f0b76d9d700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0b70006bc0 0x7f0b70096b80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.215+0000 7f0b76d9d700 1 -- 192.168.123.106:0/307366203 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0b78067050 con 0x7f0b700062b0 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.215+0000 7f0b76d9d700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b700062b0 0x7f0b70096640 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7f0b78050040 tx=0x7f0b78069b20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.215+0000 7f0b6ffff700 1 -- 192.168.123.106:0/307366203 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0b78067d20 con 0x7f0b700062b0 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.215+0000 7f0b6ffff700 1 -- 192.168.123.106:0/307366203 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0b7807d900 con 0x7f0b700062b0 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.215+0000 7f0b6ffff700 1 -- 192.168.123.106:0/307366203 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0b78088a90 con 0x7f0b700062b0 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.215+0000 7f0b7ca49700 1 -- 192.168.123.106:0/307366203 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0b700973a0 con 0x7f0b700062b0 2026-03-09T17:26:59.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.215+0000 7f0b7ca49700 1 -- 192.168.123.106:0/307366203 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0b7013b9c0 con 0x7f0b700062b0 2026-03-09T17:26:59.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.222+0000 7f0b6ffff700 1 -- 192.168.123.106:0/307366203 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f0b78085070 con 0x7f0b700062b0 2026-03-09T17:26:59.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.223+0000 7f0b6ffff700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0b6406c880 0x7f0b6406ed30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.223+0000 7f0b7659c700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0b6406c880 0x7f0b6406ed30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.245+0000 7f0b7659c700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0b6406c880 0x7f0b6406ed30 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f0b68005fd0 tx=0x7f0b6800a560 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:59.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.245+0000 7f0b6ffff700 1 -- 192.168.123.106:0/307366203 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f0b78078050 con 0x7f0b700062b0 2026-03-09T17:26:59.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.245+0000 7f0b7ca49700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701] conn(0x7f0b58001610 0x7f0b58003ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.245+0000 7f0b7ca49700 1 -- 192.168.123.106:0/307366203 --> [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f0b58006bf0 con 0x7f0b58001610 2026-03-09T17:26:59.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.247+0000 7f0b7759e700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701] conn(0x7f0b58001610 0x7f0b58003ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.251+0000 7fa53ad83700 1 -- 192.168.123.106:0/3055584620 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa52c0a58d0 msgr2=0x7fa52c0a8e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.251+0000 7fa53ad83700 1 --2- 192.168.123.106:0/3055584620 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa52c0a58d0 0x7fa52c0a8e90 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7fa5340669f0 tx=0x7fa5340671e0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.251+0000 7fa53ad83700 1 -- 192.168.123.106:0/3055584620 shutdown_connections 2026-03-09T17:26:59.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.251+0000 7fa53ad83700 1 --2- 192.168.123.106:0/3055584620 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa52c0a58d0 0x7fa52c0a8e90 unknown :-1 s=CLOSED pgs=214 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.251+0000 7fa53ad83700 1 --2- 192.168.123.106:0/3055584620 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa52c0a4f30 0x7fa52c0a5300 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.251+0000 7fa53ad83700 1 -- 192.168.123.106:0/3055584620 >> 192.168.123.106:0/3055584620 conn(0x7fa52c01a290 msgr2=0x7fa52c01a690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:59.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.252+0000 7fa53ad83700 1 -- 192.168.123.106:0/3055584620 shutdown_connections 2026-03-09T17:26:59.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.252+0000 7fa53ad83700 1 -- 192.168.123.106:0/3055584620 wait complete. 2026-03-09T17:26:59.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.252+0000 7fa53ad83700 1 Processor -- start 2026-03-09T17:26:59.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.252+0000 7fa53ad83700 1 -- start start 2026-03-09T17:26:59.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.252+0000 7fa53ad83700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa52c0a4f30 0x7fa52c00f770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.252+0000 7fa53ad83700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa52c00fcb0 0x7fa52c010120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.252+0000 7fa53ad83700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa52c0142f0 con 0x7fa52c0a4f30 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.252+0000 7fa53ad83700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa52c014430 con 0x7fa52c00fcb0 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.252+0000 7fa539d81700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa52c0a4f30 0x7fa52c00f770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.252+0000 7fa539d81700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa52c0a4f30 0x7fa52c00f770 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43796/0 (socket says 192.168.123.106:43796) 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.252+0000 7fa539d81700 1 -- 192.168.123.106:0/48258702 learned_addr learned my addr 192.168.123.106:0/48258702 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.252+0000 7fa539580700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa52c00fcb0 0x7fa52c010120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.253+0000 7fa539d81700 1 -- 192.168.123.106:0/48258702 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa52c00fcb0 msgr2=0x7fa52c010120 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.253+0000 7fa539d81700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa52c00fcb0 0x7fa52c010120 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.253+0000 7fa539d81700 1 -- 192.168.123.106:0/48258702 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa534067050 con 0x7fa52c0a4f30 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.253+0000 7fa539d81700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa52c0a4f30 0x7fa52c00f770 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7fa53000d8d0 tx=0x7fa53000dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.253+0000 7fa52affd700 1 -- 192.168.123.106:0/48258702 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa530009880 con 0x7fa52c0a4f30 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.253+0000 7fa53ad83700 1 -- 192.168.123.106:0/48258702 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa52c0146b0 con 0x7fa52c0a4f30 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.253+0000 7fa53ad83700 1 -- 192.168.123.106:0/48258702 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa52c014c00 con 0x7fa52c0a4f30 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.253+0000 7fa52affd700 1 -- 192.168.123.106:0/48258702 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa530010460 con 0x7fa52c0a4f30 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.253+0000 7fa52affd700 1 -- 192.168.123.106:0/48258702 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa53000f5d0 con 0x7fa52c0a4f30 2026-03-09T17:26:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.254+0000 7fa53ad83700 1 -- 192.168.123.106:0/48258702 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7fa518000ff0 con 0x7fa52c0a4f30 2026-03-09T17:26:59.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.256+0000 7fa52affd700 1 -- 192.168.123.106:0/48258702 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fa5300105d0 con 0x7fa52c0a4f30 2026-03-09T17:26:59.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.257+0000 7fa52affd700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa52006c7b0 0x7fa52006ec60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.257+0000 7fa52affd700 1 -- 192.168.123.106:0/48258702 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fa53008b710 con 0x7fa52c0a4f30 2026-03-09T17:26:59.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.257+0000 7fa539580700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa52006c7b0 0x7fa52006ec60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.257+0000 7fa52affd700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.109:6816/764831086,v1:192.168.123.109:6817/764831086] conn(0x7fa520072340 0x7fa520074750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.257+0000 7fa52affd700 1 -- 192.168.123.106:0/48258702 --> [v2:192.168.123.109:6816/764831086,v1:192.168.123.109:6817/764831086] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7fa520074e00 con 0x7fa520072340 2026-03-09T17:26:59.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.257+0000 7fa52affd700 1 -- 192.168.123.106:0/48258702 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7fa53008bac0 con 0x7fa52c0a4f30 2026-03-09T17:26:59.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.257+0000 7fa539580700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa52006c7b0 0x7fa52006ec60 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fa534063df0 tx=0x7fa534070040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:59.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.257+0000 7fa53a582700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.109:6816/764831086,v1:192.168.123.109:6817/764831086] conn(0x7fa520072340 0x7fa520074750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.258+0000 7fa53a582700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.109:6816/764831086,v1:192.168.123.109:6817/764831086] conn(0x7fa520072340 0x7fa520074750 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.5 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:59.270 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.258+0000 7f0b7759e700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701] conn(0x7f0b58001610 0x7f0b58003ac0 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.2 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:59.271 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.258+0000 7f0b6ffff700 1 -- 192.168.123.106:0/307366203 <== osd.2 v2:192.168.123.106:6818/1615959701 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f0b58006bf0 con 0x7f0b58001610 2026-03-09T17:26:59.271 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.266+0000 7fa52affd700 1 -- 192.168.123.106:0/48258702 <== osd.5 v2:192.168.123.109:6816/764831086 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7fa520074e00 con 0x7fa520072340 2026-03-09T17:26:59.275 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.270+0000 7f0b7ca49700 1 -- 192.168.123.106:0/307366203 --> [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f0b58005cd0 con 0x7f0b58001610 2026-03-09T17:26:59.275 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.270+0000 7f0b6ffff700 1 -- 192.168.123.106:0/307366203 <== osd.2 v2:192.168.123.106:6818/1615959701 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f0b58005cd0 con 0x7f0b58001610 2026-03-09T17:26:59.275 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.271+0000 7f0b6dffb700 1 -- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701] conn(0x7f0b58001610 msgr2=0x7f0b58003ac0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.275 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.271+0000 7f0b6dffb700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701] conn(0x7f0b58001610 0x7f0b58003ac0 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.278 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.276+0000 7f0b6dffb700 1 -- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0b6406c880 msgr2=0x7f0b6406ed30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.278 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.276+0000 7f0b6dffb700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0b6406c880 0x7f0b6406ed30 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f0b68005fd0 tx=0x7f0b6800a560 comp rx=0 tx=0).stop 2026-03-09T17:26:59.278 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.276+0000 7f0b6dffb700 1 -- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b700062b0 msgr2=0x7f0b70096640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.278 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.276+0000 7f0b6dffb700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b700062b0 0x7f0b70096640 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7f0b78050040 tx=0x7f0b78069b20 comp rx=0 tx=0).stop 2026-03-09T17:26:59.280 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.278+0000 7fa53ad83700 1 -- 192.168.123.106:0/48258702 --> [v2:192.168.123.109:6816/764831086,v1:192.168.123.109:6817/764831086] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7fa518002da0 con 0x7fa520072340 2026-03-09T17:26:59.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.279+0000 7f0b6dffb700 1 -- 192.168.123.106:0/307366203 shutdown_connections 2026-03-09T17:26:59.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.279+0000 7f0b6dffb700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:6818/1615959701,v1:192.168.123.106:6819/1615959701] conn(0x7f0b58001610 0x7f0b58003ac0 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.279+0000 7f0b6dffb700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f0b6406c880 0x7f0b6406ed30 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.279+0000 7f0b6dffb700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b700062b0 0x7f0b70096640 unknown :-1 s=CLOSED pgs=213 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.279+0000 7f0b6dffb700 1 --2- 192.168.123.106:0/307366203 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0b70006bc0 0x7f0b70096b80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.279+0000 7f0b6dffb700 1 -- 192.168.123.106:0/307366203 >> 192.168.123.106:0/307366203 conn(0x7f0b7000b180 msgr2=0x7f0b70092830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:59.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.280+0000 7f0b6dffb700 1 -- 192.168.123.106:0/307366203 shutdown_connections 2026-03-09T17:26:59.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.280+0000 7f0b6dffb700 1 -- 192.168.123.106:0/307366203 wait complete. 2026-03-09T17:26:59.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.279+0000 7fa52affd700 1 -- 192.168.123.106:0/48258702 <== osd.5 v2:192.168.123.109:6816/764831086 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7fa518002da0 con 0x7fa520072340 2026-03-09T17:26:59.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.287+0000 7fa528ff9700 1 -- 192.168.123.106:0/48258702 >> [v2:192.168.123.109:6816/764831086,v1:192.168.123.109:6817/764831086] conn(0x7fa520072340 msgr2=0x7fa520074750 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.288+0000 7fa528ff9700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.109:6816/764831086,v1:192.168.123.109:6817/764831086] conn(0x7fa520072340 0x7fa520074750 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.288+0000 7fa528ff9700 1 -- 192.168.123.106:0/48258702 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa52006c7b0 msgr2=0x7fa52006ec60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.288+0000 7fa528ff9700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa52006c7b0 0x7fa52006ec60 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fa534063df0 tx=0x7fa534070040 comp rx=0 tx=0).stop 2026-03-09T17:26:59.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.288+0000 7fa528ff9700 1 -- 192.168.123.106:0/48258702 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa52c0a4f30 msgr2=0x7fa52c00f770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.288+0000 7fa528ff9700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa52c0a4f30 0x7fa52c00f770 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7fa53000d8d0 tx=0x7fa53000dbe0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.288+0000 7fa528ff9700 1 -- 192.168.123.106:0/48258702 shutdown_connections 2026-03-09T17:26:59.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.288+0000 7fa528ff9700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.109:6816/764831086,v1:192.168.123.109:6817/764831086] conn(0x7fa520072340 0x7fa520074750 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.288+0000 7fa528ff9700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa52006c7b0 0x7fa52006ec60 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.289+0000 7fa528ff9700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa52c0a4f30 0x7fa52c00f770 unknown :-1 s=CLOSED pgs=215 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.289+0000 7fa528ff9700 1 --2- 192.168.123.106:0/48258702 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa52c00fcb0 0x7fa52c010120 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.289+0000 7fa528ff9700 1 -- 192.168.123.106:0/48258702 >> 192.168.123.106:0/48258702 conn(0x7fa52c01a290 msgr2=0x7fa52c0a3cc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:59.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.289+0000 7fa528ff9700 1 -- 192.168.123.106:0/48258702 shutdown_connections 2026-03-09T17:26:59.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.290+0000 7fa528ff9700 1 -- 192.168.123.106:0/48258702 wait complete. 2026-03-09T17:26:59.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.378+0000 7f10b8a7e700 1 -- 192.168.123.106:0/3621676057 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b4071e40 msgr2=0x7f10b40722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.378+0000 7f10b8a7e700 1 --2- 192.168.123.106:0/3621676057 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b4071e40 0x7f10b40722b0 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7f10ac00b3a0 tx=0x7f10ac00b6b0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.378+0000 7f10b8a7e700 1 -- 192.168.123.106:0/3621676057 shutdown_connections 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.378+0000 7f10b8a7e700 1 --2- 192.168.123.106:0/3621676057 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b4071e40 0x7f10b40722b0 unknown :-1 s=CLOSED pgs=216 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.378+0000 7f10b8a7e700 1 --2- 192.168.123.106:0/3621676057 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f10b410c8b0 0x7f10b410cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.378+0000 7f10b8a7e700 1 -- 192.168.123.106:0/3621676057 >> 192.168.123.106:0/3621676057 conn(0x7f10b406c6c0 msgr2=0x7f10b406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.378+0000 7f10b8a7e700 1 -- 192.168.123.106:0/3621676057 shutdown_connections 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.378+0000 7f10b8a7e700 1 -- 192.168.123.106:0/3621676057 wait complete. 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.379+0000 7f10b8a7e700 1 Processor -- start 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.379+0000 7f10b8a7e700 1 -- start start 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.379+0000 7f10b8a7e700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f10b410c8b0 0x7f10b407d1c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.379+0000 7f10b8a7e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b407d700 0x7f10b4081b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.379+0000 7f10b8a7e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f10b407dc00 con 0x7f10b407d700 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.379+0000 7f10b8a7e700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f10b407dd70 con 0x7f10b410c8b0 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.379+0000 7f10b1d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b407d700 0x7f10b4081b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.379+0000 7f10b1d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b407d700 0x7f10b4081b70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43802/0 (socket says 192.168.123.106:43802) 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.379+0000 7f10b1d9b700 1 -- 192.168.123.106:0/1912720624 learned_addr learned my addr 192.168.123.106:0/1912720624 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.379+0000 7f10b259c700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f10b410c8b0 0x7f10b407d1c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.379+0000 7f10b1d9b700 1 -- 192.168.123.106:0/1912720624 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f10b410c8b0 msgr2=0x7f10b407d1c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.379+0000 7f10b1d9b700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f10b410c8b0 0x7f10b407d1c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.379+0000 7f10b1d9b700 1 -- 192.168.123.106:0/1912720624 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f10ac00b050 con 0x7f10b407d700 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.380+0000 7f10b1d9b700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b407d700 0x7f10b4081b70 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7f10ac007b90 tx=0x7f10ac0095a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:59.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.380+0000 7f10a37fe700 1 -- 192.168.123.106:0/1912720624 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f10ac00e050 con 0x7f10b407d700 2026-03-09T17:26:59.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.380+0000 7f10b8a7e700 1 -- 192.168.123.106:0/1912720624 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f10b40820b0 con 0x7f10b407d700 2026-03-09T17:26:59.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.380+0000 7f10b8a7e700 1 -- 192.168.123.106:0/1912720624 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f10b4082600 con 0x7f10b407d700 2026-03-09T17:26:59.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.382+0000 7f10a37fe700 1 -- 192.168.123.106:0/1912720624 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f10ac004730 con 0x7f10b407d700 2026-03-09T17:26:59.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.382+0000 7f10a37fe700 1 -- 192.168.123.106:0/1912720624 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f10ac01bd20 con 0x7f10b407d700 2026-03-09T17:26:59.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.383+0000 7f10a37fe700 1 -- 192.168.123.106:0/1912720624 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f10ac019040 con 0x7f10b407d700 2026-03-09T17:26:59.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.384+0000 7f10a37fe700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c06e8e0 0x7f109c070d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.384+0000 7f10b259c700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c06e8e0 0x7f109c070d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.384+0000 7f10b259c700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c06e8e0 0x7f109c070d90 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f10a4009910 tx=0x7f10a4008040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:59.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.384+0000 7f10a37fe700 1 -- 192.168.123.106:0/1912720624 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f10ac08d790 con 0x7f10b407d700 2026-03-09T17:26:59.389 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.385+0000 7f10b8a7e700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563] conn(0x7f1094001610 0x7f1094003ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.389 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.385+0000 7f10b8a7e700 1 -- 192.168.123.106:0/1912720624 --> [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f1094006bf0 con 0x7f1094001610 2026-03-09T17:26:59.389 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.385+0000 7f10b2d9d700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563] conn(0x7f1094001610 0x7f1094003ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.390 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.387+0000 7f10b2d9d700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563] conn(0x7f1094001610 0x7f1094003ac0 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:59.399 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.388+0000 7f10a37fe700 1 -- 192.168.123.106:0/1912720624 <== osd.1 v2:192.168.123.106:6810/1344357563 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f1094006bf0 con 0x7f1094001610 2026-03-09T17:26:59.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.397+0000 7f989089c700 1 -- 192.168.123.106:0/4058383454 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f988c107de0 msgr2=0x7f988c1081b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.397+0000 7f989089c700 1 --2- 192.168.123.106:0/4058383454 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f988c107de0 0x7f988c1081b0 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7f987c009b80 tx=0x7f987c009e90 comp rx=0 tx=0).stop 2026-03-09T17:26:59.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.399+0000 7f989089c700 1 -- 192.168.123.106:0/4058383454 shutdown_connections 2026-03-09T17:26:59.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.399+0000 7f989089c700 1 --2- 192.168.123.106:0/4058383454 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f988c108780 0x7f988c110dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.399+0000 7f989089c700 1 --2- 192.168.123.106:0/4058383454 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f988c107de0 0x7f988c1081b0 unknown :-1 s=CLOSED pgs=218 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.399+0000 7f989089c700 1 -- 192.168.123.106:0/4058383454 >> 192.168.123.106:0/4058383454 conn(0x7f988c06c6c0 msgr2=0x7f988c06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:59.408 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.405+0000 7f10b8a7e700 1 -- 192.168.123.106:0/1912720624 --> [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f1094005cd0 con 0x7f1094001610 2026-03-09T17:26:59.409 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.406+0000 7f989089c700 1 -- 192.168.123.106:0/4058383454 shutdown_connections 2026-03-09T17:26:59.409 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.406+0000 7f989089c700 1 -- 192.168.123.106:0/4058383454 wait complete. 2026-03-09T17:26:59.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.407+0000 7f10a37fe700 1 -- 192.168.123.106:0/1912720624 <== osd.1 v2:192.168.123.106:6810/1344357563 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f1094005cd0 con 0x7f1094001610 2026-03-09T17:26:59.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.407+0000 7f989089c700 1 Processor -- start 2026-03-09T17:26:59.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.407+0000 7f989089c700 1 -- start start 2026-03-09T17:26:59.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.407+0000 7f989089c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f988c107de0 0x7f988c1a2880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.408+0000 7f988b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f988c107de0 0x7f988c1a2880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.408+0000 7f988b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f988c107de0 0x7f988c1a2880 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43818/0 (socket says 192.168.123.106:43818) 2026-03-09T17:26:59.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.408+0000 7f988b7fe700 1 -- 192.168.123.106:0/2327541241 learned_addr learned my addr 192.168.123.106:0/2327541241 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:26:59.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.408+0000 7f989089c700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f988c108780 0x7f988c1a2dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.408+0000 7f989089c700 1 -- 192.168.123.106:0/2327541241 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f988c1a3450 con 0x7f988c107de0 2026-03-09T17:26:59.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.408+0000 7f989089c700 1 -- 192.168.123.106:0/2327541241 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f988c19c900 con 0x7f988c108780 2026-03-09T17:26:59.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.408+0000 7f988affd700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f988c108780 0x7f988c1a2dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.408+0000 7f988b7fe700 1 -- 192.168.123.106:0/2327541241 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f988c108780 msgr2=0x7f988c1a2dc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.408+0000 7f988b7fe700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f988c108780 0x7f988c1a2dc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.408+0000 7f988b7fe700 1 -- 192.168.123.106:0/2327541241 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f987c0097e0 con 0x7f988c107de0 2026-03-09T17:26:59.417 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.409+0000 7f988b7fe700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f988c107de0 0x7f988c1a2880 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7f987c00b580 tx=0x7f987c0049f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:59.417 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.409+0000 7f9888ff9700 1 -- 192.168.123.106:0/2327541241 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f987c01d070 con 0x7f988c107de0 2026-03-09T17:26:59.417 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.409+0000 7f989089c700 1 -- 192.168.123.106:0/2327541241 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f988c19cbe0 con 0x7f988c107de0 2026-03-09T17:26:59.417 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.409+0000 7f989089c700 1 -- 192.168.123.106:0/2327541241 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f988c19d130 con 0x7f988c107de0 2026-03-09T17:26:59.417 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.411+0000 7f9888ff9700 1 -- 192.168.123.106:0/2327541241 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f987c004cf0 con 0x7f988c107de0 2026-03-09T17:26:59.417 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.411+0000 7f9888ff9700 1 -- 192.168.123.106:0/2327541241 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f987c00f740 con 0x7f988c107de0 2026-03-09T17:26:59.417 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.411+0000 7f9888ff9700 1 -- 192.168.123.106:0/2327541241 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f987c00f960 con 0x7f988c107de0 2026-03-09T17:26:59.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.412+0000 7f10a17fa700 1 -- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563] conn(0x7f1094001610 msgr2=0x7f1094003ac0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.412+0000 7f10a17fa700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563] conn(0x7f1094001610 0x7f1094003ac0 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.412+0000 7f10a17fa700 1 -- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c06e8e0 msgr2=0x7f109c070d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.412+0000 7f10a17fa700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c06e8e0 0x7f109c070d90 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f10a4009910 tx=0x7f10a4008040 comp rx=0 tx=0).stop 2026-03-09T17:26:59.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.412+0000 7f10a17fa700 1 -- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b407d700 msgr2=0x7f10b4081b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.412+0000 7f10a17fa700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b407d700 0x7f10b4081b70 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7f10ac007b90 tx=0x7f10ac0095a0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.417+0000 7f10a17fa700 1 -- 192.168.123.106:0/1912720624 shutdown_connections 2026-03-09T17:26:59.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.417+0000 7f10a17fa700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f10b410c8b0 0x7f10b407d1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.417+0000 7f10a17fa700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:6810/1344357563,v1:192.168.123.106:6811/1344357563] conn(0x7f1094001610 0x7f1094003ac0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.417+0000 7f10a17fa700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f109c06e8e0 0x7f109c070d90 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.417+0000 7f10a17fa700 1 --2- 192.168.123.106:0/1912720624 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10b407d700 0x7f10b4081b70 unknown :-1 s=CLOSED pgs=217 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.417+0000 7f10a17fa700 1 -- 192.168.123.106:0/1912720624 >> 192.168.123.106:0/1912720624 conn(0x7f10b406c6c0 msgr2=0x7f10b4070080 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:59.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.417+0000 7f10a17fa700 1 -- 192.168.123.106:0/1912720624 shutdown_connections 2026-03-09T17:26:59.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.417+0000 7f10a17fa700 1 -- 192.168.123.106:0/1912720624 wait complete. 2026-03-09T17:26:59.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.412+0000 7f989089c700 1 -- 192.168.123.106:0/2327541241 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f9870000fc0 con 0x7f988c107de0 2026-03-09T17:26:59.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.426+0000 7f9888ff9700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f987406c870 0x7f987406ed20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.428+0000 7f988affd700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f987406c870 0x7f987406ed20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.430+0000 7f988affd700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f987406c870 0x7f987406ed20 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f988c070170 tx=0x7f9880009450 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:59.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.430+0000 7f9888ff9700 1 -- 192.168.123.106:0/2327541241 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f987c08d8f0 con 0x7f988c107de0 2026-03-09T17:26:59.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.430+0000 7f9888ff9700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.109:6800/1175302832,v1:192.168.123.109:6801/1175302832] conn(0x7f9874072400 0x7f9874074810 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:26:59.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.430+0000 7f9888ff9700 1 -- 192.168.123.106:0/2327541241 --> [v2:192.168.123.109:6800/1175302832,v1:192.168.123.109:6801/1175302832] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f9874074ec0 con 0x7f9874072400 2026-03-09T17:26:59.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.430+0000 7f9888ff9700 1 -- 192.168.123.106:0/2327541241 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7f987c0224c0 con 0x7f988c107de0 2026-03-09T17:26:59.436 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.431+0000 7f988bfff700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.109:6800/1175302832,v1:192.168.123.109:6801/1175302832] conn(0x7f9874072400 0x7f9874074810 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:26:59.436 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.434+0000 7f988bfff700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.109:6800/1175302832,v1:192.168.123.109:6801/1175302832] conn(0x7f9874072400 0x7f9874074810 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.3 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:26:59.438 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.434+0000 7f9888ff9700 1 -- 192.168.123.106:0/2327541241 <== osd.3 v2:192.168.123.109:6800/1175302832 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f9874074ec0 con 0x7f9874072400 2026-03-09T17:26:59.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.454+0000 7f989089c700 1 -- 192.168.123.106:0/2327541241 --> [v2:192.168.123.109:6800/1175302832,v1:192.168.123.109:6801/1175302832] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f9870002d40 con 0x7f9874072400 2026-03-09T17:26:59.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.458+0000 7f9888ff9700 1 -- 192.168.123.106:0/2327541241 <== osd.3 v2:192.168.123.109:6800/1175302832 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f9870002d40 con 0x7f9874072400 2026-03-09T17:26:59.471 INFO:teuthology.orchestra.run.vm06.stdout:73014444041 2026-03-09T17:26:59.471 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd last-stat-seq osd.2 2026-03-09T17:26:59.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.462+0000 7f987a7fc700 1 -- 192.168.123.106:0/2327541241 >> [v2:192.168.123.109:6800/1175302832,v1:192.168.123.109:6801/1175302832] conn(0x7f9874072400 msgr2=0x7f9874074810 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.462+0000 7f987a7fc700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.109:6800/1175302832,v1:192.168.123.109:6801/1175302832] conn(0x7f9874072400 0x7f9874074810 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.470+0000 7f987a7fc700 1 -- 192.168.123.106:0/2327541241 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f987406c870 msgr2=0x7f987406ed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.470+0000 7f987a7fc700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f987406c870 0x7f987406ed20 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f988c070170 tx=0x7f9880009450 comp rx=0 tx=0).stop 2026-03-09T17:26:59.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.470+0000 7f987a7fc700 1 -- 192.168.123.106:0/2327541241 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f988c107de0 msgr2=0x7f988c1a2880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:26:59.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.470+0000 7f987a7fc700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f988c107de0 0x7f988c1a2880 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7f987c00b580 tx=0x7f987c0049f0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.470+0000 7f987a7fc700 1 -- 192.168.123.106:0/2327541241 shutdown_connections 2026-03-09T17:26:59.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.470+0000 7f987a7fc700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.109:6800/1175302832,v1:192.168.123.109:6801/1175302832] conn(0x7f9874072400 0x7f9874074810 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.470+0000 7f987a7fc700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f987406c870 0x7f987406ed20 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.470+0000 7f987a7fc700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f988c107de0 0x7f988c1a2880 unknown :-1 s=CLOSED pgs=219 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.470+0000 7f987a7fc700 1 --2- 192.168.123.106:0/2327541241 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f988c108780 0x7f988c1a2dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:26:59.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.470+0000 7f987a7fc700 1 -- 192.168.123.106:0/2327541241 >> 192.168.123.106:0/2327541241 conn(0x7f988c06c6c0 msgr2=0x7f988c10b600 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:26:59.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.471+0000 7f987a7fc700 1 -- 192.168.123.106:0/2327541241 shutdown_connections 2026-03-09T17:26:59.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:26:59.471+0000 7f987a7fc700 1 -- 192.168.123.106:0/2327541241 wait complete. 2026-03-09T17:26:59.550 INFO:teuthology.orchestra.run.vm06.stdout:38654705677 2026-03-09T17:26:59.550 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd last-stat-seq osd.0 2026-03-09T17:26:59.553 INFO:teuthology.orchestra.run.vm06.stdout:137438953475 2026-03-09T17:26:59.554 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd last-stat-seq osd.5 2026-03-09T17:26:59.569 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:26:59.578 INFO:teuthology.orchestra.run.vm06.stdout:98784247815 2026-03-09T17:26:59.579 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd last-stat-seq osd.3 2026-03-09T17:26:59.579 INFO:teuthology.orchestra.run.vm06.stdout:55834574859 2026-03-09T17:26:59.579 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd last-stat-seq osd.1 2026-03-09T17:27:00.083 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:00.094 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:00.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.125+0000 7f5d6e42d700 1 -- 192.168.123.106:0/2467900590 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d68105470 msgr2=0x7f5d68105840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:00.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.125+0000 7f5d6e42d700 1 --2- 192.168.123.106:0/2467900590 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d68105470 0x7f5d68105840 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f5d58009b00 tx=0x7f5d58009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:00.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.126+0000 7f5d6e42d700 1 -- 192.168.123.106:0/2467900590 shutdown_connections 2026-03-09T17:27:00.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.126+0000 7f5d6e42d700 1 --2- 192.168.123.106:0/2467900590 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5d680ff4c0 0x7f5d680ff930 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.126+0000 7f5d6e42d700 1 --2- 192.168.123.106:0/2467900590 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d68105470 0x7f5d68105840 unknown :-1 s=CLOSED pgs=220 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.126+0000 7f5d6e42d700 1 -- 192.168.123.106:0/2467900590 >> 192.168.123.106:0/2467900590 conn(0x7f5d68076330 msgr2=0x7f5d68076730 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:00.131 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.127+0000 7f5d6e42d700 1 -- 192.168.123.106:0/2467900590 shutdown_connections 2026-03-09T17:27:00.131 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.128+0000 7f5d6e42d700 1 -- 192.168.123.106:0/2467900590 wait complete. 2026-03-09T17:27:00.131 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.128+0000 7f5d6e42d700 1 Processor -- start 2026-03-09T17:27:00.131 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.128+0000 7f5d6e42d700 1 -- start start 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.128+0000 7f5d6e42d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5d680ff4c0 0x7f5d681983e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.128+0000 7f5d6e42d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d68105470 0x7f5d68198920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.128+0000 7f5d6e42d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d68199000 con 0x7f5d68105470 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.128+0000 7f5d6e42d700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d6819cd90 con 0x7f5d680ff4c0 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.129+0000 7f5d677fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d68105470 0x7f5d68198920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.129+0000 7f5d677fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d68105470 0x7f5d68198920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43826/0 (socket says 192.168.123.106:43826) 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.129+0000 7f5d677fe700 1 -- 192.168.123.106:0/1924594979 learned_addr learned my addr 192.168.123.106:0/1924594979 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.129+0000 7f5d67fff700 1 --2- 192.168.123.106:0/1924594979 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5d680ff4c0 0x7f5d681983e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.129+0000 7f5d677fe700 1 -- 192.168.123.106:0/1924594979 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5d680ff4c0 msgr2=0x7f5d681983e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.129+0000 7f5d677fe700 1 --2- 192.168.123.106:0/1924594979 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5d680ff4c0 0x7f5d681983e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.129+0000 7f5d677fe700 1 -- 192.168.123.106:0/1924594979 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5d580097e0 con 0x7f5d68105470 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.130+0000 7f5d677fe700 1 --2- 192.168.123.106:0/1924594979 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d68105470 0x7f5d68198920 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7f5d5c00b700 tx=0x7f5d5c00ba10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.130+0000 7f5d657fa700 1 -- 192.168.123.106:0/1924594979 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d5c0107c0 con 0x7f5d68105470 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.130+0000 7f5d6e42d700 1 -- 192.168.123.106:0/1924594979 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5d6819d070 con 0x7f5d68105470 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.130+0000 7f5d6e42d700 1 -- 192.168.123.106:0/1924594979 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5d6819d5c0 con 0x7f5d68105470 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.130+0000 7f5d657fa700 1 -- 192.168.123.106:0/1924594979 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5d5c010e00 con 0x7f5d68105470 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.130+0000 7f5d657fa700 1 -- 192.168.123.106:0/1924594979 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d5c00f360 con 0x7f5d68105470 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.131+0000 7f5d6e42d700 1 -- 192.168.123.106:0/1924594979 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5d540052f0 con 0x7f5d68105470 2026-03-09T17:27:00.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.134+0000 7f5d657fa700 1 -- 192.168.123.106:0/1924594979 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f5d5c010920 con 0x7f5d68105470 2026-03-09T17:27:00.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.135+0000 7f5d657fa700 1 --2- 192.168.123.106:0/1924594979 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5d5006c680 0x7f5d5006eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:00.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.135+0000 7f5d657fa700 1 -- 192.168.123.106:0/1924594979 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f5d5c08afb0 con 0x7f5d68105470 2026-03-09T17:27:00.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.135+0000 7f5d657fa700 1 -- 192.168.123.106:0/1924594979 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5d5c08b390 con 0x7f5d68105470 2026-03-09T17:27:00.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.138+0000 7f5d67fff700 1 --2- 192.168.123.106:0/1924594979 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5d5006c680 0x7f5d5006eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:00.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.138+0000 7f5d67fff700 1 --2- 192.168.123.106:0/1924594979 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5d5006c680 0x7f5d5006eb30 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f5d58006010 tx=0x7f5d5800b540 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:00.207 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:00.272 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:00 vm06 ceph-mon[57307]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:27:00.281 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.278+0000 7f5d6e42d700 1 -- 192.168.123.106:0/1924594979 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 4} v 0) v1 -- 0x7f5d54005160 con 0x7f5d68105470 2026-03-09T17:27:00.285 INFO:teuthology.orchestra.run.vm06.stdout:120259084294 2026-03-09T17:27:00.285 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.281+0000 7f5d657fa700 1 -- 192.168.123.106:0/1924594979 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 4}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f5d5c01e090 con 0x7f5d68105470 2026-03-09T17:27:00.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.284+0000 7f5d4effd700 1 -- 192.168.123.106:0/1924594979 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5d5006c680 msgr2=0x7f5d5006eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:00.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.284+0000 7f5d4effd700 1 --2- 192.168.123.106:0/1924594979 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5d5006c680 0x7f5d5006eb30 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f5d58006010 tx=0x7f5d5800b540 comp rx=0 tx=0).stop 2026-03-09T17:27:00.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.284+0000 7f5d4effd700 1 -- 192.168.123.106:0/1924594979 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d68105470 msgr2=0x7f5d68198920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:00.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.284+0000 7f5d4effd700 1 --2- 192.168.123.106:0/1924594979 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d68105470 0x7f5d68198920 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7f5d5c00b700 tx=0x7f5d5c00ba10 comp rx=0 tx=0).stop 2026-03-09T17:27:00.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.284+0000 7f5d4effd700 1 -- 192.168.123.106:0/1924594979 shutdown_connections 2026-03-09T17:27:00.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.284+0000 7f5d4effd700 1 --2- 192.168.123.106:0/1924594979 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5d680ff4c0 0x7f5d681983e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.284+0000 7f5d4effd700 1 --2- 192.168.123.106:0/1924594979 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f5d5006c680 0x7f5d5006eb30 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.285+0000 7f5d4effd700 1 --2- 192.168.123.106:0/1924594979 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d68105470 0x7f5d68198920 unknown :-1 s=CLOSED pgs=221 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.285+0000 7f5d4effd700 1 -- 192.168.123.106:0/1924594979 >> 192.168.123.106:0/1924594979 conn(0x7f5d68076330 msgr2=0x7f5d68068d20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:00.289 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.286+0000 7f5d4effd700 1 -- 192.168.123.106:0/1924594979 shutdown_connections 2026-03-09T17:27:00.289 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.286+0000 7f5d4effd700 1 -- 192.168.123.106:0/1924594979 wait complete. 2026-03-09T17:27:00.402 INFO:tasks.cephadm.ceph_manager.ceph:need seq 120259084294 got 120259084294 for osd.4 2026-03-09T17:27:00.402 DEBUG:teuthology.parallel:result is None 2026-03-09T17:27:00.470 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:00.471 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:00.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:00 vm09 ceph-mon[62061]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:27:00.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.920+0000 7f7041101700 1 -- 192.168.123.106:0/3359859153 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f703c1016e0 msgr2=0x7f703c101ab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:00.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.920+0000 7f7041101700 1 --2- 192.168.123.106:0/3359859153 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f703c1016e0 0x7f703c101ab0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f703400b3a0 tx=0x7f703400b6b0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.920+0000 7f7041101700 1 -- 192.168.123.106:0/3359859153 shutdown_connections 2026-03-09T17:27:00.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.920+0000 7f7041101700 1 --2- 192.168.123.106:0/3359859153 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f703c101ff0 0x7f703c10a4f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.920+0000 7f7041101700 1 --2- 192.168.123.106:0/3359859153 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f703c1016e0 0x7f703c101ab0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.920+0000 7f7041101700 1 -- 192.168.123.106:0/3359859153 >> 192.168.123.106:0/3359859153 conn(0x7f703c0faf00 msgr2=0x7f703c0fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:00.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.920+0000 7f7041101700 1 -- 192.168.123.106:0/3359859153 shutdown_connections 2026-03-09T17:27:00.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.920+0000 7f7041101700 1 -- 192.168.123.106:0/3359859153 wait complete. 2026-03-09T17:27:00.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.921+0000 7f7041101700 1 Processor -- start 2026-03-09T17:27:00.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.921+0000 7f7041101700 1 -- start start 2026-03-09T17:27:00.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.921+0000 7f7041101700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f703c101ff0 0x7f703c109ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:00.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.921+0000 7f7041101700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f703c10a400 0x7f703c103eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:00.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.921+0000 7f7041101700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f703c104540 con 0x7f703c10a400 2026-03-09T17:27:00.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.921+0000 7f7041101700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f703c1046b0 con 0x7f703c101ff0 2026-03-09T17:27:00.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.921+0000 7f703a59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f703c10a400 0x7f703c103eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:00.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.921+0000 7f703a59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f703c10a400 0x7f703c103eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43842/0 (socket says 192.168.123.106:43842) 2026-03-09T17:27:00.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.921+0000 7f703a59c700 1 -- 192.168.123.106:0/1070341522 learned_addr learned my addr 192.168.123.106:0/1070341522 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:00.930 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.928+0000 7f703ad9d700 1 --2- 192.168.123.106:0/1070341522 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f703c101ff0 0x7f703c109ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:00.930 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.928+0000 7f703a59c700 1 -- 192.168.123.106:0/1070341522 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f703c101ff0 msgr2=0x7f703c109ec0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:00.930 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.928+0000 7f703a59c700 1 --2- 192.168.123.106:0/1070341522 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f703c101ff0 0x7f703c109ec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.930 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.928+0000 7f703a59c700 1 -- 192.168.123.106:0/1070341522 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f703400b050 con 0x7f703c10a400 2026-03-09T17:27:00.931 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.929+0000 7f703a59c700 1 --2- 192.168.123.106:0/1070341522 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f703c10a400 0x7f703c103eb0 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f702c00eb10 tx=0x7f702c00eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:00.932 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.929+0000 7f7023fff700 1 -- 192.168.123.106:0/1070341522 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f702c00cca0 con 0x7f703c10a400 2026-03-09T17:27:00.932 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.929+0000 7f7041101700 1 -- 192.168.123.106:0/1070341522 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f703c104990 con 0x7f703c10a400 2026-03-09T17:27:00.932 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.929+0000 7f7041101700 1 -- 192.168.123.106:0/1070341522 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f703c1a7120 con 0x7f703c10a400 2026-03-09T17:27:00.932 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.929+0000 7f7023fff700 1 -- 192.168.123.106:0/1070341522 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f702c00ce00 con 0x7f703c10a400 2026-03-09T17:27:00.932 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.929+0000 7f7023fff700 1 -- 192.168.123.106:0/1070341522 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f702c018910 con 0x7f703c10a400 2026-03-09T17:27:00.943 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.940+0000 7f7023fff700 1 -- 192.168.123.106:0/1070341522 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f702c018a70 con 0x7f703c10a400 2026-03-09T17:27:00.943 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.940+0000 7f7023fff700 1 --2- 192.168.123.106:0/1070341522 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f702406c7a0 0x7f702406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:00.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.941+0000 7f19a9ebc700 1 -- 192.168.123.106:0/386919615 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19a406d7a0 msgr2=0x7f19a406dc10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:00.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.941+0000 7f19a9ebc700 1 --2- 192.168.123.106:0/386919615 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19a406d7a0 0x7f19a406dc10 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7f1998009b50 tx=0x7f1998009e60 comp rx=0 tx=0).stop 2026-03-09T17:27:00.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.942+0000 7f19a9ebc700 1 -- 192.168.123.106:0/386919615 shutdown_connections 2026-03-09T17:27:00.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.942+0000 7f19a9ebc700 1 --2- 192.168.123.106:0/386919615 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19a406d7a0 0x7f19a406dc10 unknown :-1 s=CLOSED pgs=223 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.942+0000 7f19a9ebc700 1 --2- 192.168.123.106:0/386919615 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f19a410ed80 0x7f19a406d260 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.942+0000 7f19a9ebc700 1 -- 192.168.123.106:0/386919615 >> 192.168.123.106:0/386919615 conn(0x7f19a406c830 msgr2=0x7f19a4071830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:00.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.942+0000 7f19a9ebc700 1 -- 192.168.123.106:0/386919615 shutdown_connections 2026-03-09T17:27:00.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.942+0000 7f19a9ebc700 1 -- 192.168.123.106:0/386919615 wait complete. 2026-03-09T17:27:00.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.948+0000 7f19a9ebc700 1 Processor -- start 2026-03-09T17:27:00.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.949+0000 7f703ad9d700 1 --2- 192.168.123.106:0/1070341522 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f702406c7a0 0x7f702406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:00.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.949+0000 7f703ad9d700 1 --2- 192.168.123.106:0/1070341522 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f702406c7a0 0x7f702406ec50 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f7034009750 tx=0x7f70340096e0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:00.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.949+0000 7f7023fff700 1 -- 192.168.123.106:0/1070341522 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f702c014070 con 0x7f703c10a400 2026-03-09T17:27:00.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.950+0000 7f19a9ebc700 1 -- start start 2026-03-09T17:27:00.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.950+0000 7f19a9ebc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19a410ed80 0x7f19a4117520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:00.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.950+0000 7f19a9ebc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f19a4112520 0x7f19a4112990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:00.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.950+0000 7f19a9ebc700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f19a4112ed0 con 0x7f19a410ed80 2026-03-09T17:27:00.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.950+0000 7f19a9ebc700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f19a4113040 con 0x7f19a4112520 2026-03-09T17:27:00.958 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.952+0000 7f7041101700 1 -- 192.168.123.106:0/1070341522 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7028005320 con 0x7f703c10a400 2026-03-09T17:27:00.958 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.956+0000 7f19a37fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19a410ed80 0x7f19a4117520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:00.958 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.956+0000 7f19a37fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19a410ed80 0x7f19a4117520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43872/0 (socket says 192.168.123.106:43872) 2026-03-09T17:27:00.958 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.956+0000 7f19a37fe700 1 -- 192.168.123.106:0/121242077 learned_addr learned my addr 192.168.123.106:0/121242077 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:00.958 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.956+0000 7f19a37fe700 1 -- 192.168.123.106:0/121242077 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f19a4112520 msgr2=0x7f19a4112990 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:27:00.958 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.956+0000 7f19a37fe700 1 --2- 192.168.123.106:0/121242077 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f19a4112520 0x7f19a4112990 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.958 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.956+0000 7f19a37fe700 1 -- 192.168.123.106:0/121242077 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f19980097e0 con 0x7f19a410ed80 2026-03-09T17:27:00.959 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.957+0000 7f19a37fe700 1 --2- 192.168.123.106:0/121242077 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19a410ed80 0x7f19a4117520 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7f199400ed70 tx=0x7f199400c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:00.959 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.957+0000 7f19a0ff9700 1 -- 192.168.123.106:0/121242077 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1994009980 con 0x7f19a410ed80 2026-03-09T17:27:00.960 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.957+0000 7f19a9ebc700 1 -- 192.168.123.106:0/121242077 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f19a4113320 con 0x7f19a410ed80 2026-03-09T17:27:00.960 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.957+0000 7f19a9ebc700 1 -- 192.168.123.106:0/121242077 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f19a41b7da0 con 0x7f19a410ed80 2026-03-09T17:27:00.960 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.958+0000 7f19a0ff9700 1 -- 192.168.123.106:0/121242077 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f199400cd70 con 0x7f19a410ed80 2026-03-09T17:27:00.960 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.958+0000 7f19a0ff9700 1 -- 192.168.123.106:0/121242077 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f19940189c0 con 0x7f19a410ed80 2026-03-09T17:27:00.962 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.960+0000 7f19a0ff9700 1 -- 192.168.123.106:0/121242077 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f1994018b20 con 0x7f19a410ed80 2026-03-09T17:27:00.962 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.960+0000 7f19a0ff9700 1 --2- 192.168.123.106:0/121242077 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f198c06c6d0 0x7f198c06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:00.962 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.955+0000 7f7023fff700 1 -- 192.168.123.106:0/1070341522 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f702c057b50 con 0x7f703c10a400 2026-03-09T17:27:00.962 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.960+0000 7f19a0ff9700 1 -- 192.168.123.106:0/121242077 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f1994014070 con 0x7f19a410ed80 2026-03-09T17:27:00.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.969+0000 7f19a2ffd700 1 --2- 192.168.123.106:0/121242077 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f198c06c6d0 0x7f198c06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:00.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.971+0000 7f198a7fc700 1 -- 192.168.123.106:0/121242077 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1990005320 con 0x7f19a410ed80 2026-03-09T17:27:00.981 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.976+0000 7f19a0ff9700 1 -- 192.168.123.106:0/121242077 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f1994057c60 con 0x7f19a410ed80 2026-03-09T17:27:00.981 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.979+0000 7f19a2ffd700 1 --2- 192.168.123.106:0/121242077 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f198c06c6d0 0x7f198c06eb80 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f1998005cb0 tx=0x7f1998005be0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:00.989 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.986+0000 7f785978e700 1 -- 192.168.123.106:0/2232899588 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f785410e9e0 msgr2=0x7f785410edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:00.989 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.986+0000 7f785978e700 1 --2- 192.168.123.106:0/2232899588 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f785410e9e0 0x7f785410edb0 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7f7844009b50 tx=0x7f7844009e60 comp rx=0 tx=0).stop 2026-03-09T17:27:00.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.990+0000 7f785978e700 1 -- 192.168.123.106:0/2232899588 shutdown_connections 2026-03-09T17:27:00.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.990+0000 7f785978e700 1 --2- 192.168.123.106:0/2232899588 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7854071b60 0x7f7854071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.990+0000 7f785978e700 1 --2- 192.168.123.106:0/2232899588 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f785410e9e0 0x7f785410edb0 unknown :-1 s=CLOSED pgs=225 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.990+0000 7f785978e700 1 -- 192.168.123.106:0/2232899588 >> 192.168.123.106:0/2232899588 conn(0x7f785406c6c0 msgr2=0x7f785406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:00.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.990+0000 7f785978e700 1 -- 192.168.123.106:0/2232899588 shutdown_connections 2026-03-09T17:27:00.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.990+0000 7f785978e700 1 -- 192.168.123.106:0/2232899588 wait complete. 2026-03-09T17:27:00.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.991+0000 7f785978e700 1 Processor -- start 2026-03-09T17:27:00.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.991+0000 7f785978e700 1 -- start start 2026-03-09T17:27:00.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.991+0000 7f785978e700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7854071b60 0x7f7854119590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:00.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.991+0000 7f785978e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f785410e9e0 0x7f7854114590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:00.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.991+0000 7f785978e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7854114ad0 con 0x7f785410e9e0 2026-03-09T17:27:00.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.991+0000 7f785978e700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7854114c10 con 0x7f7854071b60 2026-03-09T17:27:00.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.991+0000 7f7853fff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7854071b60 0x7f7854119590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:00.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.991+0000 7f78537fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f785410e9e0 0x7f7854114590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:00.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.991+0000 7f78537fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f785410e9e0 0x7f7854114590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43890/0 (socket says 192.168.123.106:43890) 2026-03-09T17:27:00.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.991+0000 7f78537fe700 1 -- 192.168.123.106:0/137103369 learned_addr learned my addr 192.168.123.106:0/137103369 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:00.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.992+0000 7f78537fe700 1 -- 192.168.123.106:0/137103369 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7854071b60 msgr2=0x7f7854119590 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:00.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.992+0000 7f78537fe700 1 --2- 192.168.123.106:0/137103369 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7854071b60 0x7f7854119590 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:00.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.992+0000 7f78537fe700 1 -- 192.168.123.106:0/137103369 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f78440097e0 con 0x7f785410e9e0 2026-03-09T17:27:00.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.992+0000 7f78537fe700 1 --2- 192.168.123.106:0/137103369 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f785410e9e0 0x7f7854114590 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7f784800ba70 tx=0x7f784800bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:00.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.992+0000 7f78517fa700 1 -- 192.168.123.106:0/137103369 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f784800c700 con 0x7f785410e9e0 2026-03-09T17:27:00.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.992+0000 7f78517fa700 1 -- 192.168.123.106:0/137103369 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f784800cd40 con 0x7f785410e9e0 2026-03-09T17:27:00.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.993+0000 7f785978e700 1 -- 192.168.123.106:0/137103369 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7854114ef0 con 0x7f785410e9e0 2026-03-09T17:27:00.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.993+0000 7f785978e700 1 -- 192.168.123.106:0/137103369 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f78541b7b20 con 0x7f785410e9e0 2026-03-09T17:27:00.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.995+0000 7f78517fa700 1 -- 192.168.123.106:0/137103369 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7848012340 con 0x7f785410e9e0 2026-03-09T17:27:00.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.995+0000 7f78517fa700 1 -- 192.168.123.106:0/137103369 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f7848012580 con 0x7f785410e9e0 2026-03-09T17:27:00.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.995+0000 7f78517fa700 1 --2- 192.168.123.106:0/137103369 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f783c06c620 0x7f783c06ead0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:00.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.995+0000 7f7853fff700 1 --2- 192.168.123.106:0/137103369 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f783c06c620 0x7f783c06ead0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:00.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.995+0000 7f78517fa700 1 -- 192.168.123.106:0/137103369 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f784808b640 con 0x7f785410e9e0 2026-03-09T17:27:01.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.996+0000 7f7853fff700 1 --2- 192.168.123.106:0/137103369 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f783c06c620 0x7f783c06ead0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f7844000c00 tx=0x7f784400b540 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:01.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.996+0000 7f785978e700 1 -- 192.168.123.106:0/137103369 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f785404f2a0 con 0x7f785410e9e0 2026-03-09T17:27:01.004 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:00.998+0000 7f78517fa700 1 -- 192.168.123.106:0/137103369 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f784804e720 con 0x7f785410e9e0 2026-03-09T17:27:01.101 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.095+0000 7f7041101700 1 -- 192.168.123.106:0/1070341522 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 0} v 0) v1 -- 0x7f7028005190 con 0x7f703c10a400 2026-03-09T17:27:01.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.100+0000 7f7023fff700 1 -- 192.168.123.106:0/1070341522 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 0}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f702c05b170 con 0x7f703c10a400 2026-03-09T17:27:01.105 INFO:teuthology.orchestra.run.vm06.stdout:38654705678 2026-03-09T17:27:01.112 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.109+0000 7f7021ffb700 1 -- 192.168.123.106:0/1070341522 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f702406c7a0 msgr2=0x7f702406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:01.112 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.109+0000 7f7021ffb700 1 --2- 192.168.123.106:0/1070341522 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f702406c7a0 0x7f702406ec50 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f7034009750 tx=0x7f70340096e0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.112 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.109+0000 7f7021ffb700 1 -- 192.168.123.106:0/1070341522 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f703c10a400 msgr2=0x7f703c103eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:01.112 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.109+0000 7f7021ffb700 1 --2- 192.168.123.106:0/1070341522 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f703c10a400 0x7f703c103eb0 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f702c00eb10 tx=0x7f702c00eed0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.112 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.109+0000 7f7021ffb700 1 -- 192.168.123.106:0/1070341522 shutdown_connections 2026-03-09T17:27:01.112 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.109+0000 7f7021ffb700 1 --2- 192.168.123.106:0/1070341522 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f703c101ff0 0x7f703c109ec0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.112 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.109+0000 7f7021ffb700 1 --2- 192.168.123.106:0/1070341522 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f702406c7a0 0x7f702406ec50 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.112 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.109+0000 7f7021ffb700 1 --2- 192.168.123.106:0/1070341522 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f703c10a400 0x7f703c103eb0 unknown :-1 s=CLOSED pgs=222 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.112 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.109+0000 7f7021ffb700 1 -- 192.168.123.106:0/1070341522 >> 192.168.123.106:0/1070341522 conn(0x7f703c0faf00 msgr2=0x7f703c0fbb60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:01.112 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.109+0000 7f7021ffb700 1 -- 192.168.123.106:0/1070341522 shutdown_connections 2026-03-09T17:27:01.112 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.109+0000 7f7021ffb700 1 -- 192.168.123.106:0/1070341522 wait complete. 2026-03-09T17:27:01.165 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.159+0000 7f7503536700 1 -- 192.168.123.106:0/432870602 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74fc10c8b0 msgr2=0x7f74fc10cc80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:01.165 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.159+0000 7f7503536700 1 --2- 192.168.123.106:0/432870602 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74fc10c8b0 0x7f74fc10cc80 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7f74f4009230 tx=0x7f74f4009260 comp rx=0 tx=0).stop 2026-03-09T17:27:01.165 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.161+0000 7f7503536700 1 -- 192.168.123.106:0/432870602 shutdown_connections 2026-03-09T17:27:01.165 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.161+0000 7f7503536700 1 --2- 192.168.123.106:0/432870602 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f74fc071e40 0x7f74fc0722b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.165 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.161+0000 7f7503536700 1 --2- 192.168.123.106:0/432870602 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74fc10c8b0 0x7f74fc10cc80 unknown :-1 s=CLOSED pgs=227 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.165 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.161+0000 7f7503536700 1 -- 192.168.123.106:0/432870602 >> 192.168.123.106:0/432870602 conn(0x7f74fc06c6c0 msgr2=0x7f74fc06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:01.165 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.161+0000 7f7503536700 1 -- 192.168.123.106:0/432870602 shutdown_connections 2026-03-09T17:27:01.165 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.161+0000 7f7503536700 1 -- 192.168.123.106:0/432870602 wait complete. 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.161+0000 7f7503536700 1 Processor -- start 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.161+0000 7f7503536700 1 -- start start 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.161+0000 7f7503536700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74fc071e40 0x7f74fc07ceb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.161+0000 7f7503536700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f74fc07d3f0 0x7f74fc07d860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.161+0000 7f7503536700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f74fc081a30 con 0x7f74fc071e40 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.161+0000 7f7503536700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f74fc081ba0 con 0x7f74fc07d3f0 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.161+0000 7f75012d2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74fc071e40 0x7f74fc07ceb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.161+0000 7f7500ad1700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f74fc07d3f0 0x7f74fc07d860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.162+0000 7f7500ad1700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f74fc07d3f0 0x7f74fc07d860 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:46892/0 (socket says 192.168.123.106:46892) 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.162+0000 7f7500ad1700 1 -- 192.168.123.106:0/2537804587 learned_addr learned my addr 192.168.123.106:0/2537804587 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.162+0000 7f7500ad1700 1 -- 192.168.123.106:0/2537804587 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74fc071e40 msgr2=0x7f74fc07ceb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.162+0000 7f7500ad1700 1 --2- 192.168.123.106:0/2537804587 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74fc071e40 0x7f74fc07ceb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.162+0000 7f7500ad1700 1 -- 192.168.123.106:0/2537804587 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f74f4008ee0 con 0x7f74fc07d3f0 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.162+0000 7f75012d2700 1 --2- 192.168.123.106:0/2537804587 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74fc071e40 0x7f74fc07ceb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.163+0000 7f7500ad1700 1 --2- 192.168.123.106:0/2537804587 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f74fc07d3f0 0x7f74fc07d860 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f74f8007c00 tx=0x7f74f8007f10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.163+0000 7f74f27fc700 1 -- 192.168.123.106:0/2537804587 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f74f8010040 con 0x7f74fc07d3f0 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.163+0000 7f7503536700 1 -- 192.168.123.106:0/2537804587 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f74fc081e80 con 0x7f74fc07d3f0 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.163+0000 7f7503536700 1 -- 192.168.123.106:0/2537804587 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f74fc0823d0 con 0x7f74fc07d3f0 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.164+0000 7f74f27fc700 1 -- 192.168.123.106:0/2537804587 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f74f8015470 con 0x7f74fc07d3f0 2026-03-09T17:27:01.169 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.164+0000 7f74f27fc700 1 -- 192.168.123.106:0/2537804587 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f74f80145e0 con 0x7f74fc07d3f0 2026-03-09T17:27:01.174 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.167+0000 7f74f27fc700 1 -- 192.168.123.106:0/2537804587 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f74f800fa70 con 0x7f74fc07d3f0 2026-03-09T17:27:01.174 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.167+0000 7f74f27fc700 1 --2- 192.168.123.106:0/2537804587 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f74e806c6d0 0x7f74e806eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:01.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.167+0000 7f75012d2700 1 --2- 192.168.123.106:0/2537804587 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f74e806c6d0 0x7f74e806eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:01.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.169+0000 7f75012d2700 1 --2- 192.168.123.106:0/2537804587 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f74e806c6d0 0x7f74e806eb80 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f74f400bfd0 tx=0x7f74f4007660 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:01.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.169+0000 7f74f27fc700 1 -- 192.168.123.106:0/2537804587 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f74f808b910 con 0x7f74fc07d3f0 2026-03-09T17:27:01.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.169+0000 7f7503536700 1 -- 192.168.123.106:0/2537804587 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f74e0005320 con 0x7f74fc07d3f0 2026-03-09T17:27:01.176 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.172+0000 7f74f27fc700 1 -- 192.168.123.106:0/2537804587 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f74f80563d0 con 0x7f74fc07d3f0 2026-03-09T17:27:01.177 INFO:tasks.cephadm.ceph_manager.ceph:need seq 38654705677 got 38654705678 for osd.0 2026-03-09T17:27:01.177 DEBUG:teuthology.parallel:result is None 2026-03-09T17:27:01.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.204+0000 7fdc4b389700 1 -- 192.168.123.106:0/2435444494 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc4410e9e0 msgr2=0x7fdc4410edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:01.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.204+0000 7fdc4b389700 1 --2- 192.168.123.106:0/2435444494 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc4410e9e0 0x7fdc4410edb0 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7fdc40009b00 tx=0x7fdc40009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:01.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.205+0000 7fdc4b389700 1 -- 192.168.123.106:0/2435444494 shutdown_connections 2026-03-09T17:27:01.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.205+0000 7fdc4b389700 1 --2- 192.168.123.106:0/2435444494 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdc44071b60 0x7fdc44071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.205+0000 7fdc4b389700 1 --2- 192.168.123.106:0/2435444494 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc4410e9e0 0x7fdc4410edb0 secure :-1 s=CLOSED pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7fdc40009b00 tx=0x7fdc40009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:01.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.205+0000 7fdc4b389700 1 -- 192.168.123.106:0/2435444494 >> 192.168.123.106:0/2435444494 conn(0x7fdc4406c6c0 msgr2=0x7fdc4406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:01.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.208+0000 7fdc4b389700 1 -- 192.168.123.106:0/2435444494 shutdown_connections 2026-03-09T17:27:01.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.208+0000 7fdc4b389700 1 -- 192.168.123.106:0/2435444494 wait complete. 2026-03-09T17:27:01.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.208+0000 7fdc4b389700 1 Processor -- start 2026-03-09T17:27:01.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.208+0000 7fdc4b389700 1 -- start start 2026-03-09T17:27:01.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.209+0000 7fdc4b389700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc44071b60 0x7fdc44115b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:01.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.209+0000 7fdc4b389700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdc441160b0 0x7fdc44116520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:01.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.209+0000 7fdc4b389700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdc441b7730 con 0x7fdc44071b60 2026-03-09T17:27:01.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.209+0000 7fdc4b389700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdc441b7870 con 0x7fdc441160b0 2026-03-09T17:27:01.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.210+0000 7fdc4a387700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc44071b60 0x7fdc44115b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:01.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.210+0000 7fdc4a387700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc44071b60 0x7fdc44115b70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43902/0 (socket says 192.168.123.106:43902) 2026-03-09T17:27:01.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.210+0000 7fdc4a387700 1 -- 192.168.123.106:0/4291778629 learned_addr learned my addr 192.168.123.106:0/4291778629 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:01.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.210+0000 7fdc49b86700 1 --2- 192.168.123.106:0/4291778629 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdc441160b0 0x7fdc44116520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:01.214 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.212+0000 7fdc49b86700 1 -- 192.168.123.106:0/4291778629 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc44071b60 msgr2=0x7fdc44115b70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:01.214 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.212+0000 7fdc49b86700 1 --2- 192.168.123.106:0/4291778629 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc44071b60 0x7fdc44115b70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.214 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.212+0000 7fdc49b86700 1 -- 192.168.123.106:0/4291778629 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdc400097e0 con 0x7fdc441160b0 2026-03-09T17:27:01.215 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.212+0000 7fdc49b86700 1 --2- 192.168.123.106:0/4291778629 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdc441160b0 0x7fdc44116520 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fdc3400eab0 tx=0x7fdc3400edc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:01.216 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.214+0000 7fdc3b7fe700 1 -- 192.168.123.106:0/4291778629 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdc3400cb20 con 0x7fdc441160b0 2026-03-09T17:27:01.216 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.214+0000 7fdc3b7fe700 1 -- 192.168.123.106:0/4291778629 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdc3400cc80 con 0x7fdc441160b0 2026-03-09T17:27:01.216 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.214+0000 7fdc3b7fe700 1 -- 192.168.123.106:0/4291778629 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdc34018930 con 0x7fdc441160b0 2026-03-09T17:27:01.216 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.214+0000 7fdc4b389700 1 -- 192.168.123.106:0/4291778629 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdc441b7a10 con 0x7fdc441160b0 2026-03-09T17:27:01.216 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.214+0000 7fdc4b389700 1 -- 192.168.123.106:0/4291778629 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdc441b7f10 con 0x7fdc441160b0 2026-03-09T17:27:01.221 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.219+0000 7fdc397fa700 1 -- 192.168.123.106:0/4291778629 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdc4404f2a0 con 0x7fdc441160b0 2026-03-09T17:27:01.222 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.220+0000 7fdc3b7fe700 1 -- 192.168.123.106:0/4291778629 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fdc34018a90 con 0x7fdc441160b0 2026-03-09T17:27:01.223 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.220+0000 7fdc3b7fe700 1 --2- 192.168.123.106:0/4291778629 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdc3006c7a0 0x7fdc3006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:01.223 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.220+0000 7fdc3b7fe700 1 -- 192.168.123.106:0/4291778629 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fdc34014070 con 0x7fdc441160b0 2026-03-09T17:27:01.223 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.221+0000 7fdc4a387700 1 --2- 192.168.123.106:0/4291778629 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdc3006c7a0 0x7fdc3006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:01.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.224+0000 7fdc4a387700 1 --2- 192.168.123.106:0/4291778629 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdc3006c7a0 0x7fdc3006ec50 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fdc4000b5c0 tx=0x7fdc40005fb0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:01.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.225+0000 7fdc3b7fe700 1 -- 192.168.123.106:0/4291778629 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fdc340567d0 con 0x7fdc441160b0 2026-03-09T17:27:01.280 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.277+0000 7f198a7fc700 1 -- 192.168.123.106:0/121242077 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 5} v 0) v1 -- 0x7f1990005190 con 0x7f19a410ed80 2026-03-09T17:27:01.280 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.278+0000 7f19a0ff9700 1 -- 192.168.123.106:0/121242077 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 5}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f199405b280 con 0x7f19a410ed80 2026-03-09T17:27:01.281 INFO:teuthology.orchestra.run.vm06.stdout:137438953475 2026-03-09T17:27:01.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.289+0000 7f19a9ebc700 1 -- 192.168.123.106:0/121242077 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f198c06c6d0 msgr2=0x7f198c06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:01.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.289+0000 7f19a9ebc700 1 --2- 192.168.123.106:0/121242077 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f198c06c6d0 0x7f198c06eb80 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f1998005cb0 tx=0x7f1998005be0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.289+0000 7f19a9ebc700 1 -- 192.168.123.106:0/121242077 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19a410ed80 msgr2=0x7f19a4117520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:01.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.289+0000 7f19a9ebc700 1 --2- 192.168.123.106:0/121242077 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19a410ed80 0x7f19a4117520 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7f199400ed70 tx=0x7f199400c5b0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.289+0000 7f19a9ebc700 1 -- 192.168.123.106:0/121242077 shutdown_connections 2026-03-09T17:27:01.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.289+0000 7f19a9ebc700 1 --2- 192.168.123.106:0/121242077 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f198c06c6d0 0x7f198c06eb80 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.292 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.289+0000 7f19a9ebc700 1 --2- 192.168.123.106:0/121242077 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f19a410ed80 0x7f19a4117520 unknown :-1 s=CLOSED pgs=224 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.292 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.289+0000 7f19a9ebc700 1 --2- 192.168.123.106:0/121242077 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f19a4112520 0x7f19a4112990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.292 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.289+0000 7f19a9ebc700 1 -- 192.168.123.106:0/121242077 >> 192.168.123.106:0/121242077 conn(0x7f19a406c830 msgr2=0x7f19a4117d90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:01.292 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.290+0000 7f19a9ebc700 1 -- 192.168.123.106:0/121242077 shutdown_connections 2026-03-09T17:27:01.292 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.290+0000 7f19a9ebc700 1 -- 192.168.123.106:0/121242077 wait complete. 2026-03-09T17:27:01.322 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.319+0000 7f785978e700 1 -- 192.168.123.106:0/137103369 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 2} v 0) v1 -- 0x7f785404ea50 con 0x7f785410e9e0 2026-03-09T17:27:01.323 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.321+0000 7f78517fa700 1 -- 192.168.123.106:0/137103369 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 2}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f7848019070 con 0x7f785410e9e0 2026-03-09T17:27:01.324 INFO:teuthology.orchestra.run.vm06.stdout:73014444042 2026-03-09T17:27:01.326 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.324+0000 7f783affd700 1 -- 192.168.123.106:0/137103369 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f783c06c620 msgr2=0x7f783c06ead0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:01.326 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.324+0000 7f783affd700 1 --2- 192.168.123.106:0/137103369 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f783c06c620 0x7f783c06ead0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f7844000c00 tx=0x7f784400b540 comp rx=0 tx=0).stop 2026-03-09T17:27:01.326 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.324+0000 7f783affd700 1 -- 192.168.123.106:0/137103369 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f785410e9e0 msgr2=0x7f7854114590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:01.326 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.324+0000 7f783affd700 1 --2- 192.168.123.106:0/137103369 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f785410e9e0 0x7f7854114590 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7f784800ba70 tx=0x7f784800bd80 comp rx=0 tx=0).stop 2026-03-09T17:27:01.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.325+0000 7f783affd700 1 -- 192.168.123.106:0/137103369 shutdown_connections 2026-03-09T17:27:01.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.325+0000 7f783affd700 1 --2- 192.168.123.106:0/137103369 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7854071b60 0x7f7854119590 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.325+0000 7f783affd700 1 --2- 192.168.123.106:0/137103369 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f783c06c620 0x7f783c06ead0 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.325+0000 7f783affd700 1 --2- 192.168.123.106:0/137103369 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f785410e9e0 0x7f7854114590 unknown :-1 s=CLOSED pgs=226 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.325+0000 7f783affd700 1 -- 192.168.123.106:0/137103369 >> 192.168.123.106:0/137103369 conn(0x7f785406c6c0 msgr2=0x7f785406cf30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:01.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.325+0000 7f783affd700 1 -- 192.168.123.106:0/137103369 shutdown_connections 2026-03-09T17:27:01.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.325+0000 7f783affd700 1 -- 192.168.123.106:0/137103369 wait complete. 2026-03-09T17:27:01.356 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:01 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/1924594979' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-09T17:27:01.356 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:01 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/1070341522' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-09T17:27:01.389 INFO:tasks.cephadm.ceph_manager.ceph:need seq 73014444041 got 73014444042 for osd.2 2026-03-09T17:27:01.389 DEBUG:teuthology.parallel:result is None 2026-03-09T17:27:01.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.428+0000 7fdc397fa700 1 -- 192.168.123.106:0/4291778629 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 1} v 0) v1 -- 0x7fdc4404ea50 con 0x7fdc441160b0 2026-03-09T17:27:01.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.429+0000 7fdc3b7fe700 1 -- 192.168.123.106:0/4291778629 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 1}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7fdc34059df0 con 0x7fdc441160b0 2026-03-09T17:27:01.432 INFO:teuthology.orchestra.run.vm06.stdout:55834574860 2026-03-09T17:27:01.436 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.434+0000 7f7503536700 1 -- 192.168.123.106:0/2537804587 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 3} v 0) v1 -- 0x7f74e0005190 con 0x7f74fc07d3f0 2026-03-09T17:27:01.438 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.434+0000 7f74f27fc700 1 -- 192.168.123.106:0/2537804587 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 3}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f74f8019070 con 0x7f74fc07d3f0 2026-03-09T17:27:01.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.436+0000 7fdc4b389700 1 -- 192.168.123.106:0/4291778629 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdc3006c7a0 msgr2=0x7fdc3006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:01.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.436+0000 7fdc4b389700 1 --2- 192.168.123.106:0/4291778629 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdc3006c7a0 0x7fdc3006ec50 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fdc4000b5c0 tx=0x7fdc40005fb0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.436+0000 7fdc4b389700 1 -- 192.168.123.106:0/4291778629 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdc441160b0 msgr2=0x7fdc44116520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:01.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.436+0000 7fdc4b389700 1 --2- 192.168.123.106:0/4291778629 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdc441160b0 0x7fdc44116520 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fdc3400eab0 tx=0x7fdc3400edc0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.444+0000 7fdc4b389700 1 -- 192.168.123.106:0/4291778629 shutdown_connections 2026-03-09T17:27:01.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.444+0000 7fdc4b389700 1 --2- 192.168.123.106:0/4291778629 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdc3006c7a0 0x7fdc3006ec50 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.444+0000 7fdc4b389700 1 --2- 192.168.123.106:0/4291778629 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdc44071b60 0x7fdc44115b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.444+0000 7fdc4b389700 1 --2- 192.168.123.106:0/4291778629 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdc441160b0 0x7fdc44116520 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.444+0000 7fdc4b389700 1 -- 192.168.123.106:0/4291778629 >> 192.168.123.106:0/4291778629 conn(0x7fdc4406c6c0 msgr2=0x7fdc4406fb70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:01.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.445+0000 7fdc4b389700 1 -- 192.168.123.106:0/4291778629 shutdown_connections 2026-03-09T17:27:01.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.445+0000 7fdc4b389700 1 -- 192.168.123.106:0/4291778629 wait complete. 2026-03-09T17:27:01.451 INFO:teuthology.orchestra.run.vm06.stdout:98784247815 2026-03-09T17:27:01.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.452+0000 7f74e7fff700 1 -- 192.168.123.106:0/2537804587 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f74e806c6d0 msgr2=0x7f74e806eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:01.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.452+0000 7f74e7fff700 1 --2- 192.168.123.106:0/2537804587 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f74e806c6d0 0x7f74e806eb80 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f74f400bfd0 tx=0x7f74f4007660 comp rx=0 tx=0).stop 2026-03-09T17:27:01.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.452+0000 7f74e7fff700 1 -- 192.168.123.106:0/2537804587 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f74fc07d3f0 msgr2=0x7f74fc07d860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:01.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.452+0000 7f74e7fff700 1 --2- 192.168.123.106:0/2537804587 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f74fc07d3f0 0x7f74fc07d860 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f74f8007c00 tx=0x7f74f8007f10 comp rx=0 tx=0).stop 2026-03-09T17:27:01.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.452+0000 7f74e7fff700 1 -- 192.168.123.106:0/2537804587 shutdown_connections 2026-03-09T17:27:01.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.452+0000 7f74e7fff700 1 --2- 192.168.123.106:0/2537804587 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f74e806c6d0 0x7f74e806eb80 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.452+0000 7f74e7fff700 1 --2- 192.168.123.106:0/2537804587 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74fc071e40 0x7f74fc07ceb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:01.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.452+0000 7f74e7fff700 1 --2- 192.168.123.106:0/2537804587 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f74fc07d3f0 0x7f74fc07d860 secure :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f74f8007c00 tx=0x7f74f8007f10 comp rx=0 tx=0).stop 2026-03-09T17:27:01.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.452+0000 7f74e7fff700 1 -- 192.168.123.106:0/2537804587 >> 192.168.123.106:0/2537804587 conn(0x7f74fc06c6c0 msgr2=0x7f74fc06ffd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:01.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.452+0000 7f74e7fff700 1 -- 192.168.123.106:0/2537804587 shutdown_connections 2026-03-09T17:27:01.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:01.453+0000 7f74e7fff700 1 -- 192.168.123.106:0/2537804587 wait complete. 2026-03-09T17:27:01.462 INFO:tasks.cephadm.ceph_manager.ceph:need seq 137438953475 got 137438953475 for osd.5 2026-03-09T17:27:01.462 DEBUG:teuthology.parallel:result is None 2026-03-09T17:27:01.535 INFO:tasks.cephadm.ceph_manager.ceph:need seq 98784247815 got 98784247815 for osd.3 2026-03-09T17:27:01.536 DEBUG:teuthology.parallel:result is None 2026-03-09T17:27:01.546 INFO:tasks.cephadm.ceph_manager.ceph:need seq 55834574859 got 55834574860 for osd.1 2026-03-09T17:27:01.546 DEBUG:teuthology.parallel:result is None 2026-03-09T17:27:01.546 INFO:tasks.cephadm.ceph_manager.ceph:waiting for clean 2026-03-09T17:27:01.546 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph pg dump --format=json 2026-03-09T17:27:01.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:01 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/1924594979' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-09T17:27:01.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:01 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/1070341522' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-09T17:27:01.742 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:02.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.015+0000 7f6d7b147700 1 -- 192.168.123.106:0/1452726499 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d74102780 msgr2=0x7f6d74102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:02.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.015+0000 7f6d7b147700 1 --2- 192.168.123.106:0/1452726499 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d74102780 0x7f6d74102bf0 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f6d68009b00 tx=0x7f6d68009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:02.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.016+0000 7f6d7b147700 1 -- 192.168.123.106:0/1452726499 shutdown_connections 2026-03-09T17:27:02.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.016+0000 7f6d7b147700 1 --2- 192.168.123.106:0/1452726499 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d74102780 0x7f6d74102bf0 unknown :-1 s=CLOSED pgs=229 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:02.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.016+0000 7f6d7b147700 1 --2- 192.168.123.106:0/1452726499 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6d74108780 0x7f6d74108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:02.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.016+0000 7f6d7b147700 1 -- 192.168.123.106:0/1452726499 >> 192.168.123.106:0/1452726499 conn(0x7f6d740fe280 msgr2=0x7f6d74100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:02.019 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.016+0000 7f6d7b147700 1 -- 192.168.123.106:0/1452726499 shutdown_connections 2026-03-09T17:27:02.019 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.016+0000 7f6d7b147700 1 -- 192.168.123.106:0/1452726499 wait complete. 2026-03-09T17:27:02.019 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.017+0000 7f6d7b147700 1 Processor -- start 2026-03-09T17:27:02.019 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.017+0000 7f6d7b147700 1 -- start start 2026-03-09T17:27:02.019 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.017+0000 7f6d7b147700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6d74102780 0x7f6d741983b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:02.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.017+0000 7f6d7b147700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d74108780 0x7f6d741988f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:02.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.017+0000 7f6d7b147700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6d74198fd0 con 0x7f6d74108780 2026-03-09T17:27:02.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.017+0000 7f6d7b147700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6d7419cd10 con 0x7f6d74102780 2026-03-09T17:27:02.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.017+0000 7f6d73fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d74108780 0x7f6d741988f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:02.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.017+0000 7f6d73fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d74108780 0x7f6d741988f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43914/0 (socket says 192.168.123.106:43914) 2026-03-09T17:27:02.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.017+0000 7f6d73fff700 1 -- 192.168.123.106:0/688562983 learned_addr learned my addr 192.168.123.106:0/688562983 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:02.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.018+0000 7f6d73fff700 1 -- 192.168.123.106:0/688562983 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6d74102780 msgr2=0x7f6d741983b0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T17:27:02.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.018+0000 7f6d73fff700 1 --2- 192.168.123.106:0/688562983 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6d74102780 0x7f6d741983b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:02.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.018+0000 7f6d73fff700 1 -- 192.168.123.106:0/688562983 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6d680097e0 con 0x7f6d74108780 2026-03-09T17:27:02.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.018+0000 7f6d73fff700 1 --2- 192.168.123.106:0/688562983 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d74108780 0x7f6d741988f0 secure :-1 s=READY pgs=230 cs=0 l=1 rev1=1 crypto rx=0x7f6d68009ad0 tx=0x7f6d680052e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:02.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.018+0000 7f6d71ffb700 1 -- 192.168.123.106:0/688562983 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6d6801d070 con 0x7f6d74108780 2026-03-09T17:27:02.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.019+0000 7f6d7b147700 1 -- 192.168.123.106:0/688562983 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6d7419cf90 con 0x7f6d74108780 2026-03-09T17:27:02.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.019+0000 7f6d71ffb700 1 -- 192.168.123.106:0/688562983 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6d6800bc50 con 0x7f6d74108780 2026-03-09T17:27:02.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.019+0000 7f6d71ffb700 1 -- 192.168.123.106:0/688562983 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6d6800f790 con 0x7f6d74108780 2026-03-09T17:27:02.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.019+0000 7f6d7b147700 1 -- 192.168.123.106:0/688562983 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6d7419d480 con 0x7f6d74108780 2026-03-09T17:27:02.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.021+0000 7f6d7b147700 1 -- 192.168.123.106:0/688562983 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6d7404ea50 con 0x7f6d74108780 2026-03-09T17:27:02.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.021+0000 7f6d71ffb700 1 -- 192.168.123.106:0/688562983 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f6d68022470 con 0x7f6d74108780 2026-03-09T17:27:02.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.026+0000 7f6d71ffb700 1 --2- 192.168.123.106:0/688562983 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6d5c06c7a0 0x7f6d5c06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:02.029 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.026+0000 7f6d71ffb700 1 -- 192.168.123.106:0/688562983 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f6d6808cd80 con 0x7f6d74108780 2026-03-09T17:27:02.029 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.026+0000 7f6d71ffb700 1 -- 192.168.123.106:0/688562983 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6d68091330 con 0x7f6d74108780 2026-03-09T17:27:02.029 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.027+0000 7f6d78ee3700 1 --2- 192.168.123.106:0/688562983 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6d5c06c7a0 0x7f6d5c06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:02.029 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.027+0000 7f6d78ee3700 1 --2- 192.168.123.106:0/688562983 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6d5c06c7a0 0x7f6d5c06ec50 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f6d741038c0 tx=0x7f6d64005c30 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:02.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.135+0000 7f6d7b147700 1 -- 192.168.123.106:0/688562983 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f6d7419d760 con 0x7f6d5c06c7a0 2026-03-09T17:27:02.138 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.136+0000 7f6d71ffb700 1 -- 192.168.123.106:0/688562983 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19146 (secure 0 0 0) 0x7f6d7419d760 con 0x7f6d5c06c7a0 2026-03-09T17:27:02.139 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:02.143 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.141+0000 7f6d7b147700 1 -- 192.168.123.106:0/688562983 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6d5c06c7a0 msgr2=0x7f6d5c06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:02.143 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.141+0000 7f6d7b147700 1 --2- 192.168.123.106:0/688562983 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6d5c06c7a0 0x7f6d5c06ec50 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f6d741038c0 tx=0x7f6d64005c30 comp rx=0 tx=0).stop 2026-03-09T17:27:02.143 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.141+0000 7f6d7b147700 1 -- 192.168.123.106:0/688562983 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d74108780 msgr2=0x7f6d741988f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:02.143 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.141+0000 7f6d7b147700 1 --2- 192.168.123.106:0/688562983 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d74108780 0x7f6d741988f0 secure :-1 s=READY pgs=230 cs=0 l=1 rev1=1 crypto rx=0x7f6d68009ad0 tx=0x7f6d680052e0 comp rx=0 tx=0).stop 2026-03-09T17:27:02.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.141+0000 7f6d7b147700 1 -- 192.168.123.106:0/688562983 shutdown_connections 2026-03-09T17:27:02.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.141+0000 7f6d7b147700 1 --2- 192.168.123.106:0/688562983 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6d74102780 0x7f6d741983b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:02.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.142+0000 7f6d7b147700 1 --2- 192.168.123.106:0/688562983 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6d5c06c7a0 0x7f6d5c06ec50 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:02.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.142+0000 7f6d7b147700 1 --2- 192.168.123.106:0/688562983 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6d74108780 0x7f6d741988f0 unknown :-1 s=CLOSED pgs=230 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:02.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.142+0000 7f6d7b147700 1 -- 192.168.123.106:0/688562983 >> 192.168.123.106:0/688562983 conn(0x7f6d740fe280 msgr2=0x7f6d740ffbb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:02.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.142+0000 7f6d7b147700 1 -- 192.168.123.106:0/688562983 shutdown_connections 2026-03-09T17:27:02.144 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.142+0000 7f6d7b147700 1 -- 192.168.123.106:0/688562983 wait complete. 2026-03-09T17:27:02.145 INFO:teuthology.orchestra.run.vm06.stderr:dumped all 2026-03-09T17:27:02.190 INFO:teuthology.orchestra.run.vm06.stdout:{"pg_ready":true,"pg_map":{"version":70,"stamp":"2026-03-09T17:27:01.213977+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":163644,"kb_used_data":3084,"kb_used_omap":0,"kb_used_meta":160512,"kb_avail":125640900,"statfs":{"total":128823853056,"available":128656281600,"internally_reserved":0,"allocated":3158016,"data_stored":2042718,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":164364288},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"9.655635"},"pg_stats":[{"pgid":"1.0","version":"19'76","reported_seq":137,"reported_epoch":32,"state":"active+clean","last_fresh":"2026-03-09T17:26:51.557744+0000","last_change":"2026-03-09T17:26:40.719741+0000","last_active":"2026-03-09T17:26:51.557744+0000","last_peered":"2026-03-09T17:26:51.557744+0000","last_clean":"2026-03-09T17:26:51.557744+0000","last_became_active":"2026-03-09T17:26:40.719514+0000","last_became_peered":"2026-03-09T17:26:40.719514+0000","last_unstale":"2026-03-09T17:26:51.557744+0000","last_undegraded":"2026-03-09T17:26:51.557744+0000","last_fullsized":"2026-03-09T17:26:51.557744+0000","mapping_epoch":27,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":28,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-09T17:26:22.090189+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-09T17:26:22.090189+0000","last_clean_scrub_stamp":"2026-03-09T17:26:22.090189+0000","objects_scrubbed":0,"log_size":76,"log_dups_size":0,"ondisk_log_size":76,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-10T19:01:26.617769+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":32,"seq":137438953475,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110813,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.47099999999999997}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.51400000000000001}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56899999999999995}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.58499999999999996}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.93300000000000005}]}]},{"osd":4,"up_from":28,"seq":120259084294,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110813,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.373}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.41399999999999998}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.51500000000000001}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.34999999999999998}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.30299999999999999}]}]},{"osd":3,"up_from":23,"seq":98784247815,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570093,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64200000000000002}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66200000000000003}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.626}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.44700000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.46200000000000002}]}]},{"osd":2,"up_from":17,"seq":73014444042,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110813,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.38800000000000001}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.40899999999999997}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.67500000000000004}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.51400000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.57799999999999996}]}]},{"osd":0,"up_from":9,"seq":38654705678,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570093,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":2.9580000000000002}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":2.8239999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":3.0979999999999999}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":2.8639999999999999}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":2.948}]}]},{"osd":1,"up_from":13,"seq":55834574860,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570093,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.34399999999999997}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.81299999999999994}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65200000000000002}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.55600000000000005}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.49399999999999999}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-09T17:27:02.190 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph pg dump --format=json 2026-03-09T17:27:02.364 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:02.410 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:02 vm06 ceph-mon[57307]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:27:02.410 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:02 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/121242077' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-09T17:27:02.410 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:02 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/137103369' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-09T17:27:02.410 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:02 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/4291778629' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-09T17:27:02.410 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:02 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/2537804587' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-09T17:27:02.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.633+0000 7ff79a522700 1 -- 192.168.123.106:0/4140101847 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff794073050 msgr2=0x7ff794073420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:02.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.633+0000 7ff79a522700 1 --2- 192.168.123.106:0/4140101847 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff794073050 0x7ff794073420 secure :-1 s=READY pgs=231 cs=0 l=1 rev1=1 crypto rx=0x7ff77c009b30 tx=0x7ff77c009e40 comp rx=0 tx=0).stop 2026-03-09T17:27:02.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.634+0000 7ff79a522700 1 -- 192.168.123.106:0/4140101847 shutdown_connections 2026-03-09T17:27:02.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.634+0000 7ff79a522700 1 --2- 192.168.123.106:0/4140101847 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff794073960 0x7ff79410c9f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:02.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.634+0000 7ff79a522700 1 --2- 192.168.123.106:0/4140101847 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff794073050 0x7ff794073420 unknown :-1 s=CLOSED pgs=231 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:02.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.634+0000 7ff79a522700 1 -- 192.168.123.106:0/4140101847 >> 192.168.123.106:0/4140101847 conn(0x7ff794078580 msgr2=0x7ff794078980 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:02.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.635+0000 7ff79a522700 1 -- 192.168.123.106:0/4140101847 shutdown_connections 2026-03-09T17:27:02.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.635+0000 7ff79a522700 1 -- 192.168.123.106:0/4140101847 wait complete. 2026-03-09T17:27:02.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.635+0000 7ff79a522700 1 Processor -- start 2026-03-09T17:27:02.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.635+0000 7ff79a522700 1 -- start start 2026-03-09T17:27:02.638 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.635+0000 7ff79a522700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff794073050 0x7ff7941984d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:02.638 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.635+0000 7ff79a522700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff794073960 0x7ff794198a10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:02.638 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.635+0000 7ff79a522700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff7941990f0 con 0x7ff794073050 2026-03-09T17:27:02.638 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.635+0000 7ff79a522700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff79419ce80 con 0x7ff794073960 2026-03-09T17:27:02.638 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.636+0000 7ff793fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff794073050 0x7ff7941984d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:02.638 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.636+0000 7ff793fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff794073050 0x7ff7941984d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:43930/0 (socket says 192.168.123.106:43930) 2026-03-09T17:27:02.638 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.636+0000 7ff793fff700 1 -- 192.168.123.106:0/4151090199 learned_addr learned my addr 192.168.123.106:0/4151090199 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:02.638 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.636+0000 7ff7937fe700 1 --2- 192.168.123.106:0/4151090199 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff794073960 0x7ff794198a10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:02.639 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.636+0000 7ff793fff700 1 -- 192.168.123.106:0/4151090199 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff794073960 msgr2=0x7ff794198a10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:02.639 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.636+0000 7ff793fff700 1 --2- 192.168.123.106:0/4151090199 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff794073960 0x7ff794198a10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:02.639 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.636+0000 7ff793fff700 1 -- 192.168.123.106:0/4151090199 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff77c0097e0 con 0x7ff794073050 2026-03-09T17:27:02.639 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.636+0000 7ff7937fe700 1 --2- 192.168.123.106:0/4151090199 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff794073960 0x7ff794198a10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:27:02.639 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.637+0000 7ff793fff700 1 --2- 192.168.123.106:0/4151090199 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff794073050 0x7ff7941984d0 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7ff77c004a30 tx=0x7ff77c004b10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:02.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.637+0000 7ff7917fa700 1 -- 192.168.123.106:0/4151090199 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff77c01d070 con 0x7ff794073050 2026-03-09T17:27:02.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.637+0000 7ff7917fa700 1 -- 192.168.123.106:0/4151090199 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff77c00bc10 con 0x7ff794073050 2026-03-09T17:27:02.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.637+0000 7ff7917fa700 1 -- 192.168.123.106:0/4151090199 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff77c00f790 con 0x7ff794073050 2026-03-09T17:27:02.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.637+0000 7ff79a522700 1 -- 192.168.123.106:0/4151090199 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff79419d100 con 0x7ff794073050 2026-03-09T17:27:02.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.637+0000 7ff79a522700 1 -- 192.168.123.106:0/4151090199 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff79419d510 con 0x7ff794073050 2026-03-09T17:27:02.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.638+0000 7ff7917fa700 1 -- 192.168.123.106:0/4151090199 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7ff77c00f8f0 con 0x7ff794073050 2026-03-09T17:27:02.641 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.639+0000 7ff79a522700 1 -- 192.168.123.106:0/4151090199 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff79410a0f0 con 0x7ff794073050 2026-03-09T17:27:02.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.639+0000 7ff7917fa700 1 --2- 192.168.123.106:0/4151090199 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff78006c680 0x7ff78006eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:02.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.639+0000 7ff7917fa700 1 -- 192.168.123.106:0/4151090199 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7ff77c08dc00 con 0x7ff794073050 2026-03-09T17:27:02.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.640+0000 7ff7937fe700 1 --2- 192.168.123.106:0/4151090199 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff78006c680 0x7ff78006eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:02.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.640+0000 7ff7937fe700 1 --2- 192.168.123.106:0/4151090199 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff78006c680 0x7ff78006eb30 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7ff794199af0 tx=0x7ff784008040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:02.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:02 vm09 ceph-mon[62061]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:27:02.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:02 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/121242077' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-09T17:27:02.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:02 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/137103369' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-09T17:27:02.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:02 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/4291778629' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-09T17:27:02.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:02 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/2537804587' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-09T17:27:02.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.642+0000 7ff7917fa700 1 -- 192.168.123.106:0/4151090199 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7ff77c058770 con 0x7ff794073050 2026-03-09T17:27:02.746 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.744+0000 7ff79a522700 1 -- 192.168.123.106:0/4151090199 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7ff794199830 con 0x7ff78006c680 2026-03-09T17:27:02.750 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.748+0000 7ff7917fa700 1 -- 192.168.123.106:0/4151090199 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19146 (secure 0 0 0) 0x7ff794199830 con 0x7ff78006c680 2026-03-09T17:27:02.750 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:02.753 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.751+0000 7ff79a522700 1 -- 192.168.123.106:0/4151090199 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff78006c680 msgr2=0x7ff78006eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:02.753 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.751+0000 7ff79a522700 1 --2- 192.168.123.106:0/4151090199 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff78006c680 0x7ff78006eb30 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7ff794199af0 tx=0x7ff784008040 comp rx=0 tx=0).stop 2026-03-09T17:27:02.753 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.751+0000 7ff79a522700 1 -- 192.168.123.106:0/4151090199 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff794073050 msgr2=0x7ff7941984d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:02.753 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.751+0000 7ff79a522700 1 --2- 192.168.123.106:0/4151090199 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff794073050 0x7ff7941984d0 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7ff77c004a30 tx=0x7ff77c004b10 comp rx=0 tx=0).stop 2026-03-09T17:27:02.753 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.751+0000 7ff79a522700 1 -- 192.168.123.106:0/4151090199 shutdown_connections 2026-03-09T17:27:02.753 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.751+0000 7ff79a522700 1 --2- 192.168.123.106:0/4151090199 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ff78006c680 0x7ff78006eb30 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:02.754 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.751+0000 7ff79a522700 1 --2- 192.168.123.106:0/4151090199 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff794073050 0x7ff7941984d0 unknown :-1 s=CLOSED pgs=232 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:02.754 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.751+0000 7ff79a522700 1 --2- 192.168.123.106:0/4151090199 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff794073960 0x7ff794198a10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:02.754 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.751+0000 7ff79a522700 1 -- 192.168.123.106:0/4151090199 >> 192.168.123.106:0/4151090199 conn(0x7ff794078580 msgr2=0x7ff794107230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:02.754 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.751+0000 7ff79a522700 1 -- 192.168.123.106:0/4151090199 shutdown_connections 2026-03-09T17:27:02.754 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:02.752+0000 7ff79a522700 1 -- 192.168.123.106:0/4151090199 wait complete. 2026-03-09T17:27:02.755 INFO:teuthology.orchestra.run.vm06.stderr:dumped all 2026-03-09T17:27:02.801 INFO:teuthology.orchestra.run.vm06.stdout:{"pg_ready":true,"pg_map":{"version":70,"stamp":"2026-03-09T17:27:01.213977+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":163644,"kb_used_data":3084,"kb_used_omap":0,"kb_used_meta":160512,"kb_avail":125640900,"statfs":{"total":128823853056,"available":128656281600,"internally_reserved":0,"allocated":3158016,"data_stored":2042718,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":164364288},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"9.655635"},"pg_stats":[{"pgid":"1.0","version":"19'76","reported_seq":137,"reported_epoch":32,"state":"active+clean","last_fresh":"2026-03-09T17:26:51.557744+0000","last_change":"2026-03-09T17:26:40.719741+0000","last_active":"2026-03-09T17:26:51.557744+0000","last_peered":"2026-03-09T17:26:51.557744+0000","last_clean":"2026-03-09T17:26:51.557744+0000","last_became_active":"2026-03-09T17:26:40.719514+0000","last_became_peered":"2026-03-09T17:26:40.719514+0000","last_unstale":"2026-03-09T17:26:51.557744+0000","last_undegraded":"2026-03-09T17:26:51.557744+0000","last_fullsized":"2026-03-09T17:26:51.557744+0000","mapping_epoch":27,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":28,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-09T17:26:22.090189+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-09T17:26:22.090189+0000","last_clean_scrub_stamp":"2026-03-09T17:26:22.090189+0000","objects_scrubbed":0,"log_size":76,"log_dups_size":0,"ondisk_log_size":76,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-10T19:01:26.617769+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":32,"seq":137438953475,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110813,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.47099999999999997}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.51400000000000001}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56899999999999995}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.58499999999999996}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.93300000000000005}]}]},{"osd":4,"up_from":28,"seq":120259084294,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110813,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.373}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.41399999999999998}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.51500000000000001}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.34999999999999998}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.30299999999999999}]}]},{"osd":3,"up_from":23,"seq":98784247815,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570093,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64200000000000002}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66200000000000003}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.626}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.44700000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.46200000000000002}]}]},{"osd":2,"up_from":17,"seq":73014444042,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110813,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.38800000000000001}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.40899999999999997}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.67500000000000004}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.51400000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.57799999999999996}]}]},{"osd":0,"up_from":9,"seq":38654705678,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570093,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":2.9580000000000002}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":2.8239999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":3.0979999999999999}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":2.8639999999999999}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":2.948}]}]},{"osd":1,"up_from":13,"seq":55834574860,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570093,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.34399999999999997}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.81299999999999994}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65200000000000002}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.55600000000000005}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.49399999999999999}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-09T17:27:02.802 INFO:tasks.cephadm.ceph_manager.ceph:clean! 2026-03-09T17:27:02.802 INFO:tasks.ceph:Waiting until ceph cluster ceph is healthy... 2026-03-09T17:27:02.802 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy 2026-03-09T17:27:02.802 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph health --format=json 2026-03-09T17:27:02.968 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:03.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.232+0000 7f8eccd76700 1 -- 192.168.123.106:0/1723245353 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ec80730f0 msgr2=0x7f8ec80734c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:03.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.232+0000 7f8eccd76700 1 --2- 192.168.123.106:0/1723245353 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ec80730f0 0x7f8ec80734c0 secure :-1 s=READY pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7f8eb0009b00 tx=0x7f8eb0009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:03.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.232+0000 7f8eccd76700 1 -- 192.168.123.106:0/1723245353 shutdown_connections 2026-03-09T17:27:03.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.232+0000 7f8eccd76700 1 --2- 192.168.123.106:0/1723245353 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8ec8073a00 0x7f8ec8111040 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:03.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.232+0000 7f8eccd76700 1 --2- 192.168.123.106:0/1723245353 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ec80730f0 0x7f8ec80734c0 unknown :-1 s=CLOSED pgs=233 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:03.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.232+0000 7f8eccd76700 1 -- 192.168.123.106:0/1723245353 >> 192.168.123.106:0/1723245353 conn(0x7f8ec80fc090 msgr2=0x7f8ec80fe4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:03.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.232+0000 7f8eccd76700 1 -- 192.168.123.106:0/1723245353 shutdown_connections 2026-03-09T17:27:03.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.232+0000 7f8eccd76700 1 -- 192.168.123.106:0/1723245353 wait complete. 2026-03-09T17:27:03.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.233+0000 7f8eccd76700 1 Processor -- start 2026-03-09T17:27:03.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.233+0000 7f8eccd76700 1 -- start start 2026-03-09T17:27:03.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.233+0000 7f8eccd76700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ec80730f0 0x7f8ec81a24f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:03.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.233+0000 7f8eccd76700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8ec8073a00 0x7f8ec81a2a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:03.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.233+0000 7f8eccd76700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ec81a30c0 con 0x7f8ec80730f0 2026-03-09T17:27:03.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.233+0000 7f8eccd76700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ec819c570 con 0x7f8ec8073a00 2026-03-09T17:27:03.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.234+0000 7f8ec5d9b700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8ec8073a00 0x7f8ec81a2a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:03.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.234+0000 7f8ec5d9b700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8ec8073a00 0x7f8ec81a2a30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:56270/0 (socket says 192.168.123.106:56270) 2026-03-09T17:27:03.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.234+0000 7f8ec5d9b700 1 -- 192.168.123.106:0/3952613173 learned_addr learned my addr 192.168.123.106:0/3952613173 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:03.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.234+0000 7f8ec659c700 1 --2- 192.168.123.106:0/3952613173 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ec80730f0 0x7f8ec81a24f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:03.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.234+0000 7f8ec5d9b700 1 -- 192.168.123.106:0/3952613173 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ec80730f0 msgr2=0x7f8ec81a24f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:03.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.234+0000 7f8ec5d9b700 1 --2- 192.168.123.106:0/3952613173 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ec80730f0 0x7f8ec81a24f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:03.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.234+0000 7f8ec5d9b700 1 -- 192.168.123.106:0/3952613173 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8eb00097e0 con 0x7f8ec8073a00 2026-03-09T17:27:03.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.234+0000 7f8ec659c700 1 --2- 192.168.123.106:0/3952613173 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ec80730f0 0x7f8ec81a24f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T17:27:03.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.234+0000 7f8ec5d9b700 1 --2- 192.168.123.106:0/3952613173 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8ec8073a00 0x7f8ec81a2a30 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f8eb800d8d0 tx=0x7f8eb800dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:03.237 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.235+0000 7f8ebf7fe700 1 -- 192.168.123.106:0/3952613173 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8eb8009940 con 0x7f8ec8073a00 2026-03-09T17:27:03.237 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.235+0000 7f8eccd76700 1 -- 192.168.123.106:0/3952613173 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8ec819c850 con 0x7f8ec8073a00 2026-03-09T17:27:03.237 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.235+0000 7f8eccd76700 1 -- 192.168.123.106:0/3952613173 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8ec819cda0 con 0x7f8ec8073a00 2026-03-09T17:27:03.237 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.235+0000 7f8ebf7fe700 1 -- 192.168.123.106:0/3952613173 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8eb8010460 con 0x7f8ec8073a00 2026-03-09T17:27:03.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.235+0000 7f8ebf7fe700 1 -- 192.168.123.106:0/3952613173 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8eb800f5d0 con 0x7f8ec8073a00 2026-03-09T17:27:03.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.236+0000 7f8ebf7fe700 1 -- 192.168.123.106:0/3952613173 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f8eb800f7c0 con 0x7f8ec8073a00 2026-03-09T17:27:03.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.236+0000 7f8ebf7fe700 1 --2- 192.168.123.106:0/3952613173 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8eb406c680 0x7f8eb406eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:03.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.236+0000 7f8ec659c700 1 --2- 192.168.123.106:0/3952613173 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8eb406c680 0x7f8eb406eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:03.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.237+0000 7f8eccd76700 1 -- 192.168.123.106:0/3952613173 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8ea8005320 con 0x7f8ec8073a00 2026-03-09T17:27:03.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.237+0000 7f8ec659c700 1 --2- 192.168.123.106:0/3952613173 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8eb406c680 0x7f8eb406eb30 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f8eb0006010 tx=0x7f8eb000b540 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:03.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.237+0000 7f8ebf7fe700 1 -- 192.168.123.106:0/3952613173 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f8eb808c470 con 0x7f8ec8073a00 2026-03-09T17:27:03.242 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.240+0000 7f8ebf7fe700 1 -- 192.168.123.106:0/3952613173 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8eb805a9c0 con 0x7f8ec8073a00 2026-03-09T17:27:03.381 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.378+0000 7f8eccd76700 1 -- 192.168.123.106:0/3952613173 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "health", "format": "json"} v 0) v1 -- 0x7f8ea8005f70 con 0x7f8ec8073a00 2026-03-09T17:27:03.381 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.379+0000 7f8ebf7fe700 1 -- 192.168.123.106:0/3952613173 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "health", "format": "json"}]=0 v0) v1 ==== 72+0+46 (secure 0 0 0) 0x7f8eb8020040 con 0x7f8ec8073a00 2026-03-09T17:27:03.381 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:03.381 INFO:teuthology.orchestra.run.vm06.stdout:{"status":"HEALTH_OK","checks":{},"mutes":[]} 2026-03-09T17:27:03.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.382+0000 7f8eccd76700 1 -- 192.168.123.106:0/3952613173 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8eb406c680 msgr2=0x7f8eb406eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:03.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.382+0000 7f8eccd76700 1 --2- 192.168.123.106:0/3952613173 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8eb406c680 0x7f8eb406eb30 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f8eb0006010 tx=0x7f8eb000b540 comp rx=0 tx=0).stop 2026-03-09T17:27:03.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.382+0000 7f8eccd76700 1 -- 192.168.123.106:0/3952613173 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8ec8073a00 msgr2=0x7f8ec81a2a30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:03.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.382+0000 7f8eccd76700 1 --2- 192.168.123.106:0/3952613173 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8ec8073a00 0x7f8ec81a2a30 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f8eb800d8d0 tx=0x7f8eb800dc90 comp rx=0 tx=0).stop 2026-03-09T17:27:03.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.382+0000 7f8eccd76700 1 -- 192.168.123.106:0/3952613173 shutdown_connections 2026-03-09T17:27:03.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.382+0000 7f8eccd76700 1 --2- 192.168.123.106:0/3952613173 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8eb406c680 0x7f8eb406eb30 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:03.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.382+0000 7f8eccd76700 1 --2- 192.168.123.106:0/3952613173 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8ec80730f0 0x7f8ec81a24f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:03.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.382+0000 7f8eccd76700 1 --2- 192.168.123.106:0/3952613173 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8ec8073a00 0x7f8ec81a2a30 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:03.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.382+0000 7f8eccd76700 1 -- 192.168.123.106:0/3952613173 >> 192.168.123.106:0/3952613173 conn(0x7f8ec80fc090 msgr2=0x7f8ec8102b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:03.385 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.383+0000 7f8eccd76700 1 -- 192.168.123.106:0/3952613173 shutdown_connections 2026-03-09T17:27:03.385 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.383+0000 7f8eccd76700 1 -- 192.168.123.106:0/3952613173 wait complete. 2026-03-09T17:27:03.531 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy done 2026-03-09T17:27:03.531 INFO:tasks.cephadm:Setup complete, yielding 2026-03-09T17:27:03.531 INFO:teuthology.run_tasks:Running task print... 2026-03-09T17:27:03.533 INFO:teuthology.task.print:**** done end installing v18.2.0 cephadm ... 2026-03-09T17:27:03.533 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T17:27:03.535 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm06.local 2026-03-09T17:27:03.535 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'ceph config set mgr mgr/cephadm/use_repo_digest true --force' 2026-03-09T17:27:03.557 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:03 vm06 ceph-mon[57307]: from='client.14428 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T17:27:03.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:03 vm09 ceph-mon[62061]: from='client.14428 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T17:27:03.687 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:03.949 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.947+0000 7f0062bd4700 1 -- 192.168.123.106:0/1305525 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f005c103960 msgr2=0x7f005c103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:03.949 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.947+0000 7f0062bd4700 1 --2- 192.168.123.106:0/1305525 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f005c103960 0x7f005c103db0 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7f0050009b00 tx=0x7f0050009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:03.950 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.948+0000 7f0062bd4700 1 -- 192.168.123.106:0/1305525 shutdown_connections 2026-03-09T17:27:03.950 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.948+0000 7f0062bd4700 1 --2- 192.168.123.106:0/1305525 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f005c103960 0x7f005c103db0 unknown :-1 s=CLOSED pgs=234 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:03.950 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.948+0000 7f0062bd4700 1 --2- 192.168.123.106:0/1305525 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f005c102760 0x7f005c102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:03.950 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.948+0000 7f0062bd4700 1 -- 192.168.123.106:0/1305525 >> 192.168.123.106:0/1305525 conn(0x7f005c0fdcf0 msgr2=0x7f005c100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:03.950 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.948+0000 7f0062bd4700 1 -- 192.168.123.106:0/1305525 shutdown_connections 2026-03-09T17:27:03.950 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.948+0000 7f0062bd4700 1 -- 192.168.123.106:0/1305525 wait complete. 2026-03-09T17:27:03.950 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.948+0000 7f0062bd4700 1 Processor -- start 2026-03-09T17:27:03.950 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.948+0000 7f0062bd4700 1 -- start start 2026-03-09T17:27:03.950 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.948+0000 7f0062bd4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f005c102760 0x7f005c198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:03.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f0060970700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f005c102760 0x7f005c198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:03.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f0060970700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f005c102760 0x7f005c198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:40464/0 (socket says 192.168.123.106:40464) 2026-03-09T17:27:03.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f0062bd4700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f005c103960 0x7f005c198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:03.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f0062bd4700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f005c198b80 con 0x7f005c102760 2026-03-09T17:27:03.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f0062bd4700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f005c198cc0 con 0x7f005c103960 2026-03-09T17:27:03.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f0060970700 1 -- 192.168.123.106:0/3478816466 learned_addr learned my addr 192.168.123.106:0/3478816466 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:03.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f005bfff700 1 --2- 192.168.123.106:0/3478816466 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f005c103960 0x7f005c198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:03.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f0060970700 1 -- 192.168.123.106:0/3478816466 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f005c103960 msgr2=0x7f005c198560 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:03.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f0060970700 1 --2- 192.168.123.106:0/3478816466 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f005c103960 0x7f005c198560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:03.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f0060970700 1 -- 192.168.123.106:0/3478816466 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f00500097e0 con 0x7f005c102760 2026-03-09T17:27:03.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f0060970700 1 --2- 192.168.123.106:0/3478816466 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f005c102760 0x7f005c198020 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7f004c00da40 tx=0x7f004c00de00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:03.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f0059ffb700 1 -- 192.168.123.106:0/3478816466 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f004c0041d0 con 0x7f005c102760 2026-03-09T17:27:03.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f0059ffb700 1 -- 192.168.123.106:0/3478816466 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f004c009c70 con 0x7f005c102760 2026-03-09T17:27:03.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f0062bd4700 1 -- 192.168.123.106:0/3478816466 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f005c19d770 con 0x7f005c102760 2026-03-09T17:27:03.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.949+0000 7f0062bd4700 1 -- 192.168.123.106:0/3478816466 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f005c19dcc0 con 0x7f005c102760 2026-03-09T17:27:03.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.951+0000 7f0059ffb700 1 -- 192.168.123.106:0/3478816466 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f004c003e40 con 0x7f005c102760 2026-03-09T17:27:03.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.951+0000 7f0059ffb700 1 -- 192.168.123.106:0/3478816466 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f004c010460 con 0x7f005c102760 2026-03-09T17:27:03.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.951+0000 7f0062bd4700 1 -- 192.168.123.106:0/3478816466 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0048005320 con 0x7f005c102760 2026-03-09T17:27:03.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.954+0000 7f0059ffb700 1 --2- 192.168.123.106:0/3478816466 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f004406c5a0 0x7f004406ea50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:03.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.954+0000 7f0059ffb700 1 -- 192.168.123.106:0/3478816466 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f004c021030 con 0x7f005c102760 2026-03-09T17:27:03.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.954+0000 7f0059ffb700 1 -- 192.168.123.106:0/3478816466 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f004c009de0 con 0x7f005c102760 2026-03-09T17:27:03.957 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.955+0000 7f005bfff700 1 --2- 192.168.123.106:0/3478816466 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f004406c5a0 0x7f004406ea50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:03.957 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:03.955+0000 7f005bfff700 1 --2- 192.168.123.106:0/3478816466 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f004406c5a0 0x7f004406ea50 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f0050000c00 tx=0x7f00500053a0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:04.063 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.060+0000 7f0062bd4700 1 -- 192.168.123.106:0/3478816466 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/use_repo_digest}] v 0) v1 -- 0x7f00480059f0 con 0x7f005c102760 2026-03-09T17:27:04.069 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.066+0000 7f0059ffb700 1 -- 192.168.123.106:0/3478816466 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/use_repo_digest}]=0 v14)=0 v14) v1 ==== 143+0+0 (secure 0 0 0) 0x7f004c05a9f0 con 0x7f005c102760 2026-03-09T17:27:04.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.071+0000 7f0062bd4700 1 -- 192.168.123.106:0/3478816466 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f004406c5a0 msgr2=0x7f004406ea50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:04.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.071+0000 7f0062bd4700 1 --2- 192.168.123.106:0/3478816466 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f004406c5a0 0x7f004406ea50 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f0050000c00 tx=0x7f00500053a0 comp rx=0 tx=0).stop 2026-03-09T17:27:04.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.071+0000 7f0062bd4700 1 -- 192.168.123.106:0/3478816466 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f005c102760 msgr2=0x7f005c198020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:04.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.071+0000 7f0062bd4700 1 --2- 192.168.123.106:0/3478816466 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f005c102760 0x7f005c198020 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7f004c00da40 tx=0x7f004c00de00 comp rx=0 tx=0).stop 2026-03-09T17:27:04.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.072+0000 7f0062bd4700 1 -- 192.168.123.106:0/3478816466 shutdown_connections 2026-03-09T17:27:04.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.072+0000 7f0062bd4700 1 --2- 192.168.123.106:0/3478816466 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f004406c5a0 0x7f004406ea50 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:04.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.072+0000 7f0062bd4700 1 --2- 192.168.123.106:0/3478816466 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f005c102760 0x7f005c198020 unknown :-1 s=CLOSED pgs=235 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:04.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.072+0000 7f0062bd4700 1 --2- 192.168.123.106:0/3478816466 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f005c103960 0x7f005c198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:04.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.072+0000 7f0062bd4700 1 -- 192.168.123.106:0/3478816466 >> 192.168.123.106:0/3478816466 conn(0x7f005c0fdcf0 msgr2=0x7f005c106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:04.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.073+0000 7f0062bd4700 1 -- 192.168.123.106:0/3478816466 shutdown_connections 2026-03-09T17:27:04.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.073+0000 7f0062bd4700 1 -- 192.168.123.106:0/3478816466 wait complete. 2026-03-09T17:27:04.172 INFO:teuthology.run_tasks:Running task print... 2026-03-09T17:27:04.173 INFO:teuthology.task.print:**** done cephadm.shell ceph config set mgr... 2026-03-09T17:27:04.173 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T17:27:04.175 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm06.local 2026-03-09T17:27:04.175 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'ceph orch status' 2026-03-09T17:27:04.288 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:04 vm09 ceph-mon[62061]: from='client.14432 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T17:27:04.288 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:04 vm09 ceph-mon[62061]: pgmap v71: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:27:04.288 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:04 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/3952613173' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-09T17:27:04.288 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:04 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/3478816466' entity='client.admin' 2026-03-09T17:27:04.288 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:04 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:27:04.288 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:04 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:04.288 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:04 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:27:04.288 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:04 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:04.371 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:04.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:04 vm06 ceph-mon[57307]: from='client.14432 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T17:27:04.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:04 vm06 ceph-mon[57307]: pgmap v71: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:27:04.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:04 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3952613173' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-09T17:27:04.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:04 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3478816466' entity='client.admin' 2026-03-09T17:27:04.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:04 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:27:04.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:04 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:04.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:04 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:27:04.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:04 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:04.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.641+0000 7f640d8c2700 1 -- 192.168.123.106:0/3748839263 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6408100510 msgr2=0x7f6408100920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:04.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.641+0000 7f640d8c2700 1 --2- 192.168.123.106:0/3748839263 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6408100510 0x7f6408100920 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7f63f0009b00 tx=0x7f63f0009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:04.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.642+0000 7f640d8c2700 1 -- 192.168.123.106:0/3748839263 shutdown_connections 2026-03-09T17:27:04.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.642+0000 7f640d8c2700 1 --2- 192.168.123.106:0/3748839263 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6408101710 0x7f6408101b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:04.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.642+0000 7f640d8c2700 1 --2- 192.168.123.106:0/3748839263 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6408100510 0x7f6408100920 unknown :-1 s=CLOSED pgs=236 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:04.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.642+0000 7f640d8c2700 1 -- 192.168.123.106:0/3748839263 >> 192.168.123.106:0/3748839263 conn(0x7f64080fba80 msgr2=0x7f64080fdef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:04.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.643+0000 7f640d8c2700 1 -- 192.168.123.106:0/3748839263 shutdown_connections 2026-03-09T17:27:04.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.643+0000 7f640d8c2700 1 -- 192.168.123.106:0/3748839263 wait complete. 2026-03-09T17:27:04.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.644+0000 7f640d8c2700 1 Processor -- start 2026-03-09T17:27:04.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.644+0000 7f640d8c2700 1 -- start start 2026-03-09T17:27:04.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.644+0000 7f640d8c2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6408100510 0x7f6408071c70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:04.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.644+0000 7f640d8c2700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6408101710 0x7f64080721b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:04.647 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.644+0000 7f640d8c2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f64080727b0 con 0x7f6408100510 2026-03-09T17:27:04.647 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.644+0000 7f640d8c2700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6408072920 con 0x7f6408101710 2026-03-09T17:27:04.647 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.644+0000 7f6406ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6408100510 0x7f6408071c70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:04.647 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.644+0000 7f6406ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6408100510 0x7f6408071c70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:40492/0 (socket says 192.168.123.106:40492) 2026-03-09T17:27:04.647 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.644+0000 7f6406ffd700 1 -- 192.168.123.106:0/3956579866 learned_addr learned my addr 192.168.123.106:0/3956579866 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:04.647 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.644+0000 7f6406ffd700 1 -- 192.168.123.106:0/3956579866 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6408101710 msgr2=0x7f64080721b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:04.647 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.644+0000 7f6406ffd700 1 --2- 192.168.123.106:0/3956579866 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6408101710 0x7f64080721b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:04.647 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.645+0000 7f6406ffd700 1 -- 192.168.123.106:0/3956579866 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f63f00097e0 con 0x7f6408100510 2026-03-09T17:27:04.647 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.645+0000 7f6406ffd700 1 --2- 192.168.123.106:0/3956579866 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6408100510 0x7f6408071c70 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7f63f000bb70 tx=0x7f63f0005ef0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:04.647 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.645+0000 7f640c8c0700 1 -- 192.168.123.106:0/3956579866 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f63f001d070 con 0x7f6408100510 2026-03-09T17:27:04.647 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.645+0000 7f640c8c0700 1 -- 192.168.123.106:0/3956579866 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f63f0022470 con 0x7f6408100510 2026-03-09T17:27:04.648 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.645+0000 7f640c8c0700 1 -- 192.168.123.106:0/3956579866 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f63f000f650 con 0x7f6408100510 2026-03-09T17:27:04.648 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.646+0000 7f640d8c2700 1 -- 192.168.123.106:0/3956579866 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f64081a5ef0 con 0x7f6408100510 2026-03-09T17:27:04.648 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.646+0000 7f640d8c2700 1 -- 192.168.123.106:0/3956579866 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f64081a63b0 con 0x7f6408100510 2026-03-09T17:27:04.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.648+0000 7f640d8c2700 1 -- 192.168.123.106:0/3956579866 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6408066e40 con 0x7f6408100510 2026-03-09T17:27:04.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.648+0000 7f640c8c0700 1 -- 192.168.123.106:0/3956579866 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f63f000f7b0 con 0x7f6408100510 2026-03-09T17:27:04.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.648+0000 7f640c8c0700 1 --2- 192.168.123.106:0/3956579866 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f63f406c680 0x7f63f406eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:04.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.648+0000 7f640c8c0700 1 -- 192.168.123.106:0/3956579866 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f63f008ca10 con 0x7f6408100510 2026-03-09T17:27:04.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.650+0000 7f64067fc700 1 --2- 192.168.123.106:0/3956579866 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f63f406c680 0x7f63f406eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:04.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.650+0000 7f64067fc700 1 --2- 192.168.123.106:0/3956579866 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f63f406c680 0x7f63f406eb30 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f63f8005950 tx=0x7f63f80058e0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:04.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.653+0000 7f640c8c0700 1 -- 192.168.123.106:0/3956579866 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f63f00586e0 con 0x7f6408100510 2026-03-09T17:27:04.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.765+0000 7f640d8c2700 1 -- 192.168.123.106:0/3956579866 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6408106060 con 0x7f63f406c680 2026-03-09T17:27:04.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.766+0000 7f640c8c0700 1 -- 192.168.123.106:0/3956579866 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+43 (secure 0 0 0) 0x7f6408106060 con 0x7f63f406c680 2026-03-09T17:27:04.768 INFO:teuthology.orchestra.run.vm06.stdout:Backend: cephadm 2026-03-09T17:27:04.768 INFO:teuthology.orchestra.run.vm06.stdout:Available: Yes 2026-03-09T17:27:04.768 INFO:teuthology.orchestra.run.vm06.stdout:Paused: No 2026-03-09T17:27:04.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.768+0000 7f640d8c2700 1 -- 192.168.123.106:0/3956579866 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f63f406c680 msgr2=0x7f63f406eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:04.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.769+0000 7f640d8c2700 1 --2- 192.168.123.106:0/3956579866 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f63f406c680 0x7f63f406eb30 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f63f8005950 tx=0x7f63f80058e0 comp rx=0 tx=0).stop 2026-03-09T17:27:04.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.769+0000 7f640d8c2700 1 -- 192.168.123.106:0/3956579866 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6408100510 msgr2=0x7f6408071c70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:04.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.769+0000 7f640d8c2700 1 --2- 192.168.123.106:0/3956579866 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6408100510 0x7f6408071c70 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7f63f000bb70 tx=0x7f63f0005ef0 comp rx=0 tx=0).stop 2026-03-09T17:27:04.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.769+0000 7f640d8c2700 1 -- 192.168.123.106:0/3956579866 shutdown_connections 2026-03-09T17:27:04.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.769+0000 7f640d8c2700 1 --2- 192.168.123.106:0/3956579866 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f63f406c680 0x7f63f406eb30 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:04.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.769+0000 7f640d8c2700 1 --2- 192.168.123.106:0/3956579866 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6408100510 0x7f6408071c70 unknown :-1 s=CLOSED pgs=237 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:04.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.769+0000 7f640d8c2700 1 --2- 192.168.123.106:0/3956579866 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6408101710 0x7f64080721b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:04.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.769+0000 7f640d8c2700 1 -- 192.168.123.106:0/3956579866 >> 192.168.123.106:0/3956579866 conn(0x7f64080fba80 msgr2=0x7f6408104940 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:04.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.769+0000 7f640d8c2700 1 -- 192.168.123.106:0/3956579866 shutdown_connections 2026-03-09T17:27:04.771 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:04.769+0000 7f640d8c2700 1 -- 192.168.123.106:0/3956579866 wait complete. 2026-03-09T17:27:04.827 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'ceph orch ps' 2026-03-09T17:27:04.988 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:05.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.238+0000 7fd6f1893700 1 -- 192.168.123.106:0/1231564398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6ec0fee80 msgr2=0x7fd6ec0ff290 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:05.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.238+0000 7fd6f1893700 1 --2- 192.168.123.106:0/1231564398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6ec0fee80 0x7fd6ec0ff290 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7fd6d4009b00 tx=0x7fd6d4009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:05.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.239+0000 7fd6f1893700 1 -- 192.168.123.106:0/1231564398 shutdown_connections 2026-03-09T17:27:05.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.239+0000 7fd6f1893700 1 --2- 192.168.123.106:0/1231564398 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd6ec0ff7d0 0x7fd6ec0ffc40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.239+0000 7fd6f1893700 1 --2- 192.168.123.106:0/1231564398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6ec0fee80 0x7fd6ec0ff290 unknown :-1 s=CLOSED pgs=238 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.239+0000 7fd6f1893700 1 -- 192.168.123.106:0/1231564398 >> 192.168.123.106:0/1231564398 conn(0x7fd6ec0faa70 msgr2=0x7fd6ec0fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:05.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.239+0000 7fd6f1893700 1 -- 192.168.123.106:0/1231564398 shutdown_connections 2026-03-09T17:27:05.242 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.240+0000 7fd6f1893700 1 -- 192.168.123.106:0/1231564398 wait complete. 2026-03-09T17:27:05.242 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.240+0000 7fd6f1893700 1 Processor -- start 2026-03-09T17:27:05.243 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.241+0000 7fd6f1893700 1 -- start start 2026-03-09T17:27:05.243 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.241+0000 7fd6f1893700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd6ec0ff7d0 0x7fd6ec198230 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:05.243 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.241+0000 7fd6f1893700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6ec198770 0x7fd6ec19d7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:05.243 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.241+0000 7fd6ea7fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6ec198770 0x7fd6ec19d7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:05.243 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.241+0000 7fd6ea7fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6ec198770 0x7fd6ec19d7e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:40518/0 (socket says 192.168.123.106:40518) 2026-03-09T17:27:05.243 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.241+0000 7fd6ea7fc700 1 -- 192.168.123.106:0/522936949 learned_addr learned my addr 192.168.123.106:0/522936949 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:05.243 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.241+0000 7fd6f1893700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6ec198c70 con 0x7fd6ec198770 2026-03-09T17:27:05.243 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.241+0000 7fd6f1893700 1 -- 192.168.123.106:0/522936949 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6ec198de0 con 0x7fd6ec0ff7d0 2026-03-09T17:27:05.244 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.242+0000 7fd6ea7fc700 1 -- 192.168.123.106:0/522936949 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd6ec0ff7d0 msgr2=0x7fd6ec198230 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:05.244 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.242+0000 7fd6ea7fc700 1 --2- 192.168.123.106:0/522936949 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd6ec0ff7d0 0x7fd6ec198230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.244 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.242+0000 7fd6ea7fc700 1 -- 192.168.123.106:0/522936949 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd6d40097e0 con 0x7fd6ec198770 2026-03-09T17:27:05.244 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.242+0000 7fd6ea7fc700 1 --2- 192.168.123.106:0/522936949 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6ec198770 0x7fd6ec19d7e0 secure :-1 s=READY pgs=239 cs=0 l=1 rev1=1 crypto rx=0x7fd6dc00d900 tx=0x7fd6dc00dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:05.245 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.242+0000 7fd6f0891700 1 -- 192.168.123.106:0/522936949 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd6dc0041d0 con 0x7fd6ec198770 2026-03-09T17:27:05.245 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.242+0000 7fd6f0891700 1 -- 192.168.123.106:0/522936949 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd6dc004d10 con 0x7fd6ec198770 2026-03-09T17:27:05.245 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.242+0000 7fd6f0891700 1 -- 192.168.123.106:0/522936949 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd6dc00b750 con 0x7fd6ec198770 2026-03-09T17:27:05.245 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.242+0000 7fd6f1893700 1 -- 192.168.123.106:0/522936949 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd6ec19dd80 con 0x7fd6ec198770 2026-03-09T17:27:05.245 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.242+0000 7fd6f1893700 1 -- 192.168.123.106:0/522936949 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd6ec1013c0 con 0x7fd6ec198770 2026-03-09T17:27:05.246 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.244+0000 7fd6f1893700 1 -- 192.168.123.106:0/522936949 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd6ec066e40 con 0x7fd6ec198770 2026-03-09T17:27:05.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.244+0000 7fd6f0891700 1 -- 192.168.123.106:0/522936949 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd6dc004330 con 0x7fd6ec198770 2026-03-09T17:27:05.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.245+0000 7fd6f0891700 1 --2- 192.168.123.106:0/522936949 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd6d806c750 0x7fd6d806ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:05.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.245+0000 7fd6f0891700 1 -- 192.168.123.106:0/522936949 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fd6dc01f030 con 0x7fd6ec198770 2026-03-09T17:27:05.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.247+0000 7fd6f0891700 1 -- 192.168.123.106:0/522936949 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd6dc059670 con 0x7fd6ec198770 2026-03-09T17:27:05.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.247+0000 7fd6eaffd700 1 --2- 192.168.123.106:0/522936949 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd6d806c750 0x7fd6d806ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:05.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.248+0000 7fd6eaffd700 1 --2- 192.168.123.106:0/522936949 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd6d806c750 0x7fd6d806ec00 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fd6d4006010 tx=0x7fd6d4009f90 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:05.364 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.361+0000 7fd6f1893700 1 -- 192.168.123.106:0/522936949 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fd6ec108bd0 con 0x7fd6d806c750 2026-03-09T17:27:05.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.367+0000 7fd6f0891700 1 -- 192.168.123.106:0/522936949 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+2640 (secure 0 0 0) 0x7fd6ec108bd0 con 0x7fd6d806c750 2026-03-09T17:27:05.369 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:27:05.369 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (80s) 44s ago 2m 24.7M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (2m) 44s ago 2m 7746k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (96s) 18s ago 96s 7944k - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (2m) 44s ago 2m 7402k - 18.2.0 dc2bc1663786 8c6366ef2954 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (95s) 18s ago 95s 7419k - 18.2.0 dc2bc1663786 78af352f0367 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (79s) 44s ago 113s 85.4M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:9283,8765,8443 running (2m) 44s ago 2m 486M - 18.2.0 dc2bc1663786 2765e8d99a9c 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (91s) 18s ago 91s 443M - 18.2.0 dc2bc1663786 e6525bf5de20 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (2m) 44s ago 2m 46.4M 2048M 18.2.0 dc2bc1663786 e0e1a20b1577 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (89s) 18s ago 89s 44.7M 2048M 18.2.0 dc2bc1663786 4c30d1217de3 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (2m) 44s ago 2m 13.6M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (92s) 18s ago 92s 13.4M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (71s) 44s ago 71s 37.9M 4096M 18.2.0 dc2bc1663786 7a07f019bdd7 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (60s) 44s ago 60s 37.1M 4096M 18.2.0 dc2bc1663786 decf7e88ebaf 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (49s) 44s ago 49s 35.8M 4096M 18.2.0 dc2bc1663786 df6c7067c3e4 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (39s) 18s ago 39s 40.1M 4096M 18.2.0 dc2bc1663786 48a594500ef1 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (29s) 18s ago 29s 39.8M 4096M 18.2.0 dc2bc1663786 a47c39052541 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (19s) 18s ago 19s 11.7M 4096M 18.2.0 dc2bc1663786 89f436540a49 2026-03-09T17:27:05.370 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (73s) 44s ago 108s 30.2M - 2.43.0 a07b618ecd1d 9f52c04d903c 2026-03-09T17:27:05.372 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.370+0000 7fd6f1893700 1 -- 192.168.123.106:0/522936949 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd6d806c750 msgr2=0x7fd6d806ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:05.372 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.370+0000 7fd6f1893700 1 --2- 192.168.123.106:0/522936949 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd6d806c750 0x7fd6d806ec00 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fd6d4006010 tx=0x7fd6d4009f90 comp rx=0 tx=0).stop 2026-03-09T17:27:05.372 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.370+0000 7fd6f1893700 1 -- 192.168.123.106:0/522936949 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6ec198770 msgr2=0x7fd6ec19d7e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:05.372 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.370+0000 7fd6f1893700 1 --2- 192.168.123.106:0/522936949 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6ec198770 0x7fd6ec19d7e0 secure :-1 s=READY pgs=239 cs=0 l=1 rev1=1 crypto rx=0x7fd6dc00d900 tx=0x7fd6dc00dcc0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.372 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.370+0000 7fd6f1893700 1 -- 192.168.123.106:0/522936949 shutdown_connections 2026-03-09T17:27:05.372 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.370+0000 7fd6f1893700 1 --2- 192.168.123.106:0/522936949 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd6ec0ff7d0 0x7fd6ec198230 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.372 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.370+0000 7fd6f1893700 1 --2- 192.168.123.106:0/522936949 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd6d806c750 0x7fd6d806ec00 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.372 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.370+0000 7fd6f1893700 1 --2- 192.168.123.106:0/522936949 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6ec198770 0x7fd6ec19d7e0 unknown :-1 s=CLOSED pgs=239 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.372 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.370+0000 7fd6f1893700 1 -- 192.168.123.106:0/522936949 >> 192.168.123.106:0/522936949 conn(0x7fd6ec0faa70 msgr2=0x7fd6ec1074b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:05.372 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.370+0000 7fd6f1893700 1 -- 192.168.123.106:0/522936949 shutdown_connections 2026-03-09T17:27:05.373 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.370+0000 7fd6f1893700 1 -- 192.168.123.106:0/522936949 wait complete. 2026-03-09T17:27:05.435 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'ceph orch ls' 2026-03-09T17:27:05.588 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:05.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.831+0000 7fd6294b4700 1 -- 192.168.123.106:0/4006191549 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd624106ec0 msgr2=0x7fd6241072d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:05.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.831+0000 7fd6294b4700 1 --2- 192.168.123.106:0/4006191549 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd624106ec0 0x7fd6241072d0 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7fd61c00b3a0 tx=0x7fd61c00b6b0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.832+0000 7fd6294b4700 1 -- 192.168.123.106:0/4006191549 shutdown_connections 2026-03-09T17:27:05.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.832+0000 7fd6294b4700 1 --2- 192.168.123.106:0/4006191549 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd624107df0 0x7fd624108260 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.832+0000 7fd6294b4700 1 --2- 192.168.123.106:0/4006191549 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd624106ec0 0x7fd6241072d0 unknown :-1 s=CLOSED pgs=240 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.832+0000 7fd6294b4700 1 -- 192.168.123.106:0/4006191549 >> 192.168.123.106:0/4006191549 conn(0x7fd624075bd0 msgr2=0x7fd624078000 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:05.835 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.832+0000 7fd6294b4700 1 -- 192.168.123.106:0/4006191549 shutdown_connections 2026-03-09T17:27:05.835 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.832+0000 7fd6294b4700 1 -- 192.168.123.106:0/4006191549 wait complete. 2026-03-09T17:27:05.835 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.833+0000 7fd6294b4700 1 Processor -- start 2026-03-09T17:27:05.835 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.833+0000 7fd6294b4700 1 -- start start 2026-03-09T17:27:05.835 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.833+0000 7fd6294b4700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd624106ec0 0x7fd62419c380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:05.835 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.833+0000 7fd6294b4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd624107df0 0x7fd62419c8c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:05.835 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.833+0000 7fd6294b4700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd62419cee0 con 0x7fd624107df0 2026-03-09T17:27:05.835 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.833+0000 7fd6294b4700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd62419d020 con 0x7fd624106ec0 2026-03-09T17:27:05.835 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.833+0000 7fd622ffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd624106ec0 0x7fd62419c380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:05.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.833+0000 7fd622ffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd624106ec0 0x7fd62419c380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:56342/0 (socket says 192.168.123.106:56342) 2026-03-09T17:27:05.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.833+0000 7fd622ffd700 1 -- 192.168.123.106:0/193059594 learned_addr learned my addr 192.168.123.106:0/193059594 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:05.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.833+0000 7fd622ffd700 1 -- 192.168.123.106:0/193059594 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd624107df0 msgr2=0x7fd62419c8c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:05.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.834+0000 7fd6227fc700 1 --2- 192.168.123.106:0/193059594 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd624107df0 0x7fd62419c8c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:05.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.834+0000 7fd622ffd700 1 --2- 192.168.123.106:0/193059594 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd624107df0 0x7fd62419c8c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.834+0000 7fd622ffd700 1 -- 192.168.123.106:0/193059594 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd61c00b050 con 0x7fd624106ec0 2026-03-09T17:27:05.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.834+0000 7fd6227fc700 1 --2- 192.168.123.106:0/193059594 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd624107df0 0x7fd62419c8c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:27:05.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.834+0000 7fd622ffd700 1 --2- 192.168.123.106:0/193059594 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd624106ec0 0x7fd62419c380 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fd61c009600 tx=0x7fd61c009180 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:05.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.834+0000 7fd60bfff700 1 -- 192.168.123.106:0/193059594 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd61c00e040 con 0x7fd624106ec0 2026-03-09T17:27:05.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.834+0000 7fd6294b4700 1 -- 192.168.123.106:0/193059594 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd6241a1a70 con 0x7fd624106ec0 2026-03-09T17:27:05.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.834+0000 7fd6294b4700 1 -- 192.168.123.106:0/193059594 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd6241a1f60 con 0x7fd624106ec0 2026-03-09T17:27:05.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.834+0000 7fd60bfff700 1 -- 192.168.123.106:0/193059594 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd61c004520 con 0x7fd624106ec0 2026-03-09T17:27:05.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.834+0000 7fd60bfff700 1 -- 192.168.123.106:0/193059594 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd61c01da30 con 0x7fd624106ec0 2026-03-09T17:27:05.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.835+0000 7fd60bfff700 1 -- 192.168.123.106:0/193059594 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd61c019070 con 0x7fd624106ec0 2026-03-09T17:27:05.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.836+0000 7fd6294b4700 1 -- 192.168.123.106:0/193059594 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd610005320 con 0x7fd624106ec0 2026-03-09T17:27:05.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.836+0000 7fd60bfff700 1 --2- 192.168.123.106:0/193059594 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd60c06c6f0 0x7fd60c06eba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:05.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.836+0000 7fd6227fc700 1 --2- 192.168.123.106:0/193059594 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd60c06c6f0 0x7fd60c06eba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:05.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.836+0000 7fd60bfff700 1 -- 192.168.123.106:0/193059594 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fd61c08cab0 con 0x7fd624106ec0 2026-03-09T17:27:05.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.836+0000 7fd6227fc700 1 --2- 192.168.123.106:0/193059594 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd60c06c6f0 0x7fd60c06eba0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fd618005fd0 tx=0x7fd618005ee0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:05.841 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.839+0000 7fd60bfff700 1 -- 192.168.123.106:0/193059594 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd61c05b000 con 0x7fd624106ec0 2026-03-09T17:27:05.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:05 vm06 ceph-mon[57307]: from='client.14444 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:05.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:05 vm06 ceph-mon[57307]: pgmap v72: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:27:05.950 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.948+0000 7fd6294b4700 1 -- 192.168.123.106:0/193059594 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch ls", "target": ["mon-mgr", ""]}) v1 -- 0x7fd610000bf0 con 0x7fd60c06c6f0 2026-03-09T17:27:05.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.951+0000 7fd60bfff700 1 -- 192.168.123.106:0/193059594 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1150 (secure 0 0 0) 0x7fd610000bf0 con 0x7fd60c06c6f0 2026-03-09T17:27:05.953 INFO:teuthology.orchestra.run.vm06.stdout:NAME PORTS RUNNING REFRESHED AGE PLACEMENT 2026-03-09T17:27:05.953 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager ?:9093,9094 1/1 45s ago 2m count:1 2026-03-09T17:27:05.953 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter 2/2 45s ago 2m * 2026-03-09T17:27:05.953 INFO:teuthology.orchestra.run.vm06.stdout:crash 2/2 45s ago 2m * 2026-03-09T17:27:05.953 INFO:teuthology.orchestra.run.vm06.stdout:grafana ?:3000 1/1 45s ago 2m count:1 2026-03-09T17:27:05.953 INFO:teuthology.orchestra.run.vm06.stdout:mgr 2/2 45s ago 2m count:2 2026-03-09T17:27:05.953 INFO:teuthology.orchestra.run.vm06.stdout:mon 2/2 45s ago 2m vm06:192.168.123.106=vm06;vm09:192.168.123.109=vm09;count:2 2026-03-09T17:27:05.953 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter ?:9100 2/2 45s ago 2m * 2026-03-09T17:27:05.953 INFO:teuthology.orchestra.run.vm06.stdout:osd 6 45s ago - 2026-03-09T17:27:05.953 INFO:teuthology.orchestra.run.vm06.stdout:prometheus ?:9095 1/1 45s ago 2m count:1 2026-03-09T17:27:05.955 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.953+0000 7fd6294b4700 1 -- 192.168.123.106:0/193059594 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd60c06c6f0 msgr2=0x7fd60c06eba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:05.955 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.953+0000 7fd6294b4700 1 --2- 192.168.123.106:0/193059594 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd60c06c6f0 0x7fd60c06eba0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fd618005fd0 tx=0x7fd618005ee0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.955 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.953+0000 7fd6294b4700 1 -- 192.168.123.106:0/193059594 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd624106ec0 msgr2=0x7fd62419c380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:05.955 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.953+0000 7fd6294b4700 1 --2- 192.168.123.106:0/193059594 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd624106ec0 0x7fd62419c380 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fd61c009600 tx=0x7fd61c009180 comp rx=0 tx=0).stop 2026-03-09T17:27:05.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.954+0000 7fd6294b4700 1 -- 192.168.123.106:0/193059594 shutdown_connections 2026-03-09T17:27:05.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.954+0000 7fd6294b4700 1 --2- 192.168.123.106:0/193059594 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd624106ec0 0x7fd62419c380 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.954+0000 7fd6294b4700 1 --2- 192.168.123.106:0/193059594 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd60c06c6f0 0x7fd60c06eba0 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.954+0000 7fd6294b4700 1 --2- 192.168.123.106:0/193059594 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd624107df0 0x7fd62419c8c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:05.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.954+0000 7fd6294b4700 1 -- 192.168.123.106:0/193059594 >> 192.168.123.106:0/193059594 conn(0x7fd624075bd0 msgr2=0x7fd62410b020 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:05.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.954+0000 7fd6294b4700 1 -- 192.168.123.106:0/193059594 shutdown_connections 2026-03-09T17:27:05.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:05.954+0000 7fd6294b4700 1 -- 192.168.123.106:0/193059594 wait complete. 2026-03-09T17:27:06.019 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'ceph orch host ls' 2026-03-09T17:27:06.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:05 vm09 ceph-mon[62061]: from='client.14444 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:06.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:05 vm09 ceph-mon[62061]: pgmap v72: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:27:06.168 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:06.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.455+0000 7f68abd19700 1 -- 192.168.123.106:0/1429389482 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f68a4100fb0 msgr2=0x7f68a4103390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:06.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.455+0000 7f68abd19700 1 --2- 192.168.123.106:0/1429389482 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f68a4100fb0 0x7f68a4103390 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f68a0009b00 tx=0x7f68a0009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:06.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.455+0000 7f68a8ab3700 1 -- 192.168.123.106:0/1429389482 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f68a0005600 con 0x7f68a4100fb0 2026-03-09T17:27:06.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.455+0000 7f68abd19700 1 -- 192.168.123.106:0/1429389482 shutdown_connections 2026-03-09T17:27:06.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.455+0000 7f68abd19700 1 --2- 192.168.123.106:0/1429389482 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68a41038d0 0x7f68a4105cb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:06.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.455+0000 7f68abd19700 1 --2- 192.168.123.106:0/1429389482 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f68a4100fb0 0x7f68a4103390 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:06.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.455+0000 7f68abd19700 1 -- 192.168.123.106:0/1429389482 >> 192.168.123.106:0/1429389482 conn(0x7f68a40fa9b0 msgr2=0x7f68a40fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:06.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.456+0000 7f68abd19700 1 -- 192.168.123.106:0/1429389482 shutdown_connections 2026-03-09T17:27:06.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.456+0000 7f68abd19700 1 -- 192.168.123.106:0/1429389482 wait complete. 2026-03-09T17:27:06.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.456+0000 7f68abd19700 1 Processor -- start 2026-03-09T17:27:06.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.456+0000 7f68abd19700 1 -- start start 2026-03-09T17:27:06.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.456+0000 7f68abd19700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68a4100fb0 0x7f68a4195d40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:06.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.456+0000 7f68a9ab5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68a4100fb0 0x7f68a4195d40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:06.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.456+0000 7f68a9ab5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68a4100fb0 0x7f68a4195d40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:40564/0 (socket says 192.168.123.106:40564) 2026-03-09T17:27:06.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.457+0000 7f68abd19700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f68a41038d0 0x7f68a4196280 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:06.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.457+0000 7f68abd19700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f68a41968a0 con 0x7f68a4100fb0 2026-03-09T17:27:06.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.457+0000 7f68abd19700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f68a41969e0 con 0x7f68a41038d0 2026-03-09T17:27:06.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.457+0000 7f68a9ab5700 1 -- 192.168.123.106:0/1192217892 learned_addr learned my addr 192.168.123.106:0/1192217892 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:06.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.457+0000 7f68a9ab5700 1 -- 192.168.123.106:0/1192217892 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f68a41038d0 msgr2=0x7f68a4196280 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T17:27:06.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.457+0000 7f68a92b4700 1 --2- 192.168.123.106:0/1192217892 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f68a41038d0 0x7f68a4196280 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:06.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.457+0000 7f68a9ab5700 1 --2- 192.168.123.106:0/1192217892 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f68a41038d0 0x7f68a4196280 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:06.460 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.457+0000 7f68a9ab5700 1 -- 192.168.123.106:0/1192217892 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6894009710 con 0x7f68a4100fb0 2026-03-09T17:27:06.460 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.457+0000 7f68a9ab5700 1 --2- 192.168.123.106:0/1192217892 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68a4100fb0 0x7f68a4195d40 secure :-1 s=READY pgs=241 cs=0 l=1 rev1=1 crypto rx=0x7f68a0000c00 tx=0x7f68a0004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:06.460 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.457+0000 7f689affd700 1 -- 192.168.123.106:0/1192217892 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f68a001d070 con 0x7f68a4100fb0 2026-03-09T17:27:06.460 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.457+0000 7f68abd19700 1 -- 192.168.123.106:0/1192217892 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f68a00097e0 con 0x7f68a4100fb0 2026-03-09T17:27:06.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.457+0000 7f68abd19700 1 -- 192.168.123.106:0/1192217892 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f68a419b790 con 0x7f68a4100fb0 2026-03-09T17:27:06.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.458+0000 7f689affd700 1 -- 192.168.123.106:0/1192217892 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f68a000bb40 con 0x7f68a4100fb0 2026-03-09T17:27:06.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.458+0000 7f689affd700 1 -- 192.168.123.106:0/1192217892 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f68a000f700 con 0x7f68a4100fb0 2026-03-09T17:27:06.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.459+0000 7f689affd700 1 -- 192.168.123.106:0/1192217892 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f68a000f860 con 0x7f68a4100fb0 2026-03-09T17:27:06.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.459+0000 7f689affd700 1 --2- 192.168.123.106:0/1192217892 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6890070ae0 0x7f6890072f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:06.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.459+0000 7f689affd700 1 -- 192.168.123.106:0/1192217892 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f68a008db70 con 0x7f68a4100fb0 2026-03-09T17:27:06.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.459+0000 7f68a92b4700 1 --2- 192.168.123.106:0/1192217892 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6890070ae0 0x7f6890072f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:06.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.460+0000 7f68abd19700 1 -- 192.168.123.106:0/1192217892 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6888005320 con 0x7f68a4100fb0 2026-03-09T17:27:06.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.464+0000 7f689affd700 1 -- 192.168.123.106:0/1192217892 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f68a005c170 con 0x7f68a4100fb0 2026-03-09T17:27:06.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.464+0000 7f68a92b4700 1 --2- 192.168.123.106:0/1192217892 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6890070ae0 0x7f6890072f90 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f6894009e90 tx=0x7f6894009450 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:06.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.576+0000 7f68abd19700 1 -- 192.168.123.106:0/1192217892 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f6888000bf0 con 0x7f6890070ae0 2026-03-09T17:27:06.579 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.577+0000 7f689affd700 1 -- 192.168.123.106:0/1192217892 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+139 (secure 0 0 0) 0x7f6888000bf0 con 0x7f6890070ae0 2026-03-09T17:27:06.579 INFO:teuthology.orchestra.run.vm06.stdout:HOST ADDR LABELS STATUS 2026-03-09T17:27:06.579 INFO:teuthology.orchestra.run.vm06.stdout:vm06 192.168.123.106 2026-03-09T17:27:06.579 INFO:teuthology.orchestra.run.vm06.stdout:vm09 192.168.123.109 2026-03-09T17:27:06.579 INFO:teuthology.orchestra.run.vm06.stdout:2 hosts in cluster 2026-03-09T17:27:06.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.579+0000 7f68abd19700 1 -- 192.168.123.106:0/1192217892 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6890070ae0 msgr2=0x7f6890072f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:06.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.579+0000 7f68abd19700 1 --2- 192.168.123.106:0/1192217892 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6890070ae0 0x7f6890072f90 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f6894009e90 tx=0x7f6894009450 comp rx=0 tx=0).stop 2026-03-09T17:27:06.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.579+0000 7f68abd19700 1 -- 192.168.123.106:0/1192217892 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68a4100fb0 msgr2=0x7f68a4195d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:06.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.579+0000 7f68abd19700 1 --2- 192.168.123.106:0/1192217892 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68a4100fb0 0x7f68a4195d40 secure :-1 s=READY pgs=241 cs=0 l=1 rev1=1 crypto rx=0x7f68a0000c00 tx=0x7f68a0004970 comp rx=0 tx=0).stop 2026-03-09T17:27:06.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.580+0000 7f68abd19700 1 -- 192.168.123.106:0/1192217892 shutdown_connections 2026-03-09T17:27:06.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.580+0000 7f68abd19700 1 --2- 192.168.123.106:0/1192217892 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f6890070ae0 0x7f6890072f90 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:06.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.580+0000 7f68abd19700 1 --2- 192.168.123.106:0/1192217892 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68a4100fb0 0x7f68a4195d40 unknown :-1 s=CLOSED pgs=241 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:06.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.580+0000 7f68abd19700 1 --2- 192.168.123.106:0/1192217892 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f68a41038d0 0x7f68a4196280 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:06.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.580+0000 7f68abd19700 1 -- 192.168.123.106:0/1192217892 >> 192.168.123.106:0/1192217892 conn(0x7f68a40fa9b0 msgr2=0x7f68a40fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:06.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.580+0000 7f68abd19700 1 -- 192.168.123.106:0/1192217892 shutdown_connections 2026-03-09T17:27:06.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:06.580+0000 7f68abd19700 1 -- 192.168.123.106:0/1192217892 wait complete. 2026-03-09T17:27:06.647 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'ceph orch device ls' 2026-03-09T17:27:06.815 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:06.998 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:06 vm06 ceph-mon[57307]: from='client.14448 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:06.998 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:06 vm06 ceph-mon[57307]: from='client.24273 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:07.071 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.069+0000 7f4648be3700 1 -- 192.168.123.106:0/4087168621 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4644103960 msgr2=0x7f4644103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:07.071 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.069+0000 7f4648be3700 1 --2- 192.168.123.106:0/4087168621 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4644103960 0x7f4644103db0 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7f4634009b00 tx=0x7f4634009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:07.071 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.069+0000 7f4648be3700 1 -- 192.168.123.106:0/4087168621 shutdown_connections 2026-03-09T17:27:07.071 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.069+0000 7f4648be3700 1 --2- 192.168.123.106:0/4087168621 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4644103960 0x7f4644103db0 unknown :-1 s=CLOSED pgs=242 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:07.071 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.069+0000 7f4648be3700 1 --2- 192.168.123.106:0/4087168621 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4644102760 0x7f4644102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:07.071 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.069+0000 7f4648be3700 1 -- 192.168.123.106:0/4087168621 >> 192.168.123.106:0/4087168621 conn(0x7f46440fdcf0 msgr2=0x7f4644100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:07.072 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.069+0000 7f4648be3700 1 -- 192.168.123.106:0/4087168621 shutdown_connections 2026-03-09T17:27:07.072 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.070+0000 7f4648be3700 1 -- 192.168.123.106:0/4087168621 wait complete. 2026-03-09T17:27:07.072 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.070+0000 7f4648be3700 1 Processor -- start 2026-03-09T17:27:07.072 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.070+0000 7f4648be3700 1 -- start start 2026-03-09T17:27:07.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.070+0000 7f4648be3700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4644102760 0x7f4644198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:07.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.071+0000 7f4648be3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4644103960 0x7f4644198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:07.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.071+0000 7f4648be3700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4644198b80 con 0x7f4644103960 2026-03-09T17:27:07.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.071+0000 7f4648be3700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4644198cc0 con 0x7f4644102760 2026-03-09T17:27:07.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.071+0000 7f4641d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4644103960 0x7f4644198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:07.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.071+0000 7f464259c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4644102760 0x7f4644198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:07.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.071+0000 7f4641d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4644103960 0x7f4644198560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:40592/0 (socket says 192.168.123.106:40592) 2026-03-09T17:27:07.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.071+0000 7f4641d9b700 1 -- 192.168.123.106:0/911555507 learned_addr learned my addr 192.168.123.106:0/911555507 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:07.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.071+0000 7f4641d9b700 1 -- 192.168.123.106:0/911555507 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4644102760 msgr2=0x7f4644198020 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:07.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.071+0000 7f4641d9b700 1 --2- 192.168.123.106:0/911555507 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4644102760 0x7f4644198020 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:07.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.071+0000 7f4641d9b700 1 -- 192.168.123.106:0/911555507 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f46340097e0 con 0x7f4644103960 2026-03-09T17:27:07.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.071+0000 7f464259c700 1 --2- 192.168.123.106:0/911555507 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4644102760 0x7f4644198020 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T17:27:07.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.072+0000 7f4641d9b700 1 --2- 192.168.123.106:0/911555507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4644103960 0x7f4644198560 secure :-1 s=READY pgs=243 cs=0 l=1 rev1=1 crypto rx=0x7f4634009ad0 tx=0x7f46340048c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:07.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.072+0000 7f463b7fe700 1 -- 192.168.123.106:0/911555507 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f463401d070 con 0x7f4644103960 2026-03-09T17:27:07.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.072+0000 7f463b7fe700 1 -- 192.168.123.106:0/911555507 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f463400bb40 con 0x7f4644103960 2026-03-09T17:27:07.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.072+0000 7f463b7fe700 1 -- 192.168.123.106:0/911555507 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4634022ea0 con 0x7f4644103960 2026-03-09T17:27:07.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.072+0000 7f4648be3700 1 -- 192.168.123.106:0/911555507 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f464419d710 con 0x7f4644103960 2026-03-09T17:27:07.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.072+0000 7f4648be3700 1 -- 192.168.123.106:0/911555507 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f464419dc00 con 0x7f4644103960 2026-03-09T17:27:07.076 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.074+0000 7f4648be3700 1 -- 192.168.123.106:0/911555507 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4644066e40 con 0x7f4644103960 2026-03-09T17:27:07.076 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.074+0000 7f463b7fe700 1 -- 192.168.123.106:0/911555507 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f4634022870 con 0x7f4644103960 2026-03-09T17:27:07.079 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.077+0000 7f463b7fe700 1 --2- 192.168.123.106:0/911555507 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f463006c6d0 0x7f463006eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:07.080 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.077+0000 7f463b7fe700 1 -- 192.168.123.106:0/911555507 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f463408cb60 con 0x7f4644103960 2026-03-09T17:27:07.080 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.077+0000 7f463b7fe700 1 -- 192.168.123.106:0/911555507 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f46340902a0 con 0x7f4644103960 2026-03-09T17:27:07.080 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.077+0000 7f464259c700 1 --2- 192.168.123.106:0/911555507 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f463006c6d0 0x7f463006eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:07.080 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.078+0000 7f464259c700 1 --2- 192.168.123.106:0/911555507 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f463006c6d0 0x7f463006eb80 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f462c009fd0 tx=0x7f462c009450 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:07.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:06 vm09 ceph-mon[62061]: from='client.14448 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:07.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:06 vm09 ceph-mon[62061]: from='client.24273 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:07.191 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.189+0000 7f4648be3700 1 -- 192.168.123.106:0/911555507 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch device ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f46441082b0 con 0x7f463006c6d0 2026-03-09T17:27:07.193 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.191+0000 7f463b7fe700 1 -- 192.168.123.106:0/911555507 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1188 (secure 0 0 0) 0x7f46441082b0 con 0x7f463006c6d0 2026-03-09T17:27:07.193 INFO:teuthology.orchestra.run.vm06.stdout:HOST PATH TYPE DEVICE ID SIZE AVAILABLE REFRESHED REJECT REASONS 2026-03-09T17:27:07.193 INFO:teuthology.orchestra.run.vm06.stdout:vm06 /dev/vdb hdd DWNBRSTVMM06001 20.0G Yes 48s ago 2026-03-09T17:27:07.193 INFO:teuthology.orchestra.run.vm06.stdout:vm06 /dev/vdc hdd DWNBRSTVMM06002 20.0G No 48s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-09T17:27:07.193 INFO:teuthology.orchestra.run.vm06.stdout:vm06 /dev/vdd hdd DWNBRSTVMM06003 20.0G No 48s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-09T17:27:07.193 INFO:teuthology.orchestra.run.vm06.stdout:vm06 /dev/vde hdd DWNBRSTVMM06004 20.0G No 48s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-09T17:27:07.193 INFO:teuthology.orchestra.run.vm06.stdout:vm09 /dev/vdb hdd DWNBRSTVMM09001 20.0G Yes 18s ago 2026-03-09T17:27:07.193 INFO:teuthology.orchestra.run.vm06.stdout:vm09 /dev/vdc hdd DWNBRSTVMM09002 20.0G No 18s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-09T17:27:07.193 INFO:teuthology.orchestra.run.vm06.stdout:vm09 /dev/vdd hdd DWNBRSTVMM09003 20.0G No 18s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-09T17:27:07.193 INFO:teuthology.orchestra.run.vm06.stdout:vm09 /dev/vde hdd DWNBRSTVMM09004 20.0G No 18s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-09T17:27:07.195 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.193+0000 7f4648be3700 1 -- 192.168.123.106:0/911555507 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f463006c6d0 msgr2=0x7f463006eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:07.195 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.193+0000 7f4648be3700 1 --2- 192.168.123.106:0/911555507 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f463006c6d0 0x7f463006eb80 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f462c009fd0 tx=0x7f462c009450 comp rx=0 tx=0).stop 2026-03-09T17:27:07.196 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.194+0000 7f4648be3700 1 -- 192.168.123.106:0/911555507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4644103960 msgr2=0x7f4644198560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:07.196 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.194+0000 7f4648be3700 1 --2- 192.168.123.106:0/911555507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4644103960 0x7f4644198560 secure :-1 s=READY pgs=243 cs=0 l=1 rev1=1 crypto rx=0x7f4634009ad0 tx=0x7f46340048c0 comp rx=0 tx=0).stop 2026-03-09T17:27:07.196 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.194+0000 7f4648be3700 1 -- 192.168.123.106:0/911555507 shutdown_connections 2026-03-09T17:27:07.196 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.194+0000 7f4648be3700 1 --2- 192.168.123.106:0/911555507 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4644102760 0x7f4644198020 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:07.196 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.194+0000 7f4648be3700 1 --2- 192.168.123.106:0/911555507 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f463006c6d0 0x7f463006eb80 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:07.196 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.194+0000 7f4648be3700 1 --2- 192.168.123.106:0/911555507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4644103960 0x7f4644198560 unknown :-1 s=CLOSED pgs=243 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:07.197 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.195+0000 7f4648be3700 1 -- 192.168.123.106:0/911555507 >> 192.168.123.106:0/911555507 conn(0x7f46440fdcf0 msgr2=0x7f4644106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:07.197 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.195+0000 7f4648be3700 1 -- 192.168.123.106:0/911555507 shutdown_connections 2026-03-09T17:27:07.197 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.195+0000 7f4648be3700 1 -- 192.168.123.106:0/911555507 wait complete. 2026-03-09T17:27:07.268 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T17:27:07.270 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm06.local 2026-03-09T17:27:07.270 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'ceph fs volume create cephfs --placement=4' 2026-03-09T17:27:07.428 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:07.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.708+0000 7f4e3207e700 1 -- 192.168.123.106:0/3807118649 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4e2c103940 msgr2=0x7f4e2c103d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:07.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.708+0000 7f4e3207e700 1 --2- 192.168.123.106:0/3807118649 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4e2c103940 0x7f4e2c103d90 secure :-1 s=READY pgs=244 cs=0 l=1 rev1=1 crypto rx=0x7f4e14009b50 tx=0x7f4e14009e60 comp rx=0 tx=0).stop 2026-03-09T17:27:07.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.709+0000 7f4e3207e700 1 -- 192.168.123.106:0/3807118649 shutdown_connections 2026-03-09T17:27:07.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.709+0000 7f4e3207e700 1 --2- 192.168.123.106:0/3807118649 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4e2c103940 0x7f4e2c103d90 unknown :-1 s=CLOSED pgs=244 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:07.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.709+0000 7f4e3207e700 1 --2- 192.168.123.106:0/3807118649 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4e2c102740 0x7f4e2c102b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:07.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.709+0000 7f4e3207e700 1 -- 192.168.123.106:0/3807118649 >> 192.168.123.106:0/3807118649 conn(0x7f4e2c0fdcf0 msgr2=0x7f4e2c100120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:07.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.709+0000 7f4e3207e700 1 -- 192.168.123.106:0/3807118649 shutdown_connections 2026-03-09T17:27:07.712 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.709+0000 7f4e3207e700 1 -- 192.168.123.106:0/3807118649 wait complete. 2026-03-09T17:27:07.712 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.710+0000 7f4e3207e700 1 Processor -- start 2026-03-09T17:27:07.712 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.710+0000 7f4e3207e700 1 -- start start 2026-03-09T17:27:07.713 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.710+0000 7f4e3207e700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4e2c102740 0x7f4e2c197fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:07.713 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.711+0000 7f4e3207e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4e2c103940 0x7f4e2c1984f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:07.713 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.711+0000 7f4e23fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4e2c103940 0x7f4e2c1984f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:07.713 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.711+0000 7f4e23fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4e2c103940 0x7f4e2c1984f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:40606/0 (socket says 192.168.123.106:40606) 2026-03-09T17:27:07.713 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.711+0000 7f4e23fff700 1 -- 192.168.123.106:0/856221663 learned_addr learned my addr 192.168.123.106:0/856221663 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:07.713 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.711+0000 7f4e2b7fe700 1 --2- 192.168.123.106:0/856221663 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4e2c102740 0x7f4e2c197fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:07.714 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.711+0000 7f4e3207e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4e2c198b10 con 0x7f4e2c103940 2026-03-09T17:27:07.714 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.711+0000 7f4e3207e700 1 -- 192.168.123.106:0/856221663 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4e2c198c50 con 0x7f4e2c102740 2026-03-09T17:27:07.714 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.712+0000 7f4e23fff700 1 -- 192.168.123.106:0/856221663 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4e2c102740 msgr2=0x7f4e2c197fb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:07.714 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.712+0000 7f4e23fff700 1 --2- 192.168.123.106:0/856221663 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4e2c102740 0x7f4e2c197fb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:07.714 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.712+0000 7f4e23fff700 1 -- 192.168.123.106:0/856221663 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4e140097e0 con 0x7f4e2c103940 2026-03-09T17:27:07.714 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.712+0000 7f4e23fff700 1 --2- 192.168.123.106:0/856221663 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4e2c103940 0x7f4e2c1984f0 secure :-1 s=READY pgs=245 cs=0 l=1 rev1=1 crypto rx=0x7f4e1400b5c0 tx=0x7f4e14005740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:07.716 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.712+0000 7f4e297fa700 1 -- 192.168.123.106:0/856221663 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4e1401d070 con 0x7f4e2c103940 2026-03-09T17:27:07.716 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.712+0000 7f4e297fa700 1 -- 192.168.123.106:0/856221663 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4e14004e80 con 0x7f4e2c103940 2026-03-09T17:27:07.716 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.712+0000 7f4e297fa700 1 -- 192.168.123.106:0/856221663 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4e1400f780 con 0x7f4e2c103940 2026-03-09T17:27:07.716 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.712+0000 7f4e3207e700 1 -- 192.168.123.106:0/856221663 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4e2c19d6a0 con 0x7f4e2c103940 2026-03-09T17:27:07.716 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.713+0000 7f4e3207e700 1 -- 192.168.123.106:0/856221663 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4e2c19db90 con 0x7f4e2c103940 2026-03-09T17:27:07.719 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.714+0000 7f4e3207e700 1 -- 192.168.123.106:0/856221663 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4e2c066e40 con 0x7f4e2c103940 2026-03-09T17:27:07.720 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.717+0000 7f4e297fa700 1 -- 192.168.123.106:0/856221663 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f4e1400bc30 con 0x7f4e2c103940 2026-03-09T17:27:07.720 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.718+0000 7f4e297fa700 1 --2- 192.168.123.106:0/856221663 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4e0c06c7a0 0x7f4e0c06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:07.720 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.718+0000 7f4e297fa700 1 -- 192.168.123.106:0/856221663 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f4e1408d420 con 0x7f4e2c103940 2026-03-09T17:27:07.720 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.718+0000 7f4e297fa700 1 -- 192.168.123.106:0/856221663 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4e1408d8a0 con 0x7f4e2c103940 2026-03-09T17:27:07.720 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.718+0000 7f4e2b7fe700 1 --2- 192.168.123.106:0/856221663 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4e0c06c7a0 0x7f4e0c06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:07.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.719+0000 7f4e2b7fe700 1 --2- 192.168.123.106:0/856221663 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4e0c06c7a0 0x7f4e0c06ec50 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f4e2c1037a0 tx=0x7f4e1c007400 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:07.853 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:07 vm06 ceph-mon[57307]: from='client.14454 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:07.853 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:07 vm06 ceph-mon[57307]: from='client.14458 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:07.853 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:07 vm06 ceph-mon[57307]: pgmap v73: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:27:07.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:07.850+0000 7f4e3207e700 1 -- 192.168.123.106:0/856221663 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}) v1 -- 0x7f4e2c19df30 con 0x7f4e0c06c7a0 2026-03-09T17:27:08.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:07 vm09 ceph-mon[62061]: from='client.14454 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:08.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:07 vm09 ceph-mon[62061]: from='client.14458 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:08.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:07 vm09 ceph-mon[62061]: pgmap v73: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:27:09.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:08 vm06 ceph-mon[57307]: from='client.14462 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:09.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:08 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-09T17:27:09.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:08 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:27:09.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:08 vm09 ceph-mon[62061]: from='client.14462 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:09.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:08 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-09T17:27:09.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:08 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:27:09.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:09.821+0000 7f4e297fa700 1 -- 192.168.123.106:0/856221663 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f4e2c19df30 con 0x7f4e0c06c7a0 2026-03-09T17:27:09.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:09.824+0000 7f4e3207e700 1 -- 192.168.123.106:0/856221663 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4e0c06c7a0 msgr2=0x7f4e0c06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:09.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:09.824+0000 7f4e3207e700 1 --2- 192.168.123.106:0/856221663 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4e0c06c7a0 0x7f4e0c06ec50 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f4e2c1037a0 tx=0x7f4e1c007400 comp rx=0 tx=0).stop 2026-03-09T17:27:09.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:09.824+0000 7f4e3207e700 1 -- 192.168.123.106:0/856221663 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4e2c103940 msgr2=0x7f4e2c1984f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:09.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:09.824+0000 7f4e3207e700 1 --2- 192.168.123.106:0/856221663 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4e2c103940 0x7f4e2c1984f0 secure :-1 s=READY pgs=245 cs=0 l=1 rev1=1 crypto rx=0x7f4e1400b5c0 tx=0x7f4e14005740 comp rx=0 tx=0).stop 2026-03-09T17:27:09.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:09.824+0000 7f4e3207e700 1 -- 192.168.123.106:0/856221663 shutdown_connections 2026-03-09T17:27:09.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:09.824+0000 7f4e3207e700 1 --2- 192.168.123.106:0/856221663 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4e2c102740 0x7f4e2c197fb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:09.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:09.824+0000 7f4e3207e700 1 --2- 192.168.123.106:0/856221663 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4e0c06c7a0 0x7f4e0c06ec50 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:09.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:09.824+0000 7f4e3207e700 1 --2- 192.168.123.106:0/856221663 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4e2c103940 0x7f4e2c1984f0 unknown :-1 s=CLOSED pgs=245 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:09.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:09.824+0000 7f4e3207e700 1 -- 192.168.123.106:0/856221663 >> 192.168.123.106:0/856221663 conn(0x7f4e2c0fdcf0 msgr2=0x7f4e2c106b70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:09.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:09.824+0000 7f4e3207e700 1 -- 192.168.123.106:0/856221663 shutdown_connections 2026-03-09T17:27:09.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:09.824+0000 7f4e3207e700 1 -- 192.168.123.106:0/856221663 wait complete. 2026-03-09T17:27:09.903 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'ceph fs dump' 2026-03-09T17:27:10.078 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:10.115 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:09 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-09T17:27:10.115 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:09 vm06 ceph-mon[57307]: osdmap e34: 6 total, 6 up, 6 in 2026-03-09T17:27:10.115 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:09 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-09T17:27:10.115 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:09 vm06 ceph-mon[57307]: pgmap v75: 33 pgs: 3 creating+peering, 1 active+clean, 29 unknown; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:27:10.115 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:09 vm06 ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06[57303]: 2026-03-09T17:27:09.794+0000 7f8723235700 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T17:27:10.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:09 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-09T17:27:10.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:09 vm09 ceph-mon[62061]: osdmap e34: 6 total, 6 up, 6 in 2026-03-09T17:27:10.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:09 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-09T17:27:10.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:09 vm09 ceph-mon[62061]: pgmap v75: 33 pgs: 3 creating+peering, 1 active+clean, 29 unknown; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.389+0000 7fda36b95700 1 -- 192.168.123.106:0/3993951006 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda30103a00 msgr2=0x7fda30103e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.389+0000 7fda36b95700 1 --2- 192.168.123.106:0/3993951006 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda30103a00 0x7fda30103e70 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fda24009b00 tx=0x7fda24009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.390+0000 7fda36b95700 1 -- 192.168.123.106:0/3993951006 shutdown_connections 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.390+0000 7fda36b95700 1 --2- 192.168.123.106:0/3993951006 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda30103a00 0x7fda30103e70 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.390+0000 7fda36b95700 1 --2- 192.168.123.106:0/3993951006 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda30102760 0x7fda30102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.390+0000 7fda36b95700 1 -- 192.168.123.106:0/3993951006 >> 192.168.123.106:0/3993951006 conn(0x7fda300fddb0 msgr2=0x7fda301001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.390+0000 7fda36b95700 1 -- 192.168.123.106:0/3993951006 shutdown_connections 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.390+0000 7fda36b95700 1 -- 192.168.123.106:0/3993951006 wait complete. 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.390+0000 7fda36b95700 1 Processor -- start 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.391+0000 7fda36b95700 1 -- start start 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.391+0000 7fda36b95700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda30103a00 0x7fda30197fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.391+0000 7fda36b95700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda30198520 0x7fda3019d590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.391+0000 7fda36b95700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda30198a20 con 0x7fda30103a00 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.391+0000 7fda36b95700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda30198b90 con 0x7fda30198520 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.391+0000 7fda2ffff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda30198520 0x7fda3019d590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.391+0000 7fda2ffff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda30198520 0x7fda3019d590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:56430/0 (socket says 192.168.123.106:56430) 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.391+0000 7fda2ffff700 1 -- 192.168.123.106:0/924641684 learned_addr learned my addr 192.168.123.106:0/924641684 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.391+0000 7fda34931700 1 --2- 192.168.123.106:0/924641684 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda30103a00 0x7fda30197fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.391+0000 7fda2ffff700 1 -- 192.168.123.106:0/924641684 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda30103a00 msgr2=0x7fda30197fe0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.391+0000 7fda2ffff700 1 --2- 192.168.123.106:0/924641684 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda30103a00 0x7fda30197fe0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.391+0000 7fda2ffff700 1 -- 192.168.123.106:0/924641684 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fda240097e0 con 0x7fda30198520 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.391+0000 7fda34931700 1 --2- 192.168.123.106:0/924641684 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda30103a00 0x7fda30197fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.392+0000 7fda2ffff700 1 --2- 192.168.123.106:0/924641684 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda30198520 0x7fda3019d590 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fda24005850 tx=0x7fda24005230 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:10.394 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.392+0000 7fda2dffb700 1 -- 192.168.123.106:0/924641684 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda2401d070 con 0x7fda30198520 2026-03-09T17:27:10.395 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.392+0000 7fda36b95700 1 -- 192.168.123.106:0/924641684 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fda3019dad0 con 0x7fda30198520 2026-03-09T17:27:10.395 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.393+0000 7fda36b95700 1 -- 192.168.123.106:0/924641684 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fda3019df90 con 0x7fda30198520 2026-03-09T17:27:10.395 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.393+0000 7fda2dffb700 1 -- 192.168.123.106:0/924641684 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fda2400f9d0 con 0x7fda30198520 2026-03-09T17:27:10.396 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.393+0000 7fda36b95700 1 -- 192.168.123.106:0/924641684 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fda30066e40 con 0x7fda30198520 2026-03-09T17:27:10.396 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.393+0000 7fda2dffb700 1 -- 192.168.123.106:0/924641684 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda24022b40 con 0x7fda30198520 2026-03-09T17:27:10.396 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.394+0000 7fda2dffb700 1 -- 192.168.123.106:0/924641684 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fda2402c430 con 0x7fda30198520 2026-03-09T17:27:10.397 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.395+0000 7fda2dffb700 1 --2- 192.168.123.106:0/924641684 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fda1806c680 0x7fda1806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:10.397 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.395+0000 7fda34931700 1 --2- 192.168.123.106:0/924641684 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fda1806c680 0x7fda1806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:10.397 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.395+0000 7fda2dffb700 1 -- 192.168.123.106:0/924641684 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(36..36 src has 1..36) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fda2408df40 con 0x7fda30198520 2026-03-09T17:27:10.397 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.395+0000 7fda34931700 1 --2- 192.168.123.106:0/924641684 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fda1806c680 0x7fda1806eb30 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fda30102350 tx=0x7fda20008040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:10.399 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.397+0000 7fda2dffb700 1 -- 192.168.123.106:0/924641684 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fda2405c320 con 0x7fda30198520 2026-03-09T17:27:10.573 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.571+0000 7fda36b95700 1 -- 192.168.123.106:0/924641684 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fda3019e270 con 0x7fda30198520 2026-03-09T17:27:10.573 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.571+0000 7fda2dffb700 1 -- 192.168.123.106:0/924641684 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 2 v2) v1 ==== 75+0+1093 (secure 0 0 0) 0x7fda24027070 con 0x7fda30198520 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:e2 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:epoch 2 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:09.795388+0000 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:in 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:up {} 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 0 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:10.576 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:10.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.576+0000 7fda177fe700 1 -- 192.168.123.106:0/924641684 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fda1806c680 msgr2=0x7fda1806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:10.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.576+0000 7fda177fe700 1 --2- 192.168.123.106:0/924641684 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fda1806c680 0x7fda1806eb30 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fda30102350 tx=0x7fda20008040 comp rx=0 tx=0).stop 2026-03-09T17:27:10.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.576+0000 7fda177fe700 1 -- 192.168.123.106:0/924641684 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda30198520 msgr2=0x7fda3019d590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:10.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.576+0000 7fda177fe700 1 --2- 192.168.123.106:0/924641684 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda30198520 0x7fda3019d590 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fda24005850 tx=0x7fda24005230 comp rx=0 tx=0).stop 2026-03-09T17:27:10.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.576+0000 7fda177fe700 1 -- 192.168.123.106:0/924641684 shutdown_connections 2026-03-09T17:27:10.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.576+0000 7fda177fe700 1 --2- 192.168.123.106:0/924641684 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fda1806c680 0x7fda1806eb30 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:10.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.576+0000 7fda177fe700 1 --2- 192.168.123.106:0/924641684 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda30103a00 0x7fda30197fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:10.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.576+0000 7fda177fe700 1 --2- 192.168.123.106:0/924641684 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda30198520 0x7fda3019d590 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:10.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.576+0000 7fda177fe700 1 -- 192.168.123.106:0/924641684 >> 192.168.123.106:0/924641684 conn(0x7fda300fddb0 msgr2=0x7fda30100120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:10.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.576+0000 7fda177fe700 1 -- 192.168.123.106:0/924641684 shutdown_connections 2026-03-09T17:27:10.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:10.576+0000 7fda177fe700 1 -- 192.168.123.106:0/924641684 wait complete. 2026-03-09T17:27:10.582 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 2 2026-03-09T17:27:10.773 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T17:27:10.776 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm06.local 2026-03-09T17:27:10.776 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'ceph fs set cephfs max_mds 1' 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: osdmap e35: 6 total, 6 up, 6 in 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: osdmap e36: 6 total, 6 up, 6 in 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: fsmap cephfs:0 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: Saving service mds.cephfs spec with placement count:4 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vmzmbb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vmzmbb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: Deploying daemon mds.cephfs.vm06.vmzmbb on vm06 2026-03-09T17:27:10.915 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:10 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/924641684' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:27:10.991 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: osdmap e35: 6 total, 6 up, 6 in 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: osdmap e36: 6 total, 6 up, 6 in 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: fsmap cephfs:0 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: Saving service mds.cephfs spec with placement count:4 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vmzmbb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vmzmbb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: Deploying daemon mds.cephfs.vm06.vmzmbb on vm06 2026-03-09T17:27:11.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:10 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/924641684' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:27:11.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.281+0000 7fb449418700 1 -- 192.168.123.106:0/3497109893 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb444103960 msgr2=0x7fb444103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:11.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.281+0000 7fb449418700 1 --2- 192.168.123.106:0/3497109893 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb444103960 0x7fb444103db0 secure :-1 s=READY pgs=247 cs=0 l=1 rev1=1 crypto rx=0x7fb42c009b00 tx=0x7fb42c009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:11.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.282+0000 7fb449418700 1 -- 192.168.123.106:0/3497109893 shutdown_connections 2026-03-09T17:27:11.285 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.282+0000 7fb449418700 1 --2- 192.168.123.106:0/3497109893 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb444103960 0x7fb444103db0 unknown :-1 s=CLOSED pgs=247 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:11.285 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.282+0000 7fb449418700 1 --2- 192.168.123.106:0/3497109893 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb444102760 0x7fb444102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:11.285 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.282+0000 7fb449418700 1 -- 192.168.123.106:0/3497109893 >> 192.168.123.106:0/3497109893 conn(0x7fb4440fdcf0 msgr2=0x7fb444100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:11.285 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.282+0000 7fb449418700 1 -- 192.168.123.106:0/3497109893 shutdown_connections 2026-03-09T17:27:11.285 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.282+0000 7fb449418700 1 -- 192.168.123.106:0/3497109893 wait complete. 2026-03-09T17:27:11.285 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.283+0000 7fb449418700 1 Processor -- start 2026-03-09T17:27:11.285 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.283+0000 7fb449418700 1 -- start start 2026-03-09T17:27:11.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.283+0000 7fb449418700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb444102760 0x7fb444078b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:11.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.283+0000 7fb449418700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb444103960 0x7fb444079040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:11.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.283+0000 7fb449418700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb444075560 con 0x7fb444103960 2026-03-09T17:27:11.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.283+0000 7fb449418700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb4440756d0 con 0x7fb444102760 2026-03-09T17:27:11.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.284+0000 7fb43bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb444103960 0x7fb444079040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:11.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.284+0000 7fb43bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb444103960 0x7fb444079040 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:40660/0 (socket says 192.168.123.106:40660) 2026-03-09T17:27:11.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.284+0000 7fb43bfff700 1 -- 192.168.123.106:0/2400077864 learned_addr learned my addr 192.168.123.106:0/2400077864 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:11.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.284+0000 7fb442ffd700 1 --2- 192.168.123.106:0/2400077864 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb444102760 0x7fb444078b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:11.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.284+0000 7fb43bfff700 1 -- 192.168.123.106:0/2400077864 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb444102760 msgr2=0x7fb444078b00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:11.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.284+0000 7fb43bfff700 1 --2- 192.168.123.106:0/2400077864 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb444102760 0x7fb444078b00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:11.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.284+0000 7fb43bfff700 1 -- 192.168.123.106:0/2400077864 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb42c0097e0 con 0x7fb444103960 2026-03-09T17:27:11.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.284+0000 7fb43bfff700 1 --2- 192.168.123.106:0/2400077864 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb444103960 0x7fb444079040 secure :-1 s=READY pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7fb42c005230 tx=0x7fb42c004ca0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:11.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.284+0000 7fb440ff9700 1 -- 192.168.123.106:0/2400077864 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb42c01d070 con 0x7fb444103960 2026-03-09T17:27:11.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.284+0000 7fb449418700 1 -- 192.168.123.106:0/2400077864 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb444075950 con 0x7fb444103960 2026-03-09T17:27:11.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.284+0000 7fb449418700 1 -- 192.168.123.106:0/2400077864 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb444075e40 con 0x7fb444103960 2026-03-09T17:27:11.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.284+0000 7fb440ff9700 1 -- 192.168.123.106:0/2400077864 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb42c00bc50 con 0x7fb444103960 2026-03-09T17:27:11.289 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.284+0000 7fb440ff9700 1 -- 192.168.123.106:0/2400077864 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb42c00f820 con 0x7fb444103960 2026-03-09T17:27:11.289 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.286+0000 7fb440ff9700 1 -- 192.168.123.106:0/2400077864 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb42c00fae0 con 0x7fb444103960 2026-03-09T17:27:11.289 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.286+0000 7fb440ff9700 1 --2- 192.168.123.106:0/2400077864 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb42406c6d0 0x7fb42406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:11.289 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.287+0000 7fb442ffd700 1 --2- 192.168.123.106:0/2400077864 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb42406c6d0 0x7fb42406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:11.289 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.287+0000 7fb449418700 1 -- 192.168.123.106:0/2400077864 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb428005320 con 0x7fb444103960 2026-03-09T17:27:11.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.287+0000 7fb442ffd700 1 --2- 192.168.123.106:0/2400077864 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb42406c6d0 0x7fb42406eb80 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fb434005fd0 tx=0x7fb434005e50 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:11.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.287+0000 7fb440ff9700 1 -- 192.168.123.106:0/2400077864 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb42c08d060 con 0x7fb444103960 2026-03-09T17:27:11.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.290+0000 7fb440ff9700 1 -- 192.168.123.106:0/2400077864 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb42c027080 con 0x7fb444103960 2026-03-09T17:27:11.420 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.417+0000 7fb449418700 1 -- 192.168.123.106:0/2400077864 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"} v 0) v1 -- 0x7fb428005f70 con 0x7fb444103960 2026-03-09T17:27:11.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.989+0000 7fb440ff9700 1 -- 192.168.123.106:0/2400077864 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]=0 v3) v1 ==== 105+0+0 (secure 0 0 0) 0x7fb42c022cc0 con 0x7fb444103960 2026-03-09T17:27:11.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.992+0000 7fb449418700 1 -- 192.168.123.106:0/2400077864 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb42406c6d0 msgr2=0x7fb42406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:11.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.992+0000 7fb449418700 1 --2- 192.168.123.106:0/2400077864 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb42406c6d0 0x7fb42406eb80 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fb434005fd0 tx=0x7fb434005e50 comp rx=0 tx=0).stop 2026-03-09T17:27:11.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.992+0000 7fb449418700 1 -- 192.168.123.106:0/2400077864 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb444103960 msgr2=0x7fb444079040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:11.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.992+0000 7fb449418700 1 --2- 192.168.123.106:0/2400077864 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb444103960 0x7fb444079040 secure :-1 s=READY pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7fb42c005230 tx=0x7fb42c004ca0 comp rx=0 tx=0).stop 2026-03-09T17:27:11.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.992+0000 7fb449418700 1 -- 192.168.123.106:0/2400077864 shutdown_connections 2026-03-09T17:27:11.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.992+0000 7fb449418700 1 --2- 192.168.123.106:0/2400077864 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb444102760 0x7fb444078b00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:11.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.992+0000 7fb449418700 1 --2- 192.168.123.106:0/2400077864 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb42406c6d0 0x7fb42406eb80 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:11.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.992+0000 7fb449418700 1 --2- 192.168.123.106:0/2400077864 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb444103960 0x7fb444079040 unknown :-1 s=CLOSED pgs=248 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:11.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.992+0000 7fb449418700 1 -- 192.168.123.106:0/2400077864 >> 192.168.123.106:0/2400077864 conn(0x7fb4440fdcf0 msgr2=0x7fb444106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:11.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.992+0000 7fb449418700 1 -- 192.168.123.106:0/2400077864 shutdown_connections 2026-03-09T17:27:11.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:11.992+0000 7fb449418700 1 -- 192.168.123.106:0/2400077864 wait complete. 2026-03-09T17:27:12.066 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:11 vm06 ceph-mon[57307]: osdmap e37: 6 total, 6 up, 6 in 2026-03-09T17:27:12.066 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:11 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:12.066 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:11 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:12.066 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:11 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:12.067 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:11 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.cjcawy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:27:12.067 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:11 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.cjcawy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T17:27:12.067 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:11 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:12.067 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:11 vm06 ceph-mon[57307]: Deploying daemon mds.cephfs.vm09.cjcawy on vm09 2026-03-09T17:27:12.067 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:11 vm06 ceph-mon[57307]: pgmap v79: 65 pgs: 3 creating+peering, 22 active+clean, 40 unknown; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:27:12.067 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:11 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/2400077864' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-09T17:27:12.067 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T17:27:12.069 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm06.local 2026-03-09T17:27:12.069 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'ceph fs set cephfs allow_standby_replay true' 2026-03-09T17:27:12.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:11 vm09 ceph-mon[62061]: osdmap e37: 6 total, 6 up, 6 in 2026-03-09T17:27:12.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:11 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:12.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:11 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:12.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:11 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:12.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:11 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.cjcawy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:27:12.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:11 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.cjcawy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T17:27:12.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:11 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:12.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:11 vm09 ceph-mon[62061]: Deploying daemon mds.cephfs.vm09.cjcawy on vm09 2026-03-09T17:27:12.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:11 vm09 ceph-mon[62061]: pgmap v79: 65 pgs: 3 creating+peering, 22 active+clean, 40 unknown; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T17:27:12.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:11 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/2400077864' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-09T17:27:12.266 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:12.967 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.964+0000 7fd999acb700 1 -- 192.168.123.106:0/1860921114 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd994071a60 msgr2=0x7fd994071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:12.967 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.964+0000 7fd999acb700 1 --2- 192.168.123.106:0/1860921114 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd994071a60 0x7fd994071e70 secure :-1 s=READY pgs=251 cs=0 l=1 rev1=1 crypto rx=0x7fd984009b00 tx=0x7fd984009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:12.967 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.964+0000 7fd999acb700 1 -- 192.168.123.106:0/1860921114 shutdown_connections 2026-03-09T17:27:12.967 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.964+0000 7fd999acb700 1 --2- 192.168.123.106:0/1860921114 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd994072440 0x7fd99410be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:12.967 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.964+0000 7fd999acb700 1 --2- 192.168.123.106:0/1860921114 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd994071a60 0x7fd994071e70 unknown :-1 s=CLOSED pgs=251 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:12.967 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.964+0000 7fd999acb700 1 -- 192.168.123.106:0/1860921114 >> 192.168.123.106:0/1860921114 conn(0x7fd99406d1a0 msgr2=0x7fd99406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:12.968 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.964+0000 7fd999acb700 1 -- 192.168.123.106:0/1860921114 shutdown_connections 2026-03-09T17:27:12.968 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.964+0000 7fd999acb700 1 -- 192.168.123.106:0/1860921114 wait complete. 2026-03-09T17:27:12.968 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.966+0000 7fd999acb700 1 Processor -- start 2026-03-09T17:27:12.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.969+0000 7fd999acb700 1 -- start start 2026-03-09T17:27:12.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.969+0000 7fd999acb700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd994071a60 0x7fd9941a4a00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:12.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.969+0000 7fd999acb700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd994072440 0x7fd9941a4f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:12.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.969+0000 7fd999acb700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd9941a5560 con 0x7fd994072440 2026-03-09T17:27:12.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.969+0000 7fd999acb700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd9941a56a0 con 0x7fd994071a60 2026-03-09T17:27:12.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.970+0000 7fd993fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd994072440 0x7fd9941a4f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:12.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.970+0000 7fd998ac9700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd994071a60 0x7fd9941a4a00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:12.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.970+0000 7fd998ac9700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd994071a60 0x7fd9941a4a00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:48286/0 (socket says 192.168.123.106:48286) 2026-03-09T17:27:12.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.970+0000 7fd998ac9700 1 -- 192.168.123.106:0/2508056539 learned_addr learned my addr 192.168.123.106:0/2508056539 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:12.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.971+0000 7fd993fff700 1 -- 192.168.123.106:0/2508056539 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd994071a60 msgr2=0x7fd9941a4a00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:12.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.971+0000 7fd993fff700 1 --2- 192.168.123.106:0/2508056539 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd994071a60 0x7fd9941a4a00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:12.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.971+0000 7fd993fff700 1 -- 192.168.123.106:0/2508056539 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd9840097e0 con 0x7fd994072440 2026-03-09T17:27:12.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.971+0000 7fd993fff700 1 --2- 192.168.123.106:0/2508056539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd994072440 0x7fd9941a4f40 secure :-1 s=READY pgs=252 cs=0 l=1 rev1=1 crypto rx=0x7fd98c00beb0 tx=0x7fd98c00bf90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:12.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.972+0000 7fd991ffb700 1 -- 192.168.123.106:0/2508056539 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd98c00ce20 con 0x7fd994072440 2026-03-09T17:27:12.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.972+0000 7fd999acb700 1 -- 192.168.123.106:0/2508056539 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd9941aa150 con 0x7fd994072440 2026-03-09T17:27:12.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.972+0000 7fd999acb700 1 -- 192.168.123.106:0/2508056539 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd9941aa6a0 con 0x7fd994072440 2026-03-09T17:27:12.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.973+0000 7fd991ffb700 1 -- 192.168.123.106:0/2508056539 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd98c014920 con 0x7fd994072440 2026-03-09T17:27:12.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.974+0000 7fd991ffb700 1 -- 192.168.123.106:0/2508056539 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd98c012ac0 con 0x7fd994072440 2026-03-09T17:27:12.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.974+0000 7fd991ffb700 1 -- 192.168.123.106:0/2508056539 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd98c012ce0 con 0x7fd994072440 2026-03-09T17:27:12.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.975+0000 7fd991ffb700 1 --2- 192.168.123.106:0/2508056539 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd97c06c550 0x7fd97c06ea00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:12.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.975+0000 7fd998ac9700 1 --2- 192.168.123.106:0/2508056539 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd97c06c550 0x7fd97c06ea00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:12.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.975+0000 7fd991ffb700 1 -- 192.168.123.106:0/2508056539 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fd98c006e40 con 0x7fd994072440 2026-03-09T17:27:12.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.975+0000 7fd998ac9700 1 --2- 192.168.123.106:0/2508056539 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd97c06c550 0x7fd97c06ea00 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7fd98400b5c0 tx=0x7fd984005fb0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:12.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.975+0000 7fd999acb700 1 -- 192.168.123.106:0/2508056539 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd980005320 con 0x7fd994072440 2026-03-09T17:27:12.982 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:12.979+0000 7fd991ffb700 1 -- 192.168.123.106:0/2508056539 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd98c04e720 con 0x7fd994072440 2026-03-09T17:27:13.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:13.124+0000 7fd999acb700 1 -- 192.168.123.106:0/2508056539 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"} v 0) v1 -- 0x7fd980005f70 con 0x7fd994072440 2026-03-09T17:27:13.183 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:13.183 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.gzymac", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.gzymac", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: Deploying daemon mds.cephfs.vm06.gzymac on vm06 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: mds.? [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] up:boot 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/2400077864' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: mds.? [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] up:boot 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: daemon mds.cephfs.vm06.vmzmbb assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: Cluster is now healthy 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: fsmap cephfs:0 2 up:standby 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.cjcawy"}]: dispatch 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vmzmbb"}]: dispatch 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:creating} 1 up:standby 2026-03-09T17:27:13.184 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:12 vm06 ceph-mon[57307]: daemon mds.cephfs.vm06.vmzmbb is now active in filesystem cephfs as rank 0 2026-03-09T17:27:13.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:13.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:13.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:13.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.gzymac", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:27:13.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.gzymac", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T17:27:13.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:13.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: Deploying daemon mds.cephfs.vm06.gzymac on vm06 2026-03-09T17:27:13.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: mds.? [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] up:boot 2026-03-09T17:27:13.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/2400077864' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-09T17:27:13.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: mds.? [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] up:boot 2026-03-09T17:27:13.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: daemon mds.cephfs.vm06.vmzmbb assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-09T17:27:13.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T17:27:13.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-09T17:27:13.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: Cluster is now healthy 2026-03-09T17:27:13.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: fsmap cephfs:0 2 up:standby 2026-03-09T17:27:13.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.cjcawy"}]: dispatch 2026-03-09T17:27:13.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vmzmbb"}]: dispatch 2026-03-09T17:27:13.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:creating} 1 up:standby 2026-03-09T17:27:13.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:12 vm09 ceph-mon[62061]: daemon mds.cephfs.vm06.vmzmbb is now active in filesystem cephfs as rank 0 2026-03-09T17:27:14.050 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.047+0000 7fd991ffb700 1 -- 192.168.123.106:0/2508056539 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]=0 v7) v1 ==== 121+0+0 (secure 0 0 0) 0x7fd98c019070 con 0x7fd994072440 2026-03-09T17:27:14.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.050+0000 7fd999acb700 1 -- 192.168.123.106:0/2508056539 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd97c06c550 msgr2=0x7fd97c06ea00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:14.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.050+0000 7fd999acb700 1 --2- 192.168.123.106:0/2508056539 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd97c06c550 0x7fd97c06ea00 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7fd98400b5c0 tx=0x7fd984005fb0 comp rx=0 tx=0).stop 2026-03-09T17:27:14.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.050+0000 7fd999acb700 1 -- 192.168.123.106:0/2508056539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd994072440 msgr2=0x7fd9941a4f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:14.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.050+0000 7fd999acb700 1 --2- 192.168.123.106:0/2508056539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd994072440 0x7fd9941a4f40 secure :-1 s=READY pgs=252 cs=0 l=1 rev1=1 crypto rx=0x7fd98c00beb0 tx=0x7fd98c00bf90 comp rx=0 tx=0).stop 2026-03-09T17:27:14.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.050+0000 7fd999acb700 1 -- 192.168.123.106:0/2508056539 shutdown_connections 2026-03-09T17:27:14.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.050+0000 7fd999acb700 1 --2- 192.168.123.106:0/2508056539 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd994071a60 0x7fd9941a4a00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:14.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.050+0000 7fd999acb700 1 --2- 192.168.123.106:0/2508056539 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd97c06c550 0x7fd97c06ea00 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:14.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.050+0000 7fd999acb700 1 --2- 192.168.123.106:0/2508056539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd994072440 0x7fd9941a4f40 unknown :-1 s=CLOSED pgs=252 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:14.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.050+0000 7fd999acb700 1 -- 192.168.123.106:0/2508056539 >> 192.168.123.106:0/2508056539 conn(0x7fd99406d1a0 msgr2=0x7fd99410b4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:14.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.050+0000 7fd999acb700 1 -- 192.168.123.106:0/2508056539 shutdown_connections 2026-03-09T17:27:14.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.050+0000 7fd999acb700 1 -- 192.168.123.106:0/2508056539 wait complete. 2026-03-09T17:27:14.098 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T17:27:14.101 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm06.local 2026-03-09T17:27:14.101 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'ceph fs set cephfs inline_data false' 2026-03-09T17:27:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:14 vm09 ceph-mon[62061]: mds.? [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] up:active 2026-03-09T17:27:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:14 vm09 ceph-mon[62061]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby 2026-03-09T17:27:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:14 vm09 ceph-mon[62061]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby 2026-03-09T17:27:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:14 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/2508056539' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]: dispatch 2026-03-09T17:27:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:14 vm09 ceph-mon[62061]: pgmap v80: 65 pgs: 3 creating+peering, 56 active+clean, 6 unknown; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 459 B/s wr, 2 op/s 2026-03-09T17:27:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:14 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:14 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:14 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:14 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.drzmdt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:27:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:14 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.drzmdt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T17:27:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:14 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:14 vm09 ceph-mon[62061]: Deploying daemon mds.cephfs.vm09.drzmdt on vm09 2026-03-09T17:27:14.292 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:14.377 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:14 vm06 ceph-mon[57307]: mds.? [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] up:active 2026-03-09T17:27:14.377 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:14 vm06 ceph-mon[57307]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby 2026-03-09T17:27:14.377 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:14 vm06 ceph-mon[57307]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby 2026-03-09T17:27:14.377 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:14 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/2508056539' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]: dispatch 2026-03-09T17:27:14.377 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:14 vm06 ceph-mon[57307]: pgmap v80: 65 pgs: 3 creating+peering, 56 active+clean, 6 unknown; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 459 B/s wr, 2 op/s 2026-03-09T17:27:14.377 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:14 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:14.378 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:14 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:14.378 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:14 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:14.378 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:14 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.drzmdt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:27:14.378 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:14 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.drzmdt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T17:27:14.378 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:14 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:14.378 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:14 vm06 ceph-mon[57307]: Deploying daemon mds.cephfs.vm09.drzmdt on vm09 2026-03-09T17:27:14.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.672+0000 7fcdaff21700 1 -- 192.168.123.106:0/2398298024 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcda8071db0 msgr2=0x7fcda80721c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:14.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.672+0000 7fcdaff21700 1 --2- 192.168.123.106:0/2398298024 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcda8071db0 0x7fcda80721c0 secure :-1 s=READY pgs=254 cs=0 l=1 rev1=1 crypto rx=0x7fcda4009b00 tx=0x7fcda4009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:14.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.673+0000 7fcdaff21700 1 -- 192.168.123.106:0/2398298024 shutdown_connections 2026-03-09T17:27:14.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.673+0000 7fcdaff21700 1 --2- 192.168.123.106:0/2398298024 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcda8107d50 0x7fcda81081c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:14.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.673+0000 7fcdaff21700 1 --2- 192.168.123.106:0/2398298024 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcda8071db0 0x7fcda80721c0 unknown :-1 s=CLOSED pgs=254 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:14.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.673+0000 7fcdaff21700 1 -- 192.168.123.106:0/2398298024 >> 192.168.123.106:0/2398298024 conn(0x7fcda806d3e0 msgr2=0x7fcda806f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:14.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.673+0000 7fcdaff21700 1 -- 192.168.123.106:0/2398298024 shutdown_connections 2026-03-09T17:27:14.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.673+0000 7fcdaff21700 1 -- 192.168.123.106:0/2398298024 wait complete. 2026-03-09T17:27:14.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.674+0000 7fcdaff21700 1 Processor -- start 2026-03-09T17:27:14.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.674+0000 7fcdaff21700 1 -- start start 2026-03-09T17:27:14.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.674+0000 7fcdaff21700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcda8071db0 0x7fcda8116900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:14.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.674+0000 7fcdaff21700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcda8107d50 0x7fcda8116e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:14.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.674+0000 7fcdaff21700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcda8117480 con 0x7fcda8071db0 2026-03-09T17:27:14.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.674+0000 7fcdaff21700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcda81175f0 con 0x7fcda8107d50 2026-03-09T17:27:14.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.674+0000 7fcdad4bc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcda8107d50 0x7fcda8116e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:14.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.675+0000 7fcdad4bc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcda8107d50 0x7fcda8116e40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:48330/0 (socket says 192.168.123.106:48330) 2026-03-09T17:27:14.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.675+0000 7fcdad4bc700 1 -- 192.168.123.106:0/1739787526 learned_addr learned my addr 192.168.123.106:0/1739787526 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:14.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.675+0000 7fcdadcbd700 1 --2- 192.168.123.106:0/1739787526 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcda8071db0 0x7fcda8116900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:14.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.675+0000 7fcdad4bc700 1 -- 192.168.123.106:0/1739787526 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcda8071db0 msgr2=0x7fcda8116900 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:14.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.675+0000 7fcdad4bc700 1 --2- 192.168.123.106:0/1739787526 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcda8071db0 0x7fcda8116900 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:14.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.675+0000 7fcdad4bc700 1 -- 192.168.123.106:0/1739787526 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcda40097e0 con 0x7fcda8107d50 2026-03-09T17:27:14.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.675+0000 7fcdad4bc700 1 --2- 192.168.123.106:0/1739787526 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcda8107d50 0x7fcda8116e40 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fcd9800d8d0 tx=0x7fcd9800dbe0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:14.683 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.681+0000 7fcd9effd700 1 -- 192.168.123.106:0/1739787526 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd98009880 con 0x7fcda8107d50 2026-03-09T17:27:14.683 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.681+0000 7fcdaff21700 1 -- 192.168.123.106:0/1739787526 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcda81a1810 con 0x7fcda8107d50 2026-03-09T17:27:14.683 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.681+0000 7fcdaff21700 1 -- 192.168.123.106:0/1739787526 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcda81a1d30 con 0x7fcda8107d50 2026-03-09T17:27:14.683 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.681+0000 7fcd9effd700 1 -- 192.168.123.106:0/1739787526 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcd98010460 con 0x7fcda8107d50 2026-03-09T17:27:14.683 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.681+0000 7fcdaff21700 1 -- 192.168.123.106:0/1739787526 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcda8118040 con 0x7fcda8107d50 2026-03-09T17:27:14.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.682+0000 7fcd9effd700 1 -- 192.168.123.106:0/1739787526 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd9800f5d0 con 0x7fcda8107d50 2026-03-09T17:27:14.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.683+0000 7fcd9effd700 1 -- 192.168.123.106:0/1739787526 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fcd980105d0 con 0x7fcda8107d50 2026-03-09T17:27:14.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.683+0000 7fcd9effd700 1 --2- 192.168.123.106:0/1739787526 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fcd9406c7a0 0x7fcd9406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:14.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.683+0000 7fcd9effd700 1 -- 192.168.123.106:0/1739787526 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fcd9808b8c0 con 0x7fcda8107d50 2026-03-09T17:27:14.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.683+0000 7fcdadcbd700 1 --2- 192.168.123.106:0/1739787526 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fcd9406c7a0 0x7fcd9406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:14.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.684+0000 7fcdadcbd700 1 --2- 192.168.123.106:0/1739787526 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fcd9406c7a0 0x7fcd9406ec50 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fcda400b5c0 tx=0x7fcda4005c00 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:14.687 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.685+0000 7fcd9effd700 1 -- 192.168.123.106:0/1739787526 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fcd980560c0 con 0x7fcda8107d50 2026-03-09T17:27:14.846 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:14.844+0000 7fcdaff21700 1 -- 192.168.123.106:0/1739787526 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"} v 0) v1 -- 0x7fcda81a2430 con 0x7fcda8107d50 2026-03-09T17:27:15.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.060+0000 7fcd9effd700 1 -- 192.168.123.106:0/1739787526 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]=0 inline data disabled v9) v1 ==== 133+0+0 (secure 0 0 0) 0x7fcd980596e0 con 0x7fcda8107d50 2026-03-09T17:27:15.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.062+0000 7fcdaff21700 1 -- 192.168.123.106:0/1739787526 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fcd9406c7a0 msgr2=0x7fcd9406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:15.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.062+0000 7fcdaff21700 1 --2- 192.168.123.106:0/1739787526 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fcd9406c7a0 0x7fcd9406ec50 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fcda400b5c0 tx=0x7fcda4005c00 comp rx=0 tx=0).stop 2026-03-09T17:27:15.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.062+0000 7fcdaff21700 1 -- 192.168.123.106:0/1739787526 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcda8107d50 msgr2=0x7fcda8116e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:15.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.062+0000 7fcdaff21700 1 --2- 192.168.123.106:0/1739787526 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcda8107d50 0x7fcda8116e40 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fcd9800d8d0 tx=0x7fcd9800dbe0 comp rx=0 tx=0).stop 2026-03-09T17:27:15.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.063+0000 7fcdaff21700 1 -- 192.168.123.106:0/1739787526 shutdown_connections 2026-03-09T17:27:15.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.063+0000 7fcdaff21700 1 --2- 192.168.123.106:0/1739787526 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fcd9406c7a0 0x7fcd9406ec50 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:15.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.063+0000 7fcdaff21700 1 --2- 192.168.123.106:0/1739787526 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcda8071db0 0x7fcda8116900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:15.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.063+0000 7fcdaff21700 1 --2- 192.168.123.106:0/1739787526 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcda8107d50 0x7fcda8116e40 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:15.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.063+0000 7fcdaff21700 1 -- 192.168.123.106:0/1739787526 >> 192.168.123.106:0/1739787526 conn(0x7fcda806d3e0 msgr2=0x7fcda80707d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:15.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.063+0000 7fcdaff21700 1 -- 192.168.123.106:0/1739787526 shutdown_connections 2026-03-09T17:27:15.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.063+0000 7fcdaff21700 1 -- 192.168.123.106:0/1739787526 wait complete. 2026-03-09T17:27:15.070 INFO:teuthology.orchestra.run.vm06.stderr:inline data disabled 2026-03-09T17:27:15.099 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:15 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/2508056539' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]': finished 2026-03-09T17:27:15.099 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:15 vm09 ceph-mon[62061]: mds.? [v2:192.168.123.106:6828/642747667,v1:192.168.123.106:6829/642747667] up:boot 2026-03-09T17:27:15.099 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:15 vm09 ceph-mon[62061]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 2 up:standby 2026-03-09T17:27:15.099 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:15 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:27:15.099 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:15 vm09 ceph-mon[62061]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 1 up:standby 2026-03-09T17:27:15.099 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:15 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:15.099 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:15 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:15.099 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:15 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:15.099 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:15 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:15.099 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:15 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:15.099 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:15 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:27:15.099 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:15 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/1739787526' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]: dispatch 2026-03-09T17:27:15.099 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:15 vm09 ceph-mon[62061]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]: dispatch 2026-03-09T17:27:15.134 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T17:27:15.137 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm06.local 2026-03-09T17:27:15.137 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'ceph fs dump' 2026-03-09T17:27:15.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:15 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/2508056539' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]': finished 2026-03-09T17:27:15.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:15 vm06 ceph-mon[57307]: mds.? [v2:192.168.123.106:6828/642747667,v1:192.168.123.106:6829/642747667] up:boot 2026-03-09T17:27:15.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:15 vm06 ceph-mon[57307]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 2 up:standby 2026-03-09T17:27:15.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:15 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:27:15.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:15 vm06 ceph-mon[57307]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 1 up:standby 2026-03-09T17:27:15.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:15 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:15.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:15 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:15.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:15 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:15.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:15 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:15.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:15 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:15.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:15 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:27:15.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:15 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/1739787526' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]: dispatch 2026-03-09T17:27:15.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:15 vm06 ceph-mon[57307]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]: dispatch 2026-03-09T17:27:15.352 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:15.678 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.676+0000 7f7b67e10700 1 -- 192.168.123.106:0/40011006 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7b60103920 msgr2=0x7f7b60103d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:15.678 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.676+0000 7f7b67e10700 1 --2- 192.168.123.106:0/40011006 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7b60103920 0x7f7b60103d70 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f7b50009a60 tx=0x7f7b50009d70 comp rx=0 tx=0).stop 2026-03-09T17:27:15.678 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.676+0000 7f7b67e10700 1 -- 192.168.123.106:0/40011006 shutdown_connections 2026-03-09T17:27:15.678 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.676+0000 7f7b67e10700 1 --2- 192.168.123.106:0/40011006 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7b60103920 0x7f7b60103d70 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:15.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.676+0000 7f7b67e10700 1 --2- 192.168.123.106:0/40011006 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7b60102720 0x7f7b60102b30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:15.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.676+0000 7f7b67e10700 1 -- 192.168.123.106:0/40011006 >> 192.168.123.106:0/40011006 conn(0x7f7b600fdcb0 msgr2=0x7f7b60100100 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:15.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.676+0000 7f7b67e10700 1 -- 192.168.123.106:0/40011006 shutdown_connections 2026-03-09T17:27:15.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.677+0000 7f7b67e10700 1 -- 192.168.123.106:0/40011006 wait complete. 2026-03-09T17:27:15.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.677+0000 7f7b67e10700 1 Processor -- start 2026-03-09T17:27:15.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.677+0000 7f7b67e10700 1 -- start start 2026-03-09T17:27:15.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.677+0000 7f7b67e10700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7b60102720 0x7f7b60198000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:15.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.678+0000 7f7b67e10700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7b60103920 0x7f7b60198540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:15.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.678+0000 7f7b67e10700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7b60198b60 con 0x7f7b60103920 2026-03-09T17:27:15.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.678+0000 7f7b67e10700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7b60198ca0 con 0x7f7b60102720 2026-03-09T17:27:15.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.678+0000 7f7b65bac700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7b60102720 0x7f7b60198000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:15.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.678+0000 7f7b65bac700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7b60102720 0x7f7b60198000 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:48350/0 (socket says 192.168.123.106:48350) 2026-03-09T17:27:15.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.678+0000 7f7b65bac700 1 -- 192.168.123.106:0/3320751630 learned_addr learned my addr 192.168.123.106:0/3320751630 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:15.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.678+0000 7f7b653ab700 1 --2- 192.168.123.106:0/3320751630 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7b60103920 0x7f7b60198540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:15.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.678+0000 7f7b65bac700 1 -- 192.168.123.106:0/3320751630 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7b60103920 msgr2=0x7f7b60198540 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:15.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.678+0000 7f7b65bac700 1 --2- 192.168.123.106:0/3320751630 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7b60103920 0x7f7b60198540 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:15.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.678+0000 7f7b65bac700 1 -- 192.168.123.106:0/3320751630 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7b50009710 con 0x7f7b60102720 2026-03-09T17:27:15.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.678+0000 7f7b65bac700 1 --2- 192.168.123.106:0/3320751630 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7b60102720 0x7f7b60198000 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f7b5c00ea00 tx=0x7f7b5c00edc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:15.681 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.678+0000 7f7b56ffd700 1 -- 192.168.123.106:0/3320751630 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7b5c00cb80 con 0x7f7b60102720 2026-03-09T17:27:15.681 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.679+0000 7f7b67e10700 1 -- 192.168.123.106:0/3320751630 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7b6019d750 con 0x7f7b60102720 2026-03-09T17:27:15.682 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.679+0000 7f7b56ffd700 1 -- 192.168.123.106:0/3320751630 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7b5c004500 con 0x7f7b60102720 2026-03-09T17:27:15.682 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.679+0000 7f7b56ffd700 1 -- 192.168.123.106:0/3320751630 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7b5c010430 con 0x7f7b60102720 2026-03-09T17:27:15.682 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.679+0000 7f7b67e10700 1 -- 192.168.123.106:0/3320751630 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7b6019dca0 con 0x7f7b60102720 2026-03-09T17:27:15.682 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.680+0000 7f7b67e10700 1 -- 192.168.123.106:0/3320751630 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7b60066e40 con 0x7f7b60102720 2026-03-09T17:27:15.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.681+0000 7f7b56ffd700 1 -- 192.168.123.106:0/3320751630 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f7b5c003710 con 0x7f7b60102720 2026-03-09T17:27:15.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.681+0000 7f7b56ffd700 1 --2- 192.168.123.106:0/3320751630 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7b4c06c680 0x7f7b4c06eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:15.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.682+0000 7f7b653ab700 1 --2- 192.168.123.106:0/3320751630 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7b4c06c680 0x7f7b4c06eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:15.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.682+0000 7f7b56ffd700 1 -- 192.168.123.106:0/3320751630 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f7b5c014070 con 0x7f7b60102720 2026-03-09T17:27:15.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.682+0000 7f7b653ab700 1 --2- 192.168.123.106:0/3320751630 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7b4c06c680 0x7f7b4c06eb30 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f7b50000c00 tx=0x7f7b5000b540 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:15.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.683+0000 7f7b56ffd700 1 -- 192.168.123.106:0/3320751630 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7b5c05a2d0 con 0x7f7b60102720 2026-03-09T17:27:15.825 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.823+0000 7f7b67e10700 1 -- 192.168.123.106:0/3320751630 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f7b6019df80 con 0x7f7b60102720 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.824+0000 7f7b56ffd700 1 -- 192.168.123.106:0/3320751630 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 9 v9) v1 ==== 75+0+1795 (secure 0 0 0) 0x7f7b5c059e60 con 0x7f7b60102720 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:e9 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:epoch 9 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:15.054947+0000 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:27:15.826 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24291} 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 2 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{0:24303} state up:standby-replay seq 1 addr [v2:192.168.123.106:6828/642747667,v1:192.168.123.106:6829/642747667] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{-1:14476} state up:standby seq 1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:27:15.827 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:27:15.830 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.828+0000 7f7b54ff9700 1 -- 192.168.123.106:0/3320751630 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7b4c06c680 msgr2=0x7f7b4c06eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:15.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.828+0000 7f7b54ff9700 1 --2- 192.168.123.106:0/3320751630 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7b4c06c680 0x7f7b4c06eb30 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f7b50000c00 tx=0x7f7b5000b540 comp rx=0 tx=0).stop 2026-03-09T17:27:15.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.828+0000 7f7b54ff9700 1 -- 192.168.123.106:0/3320751630 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7b60102720 msgr2=0x7f7b60198000 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:15.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.828+0000 7f7b54ff9700 1 --2- 192.168.123.106:0/3320751630 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7b60102720 0x7f7b60198000 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f7b5c00ea00 tx=0x7f7b5c00edc0 comp rx=0 tx=0).stop 2026-03-09T17:27:15.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.829+0000 7f7b54ff9700 1 -- 192.168.123.106:0/3320751630 shutdown_connections 2026-03-09T17:27:15.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.829+0000 7f7b54ff9700 1 --2- 192.168.123.106:0/3320751630 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7b60102720 0x7f7b60198000 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:15.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.829+0000 7f7b54ff9700 1 --2- 192.168.123.106:0/3320751630 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7b4c06c680 0x7f7b4c06eb30 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:15.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.829+0000 7f7b54ff9700 1 --2- 192.168.123.106:0/3320751630 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7b60103920 0x7f7b60198540 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:15.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.829+0000 7f7b54ff9700 1 -- 192.168.123.106:0/3320751630 >> 192.168.123.106:0/3320751630 conn(0x7f7b600fdcb0 msgr2=0x7f7b60106b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:15.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.830+0000 7f7b54ff9700 1 -- 192.168.123.106:0/3320751630 shutdown_connections 2026-03-09T17:27:15.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:15.830+0000 7f7b54ff9700 1 -- 192.168.123.106:0/3320751630 wait complete. 2026-03-09T17:27:15.833 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 9 2026-03-09T17:27:15.889 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'ceph --format=json fs dump | jq -e ".filesystems | length == 1"' 2026-03-09T17:27:16.091 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:16 vm06 ceph-mon[57307]: mds.? [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] up:boot 2026-03-09T17:27:16.091 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:16 vm06 ceph-mon[57307]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]': finished 2026-03-09T17:27:16.091 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:16 vm06 ceph-mon[57307]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 2 up:standby 2026-03-09T17:27:16.091 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:16 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.drzmdt"}]: dispatch 2026-03-09T17:27:16.091 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:16 vm06 ceph-mon[57307]: pgmap v81: 65 pgs: 65 active+clean; 450 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 511 B/s rd, 2.0 KiB/s wr, 4 op/s 2026-03-09T17:27:16.091 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:16 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:16.091 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:16 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:16.091 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:16 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:16.091 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:16 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3320751630' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:27:16.102 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:16.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:16 vm09 ceph-mon[62061]: mds.? [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] up:boot 2026-03-09T17:27:16.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:16 vm09 ceph-mon[62061]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]': finished 2026-03-09T17:27:16.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:16 vm09 ceph-mon[62061]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 2 up:standby 2026-03-09T17:27:16.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:16 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.drzmdt"}]: dispatch 2026-03-09T17:27:16.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:16 vm09 ceph-mon[62061]: pgmap v81: 65 pgs: 65 active+clean; 450 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 511 B/s rd, 2.0 KiB/s wr, 4 op/s 2026-03-09T17:27:16.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:16 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:16.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:16 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:16.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:16 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:16.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:16 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/3320751630' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:27:16.461 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.458+0000 7f7ecd367700 1 -- 192.168.123.106:0/3256084350 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7ec80fee80 msgr2=0x7f7ec81012a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:16.461 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.458+0000 7f7ecd367700 1 --2- 192.168.123.106:0/3256084350 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7ec80fee80 0x7f7ec81012a0 secure :-1 s=READY pgs=255 cs=0 l=1 rev1=1 crypto rx=0x7f7eb8009b00 tx=0x7f7eb8009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:16.461 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.459+0000 7f7ecd367700 1 -- 192.168.123.106:0/3256084350 shutdown_connections 2026-03-09T17:27:16.461 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.459+0000 7f7ecd367700 1 --2- 192.168.123.106:0/3256084350 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7ec81017e0 0x7f7ec8103c60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:16.461 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.459+0000 7f7ecd367700 1 --2- 192.168.123.106:0/3256084350 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7ec80fee80 0x7f7ec81012a0 unknown :-1 s=CLOSED pgs=255 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:16.461 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.459+0000 7f7ecd367700 1 -- 192.168.123.106:0/3256084350 >> 192.168.123.106:0/3256084350 conn(0x7f7ec80faa70 msgr2=0x7f7ec80fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:16.461 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.459+0000 7f7ecd367700 1 -- 192.168.123.106:0/3256084350 shutdown_connections 2026-03-09T17:27:16.461 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.459+0000 7f7ecd367700 1 -- 192.168.123.106:0/3256084350 wait complete. 2026-03-09T17:27:16.461 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.459+0000 7f7ecd367700 1 Processor -- start 2026-03-09T17:27:16.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.460+0000 7f7ecd367700 1 -- start start 2026-03-09T17:27:16.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.460+0000 7f7ecd367700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7ec80fee80 0x7f7ec819c490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:16.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.460+0000 7f7ecd367700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7ec81017e0 0x7f7ec819c9d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:16.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.460+0000 7f7ecd367700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7ec819cff0 con 0x7f7ec81017e0 2026-03-09T17:27:16.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.460+0000 7f7ecd367700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7ec819d130 con 0x7f7ec80fee80 2026-03-09T17:27:16.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.460+0000 7f7ec6ffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7ec80fee80 0x7f7ec819c490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:16.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.460+0000 7f7ec6ffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7ec80fee80 0x7f7ec819c490 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:48366/0 (socket says 192.168.123.106:48366) 2026-03-09T17:27:16.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.460+0000 7f7ec6ffd700 1 -- 192.168.123.106:0/2285559973 learned_addr learned my addr 192.168.123.106:0/2285559973 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:16.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.460+0000 7f7ec6ffd700 1 -- 192.168.123.106:0/2285559973 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7ec81017e0 msgr2=0x7f7ec819c9d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:16.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.460+0000 7f7ebffff700 1 --2- 192.168.123.106:0/2285559973 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7ec81017e0 0x7f7ec819c9d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:16.463 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.461+0000 7f7ec6ffd700 1 --2- 192.168.123.106:0/2285559973 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7ec81017e0 0x7f7ec819c9d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:16.463 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.461+0000 7f7ec6ffd700 1 -- 192.168.123.106:0/2285559973 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7eb80097e0 con 0x7f7ec80fee80 2026-03-09T17:27:16.463 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.461+0000 7f7ebffff700 1 --2- 192.168.123.106:0/2285559973 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7ec81017e0 0x7f7ec819c9d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:27:16.463 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.461+0000 7f7ec6ffd700 1 --2- 192.168.123.106:0/2285559973 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7ec80fee80 0x7f7ec819c490 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f7eb800b5c0 tx=0x7f7eb8004a80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:16.463 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.461+0000 7f7ec4ff9700 1 -- 192.168.123.106:0/2285559973 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7eb801d070 con 0x7f7ec80fee80 2026-03-09T17:27:16.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.461+0000 7f7ec4ff9700 1 -- 192.168.123.106:0/2285559973 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7eb8004500 con 0x7f7ec80fee80 2026-03-09T17:27:16.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.461+0000 7f7ecd367700 1 -- 192.168.123.106:0/2285559973 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7ec81a1b80 con 0x7f7ec80fee80 2026-03-09T17:27:16.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.461+0000 7f7ecd367700 1 -- 192.168.123.106:0/2285559973 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7ec81a2010 con 0x7f7ec80fee80 2026-03-09T17:27:16.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.462+0000 7f7ec4ff9700 1 -- 192.168.123.106:0/2285559973 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7eb800f460 con 0x7f7ec80fee80 2026-03-09T17:27:16.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.463+0000 7f7ec4ff9700 1 -- 192.168.123.106:0/2285559973 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f7eb8004020 con 0x7f7ec80fee80 2026-03-09T17:27:16.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.463+0000 7f7ec4ff9700 1 --2- 192.168.123.106:0/2285559973 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7ea806c680 0x7f7ea806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:16.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.463+0000 7f7ec4ff9700 1 -- 192.168.123.106:0/2285559973 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f7eb808ce70 con 0x7f7ec80fee80 2026-03-09T17:27:16.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.464+0000 7f7ebffff700 1 --2- 192.168.123.106:0/2285559973 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7ea806c680 0x7f7ea806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:16.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.464+0000 7f7ebffff700 1 --2- 192.168.123.106:0/2285559973 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7ea806c680 0x7f7ea806eb30 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f7eb0006fd0 tx=0x7f7eb0008040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:16.467 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.464+0000 7f7ecd367700 1 -- 192.168.123.106:0/2285559973 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7eac005320 con 0x7f7ec80fee80 2026-03-09T17:27:16.469 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.467+0000 7f7ec4ff9700 1 -- 192.168.123.106:0/2285559973 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7eb805b080 con 0x7f7ec80fee80 2026-03-09T17:27:16.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.609+0000 7f7ecd367700 1 -- 192.168.123.106:0/2285559973 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f7eac005f70 con 0x7f7ec80fee80 2026-03-09T17:27:16.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.617+0000 7f7ec4ff9700 1 -- 192.168.123.106:0/2285559973 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 11 v11) v1 ==== 94+0+3972 (secure 0 0 0) 0x7f7eb805ac10 con 0x7f7ec80fee80 2026-03-09T17:27:16.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.620+0000 7f7ebdffb700 1 -- 192.168.123.106:0/2285559973 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7ea806c680 msgr2=0x7f7ea806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:16.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.620+0000 7f7ebdffb700 1 --2- 192.168.123.106:0/2285559973 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7ea806c680 0x7f7ea806eb30 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f7eb0006fd0 tx=0x7f7eb0008040 comp rx=0 tx=0).stop 2026-03-09T17:27:16.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.620+0000 7f7ebdffb700 1 -- 192.168.123.106:0/2285559973 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7ec80fee80 msgr2=0x7f7ec819c490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:16.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.620+0000 7f7ebdffb700 1 --2- 192.168.123.106:0/2285559973 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7ec80fee80 0x7f7ec819c490 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f7eb800b5c0 tx=0x7f7eb8004a80 comp rx=0 tx=0).stop 2026-03-09T17:27:16.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.620+0000 7f7ebdffb700 1 -- 192.168.123.106:0/2285559973 shutdown_connections 2026-03-09T17:27:16.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.620+0000 7f7ebdffb700 1 --2- 192.168.123.106:0/2285559973 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7ec80fee80 0x7f7ec819c490 secure :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f7eb800b5c0 tx=0x7f7eb8004a80 comp rx=0 tx=0).stop 2026-03-09T17:27:16.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.620+0000 7f7ebdffb700 1 --2- 192.168.123.106:0/2285559973 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7ea806c680 0x7f7ea806eb30 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:16.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.620+0000 7f7ebdffb700 1 --2- 192.168.123.106:0/2285559973 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7ec81017e0 0x7f7ec819c9d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:16.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.620+0000 7f7ebdffb700 1 -- 192.168.123.106:0/2285559973 >> 192.168.123.106:0/2285559973 conn(0x7f7ec80faa70 msgr2=0x7f7ec80fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:16.623 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.620+0000 7f7ebdffb700 1 -- 192.168.123.106:0/2285559973 shutdown_connections 2026-03-09T17:27:16.623 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:16.620+0000 7f7ebdffb700 1 -- 192.168.123.106:0/2285559973 wait complete. 2026-03-09T17:27:16.625 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 11 2026-03-09T17:27:16.637 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:27:16.721 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done' 2026-03-09T17:27:16.910 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:17.224 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.221+0000 7f88dc3fe700 1 -- 192.168.123.106:0/3700460857 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f88d4103a00 msgr2=0x7f88d4103e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:17.224 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.221+0000 7f88dc3fe700 1 --2- 192.168.123.106:0/3700460857 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f88d4103a00 0x7f88d4103e70 secure :-1 s=READY pgs=258 cs=0 l=1 rev1=1 crypto rx=0x7f88c4009b50 tx=0x7f88c4009e60 comp rx=0 tx=0).stop 2026-03-09T17:27:17.224 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.222+0000 7f88dc3fe700 1 -- 192.168.123.106:0/3700460857 shutdown_connections 2026-03-09T17:27:17.224 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.222+0000 7f88dc3fe700 1 --2- 192.168.123.106:0/3700460857 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f88d4103a00 0x7f88d4103e70 unknown :-1 s=CLOSED pgs=258 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:17.224 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.222+0000 7f88dc3fe700 1 --2- 192.168.123.106:0/3700460857 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f88d4102760 0x7f88d4102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:17.224 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.222+0000 7f88dc3fe700 1 -- 192.168.123.106:0/3700460857 >> 192.168.123.106:0/3700460857 conn(0x7f88d40fddb0 msgr2=0x7f88d41001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:17.224 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.222+0000 7f88dc3fe700 1 -- 192.168.123.106:0/3700460857 shutdown_connections 2026-03-09T17:27:17.224 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.222+0000 7f88dc3fe700 1 -- 192.168.123.106:0/3700460857 wait complete. 2026-03-09T17:27:17.226 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.223+0000 7f88dc3fe700 1 Processor -- start 2026-03-09T17:27:17.226 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.224+0000 7f88dc3fe700 1 -- start start 2026-03-09T17:27:17.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.224+0000 7f88dc3fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f88d4102760 0x7f88d4193be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:17.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.224+0000 7f88dc3fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f88d4103a00 0x7f88d4194120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:17.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.224+0000 7f88d9999700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f88d4103a00 0x7f88d4194120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:17.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.224+0000 7f88d9999700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f88d4103a00 0x7f88d4194120 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:53466/0 (socket says 192.168.123.106:53466) 2026-03-09T17:27:17.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.224+0000 7f88d9999700 1 -- 192.168.123.106:0/3891101286 learned_addr learned my addr 192.168.123.106:0/3891101286 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:17.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.225+0000 7f88dc3fe700 1 -- 192.168.123.106:0/3891101286 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f88d41946f0 con 0x7f88d4103a00 2026-03-09T17:27:17.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.225+0000 7f88dc3fe700 1 -- 192.168.123.106:0/3891101286 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f88d4194830 con 0x7f88d4102760 2026-03-09T17:27:17.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.225+0000 7f88da19a700 1 --2- 192.168.123.106:0/3891101286 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f88d4102760 0x7f88d4193be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:17.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.225+0000 7f88d9999700 1 -- 192.168.123.106:0/3891101286 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f88d4102760 msgr2=0x7f88d4193be0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:17.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.225+0000 7f88d9999700 1 --2- 192.168.123.106:0/3891101286 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f88d4102760 0x7f88d4193be0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:17.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.225+0000 7f88d9999700 1 -- 192.168.123.106:0/3891101286 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f88c40097e0 con 0x7f88d4103a00 2026-03-09T17:27:17.227 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.225+0000 7f88d9999700 1 --2- 192.168.123.106:0/3891101286 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f88d4103a00 0x7f88d4194120 secure :-1 s=READY pgs=259 cs=0 l=1 rev1=1 crypto rx=0x7f88c4005950 tx=0x7f88c4004f80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:17.229 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.225+0000 7f88cb7fe700 1 -- 192.168.123.106:0/3891101286 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f88c401d070 con 0x7f88d4103a00 2026-03-09T17:27:17.229 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.225+0000 7f88cb7fe700 1 -- 192.168.123.106:0/3891101286 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f88c400bb70 con 0x7f88d4103a00 2026-03-09T17:27:17.229 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.225+0000 7f88cb7fe700 1 -- 192.168.123.106:0/3891101286 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f88c400f780 con 0x7f88d4103a00 2026-03-09T17:27:17.229 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.225+0000 7f88dc3fe700 1 -- 192.168.123.106:0/3891101286 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f88d41aa310 con 0x7f88d4103a00 2026-03-09T17:27:17.229 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.226+0000 7f88dc3fe700 1 -- 192.168.123.106:0/3891101286 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f88d41aa770 con 0x7f88d4103a00 2026-03-09T17:27:17.230 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.228+0000 7f88dc3fe700 1 -- 192.168.123.106:0/3891101286 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f88d4066e40 con 0x7f88d4103a00 2026-03-09T17:27:17.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.230+0000 7f88cb7fe700 1 -- 192.168.123.106:0/3891101286 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f88c400f8e0 con 0x7f88d4103a00 2026-03-09T17:27:17.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.230+0000 7f88cb7fe700 1 --2- 192.168.123.106:0/3891101286 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f88c006c7a0 0x7f88c006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:17.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.230+0000 7f88cb7fe700 1 -- 192.168.123.106:0/3891101286 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f88c408cfa0 con 0x7f88d4103a00 2026-03-09T17:27:17.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.231+0000 7f88da19a700 1 --2- 192.168.123.106:0/3891101286 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f88c006c7a0 0x7f88c006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:17.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.231+0000 7f88cb7fe700 1 -- 192.168.123.106:0/3891101286 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f88c405c220 con 0x7f88d4103a00 2026-03-09T17:27:17.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.231+0000 7f88da19a700 1 --2- 192.168.123.106:0/3891101286 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f88c006c7a0 0x7f88c006ec50 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f88d0009de0 tx=0x7f88d0009450 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:17.412 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.409+0000 7f88dc3fe700 1 -- 192.168.123.106:0/3891101286 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mds versions", "format": "json"} v 0) v1 -- 0x7f88d41aab40 con 0x7f88d4103a00 2026-03-09T17:27:17.412 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.410+0000 7f88cb7fe700 1 -- 192.168.123.106:0/3891101286 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mds versions", "format": "json"}]=0 v11) v1 ==== 78+0+83 (secure 0 0 0) 0x7f88c405bdb0 con 0x7f88d4103a00 2026-03-09T17:27:17.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.413+0000 7f88c97fa700 1 -- 192.168.123.106:0/3891101286 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f88c006c7a0 msgr2=0x7f88c006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:17.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.414+0000 7f88c97fa700 1 --2- 192.168.123.106:0/3891101286 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f88c006c7a0 0x7f88c006ec50 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f88d0009de0 tx=0x7f88d0009450 comp rx=0 tx=0).stop 2026-03-09T17:27:17.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.414+0000 7f88c97fa700 1 -- 192.168.123.106:0/3891101286 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f88d4103a00 msgr2=0x7f88d4194120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:17.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.414+0000 7f88c97fa700 1 --2- 192.168.123.106:0/3891101286 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f88d4103a00 0x7f88d4194120 secure :-1 s=READY pgs=259 cs=0 l=1 rev1=1 crypto rx=0x7f88c4005950 tx=0x7f88c4004f80 comp rx=0 tx=0).stop 2026-03-09T17:27:17.418 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.414+0000 7f88c97fa700 1 -- 192.168.123.106:0/3891101286 shutdown_connections 2026-03-09T17:27:17.418 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.414+0000 7f88c97fa700 1 --2- 192.168.123.106:0/3891101286 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f88d4102760 0x7f88d4193be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:17.418 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.414+0000 7f88c97fa700 1 --2- 192.168.123.106:0/3891101286 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f88c006c7a0 0x7f88c006ec50 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:17.418 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.414+0000 7f88c97fa700 1 --2- 192.168.123.106:0/3891101286 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f88d4103a00 0x7f88d4194120 unknown :-1 s=CLOSED pgs=259 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:17.418 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.414+0000 7f88c97fa700 1 -- 192.168.123.106:0/3891101286 >> 192.168.123.106:0/3891101286 conn(0x7f88d40fddb0 msgr2=0x7f88d4100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:17.418 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.415+0000 7f88c97fa700 1 -- 192.168.123.106:0/3891101286 shutdown_connections 2026-03-09T17:27:17.418 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:17.415+0000 7f88c97fa700 1 -- 192.168.123.106:0/3891101286 wait complete. 2026-03-09T17:27:17.427 INFO:teuthology.orchestra.run.vm06.stdout:false 2026-03-09T17:27:17.872 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:17 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:17.872 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:17 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:17.872 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:17 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:17.872 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:17 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:27:17.872 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:17 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:17.872 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:17 vm09 ceph-mon[62061]: mds.? [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] up:standby 2026-03-09T17:27:17.872 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:17 vm09 ceph-mon[62061]: mds.? [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] up:active 2026-03-09T17:27:17.873 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:17 vm09 ceph-mon[62061]: Dropping low affinity standby-replay daemon mds.cephfs.vm06.gzymac in favor of higher affinity standby. 2026-03-09T17:27:17.873 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:17 vm09 ceph-mon[62061]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 2 up:standby 2026-03-09T17:27:17.873 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:17 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/2285559973' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T17:27:17.873 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:17 vm09 ceph-mon[62061]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 1 up:standby 2026-03-09T17:27:17.873 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:17 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:27:17.873 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:17 vm09 ceph-mon[62061]: pgmap v82: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.3 KiB/s rd, 1.9 KiB/s wr, 8 op/s 2026-03-09T17:27:17.873 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:17 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/3891101286' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-09T17:27:17.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:17 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:17.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:17 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:17.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:17 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:17.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:17 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:27:17.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:17 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:17.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:17 vm06 ceph-mon[57307]: mds.? [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] up:standby 2026-03-09T17:27:17.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:17 vm06 ceph-mon[57307]: mds.? [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] up:active 2026-03-09T17:27:17.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:17 vm06 ceph-mon[57307]: Dropping low affinity standby-replay daemon mds.cephfs.vm06.gzymac in favor of higher affinity standby. 2026-03-09T17:27:17.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:17 vm06 ceph-mon[57307]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 2 up:standby 2026-03-09T17:27:17.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:17 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/2285559973' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T17:27:17.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:17 vm06 ceph-mon[57307]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 1 up:standby 2026-03-09T17:27:17.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:17 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:27:17.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:17 vm06 ceph-mon[57307]: pgmap v82: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.3 KiB/s rd, 1.9 KiB/s wr, 8 op/s 2026-03-09T17:27:17.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:17 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3891101286' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-09T17:27:18.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.502+0000 7f4c12c13700 1 -- 192.168.123.106:0/1254833015 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c106f60 msgr2=0x7f4c0c107370 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:18.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.502+0000 7f4c12c13700 1 --2- 192.168.123.106:0/1254833015 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c106f60 0x7f4c0c107370 secure :-1 s=READY pgs=260 cs=0 l=1 rev1=1 crypto rx=0x7f4bf8009b00 tx=0x7f4bf8009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:18.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.503+0000 7f4c12c13700 1 -- 192.168.123.106:0/1254833015 shutdown_connections 2026-03-09T17:27:18.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.503+0000 7f4c12c13700 1 --2- 192.168.123.106:0/1254833015 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4c0c107d70 0x7f4c0c1081c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:18.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.503+0000 7f4c12c13700 1 --2- 192.168.123.106:0/1254833015 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c106f60 0x7f4c0c107370 unknown :-1 s=CLOSED pgs=260 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:18.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.503+0000 7f4c12c13700 1 -- 192.168.123.106:0/1254833015 >> 192.168.123.106:0/1254833015 conn(0x7f4c0c075b50 msgr2=0x7f4c0c077f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:18.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.503+0000 7f4c12c13700 1 -- 192.168.123.106:0/1254833015 shutdown_connections 2026-03-09T17:27:18.506 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.503+0000 7f4c12c13700 1 -- 192.168.123.106:0/1254833015 wait complete. 2026-03-09T17:27:18.506 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.504+0000 7f4c12c13700 1 Processor -- start 2026-03-09T17:27:18.506 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.504+0000 7f4c12c13700 1 -- start start 2026-03-09T17:27:18.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.505+0000 7f4c12c13700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c106f60 0x7f4c0c19c430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:18.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.505+0000 7f4c12c13700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4c0c107d70 0x7f4c0c19c970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:18.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.505+0000 7f4c109af700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c106f60 0x7f4c0c19c430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:18.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.505+0000 7f4c109af700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c106f60 0x7f4c0c19c430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:53488/0 (socket says 192.168.123.106:53488) 2026-03-09T17:27:18.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.505+0000 7f4c109af700 1 -- 192.168.123.106:0/3912743888 learned_addr learned my addr 192.168.123.106:0/3912743888 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:18.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.505+0000 7f4c12c13700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c0c19cf90 con 0x7f4c0c106f60 2026-03-09T17:27:18.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.505+0000 7f4c12c13700 1 -- 192.168.123.106:0/3912743888 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c0c19d0d0 con 0x7f4c0c107d70 2026-03-09T17:27:18.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.505+0000 7f4c0bfff700 1 --2- 192.168.123.106:0/3912743888 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4c0c107d70 0x7f4c0c19c970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:18.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.505+0000 7f4c109af700 1 -- 192.168.123.106:0/3912743888 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4c0c107d70 msgr2=0x7f4c0c19c970 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:18.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.505+0000 7f4c109af700 1 --2- 192.168.123.106:0/3912743888 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4c0c107d70 0x7f4c0c19c970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:18.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.505+0000 7f4c109af700 1 -- 192.168.123.106:0/3912743888 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4bf80097e0 con 0x7f4c0c106f60 2026-03-09T17:27:18.508 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.506+0000 7f4c109af700 1 --2- 192.168.123.106:0/3912743888 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c106f60 0x7f4c0c19c430 secure :-1 s=READY pgs=261 cs=0 l=1 rev1=1 crypto rx=0x7f4bf800bb40 tx=0x7f4bf800bb70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:18.513 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.506+0000 7f4c09ffb700 1 -- 192.168.123.106:0/3912743888 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4bf801d070 con 0x7f4c0c106f60 2026-03-09T17:27:18.513 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.506+0000 7f4c09ffb700 1 -- 192.168.123.106:0/3912743888 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4bf8022470 con 0x7f4c0c106f60 2026-03-09T17:27:18.513 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.506+0000 7f4c09ffb700 1 -- 192.168.123.106:0/3912743888 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4bf800f670 con 0x7f4c0c106f60 2026-03-09T17:27:18.513 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.506+0000 7f4c12c13700 1 -- 192.168.123.106:0/3912743888 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4c0c1a1b20 con 0x7f4c0c106f60 2026-03-09T17:27:18.513 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.506+0000 7f4c12c13700 1 -- 192.168.123.106:0/3912743888 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4c0c1a2010 con 0x7f4c0c106f60 2026-03-09T17:27:18.513 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.507+0000 7f4c12c13700 1 -- 192.168.123.106:0/3912743888 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4c0c066e40 con 0x7f4c0c106f60 2026-03-09T17:27:18.513 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.507+0000 7f4c09ffb700 1 -- 192.168.123.106:0/3912743888 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f4bf8022ac0 con 0x7f4c0c106f60 2026-03-09T17:27:18.513 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.509+0000 7f4c09ffb700 1 --2- 192.168.123.106:0/3912743888 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4bfc06c630 0x7f4bfc06eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:18.513 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.509+0000 7f4c09ffb700 1 -- 192.168.123.106:0/3912743888 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f4bf808ccf0 con 0x7f4c0c106f60 2026-03-09T17:27:18.513 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.509+0000 7f4c0bfff700 1 --2- 192.168.123.106:0/3912743888 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4bfc06c630 0x7f4bfc06eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:18.513 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.510+0000 7f4c0bfff700 1 --2- 192.168.123.106:0/3912743888 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4bfc06c630 0x7f4bfc06eae0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f4c00009ea0 tx=0x7f4c00009450 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:18.514 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.512+0000 7f4c09ffb700 1 -- 192.168.123.106:0/3912743888 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4bf805afb0 con 0x7f4c0c106f60 2026-03-09T17:27:18.691 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.688+0000 7f4c12c13700 1 -- 192.168.123.106:0/3912743888 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mds versions", "format": "json"} v 0) v1 -- 0x7f4c0c1a2260 con 0x7f4c0c106f60 2026-03-09T17:27:18.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.690+0000 7f4c09ffb700 1 -- 192.168.123.106:0/3912743888 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mds versions", "format": "json"}]=0 v12) v1 ==== 78+0+83 (secure 0 0 0) 0x7f4bf8027090 con 0x7f4c0c106f60 2026-03-09T17:27:18.694 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.692+0000 7f4c12c13700 1 -- 192.168.123.106:0/3912743888 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4bfc06c630 msgr2=0x7f4bfc06eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:18.694 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.692+0000 7f4c12c13700 1 --2- 192.168.123.106:0/3912743888 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4bfc06c630 0x7f4bfc06eae0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f4c00009ea0 tx=0x7f4c00009450 comp rx=0 tx=0).stop 2026-03-09T17:27:18.694 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.692+0000 7f4c12c13700 1 -- 192.168.123.106:0/3912743888 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c106f60 msgr2=0x7f4c0c19c430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:18.694 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.692+0000 7f4c12c13700 1 --2- 192.168.123.106:0/3912743888 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c106f60 0x7f4c0c19c430 secure :-1 s=READY pgs=261 cs=0 l=1 rev1=1 crypto rx=0x7f4bf800bb40 tx=0x7f4bf800bb70 comp rx=0 tx=0).stop 2026-03-09T17:27:18.694 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.692+0000 7f4c12c13700 1 -- 192.168.123.106:0/3912743888 shutdown_connections 2026-03-09T17:27:18.694 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.692+0000 7f4c12c13700 1 --2- 192.168.123.106:0/3912743888 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4bfc06c630 0x7f4bfc06eae0 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:18.694 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.692+0000 7f4c12c13700 1 --2- 192.168.123.106:0/3912743888 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c106f60 0x7f4c0c19c430 unknown :-1 s=CLOSED pgs=261 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:18.694 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.692+0000 7f4c12c13700 1 --2- 192.168.123.106:0/3912743888 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4c0c107d70 0x7f4c0c19c970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:18.694 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.692+0000 7f4c12c13700 1 -- 192.168.123.106:0/3912743888 >> 192.168.123.106:0/3912743888 conn(0x7f4c0c075b50 msgr2=0x7f4c0c10afa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:18.695 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.693+0000 7f4c12c13700 1 -- 192.168.123.106:0/3912743888 shutdown_connections 2026-03-09T17:27:18.695 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:18.693+0000 7f4c12c13700 1 -- 192.168.123.106:0/3912743888 wait complete. 2026-03-09T17:27:18.703 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:27:18.760 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:18 vm06 ceph-mon[57307]: mds.? [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] up:boot 2026-03-09T17:27:18.760 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:18 vm06 ceph-mon[57307]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 2 up:standby 2026-03-09T17:27:18.760 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:18 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:27:18.760 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:18 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:18.760 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:18 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:18.760 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:18 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:18.760 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:18 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:18.760 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:18 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:18.760 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:18 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:27:18.760 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:18 vm06 ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:18.761 INFO:teuthology.run_tasks:Running task fs.pre_upgrade_save... 2026-03-09T17:27:18.764 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 2026-03-09T17:27:18.924 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:19.049 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:18 vm09 ceph-mon[62061]: mds.? [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] up:boot 2026-03-09T17:27:19.049 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:18 vm09 ceph-mon[62061]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 2 up:standby 2026-03-09T17:27:19.049 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:18 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:27:19.049 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:18 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:19.049 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:18 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:19.049 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:18 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:19.049 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:18 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:19.049 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:18 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:19.049 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:18 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:27:19.049 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:18 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:19.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.179+0000 7fdcc2a58700 1 -- 192.168.123.106:0/2922228095 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcbc100670 msgr2=0x7fdcbc100ae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:19.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.179+0000 7fdcc2a58700 1 --2- 192.168.123.106:0/2922228095 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcbc100670 0x7fdcbc100ae0 secure :-1 s=READY pgs=262 cs=0 l=1 rev1=1 crypto rx=0x7fdcac009b00 tx=0x7fdcac009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:19.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.180+0000 7fdcc2a58700 1 -- 192.168.123.106:0/2922228095 shutdown_connections 2026-03-09T17:27:19.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.180+0000 7fdcc2a58700 1 --2- 192.168.123.106:0/2922228095 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcbc100670 0x7fdcbc100ae0 unknown :-1 s=CLOSED pgs=262 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:19.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.180+0000 7fdcc2a58700 1 --2- 192.168.123.106:0/2922228095 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcbc106640 0x7fdcbc106a10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:19.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.180+0000 7fdcc2a58700 1 -- 192.168.123.106:0/2922228095 >> 192.168.123.106:0/2922228095 conn(0x7fdcbc078580 msgr2=0x7fdcbc078980 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:19.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.180+0000 7fdcc2a58700 1 -- 192.168.123.106:0/2922228095 shutdown_connections 2026-03-09T17:27:19.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.180+0000 7fdcc2a58700 1 -- 192.168.123.106:0/2922228095 wait complete. 2026-03-09T17:27:19.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.180+0000 7fdcc2a58700 1 Processor -- start 2026-03-09T17:27:19.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.181+0000 7fdcc2a58700 1 -- start start 2026-03-09T17:27:19.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.181+0000 7fdcc2a58700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcbc100670 0x7fdcbc193fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:19.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.181+0000 7fdcc2a58700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcbc106640 0x7fdcbc194500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:19.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.181+0000 7fdcc2a58700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdcbc194b50 con 0x7fdcbc106640 2026-03-09T17:27:19.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.181+0000 7fdcc2a58700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdcbc194c90 con 0x7fdcbc100670 2026-03-09T17:27:19.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.181+0000 7fdcbbfff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcbc100670 0x7fdcbc193fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:19.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.181+0000 7fdcbb7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcbc106640 0x7fdcbc194500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:19.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.181+0000 7fdcbb7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcbc106640 0x7fdcbc194500 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:53516/0 (socket says 192.168.123.106:53516) 2026-03-09T17:27:19.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.181+0000 7fdcbb7fe700 1 -- 192.168.123.106:0/3008487981 learned_addr learned my addr 192.168.123.106:0/3008487981 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:19.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.181+0000 7fdcbbfff700 1 --2- 192.168.123.106:0/3008487981 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcbc100670 0x7fdcbc193fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:48442/0 (socket says 192.168.123.106:48442) 2026-03-09T17:27:19.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.182+0000 7fdcbb7fe700 1 -- 192.168.123.106:0/3008487981 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcbc100670 msgr2=0x7fdcbc193fc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:19.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.182+0000 7fdcbb7fe700 1 --2- 192.168.123.106:0/3008487981 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcbc100670 0x7fdcbc193fc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:19.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.182+0000 7fdcbb7fe700 1 -- 192.168.123.106:0/3008487981 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdcac0097e0 con 0x7fdcbc106640 2026-03-09T17:27:19.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.182+0000 7fdcbb7fe700 1 --2- 192.168.123.106:0/3008487981 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcbc106640 0x7fdcbc194500 secure :-1 s=READY pgs=263 cs=0 l=1 rev1=1 crypto rx=0x7fdcac009fd0 tx=0x7fdcac004990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:19.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.182+0000 7fdcb97fa700 1 -- 192.168.123.106:0/3008487981 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdcac01d070 con 0x7fdcbc106640 2026-03-09T17:27:19.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.182+0000 7fdcb97fa700 1 -- 192.168.123.106:0/3008487981 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdcac004030 con 0x7fdcbc106640 2026-03-09T17:27:19.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.182+0000 7fdcb97fa700 1 -- 192.168.123.106:0/3008487981 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdcac003de0 con 0x7fdcbc106640 2026-03-09T17:27:19.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.182+0000 7fdcc2a58700 1 -- 192.168.123.106:0/3008487981 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdcbc198a80 con 0x7fdcbc106640 2026-03-09T17:27:19.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.182+0000 7fdcc2a58700 1 -- 192.168.123.106:0/3008487981 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdcbc198f70 con 0x7fdcbc106640 2026-03-09T17:27:19.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.183+0000 7fdcc2a58700 1 -- 192.168.123.106:0/3008487981 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdcbc04ea50 con 0x7fdcbc106640 2026-03-09T17:27:19.190 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.187+0000 7fdcb97fa700 1 -- 192.168.123.106:0/3008487981 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fdcac0041a0 con 0x7fdcbc106640 2026-03-09T17:27:19.190 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.187+0000 7fdcb97fa700 1 --2- 192.168.123.106:0/3008487981 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdca806c770 0x7fdca806ec20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:19.190 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.187+0000 7fdcbbfff700 1 --2- 192.168.123.106:0/3008487981 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdca806c770 0x7fdca806ec20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:19.190 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.188+0000 7fdcb97fa700 1 -- 192.168.123.106:0/3008487981 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fdcac08d190 con 0x7fdcbc106640 2026-03-09T17:27:19.190 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.188+0000 7fdcbbfff700 1 --2- 192.168.123.106:0/3008487981 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdca806c770 0x7fdca806ec20 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fdca400ac50 tx=0x7fdca400a380 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:19.190 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.188+0000 7fdcb97fa700 1 -- 192.168.123.106:0/3008487981 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fdcac08d610 con 0x7fdcbc106640 2026-03-09T17:27:19.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.330+0000 7fdcc2a58700 1 -- 192.168.123.106:0/3008487981 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7fdcbc066e40 con 0x7fdcbc106640 2026-03-09T17:27:19.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.331+0000 7fdcb97fa700 1 -- 192.168.123.106:0/3008487981 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 13 v13) v1 ==== 94+0+4755 (secure 0 0 0) 0x7fdcac057990 con 0x7fdcbc106640 2026-03-09T17:27:19.335 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:19.335 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":13,"default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14500,"name":"cephfs.vm06.gzymac","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6829/4261949342","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":4261949342},{"type":"v1","addr":"192.168.123.106:6829","nonce":4261949342}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":12},{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13}],"filesystems":[{"mdsmap":{"epoch":11,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:27:16.605001+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24291},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14476":{"gid":14476,"name":"cephfs.vm09.cjcawy","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.109:6825/791757990","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":791757990},{"type":"v1","addr":"192.168.123.109:6825","nonce":791757990}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24291":{"gid":24291,"name":"cephfs.vm06.vmzmbb","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.106:6827/649840868","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":649840868},{"type":"v1","addr":"192.168.123.106:6827","nonce":649840868}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1},"id":1}]} 2026-03-09T17:27:19.336 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.334+0000 7fdcc2a58700 1 -- 192.168.123.106:0/3008487981 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdca806c770 msgr2=0x7fdca806ec20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:19.336 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.334+0000 7fdcc2a58700 1 --2- 192.168.123.106:0/3008487981 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdca806c770 0x7fdca806ec20 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fdca400ac50 tx=0x7fdca400a380 comp rx=0 tx=0).stop 2026-03-09T17:27:19.336 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.334+0000 7fdcc2a58700 1 -- 192.168.123.106:0/3008487981 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcbc106640 msgr2=0x7fdcbc194500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:19.336 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.334+0000 7fdcc2a58700 1 --2- 192.168.123.106:0/3008487981 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcbc106640 0x7fdcbc194500 secure :-1 s=READY pgs=263 cs=0 l=1 rev1=1 crypto rx=0x7fdcac009fd0 tx=0x7fdcac004990 comp rx=0 tx=0).stop 2026-03-09T17:27:19.336 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.334+0000 7fdcc2a58700 1 -- 192.168.123.106:0/3008487981 shutdown_connections 2026-03-09T17:27:19.336 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.334+0000 7fdcc2a58700 1 --2- 192.168.123.106:0/3008487981 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcbc100670 0x7fdcbc193fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:19.336 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.334+0000 7fdcc2a58700 1 --2- 192.168.123.106:0/3008487981 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdca806c770 0x7fdca806ec20 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:19.336 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.334+0000 7fdcc2a58700 1 --2- 192.168.123.106:0/3008487981 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcbc106640 0x7fdcbc194500 unknown :-1 s=CLOSED pgs=263 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:19.337 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.334+0000 7fdcc2a58700 1 -- 192.168.123.106:0/3008487981 >> 192.168.123.106:0/3008487981 conn(0x7fdcbc078580 msgr2=0x7fdcbc0fed90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:19.337 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.334+0000 7fdcc2a58700 1 -- 192.168.123.106:0/3008487981 shutdown_connections 2026-03-09T17:27:19.337 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:19.335+0000 7fdcc2a58700 1 -- 192.168.123.106:0/3008487981 wait complete. 2026-03-09T17:27:19.337 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:27:19.405 DEBUG:tasks.fs:fs fscid=1,name=cephfs state = {'epoch': 11, 'max_mds': 1, 'flags': 50} 2026-03-09T17:27:19.405 INFO:teuthology.run_tasks:Running task ceph-fuse... 2026-03-09T17:27:19.414 INFO:tasks.ceph_fuse:Running ceph_fuse task... 2026-03-09T17:27:19.415 INFO:tasks.ceph_fuse:config is {'client.0': {}, 'client.1': {}} 2026-03-09T17:27:19.415 INFO:tasks.ceph_fuse:client.0 config is {} 2026-03-09T17:27:19.415 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-09T17:27:19.415 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-09T17:27:19.415 INFO:tasks.ceph_fuse:client.1 config is {} 2026-03-09T17:27:19.415 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-09T17:27:19.415 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-09T17:27:19.415 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:19.415 DEBUG:teuthology.orchestra.run.vm09:> ip netns list 2026-03-09T17:27:19.434 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:19.434 DEBUG:teuthology.orchestra.run.vm09:> sudo ip link delete ceph-brx 2026-03-09T17:27:19.507 INFO:teuthology.orchestra.run.vm09.stderr:Cannot find device "ceph-brx" 2026-03-09T17:27:19.508 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T17:27:19.508 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:19.508 DEBUG:teuthology.orchestra.run.vm06:> ip netns list 2026-03-09T17:27:19.530 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:19.530 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link delete ceph-brx 2026-03-09T17:27:19.604 INFO:teuthology.orchestra.run.vm06.stderr:Cannot find device "ceph-brx" 2026-03-09T17:27:19.606 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T17:27:19.606 INFO:tasks.ceph_fuse:Mounting ceph-fuse clients... 2026-03-09T17:27:19.606 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-09T17:27:19.606 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs ls 2026-03-09T17:27:19.809 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:19.849 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:19 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3912743888' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-09T17:27:19.849 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:19 vm06 ceph-mon[57307]: mds.? [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] up:standby 2026-03-09T17:27:19.849 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:19 vm06 ceph-mon[57307]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 2 up:standby 2026-03-09T17:27:19.849 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:19 vm06 ceph-mon[57307]: pgmap v83: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.0 KiB/s rd, 1.5 KiB/s wr, 7 op/s 2026-03-09T17:27:19.849 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:19 vm06 ceph-mon[57307]: from='client.? 192.168.123.106:0/3008487981' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T17:27:19.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:19 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/3912743888' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-09T17:27:19.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:19 vm09 ceph-mon[62061]: mds.? [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] up:standby 2026-03-09T17:27:19.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:19 vm09 ceph-mon[62061]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 2 up:standby 2026-03-09T17:27:19.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:19 vm09 ceph-mon[62061]: pgmap v83: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.0 KiB/s rd, 1.5 KiB/s wr, 7 op/s 2026-03-09T17:27:19.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:19 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/3008487981' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T17:27:20.059 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.056+0000 7f1749bef700 1 -- 192.168.123.106:0/1616711539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1744101810 msgr2=0x7f1744101be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:20.059 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.056+0000 7f1749bef700 1 --2- 192.168.123.106:0/1616711539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1744101810 0x7f1744101be0 secure :-1 s=READY pgs=264 cs=0 l=1 rev1=1 crypto rx=0x7f1738009b80 tx=0x7f1738009e90 comp rx=0 tx=0).stop 2026-03-09T17:27:20.059 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.057+0000 7f1749bef700 1 -- 192.168.123.106:0/1616711539 shutdown_connections 2026-03-09T17:27:20.059 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.057+0000 7f1749bef700 1 --2- 192.168.123.106:0/1616711539 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1744102120 0x7f174410a620 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:20.059 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.057+0000 7f1749bef700 1 --2- 192.168.123.106:0/1616711539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1744101810 0x7f1744101be0 unknown :-1 s=CLOSED pgs=264 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:20.059 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.057+0000 7f1749bef700 1 -- 192.168.123.106:0/1616711539 >> 192.168.123.106:0/1616711539 conn(0x7f1744076270 msgr2=0x7f1744076670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:20.060 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.057+0000 7f1749bef700 1 -- 192.168.123.106:0/1616711539 shutdown_connections 2026-03-09T17:27:20.060 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.057+0000 7f1749bef700 1 -- 192.168.123.106:0/1616711539 wait complete. 2026-03-09T17:27:20.060 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.058+0000 7f1749bef700 1 Processor -- start 2026-03-09T17:27:20.060 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.058+0000 7f1749bef700 1 -- start start 2026-03-09T17:27:20.060 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.058+0000 7f1749bef700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1744101810 0x7f1744196300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:20.061 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.058+0000 7f1749bef700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1744102120 0x7f1744196840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:20.061 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.058+0000 7f1749bef700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1744196f20 con 0x7f1744102120 2026-03-09T17:27:20.061 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.058+0000 7f1749bef700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f174419acb0 con 0x7f1744101810 2026-03-09T17:27:20.061 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.059+0000 7f17437fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1744101810 0x7f1744196300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:20.061 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.059+0000 7f17437fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1744101810 0x7f1744196300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:48466/0 (socket says 192.168.123.106:48466) 2026-03-09T17:27:20.061 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.059+0000 7f17437fe700 1 -- 192.168.123.106:0/23331301 learned_addr learned my addr 192.168.123.106:0/23331301 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:20.061 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.059+0000 7f1742ffd700 1 --2- 192.168.123.106:0/23331301 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1744102120 0x7f1744196840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:20.061 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.059+0000 7f17437fe700 1 -- 192.168.123.106:0/23331301 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1744102120 msgr2=0x7f1744196840 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:20.062 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.059+0000 7f17437fe700 1 --2- 192.168.123.106:0/23331301 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1744102120 0x7f1744196840 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:20.062 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.059+0000 7f17437fe700 1 -- 192.168.123.106:0/23331301 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f17380097e0 con 0x7f1744101810 2026-03-09T17:27:20.062 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.060+0000 7f1742ffd700 1 --2- 192.168.123.106:0/23331301 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1744102120 0x7f1744196840 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T17:27:20.062 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.060+0000 7f17437fe700 1 --2- 192.168.123.106:0/23331301 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1744101810 0x7f1744196300 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f17380048d0 tx=0x7f1738004900 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:20.063 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.060+0000 7f1740ff9700 1 -- 192.168.123.106:0/23331301 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f173801d070 con 0x7f1744101810 2026-03-09T17:27:20.063 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.060+0000 7f1740ff9700 1 -- 192.168.123.106:0/23331301 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1738022470 con 0x7f1744101810 2026-03-09T17:27:20.063 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.060+0000 7f1740ff9700 1 -- 192.168.123.106:0/23331301 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f173800f670 con 0x7f1744101810 2026-03-09T17:27:20.063 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.060+0000 7f1749bef700 1 -- 192.168.123.106:0/23331301 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f174419af30 con 0x7f1744101810 2026-03-09T17:27:20.064 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.060+0000 7f1749bef700 1 -- 192.168.123.106:0/23331301 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f174419b420 con 0x7f1744101810 2026-03-09T17:27:20.064 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.062+0000 7f1740ff9700 1 -- 192.168.123.106:0/23331301 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f173800f7d0 con 0x7f1744101810 2026-03-09T17:27:20.064 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.062+0000 7f1740ff9700 1 --2- 192.168.123.106:0/23331301 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f172c06c680 0x7f172c06eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:20.064 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.062+0000 7f1742ffd700 1 --2- 192.168.123.106:0/23331301 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f172c06c680 0x7f172c06eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:20.065 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.063+0000 7f1749bef700 1 -- 192.168.123.106:0/23331301 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1744107d20 con 0x7f1744101810 2026-03-09T17:27:20.065 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.063+0000 7f1740ff9700 1 -- 192.168.123.106:0/23331301 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f173808dd70 con 0x7f1744101810 2026-03-09T17:27:20.065 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.063+0000 7f1742ffd700 1 --2- 192.168.123.106:0/23331301 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f172c06c680 0x7f172c06eb30 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f1744197920 tx=0x7f173400b410 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:20.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.066+0000 7f1740ff9700 1 -- 192.168.123.106:0/23331301 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f1738031030 con 0x7f1744101810 2026-03-09T17:27:20.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.208+0000 7f1749bef700 1 -- 192.168.123.106:0/23331301 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7f1744066e40 con 0x7f1744101810 2026-03-09T17:27:20.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.208+0000 7f1740ff9700 1 -- 192.168.123.106:0/23331301 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v13) v1 ==== 53+0+83 (secure 0 0 0) 0x7f17380585a0 con 0x7f1744101810 2026-03-09T17:27:20.211 INFO:teuthology.orchestra.run.vm06.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-09T17:27:20.213 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.211+0000 7f1749bef700 1 -- 192.168.123.106:0/23331301 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f172c06c680 msgr2=0x7f172c06eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:20.213 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.211+0000 7f1749bef700 1 --2- 192.168.123.106:0/23331301 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f172c06c680 0x7f172c06eb30 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f1744197920 tx=0x7f173400b410 comp rx=0 tx=0).stop 2026-03-09T17:27:20.213 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.211+0000 7f1749bef700 1 -- 192.168.123.106:0/23331301 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1744101810 msgr2=0x7f1744196300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:20.213 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.211+0000 7f1749bef700 1 --2- 192.168.123.106:0/23331301 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1744101810 0x7f1744196300 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f17380048d0 tx=0x7f1738004900 comp rx=0 tx=0).stop 2026-03-09T17:27:20.213 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.211+0000 7f1749bef700 1 -- 192.168.123.106:0/23331301 shutdown_connections 2026-03-09T17:27:20.213 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.211+0000 7f1749bef700 1 --2- 192.168.123.106:0/23331301 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1744101810 0x7f1744196300 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:20.214 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.211+0000 7f1749bef700 1 --2- 192.168.123.106:0/23331301 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f172c06c680 0x7f172c06eb30 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:20.214 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.211+0000 7f1749bef700 1 --2- 192.168.123.106:0/23331301 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1744102120 0x7f1744196840 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:20.214 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.211+0000 7f1749bef700 1 -- 192.168.123.106:0/23331301 >> 192.168.123.106:0/23331301 conn(0x7f1744076270 msgr2=0x7f1744104e60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:20.214 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.211+0000 7f1749bef700 1 -- 192.168.123.106:0/23331301 shutdown_connections 2026-03-09T17:27:20.214 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20.211+0000 7f1749bef700 1 -- 192.168.123.106:0/23331301 wait complete. 2026-03-09T17:27:20.268 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-09T17:27:20.268 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-09T17:27:20.268 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm06.local 2026-03-09T17:27:20.268 INFO:tasks.cephfs.mount:self.client.name = client.0 2026-03-09T17:27:20.268 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-09T17:27:20.268 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-09T17:27:20.268 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-09T17:27:20.268 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-09T17:27:20.268 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.0' 2026-03-09T17:27:20.268 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:20.268 DEBUG:teuthology.orchestra.run.vm06:> ip addr 2026-03-09T17:27:20.283 INFO:teuthology.orchestra.run.vm06.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-09T17:27:20.283 INFO:teuthology.orchestra.run.vm06.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-09T17:27:20.283 INFO:teuthology.orchestra.run.vm06.stdout: inet 127.0.0.1/8 scope host lo 2026-03-09T17:27:20.283 INFO:teuthology.orchestra.run.vm06.stdout: valid_lft forever preferred_lft forever 2026-03-09T17:27:20.283 INFO:teuthology.orchestra.run.vm06.stdout: inet6 ::1/128 scope host 2026-03-09T17:27:20.283 INFO:teuthology.orchestra.run.vm06.stdout: valid_lft forever preferred_lft forever 2026-03-09T17:27:20.283 INFO:teuthology.orchestra.run.vm06.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-09T17:27:20.283 INFO:teuthology.orchestra.run.vm06.stdout: link/ether 52:55:00:00:00:06 brd ff:ff:ff:ff:ff:ff 2026-03-09T17:27:20.283 INFO:teuthology.orchestra.run.vm06.stdout: altname enp0s3 2026-03-09T17:27:20.283 INFO:teuthology.orchestra.run.vm06.stdout: altname ens3 2026-03-09T17:27:20.283 INFO:teuthology.orchestra.run.vm06.stdout: inet 192.168.123.106/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-09T17:27:20.283 INFO:teuthology.orchestra.run.vm06.stdout: valid_lft 3035sec preferred_lft 3035sec 2026-03-09T17:27:20.283 INFO:teuthology.orchestra.run.vm06.stdout: inet6 fe80::5055:ff:fe00:6/64 scope link noprefixroute 2026-03-09T17:27:20.283 INFO:teuthology.orchestra.run.vm06.stdout: valid_lft forever preferred_lft forever 2026-03-09T17:27:20.283 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-09T17:27:20.283 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T17:27:20.283 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-09T17:27:20.283 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link add name ceph-brx type bridge 2026-03-09T17:27:20.283 DEBUG:teuthology.orchestra.run.vm06:> sudo ip addr flush dev ceph-brx 2026-03-09T17:27:20.283 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link set ceph-brx up 2026-03-09T17:27:20.283 DEBUG:teuthology.orchestra.run.vm06:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-09T17:27:20.283 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-09T17:27:20.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T17:27:20.438 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T17:27:20.443 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:20.443 DEBUG:teuthology.orchestra.run.vm06:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-09T17:27:20.521 INFO:teuthology.orchestra.run.vm06.stdout:1 2026-03-09T17:27:20.524 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:20.524 DEBUG:teuthology.orchestra.run.vm06:> ip r 2026-03-09T17:27:20.579 INFO:teuthology.orchestra.run.vm06.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.106 metric 100 2026-03-09T17:27:20.579 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.106 metric 100 2026-03-09T17:27:20.579 INFO:teuthology.orchestra.run.vm06.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-09T17:27:20.580 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T17:27:20.580 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-09T17:27:20.580 DEBUG:teuthology.orchestra.run.vm06:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-09T17:27:20.580 DEBUG:teuthology.orchestra.run.vm06:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-09T17:27:20.580 DEBUG:teuthology.orchestra.run.vm06:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-09T17:27:20.580 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-09T17:27:20.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T17:27:20.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T17:27:20.725 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:20.725 DEBUG:teuthology.orchestra.run.vm06:> ip netns list 2026-03-09T17:27:20.784 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:20.784 DEBUG:teuthology.orchestra.run.vm06:> ip netns list-id 2026-03-09T17:27:20.840 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:20 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/23331301' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T17:27:20.840 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:20 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:20.844 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T17:27:20.844 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-09T17:27:20.844 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T17:27:20.844 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.0 0 2026-03-09T17:27:20.844 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-09T17:27:20.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T17:27:20.949 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:20 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T17:27:20.951 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.0' with 192.168.144.1/20 2026-03-09T17:27:20.951 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T17:27:20.951 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-09T17:27:20.951 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.0 type veth peer name brx.0 2026-03-09T17:27:20.951 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-09T17:27:20.951 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set veth0 up 2026-03-09T17:27:20.951 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set lo up 2026-03-09T17:27:20.951 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip route add default via 192.168.159.254 2026-03-09T17:27:20.951 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-09T17:27:21.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:21 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T17:27:21.098 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:21 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T17:27:21.103 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T17:27:21.103 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-09T17:27:21.103 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link set brx.0 up 2026-03-09T17:27:21.103 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link set dev brx.0 master ceph-brx 2026-03-09T17:27:21.103 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-09T17:27:21.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:20 vm09 ceph-mon[62061]: from='client.? 192.168.123.106:0/23331301' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T17:27:21.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:20 vm09 ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:21.180 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:21 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T17:27:21.204 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:21 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T17:27:21.209 INFO:tasks.cephfs.fuse_mount:Client client.0 config is {} 2026-03-09T17:27:21.209 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T17:27:21.209 DEBUG:teuthology.orchestra.run.vm06:> mkdir -p -v /home/ubuntu/cephtest/mnt.0 2026-03-09T17:27:21.266 INFO:teuthology.orchestra.run.vm06.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.0' 2026-03-09T17:27:21.266 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T17:27:21.266 DEBUG:teuthology.orchestra.run.vm06:> chmod 0000 /home/ubuntu/cephtest/mnt.0 2026-03-09T17:27:21.322 DEBUG:teuthology.orchestra.run.vm06:> sudo modprobe fuse 2026-03-09T17:27:21.385 DEBUG:teuthology.orchestra.run.vm06:> cat /proc/self/mounts | awk '{print $2}' 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/proc 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/sys 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/dev 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/security 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/dev/shm 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/dev/pts 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/run 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/cgroup 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/pstore 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/bpf 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/config 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/ 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/nfs/rpc_pipefs 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/selinux 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/dev/hugepages 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/dev/mqueue 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/debug 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/tracing 2026-03-09T17:27:21.441 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-sysctl.service 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/fuse/connections 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-sysusers.service 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/run/user/1000 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/b267d18f6801bfb3a00b61ca8d160b1167110e98217cca616f4df8f13daf8c89/merged 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/1c7aaafeed43f881fa0186b0155aa3bf7324dbb8e86de01b007001992d2c6bd1/merged 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/run/user/0 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/df3226d9fe667b6f3ebeac1150f7111a815bfeac813bdb9aa59bc59547d94b62/merged 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/4d473fb360e05346c9ba28bbc81ff7e938b00f1c114c2529ae8de53d5d555486/merged 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/f49f2f70efd721a4faa932395c3b30f45174f398ff6962bf4879806e95364f43/merged 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/58505384a5fdb3a3b9b80e88fc806544404335a48837c073d8c0fba11299a4f0/merged 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/b72a03774248a9fa14bee7d27f80291ecb7ec801a48abcd001434a1ed8bef2e5/merged 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/b1d83eb0d1469b29bf4ee8c817cb13d8e028c813ae4c03d9586e41931941a56d/merged 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/d387c4812b583d4001ae041d14eb89c7404e5417886ba307287e3785fcd1a3ba/merged 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/60ba3b89dbcf855b7722615025c14f090a96fad4b200d4e0b11a9fe651b0453f/merged 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/8e479eda10c1aec49eb87b1b0c24c1a14d2667dd01d10bbe87826fdedd977942/merged 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/b0d5b07df01f1ce02830bbbfa4b7e7135b0bf1b59e90873380a8a2af68cfaec1/merged 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/f432a6679bbbcbf7afc1bf69efe8a51877a91b54be88ad3817ddff386ebc8191/merged 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/run/netns 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run.vm06.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T17:27:21.442 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:21.442 DEBUG:teuthology.orchestra.run.vm06:> ls /sys/fs/fuse/connections 2026-03-09T17:27:21.497 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-09T17:27:21.497 DEBUG:teuthology.orchestra.run.vm06:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.0 --id 0) 2026-03-09T17:27:21.538 DEBUG:teuthology.orchestra.run.vm06:> sudo modprobe fuse 2026-03-09T17:27:21.564 DEBUG:teuthology.orchestra.run.vm06:> cat /proc/self/mounts | awk '{print $2}' 2026-03-09T17:27:21.610 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm06.stderr:2026-03-09T17:27:21.605+0000 7ff117618480 -1 init, newargv = 0x562fb238c5e0 newargc=15 2026-03-09T17:27:21.610 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm06.stderr:ceph-fuse[98178]: starting ceph client 2026-03-09T17:27:21.616 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm06.stderr:ceph-fuse[98178]: starting fuse 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/proc 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/sys 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/dev 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/security 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/dev/shm 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/dev/pts 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/run 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/cgroup 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/pstore 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/bpf 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/config 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/ 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/nfs/rpc_pipefs 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/selinux 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/dev/hugepages 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/dev/mqueue 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/debug 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/sys/kernel/tracing 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-sysctl.service 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/sys/fs/fuse/connections 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-sysusers.service 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/run/user/1000 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/b267d18f6801bfb3a00b61ca8d160b1167110e98217cca616f4df8f13daf8c89/merged 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/1c7aaafeed43f881fa0186b0155aa3bf7324dbb8e86de01b007001992d2c6bd1/merged 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/run/user/0 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/df3226d9fe667b6f3ebeac1150f7111a815bfeac813bdb9aa59bc59547d94b62/merged 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/4d473fb360e05346c9ba28bbc81ff7e938b00f1c114c2529ae8de53d5d555486/merged 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/f49f2f70efd721a4faa932395c3b30f45174f398ff6962bf4879806e95364f43/merged 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/58505384a5fdb3a3b9b80e88fc806544404335a48837c073d8c0fba11299a4f0/merged 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/b72a03774248a9fa14bee7d27f80291ecb7ec801a48abcd001434a1ed8bef2e5/merged 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/b1d83eb0d1469b29bf4ee8c817cb13d8e028c813ae4c03d9586e41931941a56d/merged 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/d387c4812b583d4001ae041d14eb89c7404e5417886ba307287e3785fcd1a3ba/merged 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/60ba3b89dbcf855b7722615025c14f090a96fad4b200d4e0b11a9fe651b0453f/merged 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/8e479eda10c1aec49eb87b1b0c24c1a14d2667dd01d10bbe87826fdedd977942/merged 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/b0d5b07df01f1ce02830bbbfa4b7e7135b0bf1b59e90873380a8a2af68cfaec1/merged 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/var/lib/containers/storage/overlay/f432a6679bbbcbf7afc1bf69efe8a51877a91b54be88ad3817ddff386ebc8191/merged 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/run/netns 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T17:27:21.634 INFO:teuthology.orchestra.run.vm06.stdout:/home/ubuntu/cephtest/mnt.0 2026-03-09T17:27:21.635 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:21.635 DEBUG:teuthology.orchestra.run.vm06:> ls /sys/fs/fuse/connections 2026-03-09T17:27:21.694 INFO:teuthology.orchestra.run.vm06.stdout:79 2026-03-09T17:27:21.694 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [79] 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> sudo stdin-killer -- python3 -c ' 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> import glob 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> import re 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> import os 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> import subprocess 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> def _find_admin_socket(client_name): 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> asok_path = "/var/run/ceph/ceph-client.0.*.asok" 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> files = glob.glob(asok_path) 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> mountpoint = "/home/ubuntu/cephtest/mnt.0" 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> # Given a non-glob path, it better be there 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> if "*" not in asok_path: 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> assert(len(files) == 1) 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> return files[0] 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> for f in files: 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> contents = proc_f.read() 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> if mountpoint in contents: 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> return f 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> print(_find_admin_socket("client.0")) 2026-03-09T17:27:21.695 DEBUG:teuthology.orchestra.run.vm06:> ' 2026-03-09T17:27:21.798 INFO:teuthology.orchestra.run.vm06.stdout:/var/run/ceph/ceph-client.0.98178.asok 2026-03-09T17:27:21.800 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:21 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T17:27:21.804 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.0.98178.asok 2026-03-09T17:27:21.804 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:21.805 DEBUG:teuthology.orchestra.run.vm06:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.0.98178.asok status 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "metadata": { 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "ceph_sha1": "5dd24139a1eada541a3bc16b6941c5dde975e26d", 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "entity_id": "0", 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "hostname": "vm06.local", 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.0", 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "pid": "98178", 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "root": "/" 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "dentry_count": 0, 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "dentry_pinned_count": 0, 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "id": 14520, 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "inst": { 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "name": { 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "type": "client", 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "num": 14520 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "addr": { 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "type": "v1", 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "addr": "192.168.144.1:0", 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "nonce": 677291695 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "addr": { 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "type": "v1", 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "addr": "192.168.144.1:0", 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "nonce": 677291695 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "inst_str": "client.14520 192.168.144.1:0/677291695", 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "addr_str": "192.168.144.1:0/677291695", 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "inode_count": 1, 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "mds_epoch": 11, 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "osd_epoch": 37, 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "osd_epoch_barrier": 0, 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "blocklisted": false, 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout: "fs_name": "cephfs" 2026-03-09T17:27:21.915 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:27:21.921 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-09T17:27:21.921 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs ls 2026-03-09T17:27:22.074 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:22.104 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:21 vm06.local ceph-mon[57307]: pgmap v84: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.9 KiB/s rd, 1.3 KiB/s wr, 6 op/s 2026-03-09T17:27:22.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:21 vm09 ceph-mon[62061]: pgmap v84: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.9 KiB/s rd, 1.3 KiB/s wr, 6 op/s 2026-03-09T17:27:22.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.364+0000 7f58ef332700 1 -- 192.168.123.106:0/1438457815 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58e8102780 msgr2=0x7f58e8102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:22.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.364+0000 7f58ef332700 1 --2- 192.168.123.106:0/1438457815 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58e8102780 0x7f58e8102bf0 secure :-1 s=READY pgs=267 cs=0 l=1 rev1=1 crypto rx=0x7f58e4009b00 tx=0x7f58e4009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:22.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.365+0000 7f58ef332700 1 -- 192.168.123.106:0/1438457815 shutdown_connections 2026-03-09T17:27:22.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.365+0000 7f58ef332700 1 --2- 192.168.123.106:0/1438457815 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58e8102780 0x7f58e8102bf0 unknown :-1 s=CLOSED pgs=267 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:22.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.365+0000 7f58ef332700 1 --2- 192.168.123.106:0/1438457815 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f58e8108780 0x7f58e8108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:22.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.365+0000 7f58ef332700 1 -- 192.168.123.106:0/1438457815 >> 192.168.123.106:0/1438457815 conn(0x7f58e80fe280 msgr2=0x7f58e8100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:22.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.366+0000 7f58ef332700 1 -- 192.168.123.106:0/1438457815 shutdown_connections 2026-03-09T17:27:22.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.366+0000 7f58ef332700 1 -- 192.168.123.106:0/1438457815 wait complete. 2026-03-09T17:27:22.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.366+0000 7f58ef332700 1 Processor -- start 2026-03-09T17:27:22.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.366+0000 7f58ef332700 1 -- start start 2026-03-09T17:27:22.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.367+0000 7f58ef332700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f58e8102780 0x7f58e8198390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:22.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.367+0000 7f58ef332700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58e8108780 0x7f58e81988d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:22.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.367+0000 7f58ef332700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58e8198fb0 con 0x7f58e8108780 2026-03-09T17:27:22.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.367+0000 7f58ef332700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58e819ccf0 con 0x7f58e8102780 2026-03-09T17:27:22.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.367+0000 7f58ec8cd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58e8108780 0x7f58e81988d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:22.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.367+0000 7f58ec8cd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58e8108780 0x7f58e81988d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:53538/0 (socket says 192.168.123.106:53538) 2026-03-09T17:27:22.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.367+0000 7f58ec8cd700 1 -- 192.168.123.106:0/74929512 learned_addr learned my addr 192.168.123.106:0/74929512 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:22.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.367+0000 7f58ec8cd700 1 -- 192.168.123.106:0/74929512 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f58e8102780 msgr2=0x7f58e8198390 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:22.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.367+0000 7f58ec8cd700 1 --2- 192.168.123.106:0/74929512 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f58e8102780 0x7f58e8198390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:22.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.367+0000 7f58ec8cd700 1 -- 192.168.123.106:0/74929512 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f58e40097e0 con 0x7f58e8108780 2026-03-09T17:27:22.370 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.367+0000 7f58ec8cd700 1 --2- 192.168.123.106:0/74929512 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58e8108780 0x7f58e81988d0 secure :-1 s=READY pgs=268 cs=0 l=1 rev1=1 crypto rx=0x7f58e4009ad0 tx=0x7f58e40052e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:22.370 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.368+0000 7f58de7fc700 1 -- 192.168.123.106:0/74929512 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f58e401d070 con 0x7f58e8108780 2026-03-09T17:27:22.370 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.368+0000 7f58de7fc700 1 -- 192.168.123.106:0/74929512 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f58e400bc50 con 0x7f58e8108780 2026-03-09T17:27:22.370 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.368+0000 7f58de7fc700 1 -- 192.168.123.106:0/74929512 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f58e400f790 con 0x7f58e8108780 2026-03-09T17:27:22.370 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.368+0000 7f58ef332700 1 -- 192.168.123.106:0/74929512 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f58e819cf70 con 0x7f58e8108780 2026-03-09T17:27:22.370 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.368+0000 7f58ef332700 1 -- 192.168.123.106:0/74929512 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f58e819d460 con 0x7f58e8108780 2026-03-09T17:27:22.371 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.368+0000 7f58ef332700 1 -- 192.168.123.106:0/74929512 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f58e804ea50 con 0x7f58e8108780 2026-03-09T17:27:22.374 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.370+0000 7f58de7fc700 1 -- 192.168.123.106:0/74929512 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f58e4022470 con 0x7f58e8108780 2026-03-09T17:27:22.374 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.370+0000 7f58de7fc700 1 --2- 192.168.123.106:0/74929512 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f58d406c7a0 0x7f58d406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:22.374 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.370+0000 7f58de7fc700 1 -- 192.168.123.106:0/74929512 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f58e408d0c0 con 0x7f58e8108780 2026-03-09T17:27:22.375 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.373+0000 7f58de7fc700 1 -- 192.168.123.106:0/74929512 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f58e40578c0 con 0x7f58e8108780 2026-03-09T17:27:22.375 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.373+0000 7f58ed0ce700 1 --2- 192.168.123.106:0/74929512 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f58d406c7a0 0x7f58d406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:22.377 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.374+0000 7f58ed0ce700 1 --2- 192.168.123.106:0/74929512 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f58d406c7a0 0x7f58d406ec50 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f58e81038c0 tx=0x7f58d8005ca0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:22.499 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.497+0000 7f58ef332700 1 -- 192.168.123.106:0/74929512 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7f58e8066e40 con 0x7f58e8108780 2026-03-09T17:27:22.500 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.498+0000 7f58de7fc700 1 -- 192.168.123.106:0/74929512 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v13) v1 ==== 53+0+83 (secure 0 0 0) 0x7f58e405aee0 con 0x7f58e8108780 2026-03-09T17:27:22.500 INFO:teuthology.orchestra.run.vm06.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-09T17:27:22.502 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.500+0000 7f58ef332700 1 -- 192.168.123.106:0/74929512 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f58d406c7a0 msgr2=0x7f58d406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:22.502 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.500+0000 7f58ef332700 1 --2- 192.168.123.106:0/74929512 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f58d406c7a0 0x7f58d406ec50 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f58e81038c0 tx=0x7f58d8005ca0 comp rx=0 tx=0).stop 2026-03-09T17:27:22.502 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.500+0000 7f58ef332700 1 -- 192.168.123.106:0/74929512 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58e8108780 msgr2=0x7f58e81988d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:22.502 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.500+0000 7f58ef332700 1 --2- 192.168.123.106:0/74929512 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58e8108780 0x7f58e81988d0 secure :-1 s=READY pgs=268 cs=0 l=1 rev1=1 crypto rx=0x7f58e4009ad0 tx=0x7f58e40052e0 comp rx=0 tx=0).stop 2026-03-09T17:27:22.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.500+0000 7f58ef332700 1 -- 192.168.123.106:0/74929512 shutdown_connections 2026-03-09T17:27:22.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.500+0000 7f58ef332700 1 --2- 192.168.123.106:0/74929512 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f58e8102780 0x7f58e8198390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:22.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.500+0000 7f58ef332700 1 --2- 192.168.123.106:0/74929512 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f58d406c7a0 0x7f58d406ec50 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:22.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.500+0000 7f58ef332700 1 --2- 192.168.123.106:0/74929512 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58e8108780 0x7f58e81988d0 unknown :-1 s=CLOSED pgs=268 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:22.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.500+0000 7f58ef332700 1 -- 192.168.123.106:0/74929512 >> 192.168.123.106:0/74929512 conn(0x7f58e80fe280 msgr2=0x7f58e80ffbd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:22.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.501+0000 7f58ef332700 1 -- 192.168.123.106:0/74929512 shutdown_connections 2026-03-09T17:27:22.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:22.501+0000 7f58ef332700 1 -- 192.168.123.106:0/74929512 wait complete. 2026-03-09T17:27:22.555 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-09T17:27:22.555 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-09T17:27:22.555 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm09.local 2026-03-09T17:27:22.555 INFO:tasks.cephfs.mount:self.client.name = client.1 2026-03-09T17:27:22.555 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-09T17:27:22.555 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-09T17:27:22.555 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-09T17:27:22.555 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-09T17:27:22.555 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.1' 2026-03-09T17:27:22.555 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:22.555 DEBUG:teuthology.orchestra.run.vm09:> ip addr 2026-03-09T17:27:22.573 INFO:teuthology.orchestra.run.vm09.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-09T17:27:22.573 INFO:teuthology.orchestra.run.vm09.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-09T17:27:22.573 INFO:teuthology.orchestra.run.vm09.stdout: inet 127.0.0.1/8 scope host lo 2026-03-09T17:27:22.573 INFO:teuthology.orchestra.run.vm09.stdout: valid_lft forever preferred_lft forever 2026-03-09T17:27:22.573 INFO:teuthology.orchestra.run.vm09.stdout: inet6 ::1/128 scope host 2026-03-09T17:27:22.573 INFO:teuthology.orchestra.run.vm09.stdout: valid_lft forever preferred_lft forever 2026-03-09T17:27:22.573 INFO:teuthology.orchestra.run.vm09.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-09T17:27:22.573 INFO:teuthology.orchestra.run.vm09.stdout: link/ether 52:55:00:00:00:09 brd ff:ff:ff:ff:ff:ff 2026-03-09T17:27:22.573 INFO:teuthology.orchestra.run.vm09.stdout: altname enp0s3 2026-03-09T17:27:22.573 INFO:teuthology.orchestra.run.vm09.stdout: altname ens3 2026-03-09T17:27:22.573 INFO:teuthology.orchestra.run.vm09.stdout: inet 192.168.123.109/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-09T17:27:22.573 INFO:teuthology.orchestra.run.vm09.stdout: valid_lft 2966sec preferred_lft 2966sec 2026-03-09T17:27:22.573 INFO:teuthology.orchestra.run.vm09.stdout: inet6 fe80::5055:ff:fe00:9/64 scope link noprefixroute 2026-03-09T17:27:22.573 INFO:teuthology.orchestra.run.vm09.stdout: valid_lft forever preferred_lft forever 2026-03-09T17:27:22.573 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-09T17:27:22.573 DEBUG:teuthology.orchestra.run.vm09:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T17:27:22.573 DEBUG:teuthology.orchestra.run.vm09:> set -e 2026-03-09T17:27:22.573 DEBUG:teuthology.orchestra.run.vm09:> sudo ip link add name ceph-brx type bridge 2026-03-09T17:27:22.573 DEBUG:teuthology.orchestra.run.vm09:> sudo ip addr flush dev ceph-brx 2026-03-09T17:27:22.573 DEBUG:teuthology.orchestra.run.vm09:> sudo ip link set ceph-brx up 2026-03-09T17:27:22.573 DEBUG:teuthology.orchestra.run.vm09:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-09T17:27:22.573 DEBUG:teuthology.orchestra.run.vm09:> ') 2026-03-09T17:27:22.650 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:27:22 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T17:27:22.727 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:27:22 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T17:27:22.730 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:22.730 DEBUG:teuthology.orchestra.run.vm09:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-09T17:27:22.805 INFO:teuthology.orchestra.run.vm09.stdout:1 2026-03-09T17:27:22.806 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:22.806 DEBUG:teuthology.orchestra.run.vm09:> ip r 2026-03-09T17:27:22.866 INFO:teuthology.orchestra.run.vm09.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.109 metric 100 2026-03-09T17:27:22.866 INFO:teuthology.orchestra.run.vm09.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.109 metric 100 2026-03-09T17:27:22.866 INFO:teuthology.orchestra.run.vm09.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-09T17:27:22.866 DEBUG:teuthology.orchestra.run.vm09:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T17:27:22.866 DEBUG:teuthology.orchestra.run.vm09:> set -e 2026-03-09T17:27:22.866 DEBUG:teuthology.orchestra.run.vm09:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-09T17:27:22.866 DEBUG:teuthology.orchestra.run.vm09:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-09T17:27:22.866 DEBUG:teuthology.orchestra.run.vm09:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-09T17:27:22.866 DEBUG:teuthology.orchestra.run.vm09:> ') 2026-03-09T17:27:22.945 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:27:22 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T17:27:23.012 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:27:23 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T17:27:23.016 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:23.017 DEBUG:teuthology.orchestra.run.vm09:> ip netns list 2026-03-09T17:27:23.074 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:23.074 DEBUG:teuthology.orchestra.run.vm09:> ip netns list-id 2026-03-09T17:27:23.130 DEBUG:teuthology.orchestra.run.vm09:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T17:27:23.130 DEBUG:teuthology.orchestra.run.vm09:> set -e 2026-03-09T17:27:23.130 DEBUG:teuthology.orchestra.run.vm09:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T17:27:23.130 DEBUG:teuthology.orchestra.run.vm09:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.1 0 2026-03-09T17:27:23.130 DEBUG:teuthology.orchestra.run.vm09:> ') 2026-03-09T17:27:23.205 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:27:23 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T17:27:23.214 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:23 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/74929512' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T17:27:23.229 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:27:23 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T17:27:23.234 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.1' with 192.168.144.1/20 2026-03-09T17:27:23.234 DEBUG:teuthology.orchestra.run.vm09:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T17:27:23.234 DEBUG:teuthology.orchestra.run.vm09:> set -e 2026-03-09T17:27:23.234 DEBUG:teuthology.orchestra.run.vm09:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.1 type veth peer name brx.0 2026-03-09T17:27:23.234 DEBUG:teuthology.orchestra.run.vm09:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-09T17:27:23.234 DEBUG:teuthology.orchestra.run.vm09:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set veth0 up 2026-03-09T17:27:23.234 DEBUG:teuthology.orchestra.run.vm09:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set lo up 2026-03-09T17:27:23.234 DEBUG:teuthology.orchestra.run.vm09:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip route add default via 192.168.159.254 2026-03-09T17:27:23.234 DEBUG:teuthology.orchestra.run.vm09:> ') 2026-03-09T17:27:23.309 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:27:23 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T17:27:23.380 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:27:23 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T17:27:23.383 DEBUG:teuthology.orchestra.run.vm09:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T17:27:23.383 DEBUG:teuthology.orchestra.run.vm09:> set -e 2026-03-09T17:27:23.383 DEBUG:teuthology.orchestra.run.vm09:> sudo ip link set brx.0 up 2026-03-09T17:27:23.383 DEBUG:teuthology.orchestra.run.vm09:> sudo ip link set dev brx.0 master ceph-brx 2026-03-09T17:27:23.384 DEBUG:teuthology.orchestra.run.vm09:> ') 2026-03-09T17:27:23.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:22 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/74929512' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T17:27:23.461 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:27:23 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T17:27:23.490 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:27:23 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T17:27:23.496 INFO:tasks.cephfs.fuse_mount:Client client.1 config is {} 2026-03-09T17:27:23.496 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T17:27:23.496 DEBUG:teuthology.orchestra.run.vm09:> mkdir -p -v /home/ubuntu/cephtest/mnt.1 2026-03-09T17:27:23.551 INFO:teuthology.orchestra.run.vm09.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.1' 2026-03-09T17:27:23.552 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T17:27:23.552 DEBUG:teuthology.orchestra.run.vm09:> chmod 0000 /home/ubuntu/cephtest/mnt.1 2026-03-09T17:27:23.608 DEBUG:teuthology.orchestra.run.vm09:> sudo modprobe fuse 2026-03-09T17:27:23.678 DEBUG:teuthology.orchestra.run.vm09:> cat /proc/self/mounts | awk '{print $2}' 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/proc 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/sys 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/dev 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/sys/kernel/security 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/dev/shm 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/dev/pts 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/run 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/sys/fs/cgroup 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/sys/fs/pstore 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/sys/fs/bpf 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/sys/kernel/config 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/ 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/nfs/rpc_pipefs 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/sys/fs/selinux 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/dev/hugepages 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/dev/mqueue 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/sys/kernel/debug 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/sys/kernel/tracing 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/run/credentials/systemd-sysctl.service 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/sys/fs/fuse/connections 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/run/credentials/systemd-sysusers.service 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/run/user/1000 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/run/user/0 2026-03-09T17:27:23.735 INFO:teuthology.orchestra.run.vm09.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/48b7831cfc7d6befcd5aed40afa604029e28a1963c3a78bccc21515df5dd805c/merged 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/c5762fcdded035df59c93056eaf8ed1b6cf7ae6e08ae872b9883fdfa6d25f8bf/merged 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/e2bb06bd67ee320e41778eee2c0ae13285098c566050b75f384b6aca3f0df19d/merged 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/9784ba4f8cae638eb84cde28fa91cfc1fe4d5453b279f82d075bf4de7be01ca5/merged 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/f13de4b01e40bd2ab1b86e8c37dc1f2ddba7169bb1440f3655c981da47d0110e/merged 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/b74482ab6faec77dd2e07d4986bbdb83b6248103fc7d9af6883ec4d1cee55e20/merged 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/017a888ac88339e2eeb023c099ed443e4258064e301637f3866a7757049a7d94/merged 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/57ca62d2215bd7c21eab5a8844dd1564b9d256effbee12835536894b53941129/merged 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/b434a54dd81bf29f6f826dcc46e17086e9d1bc15481c4210b630f7ce198c817b/merged 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/b3041d1736a74d04425e0dd58c998304b20e83abb8c0009c92fc06c3da0a9ff7/merged 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run.vm09.stdout:/run/netns 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run.vm09.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run.vm09.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T17:27:23.736 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:23.736 DEBUG:teuthology.orchestra.run.vm09:> ls /sys/fs/fuse/connections 2026-03-09T17:27:23.791 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-09T17:27:23.791 DEBUG:teuthology.orchestra.run.vm09:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.1 --id 1) 2026-03-09T17:27:23.833 DEBUG:teuthology.orchestra.run.vm09:> sudo modprobe fuse 2026-03-09T17:27:23.859 DEBUG:teuthology.orchestra.run.vm09:> cat /proc/self/mounts | awk '{print $2}' 2026-03-09T17:27:23.906 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm09.stderr:2026-03-09T17:27:23.904+0000 7f3c5d642480 -1 init, newargv = 0x5601f95a6af0 newargc=15 2026-03-09T17:27:23.907 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm09.stderr:ceph-fuse[84361]: starting ceph client 2026-03-09T17:27:23.916 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm09.stderr:ceph-fuse[84361]: starting fuse 2026-03-09T17:27:23.929 INFO:teuthology.orchestra.run.vm09.stdout:/proc 2026-03-09T17:27:23.929 INFO:teuthology.orchestra.run.vm09.stdout:/sys 2026-03-09T17:27:23.929 INFO:teuthology.orchestra.run.vm09.stdout:/dev 2026-03-09T17:27:23.929 INFO:teuthology.orchestra.run.vm09.stdout:/sys/kernel/security 2026-03-09T17:27:23.929 INFO:teuthology.orchestra.run.vm09.stdout:/dev/shm 2026-03-09T17:27:23.929 INFO:teuthology.orchestra.run.vm09.stdout:/dev/pts 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/run 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/sys/fs/cgroup 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/sys/fs/pstore 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/sys/fs/bpf 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/sys/kernel/config 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/ 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/nfs/rpc_pipefs 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/sys/fs/selinux 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/dev/hugepages 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/dev/mqueue 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/sys/kernel/debug 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/sys/kernel/tracing 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/run/credentials/systemd-sysctl.service 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/sys/fs/fuse/connections 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/run/credentials/systemd-sysusers.service 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/run/user/1000 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/run/user/0 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/proc/sys/fs/binfmt_misc 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/48b7831cfc7d6befcd5aed40afa604029e28a1963c3a78bccc21515df5dd805c/merged 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/c5762fcdded035df59c93056eaf8ed1b6cf7ae6e08ae872b9883fdfa6d25f8bf/merged 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/e2bb06bd67ee320e41778eee2c0ae13285098c566050b75f384b6aca3f0df19d/merged 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/9784ba4f8cae638eb84cde28fa91cfc1fe4d5453b279f82d075bf4de7be01ca5/merged 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/f13de4b01e40bd2ab1b86e8c37dc1f2ddba7169bb1440f3655c981da47d0110e/merged 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/b74482ab6faec77dd2e07d4986bbdb83b6248103fc7d9af6883ec4d1cee55e20/merged 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/017a888ac88339e2eeb023c099ed443e4258064e301637f3866a7757049a7d94/merged 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/57ca62d2215bd7c21eab5a8844dd1564b9d256effbee12835536894b53941129/merged 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/b434a54dd81bf29f6f826dcc46e17086e9d1bc15481c4210b630f7ce198c817b/merged 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/var/lib/containers/storage/overlay/b3041d1736a74d04425e0dd58c998304b20e83abb8c0009c92fc06c3da0a9ff7/merged 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/run/netns 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run.vm09.stdout:/home/ubuntu/cephtest/mnt.1 2026-03-09T17:27:23.930 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:23.930 DEBUG:teuthology.orchestra.run.vm09:> ls /sys/fs/fuse/connections 2026-03-09T17:27:23.992 INFO:teuthology.orchestra.run.vm09.stdout:90 2026-03-09T17:27:23.992 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [90] 2026-03-09T17:27:23.992 DEBUG:teuthology.orchestra.run.vm09:> sudo stdin-killer -- python3 -c ' 2026-03-09T17:27:23.992 DEBUG:teuthology.orchestra.run.vm09:> import glob 2026-03-09T17:27:23.992 DEBUG:teuthology.orchestra.run.vm09:> import re 2026-03-09T17:27:23.992 DEBUG:teuthology.orchestra.run.vm09:> import os 2026-03-09T17:27:23.992 DEBUG:teuthology.orchestra.run.vm09:> import subprocess 2026-03-09T17:27:23.992 DEBUG:teuthology.orchestra.run.vm09:> 2026-03-09T17:27:23.992 DEBUG:teuthology.orchestra.run.vm09:> def _find_admin_socket(client_name): 2026-03-09T17:27:23.992 DEBUG:teuthology.orchestra.run.vm09:> asok_path = "/var/run/ceph/ceph-client.1.*.asok" 2026-03-09T17:27:23.992 DEBUG:teuthology.orchestra.run.vm09:> files = glob.glob(asok_path) 2026-03-09T17:27:23.992 DEBUG:teuthology.orchestra.run.vm09:> mountpoint = "/home/ubuntu/cephtest/mnt.1" 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> # Given a non-glob path, it better be there 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> if "*" not in asok_path: 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> assert(len(files) == 1) 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> return files[0] 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> for f in files: 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> contents = proc_f.read() 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> if mountpoint in contents: 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> return f 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> print(_find_admin_socket("client.1")) 2026-03-09T17:27:23.993 DEBUG:teuthology.orchestra.run.vm09:> ' 2026-03-09T17:27:24.108 INFO:teuthology.orchestra.run.vm09.stdout:/var/run/ceph/ceph-client.1.84361.asok 2026-03-09T17:27:24.110 INFO:teuthology.orchestra.run.vm09.stderr:2026-03-09T17:27:24 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T17:27:24.116 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.1.84361.asok 2026-03-09T17:27:24.116 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:24.116 DEBUG:teuthology.orchestra.run.vm09:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.1.84361.asok status 2026-03-09T17:27:24.116 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:23 vm09.local ceph-mon[62061]: pgmap v85: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.2 KiB/s rd, 1.2 KiB/s wr, 6 op/s 2026-03-09T17:27:24.116 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:23 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout:{ 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "metadata": { 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "ceph_sha1": "5dd24139a1eada541a3bc16b6941c5dde975e26d", 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "entity_id": "1", 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "hostname": "vm09.local", 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.1", 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "pid": "84361", 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "root": "/" 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: }, 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "dentry_count": 0, 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "dentry_pinned_count": 0, 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "id": 24343, 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "inst": { 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "name": { 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "type": "client", 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "num": 24343 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: }, 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "addr": { 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "type": "v1", 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "addr": "192.168.144.1:0", 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "nonce": 3798307677 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: } 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: }, 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "addr": { 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "type": "v1", 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "addr": "192.168.144.1:0", 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "nonce": 3798307677 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: }, 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "inst_str": "client.24343 192.168.144.1:0/3798307677", 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "addr_str": "192.168.144.1:0/3798307677", 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "inode_count": 1, 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "mds_epoch": 11, 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "osd_epoch": 37, 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "osd_epoch_barrier": 0, 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "blocklisted": false, 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout: "fs_name": "cephfs" 2026-03-09T17:27:24.224 INFO:teuthology.orchestra.run.vm09.stdout:} 2026-03-09T17:27:24.231 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:24.231 DEBUG:teuthology.orchestra.run.vm06:> stat --file-system '--printf=%T 2026-03-09T17:27:24.231 DEBUG:teuthology.orchestra.run.vm06:> ' -- /home/ubuntu/cephtest/mnt.0 2026-03-09T17:27:24.235 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:23 vm06.local ceph-mon[57307]: pgmap v85: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.2 KiB/s rd, 1.2 KiB/s wr, 6 op/s 2026-03-09T17:27:24.235 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:23 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:27:24.258 INFO:teuthology.orchestra.run.vm06.stdout:fuseblk 2026-03-09T17:27:24.258 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.0 2026-03-09T17:27:24.258 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:24.258 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.0 2026-03-09T17:27:24.333 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:24.333 DEBUG:teuthology.orchestra.run.vm09:> stat --file-system '--printf=%T 2026-03-09T17:27:24.333 DEBUG:teuthology.orchestra.run.vm09:> ' -- /home/ubuntu/cephtest/mnt.1 2026-03-09T17:27:24.349 INFO:teuthology.orchestra.run.vm09.stdout:fuseblk 2026-03-09T17:27:24.349 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.1 2026-03-09T17:27:24.350 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:27:24.350 DEBUG:teuthology.orchestra.run.vm09:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.1 2026-03-09T17:27:24.418 INFO:teuthology.run_tasks:Running task print... 2026-03-09T17:27:24.421 INFO:teuthology.task.print:**** done client 2026-03-09T17:27:24.421 INFO:teuthology.run_tasks:Running task parallel... 2026-03-09T17:27:24.424 INFO:teuthology.task.parallel:starting parallel... 2026-03-09T17:27:24.424 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-09T17:27:24.424 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-09T17:27:24.424 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm06.local 2026-03-09T17:27:24.424 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mgr mgr/orchestrator/fail_fs false || true' 2026-03-09T17:27:24.425 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-09T17:27:24.425 INFO:teuthology.task.sequential:In sequential, running task workunit... 2026-03-09T17:27:24.426 INFO:tasks.workunit:Pulling workunits from ref 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-09T17:27:24.426 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-09T17:27:24.426 INFO:tasks.workunit:timeout=3h 2026-03-09T17:27:24.426 INFO:tasks.workunit:cleanup=True 2026-03-09T17:27:24.426 DEBUG:teuthology.orchestra.run.vm06:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-09T17:27:24.451 INFO:teuthology.orchestra.run.vm06.stdout: File: /home/ubuntu/cephtest/mnt.0 2026-03-09T17:27:24.451 INFO:teuthology.orchestra.run.vm06.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-09T17:27:24.451 INFO:teuthology.orchestra.run.vm06.stdout:Device: 4fh/79d Inode: 1 Links: 2 2026-03-09T17:27:24.451 INFO:teuthology.orchestra.run.vm06.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-09T17:27:24.451 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-09T17:27:24.451 INFO:teuthology.orchestra.run.vm06.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-09T17:27:24.451 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-09 17:27:12.009570901 +0000 2026-03-09T17:27:24.451 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-09 17:27:24.329920958 +0000 2026-03-09T17:27:24.451 INFO:teuthology.orchestra.run.vm06.stdout: Birth: - 2026-03-09T17:27:24.451 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.0 2026-03-09T17:27:24.452 DEBUG:teuthology.orchestra.run.vm06:> cd -- /home/ubuntu/cephtest/mnt.0 && sudo install -d -m 0755 --owner=ubuntu -- client.0 2026-03-09T17:27:24.529 DEBUG:teuthology.orchestra.run.vm09:> stat -- /home/ubuntu/cephtest/mnt.1 2026-03-09T17:27:24.551 INFO:teuthology.orchestra.run.vm09.stdout: File: /home/ubuntu/cephtest/mnt.1 2026-03-09T17:27:24.551 INFO:teuthology.orchestra.run.vm09.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-09T17:27:24.551 INFO:teuthology.orchestra.run.vm09.stdout:Device: 5ah/90d Inode: 1 Links: 3 2026-03-09T17:27:24.551 INFO:teuthology.orchestra.run.vm09.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-09T17:27:24.551 INFO:teuthology.orchestra.run.vm09.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-09T17:27:24.551 INFO:teuthology.orchestra.run.vm09.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-09T17:27:24.551 INFO:teuthology.orchestra.run.vm09.stdout:Modify: 2026-03-09 17:27:24.522137120 +0000 2026-03-09T17:27:24.551 INFO:teuthology.orchestra.run.vm09.stdout:Change: 2026-03-09 17:27:24.522137120 +0000 2026-03-09T17:27:24.551 INFO:teuthology.orchestra.run.vm09.stdout: Birth: - 2026-03-09T17:27:24.551 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.1 2026-03-09T17:27:24.551 DEBUG:teuthology.orchestra.run.vm09:> cd -- /home/ubuntu/cephtest/mnt.1 && sudo install -d -m 0755 --owner=ubuntu -- client.1 2026-03-09T17:27:24.596 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:24.630 DEBUG:teuthology.orchestra.run.vm06:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-09T17:27:24.630 DEBUG:teuthology.orchestra.run.vm09:> rm -rf /home/ubuntu/cephtest/clone.client.1 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.1 && cd /home/ubuntu/cephtest/clone.client.1 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-09T17:27:24.663 INFO:tasks.workunit.client.0.vm06.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-09T17:27:24.690 INFO:tasks.workunit.client.1.vm09.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.1'... 2026-03-09T17:27:24.888 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.885+0000 7f720d825700 1 -- 192.168.123.106:0/3308315396 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72080ff560 msgr2=0x7f72080ff9d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:24.889 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.885+0000 7f720d825700 1 --2- 192.168.123.106:0/3308315396 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72080ff560 0x7f72080ff9d0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f71f8009a60 tx=0x7f71f8009d70 comp rx=0 tx=0).stop 2026-03-09T17:27:24.889 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.886+0000 7f720d825700 1 -- 192.168.123.106:0/3308315396 shutdown_connections 2026-03-09T17:27:24.889 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.886+0000 7f720d825700 1 --2- 192.168.123.106:0/3308315396 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72080ff560 0x7f72080ff9d0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:24.889 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.886+0000 7f720d825700 1 --2- 192.168.123.106:0/3308315396 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72080fe7f0 0x7f72080fec00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:24.889 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.886+0000 7f720d825700 1 -- 192.168.123.106:0/3308315396 >> 192.168.123.106:0/3308315396 conn(0x7f72080fa140 msgr2=0x7f72080fc590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:24.889 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.887+0000 7f720d825700 1 -- 192.168.123.106:0/3308315396 shutdown_connections 2026-03-09T17:27:24.889 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.887+0000 7f720d825700 1 -- 192.168.123.106:0/3308315396 wait complete. 2026-03-09T17:27:24.889 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.887+0000 7f720d825700 1 Processor -- start 2026-03-09T17:27:24.890 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.888+0000 7f720d825700 1 -- start start 2026-03-09T17:27:24.890 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.888+0000 7f720d825700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72080fe7f0 0x7f720819c3b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:24.890 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.888+0000 7f720d825700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72080ff560 0x7f720819c8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:24.890 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.888+0000 7f720d825700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f720819cf10 con 0x7f72080fe7f0 2026-03-09T17:27:24.890 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.888+0000 7f720d825700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f720819d050 con 0x7f72080ff560 2026-03-09T17:27:24.890 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.888+0000 7f72067fc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72080ff560 0x7f720819c8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:24.890 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.888+0000 7f72067fc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72080ff560 0x7f720819c8f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:35438/0 (socket says 192.168.123.106:35438) 2026-03-09T17:27:24.890 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.888+0000 7f72067fc700 1 -- 192.168.123.106:0/332059007 learned_addr learned my addr 192.168.123.106:0/332059007 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:24.891 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.888+0000 7f7206ffd700 1 --2- 192.168.123.106:0/332059007 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72080fe7f0 0x7f720819c3b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:24.891 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.889+0000 7f72067fc700 1 -- 192.168.123.106:0/332059007 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72080fe7f0 msgr2=0x7f720819c3b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:24.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.889+0000 7f72067fc700 1 --2- 192.168.123.106:0/332059007 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72080fe7f0 0x7f720819c3b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:24.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.889+0000 7f72067fc700 1 -- 192.168.123.106:0/332059007 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f71f00097e0 con 0x7f72080ff560 2026-03-09T17:27:24.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.889+0000 7f7206ffd700 1 --2- 192.168.123.106:0/332059007 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72080fe7f0 0x7f720819c3b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T17:27:24.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.889+0000 7f72067fc700 1 --2- 192.168.123.106:0/332059007 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72080ff560 0x7f720819c8f0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f71f80096a0 tx=0x7f71f800f880 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:24.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.889+0000 7f720c823700 1 -- 192.168.123.106:0/332059007 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f71f801d070 con 0x7f72080ff560 2026-03-09T17:27:24.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.890+0000 7f720c823700 1 -- 192.168.123.106:0/332059007 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f71f800fe60 con 0x7f72080ff560 2026-03-09T17:27:24.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.890+0000 7f720c823700 1 -- 192.168.123.106:0/332059007 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f71f80178a0 con 0x7f72080ff560 2026-03-09T17:27:24.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.892+0000 7f720d825700 1 -- 192.168.123.106:0/332059007 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f71f8009710 con 0x7f72080ff560 2026-03-09T17:27:24.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.893+0000 7f720d825700 1 -- 192.168.123.106:0/332059007 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f72081a1dd0 con 0x7f72080ff560 2026-03-09T17:27:24.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.893+0000 7f720d825700 1 -- 192.168.123.106:0/332059007 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f720810f810 con 0x7f72080ff560 2026-03-09T17:27:24.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.894+0000 7f720c823700 1 -- 192.168.123.106:0/332059007 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f71f8021c10 con 0x7f72080ff560 2026-03-09T17:27:24.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.894+0000 7f720c823700 1 --2- 192.168.123.106:0/332059007 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f71f406c680 0x7f71f406eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:24.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.894+0000 7f720c823700 1 -- 192.168.123.106:0/332059007 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f71f8047020 con 0x7f72080ff560 2026-03-09T17:27:24.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.895+0000 7f7206ffd700 1 --2- 192.168.123.106:0/332059007 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f71f406c680 0x7f71f406eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:24.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.895+0000 7f7206ffd700 1 --2- 192.168.123.106:0/332059007 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f71f406c680 0x7f71f406eb30 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f71f0005fd0 tx=0x7f71f0009500 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:24.899 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:24.896+0000 7f720c823700 1 -- 192.168.123.106:0/332059007 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f71f805b4c0 con 0x7f72080ff560 2026-03-09T17:27:25.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.004+0000 7f720d825700 1 -- 192.168.123.106:0/332059007 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command([{prefix=config set, name=mgr/orchestrator/fail_fs}] v 0) v1 -- 0x7f720810fa30 con 0x7f72080ff560 2026-03-09T17:27:25.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.014+0000 7f720c823700 1 -- 192.168.123.106:0/332059007 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/orchestrator/fail_fs}]=0 v16)=0 v16) v1 ==== 126+0+0 (secure 0 0 0) 0x7f71f805b050 con 0x7f72080ff560 2026-03-09T17:27:25.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.018+0000 7f720d825700 1 -- 192.168.123.106:0/332059007 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f71f406c680 msgr2=0x7f71f406eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:25.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.018+0000 7f720d825700 1 --2- 192.168.123.106:0/332059007 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f71f406c680 0x7f71f406eb30 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f71f0005fd0 tx=0x7f71f0009500 comp rx=0 tx=0).stop 2026-03-09T17:27:25.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.018+0000 7f720d825700 1 -- 192.168.123.106:0/332059007 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72080ff560 msgr2=0x7f720819c8f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:25.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.018+0000 7f720d825700 1 --2- 192.168.123.106:0/332059007 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72080ff560 0x7f720819c8f0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f71f80096a0 tx=0x7f71f800f880 comp rx=0 tx=0).stop 2026-03-09T17:27:25.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.020+0000 7f720d825700 1 -- 192.168.123.106:0/332059007 shutdown_connections 2026-03-09T17:27:25.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.020+0000 7f720d825700 1 --2- 192.168.123.106:0/332059007 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f71f406c680 0x7f71f406eb30 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:25.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.020+0000 7f720d825700 1 --2- 192.168.123.106:0/332059007 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72080fe7f0 0x7f720819c3b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:25.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.020+0000 7f720d825700 1 --2- 192.168.123.106:0/332059007 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72080ff560 0x7f720819c8f0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:25.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.020+0000 7f720d825700 1 -- 192.168.123.106:0/332059007 >> 192.168.123.106:0/332059007 conn(0x7f72080fa140 msgr2=0x7f7208102790 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:25.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.020+0000 7f720d825700 1 -- 192.168.123.106:0/332059007 shutdown_connections 2026-03-09T17:27:25.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.020+0000 7f720d825700 1 -- 192.168.123.106:0/332059007 wait complete. 2026-03-09T17:27:25.072 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-09T17:27:25.072 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm06.local 2026-03-09T17:27:25.072 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force' 2026-03-09T17:27:25.287 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:25.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.573+0000 7f9ec21ef700 1 -- 192.168.123.106:0/2015743720 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ebc102780 msgr2=0x7f9ebc102b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:25.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.573+0000 7f9ec21ef700 1 --2- 192.168.123.106:0/2015743720 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ebc102780 0x7f9ebc102b90 secure :-1 s=READY pgs=269 cs=0 l=1 rev1=1 crypto rx=0x7f9ea4009b00 tx=0x7f9ea4009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:25.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.575+0000 7f9ec21ef700 1 -- 192.168.123.106:0/2015743720 shutdown_connections 2026-03-09T17:27:25.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.575+0000 7f9ec21ef700 1 --2- 192.168.123.106:0/2015743720 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9ebc103980 0x7f9ebc103dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:25.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.575+0000 7f9ec21ef700 1 --2- 192.168.123.106:0/2015743720 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ebc102780 0x7f9ebc102b90 unknown :-1 s=CLOSED pgs=269 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:25.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.575+0000 7f9ec21ef700 1 -- 192.168.123.106:0/2015743720 >> 192.168.123.106:0/2015743720 conn(0x7f9ebc0fdd50 msgr2=0x7f9ebc100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:25.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.575+0000 7f9ec21ef700 1 -- 192.168.123.106:0/2015743720 shutdown_connections 2026-03-09T17:27:25.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.575+0000 7f9ec21ef700 1 -- 192.168.123.106:0/2015743720 wait complete. 2026-03-09T17:27:25.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.575+0000 7f9ec21ef700 1 Processor -- start 2026-03-09T17:27:25.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.576+0000 7f9ec21ef700 1 -- start start 2026-03-09T17:27:25.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.576+0000 7f9ec21ef700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9ebc102780 0x7f9ebc078b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:25.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.576+0000 7f9ec21ef700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ebc103980 0x7f9ebc079040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:25.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.576+0000 7f9ec21ef700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9ebc0755f0 con 0x7f9ebc103980 2026-03-09T17:27:25.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.576+0000 7f9ec21ef700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9ebc075760 con 0x7f9ebc102780 2026-03-09T17:27:25.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.576+0000 7f9ebb7fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9ebc102780 0x7f9ebc078b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:25.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.576+0000 7f9ebb7fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9ebc102780 0x7f9ebc078b00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:35456/0 (socket says 192.168.123.106:35456) 2026-03-09T17:27:25.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.576+0000 7f9ebb7fe700 1 -- 192.168.123.106:0/981747112 learned_addr learned my addr 192.168.123.106:0/981747112 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:25.578 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.576+0000 7f9ebaffd700 1 --2- 192.168.123.106:0/981747112 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ebc103980 0x7f9ebc079040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:25.579 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.577+0000 7f9ebb7fe700 1 -- 192.168.123.106:0/981747112 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ebc103980 msgr2=0x7f9ebc079040 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:25.579 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.577+0000 7f9ebb7fe700 1 --2- 192.168.123.106:0/981747112 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ebc103980 0x7f9ebc079040 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:25.579 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.577+0000 7f9ebb7fe700 1 -- 192.168.123.106:0/981747112 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9ea40097e0 con 0x7f9ebc102780 2026-03-09T17:27:25.579 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.577+0000 7f9ebaffd700 1 --2- 192.168.123.106:0/981747112 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ebc103980 0x7f9ebc079040 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T17:27:25.579 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.577+0000 7f9ebb7fe700 1 --2- 192.168.123.106:0/981747112 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9ebc102780 0x7f9ebc078b00 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f9ea4004a30 tx=0x7f9ea4004b10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:25.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.578+0000 7f9eb8ff9700 1 -- 192.168.123.106:0/981747112 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ea401d070 con 0x7f9ebc102780 2026-03-09T17:27:25.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.578+0000 7f9ec21ef700 1 -- 192.168.123.106:0/981747112 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9ebc0759e0 con 0x7f9ebc102780 2026-03-09T17:27:25.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.578+0000 7f9ec21ef700 1 -- 192.168.123.106:0/981747112 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9ebc075ed0 con 0x7f9ebc102780 2026-03-09T17:27:25.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.578+0000 7f9eb8ff9700 1 -- 192.168.123.106:0/981747112 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9ea400bcd0 con 0x7f9ebc102780 2026-03-09T17:27:25.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.578+0000 7f9eb8ff9700 1 -- 192.168.123.106:0/981747112 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ea40217e0 con 0x7f9ebc102780 2026-03-09T17:27:25.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.579+0000 7f9eb8ff9700 1 -- 192.168.123.106:0/981747112 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f9ea402b430 con 0x7f9ebc102780 2026-03-09T17:27:25.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.579+0000 7f9ec21ef700 1 -- 192.168.123.106:0/981747112 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9ebc066e40 con 0x7f9ebc102780 2026-03-09T17:27:25.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.580+0000 7f9eb8ff9700 1 --2- 192.168.123.106:0/981747112 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9ea806c630 0x7f9ea806eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:25.582 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.580+0000 7f9ebaffd700 1 --2- 192.168.123.106:0/981747112 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9ea806c630 0x7f9ea806eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:25.583 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.580+0000 7f9eb8ff9700 1 -- 192.168.123.106:0/981747112 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f9ea408cd40 con 0x7f9ebc102780 2026-03-09T17:27:25.583 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.581+0000 7f9ebaffd700 1 --2- 192.168.123.106:0/981747112 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9ea806c630 0x7f9ea806eae0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f9eac009a20 tx=0x7f9eac008040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:25.585 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.583+0000 7f9eb8ff9700 1 -- 192.168.123.106:0/981747112 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9ea405c0a0 con 0x7f9ebc102780 2026-03-09T17:27:25.699 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.696+0000 7f9ec21ef700 1 -- 192.168.123.106:0/981747112 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}] v 0) v1 -- 0x7f9ebc0761b0 con 0x7f9ebc102780 2026-03-09T17:27:25.700 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.698+0000 7f9eb8ff9700 1 -- 192.168.123.106:0/981747112 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}]=0 v16)=0 v16) v1 ==== 155+0+0 (secure 0 0 0) 0x7f9ea405bc30 con 0x7f9ebc102780 2026-03-09T17:27:25.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.700+0000 7f9ec21ef700 1 -- 192.168.123.106:0/981747112 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9ea806c630 msgr2=0x7f9ea806eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:25.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.700+0000 7f9ec21ef700 1 --2- 192.168.123.106:0/981747112 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9ea806c630 0x7f9ea806eae0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f9eac009a20 tx=0x7f9eac008040 comp rx=0 tx=0).stop 2026-03-09T17:27:25.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.700+0000 7f9ec21ef700 1 -- 192.168.123.106:0/981747112 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9ebc102780 msgr2=0x7f9ebc078b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:25.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.700+0000 7f9ec21ef700 1 --2- 192.168.123.106:0/981747112 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9ebc102780 0x7f9ebc078b00 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f9ea4004a30 tx=0x7f9ea4004b10 comp rx=0 tx=0).stop 2026-03-09T17:27:25.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.701+0000 7f9ec21ef700 1 -- 192.168.123.106:0/981747112 shutdown_connections 2026-03-09T17:27:25.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.701+0000 7f9ec21ef700 1 --2- 192.168.123.106:0/981747112 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9ebc102780 0x7f9ebc078b00 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:25.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.701+0000 7f9ec21ef700 1 --2- 192.168.123.106:0/981747112 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f9ea806c630 0x7f9ea806eae0 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:25.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.701+0000 7f9ec21ef700 1 --2- 192.168.123.106:0/981747112 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9ebc103980 0x7f9ebc079040 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:25.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.701+0000 7f9ec21ef700 1 -- 192.168.123.106:0/981747112 >> 192.168.123.106:0/981747112 conn(0x7f9ebc0fdd50 msgr2=0x7f9ebc106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:25.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.701+0000 7f9ec21ef700 1 -- 192.168.123.106:0/981747112 shutdown_connections 2026-03-09T17:27:25.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:25.701+0000 7f9ec21ef700 1 -- 192.168.123.106:0/981747112 wait complete. 2026-03-09T17:27:25.768 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force' 2026-03-09T17:27:25.938 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:26.209 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.206+0000 7f7a43037700 1 -- 192.168.123.106:0/1054096316 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7a3c103950 msgr2=0x7f7a3c105d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:26.209 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.206+0000 7f7a43037700 1 --2- 192.168.123.106:0/1054096316 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7a3c103950 0x7f7a3c105d30 secure :-1 s=READY pgs=270 cs=0 l=1 rev1=1 crypto rx=0x7f7a28009b00 tx=0x7f7a28009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:26.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.207+0000 7f7a43037700 1 -- 192.168.123.106:0/1054096316 shutdown_connections 2026-03-09T17:27:26.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.207+0000 7f7a43037700 1 --2- 192.168.123.106:0/1054096316 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7a3c103950 0x7f7a3c105d30 unknown :-1 s=CLOSED pgs=270 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:26.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.207+0000 7f7a43037700 1 --2- 192.168.123.106:0/1054096316 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7a3c101030 0x7f7a3c103410 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:26.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.207+0000 7f7a43037700 1 -- 192.168.123.106:0/1054096316 >> 192.168.123.106:0/1054096316 conn(0x7f7a3c0fa9b0 msgr2=0x7f7a3c0fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:26.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.207+0000 7f7a43037700 1 -- 192.168.123.106:0/1054096316 shutdown_connections 2026-03-09T17:27:26.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.207+0000 7f7a43037700 1 -- 192.168.123.106:0/1054096316 wait complete. 2026-03-09T17:27:26.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.208+0000 7f7a43037700 1 Processor -- start 2026-03-09T17:27:26.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.208+0000 7f7a43037700 1 -- start start 2026-03-09T17:27:26.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.208+0000 7f7a43037700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7a3c101030 0x7f7a3c195dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:26.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.209+0000 7f7a40dd3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7a3c101030 0x7f7a3c195dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:26.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.209+0000 7f7a40dd3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7a3c101030 0x7f7a3c195dc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:49520/0 (socket says 192.168.123.106:49520) 2026-03-09T17:27:26.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.209+0000 7f7a43037700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7a3c103950 0x7f7a3c196300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:26.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.209+0000 7f7a43037700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7a3c196920 con 0x7f7a3c101030 2026-03-09T17:27:26.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.209+0000 7f7a43037700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7a3c196a60 con 0x7f7a3c103950 2026-03-09T17:27:26.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.209+0000 7f7a40dd3700 1 -- 192.168.123.106:0/4074427279 learned_addr learned my addr 192.168.123.106:0/4074427279 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:26.211 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.209+0000 7f7a3bfff700 1 --2- 192.168.123.106:0/4074427279 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7a3c103950 0x7f7a3c196300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:26.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.209+0000 7f7a40dd3700 1 -- 192.168.123.106:0/4074427279 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7a3c103950 msgr2=0x7f7a3c196300 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:26.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.209+0000 7f7a40dd3700 1 --2- 192.168.123.106:0/4074427279 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7a3c103950 0x7f7a3c196300 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:26.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.210+0000 7f7a40dd3700 1 -- 192.168.123.106:0/4074427279 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7a280097e0 con 0x7f7a3c101030 2026-03-09T17:27:26.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.210+0000 7f7a40dd3700 1 --2- 192.168.123.106:0/4074427279 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7a3c101030 0x7f7a3c195dc0 secure :-1 s=READY pgs=271 cs=0 l=1 rev1=1 crypto rx=0x7f7a3000ba70 tx=0x7f7a3000bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:26.213 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.210+0000 7f7a39ffb700 1 -- 192.168.123.106:0/4074427279 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7a3000c700 con 0x7f7a3c101030 2026-03-09T17:27:26.213 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.210+0000 7f7a39ffb700 1 -- 192.168.123.106:0/4074427279 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7a3000cd40 con 0x7f7a3c101030 2026-03-09T17:27:26.213 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.210+0000 7f7a39ffb700 1 -- 192.168.123.106:0/4074427279 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7a30012340 con 0x7f7a3c101030 2026-03-09T17:27:26.214 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.210+0000 7f7a43037700 1 -- 192.168.123.106:0/4074427279 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7a3c0fee80 con 0x7f7a3c101030 2026-03-09T17:27:26.214 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.210+0000 7f7a43037700 1 -- 192.168.123.106:0/4074427279 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7a3c0ff450 con 0x7f7a3c101030 2026-03-09T17:27:26.214 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.212+0000 7f7a39ffb700 1 -- 192.168.123.106:0/4074427279 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f7a300124e0 con 0x7f7a3c101030 2026-03-09T17:27:26.214 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.212+0000 7f7a43037700 1 -- 192.168.123.106:0/4074427279 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7a3c100ea0 con 0x7f7a3c101030 2026-03-09T17:27:26.217 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:26 vm06.local ceph-mon[57307]: from='client.? ' entity='client.admin' 2026-03-09T17:27:26.217 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:26 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:27:26.217 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:26 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:26.217 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:26 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:27:26.217 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:26 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:26.217 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:26 vm06.local ceph-mon[57307]: pgmap v86: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.2 KiB/s rd, 1.1 KiB/s wr, 5 op/s 2026-03-09T17:27:26.217 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.215+0000 7f7a39ffb700 1 --2- 192.168.123.106:0/4074427279 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7a2c06c680 0x7f7a2c06eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:26.217 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.215+0000 7f7a39ffb700 1 -- 192.168.123.106:0/4074427279 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f7a3008b990 con 0x7f7a3c101030 2026-03-09T17:27:26.219 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.217+0000 7f7a3bfff700 1 --2- 192.168.123.106:0/4074427279 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7a2c06c680 0x7f7a2c06eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:26.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.218+0000 7f7a3bfff700 1 --2- 192.168.123.106:0/4074427279 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7a2c06c680 0x7f7a2c06eb30 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f7a28005f50 tx=0x7f7a28005ec0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:26.220 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.218+0000 7f7a39ffb700 1 -- 192.168.123.106:0/4074427279 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7a3004e720 con 0x7f7a3c101030 2026-03-09T17:27:26.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.326+0000 7f7a43037700 1 -- 192.168.123.106:0/4074427279 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}] v 0) v1 -- 0x7f7a3c066e40 con 0x7f7a3c101030 2026-03-09T17:27:26.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.326+0000 7f7a39ffb700 1 -- 192.168.123.106:0/4074427279 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}]=0 v16)=0 v16) v1 ==== 163+0+0 (secure 0 0 0) 0x7f7a30059c50 con 0x7f7a3c101030 2026-03-09T17:27:26.331 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.329+0000 7f7a43037700 1 -- 192.168.123.106:0/4074427279 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7a2c06c680 msgr2=0x7f7a2c06eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:26.331 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.329+0000 7f7a43037700 1 --2- 192.168.123.106:0/4074427279 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7a2c06c680 0x7f7a2c06eb30 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f7a28005f50 tx=0x7f7a28005ec0 comp rx=0 tx=0).stop 2026-03-09T17:27:26.331 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.329+0000 7f7a43037700 1 -- 192.168.123.106:0/4074427279 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7a3c101030 msgr2=0x7f7a3c195dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:26.331 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.329+0000 7f7a43037700 1 --2- 192.168.123.106:0/4074427279 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7a3c101030 0x7f7a3c195dc0 secure :-1 s=READY pgs=271 cs=0 l=1 rev1=1 crypto rx=0x7f7a3000ba70 tx=0x7f7a3000bd80 comp rx=0 tx=0).stop 2026-03-09T17:27:26.331 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.329+0000 7f7a43037700 1 -- 192.168.123.106:0/4074427279 shutdown_connections 2026-03-09T17:27:26.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.329+0000 7f7a43037700 1 --2- 192.168.123.106:0/4074427279 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7a2c06c680 0x7f7a2c06eb30 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:26.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.329+0000 7f7a43037700 1 --2- 192.168.123.106:0/4074427279 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7a3c101030 0x7f7a3c195dc0 unknown :-1 s=CLOSED pgs=271 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:26.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.329+0000 7f7a43037700 1 --2- 192.168.123.106:0/4074427279 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7a3c103950 0x7f7a3c196300 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:26.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.329+0000 7f7a43037700 1 -- 192.168.123.106:0/4074427279 >> 192.168.123.106:0/4074427279 conn(0x7f7a3c0fa9b0 msgr2=0x7f7a3c0fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:26.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.330+0000 7f7a43037700 1 -- 192.168.123.106:0/4074427279 shutdown_connections 2026-03-09T17:27:26.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.330+0000 7f7a43037700 1 -- 192.168.123.106:0/4074427279 wait complete. 2026-03-09T17:27:26.393 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set global log_to_journald false --force' 2026-03-09T17:27:26.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:26 vm09.local ceph-mon[62061]: from='client.? ' entity='client.admin' 2026-03-09T17:27:26.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:26 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:27:26.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:26 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:26.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:26 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:27:26.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:26 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:26.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:26 vm09.local ceph-mon[62061]: pgmap v86: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.2 KiB/s rd, 1.1 KiB/s wr, 5 op/s 2026-03-09T17:27:26.562 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:26.841 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.838+0000 7f8e42d0c700 1 -- 192.168.123.106:0/310356598 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8e3c1024d0 msgr2=0x7f8e3c1028e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:26.841 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.838+0000 7f8e42d0c700 1 --2- 192.168.123.106:0/310356598 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8e3c1024d0 0x7f8e3c1028e0 secure :-1 s=READY pgs=272 cs=0 l=1 rev1=1 crypto rx=0x7f8e30009b00 tx=0x7f8e30009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:26.841 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.839+0000 7f8e42d0c700 1 -- 192.168.123.106:0/310356598 shutdown_connections 2026-03-09T17:27:26.841 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.839+0000 7f8e42d0c700 1 --2- 192.168.123.106:0/310356598 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8e3c1036d0 0x7f8e3c103b20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:26.841 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.839+0000 7f8e42d0c700 1 --2- 192.168.123.106:0/310356598 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8e3c1024d0 0x7f8e3c1028e0 unknown :-1 s=CLOSED pgs=272 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:26.841 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.839+0000 7f8e42d0c700 1 -- 192.168.123.106:0/310356598 >> 192.168.123.106:0/310356598 conn(0x7f8e3c0fda80 msgr2=0x7f8e3c0ffeb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:26.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.839+0000 7f8e42d0c700 1 -- 192.168.123.106:0/310356598 shutdown_connections 2026-03-09T17:27:26.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.841+0000 7f8e42d0c700 1 -- 192.168.123.106:0/310356598 wait complete. 2026-03-09T17:27:26.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.841+0000 7f8e42d0c700 1 Processor -- start 2026-03-09T17:27:26.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.841+0000 7f8e42d0c700 1 -- start start 2026-03-09T17:27:26.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.841+0000 7f8e42d0c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8e3c1024d0 0x7f8e3c1939b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:26.844 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.841+0000 7f8e42d0c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8e3c1036d0 0x7f8e3c193ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:26.844 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.841+0000 7f8e42d0c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e3c1944c0 con 0x7f8e3c1024d0 2026-03-09T17:27:26.844 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.841+0000 7f8e42d0c700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e3c194600 con 0x7f8e3c1036d0 2026-03-09T17:27:26.844 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.842+0000 7f8e41509700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8e3c1036d0 0x7f8e3c193ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:26.844 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.842+0000 7f8e41509700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8e3c1036d0 0x7f8e3c193ef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:35496/0 (socket says 192.168.123.106:35496) 2026-03-09T17:27:26.844 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.842+0000 7f8e41509700 1 -- 192.168.123.106:0/284532442 learned_addr learned my addr 192.168.123.106:0/284532442 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:26.844 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.842+0000 7f8e41d0a700 1 --2- 192.168.123.106:0/284532442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8e3c1024d0 0x7f8e3c1939b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:26.844 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.842+0000 7f8e41d0a700 1 -- 192.168.123.106:0/284532442 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8e3c1036d0 msgr2=0x7f8e3c193ef0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:26.844 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.842+0000 7f8e41d0a700 1 --2- 192.168.123.106:0/284532442 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8e3c1036d0 0x7f8e3c193ef0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:26.844 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.842+0000 7f8e41d0a700 1 -- 192.168.123.106:0/284532442 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8e300097e0 con 0x7f8e3c1024d0 2026-03-09T17:27:26.844 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.842+0000 7f8e41d0a700 1 --2- 192.168.123.106:0/284532442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8e3c1024d0 0x7f8e3c1939b0 secure :-1 s=READY pgs=273 cs=0 l=1 rev1=1 crypto rx=0x7f8e30000c00 tx=0x7f8e30004930 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:26.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.842+0000 7f8e2effd700 1 -- 192.168.123.106:0/284532442 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8e3001d070 con 0x7f8e3c1024d0 2026-03-09T17:27:26.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.842+0000 7f8e42d0c700 1 -- 192.168.123.106:0/284532442 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8e3c1aa0e0 con 0x7f8e3c1024d0 2026-03-09T17:27:26.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.842+0000 7f8e42d0c700 1 -- 192.168.123.106:0/284532442 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8e3c1aa540 con 0x7f8e3c1024d0 2026-03-09T17:27:26.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.843+0000 7f8e2effd700 1 -- 192.168.123.106:0/284532442 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8e3000bc50 con 0x7f8e3c1024d0 2026-03-09T17:27:26.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.843+0000 7f8e2effd700 1 -- 192.168.123.106:0/284532442 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8e3000f670 con 0x7f8e3c1024d0 2026-03-09T17:27:26.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.849+0000 7f8e42d0c700 1 -- 192.168.123.106:0/284532442 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8e20005320 con 0x7f8e3c1024d0 2026-03-09T17:27:26.857 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.854+0000 7f8e2effd700 1 -- 192.168.123.106:0/284532442 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f8e30022470 con 0x7f8e3c1024d0 2026-03-09T17:27:26.857 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.854+0000 7f8e2effd700 1 --2- 192.168.123.106:0/284532442 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8e2806c680 0x7f8e2806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:26.857 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.854+0000 7f8e41509700 1 --2- 192.168.123.106:0/284532442 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8e2806c680 0x7f8e2806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:26.857 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.854+0000 7f8e2effd700 1 -- 192.168.123.106:0/284532442 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f8e3008cea0 con 0x7f8e3c1024d0 2026-03-09T17:27:26.857 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.855+0000 7f8e41509700 1 --2- 192.168.123.106:0/284532442 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8e2806c680 0x7f8e2806eb30 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f8e38005950 tx=0x7f8e3800b500 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:26.857 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.854+0000 7f8e2effd700 1 -- 192.168.123.106:0/284532442 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8e30091490 con 0x7f8e3c1024d0 2026-03-09T17:27:26.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.973+0000 7f8e42d0c700 1 -- 192.168.123.106:0/284532442 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config set, name=log_to_journald}] v 0) v1 -- 0x7f8e20005f70 con 0x7f8e3c1024d0 2026-03-09T17:27:26.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.973+0000 7f8e2effd700 1 -- 192.168.123.106:0/284532442 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{prefix=config set, name=log_to_journald}]=0 v16)=0 v16) v1 ==== 135+0+0 (secure 0 0 0) 0x7f8e3005b160 con 0x7f8e3c1024d0 2026-03-09T17:27:26.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.976+0000 7f8e2cff9700 1 -- 192.168.123.106:0/284532442 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8e2806c680 msgr2=0x7f8e2806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:26.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.976+0000 7f8e2cff9700 1 --2- 192.168.123.106:0/284532442 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8e2806c680 0x7f8e2806eb30 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f8e38005950 tx=0x7f8e3800b500 comp rx=0 tx=0).stop 2026-03-09T17:27:26.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.976+0000 7f8e2cff9700 1 -- 192.168.123.106:0/284532442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8e3c1024d0 msgr2=0x7f8e3c1939b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:26.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.976+0000 7f8e2cff9700 1 --2- 192.168.123.106:0/284532442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8e3c1024d0 0x7f8e3c1939b0 secure :-1 s=READY pgs=273 cs=0 l=1 rev1=1 crypto rx=0x7f8e30000c00 tx=0x7f8e30004930 comp rx=0 tx=0).stop 2026-03-09T17:27:26.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.976+0000 7f8e2cff9700 1 -- 192.168.123.106:0/284532442 shutdown_connections 2026-03-09T17:27:26.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.976+0000 7f8e2cff9700 1 --2- 192.168.123.106:0/284532442 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f8e2806c680 0x7f8e2806eb30 secure :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f8e38005950 tx=0x7f8e3800b500 comp rx=0 tx=0).stop 2026-03-09T17:27:26.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.977+0000 7f8e2cff9700 1 --2- 192.168.123.106:0/284532442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8e3c1024d0 0x7f8e3c1939b0 unknown :-1 s=CLOSED pgs=273 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:26.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.977+0000 7f8e2cff9700 1 --2- 192.168.123.106:0/284532442 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8e3c1036d0 0x7f8e3c193ef0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:26.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.977+0000 7f8e2cff9700 1 -- 192.168.123.106:0/284532442 >> 192.168.123.106:0/284532442 conn(0x7f8e3c0fda80 msgr2=0x7f8e3c106900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:26.984 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.982+0000 7f8e2cff9700 1 -- 192.168.123.106:0/284532442 shutdown_connections 2026-03-09T17:27:26.986 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:26.984+0000 7f8e2cff9700 1 -- 192.168.123.106:0/284532442 wait complete. 2026-03-09T17:27:27.063 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1' 2026-03-09T17:27:27.257 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:27.558 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.555+0000 7f61f98f7700 1 -- 192.168.123.106:0/724975088 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f61f4107d50 msgr2=0x7f61f41081c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:27.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.555+0000 7f61f98f7700 1 --2- 192.168.123.106:0/724975088 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f61f4107d50 0x7f61f41081c0 secure :-1 s=READY pgs=274 cs=0 l=1 rev1=1 crypto rx=0x7f61e8009b00 tx=0x7f61e8009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:27.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.555+0000 7f61f98f7700 1 -- 192.168.123.106:0/724975088 shutdown_connections 2026-03-09T17:27:27.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.555+0000 7f61f98f7700 1 --2- 192.168.123.106:0/724975088 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f61f4107d50 0x7f61f41081c0 unknown :-1 s=CLOSED pgs=274 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:27.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.555+0000 7f61f98f7700 1 --2- 192.168.123.106:0/724975088 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f61f4071db0 0x7f61f40721c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:27.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.555+0000 7f61f98f7700 1 -- 192.168.123.106:0/724975088 >> 192.168.123.106:0/724975088 conn(0x7f61f406d3e0 msgr2=0x7f61f406f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:27.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.556+0000 7f61f98f7700 1 -- 192.168.123.106:0/724975088 shutdown_connections 2026-03-09T17:27:27.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.556+0000 7f61f98f7700 1 -- 192.168.123.106:0/724975088 wait complete. 2026-03-09T17:27:27.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.556+0000 7f61f98f7700 1 Processor -- start 2026-03-09T17:27:27.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.556+0000 7f61f98f7700 1 -- start start 2026-03-09T17:27:27.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.556+0000 7f61f98f7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f61f4071db0 0x7f61f4116960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:27.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.556+0000 7f61f98f7700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f61f4107d50 0x7f61f4116ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:27.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.556+0000 7f61f98f7700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f61f4117500 con 0x7f61f4071db0 2026-03-09T17:27:27.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.556+0000 7f61f98f7700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f61f41a1670 con 0x7f61f4107d50 2026-03-09T17:27:27.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.556+0000 7f61f2ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f61f4071db0 0x7f61f4116960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:27.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.556+0000 7f61f2ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f61f4071db0 0x7f61f4116960 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:49548/0 (socket says 192.168.123.106:49548) 2026-03-09T17:27:27.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.556+0000 7f61f2ffd700 1 -- 192.168.123.106:0/3473625106 learned_addr learned my addr 192.168.123.106:0/3473625106 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:27.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.557+0000 7f61f27fc700 1 --2- 192.168.123.106:0/3473625106 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f61f4107d50 0x7f61f4116ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:27.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.557+0000 7f61f2ffd700 1 -- 192.168.123.106:0/3473625106 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f61f4107d50 msgr2=0x7f61f4116ec0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:27.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.557+0000 7f61f2ffd700 1 --2- 192.168.123.106:0/3473625106 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f61f4107d50 0x7f61f4116ec0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:27.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.557+0000 7f61f2ffd700 1 -- 192.168.123.106:0/3473625106 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f61e80097e0 con 0x7f61f4071db0 2026-03-09T17:27:27.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.557+0000 7f61f2ffd700 1 --2- 192.168.123.106:0/3473625106 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f61f4071db0 0x7f61f4116960 secure :-1 s=READY pgs=275 cs=0 l=1 rev1=1 crypto rx=0x7f61e400d8d0 tx=0x7f61e400dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:27.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.558+0000 7f61f88f5700 1 -- 192.168.123.106:0/3473625106 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f61e4009940 con 0x7f61f4071db0 2026-03-09T17:27:27.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.559+0000 7f61f88f5700 1 -- 192.168.123.106:0/3473625106 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f61e4010460 con 0x7f61f4071db0 2026-03-09T17:27:27.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.559+0000 7f61f88f5700 1 -- 192.168.123.106:0/3473625106 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f61e400f5d0 con 0x7f61f4071db0 2026-03-09T17:27:27.564 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.560+0000 7f61f98f7700 1 -- 192.168.123.106:0/3473625106 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f61f41a1950 con 0x7f61f4071db0 2026-03-09T17:27:27.564 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.560+0000 7f61f98f7700 1 -- 192.168.123.106:0/3473625106 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f61f41a1e70 con 0x7f61f4071db0 2026-03-09T17:27:27.568 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.562+0000 7f61f98f7700 1 -- 192.168.123.106:0/3473625106 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f61f404ea50 con 0x7f61f4071db0 2026-03-09T17:27:27.568 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.565+0000 7f61f88f5700 1 -- 192.168.123.106:0/3473625106 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f61e40105d0 con 0x7f61f4071db0 2026-03-09T17:27:27.568 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.565+0000 7f61f88f5700 1 --2- 192.168.123.106:0/3473625106 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f61dc06c7a0 0x7f61dc06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:27.568 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.565+0000 7f61f27fc700 1 --2- 192.168.123.106:0/3473625106 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f61dc06c7a0 0x7f61dc06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:27.568 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.566+0000 7f61f88f5700 1 -- 192.168.123.106:0/3473625106 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f61e408b500 con 0x7f61f4071db0 2026-03-09T17:27:27.568 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.566+0000 7f61f27fc700 1 --2- 192.168.123.106:0/3473625106 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f61dc06c7a0 0x7f61dc06ec50 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f61f4117a90 tx=0x7f61e800b540 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:27.569 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.566+0000 7f61f88f5700 1 -- 192.168.123.106:0/3473625106 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f61e408da90 con 0x7f61f4071db0 2026-03-09T17:27:27.713 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.708+0000 7f61f98f7700 1 -- 192.168.123.106:0/3473625106 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}) v1 -- 0x7f61f410c6a0 con 0x7f61dc06c7a0 2026-03-09T17:27:27.717 INFO:teuthology.orchestra.run.vm06.stdout:Initiating upgrade to quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T17:27:27.717 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.715+0000 7f61f88f5700 1 -- 192.168.123.106:0/3473625106 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+89 (secure 0 0 0) 0x7f61f410c6a0 con 0x7f61dc06c7a0 2026-03-09T17:27:27.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.719+0000 7f61da7fc700 1 -- 192.168.123.106:0/3473625106 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f61dc06c7a0 msgr2=0x7f61dc06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:27.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.719+0000 7f61da7fc700 1 --2- 192.168.123.106:0/3473625106 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f61dc06c7a0 0x7f61dc06ec50 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f61f4117a90 tx=0x7f61e800b540 comp rx=0 tx=0).stop 2026-03-09T17:27:27.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.719+0000 7f61da7fc700 1 -- 192.168.123.106:0/3473625106 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f61f4071db0 msgr2=0x7f61f4116960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:27.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.719+0000 7f61da7fc700 1 --2- 192.168.123.106:0/3473625106 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f61f4071db0 0x7f61f4116960 secure :-1 s=READY pgs=275 cs=0 l=1 rev1=1 crypto rx=0x7f61e400d8d0 tx=0x7f61e400dc90 comp rx=0 tx=0).stop 2026-03-09T17:27:27.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.719+0000 7f61da7fc700 1 -- 192.168.123.106:0/3473625106 shutdown_connections 2026-03-09T17:27:27.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.719+0000 7f61da7fc700 1 --2- 192.168.123.106:0/3473625106 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f61dc06c7a0 0x7f61dc06ec50 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:27.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.719+0000 7f61da7fc700 1 --2- 192.168.123.106:0/3473625106 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f61f4071db0 0x7f61f4116960 unknown :-1 s=CLOSED pgs=275 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:27.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.719+0000 7f61da7fc700 1 --2- 192.168.123.106:0/3473625106 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f61f4107d50 0x7f61f4116ec0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:27.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.719+0000 7f61da7fc700 1 -- 192.168.123.106:0/3473625106 >> 192.168.123.106:0/3473625106 conn(0x7f61f406d3e0 msgr2=0x7f61f410af80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:27.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.719+0000 7f61da7fc700 1 -- 192.168.123.106:0/3473625106 shutdown_connections 2026-03-09T17:27:27.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:27.719+0000 7f61da7fc700 1 -- 192.168.123.106:0/3473625106 wait complete. 2026-03-09T17:27:27.797 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-09T17:27:27.797 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm06.local 2026-03-09T17:27:27.797 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'while ceph orch upgrade status | jq '"'"'.in_progress'"'"' | grep true && ! ceph orch upgrade status | jq '"'"'.message'"'"' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done' 2026-03-09T17:27:28.067 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:27:28.381 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:28 vm06.local ceph-mon[57307]: pgmap v87: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.8 KiB/s rd, 767 B/s wr, 5 op/s 2026-03-09T17:27:28.381 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:28 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:28.381 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:28 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:27:28.588 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.585+0000 7f97937a8700 1 -- 192.168.123.106:0/1371780020 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f978c072360 msgr2=0x7f978c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:28.588 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.585+0000 7f97937a8700 1 --2- 192.168.123.106:0/1371780020 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f978c072360 0x7f978c0770e0 secure :-1 s=READY pgs=276 cs=0 l=1 rev1=1 crypto rx=0x7f9784009230 tx=0x7f9784009260 comp rx=0 tx=0).stop 2026-03-09T17:27:28.588 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.585+0000 7f97937a8700 1 -- 192.168.123.106:0/1371780020 shutdown_connections 2026-03-09T17:27:28.589 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.585+0000 7f97937a8700 1 --2- 192.168.123.106:0/1371780020 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f978c072360 0x7f978c0770e0 unknown :-1 s=CLOSED pgs=276 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:28.589 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.585+0000 7f97937a8700 1 --2- 192.168.123.106:0/1371780020 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f978c071980 0x7f978c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:28.589 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.585+0000 7f97937a8700 1 -- 192.168.123.106:0/1371780020 >> 192.168.123.106:0/1371780020 conn(0x7f978c06d1a0 msgr2=0x7f978c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:28.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.586+0000 7f97937a8700 1 -- 192.168.123.106:0/1371780020 shutdown_connections 2026-03-09T17:27:28.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.586+0000 7f97937a8700 1 -- 192.168.123.106:0/1371780020 wait complete. 2026-03-09T17:27:28.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.586+0000 7f97937a8700 1 Processor -- start 2026-03-09T17:27:28.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.586+0000 7f97937a8700 1 -- start start 2026-03-09T17:27:28.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.587+0000 7f97937a8700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f978c071980 0x7f978c0804e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:28.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.587+0000 7f97937a8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f978c080a20 0x7f978c080e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:28.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.587+0000 7f97937a8700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f978c12dd80 con 0x7f978c080a20 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.587+0000 7f97937a8700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f978c12dec0 con 0x7f978c071980 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.588+0000 7f9791544700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f978c071980 0x7f978c0804e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.588+0000 7f9790d43700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f978c080a20 0x7f978c080e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.588+0000 7f9790d43700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f978c080a20 0x7f978c080e90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:49568/0 (socket says 192.168.123.106:49568) 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.588+0000 7f9790d43700 1 -- 192.168.123.106:0/3706542096 learned_addr learned my addr 192.168.123.106:0/3706542096 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.588+0000 7f9790d43700 1 -- 192.168.123.106:0/3706542096 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f978c071980 msgr2=0x7f978c0804e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.588+0000 7f9790d43700 1 --2- 192.168.123.106:0/3706542096 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f978c071980 0x7f978c0804e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.588+0000 7f9790d43700 1 -- 192.168.123.106:0/3706542096 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9784008ee0 con 0x7f978c080a20 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.588+0000 7f9790d43700 1 --2- 192.168.123.106:0/3706542096 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f978c080a20 0x7f978c080e90 secure :-1 s=READY pgs=277 cs=0 l=1 rev1=1 crypto rx=0x7f978400bf90 tx=0x7f978400fea0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.594+0000 7f97827fc700 1 -- 192.168.123.106:0/3706542096 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9784007aa0 con 0x7f978c080a20 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.594+0000 7f97937a8700 1 -- 192.168.123.106:0/3706542096 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f978c12e140 con 0x7f978c080a20 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.594+0000 7f97937a8700 1 -- 192.168.123.106:0/3706542096 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f978c12e630 con 0x7f978c080a20 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.596+0000 7f97937a8700 1 -- 192.168.123.106:0/3706542096 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f978c04ea50 con 0x7f978c080a20 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.600+0000 7f97827fc700 1 -- 192.168.123.106:0/3706542096 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9784007c00 con 0x7f978c080a20 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.600+0000 7f97827fc700 1 -- 192.168.123.106:0/3706542096 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9784005820 con 0x7f978c080a20 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.601+0000 7f97827fc700 1 -- 192.168.123.106:0/3706542096 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f978402e460 con 0x7f978c080a20 2026-03-09T17:27:28.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.602+0000 7f97827fc700 1 --2- 192.168.123.106:0/3706542096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f977806c870 0x7f977806ed20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:28.613 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.602+0000 7f9791544700 1 --2- 192.168.123.106:0/3706542096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f977806c870 0x7f977806ed20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:28.613 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.603+0000 7f9791544700 1 --2- 192.168.123.106:0/3706542096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f977806c870 0x7f977806ed20 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f9788009960 tx=0x7f9788008040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:28.613 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.603+0000 7f97827fc700 1 -- 192.168.123.106:0/3706542096 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f978408e090 con 0x7f978c080a20 2026-03-09T17:27:28.613 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.603+0000 7f97827fc700 1 -- 192.168.123.106:0/3706542096 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f978408e4c0 con 0x7f978c080a20 2026-03-09T17:27:28.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:28 vm09.local ceph-mon[62061]: pgmap v87: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.8 KiB/s rd, 767 B/s wr, 5 op/s 2026-03-09T17:27:28.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:28 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:28.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:28 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:27:28.793 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.789+0000 7f97937a8700 1 -- 192.168.123.106:0/3706542096 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f978c075da0 con 0x7f977806c870 2026-03-09T17:27:28.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.792+0000 7f97827fc700 1 -- 192.168.123.106:0/3706542096 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+266 (secure 0 0 0) 0x7f978c075da0 con 0x7f977806c870 2026-03-09T17:27:28.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.799+0000 7f97937a8700 1 -- 192.168.123.106:0/3706542096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f977806c870 msgr2=0x7f977806ed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:28.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.799+0000 7f97937a8700 1 --2- 192.168.123.106:0/3706542096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f977806c870 0x7f977806ed20 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f9788009960 tx=0x7f9788008040 comp rx=0 tx=0).stop 2026-03-09T17:27:28.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.799+0000 7f97937a8700 1 -- 192.168.123.106:0/3706542096 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f978c080a20 msgr2=0x7f978c080e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:28.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.799+0000 7f97937a8700 1 --2- 192.168.123.106:0/3706542096 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f978c080a20 0x7f978c080e90 secure :-1 s=READY pgs=277 cs=0 l=1 rev1=1 crypto rx=0x7f978400bf90 tx=0x7f978400fea0 comp rx=0 tx=0).stop 2026-03-09T17:27:28.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.799+0000 7f97937a8700 1 -- 192.168.123.106:0/3706542096 shutdown_connections 2026-03-09T17:27:28.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.799+0000 7f97937a8700 1 --2- 192.168.123.106:0/3706542096 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f978c071980 0x7f978c0804e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:28.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.799+0000 7f97937a8700 1 --2- 192.168.123.106:0/3706542096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f977806c870 0x7f977806ed20 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:28.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.799+0000 7f97937a8700 1 --2- 192.168.123.106:0/3706542096 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f978c080a20 0x7f978c080e90 unknown :-1 s=CLOSED pgs=277 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:28.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.799+0000 7f97937a8700 1 -- 192.168.123.106:0/3706542096 >> 192.168.123.106:0/3706542096 conn(0x7f978c06d1a0 msgr2=0x7f978c075690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:28.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.799+0000 7f97937a8700 1 -- 192.168.123.106:0/3706542096 shutdown_connections 2026-03-09T17:27:28.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.799+0000 7f97937a8700 1 -- 192.168.123.106:0/3706542096 wait complete. 2026-03-09T17:27:28.816 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:27:28.905 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.902+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3111104825 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa9d8072330 msgr2=0x7fa9d80770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:28.905 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.902+0000 7fa9ddfff700 1 --2- 192.168.123.106:0/3111104825 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa9d8072330 0x7fa9d80770b0 secure :-1 s=READY pgs=278 cs=0 l=1 rev1=1 crypto rx=0x7fa9d000b3a0 tx=0x7fa9d000b6b0 comp rx=0 tx=0).stop 2026-03-09T17:27:28.905 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.902+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3111104825 shutdown_connections 2026-03-09T17:27:28.905 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.902+0000 7fa9ddfff700 1 --2- 192.168.123.106:0/3111104825 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa9d8072330 0x7fa9d80770b0 unknown :-1 s=CLOSED pgs=278 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:28.905 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.902+0000 7fa9ddfff700 1 --2- 192.168.123.106:0/3111104825 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa9d8071950 0x7fa9d8071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:28.905 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.902+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3111104825 >> 192.168.123.106:0/3111104825 conn(0x7fa9d806d1a0 msgr2=0x7fa9d806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:28.906 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.903+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3111104825 shutdown_connections 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.903+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3111104825 wait complete. 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.903+0000 7fa9ddfff700 1 Processor -- start 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.903+0000 7fa9ddfff700 1 -- start start 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.903+0000 7fa9ddfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa9d8071950 0x7fa9d81313a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.903+0000 7fa9ddfff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa9d81318e0 0x7fa9d807f560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.903+0000 7fa9ddfff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa9d8131de0 con 0x7fa9d8071950 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.903+0000 7fa9ddfff700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa9d8131f20 con 0x7fa9d81318e0 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.904+0000 7fa9d7fff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa9d81318e0 0x7fa9d807f560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.904+0000 7fa9d7fff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa9d81318e0 0x7fa9d807f560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:35562/0 (socket says 192.168.123.106:35562) 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.904+0000 7fa9d7fff700 1 -- 192.168.123.106:0/3027590093 learned_addr learned my addr 192.168.123.106:0/3027590093 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.904+0000 7fa9dcffd700 1 --2- 192.168.123.106:0/3027590093 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa9d8071950 0x7fa9d81313a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.904+0000 7fa9d7fff700 1 -- 192.168.123.106:0/3027590093 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa9d8071950 msgr2=0x7fa9d81313a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.904+0000 7fa9d7fff700 1 --2- 192.168.123.106:0/3027590093 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa9d8071950 0x7fa9d81313a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.904+0000 7fa9d7fff700 1 -- 192.168.123.106:0/3027590093 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa9d000b050 con 0x7fa9d81318e0 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.904+0000 7fa9d7fff700 1 --2- 192.168.123.106:0/3027590093 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa9d81318e0 0x7fa9d807f560 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fa9d000bd40 tx=0x7fa9d00095a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:28.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.905+0000 7fa9d5ffb700 1 -- 192.168.123.106:0/3027590093 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa9d000e050 con 0x7fa9d81318e0 2026-03-09T17:27:28.908 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.905+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3027590093 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa9d807faa0 con 0x7fa9d81318e0 2026-03-09T17:27:28.908 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.905+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3027590093 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa9d807ff40 con 0x7fa9d81318e0 2026-03-09T17:27:28.908 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.906+0000 7fa9d5ffb700 1 -- 192.168.123.106:0/3027590093 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa9d0003e80 con 0x7fa9d81318e0 2026-03-09T17:27:28.908 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.906+0000 7fa9d5ffb700 1 -- 192.168.123.106:0/3027590093 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa9d001bb20 con 0x7fa9d81318e0 2026-03-09T17:27:28.909 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.907+0000 7fa9d5ffb700 1 -- 192.168.123.106:0/3027590093 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fa9d0019040 con 0x7fa9d81318e0 2026-03-09T17:27:28.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.907+0000 7fa9d5ffb700 1 --2- 192.168.123.106:0/3027590093 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa9c006c6e0 0x7fa9c006eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:28.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.907+0000 7fa9dcffd700 1 --2- 192.168.123.106:0/3027590093 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa9c006c6e0 0x7fa9c006eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:28.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.908+0000 7fa9d5ffb700 1 -- 192.168.123.106:0/3027590093 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fa9d0029030 con 0x7fa9d81318e0 2026-03-09T17:27:28.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.908+0000 7fa9dcffd700 1 --2- 192.168.123.106:0/3027590093 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa9c006c6e0 0x7fa9c006eb90 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fa9c8006fd0 tx=0x7fa9c8009380 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:28.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.908+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3027590093 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa9c4005320 con 0x7fa9d81318e0 2026-03-09T17:27:28.915 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:28.911+0000 7fa9d5ffb700 1 -- 192.168.123.106:0/3027590093 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa9d005bb90 con 0x7fa9d81318e0 2026-03-09T17:27:29.060 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.056+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3027590093 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fa9c4000bf0 con 0x7fa9c006c6e0 2026-03-09T17:27:29.061 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.058+0000 7fa9d5ffb700 1 -- 192.168.123.106:0/3027590093 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7fa9c4000bf0 con 0x7fa9c006c6e0 2026-03-09T17:27:29.064 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.062+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3027590093 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa9c006c6e0 msgr2=0x7fa9c006eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:29.064 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.062+0000 7fa9ddfff700 1 --2- 192.168.123.106:0/3027590093 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa9c006c6e0 0x7fa9c006eb90 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fa9c8006fd0 tx=0x7fa9c8009380 comp rx=0 tx=0).stop 2026-03-09T17:27:29.064 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.062+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3027590093 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa9d81318e0 msgr2=0x7fa9d807f560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:29.064 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.062+0000 7fa9ddfff700 1 --2- 192.168.123.106:0/3027590093 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa9d81318e0 0x7fa9d807f560 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fa9d000bd40 tx=0x7fa9d00095a0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.064 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.062+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3027590093 shutdown_connections 2026-03-09T17:27:29.064 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.062+0000 7fa9ddfff700 1 --2- 192.168.123.106:0/3027590093 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa9c006c6e0 0x7fa9c006eb90 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.064 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.062+0000 7fa9ddfff700 1 --2- 192.168.123.106:0/3027590093 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa9d8071950 0x7fa9d81313a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.064 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.062+0000 7fa9ddfff700 1 --2- 192.168.123.106:0/3027590093 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa9d81318e0 0x7fa9d807f560 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.064 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.062+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3027590093 >> 192.168.123.106:0/3027590093 conn(0x7fa9d806d1a0 msgr2=0x7fa9d80764c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:29.065 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.062+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3027590093 shutdown_connections 2026-03-09T17:27:29.065 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.062+0000 7fa9ddfff700 1 -- 192.168.123.106:0/3027590093 wait complete. 2026-03-09T17:27:29.181 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.179+0000 7f36127f7700 1 -- 192.168.123.106:0/1007862328 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f360c071db0 msgr2=0x7f360c0721c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:29.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.179+0000 7f36127f7700 1 --2- 192.168.123.106:0/1007862328 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f360c071db0 0x7f360c0721c0 secure :-1 s=READY pgs=279 cs=0 l=1 rev1=1 crypto rx=0x7f35fc009b00 tx=0x7f35fc009e10 comp rx=0 tx=0).stop 2026-03-09T17:27:29.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.179+0000 7f36127f7700 1 -- 192.168.123.106:0/1007862328 shutdown_connections 2026-03-09T17:27:29.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.179+0000 7f36127f7700 1 --2- 192.168.123.106:0/1007862328 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f360c107d50 0x7f360c1081c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.179+0000 7f36127f7700 1 --2- 192.168.123.106:0/1007862328 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f360c071db0 0x7f360c0721c0 unknown :-1 s=CLOSED pgs=279 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.179+0000 7f36127f7700 1 -- 192.168.123.106:0/1007862328 >> 192.168.123.106:0/1007862328 conn(0x7f360c06d3e0 msgr2=0x7f360c06f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:29.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.179+0000 7f36127f7700 1 -- 192.168.123.106:0/1007862328 shutdown_connections 2026-03-09T17:27:29.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.179+0000 7f36127f7700 1 -- 192.168.123.106:0/1007862328 wait complete. 2026-03-09T17:27:29.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.180+0000 7f36127f7700 1 Processor -- start 2026-03-09T17:27:29.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.180+0000 7f36127f7700 1 -- start start 2026-03-09T17:27:29.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.180+0000 7f36127f7700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f360c107d50 0x7f360c1a4be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:29.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.180+0000 7f36127f7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f360c1a5120 0x7f360c1aa190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:29.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.180+0000 7f36127f7700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f360c1a5620 con 0x7f360c1a5120 2026-03-09T17:27:29.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.180+0000 7f36127f7700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f360c1a5790 con 0x7f360c107d50 2026-03-09T17:27:29.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.180+0000 7f360b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f360c1a5120 0x7f360c1aa190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:29.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.180+0000 7f360bfff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f360c107d50 0x7f360c1a4be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:29.183 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.180+0000 7f360bfff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f360c107d50 0x7f360c1a4be0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:35592/0 (socket says 192.168.123.106:35592) 2026-03-09T17:27:29.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.180+0000 7f360bfff700 1 -- 192.168.123.106:0/2836978247 learned_addr learned my addr 192.168.123.106:0/2836978247 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:29.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.180+0000 7f360bfff700 1 -- 192.168.123.106:0/2836978247 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f360c1a5120 msgr2=0x7f360c1aa190 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:29.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.180+0000 7f360bfff700 1 --2- 192.168.123.106:0/2836978247 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f360c1a5120 0x7f360c1aa190 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.180+0000 7f360bfff700 1 -- 192.168.123.106:0/2836978247 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f35fc0097e0 con 0x7f360c107d50 2026-03-09T17:27:29.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.181+0000 7f360bfff700 1 --2- 192.168.123.106:0/2836978247 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f360c107d50 0x7f360c1a4be0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f35fc00f690 tx=0x7f35fc00f6c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:29.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.182+0000 7f36097fa700 1 -- 192.168.123.106:0/2836978247 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f35fc01c070 con 0x7f360c107d50 2026-03-09T17:27:29.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.182+0000 7f36127f7700 1 -- 192.168.123.106:0/2836978247 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f360c10f480 con 0x7f360c107d50 2026-03-09T17:27:29.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.182+0000 7f36127f7700 1 -- 192.168.123.106:0/2836978247 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f360c10f940 con 0x7f360c107d50 2026-03-09T17:27:29.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.182+0000 7f36097fa700 1 -- 192.168.123.106:0/2836978247 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f35fc0052d0 con 0x7f360c107d50 2026-03-09T17:27:29.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.182+0000 7f36097fa700 1 -- 192.168.123.106:0/2836978247 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f35fc017400 con 0x7f360c107d50 2026-03-09T17:27:29.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.184+0000 7f36097fa700 1 -- 192.168.123.106:0/2836978247 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f35fc003680 con 0x7f360c107d50 2026-03-09T17:27:29.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.185+0000 7f36097fa700 1 --2- 192.168.123.106:0/2836978247 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f35f406c6d0 0x7f35f406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:29.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.185+0000 7f360b7fe700 1 --2- 192.168.123.106:0/2836978247 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f35f406c6d0 0x7f35f406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:29.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.185+0000 7f36097fa700 1 -- 192.168.123.106:0/2836978247 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f35fc08c8e0 con 0x7f360c107d50 2026-03-09T17:27:29.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.183+0000 7f36127f7700 1 -- 192.168.123.106:0/2836978247 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f35f8005320 con 0x7f360c107d50 2026-03-09T17:27:29.188 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.186+0000 7f360b7fe700 1 --2- 192.168.123.106:0/2836978247 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f35f406c6d0 0x7f35f406eb80 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f3604003eb0 tx=0x7f360400b040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:29.193 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.189+0000 7f36097fa700 1 -- 192.168.123.106:0/2836978247 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f35fc05aaf0 con 0x7f360c107d50 2026-03-09T17:27:29.360 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:29 vm06.local ceph-mon[57307]: from='client.14548 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:29.360 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:29 vm06.local ceph-mon[57307]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T17:27:29.360 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:29 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:29.360 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:29 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:27:29.360 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:29 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:29.360 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.351+0000 7f36127f7700 1 -- 192.168.123.106:0/2836978247 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f35f8000bf0 con 0x7f35f406c6d0 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (104s) 10s ago 2m 24.7M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (2m) 10s ago 2m 8078k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (2m) 11s ago 2m 8136k - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (2m) 10s ago 2m 7402k - 18.2.0 dc2bc1663786 8c6366ef2954 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (119s) 11s ago 119s 7419k - 18.2.0 dc2bc1663786 78af352f0367 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (103s) 10s ago 2m 84.7M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (16s) 10s ago 16s 10.8M - 18.2.0 dc2bc1663786 4b4cbdf0c640 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (18s) 10s ago 18s 19.5M - 18.2.0 dc2bc1663786 4c8e86b2b8cd 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (17s) 11s ago 17s 14.7M - 18.2.0 dc2bc1663786 aa1f0430b448 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (15s) 11s ago 15s 13.4M - 18.2.0 dc2bc1663786 8dc8a0159213 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:9283,8765,8443 running (3m) 10s ago 3m 498M - 18.2.0 dc2bc1663786 2765e8d99a9c 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (115s) 11s ago 115s 443M - 18.2.0 dc2bc1663786 e6525bf5de20 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (3m) 10s ago 3m 52.0M 2048M 18.2.0 dc2bc1663786 e0e1a20b1577 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (113s) 11s ago 113s 47.4M 2048M 18.2.0 dc2bc1663786 4c30d1217de3 2026-03-09T17:27:29.364 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (2m) 10s ago 2m 13.9M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:27:29.365 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (116s) 11s ago 116s 14.9M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:27:29.365 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (95s) 10s ago 95s 46.8M 4096M 18.2.0 dc2bc1663786 7a07f019bdd7 2026-03-09T17:27:29.365 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (84s) 10s ago 84s 46.0M 4096M 18.2.0 dc2bc1663786 decf7e88ebaf 2026-03-09T17:27:29.365 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (73s) 10s ago 73s 48.3M 4096M 18.2.0 dc2bc1663786 df6c7067c3e4 2026-03-09T17:27:29.365 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (63s) 11s ago 63s 43.8M 4096M 18.2.0 dc2bc1663786 48a594500ef1 2026-03-09T17:27:29.365 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (53s) 11s ago 53s 43.8M 4096M 18.2.0 dc2bc1663786 a47c39052541 2026-03-09T17:27:29.365 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (43s) 11s ago 43s 42.5M 4096M 18.2.0 dc2bc1663786 89f436540a49 2026-03-09T17:27:29.365 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (97s) 10s ago 2m 41.3M - 2.43.0 a07b618ecd1d 9f52c04d903c 2026-03-09T17:27:29.365 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.359+0000 7f36097fa700 1 -- 192.168.123.106:0/2836978247 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7f35f8000bf0 con 0x7f35f406c6d0 2026-03-09T17:27:29.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.366+0000 7f36127f7700 1 -- 192.168.123.106:0/2836978247 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f35f406c6d0 msgr2=0x7f35f406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:29.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.366+0000 7f36127f7700 1 --2- 192.168.123.106:0/2836978247 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f35f406c6d0 0x7f35f406eb80 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f3604003eb0 tx=0x7f360400b040 comp rx=0 tx=0).stop 2026-03-09T17:27:29.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.366+0000 7f36127f7700 1 -- 192.168.123.106:0/2836978247 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f360c107d50 msgr2=0x7f360c1a4be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:29.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.366+0000 7f36127f7700 1 --2- 192.168.123.106:0/2836978247 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f360c107d50 0x7f360c1a4be0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f35fc00f690 tx=0x7f35fc00f6c0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.366+0000 7f36127f7700 1 -- 192.168.123.106:0/2836978247 shutdown_connections 2026-03-09T17:27:29.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.366+0000 7f36127f7700 1 --2- 192.168.123.106:0/2836978247 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f360c107d50 0x7f360c1a4be0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.366+0000 7f36127f7700 1 --2- 192.168.123.106:0/2836978247 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f35f406c6d0 0x7f35f406eb80 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.366+0000 7f36127f7700 1 --2- 192.168.123.106:0/2836978247 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f360c1a5120 0x7f360c1aa190 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.366+0000 7f36127f7700 1 -- 192.168.123.106:0/2836978247 >> 192.168.123.106:0/2836978247 conn(0x7f360c06d3e0 msgr2=0x7f360c070610 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:29.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.366+0000 7f36127f7700 1 -- 192.168.123.106:0/2836978247 shutdown_connections 2026-03-09T17:27:29.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.367+0000 7f36127f7700 1 -- 192.168.123.106:0/2836978247 wait complete. 2026-03-09T17:27:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:29 vm09.local ceph-mon[62061]: from='client.14548 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:29 vm09.local ceph-mon[62061]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T17:27:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:29 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:27:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:29 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:27:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:29 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.450+0000 7f1095a90700 1 -- 192.168.123.106:0/3777468804 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1090072360 msgr2=0x7f10900770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.450+0000 7f1095a90700 1 --2- 192.168.123.106:0/3777468804 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1090072360 0x7f10900770e0 secure :-1 s=READY pgs=280 cs=0 l=1 rev1=1 crypto rx=0x7f108800b780 tx=0x7f108800ba90 comp rx=0 tx=0).stop 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.453+0000 7f1095a90700 1 -- 192.168.123.106:0/3777468804 shutdown_connections 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.453+0000 7f1095a90700 1 --2- 192.168.123.106:0/3777468804 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1090072360 0x7f10900770e0 unknown :-1 s=CLOSED pgs=280 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.453+0000 7f1095a90700 1 --2- 192.168.123.106:0/3777468804 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1090071980 0x7f1090071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.453+0000 7f1095a90700 1 -- 192.168.123.106:0/3777468804 >> 192.168.123.106:0/3777468804 conn(0x7f109006d1a0 msgr2=0x7f109006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.453+0000 7f1095a90700 1 -- 192.168.123.106:0/3777468804 shutdown_connections 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.453+0000 7f1095a90700 1 -- 192.168.123.106:0/3777468804 wait complete. 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.453+0000 7f1095a90700 1 Processor -- start 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.453+0000 7f1095a90700 1 -- start start 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.454+0000 7f1095a90700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1090071980 0x7f1090131350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.454+0000 7f1095a90700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1090131890 0x7f109007f520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.454+0000 7f1095a90700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1090131d90 con 0x7f1090071980 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.454+0000 7f1095a90700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1090131ed0 con 0x7f1090131890 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.454+0000 7f108f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1090071980 0x7f1090131350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.454+0000 7f108f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1090071980 0x7f1090131350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:49608/0 (socket says 192.168.123.106:49608) 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.454+0000 7f108f7fe700 1 -- 192.168.123.106:0/846166519 learned_addr learned my addr 192.168.123.106:0/846166519 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.454+0000 7f108effd700 1 --2- 192.168.123.106:0/846166519 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1090131890 0x7f109007f520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.454+0000 7f108f7fe700 1 -- 192.168.123.106:0/846166519 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1090131890 msgr2=0x7f109007f520 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.454+0000 7f108f7fe700 1 --2- 192.168.123.106:0/846166519 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1090131890 0x7f109007f520 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.454+0000 7f108f7fe700 1 -- 192.168.123.106:0/846166519 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f108800b050 con 0x7f1090071980 2026-03-09T17:27:29.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.455+0000 7f108f7fe700 1 --2- 192.168.123.106:0/846166519 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1090071980 0x7f1090131350 secure :-1 s=READY pgs=281 cs=0 l=1 rev1=1 crypto rx=0x7f108000b770 tx=0x7f108000ba80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:29.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.455+0000 7f108cff9700 1 -- 192.168.123.106:0/846166519 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1080010840 con 0x7f1090071980 2026-03-09T17:27:29.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.455+0000 7f108cff9700 1 -- 192.168.123.106:0/846166519 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1080010e80 con 0x7f1090071980 2026-03-09T17:27:29.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.455+0000 7f108cff9700 1 -- 192.168.123.106:0/846166519 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f108000d590 con 0x7f1090071980 2026-03-09T17:27:29.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.455+0000 7f1095a90700 1 -- 192.168.123.106:0/846166519 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f109007fa60 con 0x7f1090071980 2026-03-09T17:27:29.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.455+0000 7f1095a90700 1 -- 192.168.123.106:0/846166519 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f109007ff60 con 0x7f1090071980 2026-03-09T17:27:29.463 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.457+0000 7f1095a90700 1 -- 192.168.123.106:0/846166519 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f109012b500 con 0x7f1090071980 2026-03-09T17:27:29.463 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.457+0000 7f108cff9700 1 -- 192.168.123.106:0/846166519 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f108000f3e0 con 0x7f1090071980 2026-03-09T17:27:29.463 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.458+0000 7f108cff9700 1 --2- 192.168.123.106:0/846166519 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f107806c6d0 0x7f107806eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:29.463 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.458+0000 7f108cff9700 1 -- 192.168.123.106:0/846166519 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f108008b180 con 0x7f1090071980 2026-03-09T17:27:29.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.461+0000 7f108effd700 1 --2- 192.168.123.106:0/846166519 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f107806c6d0 0x7f107806eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:29.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.461+0000 7f108cff9700 1 -- 192.168.123.106:0/846166519 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f1080059410 con 0x7f1090071980 2026-03-09T17:27:29.477 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.470+0000 7f108effd700 1 --2- 192.168.123.106:0/846166519 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f107806c6d0 0x7f107806eb80 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f1088000f80 tx=0x7f10880119b0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:29.683 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.680+0000 7f1095a90700 1 -- 192.168.123.106:0/846166519 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f109004ea50 con 0x7f1090071980 2026-03-09T17:27:29.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.681+0000 7f108cff9700 1 -- 192.168.123.106:0/846166519 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f1080058fa0 con 0x7f1090071980 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:27:29.685 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:27:29.688 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.686+0000 7f10767fc700 1 -- 192.168.123.106:0/846166519 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f107806c6d0 msgr2=0x7f107806eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:29.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.686+0000 7f10767fc700 1 --2- 192.168.123.106:0/846166519 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f107806c6d0 0x7f107806eb80 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f1088000f80 tx=0x7f10880119b0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.686+0000 7f10767fc700 1 -- 192.168.123.106:0/846166519 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1090071980 msgr2=0x7f1090131350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:29.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.686+0000 7f10767fc700 1 --2- 192.168.123.106:0/846166519 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1090071980 0x7f1090131350 secure :-1 s=READY pgs=281 cs=0 l=1 rev1=1 crypto rx=0x7f108000b770 tx=0x7f108000ba80 comp rx=0 tx=0).stop 2026-03-09T17:27:29.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.686+0000 7f10767fc700 1 -- 192.168.123.106:0/846166519 shutdown_connections 2026-03-09T17:27:29.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.686+0000 7f10767fc700 1 --2- 192.168.123.106:0/846166519 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f107806c6d0 0x7f107806eb80 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.686+0000 7f10767fc700 1 --2- 192.168.123.106:0/846166519 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1090071980 0x7f1090131350 unknown :-1 s=CLOSED pgs=281 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.686+0000 7f10767fc700 1 --2- 192.168.123.106:0/846166519 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1090131890 0x7f109007f520 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.686+0000 7f10767fc700 1 -- 192.168.123.106:0/846166519 >> 192.168.123.106:0/846166519 conn(0x7f109006d1a0 msgr2=0x7f1090076420 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:29.690 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.686+0000 7f10767fc700 1 -- 192.168.123.106:0/846166519 shutdown_connections 2026-03-09T17:27:29.691 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.688+0000 7f10767fc700 1 -- 192.168.123.106:0/846166519 wait complete. 2026-03-09T17:27:29.801 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.799+0000 7f72c3472700 1 -- 192.168.123.106:0/2454241112 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72bc072330 msgr2=0x7f72bc0770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:29.801 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.799+0000 7f72c3472700 1 --2- 192.168.123.106:0/2454241112 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72bc072330 0x7f72bc0770b0 secure :-1 s=READY pgs=282 cs=0 l=1 rev1=1 crypto rx=0x7f72b400b780 tx=0x7f72b400ba90 comp rx=0 tx=0).stop 2026-03-09T17:27:29.802 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.799+0000 7f72c3472700 1 -- 192.168.123.106:0/2454241112 shutdown_connections 2026-03-09T17:27:29.802 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.799+0000 7f72c3472700 1 --2- 192.168.123.106:0/2454241112 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72bc072330 0x7f72bc0770b0 unknown :-1 s=CLOSED pgs=282 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.802 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.799+0000 7f72c3472700 1 --2- 192.168.123.106:0/2454241112 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72bc071950 0x7f72bc071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.802 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.799+0000 7f72c3472700 1 -- 192.168.123.106:0/2454241112 >> 192.168.123.106:0/2454241112 conn(0x7f72bc06d1a0 msgr2=0x7f72bc06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:29.802 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.799+0000 7f72c3472700 1 -- 192.168.123.106:0/2454241112 shutdown_connections 2026-03-09T17:27:29.802 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.799+0000 7f72c3472700 1 -- 192.168.123.106:0/2454241112 wait complete. 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.800+0000 7f72c3472700 1 Processor -- start 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.800+0000 7f72c3472700 1 -- start start 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.800+0000 7f72c3472700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72bc071950 0x7f72bc0824e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.800+0000 7f72c3472700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72bc082a20 0x7f72bc082e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.800+0000 7f72c3472700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f72bc1b2a90 con 0x7f72bc071950 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.800+0000 7f72c3472700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f72bc1b2bd0 con 0x7f72bc082a20 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.800+0000 7f72c2470700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72bc071950 0x7f72bc0824e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.800+0000 7f72c1c6f700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72bc082a20 0x7f72bc082e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.800+0000 7f72c1c6f700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72bc082a20 0x7f72bc082e90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:35644/0 (socket says 192.168.123.106:35644) 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.800+0000 7f72c1c6f700 1 -- 192.168.123.106:0/3957762537 learned_addr learned my addr 192.168.123.106:0/3957762537 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.800+0000 7f72c1c6f700 1 -- 192.168.123.106:0/3957762537 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72bc071950 msgr2=0x7f72bc0824e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.800+0000 7f72c1c6f700 1 --2- 192.168.123.106:0/3957762537 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72bc071950 0x7f72bc0824e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.800+0000 7f72c1c6f700 1 -- 192.168.123.106:0/3957762537 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f72b400b050 con 0x7f72bc082a20 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.801+0000 7f72c1c6f700 1 --2- 192.168.123.106:0/3957762537 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72bc082a20 0x7f72bc082e90 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f72b400b600 tx=0x7f72b4009de0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:29.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.801+0000 7f72b37fe700 1 -- 192.168.123.106:0/3957762537 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f72b401ecb0 con 0x7f72bc082a20 2026-03-09T17:27:29.804 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.801+0000 7f72c3472700 1 -- 192.168.123.106:0/3957762537 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f72bc1b2d10 con 0x7f72bc082a20 2026-03-09T17:27:29.804 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.801+0000 7f72c3472700 1 -- 192.168.123.106:0/3957762537 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f72bc1b3180 con 0x7f72bc082a20 2026-03-09T17:27:29.804 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.802+0000 7f72b37fe700 1 -- 192.168.123.106:0/3957762537 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f72b401ee10 con 0x7f72bc082a20 2026-03-09T17:27:29.804 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.802+0000 7f72b37fe700 1 -- 192.168.123.106:0/3957762537 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f72b40198d0 con 0x7f72bc082a20 2026-03-09T17:27:29.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.803+0000 7f72b37fe700 1 -- 192.168.123.106:0/3957762537 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f72b4019a30 con 0x7f72bc082a20 2026-03-09T17:27:29.806 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.804+0000 7f72b37fe700 1 --2- 192.168.123.106:0/3957762537 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f72a806c7a0 0x7f72a806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:29.806 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.804+0000 7f72c2470700 1 --2- 192.168.123.106:0/3957762537 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f72a806c7a0 0x7f72a806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:29.806 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.804+0000 7f72b37fe700 1 -- 192.168.123.106:0/3957762537 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f72b4092360 con 0x7f72bc082a20 2026-03-09T17:27:29.806 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.804+0000 7f72c2470700 1 --2- 192.168.123.106:0/3957762537 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f72a806c7a0 0x7f72a806ec50 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f72b8009b60 tx=0x7f72b8008040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:29.807 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.804+0000 7f72c3472700 1 -- 192.168.123.106:0/3957762537 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f72a0005320 con 0x7f72bc082a20 2026-03-09T17:27:29.810 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.808+0000 7f72b37fe700 1 -- 192.168.123.106:0/3957762537 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f72b40605f0 con 0x7f72bc082a20 2026-03-09T17:27:29.964 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.961+0000 7f72c3472700 1 -- 192.168.123.106:0/3957762537 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f72a0006200 con 0x7f72bc082a20 2026-03-09T17:27:29.966 INFO:teuthology.orchestra.run.vm06.stdout:e13 2026-03-09T17:27:29.966 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:27:29.966 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:27:29.966 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:27:29.966 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:29.966 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:epoch 11 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24291} 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:27:29.967 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.963+0000 7f72b37fe700 1 -- 192.168.123.106:0/3957762537 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1851 (secure 0 0 0) 0x7f72b4060180 con 0x7f72bc082a20 2026-03-09T17:27:29.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.967+0000 7f72b17fa700 1 -- 192.168.123.106:0/3957762537 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f72a806c7a0 msgr2=0x7f72a806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:29.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.967+0000 7f72b17fa700 1 --2- 192.168.123.106:0/3957762537 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f72a806c7a0 0x7f72a806ec50 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f72b8009b60 tx=0x7f72b8008040 comp rx=0 tx=0).stop 2026-03-09T17:27:29.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.967+0000 7f72b17fa700 1 -- 192.168.123.106:0/3957762537 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72bc082a20 msgr2=0x7f72bc082e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:29.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.967+0000 7f72b17fa700 1 --2- 192.168.123.106:0/3957762537 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72bc082a20 0x7f72bc082e90 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f72b400b600 tx=0x7f72b4009de0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.970 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.968+0000 7f72b17fa700 1 -- 192.168.123.106:0/3957762537 shutdown_connections 2026-03-09T17:27:29.970 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.968+0000 7f72b17fa700 1 --2- 192.168.123.106:0/3957762537 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f72a806c7a0 0x7f72a806ec50 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.970 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.968+0000 7f72b17fa700 1 --2- 192.168.123.106:0/3957762537 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f72bc071950 0x7f72bc0824e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.970 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.968+0000 7f72b17fa700 1 --2- 192.168.123.106:0/3957762537 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f72bc082a20 0x7f72bc082e90 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:29.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.968+0000 7f72b17fa700 1 -- 192.168.123.106:0/3957762537 >> 192.168.123.106:0/3957762537 conn(0x7f72bc06d1a0 msgr2=0x7f72bc0763e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:29.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.970+0000 7f72b17fa700 1 -- 192.168.123.106:0/3957762537 shutdown_connections 2026-03-09T17:27:29.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:29.970+0000 7f72b17fa700 1 -- 192.168.123.106:0/3957762537 wait complete. 2026-03-09T17:27:29.973 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:27:30.122 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.119+0000 7fdfc152b700 1 -- 192.168.123.106:0/3581454424 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdfbc108990 msgr2=0x7fdfbc071fe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:30.122 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.119+0000 7fdfc152b700 1 --2- 192.168.123.106:0/3581454424 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdfbc108990 0x7fdfbc071fe0 secure :-1 s=READY pgs=283 cs=0 l=1 rev1=1 crypto rx=0x7fdfb0009b50 tx=0x7fdfb0009e60 comp rx=0 tx=0).stop 2026-03-09T17:27:30.124 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.122+0000 7fdfc152b700 1 -- 192.168.123.106:0/3581454424 shutdown_connections 2026-03-09T17:27:30.124 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.122+0000 7fdfc152b700 1 --2- 192.168.123.106:0/3581454424 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdfbc108990 0x7fdfbc071fe0 unknown :-1 s=CLOSED pgs=283 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:30.124 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.122+0000 7fdfc152b700 1 --2- 192.168.123.106:0/3581454424 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdfbc107fb0 0x7fdfbc1083c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:30.124 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.122+0000 7fdfc152b700 1 -- 192.168.123.106:0/3581454424 >> 192.168.123.106:0/3581454424 conn(0x7fdfbc06d3e0 msgr2=0x7fdfbc06f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:30.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.124+0000 7fdfc152b700 1 -- 192.168.123.106:0/3581454424 shutdown_connections 2026-03-09T17:27:30.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.124+0000 7fdfc152b700 1 -- 192.168.123.106:0/3581454424 wait complete. 2026-03-09T17:27:30.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.125+0000 7fdfc152b700 1 Processor -- start 2026-03-09T17:27:30.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.126+0000 7fdfc152b700 1 -- start start 2026-03-09T17:27:30.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.126+0000 7fdfc152b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdfbc107fb0 0x7fdfbc10a9d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:30.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.126+0000 7fdfc152b700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdfbc108990 0x7fdfbc109020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:30.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.126+0000 7fdfc152b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdfbc109560 con 0x7fdfbc107fb0 2026-03-09T17:27:30.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.126+0000 7fdfc152b700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdfbc1096a0 con 0x7fdfbc108990 2026-03-09T17:27:30.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.126+0000 7fdfbaffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdfbc107fb0 0x7fdfbc10a9d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:30.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.126+0000 7fdfbaffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdfbc107fb0 0x7fdfbc10a9d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:49650/0 (socket says 192.168.123.106:49650) 2026-03-09T17:27:30.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.126+0000 7fdfbaffd700 1 -- 192.168.123.106:0/4189581371 learned_addr learned my addr 192.168.123.106:0/4189581371 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:30.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.126+0000 7fdfba7fc700 1 --2- 192.168.123.106:0/4189581371 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdfbc108990 0x7fdfbc109020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:30.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.127+0000 7fdfba7fc700 1 -- 192.168.123.106:0/4189581371 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdfbc107fb0 msgr2=0x7fdfbc10a9d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:30.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.127+0000 7fdfba7fc700 1 --2- 192.168.123.106:0/4189581371 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdfbc107fb0 0x7fdfbc10a9d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:30.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.127+0000 7fdfba7fc700 1 -- 192.168.123.106:0/4189581371 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdfb00097e0 con 0x7fdfbc108990 2026-03-09T17:27:30.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.127+0000 7fdfba7fc700 1 --2- 192.168.123.106:0/4189581371 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdfbc108990 0x7fdfbc109020 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fdfb0000c00 tx=0x7fdfb0005250 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:30.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.129+0000 7fdfa3fff700 1 -- 192.168.123.106:0/4189581371 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdfb001d070 con 0x7fdfbc108990 2026-03-09T17:27:30.134 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.129+0000 7fdfc152b700 1 -- 192.168.123.106:0/4189581371 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdfbc109920 con 0x7fdfbc108990 2026-03-09T17:27:30.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.129+0000 7fdfc152b700 1 -- 192.168.123.106:0/4189581371 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdfbc109e10 con 0x7fdfbc108990 2026-03-09T17:27:30.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.130+0000 7fdfa3fff700 1 -- 192.168.123.106:0/4189581371 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdfb000bc50 con 0x7fdfbc108990 2026-03-09T17:27:30.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.130+0000 7fdfa3fff700 1 -- 192.168.123.106:0/4189581371 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdfb000f670 con 0x7fdfbc108990 2026-03-09T17:27:30.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.131+0000 7fdfa3fff700 1 -- 192.168.123.106:0/4189581371 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fdfb000f890 con 0x7fdfbc108990 2026-03-09T17:27:30.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.132+0000 7fdfa3fff700 1 --2- 192.168.123.106:0/4189581371 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdfa406c7a0 0x7fdfa406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:30.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.132+0000 7fdfa3fff700 1 -- 192.168.123.106:0/4189581371 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fdfb008de10 con 0x7fdfbc108990 2026-03-09T17:27:30.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.132+0000 7fdfa1ffb700 1 -- 192.168.123.106:0/4189581371 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdfbc066e40 con 0x7fdfbc108990 2026-03-09T17:27:30.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.132+0000 7fdfbaffd700 1 --2- 192.168.123.106:0/4189581371 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdfa406c7a0 0x7fdfa406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:30.135 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.133+0000 7fdfbaffd700 1 --2- 192.168.123.106:0/4189581371 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdfa406c7a0 0x7fdfa406ec50 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7fdfac005950 tx=0x7fdfac009450 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:30.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.135+0000 7fdfa3fff700 1 -- 192.168.123.106:0/4189581371 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fdfb005c210 con 0x7fdfbc108990 2026-03-09T17:27:30.282 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:30 vm06.local ceph-mon[57307]: from='client.14552 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:30.282 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:30 vm06.local ceph-mon[57307]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T17:27:30.282 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:30 vm06.local ceph-mon[57307]: from='client.24363 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:30.282 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:30 vm06.local ceph-mon[57307]: pgmap v88: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.7 KiB/s rd, 597 B/s wr, 2 op/s 2026-03-09T17:27:30.282 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:30 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/846166519' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:27:30.282 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:30 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/3957762537' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:27:30.313 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.310+0000 7fdfa1ffb700 1 -- 192.168.123.106:0/4189581371 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fdfbc061ca0 con 0x7fdfa406c7a0 2026-03-09T17:27:30.318 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.316+0000 7fdfa3fff700 1 -- 192.168.123.106:0/4189581371 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7fdfbc061ca0 con 0x7fdfa406c7a0 2026-03-09T17:27:30.318 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:27:30.318 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-09T17:27:30.318 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:27:30.318 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:27:30.318 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [], 2026-03-09T17:27:30.318 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "", 2026-03-09T17:27:30.318 INFO:teuthology.orchestra.run.vm06.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-09T17:27:30.318 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:27:30.318 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:27:30.325 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.322+0000 7fdfc152b700 1 -- 192.168.123.106:0/4189581371 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdfa406c7a0 msgr2=0x7fdfa406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:30.325 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.322+0000 7fdfc152b700 1 --2- 192.168.123.106:0/4189581371 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdfa406c7a0 0x7fdfa406ec50 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7fdfac005950 tx=0x7fdfac009450 comp rx=0 tx=0).stop 2026-03-09T17:27:30.325 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.322+0000 7fdfc152b700 1 -- 192.168.123.106:0/4189581371 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdfbc108990 msgr2=0x7fdfbc109020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:30.325 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.322+0000 7fdfc152b700 1 --2- 192.168.123.106:0/4189581371 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdfbc108990 0x7fdfbc109020 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fdfb0000c00 tx=0x7fdfb0005250 comp rx=0 tx=0).stop 2026-03-09T17:27:30.325 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.323+0000 7fdfc152b700 1 -- 192.168.123.106:0/4189581371 shutdown_connections 2026-03-09T17:27:30.325 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.323+0000 7fdfc152b700 1 --2- 192.168.123.106:0/4189581371 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdfa406c7a0 0x7fdfa406ec50 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:30.325 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.323+0000 7fdfc152b700 1 --2- 192.168.123.106:0/4189581371 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdfbc107fb0 0x7fdfbc10a9d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:30.325 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.323+0000 7fdfc152b700 1 --2- 192.168.123.106:0/4189581371 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdfbc108990 0x7fdfbc109020 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:30.325 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.323+0000 7fdfc152b700 1 -- 192.168.123.106:0/4189581371 >> 192.168.123.106:0/4189581371 conn(0x7fdfbc06d3e0 msgr2=0x7fdfbc10d290 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:30.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.326+0000 7fdfc152b700 1 -- 192.168.123.106:0/4189581371 shutdown_connections 2026-03-09T17:27:30.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.326+0000 7fdfc152b700 1 -- 192.168.123.106:0/4189581371 wait complete. 2026-03-09T17:27:30.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.418+0000 7f4937fff700 1 -- 192.168.123.106:0/3167411705 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4938072440 msgr2=0x7f493810be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:30.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.418+0000 7f4937fff700 1 --2- 192.168.123.106:0/3167411705 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4938072440 0x7f493810be90 secure :-1 s=READY pgs=284 cs=0 l=1 rev1=1 crypto rx=0x7f493001c320 tx=0x7f493001c630 comp rx=0 tx=0).stop 2026-03-09T17:27:30.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.419+0000 7f4937fff700 1 -- 192.168.123.106:0/3167411705 shutdown_connections 2026-03-09T17:27:30.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.419+0000 7f4937fff700 1 --2- 192.168.123.106:0/3167411705 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4938072440 0x7f493810be90 unknown :-1 s=CLOSED pgs=284 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:30.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.419+0000 7f4937fff700 1 --2- 192.168.123.106:0/3167411705 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4938071a60 0x7f4938071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:30.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.419+0000 7f4937fff700 1 -- 192.168.123.106:0/3167411705 >> 192.168.123.106:0/3167411705 conn(0x7f493806d1a0 msgr2=0x7f493806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:30.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.419+0000 7f4937fff700 1 -- 192.168.123.106:0/3167411705 shutdown_connections 2026-03-09T17:27:30.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.419+0000 7f4937fff700 1 -- 192.168.123.106:0/3167411705 wait complete. 2026-03-09T17:27:30.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.420+0000 7f4937fff700 1 Processor -- start 2026-03-09T17:27:30.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.420+0000 7f4937fff700 1 -- start start 2026-03-09T17:27:30.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.420+0000 7f4937fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4938071a60 0x7f493819c250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:30.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.420+0000 7f4937fff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4938072440 0x7f493819c790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:30.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.420+0000 7f4937fff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f493819cdb0 con 0x7f4938071a60 2026-03-09T17:27:30.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.420+0000 7f4937fff700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f493819cef0 con 0x7f4938072440 2026-03-09T17:27:30.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.420+0000 7f4936ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4938071a60 0x7f493819c250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:30.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.420+0000 7f49367fc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4938072440 0x7f493819c790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:30.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.420+0000 7f49367fc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4938072440 0x7f493819c790 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:35672/0 (socket says 192.168.123.106:35672) 2026-03-09T17:27:30.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.420+0000 7f49367fc700 1 -- 192.168.123.106:0/1174168266 learned_addr learned my addr 192.168.123.106:0/1174168266 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:27:30.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.421+0000 7f4936ffd700 1 -- 192.168.123.106:0/1174168266 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4938072440 msgr2=0x7f493819c790 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:30.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.421+0000 7f4936ffd700 1 --2- 192.168.123.106:0/1174168266 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4938072440 0x7f493819c790 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:30.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.421+0000 7f4936ffd700 1 -- 192.168.123.106:0/1174168266 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f493001c060 con 0x7f4938071a60 2026-03-09T17:27:30.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.421+0000 7f4936ffd700 1 --2- 192.168.123.106:0/1174168266 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4938071a60 0x7f493819c250 secure :-1 s=READY pgs=285 cs=0 l=1 rev1=1 crypto rx=0x7f4938072f50 tx=0x7f492800bb10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:30.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.421+0000 7f491ffff700 1 -- 192.168.123.106:0/1174168266 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f492800d610 con 0x7f4938071a60 2026-03-09T17:27:30.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.421+0000 7f4937fff700 1 -- 192.168.123.106:0/1174168266 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f49381c39b0 con 0x7f4938071a60 2026-03-09T17:27:30.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.421+0000 7f4937fff700 1 -- 192.168.123.106:0/1174168266 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f49381c3f80 con 0x7f4938071a60 2026-03-09T17:27:30.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.423+0000 7f491ffff700 1 -- 192.168.123.106:0/1174168266 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f492800dc50 con 0x7f4938071a60 2026-03-09T17:27:30.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.423+0000 7f491ffff700 1 -- 192.168.123.106:0/1174168266 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4928017400 con 0x7f4938071a60 2026-03-09T17:27:30.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.423+0000 7f491ffff700 1 -- 192.168.123.106:0/1174168266 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f4928017620 con 0x7f4938071a60 2026-03-09T17:27:30.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.423+0000 7f491ffff700 1 --2- 192.168.123.106:0/1174168266 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f492006c7c0 0x7f492006ec70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:27:30.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.424+0000 7f49367fc700 1 --2- 192.168.123.106:0/1174168266 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f492006c7c0 0x7f492006ec70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:27:30.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.424+0000 7f4937fff700 1 -- 192.168.123.106:0/1174168266 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4924005320 con 0x7f4938071a60 2026-03-09T17:27:30.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.424+0000 7f49367fc700 1 --2- 192.168.123.106:0/1174168266 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f492006c7c0 0x7f492006ec70 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f493001cab0 tx=0x7f493000b040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:27:30.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.424+0000 7f491ffff700 1 -- 192.168.123.106:0/1174168266 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f492808c2d0 con 0x7f4938071a60 2026-03-09T17:27:30.432 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.429+0000 7f491ffff700 1 -- 192.168.123.106:0/1174168266 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f492805a560 con 0x7f4938071a60 2026-03-09T17:27:30.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.583+0000 7f4937fff700 1 -- 192.168.123.106:0/1174168266 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f4924005190 con 0x7f4938071a60 2026-03-09T17:27:30.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.583+0000 7f491ffff700 1 -- 192.168.123.106:0/1174168266 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f492805a0f0 con 0x7f4938071a60 2026-03-09T17:27:30.586 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_OK 2026-03-09T17:27:30.590 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.588+0000 7f491dffb700 1 -- 192.168.123.106:0/1174168266 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f492006c7c0 msgr2=0x7f492006ec70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:30.590 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.588+0000 7f491dffb700 1 --2- 192.168.123.106:0/1174168266 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f492006c7c0 0x7f492006ec70 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f493001cab0 tx=0x7f493000b040 comp rx=0 tx=0).stop 2026-03-09T17:27:30.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.589+0000 7f491dffb700 1 -- 192.168.123.106:0/1174168266 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4938071a60 msgr2=0x7f493819c250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:27:30.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.589+0000 7f491dffb700 1 --2- 192.168.123.106:0/1174168266 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4938071a60 0x7f493819c250 secure :-1 s=READY pgs=285 cs=0 l=1 rev1=1 crypto rx=0x7f4938072f50 tx=0x7f492800bb10 comp rx=0 tx=0).stop 2026-03-09T17:27:30.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.589+0000 7f491dffb700 1 -- 192.168.123.106:0/1174168266 shutdown_connections 2026-03-09T17:27:30.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.589+0000 7f491dffb700 1 --2- 192.168.123.106:0/1174168266 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f492006c7c0 0x7f492006ec70 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:30.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.589+0000 7f491dffb700 1 --2- 192.168.123.106:0/1174168266 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4938071a60 0x7f493819c250 unknown :-1 s=CLOSED pgs=285 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:30.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.589+0000 7f491dffb700 1 --2- 192.168.123.106:0/1174168266 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4938072440 0x7f493819c790 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:27:30.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.589+0000 7f491dffb700 1 -- 192.168.123.106:0/1174168266 >> 192.168.123.106:0/1174168266 conn(0x7f493806d1a0 msgr2=0x7f493810a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:27:30.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.589+0000 7f491dffb700 1 -- 192.168.123.106:0/1174168266 shutdown_connections 2026-03-09T17:27:30.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:27:30.589+0000 7f491dffb700 1 -- 192.168.123.106:0/1174168266 wait complete. 2026-03-09T17:27:30.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:30 vm09.local ceph-mon[62061]: from='client.14552 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:30.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:30 vm09.local ceph-mon[62061]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T17:27:30.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:30 vm09.local ceph-mon[62061]: from='client.24363 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:30.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:30 vm09.local ceph-mon[62061]: pgmap v88: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.7 KiB/s rd, 597 B/s wr, 2 op/s 2026-03-09T17:27:30.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:30 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/846166519' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:27:30.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:30 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/3957762537' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:27:31.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:31 vm06.local ceph-mon[57307]: from='client.24367 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:31.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:31 vm06.local ceph-mon[57307]: from='client.24375 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:31.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:31 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/1174168266' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:27:31.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:31 vm09.local ceph-mon[62061]: from='client.24367 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:31.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:31 vm09.local ceph-mon[62061]: from='client.24375 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:27:31.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:31 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/1174168266' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:27:32.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:32 vm06.local ceph-mon[57307]: pgmap v89: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.5 KiB/s rd, 682 B/s wr, 2 op/s 2026-03-09T17:27:32.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:32 vm09.local ceph-mon[62061]: pgmap v89: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.5 KiB/s rd, 682 B/s wr, 2 op/s 2026-03-09T17:27:34.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:34 vm06.local ceph-mon[57307]: pgmap v90: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.2 KiB/s rd, 1.1 KiB/s wr, 3 op/s 2026-03-09T17:27:34.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:34 vm09.local ceph-mon[62061]: pgmap v90: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.2 KiB/s rd, 1.1 KiB/s wr, 3 op/s 2026-03-09T17:27:36.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:35 vm06.local ceph-mon[57307]: pgmap v91: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.7 KiB/s rd, 1.1 KiB/s wr, 2 op/s 2026-03-09T17:27:36.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:35 vm09.local ceph-mon[62061]: pgmap v91: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.7 KiB/s rd, 1.1 KiB/s wr, 2 op/s 2026-03-09T17:27:37.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:37 vm06.local ceph-mon[57307]: pgmap v92: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 1.1 KiB/s wr, 3 op/s 2026-03-09T17:27:37.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:37 vm09.local ceph-mon[62061]: pgmap v92: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 1.1 KiB/s wr, 3 op/s 2026-03-09T17:27:38.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:38 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:27:38.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:38 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:27:40.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:39 vm09.local ceph-mon[62061]: pgmap v93: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 2 op/s 2026-03-09T17:27:40.151 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:39 vm06.local ceph-mon[57307]: pgmap v93: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 2 op/s 2026-03-09T17:27:41.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:41 vm06.local ceph-mon[57307]: pgmap v94: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 2 op/s 2026-03-09T17:27:41.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:41 vm09.local ceph-mon[62061]: pgmap v94: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 2 op/s 2026-03-09T17:27:44.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:44 vm06.local ceph-mon[57307]: pgmap v95: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.6 KiB/s rd, 511 B/s wr, 2 op/s 2026-03-09T17:27:44.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:44 vm09.local ceph-mon[62061]: pgmap v95: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.6 KiB/s rd, 511 B/s wr, 2 op/s 2026-03-09T17:27:46.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:45 vm06.local ceph-mon[57307]: pgmap v96: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 85 B/s wr, 1 op/s 2026-03-09T17:27:46.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:45 vm09.local ceph-mon[62061]: pgmap v96: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 85 B/s wr, 1 op/s 2026-03-09T17:27:48.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:48 vm06.local ceph-mon[57307]: pgmap v97: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:27:48.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:48 vm09.local ceph-mon[62061]: pgmap v97: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:27:50.554 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:50 vm06.local ceph-mon[57307]: pgmap v98: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:27:50.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:50 vm09.local ceph-mon[62061]: pgmap v98: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:27:52.829 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:52 vm09.local ceph-mon[62061]: pgmap v99: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:27:52.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:52 vm06.local ceph-mon[57307]: pgmap v99: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:27:53.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:53 vm06.local ceph-mon[57307]: pgmap v100: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:27:53.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:53 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:27:53.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:53 vm09.local ceph-mon[62061]: pgmap v100: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:27:53.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:53 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:27:56.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:55 vm06.local ceph-mon[57307]: pgmap v101: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:27:56.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:55 vm09.local ceph-mon[62061]: pgmap v101: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:27:58.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:27:58 vm06.local ceph-mon[57307]: pgmap v102: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:27:58.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:27:58 vm09.local ceph-mon[62061]: pgmap v102: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:28:00.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:00 vm06.local ceph-mon[57307]: pgmap v103: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:00.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:00 vm09.local ceph-mon[62061]: pgmap v103: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:00.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.736+0000 7f31c9879700 1 -- 192.168.123.106:0/2107393833 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f31c4071980 msgr2=0x7f31c4071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:00.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.736+0000 7f31c9879700 1 --2- 192.168.123.106:0/2107393833 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f31c4071980 0x7f31c4071d90 secure :-1 s=READY pgs=286 cs=0 l=1 rev1=1 crypto rx=0x7f31b4009b00 tx=0x7f31b4009e10 comp rx=0 tx=0).stop 2026-03-09T17:28:00.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.736+0000 7f31c9879700 1 -- 192.168.123.106:0/2107393833 shutdown_connections 2026-03-09T17:28:00.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.736+0000 7f31c9879700 1 --2- 192.168.123.106:0/2107393833 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f31c4072360 0x7f31c40770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:00.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.736+0000 7f31c9879700 1 --2- 192.168.123.106:0/2107393833 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f31c4071980 0x7f31c4071d90 unknown :-1 s=CLOSED pgs=286 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:00.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.736+0000 7f31c9879700 1 -- 192.168.123.106:0/2107393833 >> 192.168.123.106:0/2107393833 conn(0x7f31c406d1a0 msgr2=0x7f31c406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:00.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.737+0000 7f31c9879700 1 -- 192.168.123.106:0/2107393833 shutdown_connections 2026-03-09T17:28:00.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.737+0000 7f31c9879700 1 -- 192.168.123.106:0/2107393833 wait complete. 2026-03-09T17:28:00.742 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.737+0000 7f31c9879700 1 Processor -- start 2026-03-09T17:28:00.742 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.737+0000 7f31c9879700 1 -- start start 2026-03-09T17:28:00.742 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.737+0000 7f31c9879700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f31c4072360 0x7f31c41b6010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:00.742 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.737+0000 7f31c9879700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f31c41b6550 0x7f31c407f550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:00.742 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.737+0000 7f31c9879700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f31c41b69c0 con 0x7f31c41b6550 2026-03-09T17:28:00.742 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.737+0000 7f31c9879700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f31c41b6b30 con 0x7f31c4072360 2026-03-09T17:28:00.742 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.737+0000 7f31c27fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f31c41b6550 0x7f31c407f550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:00.742 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.738+0000 7f31c27fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f31c41b6550 0x7f31c407f550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:38526/0 (socket says 192.168.123.106:38526) 2026-03-09T17:28:00.742 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.738+0000 7f31c27fc700 1 -- 192.168.123.106:0/1379026200 learned_addr learned my addr 192.168.123.106:0/1379026200 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:28:00.742 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.738+0000 7f31c27fc700 1 -- 192.168.123.106:0/1379026200 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f31c4072360 msgr2=0x7f31c41b6010 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:28:00.742 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.738+0000 7f31c27fc700 1 --2- 192.168.123.106:0/1379026200 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f31c4072360 0x7f31c41b6010 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:00.742 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.738+0000 7f31c27fc700 1 -- 192.168.123.106:0/1379026200 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f31b40097e0 con 0x7f31c41b6550 2026-03-09T17:28:00.742 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.738+0000 7f31c27fc700 1 --2- 192.168.123.106:0/1379026200 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f31c41b6550 0x7f31c407f550 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7f31bc00bf40 tx=0x7f31bc00bf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:00.746 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.740+0000 7f31c8877700 1 -- 192.168.123.106:0/1379026200 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f31bc00cb40 con 0x7f31c41b6550 2026-03-09T17:28:00.746 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.740+0000 7f31c9879700 1 -- 192.168.123.106:0/1379026200 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f31c407fa90 con 0x7f31c41b6550 2026-03-09T17:28:00.746 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.742+0000 7f31c8877700 1 -- 192.168.123.106:0/1379026200 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f31bc00cca0 con 0x7f31c41b6550 2026-03-09T17:28:00.746 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.742+0000 7f31c8877700 1 -- 192.168.123.106:0/1379026200 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f31bc012740 con 0x7f31c41b6550 2026-03-09T17:28:00.746 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.742+0000 7f31c8877700 1 -- 192.168.123.106:0/1379026200 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f31bc0129c0 con 0x7f31c41b6550 2026-03-09T17:28:00.746 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.743+0000 7f31c8877700 1 --2- 192.168.123.106:0/1379026200 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f31ac06c7a0 0x7f31ac06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:00.746 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.743+0000 7f31c2ffd700 1 --2- 192.168.123.106:0/1379026200 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f31ac06c7a0 0x7f31ac06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:00.746 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.743+0000 7f31c9879700 1 -- 192.168.123.106:0/1379026200 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f31c407ff80 con 0x7f31c41b6550 2026-03-09T17:28:00.746 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.744+0000 7f31c2ffd700 1 --2- 192.168.123.106:0/1379026200 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f31ac06c7a0 0x7f31ac06ec50 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f31b4005850 tx=0x7f31b4000bc0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:00.746 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.744+0000 7f31c8877700 1 -- 192.168.123.106:0/1379026200 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f31bc005e40 con 0x7f31c41b6550 2026-03-09T17:28:00.746 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.744+0000 7f31c9879700 1 -- 192.168.123.106:0/1379026200 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f31b0005320 con 0x7f31c41b6550 2026-03-09T17:28:00.750 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.747+0000 7f31c8877700 1 -- 192.168.123.106:0/1379026200 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f31bc019050 con 0x7f31c41b6550 2026-03-09T17:28:00.893 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.889+0000 7f31c9879700 1 -- 192.168.123.106:0/1379026200 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f31b0000bf0 con 0x7f31ac06c7a0 2026-03-09T17:28:00.893 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.890+0000 7f31c8877700 1 -- 192.168.123.106:0/1379026200 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f31b0000bf0 con 0x7f31ac06c7a0 2026-03-09T17:28:00.896 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.894+0000 7f31aa7fc700 1 -- 192.168.123.106:0/1379026200 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f31ac06c7a0 msgr2=0x7f31ac06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:00.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.894+0000 7f31aa7fc700 1 --2- 192.168.123.106:0/1379026200 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f31ac06c7a0 0x7f31ac06ec50 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f31b4005850 tx=0x7f31b4000bc0 comp rx=0 tx=0).stop 2026-03-09T17:28:00.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.894+0000 7f31aa7fc700 1 -- 192.168.123.106:0/1379026200 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f31c41b6550 msgr2=0x7f31c407f550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:00.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.894+0000 7f31aa7fc700 1 --2- 192.168.123.106:0/1379026200 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f31c41b6550 0x7f31c407f550 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7f31bc00bf40 tx=0x7f31bc00bf70 comp rx=0 tx=0).stop 2026-03-09T17:28:00.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.894+0000 7f31aa7fc700 1 -- 192.168.123.106:0/1379026200 shutdown_connections 2026-03-09T17:28:00.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.894+0000 7f31aa7fc700 1 --2- 192.168.123.106:0/1379026200 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f31c4072360 0x7f31c41b6010 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:00.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.894+0000 7f31aa7fc700 1 --2- 192.168.123.106:0/1379026200 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f31ac06c7a0 0x7f31ac06ec50 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:00.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.894+0000 7f31aa7fc700 1 --2- 192.168.123.106:0/1379026200 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f31c41b6550 0x7f31c407f550 unknown :-1 s=CLOSED pgs=287 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:00.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.894+0000 7f31aa7fc700 1 -- 192.168.123.106:0/1379026200 >> 192.168.123.106:0/1379026200 conn(0x7f31c406d1a0 msgr2=0x7f31c40763a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:00.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.894+0000 7f31aa7fc700 1 -- 192.168.123.106:0/1379026200 shutdown_connections 2026-03-09T17:28:00.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.894+0000 7f31aa7fc700 1 -- 192.168.123.106:0/1379026200 wait complete. 2026-03-09T17:28:00.910 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.998+0000 7f4a5814b700 1 -- 192.168.123.106:0/630227128 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a50071a90 msgr2=0x7f4a50071ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:00.998+0000 7f4a5814b700 1 --2- 192.168.123.106:0/630227128 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a50071a90 0x7f4a50071ea0 secure :-1 s=READY pgs=288 cs=0 l=1 rev1=1 crypto rx=0x7f4a4c009b00 tx=0x7f4a4c009e10 comp rx=0 tx=0).stop 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.000+0000 7f4a5814b700 1 -- 192.168.123.106:0/630227128 shutdown_connections 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.000+0000 7f4a5814b700 1 --2- 192.168.123.106:0/630227128 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4a50072470 0x7f4a5010beb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.000+0000 7f4a5814b700 1 --2- 192.168.123.106:0/630227128 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a50071a90 0x7f4a50071ea0 unknown :-1 s=CLOSED pgs=288 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.000+0000 7f4a5814b700 1 -- 192.168.123.106:0/630227128 >> 192.168.123.106:0/630227128 conn(0x7f4a5006d1a0 msgr2=0x7f4a5006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.001+0000 7f4a5814b700 1 -- 192.168.123.106:0/630227128 shutdown_connections 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.001+0000 7f4a5814b700 1 -- 192.168.123.106:0/630227128 wait complete. 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.001+0000 7f4a5814b700 1 Processor -- start 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.001+0000 7f4a5814b700 1 -- start start 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.001+0000 7f4a5814b700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4a50072470 0x7f4a50116aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.001+0000 7f4a5814b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a50116fe0 0x7f4a501b27f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.001+0000 7f4a5814b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a501174e0 con 0x7f4a50116fe0 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.001+0000 7f4a5814b700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a50117650 con 0x7f4a50072470 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.002+0000 7f4a556e6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a50116fe0 0x7f4a501b27f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.002+0000 7f4a556e6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a50116fe0 0x7f4a501b27f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:38544/0 (socket says 192.168.123.106:38544) 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.002+0000 7f4a556e6700 1 -- 192.168.123.106:0/2857149337 learned_addr learned my addr 192.168.123.106:0/2857149337 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.002+0000 7f4a556e6700 1 -- 192.168.123.106:0/2857149337 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4a50072470 msgr2=0x7f4a50116aa0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.002+0000 7f4a556e6700 1 --2- 192.168.123.106:0/2857149337 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4a50072470 0x7f4a50116aa0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.002+0000 7f4a556e6700 1 -- 192.168.123.106:0/2857149337 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4a4c0097e0 con 0x7f4a50116fe0 2026-03-09T17:28:01.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.003+0000 7f4a556e6700 1 --2- 192.168.123.106:0/2857149337 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a50116fe0 0x7f4a501b27f0 secure :-1 s=READY pgs=289 cs=0 l=1 rev1=1 crypto rx=0x7f4a4000d900 tx=0x7f4a4000dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:01.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.003+0000 7f4a46ffd700 1 -- 192.168.123.106:0/2857149337 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4a400098e0 con 0x7f4a50116fe0 2026-03-09T17:28:01.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.005+0000 7f4a5814b700 1 -- 192.168.123.106:0/2857149337 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4a501b2d90 con 0x7f4a50116fe0 2026-03-09T17:28:01.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.005+0000 7f4a5814b700 1 -- 192.168.123.106:0/2857149337 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4a501b32b0 con 0x7f4a50116fe0 2026-03-09T17:28:01.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.006+0000 7f4a46ffd700 1 -- 192.168.123.106:0/2857149337 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4a40010460 con 0x7f4a50116fe0 2026-03-09T17:28:01.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.006+0000 7f4a46ffd700 1 -- 192.168.123.106:0/2857149337 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4a4000f5d0 con 0x7f4a50116fe0 2026-03-09T17:28:01.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.006+0000 7f4a5814b700 1 -- 192.168.123.106:0/2857149337 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4a34005320 con 0x7f4a50116fe0 2026-03-09T17:28:01.012 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.009+0000 7f4a46ffd700 1 -- 192.168.123.106:0/2857149337 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f4a40009c50 con 0x7f4a50116fe0 2026-03-09T17:28:01.012 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.009+0000 7f4a46ffd700 1 --2- 192.168.123.106:0/2857149337 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4a3c06c680 0x7f4a3c06eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:01.012 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.010+0000 7f4a55ee7700 1 --2- 192.168.123.106:0/2857149337 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4a3c06c680 0x7f4a3c06eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:01.012 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.010+0000 7f4a55ee7700 1 --2- 192.168.123.106:0/2857149337 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4a3c06c680 0x7f4a3c06eb30 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f4a4c009ad0 tx=0x7f4a4c005bc0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:01.013 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.011+0000 7f4a46ffd700 1 -- 192.168.123.106:0/2857149337 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f4a40059cf0 con 0x7f4a50116fe0 2026-03-09T17:28:01.061 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.011+0000 7f4a46ffd700 1 -- 192.168.123.106:0/2857149337 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4a40059830 con 0x7f4a50116fe0 2026-03-09T17:28:01.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.228+0000 7f4a5814b700 1 -- 192.168.123.106:0/2857149337 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f4a34000bf0 con 0x7f4a3c06c680 2026-03-09T17:28:01.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.230+0000 7f4a46ffd700 1 -- 192.168.123.106:0/2857149337 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f4a34000bf0 con 0x7f4a3c06c680 2026-03-09T17:28:01.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.232+0000 7f4a44ff9700 1 -- 192.168.123.106:0/2857149337 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4a3c06c680 msgr2=0x7f4a3c06eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:01.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.232+0000 7f4a44ff9700 1 --2- 192.168.123.106:0/2857149337 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4a3c06c680 0x7f4a3c06eb30 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f4a4c009ad0 tx=0x7f4a4c005bc0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.232+0000 7f4a44ff9700 1 -- 192.168.123.106:0/2857149337 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a50116fe0 msgr2=0x7f4a501b27f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:01.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.232+0000 7f4a44ff9700 1 --2- 192.168.123.106:0/2857149337 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a50116fe0 0x7f4a501b27f0 secure :-1 s=READY pgs=289 cs=0 l=1 rev1=1 crypto rx=0x7f4a4000d900 tx=0x7f4a4000dcc0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.233+0000 7f4a44ff9700 1 -- 192.168.123.106:0/2857149337 shutdown_connections 2026-03-09T17:28:01.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.233+0000 7f4a44ff9700 1 --2- 192.168.123.106:0/2857149337 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4a50072470 0x7f4a50116aa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.233+0000 7f4a44ff9700 1 --2- 192.168.123.106:0/2857149337 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f4a3c06c680 0x7f4a3c06eb30 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.233+0000 7f4a44ff9700 1 --2- 192.168.123.106:0/2857149337 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a50116fe0 0x7f4a501b27f0 unknown :-1 s=CLOSED pgs=289 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.233+0000 7f4a44ff9700 1 -- 192.168.123.106:0/2857149337 >> 192.168.123.106:0/2857149337 conn(0x7f4a5006d1a0 msgr2=0x7f4a500705f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:01.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.233+0000 7f4a44ff9700 1 -- 192.168.123.106:0/2857149337 shutdown_connections 2026-03-09T17:28:01.236 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.233+0000 7f4a44ff9700 1 -- 192.168.123.106:0/2857149337 wait complete. 2026-03-09T17:28:01.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.324+0000 7fa7c9b7e700 1 -- 192.168.123.106:0/2992876900 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7c4072440 msgr2=0x7fa7c410be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:01.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.324+0000 7fa7c9b7e700 1 --2- 192.168.123.106:0/2992876900 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7c4072440 0x7fa7c410be90 secure :-1 s=READY pgs=290 cs=0 l=1 rev1=1 crypto rx=0x7fa7b4009b00 tx=0x7fa7b4009e10 comp rx=0 tx=0).stop 2026-03-09T17:28:01.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.327+0000 7fa7c9b7e700 1 -- 192.168.123.106:0/2992876900 shutdown_connections 2026-03-09T17:28:01.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.327+0000 7fa7c9b7e700 1 --2- 192.168.123.106:0/2992876900 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7c4072440 0x7fa7c410be90 unknown :-1 s=CLOSED pgs=290 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.327+0000 7fa7c9b7e700 1 --2- 192.168.123.106:0/2992876900 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7c4071a60 0x7fa7c4071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.327+0000 7fa7c9b7e700 1 -- 192.168.123.106:0/2992876900 >> 192.168.123.106:0/2992876900 conn(0x7fa7c406d1a0 msgr2=0x7fa7c406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:01.331 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.328+0000 7fa7c9b7e700 1 -- 192.168.123.106:0/2992876900 shutdown_connections 2026-03-09T17:28:01.331 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.328+0000 7fa7c9b7e700 1 -- 192.168.123.106:0/2992876900 wait complete. 2026-03-09T17:28:01.331 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.328+0000 7fa7c9b7e700 1 Processor -- start 2026-03-09T17:28:01.331 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.328+0000 7fa7c9b7e700 1 -- start start 2026-03-09T17:28:01.331 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.329+0000 7fa7c9b7e700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7c4071a60 0x7fa7c4116a10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:01.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.329+0000 7fa7c9b7e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7c4072440 0x7fa7c4116f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:01.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.329+0000 7fa7c9b7e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa7c4117570 con 0x7fa7c4072440 2026-03-09T17:28:01.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.329+0000 7fa7c9b7e700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa7c41b2770 con 0x7fa7c4071a60 2026-03-09T17:28:01.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.329+0000 7fa7c3fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7c4072440 0x7fa7c4116f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:01.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.329+0000 7fa7c3fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7c4072440 0x7fa7c4116f50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:38564/0 (socket says 192.168.123.106:38564) 2026-03-09T17:28:01.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.329+0000 7fa7c3fff700 1 -- 192.168.123.106:0/2105591422 learned_addr learned my addr 192.168.123.106:0/2105591422 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:28:01.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.329+0000 7fa7c3fff700 1 -- 192.168.123.106:0/2105591422 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7c4071a60 msgr2=0x7fa7c4116a10 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:28:01.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.329+0000 7fa7c3fff700 1 --2- 192.168.123.106:0/2105591422 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7c4071a60 0x7fa7c4116a10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.329+0000 7fa7c3fff700 1 -- 192.168.123.106:0/2105591422 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa7b40097e0 con 0x7fa7c4072440 2026-03-09T17:28:01.334 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.329+0000 7fa7c3fff700 1 --2- 192.168.123.106:0/2105591422 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7c4072440 0x7fa7c4116f50 secure :-1 s=READY pgs=291 cs=0 l=1 rev1=1 crypto rx=0x7fa7b4009ad0 tx=0x7fa7b40049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:01.334 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.330+0000 7fa7c1ffb700 1 -- 192.168.123.106:0/2105591422 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa7b401d070 con 0x7fa7c4072440 2026-03-09T17:28:01.334 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.330+0000 7fa7c1ffb700 1 -- 192.168.123.106:0/2105591422 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa7b400bc50 con 0x7fa7c4072440 2026-03-09T17:28:01.334 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.330+0000 7fa7c1ffb700 1 -- 192.168.123.106:0/2105591422 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa7b400f790 con 0x7fa7c4072440 2026-03-09T17:28:01.334 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.331+0000 7fa7c9b7e700 1 -- 192.168.123.106:0/2105591422 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa7c41b2910 con 0x7fa7c4072440 2026-03-09T17:28:01.334 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.331+0000 7fa7c9b7e700 1 -- 192.168.123.106:0/2105591422 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa7c41b2d80 con 0x7fa7c4072440 2026-03-09T17:28:01.336 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.333+0000 7fa7c1ffb700 1 -- 192.168.123.106:0/2105591422 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fa7b4022ae0 con 0x7fa7c4072440 2026-03-09T17:28:01.336 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.333+0000 7fa7c1ffb700 1 --2- 192.168.123.106:0/2105591422 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa7ac06c6d0 0x7fa7ac06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:01.337 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.334+0000 7fa7c1ffb700 1 -- 192.168.123.106:0/2105591422 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fa7b400bdc0 con 0x7fa7c4072440 2026-03-09T17:28:01.337 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.334+0000 7fa7c8b7c700 1 --2- 192.168.123.106:0/2105591422 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa7ac06c6d0 0x7fa7ac06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:01.337 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.334+0000 7fa7c8b7c700 1 --2- 192.168.123.106:0/2105591422 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa7ac06c6d0 0x7fa7ac06eb80 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7fa7bc0060b0 tx=0x7fa7bc006040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:01.337 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.334+0000 7fa7c9b7e700 1 -- 192.168.123.106:0/2105591422 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa7c4110c20 con 0x7fa7c4072440 2026-03-09T17:28:01.341 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.337+0000 7fa7c1ffb700 1 -- 192.168.123.106:0/2105591422 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa7b4092050 con 0x7fa7c4072440 2026-03-09T17:28:01.497 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.494+0000 7fa7c9b7e700 1 -- 192.168.123.106:0/2105591422 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fa7c4061190 con 0x7fa7ac06c6d0 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (2m) 42s ago 3m 24.7M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (3m) 42s ago 3m 8078k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (2m) 43s ago 2m 8136k - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (3m) 42s ago 3m 7402k - 18.2.0 dc2bc1663786 8c6366ef2954 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (2m) 43s ago 2m 7419k - 18.2.0 dc2bc1663786 78af352f0367 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (2m) 42s ago 2m 84.7M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (48s) 42s ago 48s 10.8M - 18.2.0 dc2bc1663786 4b4cbdf0c640 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (50s) 42s ago 50s 19.5M - 18.2.0 dc2bc1663786 4c8e86b2b8cd 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (49s) 43s ago 49s 14.7M - 18.2.0 dc2bc1663786 aa1f0430b448 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (47s) 43s ago 47s 13.4M - 18.2.0 dc2bc1663786 8dc8a0159213 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:9283,8765,8443 running (3m) 42s ago 3m 498M - 18.2.0 dc2bc1663786 2765e8d99a9c 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (2m) 43s ago 2m 443M - 18.2.0 dc2bc1663786 e6525bf5de20 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (3m) 42s ago 3m 52.0M 2048M 18.2.0 dc2bc1663786 e0e1a20b1577 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (2m) 43s ago 2m 47.4M 2048M 18.2.0 dc2bc1663786 4c30d1217de3 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (3m) 42s ago 3m 13.9M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (2m) 43s ago 2m 14.9M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (2m) 42s ago 2m 46.8M 4096M 18.2.0 dc2bc1663786 7a07f019bdd7 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (116s) 42s ago 116s 46.0M 4096M 18.2.0 dc2bc1663786 decf7e88ebaf 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (105s) 42s ago 105s 48.3M 4096M 18.2.0 dc2bc1663786 df6c7067c3e4 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (95s) 43s ago 95s 43.8M 4096M 18.2.0 dc2bc1663786 48a594500ef1 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (85s) 43s ago 85s 43.8M 4096M 18.2.0 dc2bc1663786 a47c39052541 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (75s) 43s ago 75s 42.5M 4096M 18.2.0 dc2bc1663786 89f436540a49 2026-03-09T17:28:01.507 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (2m) 42s ago 2m 41.3M - 2.43.0 a07b618ecd1d 9f52c04d903c 2026-03-09T17:28:01.508 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.503+0000 7fa7c1ffb700 1 -- 192.168.123.106:0/2105591422 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7fa7c4061190 con 0x7fa7ac06c6d0 2026-03-09T17:28:01.509 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.507+0000 7fa7ab7fe700 1 -- 192.168.123.106:0/2105591422 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa7ac06c6d0 msgr2=0x7fa7ac06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:01.509 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.507+0000 7fa7ab7fe700 1 --2- 192.168.123.106:0/2105591422 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa7ac06c6d0 0x7fa7ac06eb80 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7fa7bc0060b0 tx=0x7fa7bc006040 comp rx=0 tx=0).stop 2026-03-09T17:28:01.509 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.507+0000 7fa7ab7fe700 1 -- 192.168.123.106:0/2105591422 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7c4072440 msgr2=0x7fa7c4116f50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:01.509 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.507+0000 7fa7ab7fe700 1 --2- 192.168.123.106:0/2105591422 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7c4072440 0x7fa7c4116f50 secure :-1 s=READY pgs=291 cs=0 l=1 rev1=1 crypto rx=0x7fa7b4009ad0 tx=0x7fa7b40049e0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.510 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.508+0000 7fa7ab7fe700 1 -- 192.168.123.106:0/2105591422 shutdown_connections 2026-03-09T17:28:01.510 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.508+0000 7fa7ab7fe700 1 --2- 192.168.123.106:0/2105591422 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7c4071a60 0x7fa7c4116a10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.510 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.508+0000 7fa7ab7fe700 1 --2- 192.168.123.106:0/2105591422 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa7ac06c6d0 0x7fa7ac06eb80 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.510 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.508+0000 7fa7ab7fe700 1 --2- 192.168.123.106:0/2105591422 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7c4072440 0x7fa7c4116f50 unknown :-1 s=CLOSED pgs=291 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.510 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.508+0000 7fa7ab7fe700 1 -- 192.168.123.106:0/2105591422 >> 192.168.123.106:0/2105591422 conn(0x7fa7c406d1a0 msgr2=0x7fa7c410a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:01.510 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.508+0000 7fa7ab7fe700 1 -- 192.168.123.106:0/2105591422 shutdown_connections 2026-03-09T17:28:01.511 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.508+0000 7fa7ab7fe700 1 -- 192.168.123.106:0/2105591422 wait complete. 2026-03-09T17:28:01.602 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.597+0000 7fc224d69700 1 -- 192.168.123.106:0/2703698324 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc220071950 msgr2=0x7fc220071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:01.602 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.597+0000 7fc224d69700 1 --2- 192.168.123.106:0/2703698324 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc220071950 0x7fc220071d60 secure :-1 s=READY pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7fc210009f10 tx=0x7fc210009b80 comp rx=0 tx=0).stop 2026-03-09T17:28:01.603 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.600+0000 7fc224d69700 1 -- 192.168.123.106:0/2703698324 shutdown_connections 2026-03-09T17:28:01.603 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.600+0000 7fc224d69700 1 --2- 192.168.123.106:0/2703698324 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc220072330 0x7fc2200770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.603 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.600+0000 7fc224d69700 1 --2- 192.168.123.106:0/2703698324 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc220071950 0x7fc220071d60 unknown :-1 s=CLOSED pgs=292 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.600+0000 7fc224d69700 1 -- 192.168.123.106:0/2703698324 >> 192.168.123.106:0/2703698324 conn(0x7fc22006d1a0 msgr2=0x7fc22006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:01.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.601+0000 7fc224d69700 1 -- 192.168.123.106:0/2703698324 shutdown_connections 2026-03-09T17:28:01.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.601+0000 7fc224d69700 1 -- 192.168.123.106:0/2703698324 wait complete. 2026-03-09T17:28:01.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.601+0000 7fc224d69700 1 Processor -- start 2026-03-09T17:28:01.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.601+0000 7fc224d69700 1 -- start start 2026-03-09T17:28:01.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.602+0000 7fc224d69700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc220072330 0x7fc2201312d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:01.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.602+0000 7fc224d69700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc220131810 0x7fc22007f430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:01.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.602+0000 7fc224d69700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc220131d10 con 0x7fc220131810 2026-03-09T17:28:01.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.602+0000 7fc224d69700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc220131e80 con 0x7fc220072330 2026-03-09T17:28:01.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.602+0000 7fc21effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc220131810 0x7fc22007f430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:01.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.602+0000 7fc21effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc220131810 0x7fc22007f430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:38586/0 (socket says 192.168.123.106:38586) 2026-03-09T17:28:01.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.602+0000 7fc21effd700 1 -- 192.168.123.106:0/3678598042 learned_addr learned my addr 192.168.123.106:0/3678598042 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:28:01.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.602+0000 7fc21effd700 1 -- 192.168.123.106:0/3678598042 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc220072330 msgr2=0x7fc2201312d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:01.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.602+0000 7fc21effd700 1 --2- 192.168.123.106:0/3678598042 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc220072330 0x7fc2201312d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.602+0000 7fc21effd700 1 -- 192.168.123.106:0/3678598042 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc210009710 con 0x7fc220131810 2026-03-09T17:28:01.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.603+0000 7fc21effd700 1 --2- 192.168.123.106:0/3678598042 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc220131810 0x7fc22007f430 secure :-1 s=READY pgs=293 cs=0 l=1 rev1=1 crypto rx=0x7fc21800e3c0 tx=0x7fc21800e780 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:01.607 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.604+0000 7fc21cff9700 1 -- 192.168.123.106:0/3678598042 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc21800c170 con 0x7fc220131810 2026-03-09T17:28:01.607 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.605+0000 7fc224d69700 1 -- 192.168.123.106:0/3678598042 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc22007f9d0 con 0x7fc220131810 2026-03-09T17:28:01.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.605+0000 7fc224d69700 1 -- 192.168.123.106:0/3678598042 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc22007ff20 con 0x7fc220131810 2026-03-09T17:28:01.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.606+0000 7fc21cff9700 1 -- 192.168.123.106:0/3678598042 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc21800f040 con 0x7fc220131810 2026-03-09T17:28:01.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.606+0000 7fc21cff9700 1 -- 192.168.123.106:0/3678598042 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc218014720 con 0x7fc220131810 2026-03-09T17:28:01.609 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.607+0000 7fc21cff9700 1 -- 192.168.123.106:0/3678598042 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fc218014880 con 0x7fc220131810 2026-03-09T17:28:01.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.608+0000 7fc21cff9700 1 --2- 192.168.123.106:0/3678598042 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc20806c7a0 0x7fc20806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:01.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.608+0000 7fc21f7fe700 1 --2- 192.168.123.106:0/3678598042 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc20806c7a0 0x7fc20806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:01.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.609+0000 7fc21f7fe700 1 --2- 192.168.123.106:0/3678598042 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc20806c7a0 0x7fc20806ec50 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7fc210005c50 tx=0x7fc210005b60 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:01.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.609+0000 7fc21cff9700 1 -- 192.168.123.106:0/3678598042 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fc21808cbc0 con 0x7fc220131810 2026-03-09T17:28:01.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.610+0000 7fc224d69700 1 -- 192.168.123.106:0/3678598042 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc20c005320 con 0x7fc220131810 2026-03-09T17:28:01.624 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.616+0000 7fc21cff9700 1 -- 192.168.123.106:0/3678598042 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc2180573c0 con 0x7fc220131810 2026-03-09T17:28:01.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.810+0000 7fc224d69700 1 -- 192.168.123.106:0/3678598042 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fc20c006200 con 0x7fc220131810 2026-03-09T17:28:01.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.811+0000 7fc21cff9700 1 -- 192.168.123.106:0/3678598042 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7fc21805a9e0 con 0x7fc220131810 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:28:01.816 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:28:01.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.815+0000 7fc2067fc700 1 -- 192.168.123.106:0/3678598042 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc20806c7a0 msgr2=0x7fc20806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:01.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.815+0000 7fc2067fc700 1 --2- 192.168.123.106:0/3678598042 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc20806c7a0 0x7fc20806ec50 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7fc210005c50 tx=0x7fc210005b60 comp rx=0 tx=0).stop 2026-03-09T17:28:01.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.815+0000 7fc2067fc700 1 -- 192.168.123.106:0/3678598042 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc220131810 msgr2=0x7fc22007f430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:01.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.815+0000 7fc2067fc700 1 --2- 192.168.123.106:0/3678598042 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc220131810 0x7fc22007f430 secure :-1 s=READY pgs=293 cs=0 l=1 rev1=1 crypto rx=0x7fc21800e3c0 tx=0x7fc21800e780 comp rx=0 tx=0).stop 2026-03-09T17:28:01.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.815+0000 7fc2067fc700 1 -- 192.168.123.106:0/3678598042 shutdown_connections 2026-03-09T17:28:01.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.815+0000 7fc2067fc700 1 --2- 192.168.123.106:0/3678598042 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc220072330 0x7fc2201312d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.815+0000 7fc2067fc700 1 --2- 192.168.123.106:0/3678598042 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc20806c7a0 0x7fc20806ec50 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.815+0000 7fc2067fc700 1 --2- 192.168.123.106:0/3678598042 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc220131810 0x7fc22007f430 secure :-1 s=CLOSED pgs=293 cs=0 l=1 rev1=1 crypto rx=0x7fc21800e3c0 tx=0x7fc21800e780 comp rx=0 tx=0).stop 2026-03-09T17:28:01.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.815+0000 7fc2067fc700 1 -- 192.168.123.106:0/3678598042 >> 192.168.123.106:0/3678598042 conn(0x7fc22006d1a0 msgr2=0x7fc220070570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:01.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.816+0000 7fc2067fc700 1 -- 192.168.123.106:0/3678598042 shutdown_connections 2026-03-09T17:28:01.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.816+0000 7fc2067fc700 1 -- 192.168.123.106:0/3678598042 wait complete. 2026-03-09T17:28:01.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.922+0000 7effc445e700 1 -- 192.168.123.106:0/2867157599 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effbc071980 msgr2=0x7effbc071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:01.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.922+0000 7effc445e700 1 --2- 192.168.123.106:0/2867157599 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effbc071980 0x7effbc071d90 secure :-1 s=READY pgs=294 cs=0 l=1 rev1=1 crypto rx=0x7effb8009b00 tx=0x7effb8009e10 comp rx=0 tx=0).stop 2026-03-09T17:28:01.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.923+0000 7effc445e700 1 -- 192.168.123.106:0/2867157599 shutdown_connections 2026-03-09T17:28:01.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.923+0000 7effc445e700 1 --2- 192.168.123.106:0/2867157599 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effbc072360 0x7effbc0770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.923+0000 7effc445e700 1 --2- 192.168.123.106:0/2867157599 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effbc071980 0x7effbc071d90 unknown :-1 s=CLOSED pgs=294 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.926 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.923+0000 7effc445e700 1 -- 192.168.123.106:0/2867157599 >> 192.168.123.106:0/2867157599 conn(0x7effbc06d1a0 msgr2=0x7effbc06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:01.926 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.923+0000 7effc445e700 1 -- 192.168.123.106:0/2867157599 shutdown_connections 2026-03-09T17:28:01.927 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.923+0000 7effc445e700 1 -- 192.168.123.106:0/2867157599 wait complete. 2026-03-09T17:28:01.927 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.924+0000 7effc445e700 1 Processor -- start 2026-03-09T17:28:01.927 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.924+0000 7effc445e700 1 -- start start 2026-03-09T17:28:01.927 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.924+0000 7effc445e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effbc072360 0x7effbc082500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:01.927 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.924+0000 7effc445e700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effbc082a40 0x7effbc082eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:01.927 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.924+0000 7effc445e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7effbc1b2a90 con 0x7effbc072360 2026-03-09T17:28:01.927 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.924+0000 7effc445e700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7effbc1b2bd0 con 0x7effbc082a40 2026-03-09T17:28:01.927 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.924+0000 7effc19f9700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effbc082a40 0x7effbc082eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:01.928 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.924+0000 7effc19f9700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effbc082a40 0x7effbc082eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:47376/0 (socket says 192.168.123.106:47376) 2026-03-09T17:28:01.928 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.924+0000 7effc19f9700 1 -- 192.168.123.106:0/2763054946 learned_addr learned my addr 192.168.123.106:0/2763054946 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:28:01.928 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.924+0000 7effc21fa700 1 --2- 192.168.123.106:0/2763054946 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effbc072360 0x7effbc082500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:01.928 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.925+0000 7effc19f9700 1 -- 192.168.123.106:0/2763054946 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effbc072360 msgr2=0x7effbc082500 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:01.928 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.925+0000 7effc19f9700 1 --2- 192.168.123.106:0/2763054946 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effbc072360 0x7effbc082500 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:01.928 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.925+0000 7effc19f9700 1 -- 192.168.123.106:0/2763054946 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7effb80097e0 con 0x7effbc082a40 2026-03-09T17:28:01.928 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.925+0000 7effc19f9700 1 --2- 192.168.123.106:0/2763054946 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effbc082a40 0x7effbc082eb0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7effb400ee40 tx=0x7effb400c620 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:01.929 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.926+0000 7effb37fe700 1 -- 192.168.123.106:0/2763054946 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7effb400e050 con 0x7effbc082a40 2026-03-09T17:28:01.929 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.926+0000 7effc445e700 1 -- 192.168.123.106:0/2763054946 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7effbc1b2d10 con 0x7effbc082a40 2026-03-09T17:28:01.929 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.926+0000 7effc445e700 1 -- 192.168.123.106:0/2763054946 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7effbc1b3260 con 0x7effbc082a40 2026-03-09T17:28:01.929 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.927+0000 7effb37fe700 1 -- 192.168.123.106:0/2763054946 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7effb400f040 con 0x7effbc082a40 2026-03-09T17:28:01.930 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.927+0000 7effb37fe700 1 -- 192.168.123.106:0/2763054946 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7effb4013ea0 con 0x7effbc082a40 2026-03-09T17:28:01.931 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.928+0000 7effb37fe700 1 -- 192.168.123.106:0/2763054946 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7effb4019070 con 0x7effbc082a40 2026-03-09T17:28:01.932 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.929+0000 7effb37fe700 1 --2- 192.168.123.106:0/2763054946 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7effa806c7a0 0x7effa806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:01.932 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.930+0000 7effc21fa700 1 --2- 192.168.123.106:0/2763054946 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7effa806c7a0 0x7effa806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:01.932 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.930+0000 7effc21fa700 1 --2- 192.168.123.106:0/2763054946 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7effa806c7a0 0x7effa806ec50 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7effb8000c00 tx=0x7effb8019040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:01.933 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.930+0000 7effb37fe700 1 -- 192.168.123.106:0/2763054946 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7effb4090700 con 0x7effbc082a40 2026-03-09T17:28:01.933 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.931+0000 7effc445e700 1 -- 192.168.123.106:0/2763054946 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7effa0005320 con 0x7effbc082a40 2026-03-09T17:28:01.940 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:01.935+0000 7effb37fe700 1 -- 192.168.123.106:0/2763054946 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7effb405e990 con 0x7effbc082a40 2026-03-09T17:28:02.081 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.079+0000 7effc445e700 1 -- 192.168.123.106:0/2763054946 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7effa0006200 con 0x7effbc082a40 2026-03-09T17:28:02.082 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.080+0000 7effb37fe700 1 -- 192.168.123.106:0/2763054946 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1851 (secure 0 0 0) 0x7effb405e520 con 0x7effbc082a40 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:e13 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:epoch 11 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24291} 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:28:02.085 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:28:02.090 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.088+0000 7effb17fa700 1 -- 192.168.123.106:0/2763054946 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7effa806c7a0 msgr2=0x7effa806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:02.090 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.088+0000 7effb17fa700 1 --2- 192.168.123.106:0/2763054946 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7effa806c7a0 0x7effa806ec50 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7effb8000c00 tx=0x7effb8019040 comp rx=0 tx=0).stop 2026-03-09T17:28:02.090 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.088+0000 7effb17fa700 1 -- 192.168.123.106:0/2763054946 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effbc082a40 msgr2=0x7effbc082eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:02.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.088+0000 7effb17fa700 1 --2- 192.168.123.106:0/2763054946 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effbc082a40 0x7effbc082eb0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7effb400ee40 tx=0x7effb400c620 comp rx=0 tx=0).stop 2026-03-09T17:28:02.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.088+0000 7effb17fa700 1 -- 192.168.123.106:0/2763054946 shutdown_connections 2026-03-09T17:28:02.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.088+0000 7effb17fa700 1 --2- 192.168.123.106:0/2763054946 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7effa806c7a0 0x7effa806ec50 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.088+0000 7effb17fa700 1 --2- 192.168.123.106:0/2763054946 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effbc072360 0x7effbc082500 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.088+0000 7effb17fa700 1 --2- 192.168.123.106:0/2763054946 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effbc082a40 0x7effbc082eb0 secure :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7effb400ee40 tx=0x7effb400c620 comp rx=0 tx=0).stop 2026-03-09T17:28:02.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.088+0000 7effb17fa700 1 -- 192.168.123.106:0/2763054946 >> 192.168.123.106:0/2763054946 conn(0x7effbc06d1a0 msgr2=0x7effbc0705c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:02.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.089+0000 7effb17fa700 1 -- 192.168.123.106:0/2763054946 shutdown_connections 2026-03-09T17:28:02.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.089+0000 7effb17fa700 1 -- 192.168.123.106:0/2763054946 wait complete. 2026-03-09T17:28:02.095 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:28:02.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.191+0000 7fc113009700 1 -- 192.168.123.106:0/1434789208 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc10c072330 msgr2=0x7fc10c0770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:02.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.191+0000 7fc113009700 1 --2- 192.168.123.106:0/1434789208 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc10c072330 0x7fc10c0770b0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fc10400a390 tx=0x7fc10400a6a0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.192+0000 7fc113009700 1 -- 192.168.123.106:0/1434789208 shutdown_connections 2026-03-09T17:28:02.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.192+0000 7fc113009700 1 --2- 192.168.123.106:0/1434789208 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc10c072330 0x7fc10c0770b0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.192+0000 7fc113009700 1 --2- 192.168.123.106:0/1434789208 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc10c071950 0x7fc10c071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.192+0000 7fc113009700 1 -- 192.168.123.106:0/1434789208 >> 192.168.123.106:0/1434789208 conn(0x7fc10c06d1a0 msgr2=0x7fc10c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:02.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.192+0000 7fc113009700 1 -- 192.168.123.106:0/1434789208 shutdown_connections 2026-03-09T17:28:02.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.196+0000 7fc113009700 1 -- 192.168.123.106:0/1434789208 wait complete. 2026-03-09T17:28:02.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.196+0000 7fc113009700 1 Processor -- start 2026-03-09T17:28:02.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.196+0000 7fc113009700 1 -- start start 2026-03-09T17:28:02.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.196+0000 7fc113009700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc10c071950 0x7fc10c1b6060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:02.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.196+0000 7fc113009700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc10c1b65a0 0x7fc10c07f480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:02.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.196+0000 7fc113009700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc10c1b6aa0 con 0x7fc10c1b65a0 2026-03-09T17:28:02.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.196+0000 7fc113009700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc10c1b6c10 con 0x7fc10c071950 2026-03-09T17:28:02.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.197+0000 7fc112007700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc10c071950 0x7fc10c1b6060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:02.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.197+0000 7fc112007700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc10c071950 0x7fc10c1b6060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:47398/0 (socket says 192.168.123.106:47398) 2026-03-09T17:28:02.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.197+0000 7fc112007700 1 -- 192.168.123.106:0/79867554 learned_addr learned my addr 192.168.123.106:0/79867554 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:28:02.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.197+0000 7fc112007700 1 -- 192.168.123.106:0/79867554 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc10c1b65a0 msgr2=0x7fc10c07f480 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:02.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.197+0000 7fc112007700 1 --2- 192.168.123.106:0/79867554 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc10c1b65a0 0x7fc10c07f480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.197+0000 7fc112007700 1 -- 192.168.123.106:0/79867554 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc10400a040 con 0x7fc10c071950 2026-03-09T17:28:02.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.197+0000 7fc112007700 1 --2- 192.168.123.106:0/79867554 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc10c071950 0x7fc10c1b6060 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fc10800daa0 tx=0x7fc10800de60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:02.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.197+0000 7fc1037fe700 1 -- 192.168.123.106:0/79867554 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc108012070 con 0x7fc10c071950 2026-03-09T17:28:02.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.197+0000 7fc1037fe700 1 -- 192.168.123.106:0/79867554 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc10800cb40 con 0x7fc10c071950 2026-03-09T17:28:02.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.198+0000 7fc1037fe700 1 -- 192.168.123.106:0/79867554 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc1080155a0 con 0x7fc10c071950 2026-03-09T17:28:02.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.199+0000 7fc113009700 1 -- 192.168.123.106:0/79867554 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc10c07fa20 con 0x7fc10c071950 2026-03-09T17:28:02.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.199+0000 7fc113009700 1 -- 192.168.123.106:0/79867554 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc10c07fea0 con 0x7fc10c071950 2026-03-09T17:28:02.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.199+0000 7fc1017fa700 1 -- 192.168.123.106:0/79867554 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc0f40052f0 con 0x7fc10c071950 2026-03-09T17:28:02.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.200+0000 7fc1037fe700 1 -- 192.168.123.106:0/79867554 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fc10800ccb0 con 0x7fc10c071950 2026-03-09T17:28:02.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.200+0000 7fc1037fe700 1 --2- 192.168.123.106:0/79867554 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc0f806c4d0 0x7fc0f806e980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:02.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.200+0000 7fc1037fe700 1 -- 192.168.123.106:0/79867554 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fc10808b680 con 0x7fc10c071950 2026-03-09T17:28:02.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.201+0000 7fc111806700 1 --2- 192.168.123.106:0/79867554 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc0f806c4d0 0x7fc0f806e980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:02.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.201+0000 7fc111806700 1 --2- 192.168.123.106:0/79867554 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc0f806c4d0 0x7fc0f806e980 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fc104009750 tx=0x7fc1040096c0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:02.222 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.202+0000 7fc1037fe700 1 -- 192.168.123.106:0/79867554 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc108055e80 con 0x7fc10c071950 2026-03-09T17:28:02.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.354+0000 7fc1017fa700 1 -- 192.168.123.106:0/79867554 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc0f4000bc0 con 0x7fc0f806c4d0 2026-03-09T17:28:02.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.356+0000 7fc1037fe700 1 -- 192.168.123.106:0/79867554 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7fc0f4000bc0 con 0x7fc0f806c4d0 2026-03-09T17:28:02.360 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:28:02.360 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-09T17:28:02.360 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:28:02.360 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:28:02.360 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [], 2026-03-09T17:28:02.360 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "", 2026-03-09T17:28:02.360 INFO:teuthology.orchestra.run.vm06.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-09T17:28:02.360 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:28:02.360 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:28:02.363 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.360+0000 7fc1017fa700 1 -- 192.168.123.106:0/79867554 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc0f806c4d0 msgr2=0x7fc0f806e980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:02.363 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.360+0000 7fc1017fa700 1 --2- 192.168.123.106:0/79867554 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc0f806c4d0 0x7fc0f806e980 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fc104009750 tx=0x7fc1040096c0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.363 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.361+0000 7fc1017fa700 1 -- 192.168.123.106:0/79867554 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc10c071950 msgr2=0x7fc10c1b6060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:02.363 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.361+0000 7fc1017fa700 1 --2- 192.168.123.106:0/79867554 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc10c071950 0x7fc10c1b6060 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fc10800daa0 tx=0x7fc10800de60 comp rx=0 tx=0).stop 2026-03-09T17:28:02.364 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.362+0000 7fc1017fa700 1 -- 192.168.123.106:0/79867554 shutdown_connections 2026-03-09T17:28:02.364 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.362+0000 7fc1017fa700 1 --2- 192.168.123.106:0/79867554 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc10c071950 0x7fc10c1b6060 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.364 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.362+0000 7fc1017fa700 1 --2- 192.168.123.106:0/79867554 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc0f806c4d0 0x7fc0f806e980 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.364 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.362+0000 7fc1017fa700 1 --2- 192.168.123.106:0/79867554 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc10c1b65a0 0x7fc10c07f480 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.364 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.362+0000 7fc1017fa700 1 -- 192.168.123.106:0/79867554 >> 192.168.123.106:0/79867554 conn(0x7fc10c06d1a0 msgr2=0x7fc10c076440 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:02.365 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.363+0000 7fc1017fa700 1 -- 192.168.123.106:0/79867554 shutdown_connections 2026-03-09T17:28:02.365 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.363+0000 7fc1017fa700 1 -- 192.168.123.106:0/79867554 wait complete. 2026-03-09T17:28:02.452 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:02 vm06.local ceph-mon[57307]: from='client.14580 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:02.452 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:02 vm06.local ceph-mon[57307]: pgmap v104: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:02.452 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:02 vm06.local ceph-mon[57307]: from='client.14584 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:02.452 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:02 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/3678598042' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:28:02.452 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:02 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/2763054946' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:28:02.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.446+0000 7ffb5a962700 1 -- 192.168.123.106:0/2573108936 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ffb54071980 msgr2=0x7ffb54071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:02.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.446+0000 7ffb5a962700 1 --2- 192.168.123.106:0/2573108936 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ffb54071980 0x7ffb54071d90 secure :-1 s=READY pgs=295 cs=0 l=1 rev1=1 crypto rx=0x7ffb44008790 tx=0x7ffb44008aa0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.447+0000 7ffb5a962700 1 -- 192.168.123.106:0/2573108936 shutdown_connections 2026-03-09T17:28:02.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.447+0000 7ffb5a962700 1 --2- 192.168.123.106:0/2573108936 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ffb54072360 0x7ffb540770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.447+0000 7ffb5a962700 1 --2- 192.168.123.106:0/2573108936 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ffb54071980 0x7ffb54071d90 unknown :-1 s=CLOSED pgs=295 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.447+0000 7ffb5a962700 1 -- 192.168.123.106:0/2573108936 >> 192.168.123.106:0/2573108936 conn(0x7ffb5406d1a0 msgr2=0x7ffb5406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.447+0000 7ffb5a962700 1 -- 192.168.123.106:0/2573108936 shutdown_connections 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.447+0000 7ffb5a962700 1 -- 192.168.123.106:0/2573108936 wait complete. 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.447+0000 7ffb5a962700 1 Processor -- start 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.447+0000 7ffb5a962700 1 -- start start 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.447+0000 7ffb5a962700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ffb54072360 0x7ffb54080300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.447+0000 7ffb5a962700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ffb54080840 0x7ffb54080cb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.447+0000 7ffb5a962700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffb54081cb0 con 0x7ffb54072360 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.447+0000 7ffb5a962700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffb5412dd80 con 0x7ffb54080840 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.448+0000 7ffb537fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ffb54080840 0x7ffb54080cb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.448+0000 7ffb537fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ffb54080840 0x7ffb54080cb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:47418/0 (socket says 192.168.123.106:47418) 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.448+0000 7ffb537fe700 1 -- 192.168.123.106:0/2781247522 learned_addr learned my addr 192.168.123.106:0/2781247522 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.448+0000 7ffb537fe700 1 -- 192.168.123.106:0/2781247522 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ffb54072360 msgr2=0x7ffb54080300 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.448+0000 7ffb537fe700 1 --2- 192.168.123.106:0/2781247522 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ffb54072360 0x7ffb54080300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.448+0000 7ffb537fe700 1 -- 192.168.123.106:0/2781247522 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ffb44008440 con 0x7ffb54080840 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.448+0000 7ffb537fe700 1 --2- 192.168.123.106:0/2781247522 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ffb54080840 0x7ffb54080cb0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7ffb4c00d670 tx=0x7ffb4c0086d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.448+0000 7ffb517fa700 1 -- 192.168.123.106:0/2781247522 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffb4c0175a0 con 0x7ffb54080840 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.448+0000 7ffb5a962700 1 -- 192.168.123.106:0/2781247522 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ffb5412e000 con 0x7ffb54080840 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.448+0000 7ffb5a962700 1 -- 192.168.123.106:0/2781247522 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ffb5412e550 con 0x7ffb54080840 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.449+0000 7ffb517fa700 1 -- 192.168.123.106:0/2781247522 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ffb4c00d810 con 0x7ffb54080840 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.449+0000 7ffb517fa700 1 -- 192.168.123.106:0/2781247522 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffb4c016b40 con 0x7ffb54080840 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.450+0000 7ffb5a962700 1 -- 192.168.123.106:0/2781247522 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ffb40005320 con 0x7ffb54080840 2026-03-09T17:28:02.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.451+0000 7ffb517fa700 1 -- 192.168.123.106:0/2781247522 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7ffb4c016ca0 con 0x7ffb54080840 2026-03-09T17:28:02.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.451+0000 7ffb517fa700 1 --2- 192.168.123.106:0/2781247522 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ffb3c06c6d0 0x7ffb3c06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:02.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.451+0000 7ffb517fa700 1 -- 192.168.123.106:0/2781247522 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7ffb4c03d020 con 0x7ffb54080840 2026-03-09T17:28:02.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.451+0000 7ffb53fff700 1 --2- 192.168.123.106:0/2781247522 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ffb3c06c6d0 0x7ffb3c06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:02.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.452+0000 7ffb53fff700 1 --2- 192.168.123.106:0/2781247522 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ffb3c06c6d0 0x7ffb3c06eb80 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7ffb44008790 tx=0x7ffb4400b320 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:02.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.453+0000 7ffb517fa700 1 -- 192.168.123.106:0/2781247522 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7ffb4c059f60 con 0x7ffb54080840 2026-03-09T17:28:02.609 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.606+0000 7ffb5a962700 1 -- 192.168.123.106:0/2781247522 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7ffb40005190 con 0x7ffb54080840 2026-03-09T17:28:02.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.607+0000 7ffb517fa700 1 -- 192.168.123.106:0/2781247522 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7ffb4c059af0 con 0x7ffb54080840 2026-03-09T17:28:02.610 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_OK 2026-03-09T17:28:02.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.610+0000 7ffb3affd700 1 -- 192.168.123.106:0/2781247522 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ffb3c06c6d0 msgr2=0x7ffb3c06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:02.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.610+0000 7ffb3affd700 1 --2- 192.168.123.106:0/2781247522 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ffb3c06c6d0 0x7ffb3c06eb80 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7ffb44008790 tx=0x7ffb4400b320 comp rx=0 tx=0).stop 2026-03-09T17:28:02.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.610+0000 7ffb3affd700 1 -- 192.168.123.106:0/2781247522 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ffb54080840 msgr2=0x7ffb54080cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:02.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.610+0000 7ffb3affd700 1 --2- 192.168.123.106:0/2781247522 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ffb54080840 0x7ffb54080cb0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7ffb4c00d670 tx=0x7ffb4c0086d0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.610+0000 7ffb3affd700 1 -- 192.168.123.106:0/2781247522 shutdown_connections 2026-03-09T17:28:02.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.610+0000 7ffb3affd700 1 --2- 192.168.123.106:0/2781247522 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7ffb3c06c6d0 0x7ffb3c06eb80 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.610+0000 7ffb3affd700 1 --2- 192.168.123.106:0/2781247522 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ffb54072360 0x7ffb54080300 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.610+0000 7ffb3affd700 1 --2- 192.168.123.106:0/2781247522 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ffb54080840 0x7ffb54080cb0 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:02.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.610+0000 7ffb3affd700 1 -- 192.168.123.106:0/2781247522 >> 192.168.123.106:0/2781247522 conn(0x7ffb5406d1a0 msgr2=0x7ffb54070300 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:02.613 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.610+0000 7ffb3affd700 1 -- 192.168.123.106:0/2781247522 shutdown_connections 2026-03-09T17:28:02.613 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:02.610+0000 7ffb3affd700 1 -- 192.168.123.106:0/2781247522 wait complete. 2026-03-09T17:28:02.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:02 vm09.local ceph-mon[62061]: from='client.14580 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:02.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:02 vm09.local ceph-mon[62061]: pgmap v104: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:02.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:02 vm09.local ceph-mon[62061]: from='client.14584 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:02.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:02 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/3678598042' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:28:02.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:02 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/2763054946' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:28:03.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:03 vm06.local ceph-mon[57307]: from='client.14588 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:03.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:03 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/2781247522' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:28:03.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:03 vm09.local ceph-mon[62061]: from='client.14588 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:03.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:03 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/2781247522' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:28:04.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:04 vm06.local ceph-mon[57307]: from='client.24391 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:04.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:04 vm06.local ceph-mon[57307]: pgmap v105: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:28:04.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:04 vm09.local ceph-mon[62061]: from='client.24391 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:04.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:04 vm09.local ceph-mon[62061]: pgmap v105: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:28:06.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:06 vm06.local ceph-mon[57307]: pgmap v106: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:06.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:06 vm09.local ceph-mon[62061]: pgmap v106: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:08.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:08 vm06.local ceph-mon[57307]: pgmap v107: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:28:08.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:08 vm09.local ceph-mon[62061]: pgmap v107: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:28:09.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:09 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:28:09.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:09 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:28:10.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:10 vm06.local ceph-mon[57307]: pgmap v108: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:10.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:10 vm09.local ceph-mon[62061]: pgmap v108: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:11.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:11 vm06.local ceph-mon[57307]: pgmap v109: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:11.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:11 vm09.local ceph-mon[62061]: pgmap v109: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:14.601 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:14 vm06.local ceph-mon[57307]: pgmap v110: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:28:14.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:14 vm09.local ceph-mon[62061]: pgmap v110: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:28:16.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:16 vm06.local ceph-mon[57307]: pgmap v111: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:16.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:16 vm09.local ceph-mon[62061]: pgmap v111: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:18.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:18 vm06.local ceph-mon[57307]: pgmap v112: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:28:18.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:18 vm09.local ceph-mon[62061]: pgmap v112: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:28:20.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:20 vm06.local ceph-mon[57307]: pgmap v113: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:20.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:20 vm09.local ceph-mon[62061]: pgmap v113: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:22.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:22 vm06.local ceph-mon[57307]: pgmap v114: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:22.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:22 vm09.local ceph-mon[62061]: pgmap v114: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:23.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:23 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:28:23.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:23 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:28:24.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:24 vm06.local ceph-mon[57307]: pgmap v115: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:28:24.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:24 vm09.local ceph-mon[62061]: pgmap v115: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:28:25.608 INFO:tasks.workunit.client.0.vm06.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-09T17:28:25.608 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr:state without impacting any branches by switching back to a branch. 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr: git switch -c 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr:Or undo this operation with: 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr: git switch - 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr: 2026-03-09T17:28:25.609 INFO:tasks.workunit.client.0.vm06.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-09T17:28:25.615 DEBUG:teuthology.orchestra.run.vm06:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-09T17:28:25.635 INFO:tasks.workunit.client.0.vm06.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-09T17:28:25.638 INFO:tasks.workunit.client.0.vm06.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-09T17:28:25.638 INFO:tasks.workunit.client.0.vm06.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-09T17:28:25.729 INFO:tasks.workunit.client.0.vm06.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-09T17:28:25.766 INFO:tasks.workunit.client.0.vm06.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-09T17:28:25.797 INFO:tasks.workunit.client.0.vm06.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-09T17:28:25.799 INFO:tasks.workunit.client.0.vm06.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-09T17:28:25.799 INFO:tasks.workunit.client.0.vm06.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-09T17:28:25.833 INFO:tasks.workunit.client.0.vm06.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-09T17:28:25.836 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:28:25.836 DEBUG:teuthology.orchestra.run.vm06:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-09T17:28:25.896 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.0... 2026-03-09T17:28:25.897 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-09T17:28:25.897 DEBUG:teuthology.orchestra.run.vm06:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/fsstress.sh 2026-03-09T17:28:25.968 INFO:tasks.workunit.client.0.vm06.stderr:+ mkdir -p fsstress 2026-03-09T17:28:25.970 INFO:tasks.workunit.client.0.vm06.stderr:+ pushd fsstress 2026-03-09T17:28:25.971 INFO:tasks.workunit.client.0.vm06.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-09T17:28:25.972 INFO:tasks.workunit.client.0.vm06.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-09T17:28:26.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:26 vm06.local ceph-mon[57307]: pgmap v116: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:26.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:26 vm09.local ceph-mon[62061]: pgmap v116: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:28:27.471 INFO:tasks.workunit.client.0.vm06.stderr:+ tar xzf ltp-full.tgz 2026-03-09T17:28:28.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:28 vm06.local ceph-mon[57307]: pgmap v117: 65 pgs: 65 active+clean; 471 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.8 KiB/s rd, 1.1 KiB/s wr, 2 op/s 2026-03-09T17:28:28.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:28 vm09.local ceph-mon[62061]: pgmap v117: 65 pgs: 65 active+clean; 471 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.8 KiB/s rd, 1.1 KiB/s wr, 2 op/s 2026-03-09T17:28:29.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:29 vm06.local ceph-mon[57307]: pgmap v118: 65 pgs: 65 active+clean; 5.0 MiB data, 170 MiB used, 120 GiB / 120 GiB avail; 1.4 KiB/s rd, 394 KiB/s wr, 12 op/s 2026-03-09T17:28:29.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:29 vm09.local ceph-mon[62061]: pgmap v118: 65 pgs: 65 active+clean; 5.0 MiB data, 170 MiB used, 120 GiB / 120 GiB avail; 1.4 KiB/s rd, 394 KiB/s wr, 12 op/s 2026-03-09T17:28:31.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:31 vm06.local ceph-mon[57307]: pgmap v119: 65 pgs: 65 active+clean; 6.0 MiB data, 186 MiB used, 120 GiB / 120 GiB avail; 1.4 KiB/s rd, 500 KiB/s wr, 39 op/s 2026-03-09T17:28:31.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:31 vm09.local ceph-mon[62061]: pgmap v119: 65 pgs: 65 active+clean; 6.0 MiB data, 186 MiB used, 120 GiB / 120 GiB avail; 1.4 KiB/s rd, 500 KiB/s wr, 39 op/s 2026-03-09T17:28:32.733 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.730+0000 7f2da4062700 1 -- 192.168.123.106:0/240628850 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d9c1080e0 msgr2=0x7f2d9c1084f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:32.733 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.730+0000 7f2da4062700 1 --2- 192.168.123.106:0/240628850 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d9c1080e0 0x7f2d9c1084f0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f2d9400d3e0 tx=0x7f2d9400d6f0 comp rx=0 tx=0).stop 2026-03-09T17:28:32.734 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.731+0000 7f2da4062700 1 -- 192.168.123.106:0/240628850 shutdown_connections 2026-03-09T17:28:32.734 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.731+0000 7f2da4062700 1 --2- 192.168.123.106:0/240628850 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d9c071960 0x7f2d9c071dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:32.734 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.731+0000 7f2da4062700 1 --2- 192.168.123.106:0/240628850 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d9c1080e0 0x7f2d9c1084f0 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:32.734 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.731+0000 7f2da4062700 1 -- 192.168.123.106:0/240628850 >> 192.168.123.106:0/240628850 conn(0x7f2d9c06d3e0 msgr2=0x7f2d9c06f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:32.734 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.731+0000 7f2da4062700 1 -- 192.168.123.106:0/240628850 shutdown_connections 2026-03-09T17:28:32.734 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.731+0000 7f2da4062700 1 -- 192.168.123.106:0/240628850 wait complete. 2026-03-09T17:28:32.734 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.732+0000 7f2da4062700 1 Processor -- start 2026-03-09T17:28:32.734 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.732+0000 7f2da4062700 1 -- start start 2026-03-09T17:28:32.734 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.732+0000 7f2da4062700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d9c071960 0x7f2d9c1b71f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:32.734 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.732+0000 7f2da4062700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d9c1b7730 0x7f2d9c07e490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:32.734 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.732+0000 7f2da4062700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2d9c1b7c30 con 0x7f2d9c071960 2026-03-09T17:28:32.735 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.732+0000 7f2da4062700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2d9c1b7da0 con 0x7f2d9c1b7730 2026-03-09T17:28:32.735 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.733+0000 7f2da15fd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d9c1b7730 0x7f2d9c07e490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:32.735 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.733+0000 7f2da15fd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d9c1b7730 0x7f2d9c07e490 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:51122/0 (socket says 192.168.123.106:51122) 2026-03-09T17:28:32.735 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.733+0000 7f2da15fd700 1 -- 192.168.123.106:0/3571609058 learned_addr learned my addr 192.168.123.106:0/3571609058 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:28:32.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.733+0000 7f2da1dfe700 1 --2- 192.168.123.106:0/3571609058 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d9c071960 0x7f2d9c1b71f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:32.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.734+0000 7f2da15fd700 1 -- 192.168.123.106:0/3571609058 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d9c071960 msgr2=0x7f2d9c1b71f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:32.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.734+0000 7f2da15fd700 1 --2- 192.168.123.106:0/3571609058 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d9c071960 0x7f2d9c1b71f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:32.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.734+0000 7f2da15fd700 1 -- 192.168.123.106:0/3571609058 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2d9400d090 con 0x7f2d9c1b7730 2026-03-09T17:28:32.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.734+0000 7f2da15fd700 1 --2- 192.168.123.106:0/3571609058 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d9c1b7730 0x7f2d9c07e490 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f2d98008cf0 tx=0x7f2d9800e410 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:32.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.734+0000 7f2da1dfe700 1 --2- 192.168.123.106:0/3571609058 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d9c071960 0x7f2d9c1b71f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:28:32.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.734+0000 7f2d92ffd700 1 -- 192.168.123.106:0/3571609058 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2d9800f040 con 0x7f2d9c1b7730 2026-03-09T17:28:32.737 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.734+0000 7f2da4062700 1 -- 192.168.123.106:0/3571609058 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2d9c07e9d0 con 0x7f2d9c1b7730 2026-03-09T17:28:32.738 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.734+0000 7f2da4062700 1 -- 192.168.123.106:0/3571609058 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2d9c07eec0 con 0x7f2d9c1b7730 2026-03-09T17:28:32.738 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.735+0000 7f2d92ffd700 1 -- 192.168.123.106:0/3571609058 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2d98008e90 con 0x7f2d9c1b7730 2026-03-09T17:28:32.738 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.735+0000 7f2d92ffd700 1 -- 192.168.123.106:0/3571609058 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2d98016440 con 0x7f2d9c1b7730 2026-03-09T17:28:32.738 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.736+0000 7f2d92ffd700 1 -- 192.168.123.106:0/3571609058 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f2d980165a0 con 0x7f2d9c1b7730 2026-03-09T17:28:32.738 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.736+0000 7f2d92ffd700 1 --2- 192.168.123.106:0/3571609058 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f2d8806c7a0 0x7f2d8806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:32.738 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.736+0000 7f2d92ffd700 1 -- 192.168.123.106:0/3571609058 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f2d9808cb90 con 0x7f2d9c1b7730 2026-03-09T17:28:32.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.737+0000 7f2da4062700 1 -- 192.168.123.106:0/3571609058 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2d80005320 con 0x7f2d9c1b7730 2026-03-09T17:28:32.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.738+0000 7f2da1dfe700 1 --2- 192.168.123.106:0/3571609058 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f2d8806c7a0 0x7f2d8806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:32.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.738+0000 7f2da1dfe700 1 --2- 192.168.123.106:0/3571609058 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f2d8806c7a0 0x7f2d8806ec50 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f2d94000f80 tx=0x7f2d9400da40 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:32.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.741+0000 7f2d92ffd700 1 -- 192.168.123.106:0/3571609058 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2d98057390 con 0x7f2d9c1b7730 2026-03-09T17:28:32.969 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.967+0000 7f2da4062700 1 -- 192.168.123.106:0/3571609058 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f2d80000bf0 con 0x7f2d8806c7a0 2026-03-09T17:28:32.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.969+0000 7f2d92ffd700 1 -- 192.168.123.106:0/3571609058 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f2d80000bf0 con 0x7f2d8806c7a0 2026-03-09T17:28:32.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.974+0000 7f2da4062700 1 -- 192.168.123.106:0/3571609058 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f2d8806c7a0 msgr2=0x7f2d8806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:32.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.974+0000 7f2da4062700 1 --2- 192.168.123.106:0/3571609058 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f2d8806c7a0 0x7f2d8806ec50 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f2d94000f80 tx=0x7f2d9400da40 comp rx=0 tx=0).stop 2026-03-09T17:28:32.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.974+0000 7f2da4062700 1 -- 192.168.123.106:0/3571609058 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d9c1b7730 msgr2=0x7f2d9c07e490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:32.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.974+0000 7f2da4062700 1 --2- 192.168.123.106:0/3571609058 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d9c1b7730 0x7f2d9c07e490 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f2d98008cf0 tx=0x7f2d9800e410 comp rx=0 tx=0).stop 2026-03-09T17:28:32.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.974+0000 7f2da4062700 1 -- 192.168.123.106:0/3571609058 shutdown_connections 2026-03-09T17:28:32.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.974+0000 7f2da4062700 1 --2- 192.168.123.106:0/3571609058 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f2d8806c7a0 0x7f2d8806ec50 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:32.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.974+0000 7f2da4062700 1 --2- 192.168.123.106:0/3571609058 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d9c071960 0x7f2d9c1b71f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:32.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.974+0000 7f2da4062700 1 --2- 192.168.123.106:0/3571609058 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d9c1b7730 0x7f2d9c07e490 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:32.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.974+0000 7f2da4062700 1 -- 192.168.123.106:0/3571609058 >> 192.168.123.106:0/3571609058 conn(0x7f2d9c06d3e0 msgr2=0x7f2d9c0705f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:32.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.975+0000 7f2da4062700 1 -- 192.168.123.106:0/3571609058 shutdown_connections 2026-03-09T17:28:32.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:32.975+0000 7f2da4062700 1 -- 192.168.123.106:0/3571609058 wait complete. 2026-03-09T17:28:33.002 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:28:33.118 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.114+0000 7f20e53db700 1 -- 192.168.123.106:0/2243383053 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20e0071950 msgr2=0x7f20e0071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:33.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.114+0000 7f20e53db700 1 --2- 192.168.123.106:0/2243383053 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20e0071950 0x7f20e0071d60 secure :-1 s=READY pgs=296 cs=0 l=1 rev1=1 crypto rx=0x7f20d0009ab0 tx=0x7f20d0009dc0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.114+0000 7f20e53db700 1 -- 192.168.123.106:0/2243383053 shutdown_connections 2026-03-09T17:28:33.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.114+0000 7f20e53db700 1 --2- 192.168.123.106:0/2243383053 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f20e0072330 0x7f20e00770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.114+0000 7f20e53db700 1 --2- 192.168.123.106:0/2243383053 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20e0071950 0x7f20e0071d60 unknown :-1 s=CLOSED pgs=296 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.114+0000 7f20e53db700 1 -- 192.168.123.106:0/2243383053 >> 192.168.123.106:0/2243383053 conn(0x7f20e006d1a0 msgr2=0x7f20e006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.115+0000 7f20e53db700 1 -- 192.168.123.106:0/2243383053 shutdown_connections 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.115+0000 7f20e53db700 1 -- 192.168.123.106:0/2243383053 wait complete. 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.115+0000 7f20e53db700 1 Processor -- start 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.116+0000 7f20e53db700 1 -- start start 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.116+0000 7f20e53db700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f20e0072330 0x7f20e00824a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.116+0000 7f20e53db700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20e00829e0 0x7f20e0082e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.116+0000 7f20e53db700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f20e0083e50 con 0x7f20e00829e0 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.116+0000 7f20e53db700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f20e01b2a90 con 0x7f20e0072330 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.116+0000 7f20df7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20e00829e0 0x7f20e0082e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.116+0000 7f20df7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20e00829e0 0x7f20e0082e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:51908/0 (socket says 192.168.123.106:51908) 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.116+0000 7f20df7fe700 1 -- 192.168.123.106:0/2651846964 learned_addr learned my addr 192.168.123.106:0/2651846964 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.116+0000 7f20dffff700 1 --2- 192.168.123.106:0/2651846964 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f20e0072330 0x7f20e00824a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.116+0000 7f20df7fe700 1 -- 192.168.123.106:0/2651846964 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f20e0072330 msgr2=0x7f20e00824a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.116+0000 7f20df7fe700 1 --2- 192.168.123.106:0/2651846964 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f20e0072330 0x7f20e00824a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.117+0000 7f20df7fe700 1 -- 192.168.123.106:0/2651846964 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f20d0009710 con 0x7f20e00829e0 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.117+0000 7f20df7fe700 1 --2- 192.168.123.106:0/2651846964 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20e00829e0 0x7f20e0082e50 secure :-1 s=READY pgs=297 cs=0 l=1 rev1=1 crypto rx=0x7f20d800c7a0 tx=0x7f20d800cb60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.117+0000 7f20dd7fa700 1 -- 192.168.123.106:0/2651846964 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f20d8012030 con 0x7f20e00829e0 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.117+0000 7f20e53db700 1 -- 192.168.123.106:0/2651846964 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f20e01b2cf0 con 0x7f20e00829e0 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.117+0000 7f20e53db700 1 -- 192.168.123.106:0/2651846964 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f20e01b31f0 con 0x7f20e00829e0 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.122+0000 7f20dd7fa700 1 -- 192.168.123.106:0/2651846964 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f20d80075c0 con 0x7f20e00829e0 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.122+0000 7f20dd7fa700 1 -- 192.168.123.106:0/2651846964 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f20d80119b0 con 0x7f20e00829e0 2026-03-09T17:28:33.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.122+0000 7f20dd7fa700 1 -- 192.168.123.106:0/2651846964 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f20d8011c20 con 0x7f20e00829e0 2026-03-09T17:28:33.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.122+0000 7f20e53db700 1 -- 192.168.123.106:0/2651846964 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f20e004ea50 con 0x7f20e00829e0 2026-03-09T17:28:33.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.127+0000 7f20dd7fa700 1 --2- 192.168.123.106:0/2651846964 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f20c806c870 0x7f20c806ed20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:33.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.127+0000 7f20dd7fa700 1 -- 192.168.123.106:0/2651846964 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f20d8090ac0 con 0x7f20e00829e0 2026-03-09T17:28:33.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.127+0000 7f20dffff700 1 --2- 192.168.123.106:0/2651846964 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f20c806c870 0x7f20c806ed20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:33.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.128+0000 7f20dffff700 1 --2- 192.168.123.106:0/2651846964 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f20c806c870 0x7f20c806ed20 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f20d000b5c0 tx=0x7f20d0011040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:33.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.129+0000 7f20dd7fa700 1 -- 192.168.123.106:0/2651846964 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f20d805ed50 con 0x7f20e00829e0 2026-03-09T17:28:33.302 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.299+0000 7f20e53db700 1 -- 192.168.123.106:0/2651846964 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f20e0076900 con 0x7f20c806c870 2026-03-09T17:28:33.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.302+0000 7f20dd7fa700 1 -- 192.168.123.106:0/2651846964 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f20e0076900 con 0x7f20c806c870 2026-03-09T17:28:33.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.307+0000 7f20c6ffd700 1 -- 192.168.123.106:0/2651846964 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f20c806c870 msgr2=0x7f20c806ed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:33.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.307+0000 7f20c6ffd700 1 --2- 192.168.123.106:0/2651846964 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f20c806c870 0x7f20c806ed20 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f20d000b5c0 tx=0x7f20d0011040 comp rx=0 tx=0).stop 2026-03-09T17:28:33.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.307+0000 7f20c6ffd700 1 -- 192.168.123.106:0/2651846964 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20e00829e0 msgr2=0x7f20e0082e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:33.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.307+0000 7f20c6ffd700 1 --2- 192.168.123.106:0/2651846964 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20e00829e0 0x7f20e0082e50 secure :-1 s=READY pgs=297 cs=0 l=1 rev1=1 crypto rx=0x7f20d800c7a0 tx=0x7f20d800cb60 comp rx=0 tx=0).stop 2026-03-09T17:28:33.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.308+0000 7f20c6ffd700 1 -- 192.168.123.106:0/2651846964 shutdown_connections 2026-03-09T17:28:33.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.308+0000 7f20c6ffd700 1 --2- 192.168.123.106:0/2651846964 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f20e0072330 0x7f20e00824a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.308+0000 7f20c6ffd700 1 --2- 192.168.123.106:0/2651846964 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f20c806c870 0x7f20c806ed20 secure :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f20d000b5c0 tx=0x7f20d0011040 comp rx=0 tx=0).stop 2026-03-09T17:28:33.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.308+0000 7f20c6ffd700 1 --2- 192.168.123.106:0/2651846964 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f20e00829e0 0x7f20e0082e50 unknown :-1 s=CLOSED pgs=297 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.308+0000 7f20c6ffd700 1 -- 192.168.123.106:0/2651846964 >> 192.168.123.106:0/2651846964 conn(0x7f20e006d1a0 msgr2=0x7f20e0070640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:33.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.308+0000 7f20c6ffd700 1 -- 192.168.123.106:0/2651846964 shutdown_connections 2026-03-09T17:28:33.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.308+0000 7f20c6ffd700 1 -- 192.168.123.106:0/2651846964 wait complete. 2026-03-09T17:28:33.445 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.442+0000 7fd6d0f03700 1 -- 192.168.123.106:0/1564939636 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6cc072470 msgr2=0x7fd6cc10beb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:33.445 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.442+0000 7fd6d0f03700 1 --2- 192.168.123.106:0/1564939636 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6cc072470 0x7fd6cc10beb0 secure :-1 s=READY pgs=298 cs=0 l=1 rev1=1 crypto rx=0x7fd6c400b3a0 tx=0x7fd6c400b6b0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.445 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.443+0000 7fd6d0f03700 1 -- 192.168.123.106:0/1564939636 shutdown_connections 2026-03-09T17:28:33.445 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.443+0000 7fd6d0f03700 1 --2- 192.168.123.106:0/1564939636 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6cc072470 0x7fd6cc10beb0 unknown :-1 s=CLOSED pgs=298 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.445 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.443+0000 7fd6d0f03700 1 --2- 192.168.123.106:0/1564939636 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd6cc071a90 0x7fd6cc071ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.446 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.443+0000 7fd6d0f03700 1 -- 192.168.123.106:0/1564939636 >> 192.168.123.106:0/1564939636 conn(0x7fd6cc06d1a0 msgr2=0x7fd6cc06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:33.446 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.444+0000 7fd6d0f03700 1 -- 192.168.123.106:0/1564939636 shutdown_connections 2026-03-09T17:28:33.446 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.444+0000 7fd6d0f03700 1 -- 192.168.123.106:0/1564939636 wait complete. 2026-03-09T17:28:33.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.445+0000 7fd6d0f03700 1 Processor -- start 2026-03-09T17:28:33.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.445+0000 7fd6d0f03700 1 -- start start 2026-03-09T17:28:33.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.445+0000 7fd6d0f03700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd6cc071a90 0x7fd6cc1a4c90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:33.447 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.445+0000 7fd6d0f03700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6cc072470 0x7fd6cc1a51d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:33.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.445+0000 7fd6d0f03700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6cc1a57d0 con 0x7fd6cc072470 2026-03-09T17:28:33.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.445+0000 7fd6d0f03700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6cc1a5940 con 0x7fd6cc071a90 2026-03-09T17:28:33.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.446+0000 7fd6ca59c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd6cc071a90 0x7fd6cc1a4c90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:33.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.446+0000 7fd6ca59c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd6cc071a90 0x7fd6cc1a4c90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:51158/0 (socket says 192.168.123.106:51158) 2026-03-09T17:28:33.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.446+0000 7fd6ca59c700 1 -- 192.168.123.106:0/3310299201 learned_addr learned my addr 192.168.123.106:0/3310299201 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:28:33.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.446+0000 7fd6c9d9b700 1 --2- 192.168.123.106:0/3310299201 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6cc072470 0x7fd6cc1a51d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:33.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.446+0000 7fd6c9d9b700 1 -- 192.168.123.106:0/3310299201 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd6cc071a90 msgr2=0x7fd6cc1a4c90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:33.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.446+0000 7fd6c9d9b700 1 --2- 192.168.123.106:0/3310299201 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd6cc071a90 0x7fd6cc1a4c90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.446+0000 7fd6c9d9b700 1 -- 192.168.123.106:0/3310299201 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd6c400b050 con 0x7fd6cc072470 2026-03-09T17:28:33.449 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.447+0000 7fd6c9d9b700 1 --2- 192.168.123.106:0/3310299201 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6cc072470 0x7fd6cc1a51d0 secure :-1 s=READY pgs=299 cs=0 l=1 rev1=1 crypto rx=0x7fd6cc107dc0 tx=0x7fd6c4003c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:33.449 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.447+0000 7fd6bb7fe700 1 -- 192.168.123.106:0/3310299201 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd6c400e070 con 0x7fd6cc072470 2026-03-09T17:28:33.450 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.447+0000 7fd6d0f03700 1 -- 192.168.123.106:0/3310299201 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd6cc10f5c0 con 0x7fd6cc072470 2026-03-09T17:28:33.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.447+0000 7fd6d0f03700 1 -- 192.168.123.106:0/3310299201 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd6cc10fb10 con 0x7fd6cc072470 2026-03-09T17:28:33.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.449+0000 7fd6bb7fe700 1 -- 192.168.123.106:0/3310299201 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd6c4007b10 con 0x7fd6cc072470 2026-03-09T17:28:33.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.449+0000 7fd6bb7fe700 1 -- 192.168.123.106:0/3310299201 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd6c401bd40 con 0x7fd6cc072470 2026-03-09T17:28:33.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.450+0000 7fd6d0f03700 1 -- 192.168.123.106:0/3310299201 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd6ac005320 con 0x7fd6cc072470 2026-03-09T17:28:33.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.452+0000 7fd6bb7fe700 1 -- 192.168.123.106:0/3310299201 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd6c4019040 con 0x7fd6cc072470 2026-03-09T17:28:33.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.452+0000 7fd6bb7fe700 1 --2- 192.168.123.106:0/3310299201 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd6b406c6d0 0x7fd6b406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:33.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.452+0000 7fd6bb7fe700 1 -- 192.168.123.106:0/3310299201 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fd6c4029030 con 0x7fd6cc072470 2026-03-09T17:28:33.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.452+0000 7fd6ca59c700 1 --2- 192.168.123.106:0/3310299201 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd6b406c6d0 0x7fd6b406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:33.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.454+0000 7fd6ca59c700 1 --2- 192.168.123.106:0/3310299201 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd6b406c6d0 0x7fd6b406eb80 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fd6cc1a5b70 tx=0x7fd6bc009450 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:33.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.454+0000 7fd6bb7fe700 1 -- 192.168.123.106:0/3310299201 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd6c40576c0 con 0x7fd6cc072470 2026-03-09T17:28:33.673 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.671+0000 7fd6d0f03700 1 -- 192.168.123.106:0/3310299201 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fd6ac000bf0 con 0x7fd6b406c6d0 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (2m) 75s ago 3m 24.7M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (3m) 75s ago 3m 8078k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (3m) 75s ago 3m 8136k - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (3m) 75s ago 3m 7402k - 18.2.0 dc2bc1663786 8c6366ef2954 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (3m) 75s ago 3m 7419k - 18.2.0 dc2bc1663786 78af352f0367 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (2m) 75s ago 3m 84.7M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (80s) 75s ago 80s 10.8M - 18.2.0 dc2bc1663786 4b4cbdf0c640 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (82s) 75s ago 82s 19.5M - 18.2.0 dc2bc1663786 4c8e86b2b8cd 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (81s) 75s ago 81s 14.7M - 18.2.0 dc2bc1663786 aa1f0430b448 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (79s) 75s ago 79s 13.4M - 18.2.0 dc2bc1663786 8dc8a0159213 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:9283,8765,8443 running (4m) 75s ago 4m 498M - 18.2.0 dc2bc1663786 2765e8d99a9c 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (2m) 75s ago 2m 443M - 18.2.0 dc2bc1663786 e6525bf5de20 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (4m) 75s ago 4m 52.0M 2048M 18.2.0 dc2bc1663786 e0e1a20b1577 2026-03-09T17:28:33.684 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (2m) 75s ago 2m 47.4M 2048M 18.2.0 dc2bc1663786 4c30d1217de3 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (3m) 75s ago 3m 13.9M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (3m) 75s ago 3m 14.9M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (2m) 75s ago 2m 46.8M 4096M 18.2.0 dc2bc1663786 7a07f019bdd7 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (2m) 75s ago 2m 46.0M 4096M 18.2.0 dc2bc1663786 decf7e88ebaf 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (2m) 75s ago 2m 48.3M 4096M 18.2.0 dc2bc1663786 df6c7067c3e4 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (2m) 75s ago 2m 43.8M 4096M 18.2.0 dc2bc1663786 48a594500ef1 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (117s) 75s ago 117s 43.8M 4096M 18.2.0 dc2bc1663786 a47c39052541 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (107s) 75s ago 107s 42.5M 4096M 18.2.0 dc2bc1663786 89f436540a49 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (2m) 75s ago 3m 41.3M - 2.43.0 a07b618ecd1d 9f52c04d903c 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.677+0000 7fd6bb7fe700 1 -- 192.168.123.106:0/3310299201 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7fd6ac000bf0 con 0x7fd6b406c6d0 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.680+0000 7fd6b97fa700 1 -- 192.168.123.106:0/3310299201 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd6b406c6d0 msgr2=0x7fd6b406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.680+0000 7fd6b97fa700 1 --2- 192.168.123.106:0/3310299201 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd6b406c6d0 0x7fd6b406eb80 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fd6cc1a5b70 tx=0x7fd6bc009450 comp rx=0 tx=0).stop 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.680+0000 7fd6b97fa700 1 -- 192.168.123.106:0/3310299201 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6cc072470 msgr2=0x7fd6cc1a51d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.680+0000 7fd6b97fa700 1 --2- 192.168.123.106:0/3310299201 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6cc072470 0x7fd6cc1a51d0 secure :-1 s=READY pgs=299 cs=0 l=1 rev1=1 crypto rx=0x7fd6cc107dc0 tx=0x7fd6c4003c30 comp rx=0 tx=0).stop 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.680+0000 7fd6b97fa700 1 -- 192.168.123.106:0/3310299201 shutdown_connections 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.680+0000 7fd6b97fa700 1 --2- 192.168.123.106:0/3310299201 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd6cc071a90 0x7fd6cc1a4c90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.680+0000 7fd6b97fa700 1 --2- 192.168.123.106:0/3310299201 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fd6b406c6d0 0x7fd6b406eb80 secure :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fd6cc1a5b70 tx=0x7fd6bc009450 comp rx=0 tx=0).stop 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.680+0000 7fd6b97fa700 1 --2- 192.168.123.106:0/3310299201 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6cc072470 0x7fd6cc1a51d0 unknown :-1 s=CLOSED pgs=299 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.680+0000 7fd6b97fa700 1 -- 192.168.123.106:0/3310299201 >> 192.168.123.106:0/3310299201 conn(0x7fd6cc06d1a0 msgr2=0x7fd6cc10b5d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.680+0000 7fd6b97fa700 1 -- 192.168.123.106:0/3310299201 shutdown_connections 2026-03-09T17:28:33.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.680+0000 7fd6b97fa700 1 -- 192.168.123.106:0/3310299201 wait complete. 2026-03-09T17:28:33.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.817+0000 7f62a7ace700 1 -- 192.168.123.106:0/761559614 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f62a0071a90 msgr2=0x7f62a0071ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:33.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.817+0000 7f62a7ace700 1 --2- 192.168.123.106:0/761559614 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f62a0071a90 0x7f62a0071ea0 secure :-1 s=READY pgs=300 cs=0 l=1 rev1=1 crypto rx=0x7f629800b3a0 tx=0x7f629800b6b0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.817+0000 7f62a7ace700 1 -- 192.168.123.106:0/761559614 shutdown_connections 2026-03-09T17:28:33.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.817+0000 7f62a7ace700 1 --2- 192.168.123.106:0/761559614 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f62a0072470 0x7f62a010beb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.817+0000 7f62a7ace700 1 --2- 192.168.123.106:0/761559614 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f62a0071a90 0x7f62a0071ea0 unknown :-1 s=CLOSED pgs=300 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.817+0000 7f62a7ace700 1 -- 192.168.123.106:0/761559614 >> 192.168.123.106:0/761559614 conn(0x7f62a006d1a0 msgr2=0x7f62a006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:33.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.817+0000 7f62a7ace700 1 -- 192.168.123.106:0/761559614 shutdown_connections 2026-03-09T17:28:33.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.817+0000 7f62a7ace700 1 -- 192.168.123.106:0/761559614 wait complete. 2026-03-09T17:28:33.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.818+0000 7f62a7ace700 1 Processor -- start 2026-03-09T17:28:33.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.818+0000 7f62a7ace700 1 -- start start 2026-03-09T17:28:33.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.818+0000 7f62a7ace700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f62a0072470 0x7f62a0116a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:33.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.818+0000 7f62a7ace700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f62a0116f90 0x7f62a01b2820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:33.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.818+0000 7f62a7ace700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f62a0117490 con 0x7f62a0072470 2026-03-09T17:28:33.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.818+0000 7f62a7ace700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f62a0117600 con 0x7f62a0116f90 2026-03-09T17:28:33.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.818+0000 7f62a5069700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f62a0116f90 0x7f62a01b2820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:33.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.818+0000 7f62a5069700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f62a0116f90 0x7f62a01b2820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:51170/0 (socket says 192.168.123.106:51170) 2026-03-09T17:28:33.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.818+0000 7f62a5069700 1 -- 192.168.123.106:0/3617062438 learned_addr learned my addr 192.168.123.106:0/3617062438 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:28:33.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.819+0000 7f62a5069700 1 -- 192.168.123.106:0/3617062438 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f62a0072470 msgr2=0x7f62a0116a50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:33.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.819+0000 7f62a5069700 1 --2- 192.168.123.106:0/3617062438 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f62a0072470 0x7f62a0116a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:33.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.819+0000 7f62a5069700 1 -- 192.168.123.106:0/3617062438 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f629800b050 con 0x7f62a0116f90 2026-03-09T17:28:33.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.819+0000 7f62a5069700 1 --2- 192.168.123.106:0/3617062438 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f62a0116f90 0x7f62a01b2820 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f629000b6e0 tx=0x7f629000baa0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:33.825 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.821+0000 7f6296ffd700 1 -- 192.168.123.106:0/3617062438 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f629000f800 con 0x7f62a0116f90 2026-03-09T17:28:33.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.821+0000 7f6296ffd700 1 -- 192.168.123.106:0/3617062438 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f629000fe40 con 0x7f62a0116f90 2026-03-09T17:28:33.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.821+0000 7f6296ffd700 1 -- 192.168.123.106:0/3617062438 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f629000d5f0 con 0x7f62a0116f90 2026-03-09T17:28:33.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.822+0000 7f62a7ace700 1 -- 192.168.123.106:0/3617062438 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f62a01b2dc0 con 0x7f62a0116f90 2026-03-09T17:28:33.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.822+0000 7f62a7ace700 1 -- 192.168.123.106:0/3617062438 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f62a01b32e0 con 0x7f62a0116f90 2026-03-09T17:28:33.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.823+0000 7f62a7ace700 1 -- 192.168.123.106:0/3617062438 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f62a0110c20 con 0x7f62a0116f90 2026-03-09T17:28:33.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.825+0000 7f6296ffd700 1 -- 192.168.123.106:0/3617062438 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f629001e030 con 0x7f62a0116f90 2026-03-09T17:28:33.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.825+0000 7f6296ffd700 1 --2- 192.168.123.106:0/3617062438 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f628c06c6d0 0x7f628c06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:33.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.825+0000 7f6296ffd700 1 -- 192.168.123.106:0/3617062438 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f629008b600 con 0x7f62a0116f90 2026-03-09T17:28:33.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.827+0000 7f62a586a700 1 --2- 192.168.123.106:0/3617062438 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f628c06c6d0 0x7f628c06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:33.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.827+0000 7f62a586a700 1 --2- 192.168.123.106:0/3617062438 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f628c06c6d0 0x7f628c06eb80 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f62980062a0 tx=0x7f6298006210 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:33.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:33.831+0000 7f6296ffd700 1 -- 192.168.123.106:0/3617062438 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6290059890 con 0x7f62a0116f90 2026-03-09T17:28:34.093 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:33 vm06.local ceph-mon[57307]: from='client.24399 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:34.093 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:33 vm06.local ceph-mon[57307]: pgmap v120: 65 pgs: 65 active+clean; 18 MiB data, 228 MiB used, 120 GiB / 120 GiB avail; 258 KiB/s rd, 1.5 MiB/s wr, 111 op/s 2026-03-09T17:28:34.093 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:33 vm06.local ceph-mon[57307]: from='client.14606 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.087+0000 7f62a7ace700 1 -- 192.168.123.106:0/3617062438 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f62a004eab0 con 0x7f62a0116f90 2026-03-09T17:28:34.093 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.088+0000 7f6296ffd700 1 -- 192.168.123.106:0/3617062438 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f6290059420 con 0x7f62a0116f90 2026-03-09T17:28:34.094 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.092+0000 7f6294f79700 1 -- 192.168.123.106:0/3617062438 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f628c06c6d0 msgr2=0x7f628c06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:34.094 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.092+0000 7f6294f79700 1 --2- 192.168.123.106:0/3617062438 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f628c06c6d0 0x7f628c06eb80 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f62980062a0 tx=0x7f6298006210 comp rx=0 tx=0).stop 2026-03-09T17:28:34.095 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.092+0000 7f6294f79700 1 -- 192.168.123.106:0/3617062438 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f62a0116f90 msgr2=0x7f62a01b2820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:34.095 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.092+0000 7f6294f79700 1 --2- 192.168.123.106:0/3617062438 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f62a0116f90 0x7f62a01b2820 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f629000b6e0 tx=0x7f629000baa0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.093+0000 7f6294f79700 1 -- 192.168.123.106:0/3617062438 shutdown_connections 2026-03-09T17:28:34.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.093+0000 7f6294f79700 1 --2- 192.168.123.106:0/3617062438 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f628c06c6d0 0x7f628c06eb80 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.093+0000 7f6294f79700 1 --2- 192.168.123.106:0/3617062438 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f62a0072470 0x7f62a0116a50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.093+0000 7f6294f79700 1 --2- 192.168.123.106:0/3617062438 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f62a0116f90 0x7f62a01b2820 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.093+0000 7f6294f79700 1 -- 192.168.123.106:0/3617062438 >> 192.168.123.106:0/3617062438 conn(0x7f62a006d1a0 msgr2=0x7f62a010b260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:34.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.093+0000 7f6294f79700 1 -- 192.168.123.106:0/3617062438 shutdown_connections 2026-03-09T17:28:34.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.094+0000 7f6294f79700 1 -- 192.168.123.106:0/3617062438 wait complete. 2026-03-09T17:28:34.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:33 vm09.local ceph-mon[62061]: from='client.24399 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:34.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:33 vm09.local ceph-mon[62061]: pgmap v120: 65 pgs: 65 active+clean; 18 MiB data, 228 MiB used, 120 GiB / 120 GiB avail; 258 KiB/s rd, 1.5 MiB/s wr, 111 op/s 2026-03-09T17:28:34.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:33 vm09.local ceph-mon[62061]: from='client.14606 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:34.277 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.274+0000 7f7823126700 1 -- 192.168.123.106:0/357378870 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f781c072360 msgr2=0x7f781c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:34.277 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.274+0000 7f7823126700 1 --2- 192.168.123.106:0/357378870 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f781c072360 0x7f781c0770e0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f781400d3f0 tx=0x7f781400d700 comp rx=0 tx=0).stop 2026-03-09T17:28:34.277 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.274+0000 7f7823126700 1 -- 192.168.123.106:0/357378870 shutdown_connections 2026-03-09T17:28:34.277 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.274+0000 7f7823126700 1 --2- 192.168.123.106:0/357378870 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f781c072360 0x7f781c0770e0 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.277 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.274+0000 7f7823126700 1 --2- 192.168.123.106:0/357378870 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f781c071980 0x7f781c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.277 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.274+0000 7f7823126700 1 -- 192.168.123.106:0/357378870 >> 192.168.123.106:0/357378870 conn(0x7f781c06d1a0 msgr2=0x7f781c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:34.280 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.275+0000 7f7823126700 1 -- 192.168.123.106:0/357378870 shutdown_connections 2026-03-09T17:28:34.280 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.275+0000 7f7823126700 1 -- 192.168.123.106:0/357378870 wait complete. 2026-03-09T17:28:34.280 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.276+0000 7f7823126700 1 Processor -- start 2026-03-09T17:28:34.280 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.276+0000 7f7823126700 1 -- start start 2026-03-09T17:28:34.280 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.276+0000 7f7823126700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f781c071980 0x7f781c1313e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:34.280 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.276+0000 7f7823126700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f781c131920 0x7f781c07f580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:34.280 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.276+0000 7f7823126700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f781c131e20 con 0x7f781c071980 2026-03-09T17:28:34.280 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.276+0000 7f7823126700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f781c131f90 con 0x7f781c131920 2026-03-09T17:28:34.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.278+0000 7f781bfff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f781c131920 0x7f781c07f580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:34.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.279+0000 7f7820ec2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f781c071980 0x7f781c1313e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:34.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.279+0000 7f7820ec2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f781c071980 0x7f781c1313e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:51956/0 (socket says 192.168.123.106:51956) 2026-03-09T17:28:34.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.279+0000 7f7820ec2700 1 -- 192.168.123.106:0/177767284 learned_addr learned my addr 192.168.123.106:0/177767284 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:28:34.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.280+0000 7f781bfff700 1 -- 192.168.123.106:0/177767284 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f781c071980 msgr2=0x7f781c1313e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:34.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.280+0000 7f781bfff700 1 --2- 192.168.123.106:0/177767284 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f781c071980 0x7f781c1313e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.280+0000 7f781bfff700 1 -- 192.168.123.106:0/177767284 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7814007ed0 con 0x7f781c131920 2026-03-09T17:28:34.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.280+0000 7f781bfff700 1 --2- 192.168.123.106:0/177767284 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f781c131920 0x7f781c07f580 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f781c072ff0 tx=0x7f7814003ce0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:34.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.281+0000 7f7819ffb700 1 -- 192.168.123.106:0/177767284 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f781401c070 con 0x7f781c131920 2026-03-09T17:28:34.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.281+0000 7f7823126700 1 -- 192.168.123.106:0/177767284 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f781c07fac0 con 0x7f781c131920 2026-03-09T17:28:34.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.281+0000 7f7823126700 1 -- 192.168.123.106:0/177767284 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f781c07ff80 con 0x7f781c131920 2026-03-09T17:28:34.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.282+0000 7f7819ffb700 1 -- 192.168.123.106:0/177767284 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f781400deb0 con 0x7f781c131920 2026-03-09T17:28:34.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.282+0000 7f7819ffb700 1 -- 192.168.123.106:0/177767284 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7814017bf0 con 0x7f781c131920 2026-03-09T17:28:34.285 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.283+0000 7f7819ffb700 1 -- 192.168.123.106:0/177767284 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f78140043c0 con 0x7f781c131920 2026-03-09T17:28:34.285 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.283+0000 7f7823126700 1 -- 192.168.123.106:0/177767284 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f781c12b500 con 0x7f781c131920 2026-03-09T17:28:34.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.286+0000 7f7819ffb700 1 --2- 192.168.123.106:0/177767284 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f780406c7a0 0x7f780406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:34.289 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.287+0000 7f7819ffb700 1 -- 192.168.123.106:0/177767284 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f7814013070 con 0x7f781c131920 2026-03-09T17:28:34.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.288+0000 7f7820ec2700 1 --2- 192.168.123.106:0/177767284 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f780406c7a0 0x7f780406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:34.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.289+0000 7f7819ffb700 1 -- 192.168.123.106:0/177767284 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7814058a70 con 0x7f781c131920 2026-03-09T17:28:34.292 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.289+0000 7f7820ec2700 1 --2- 192.168.123.106:0/177767284 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f780406c7a0 0x7f780406ec50 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f781c07b4f0 tx=0x7f780c004080 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:34.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.546+0000 7f7823126700 1 -- 192.168.123.106:0/177767284 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f781c02d0b0 con 0x7f781c131920 2026-03-09T17:28:34.549 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.547+0000 7f7819ffb700 1 -- 192.168.123.106:0/177767284 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1851 (secure 0 0 0) 0x7f781c02d0b0 con 0x7f781c131920 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:e13 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:epoch 11 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24291} 2026-03-09T17:28:34.550 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:28:34.551 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:28:34.553 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.551+0000 7f78037fe700 1 -- 192.168.123.106:0/177767284 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f780406c7a0 msgr2=0x7f780406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:34.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.551+0000 7f78037fe700 1 --2- 192.168.123.106:0/177767284 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f780406c7a0 0x7f780406ec50 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f781c07b4f0 tx=0x7f780c004080 comp rx=0 tx=0).stop 2026-03-09T17:28:34.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.551+0000 7f78037fe700 1 -- 192.168.123.106:0/177767284 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f781c131920 msgr2=0x7f781c07f580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:34.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.551+0000 7f78037fe700 1 --2- 192.168.123.106:0/177767284 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f781c131920 0x7f781c07f580 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f781c072ff0 tx=0x7f7814003ce0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.552+0000 7f78037fe700 1 -- 192.168.123.106:0/177767284 shutdown_connections 2026-03-09T17:28:34.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.552+0000 7f78037fe700 1 --2- 192.168.123.106:0/177767284 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f780406c7a0 0x7f780406ec50 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.552+0000 7f78037fe700 1 --2- 192.168.123.106:0/177767284 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f781c071980 0x7f781c1313e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.552+0000 7f78037fe700 1 --2- 192.168.123.106:0/177767284 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f781c131920 0x7f781c07f580 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.552+0000 7f78037fe700 1 -- 192.168.123.106:0/177767284 >> 192.168.123.106:0/177767284 conn(0x7f781c06d1a0 msgr2=0x7f781c0764b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:34.555 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.553+0000 7f78037fe700 1 -- 192.168.123.106:0/177767284 shutdown_connections 2026-03-09T17:28:34.555 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.553+0000 7f78037fe700 1 -- 192.168.123.106:0/177767284 wait complete. 2026-03-09T17:28:34.559 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:28:34.696 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.693+0000 7fe770dde700 1 -- 192.168.123.106:0/3189700111 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe76c0ff1d0 msgr2=0x7fe76c0ff5e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:34.696 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.693+0000 7fe770dde700 1 --2- 192.168.123.106:0/3189700111 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe76c0ff1d0 0x7fe76c0ff5e0 secure :-1 s=READY pgs=301 cs=0 l=1 rev1=1 crypto rx=0x7fe75c009b00 tx=0x7fe75c009e10 comp rx=0 tx=0).stop 2026-03-09T17:28:34.696 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.694+0000 7fe770dde700 1 -- 192.168.123.106:0/3189700111 shutdown_connections 2026-03-09T17:28:34.696 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.694+0000 7fe770dde700 1 --2- 192.168.123.106:0/3189700111 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe76c1003d0 0x7fe76c100820 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.694+0000 7fe770dde700 1 --2- 192.168.123.106:0/3189700111 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe76c0ff1d0 0x7fe76c0ff5e0 unknown :-1 s=CLOSED pgs=301 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.694+0000 7fe770dde700 1 -- 192.168.123.106:0/3189700111 >> 192.168.123.106:0/3189700111 conn(0x7fe76c0fa760 msgr2=0x7fe76c0fcbb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:34.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.694+0000 7fe770dde700 1 -- 192.168.123.106:0/3189700111 shutdown_connections 2026-03-09T17:28:34.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.694+0000 7fe770dde700 1 -- 192.168.123.106:0/3189700111 wait complete. 2026-03-09T17:28:34.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.695+0000 7fe770dde700 1 Processor -- start 2026-03-09T17:28:34.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.695+0000 7fe770dde700 1 -- start start 2026-03-09T17:28:34.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.695+0000 7fe770dde700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe76c0ff1d0 0x7fe76c072340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:34.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.695+0000 7fe770dde700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe76c1003d0 0x7fe76c072880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:34.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.695+0000 7fe770dde700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe76c070990 con 0x7fe76c0ff1d0 2026-03-09T17:28:34.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.695+0000 7fe770dde700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe76c070ad0 con 0x7fe76c1003d0 2026-03-09T17:28:34.698 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.696+0000 7fe76a59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe76c0ff1d0 0x7fe76c072340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:34.698 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.696+0000 7fe76a59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe76c0ff1d0 0x7fe76c072340 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:51966/0 (socket says 192.168.123.106:51966) 2026-03-09T17:28:34.698 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.696+0000 7fe76a59c700 1 -- 192.168.123.106:0/3888818615 learned_addr learned my addr 192.168.123.106:0/3888818615 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:28:34.699 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.696+0000 7fe761bff700 1 --2- 192.168.123.106:0/3888818615 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe76c1003d0 0x7fe76c072880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:34.699 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.697+0000 7fe76a59c700 1 -- 192.168.123.106:0/3888818615 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe76c1003d0 msgr2=0x7fe76c072880 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:34.699 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.697+0000 7fe76a59c700 1 --2- 192.168.123.106:0/3888818615 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe76c1003d0 0x7fe76c072880 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.699 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.697+0000 7fe76a59c700 1 -- 192.168.123.106:0/3888818615 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe75c0097e0 con 0x7fe76c0ff1d0 2026-03-09T17:28:34.700 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.697+0000 7fe761bff700 1 --2- 192.168.123.106:0/3888818615 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe76c1003d0 0x7fe76c072880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T17:28:34.700 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.698+0000 7fe76a59c700 1 --2- 192.168.123.106:0/3888818615 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe76c0ff1d0 0x7fe76c072340 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7fe75c0052d0 tx=0x7fe75c00bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:34.700 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.698+0000 7fe763fff700 1 -- 192.168.123.106:0/3888818615 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe75c01d070 con 0x7fe76c0ff1d0 2026-03-09T17:28:34.701 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.698+0000 7fe770dde700 1 -- 192.168.123.106:0/3888818615 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe76c070d50 con 0x7fe76c0ff1d0 2026-03-09T17:28:34.701 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.698+0000 7fe770dde700 1 -- 192.168.123.106:0/3888818615 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe76c071240 con 0x7fe76c0ff1d0 2026-03-09T17:28:34.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.701+0000 7fe763fff700 1 -- 192.168.123.106:0/3888818615 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe75c00bdf0 con 0x7fe76c0ff1d0 2026-03-09T17:28:34.704 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.701+0000 7fe763fff700 1 -- 192.168.123.106:0/3888818615 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe75c021c30 con 0x7fe76c0ff1d0 2026-03-09T17:28:34.704 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.702+0000 7fe763fff700 1 -- 192.168.123.106:0/3888818615 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fe75c02b430 con 0x7fe76c0ff1d0 2026-03-09T17:28:34.704 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.702+0000 7fe763fff700 1 --2- 192.168.123.106:0/3888818615 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe75806c870 0x7fe75806ed20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:34.706 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.703+0000 7fe770dde700 1 -- 192.168.123.106:0/3888818615 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe74c005320 con 0x7fe76c0ff1d0 2026-03-09T17:28:34.706 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.703+0000 7fe761bff700 1 --2- 192.168.123.106:0/3888818615 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe75806c870 0x7fe75806ed20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:34.706 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.703+0000 7fe763fff700 1 -- 192.168.123.106:0/3888818615 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fe75c08e070 con 0x7fe76c0ff1d0 2026-03-09T17:28:34.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.705+0000 7fe761bff700 1 --2- 192.168.123.106:0/3888818615 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe75806c870 0x7fe75806ed20 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fe754005950 tx=0x7fe75400a300 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:34.709 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.707+0000 7fe763fff700 1 -- 192.168.123.106:0/3888818615 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe75c05c300 con 0x7fe76c0ff1d0 2026-03-09T17:28:34.859 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.856+0000 7fe770dde700 1 -- 192.168.123.106:0/3888818615 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe74c000bf0 con 0x7fe75806c870 2026-03-09T17:28:34.861 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.859+0000 7fe763fff700 1 -- 192.168.123.106:0/3888818615 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7fe74c000bf0 con 0x7fe75806c870 2026-03-09T17:28:34.862 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:28:34.862 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-09T17:28:34.862 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:28:34.862 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:28:34.862 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [], 2026-03-09T17:28:34.862 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "", 2026-03-09T17:28:34.862 INFO:teuthology.orchestra.run.vm06.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-09T17:28:34.862 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:28:34.862 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:28:34.869 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.867+0000 7fe7613fe700 1 -- 192.168.123.106:0/3888818615 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe75806c870 msgr2=0x7fe75806ed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:34.869 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.867+0000 7fe7613fe700 1 --2- 192.168.123.106:0/3888818615 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe75806c870 0x7fe75806ed20 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fe754005950 tx=0x7fe75400a300 comp rx=0 tx=0).stop 2026-03-09T17:28:34.869 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.867+0000 7fe7613fe700 1 -- 192.168.123.106:0/3888818615 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe76c0ff1d0 msgr2=0x7fe76c072340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:34.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.867+0000 7fe7613fe700 1 --2- 192.168.123.106:0/3888818615 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe76c0ff1d0 0x7fe76c072340 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7fe75c0052d0 tx=0x7fe75c00bac0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.867+0000 7fe7613fe700 1 -- 192.168.123.106:0/3888818615 shutdown_connections 2026-03-09T17:28:34.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.867+0000 7fe7613fe700 1 --2- 192.168.123.106:0/3888818615 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe75806c870 0x7fe75806ed20 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.867+0000 7fe7613fe700 1 --2- 192.168.123.106:0/3888818615 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe76c0ff1d0 0x7fe76c072340 unknown :-1 s=CLOSED pgs=302 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.868+0000 7fe7613fe700 1 --2- 192.168.123.106:0/3888818615 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe76c1003d0 0x7fe76c072880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:34.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.868+0000 7fe7613fe700 1 -- 192.168.123.106:0/3888818615 >> 192.168.123.106:0/3888818615 conn(0x7fe76c0fa760 msgr2=0x7fe76c068d20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:34.872 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.870+0000 7fe7613fe700 1 -- 192.168.123.106:0/3888818615 shutdown_connections 2026-03-09T17:28:34.872 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.870+0000 7fe7613fe700 1 -- 192.168.123.106:0/3888818615 wait complete. 2026-03-09T17:28:34.878 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:34 vm09.local ceph-mon[62061]: from='client.14610 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:34.878 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:34 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/3617062438' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:28:34.878 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:34 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/177767284' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:28:34.964 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.961+0000 7fe74ecfe700 1 -- 192.168.123.106:0/1862600020 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe748072440 msgr2=0x7fe74810be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:35.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.961+0000 7fe74ecfe700 1 --2- 192.168.123.106:0/1862600020 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe748072440 0x7fe74810be90 secure :-1 s=READY pgs=303 cs=0 l=1 rev1=1 crypto rx=0x7fe74000b3a0 tx=0x7fe74000b6b0 comp rx=0 tx=0).stop 2026-03-09T17:28:35.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.961+0000 7fe74ecfe700 1 -- 192.168.123.106:0/1862600020 shutdown_connections 2026-03-09T17:28:35.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.961+0000 7fe74ecfe700 1 --2- 192.168.123.106:0/1862600020 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe748072440 0x7fe74810be90 unknown :-1 s=CLOSED pgs=303 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:35.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.961+0000 7fe74ecfe700 1 --2- 192.168.123.106:0/1862600020 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe748071a60 0x7fe748071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.961+0000 7fe74ecfe700 1 -- 192.168.123.106:0/1862600020 >> 192.168.123.106:0/1862600020 conn(0x7fe74806d1a0 msgr2=0x7fe74806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.961+0000 7fe74ecfe700 1 -- 192.168.123.106:0/1862600020 shutdown_connections 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.961+0000 7fe74ecfe700 1 -- 192.168.123.106:0/1862600020 wait complete. 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.961+0000 7fe74ecfe700 1 Processor -- start 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.961+0000 7fe74ecfe700 1 -- start start 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.961+0000 7fe74ecfe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe748071a60 0x7fe748116a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.961+0000 7fe74ecfe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe748116f90 0x7fe7481b2800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.961+0000 7fe74ecfe700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe748117490 con 0x7fe748071a60 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.961+0000 7fe74ecfe700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe748117600 con 0x7fe748116f90 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.962+0000 7fe74dcfc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe748071a60 0x7fe748116a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.962+0000 7fe74dcfc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe748071a60 0x7fe748116a50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:51980/0 (socket says 192.168.123.106:51980) 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.962+0000 7fe74dcfc700 1 -- 192.168.123.106:0/4221436096 learned_addr learned my addr 192.168.123.106:0/4221436096 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.962+0000 7fe74d4fb700 1 --2- 192.168.123.106:0/4221436096 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe748116f90 0x7fe7481b2800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.962+0000 7fe74dcfc700 1 -- 192.168.123.106:0/4221436096 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe748116f90 msgr2=0x7fe7481b2800 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.962+0000 7fe74dcfc700 1 --2- 192.168.123.106:0/4221436096 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe748116f90 0x7fe7481b2800 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.962+0000 7fe74dcfc700 1 -- 192.168.123.106:0/4221436096 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe74000b050 con 0x7fe748071a60 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.963+0000 7fe74dcfc700 1 --2- 192.168.123.106:0/4221436096 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe748071a60 0x7fe748116a50 secure :-1 s=READY pgs=304 cs=0 l=1 rev1=1 crypto rx=0x7fe74400b6e0 tx=0x7fe74400baa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.963+0000 7fe73effd700 1 -- 192.168.123.106:0/4221436096 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe74400f800 con 0x7fe748071a60 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.963+0000 7fe74ecfe700 1 -- 192.168.123.106:0/4221436096 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe7481b2da0 con 0x7fe748071a60 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.963+0000 7fe74ecfe700 1 -- 192.168.123.106:0/4221436096 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe7481b3260 con 0x7fe748071a60 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.964+0000 7fe73effd700 1 -- 192.168.123.106:0/4221436096 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe74400fe40 con 0x7fe748071a60 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.964+0000 7fe73effd700 1 -- 192.168.123.106:0/4221436096 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe74400d5f0 con 0x7fe748071a60 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.966+0000 7fe73effd700 1 -- 192.168.123.106:0/4221436096 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fe74400d750 con 0x7fe748071a60 2026-03-09T17:28:35.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.966+0000 7fe73effd700 1 --2- 192.168.123.106:0/4221436096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe73406c7a0 0x7fe73406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.967+0000 7fe74d4fb700 1 --2- 192.168.123.106:0/4221436096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe73406c7a0 0x7fe73406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.967+0000 7fe73effd700 1 -- 192.168.123.106:0/4221436096 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fe74408c630 con 0x7fe748071a60 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.967+0000 7fe74ecfe700 1 -- 192.168.123.106:0/4221436096 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe72c005320 con 0x7fe748071a60 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.970+0000 7fe74d4fb700 1 --2- 192.168.123.106:0/4221436096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe73406c7a0 0x7fe73406ec50 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fe740007f20 tx=0x7fe7400061f0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:34.971+0000 7fe73effd700 1 -- 192.168.123.106:0/4221436096 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe74405a8c0 con 0x7fe748071a60 2026-03-09T17:28:35.260 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:34 vm06.local ceph-mon[57307]: from='client.14610 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:35.260 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:34 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/3617062438' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:28:35.260 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:34 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/177767284' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:35.148+0000 7fe74ecfe700 1 -- 192.168.123.106:0/4221436096 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fe72c005190 con 0x7fe748071a60 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:35.149+0000 7fe73effd700 1 -- 192.168.123.106:0/4221436096 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7fe74405a450 con 0x7fe748071a60 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_OK 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:35.153+0000 7fe73cff9700 1 -- 192.168.123.106:0/4221436096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe73406c7a0 msgr2=0x7fe73406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:35.153+0000 7fe73cff9700 1 --2- 192.168.123.106:0/4221436096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe73406c7a0 0x7fe73406ec50 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fe740007f20 tx=0x7fe7400061f0 comp rx=0 tx=0).stop 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:35.153+0000 7fe73cff9700 1 -- 192.168.123.106:0/4221436096 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe748071a60 msgr2=0x7fe748116a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:35.153+0000 7fe73cff9700 1 --2- 192.168.123.106:0/4221436096 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe748071a60 0x7fe748116a50 secure :-1 s=READY pgs=304 cs=0 l=1 rev1=1 crypto rx=0x7fe74400b6e0 tx=0x7fe74400baa0 comp rx=0 tx=0).stop 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:35.154+0000 7fe73cff9700 1 -- 192.168.123.106:0/4221436096 shutdown_connections 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:35.154+0000 7fe73cff9700 1 --2- 192.168.123.106:0/4221436096 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fe73406c7a0 0x7fe73406ec50 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:35.154+0000 7fe73cff9700 1 --2- 192.168.123.106:0/4221436096 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe748071a60 0x7fe748116a50 unknown :-1 s=CLOSED pgs=304 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:35.154+0000 7fe73cff9700 1 --2- 192.168.123.106:0/4221436096 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe748116f90 0x7fe7481b2800 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:35.154+0000 7fe73cff9700 1 -- 192.168.123.106:0/4221436096 >> 192.168.123.106:0/4221436096 conn(0x7fe74806d1a0 msgr2=0x7fe748070610 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:35.154+0000 7fe73cff9700 1 -- 192.168.123.106:0/4221436096 shutdown_connections 2026-03-09T17:28:35.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:28:35.154+0000 7fe73cff9700 1 -- 192.168.123.106:0/4221436096 wait complete. 2026-03-09T17:28:36.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:35 vm06.local ceph-mon[57307]: from='client.14620 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:36.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:35 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/4221436096' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:28:36.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:35 vm06.local ceph-mon[57307]: pgmap v121: 65 pgs: 65 active+clean; 20 MiB data, 246 MiB used, 120 GiB / 120 GiB avail; 257 KiB/s rd, 1.7 MiB/s wr, 156 op/s 2026-03-09T17:28:36.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:35 vm09.local ceph-mon[62061]: from='client.14620 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:28:36.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:35 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/4221436096' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:28:36.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:35 vm09.local ceph-mon[62061]: pgmap v121: 65 pgs: 65 active+clean; 20 MiB data, 246 MiB used, 120 GiB / 120 GiB avail; 257 KiB/s rd, 1.7 MiB/s wr, 156 op/s 2026-03-09T17:28:38.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:37 vm06.local ceph-mon[57307]: pgmap v122: 65 pgs: 65 active+clean; 26 MiB data, 280 MiB used, 120 GiB / 120 GiB avail; 439 KiB/s rd, 2.2 MiB/s wr, 234 op/s 2026-03-09T17:28:38.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:37 vm09.local ceph-mon[62061]: pgmap v122: 65 pgs: 65 active+clean; 26 MiB data, 280 MiB used, 120 GiB / 120 GiB avail; 439 KiB/s rd, 2.2 MiB/s wr, 234 op/s 2026-03-09T17:28:39.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:38 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:28:39.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:38 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:28:40.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:40 vm06.local ceph-mon[57307]: pgmap v123: 65 pgs: 65 active+clean; 28 MiB data, 290 MiB used, 120 GiB / 120 GiB avail; 446 KiB/s rd, 2.4 MiB/s wr, 255 op/s 2026-03-09T17:28:40.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:40 vm09.local ceph-mon[62061]: pgmap v123: 65 pgs: 65 active+clean; 28 MiB data, 290 MiB used, 120 GiB / 120 GiB avail; 446 KiB/s rd, 2.4 MiB/s wr, 255 op/s 2026-03-09T17:28:41.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:41 vm06.local ceph-mon[57307]: pgmap v124: 65 pgs: 65 active+clean; 36 MiB data, 313 MiB used, 120 GiB / 120 GiB avail; 788 KiB/s rd, 2.7 MiB/s wr, 284 op/s 2026-03-09T17:28:41.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:41 vm09.local ceph-mon[62061]: pgmap v124: 65 pgs: 65 active+clean; 36 MiB data, 313 MiB used, 120 GiB / 120 GiB avail; 788 KiB/s rd, 2.7 MiB/s wr, 284 op/s 2026-03-09T17:28:42.483 INFO:tasks.workunit.client.1.vm09.stderr:Updating files: 80% (11207/13941) Updating files: 81% (11293/13941) Updating files: 82% (11432/13941) Updating files: 83% (11572/13941) Updating files: 84% (11711/13941) Updating files: 85% (11850/13941) Updating files: 86% (11990/13941) Updating files: 87% (12129/13941) Updating files: 88% (12269/13941) Updating files: 89% (12408/13941) Updating files: 90% (12547/13941) Updating files: 91% (12687/13941) Updating files: 92% (12826/13941) Updating files: 93% (12966/13941) Updating files: 94% (13105/13941) Updating files: 95% (13244/13941) Updating files: 96% (13384/13941) Updating files: 97% (13523/13941) Updating files: 98% (13663/13941) Updating files: 99% (13802/13941) Updating files: 100% (13941/13941) Updating files: 100% (13941/13941), done. 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr: 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr:state without impacting any branches by switching back to a branch. 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr: 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr: 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr: git switch -c 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr: 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr:Or undo this operation with: 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr: 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr: git switch - 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr: 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr: 2026-03-09T17:28:43.380 INFO:tasks.workunit.client.1.vm09.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-09T17:28:43.386 DEBUG:teuthology.orchestra.run.vm09:> cd -- /home/ubuntu/cephtest/clone.client.1/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.1 2026-03-09T17:28:43.444 INFO:tasks.workunit.client.1.vm09.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-09T17:28:43.447 INFO:tasks.workunit.client.1.vm09.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-09T17:28:43.447 INFO:tasks.workunit.client.1.vm09.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-09T17:28:43.502 INFO:tasks.workunit.client.1.vm09.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-09T17:28:43.550 INFO:tasks.workunit.client.1.vm09.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-09T17:28:43.594 INFO:tasks.workunit.client.1.vm09.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-09T17:28:43.599 INFO:tasks.workunit.client.1.vm09.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-09T17:28:43.636 INFO:tasks.workunit.client.1.vm09.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-09T17:28:43.636 INFO:tasks.workunit.client.1.vm09.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-09T17:28:43.640 DEBUG:teuthology.orchestra.run.vm09:> set -ex 2026-03-09T17:28:43.640 DEBUG:teuthology.orchestra.run.vm09:> dd if=/home/ubuntu/cephtest/workunits.list.client.1 of=/dev/stdout 2026-03-09T17:28:43.703 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.1... 2026-03-09T17:28:43.704 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-09T17:28:43.704 DEBUG:teuthology.orchestra.run.vm09:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && cd -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="1" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.1 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.1 CEPH_MNT=/home/ubuntu/cephtest/mnt.1 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.1/qa/workunits/suites/fsstress.sh 2026-03-09T17:28:43.803 INFO:tasks.workunit.client.1.vm09.stderr:+ mkdir -p fsstress 2026-03-09T17:28:43.810 INFO:tasks.workunit.client.1.vm09.stderr:+ pushd fsstress 2026-03-09T17:28:43.812 INFO:tasks.workunit.client.1.vm09.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-09T17:28:43.812 INFO:tasks.workunit.client.1.vm09.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-09T17:28:44.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:43 vm06.local ceph-mon[57307]: pgmap v125: 65 pgs: 65 active+clean; 45 MiB data, 433 MiB used, 120 GiB / 120 GiB avail; 1.0 MiB/s rd, 3.4 MiB/s wr, 330 op/s 2026-03-09T17:28:44.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:43 vm09.local ceph-mon[62061]: pgmap v125: 65 pgs: 65 active+clean; 45 MiB data, 433 MiB used, 120 GiB / 120 GiB avail; 1.0 MiB/s rd, 3.4 MiB/s wr, 330 op/s 2026-03-09T17:28:46.038 INFO:tasks.workunit.client.1.vm09.stderr:+ tar xzf ltp-full.tgz 2026-03-09T17:28:46.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:45 vm06.local ceph-mon[57307]: pgmap v126: 65 pgs: 65 active+clean; 50 MiB data, 485 MiB used, 120 GiB / 120 GiB avail; 1.1 MiB/s rd, 2.8 MiB/s wr, 309 op/s 2026-03-09T17:28:46.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:45 vm09.local ceph-mon[62061]: pgmap v126: 65 pgs: 65 active+clean; 50 MiB data, 485 MiB used, 120 GiB / 120 GiB avail; 1.1 MiB/s rd, 2.8 MiB/s wr, 309 op/s 2026-03-09T17:28:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:47 vm06.local ceph-mon[57307]: pgmap v127: 65 pgs: 65 active+clean; 61 MiB data, 536 MiB used, 119 GiB / 120 GiB avail; 1.4 MiB/s rd, 3.6 MiB/s wr, 336 op/s 2026-03-09T17:28:47.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:47 vm09.local ceph-mon[62061]: pgmap v127: 65 pgs: 65 active+clean; 61 MiB data, 536 MiB used, 119 GiB / 120 GiB avail; 1.4 MiB/s rd, 3.6 MiB/s wr, 336 op/s 2026-03-09T17:28:50.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:49 vm06.local ceph-mon[57307]: pgmap v128: 65 pgs: 65 active+clean; 61 MiB data, 579 MiB used, 119 GiB / 120 GiB avail; 1.2 MiB/s rd, 3.1 MiB/s wr, 278 op/s 2026-03-09T17:28:50.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:49 vm09.local ceph-mon[62061]: pgmap v128: 65 pgs: 65 active+clean; 61 MiB data, 579 MiB used, 119 GiB / 120 GiB avail; 1.2 MiB/s rd, 3.1 MiB/s wr, 278 op/s 2026-03-09T17:28:52.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:51 vm06.local ceph-mon[57307]: pgmap v129: 65 pgs: 65 active+clean; 72 MiB data, 640 MiB used, 119 GiB / 120 GiB avail; 1.6 MiB/s rd, 3.9 MiB/s wr, 296 op/s 2026-03-09T17:28:52.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:51 vm09.local ceph-mon[62061]: pgmap v129: 65 pgs: 65 active+clean; 72 MiB data, 640 MiB used, 119 GiB / 120 GiB avail; 1.6 MiB/s rd, 3.9 MiB/s wr, 296 op/s 2026-03-09T17:28:53.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:53 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:28:53.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:53 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:28:54.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:54 vm06.local ceph-mon[57307]: pgmap v130: 65 pgs: 65 active+clean; 79 MiB data, 708 MiB used, 119 GiB / 120 GiB avail; 1.6 MiB/s rd, 3.8 MiB/s wr, 337 op/s 2026-03-09T17:28:54.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:54 vm09.local ceph-mon[62061]: pgmap v130: 65 pgs: 65 active+clean; 79 MiB data, 708 MiB used, 119 GiB / 120 GiB avail; 1.6 MiB/s rd, 3.8 MiB/s wr, 337 op/s 2026-03-09T17:28:55.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:55 vm06.local ceph-mon[57307]: pgmap v131: 65 pgs: 65 active+clean; 81 MiB data, 723 MiB used, 119 GiB / 120 GiB avail; 1.3 MiB/s rd, 3.2 MiB/s wr, 303 op/s 2026-03-09T17:28:55.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:55 vm09.local ceph-mon[62061]: pgmap v131: 65 pgs: 65 active+clean; 81 MiB data, 723 MiB used, 119 GiB / 120 GiB avail; 1.3 MiB/s rd, 3.2 MiB/s wr, 303 op/s 2026-03-09T17:28:58.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:28:57 vm06.local ceph-mon[57307]: pgmap v132: 65 pgs: 65 active+clean; 90 MiB data, 752 MiB used, 119 GiB / 120 GiB avail; 1.4 MiB/s rd, 3.5 MiB/s wr, 328 op/s 2026-03-09T17:28:58.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:28:57 vm09.local ceph-mon[62061]: pgmap v132: 65 pgs: 65 active+clean; 90 MiB data, 752 MiB used, 119 GiB / 120 GiB avail; 1.4 MiB/s rd, 3.5 MiB/s wr, 328 op/s 2026-03-09T17:29:00.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:00 vm06.local ceph-mon[57307]: pgmap v133: 65 pgs: 65 active+clean; 91 MiB data, 767 MiB used, 119 GiB / 120 GiB avail; 1.1 MiB/s rd, 2.7 MiB/s wr, 279 op/s 2026-03-09T17:29:00.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:00 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:29:00.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:00 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:29:00.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:00 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:29:00.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:00 vm09.local ceph-mon[62061]: pgmap v133: 65 pgs: 65 active+clean; 91 MiB data, 767 MiB used, 119 GiB / 120 GiB avail; 1.1 MiB/s rd, 2.7 MiB/s wr, 279 op/s 2026-03-09T17:29:00.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:00 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:29:00.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:00 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:29:00.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:00 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:29:01.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:01 vm06.local ceph-mon[57307]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-09T17:29:01.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:01 vm06.local ceph-mon[57307]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-09T17:29:01.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:01 vm06.local ceph-mon[57307]: Upgrade: Need to upgrade myself (mgr.vm06.pbgzei) 2026-03-09T17:29:01.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:01 vm09.local ceph-mon[62061]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-09T17:29:01.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:01 vm09.local ceph-mon[62061]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-09T17:29:01.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:01 vm09.local ceph-mon[62061]: Upgrade: Need to upgrade myself (mgr.vm06.pbgzei) 2026-03-09T17:29:03.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:02 vm06.local ceph-mon[57307]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm09 2026-03-09T17:29:03.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:02 vm06.local ceph-mon[57307]: pgmap v134: 65 pgs: 65 active+clean; 94 MiB data, 824 MiB used, 119 GiB / 120 GiB avail; 1.1 MiB/s rd, 2.9 MiB/s wr, 298 op/s 2026-03-09T17:29:03.215 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:02 vm09.local ceph-mon[62061]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm09 2026-03-09T17:29:03.215 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:02 vm09.local ceph-mon[62061]: pgmap v134: 65 pgs: 65 active+clean; 94 MiB data, 824 MiB used, 119 GiB / 120 GiB avail; 1.1 MiB/s rd, 2.9 MiB/s wr, 298 op/s 2026-03-09T17:29:04.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:03 vm06.local ceph-mon[57307]: pgmap v135: 65 pgs: 65 active+clean; 102 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 1.1 MiB/s rd, 2.7 MiB/s wr, 341 op/s 2026-03-09T17:29:04.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:03 vm09.local ceph-mon[62061]: pgmap v135: 65 pgs: 65 active+clean; 102 MiB data, 938 MiB used, 119 GiB / 120 GiB avail; 1.1 MiB/s rd, 2.7 MiB/s wr, 341 op/s 2026-03-09T17:29:05.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.283+0000 7fa1a5f50700 1 -- 192.168.123.106:0/180647586 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1980a4310 msgr2=0x7fa1980a4720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:05.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.283+0000 7fa1a5f50700 1 --2- 192.168.123.106:0/180647586 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1980a4310 0x7fa1980a4720 secure :-1 s=READY pgs=305 cs=0 l=1 rev1=1 crypto rx=0x7fa194009a60 tx=0x7fa194009d70 comp rx=0 tx=0).stop 2026-03-09T17:29:05.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.285+0000 7fa1a5f50700 1 -- 192.168.123.106:0/180647586 shutdown_connections 2026-03-09T17:29:05.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.285+0000 7fa1a5f50700 1 --2- 192.168.123.106:0/180647586 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa1980a5450 0x7fa1980a58c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.285+0000 7fa1a5f50700 1 --2- 192.168.123.106:0/180647586 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1980a4310 0x7fa1980a4720 unknown :-1 s=CLOSED pgs=305 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.285+0000 7fa1a5f50700 1 -- 192.168.123.106:0/180647586 >> 192.168.123.106:0/180647586 conn(0x7fa19809f7e0 msgr2=0x7fa1980a1c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:05.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.285+0000 7fa1a5f50700 1 -- 192.168.123.106:0/180647586 shutdown_connections 2026-03-09T17:29:05.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.285+0000 7fa1a5f50700 1 -- 192.168.123.106:0/180647586 wait complete. 2026-03-09T17:29:05.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.286+0000 7fa1a5f50700 1 Processor -- start 2026-03-09T17:29:05.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.286+0000 7fa1a5f50700 1 -- start start 2026-03-09T17:29:05.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.286+0000 7fa1a5f50700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1980a4310 0x7fa1980b36f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:05.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.286+0000 7fa1a5f50700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa1980a5450 0x7fa1980b3c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:05.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.286+0000 7fa1a5f50700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa1980b4250 con 0x7fa1980a4310 2026-03-09T17:29:05.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.286+0000 7fa1a5f50700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa1980b4390 con 0x7fa1980a5450 2026-03-09T17:29:05.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.287+0000 7fa19ffff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa1980a5450 0x7fa1980b3c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:05.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.287+0000 7fa19ffff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa1980a5450 0x7fa1980b3c30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:56560/0 (socket says 192.168.123.106:56560) 2026-03-09T17:29:05.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.287+0000 7fa19ffff700 1 -- 192.168.123.106:0/1337532444 learned_addr learned my addr 192.168.123.106:0/1337532444 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:29:05.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.288+0000 7fa1a4f4e700 1 --2- 192.168.123.106:0/1337532444 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1980a4310 0x7fa1980b36f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:05.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.288+0000 7fa19ffff700 1 -- 192.168.123.106:0/1337532444 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1980a4310 msgr2=0x7fa1980b36f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:05.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.288+0000 7fa19ffff700 1 --2- 192.168.123.106:0/1337532444 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1980a4310 0x7fa1980b36f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.288+0000 7fa19ffff700 1 -- 192.168.123.106:0/1337532444 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa194009710 con 0x7fa1980a5450 2026-03-09T17:29:05.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.288+0000 7fa19ffff700 1 --2- 192.168.123.106:0/1337532444 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa1980a5450 0x7fa1980b3c30 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fa1a0066720 tx=0x7fa1a00729a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:05.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.288+0000 7fa19dffb700 1 -- 192.168.123.106:0/1337532444 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa1a00683f0 con 0x7fa1980a5450 2026-03-09T17:29:05.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.289+0000 7fa1a5f50700 1 -- 192.168.123.106:0/1337532444 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa198150480 con 0x7fa1980a5450 2026-03-09T17:29:05.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.289+0000 7fa1a5f50700 1 -- 192.168.123.106:0/1337532444 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa1981509a0 con 0x7fa1980a5450 2026-03-09T17:29:05.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.289+0000 7fa19dffb700 1 -- 192.168.123.106:0/1337532444 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa1a0067a90 con 0x7fa1980a5450 2026-03-09T17:29:05.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.289+0000 7fa1a5f50700 1 -- 192.168.123.106:0/1337532444 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa198004f80 con 0x7fa1980a5450 2026-03-09T17:29:05.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.289+0000 7fa19dffb700 1 -- 192.168.123.106:0/1337532444 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa1a0074c30 con 0x7fa1980a5450 2026-03-09T17:29:05.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.291+0000 7fa19dffb700 1 -- 192.168.123.106:0/1337532444 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fa1a00744f0 con 0x7fa1980a5450 2026-03-09T17:29:05.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.291+0000 7fa19dffb700 1 --2- 192.168.123.106:0/1337532444 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa19006c750 0x7fa19006ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:05.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.291+0000 7fa19dffb700 1 -- 192.168.123.106:0/1337532444 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fa1a00ee990 con 0x7fa1980a5450 2026-03-09T17:29:05.293 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.291+0000 7fa1a4f4e700 1 --2- 192.168.123.106:0/1337532444 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa19006c750 0x7fa19006ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:05.294 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.292+0000 7fa1a4f4e700 1 --2- 192.168.123.106:0/1337532444 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa19006c750 0x7fa19006ec00 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fa1940096e0 tx=0x7fa194009670 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:05.303 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.301+0000 7fa19dffb700 1 -- 192.168.123.106:0/1337532444 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa1a00bcc20 con 0x7fa1980a5450 2026-03-09T17:29:05.501 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.499+0000 7fa1a5f50700 1 -- 192.168.123.106:0/1337532444 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fa1980a9da0 con 0x7fa19006c750 2026-03-09T17:29:05.511 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.509+0000 7fa19dffb700 1 -- 192.168.123.106:0/1337532444 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fa1980a9da0 con 0x7fa19006c750 2026-03-09T17:29:05.515 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.512+0000 7fa18b7fe700 1 -- 192.168.123.106:0/1337532444 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa19006c750 msgr2=0x7fa19006ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:05.515 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.513+0000 7fa18b7fe700 1 --2- 192.168.123.106:0/1337532444 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa19006c750 0x7fa19006ec00 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fa1940096e0 tx=0x7fa194009670 comp rx=0 tx=0).stop 2026-03-09T17:29:05.515 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.513+0000 7fa18b7fe700 1 -- 192.168.123.106:0/1337532444 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa1980a5450 msgr2=0x7fa1980b3c30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:05.515 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.513+0000 7fa18b7fe700 1 --2- 192.168.123.106:0/1337532444 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa1980a5450 0x7fa1980b3c30 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fa1a0066720 tx=0x7fa1a00729a0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.515 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.513+0000 7fa18b7fe700 1 -- 192.168.123.106:0/1337532444 shutdown_connections 2026-03-09T17:29:05.515 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.513+0000 7fa18b7fe700 1 --2- 192.168.123.106:0/1337532444 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fa19006c750 0x7fa19006ec00 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.515 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.513+0000 7fa18b7fe700 1 --2- 192.168.123.106:0/1337532444 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1980a4310 0x7fa1980b36f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.515 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.513+0000 7fa18b7fe700 1 --2- 192.168.123.106:0/1337532444 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa1980a5450 0x7fa1980b3c30 secure :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fa1a0066720 tx=0x7fa1a00729a0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.515 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.513+0000 7fa18b7fe700 1 -- 192.168.123.106:0/1337532444 >> 192.168.123.106:0/1337532444 conn(0x7fa19809f7e0 msgr2=0x7fa1980a8680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:05.515 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.513+0000 7fa18b7fe700 1 -- 192.168.123.106:0/1337532444 shutdown_connections 2026-03-09T17:29:05.515 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.513+0000 7fa18b7fe700 1 -- 192.168.123.106:0/1337532444 wait complete. 2026-03-09T17:29:05.530 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:29:05.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.638+0000 7fbbe1859700 1 -- 192.168.123.106:0/1561794403 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbdc103670 msgr2=0x7fbbdc105a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:05.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.638+0000 7fbbe1859700 1 --2- 192.168.123.106:0/1561794403 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbdc103670 0x7fbbdc105a50 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7fbbc8009a60 tx=0x7fbbc8009d70 comp rx=0 tx=0).stop 2026-03-09T17:29:05.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.641+0000 7fbbe1859700 1 -- 192.168.123.106:0/1561794403 shutdown_connections 2026-03-09T17:29:05.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.641+0000 7fbbe1859700 1 --2- 192.168.123.106:0/1561794403 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbdc103670 0x7fbbdc105a50 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.641+0000 7fbbe1859700 1 --2- 192.168.123.106:0/1561794403 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbdc100d50 0x7fbbdc103130 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.641+0000 7fbbe1859700 1 -- 192.168.123.106:0/1561794403 >> 192.168.123.106:0/1561794403 conn(0x7fbbdc0fa750 msgr2=0x7fbbdc0fcba0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:05.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.641+0000 7fbbe1859700 1 -- 192.168.123.106:0/1561794403 shutdown_connections 2026-03-09T17:29:05.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.641+0000 7fbbe1859700 1 -- 192.168.123.106:0/1561794403 wait complete. 2026-03-09T17:29:05.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.642+0000 7fbbe1859700 1 Processor -- start 2026-03-09T17:29:05.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.642+0000 7fbbe1859700 1 -- start start 2026-03-09T17:29:05.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.643+0000 7fbbe1859700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbdc100d50 0x7fbbdc19c0f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:05.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.643+0000 7fbbe1859700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbdc103670 0x7fbbdc19c630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:05.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.643+0000 7fbbe1859700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbbdc19cc50 con 0x7fbbdc103670 2026-03-09T17:29:05.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.643+0000 7fbbe1859700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbbdc19cd90 con 0x7fbbdc100d50 2026-03-09T17:29:05.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.644+0000 7fbbe0857700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbdc100d50 0x7fbbdc19c0f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.644+0000 7fbbe0857700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbdc100d50 0x7fbbdc19c0f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:56578/0 (socket says 192.168.123.106:56578) 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.644+0000 7fbbe0857700 1 -- 192.168.123.106:0/3371557768 learned_addr learned my addr 192.168.123.106:0/3371557768 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.644+0000 7fbbdbfff700 1 --2- 192.168.123.106:0/3371557768 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbdc103670 0x7fbbdc19c630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.645+0000 7fbbe0857700 1 -- 192.168.123.106:0/3371557768 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbdc103670 msgr2=0x7fbbdc19c630 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.645+0000 7fbbe0857700 1 --2- 192.168.123.106:0/3371557768 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbdc103670 0x7fbbdc19c630 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.645+0000 7fbbe0857700 1 -- 192.168.123.106:0/3371557768 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbbc8009710 con 0x7fbbdc100d50 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.645+0000 7fbbe0857700 1 --2- 192.168.123.106:0/3371557768 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbdc100d50 0x7fbbdc19c0f0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fbbd000eab0 tx=0x7fbbd000edc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.645+0000 7fbbd9ffb700 1 -- 192.168.123.106:0/3371557768 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbbd000cb80 con 0x7fbbdc100d50 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.645+0000 7fbbd9ffb700 1 -- 192.168.123.106:0/3371557768 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbbd000cce0 con 0x7fbbdc100d50 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.645+0000 7fbbe1859700 1 -- 192.168.123.106:0/3371557768 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbbdc1a1840 con 0x7fbbdc100d50 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.646+0000 7fbbe1859700 1 -- 192.168.123.106:0/3371557768 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbbdc1a1d30 con 0x7fbbdc100d50 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.647+0000 7fbbc77fe700 1 -- 192.168.123.106:0/3371557768 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbbdc04ea50 con 0x7fbbdc100d50 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.647+0000 7fbbd9ffb700 1 -- 192.168.123.106:0/3371557768 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbbd0010b30 con 0x7fbbdc100d50 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.648+0000 7fbbd9ffb700 1 -- 192.168.123.106:0/3371557768 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fbbd0010430 con 0x7fbbdc100d50 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.648+0000 7fbbd9ffb700 1 --2- 192.168.123.106:0/3371557768 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbbcc06c630 0x7fbbcc06eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.649+0000 7fbbd9ffb700 1 -- 192.168.123.106:0/3371557768 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fbbd0014070 con 0x7fbbdc100d50 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.649+0000 7fbbdbfff700 1 --2- 192.168.123.106:0/3371557768 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbbcc06c630 0x7fbbcc06eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.649+0000 7fbbdbfff700 1 --2- 192.168.123.106:0/3371557768 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbbcc06c630 0x7fbbcc06eae0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fbbc80038c0 tx=0x7fbbc800b540 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:05.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.651+0000 7fbbd9ffb700 1 -- 192.168.123.106:0/3371557768 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fbbd005aa50 con 0x7fbbdc100d50 2026-03-09T17:29:05.863 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.859+0000 7fbbc77fe700 1 -- 192.168.123.106:0/3371557768 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fbbdc061190 con 0x7fbbcc06c630 2026-03-09T17:29:05.866 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.864+0000 7fbbd9ffb700 1 -- 192.168.123.106:0/3371557768 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fbbdc061190 con 0x7fbbcc06c630 2026-03-09T17:29:05.869 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.867+0000 7fbbe1859700 1 -- 192.168.123.106:0/3371557768 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbbcc06c630 msgr2=0x7fbbcc06eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:05.869 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.867+0000 7fbbe1859700 1 --2- 192.168.123.106:0/3371557768 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbbcc06c630 0x7fbbcc06eae0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fbbc80038c0 tx=0x7fbbc800b540 comp rx=0 tx=0).stop 2026-03-09T17:29:05.869 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.867+0000 7fbbe1859700 1 -- 192.168.123.106:0/3371557768 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbdc100d50 msgr2=0x7fbbdc19c0f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:05.869 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.867+0000 7fbbe1859700 1 --2- 192.168.123.106:0/3371557768 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbdc100d50 0x7fbbdc19c0f0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fbbd000eab0 tx=0x7fbbd000edc0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.869 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.867+0000 7fbbe1859700 1 -- 192.168.123.106:0/3371557768 shutdown_connections 2026-03-09T17:29:05.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.867+0000 7fbbe1859700 1 --2- 192.168.123.106:0/3371557768 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fbbdc100d50 0x7fbbdc19c0f0 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.867+0000 7fbbe1859700 1 --2- 192.168.123.106:0/3371557768 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fbbcc06c630 0x7fbbcc06eae0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.867+0000 7fbbe1859700 1 --2- 192.168.123.106:0/3371557768 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbdc103670 0x7fbbdc19c630 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.867+0000 7fbbe1859700 1 -- 192.168.123.106:0/3371557768 >> 192.168.123.106:0/3371557768 conn(0x7fbbdc0fa750 msgr2=0x7fbbdc0fcba0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:05.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.867+0000 7fbbe1859700 1 -- 192.168.123.106:0/3371557768 shutdown_connections 2026-03-09T17:29:05.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.868+0000 7fbbe1859700 1 -- 192.168.123.106:0/3371557768 wait complete. 2026-03-09T17:29:05.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.990+0000 7febeecab700 1 -- 192.168.123.106:0/2544619421 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febe8071a90 msgr2=0x7febe8071ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:05.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.990+0000 7febeecab700 1 --2- 192.168.123.106:0/2544619421 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febe8071a90 0x7febe8071ea0 secure :-1 s=READY pgs=306 cs=0 l=1 rev1=1 crypto rx=0x7febd8009b00 tx=0x7febd8009e10 comp rx=0 tx=0).stop 2026-03-09T17:29:05.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.991+0000 7febeecab700 1 -- 192.168.123.106:0/2544619421 shutdown_connections 2026-03-09T17:29:05.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.991+0000 7febeecab700 1 --2- 192.168.123.106:0/2544619421 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7febe8072470 0x7febe810beb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.991+0000 7febeecab700 1 --2- 192.168.123.106:0/2544619421 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febe8071a90 0x7febe8071ea0 unknown :-1 s=CLOSED pgs=306 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.991+0000 7febeecab700 1 -- 192.168.123.106:0/2544619421 >> 192.168.123.106:0/2544619421 conn(0x7febe806d1a0 msgr2=0x7febe806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:05.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.992+0000 7febeecab700 1 -- 192.168.123.106:0/2544619421 shutdown_connections 2026-03-09T17:29:05.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.992+0000 7febeecab700 1 -- 192.168.123.106:0/2544619421 wait complete. 2026-03-09T17:29:05.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.992+0000 7febeecab700 1 Processor -- start 2026-03-09T17:29:05.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.993+0000 7febeecab700 1 -- start start 2026-03-09T17:29:05.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.993+0000 7febeecab700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febe8072470 0x7febe8116a90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:05.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.993+0000 7febeecab700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7febe8116fd0 0x7febe81b27b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:05.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.993+0000 7febeecab700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7febe8117500 con 0x7febe8072470 2026-03-09T17:29:05.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.993+0000 7febeecab700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7febe8117670 con 0x7febe8116fd0 2026-03-09T17:29:05.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.993+0000 7febeca47700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febe8072470 0x7febe8116a90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:05.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.993+0000 7febeca47700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febe8072470 0x7febe8116a90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:34324/0 (socket says 192.168.123.106:34324) 2026-03-09T17:29:05.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.993+0000 7febeca47700 1 -- 192.168.123.106:0/1911194719 learned_addr learned my addr 192.168.123.106:0/1911194719 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:29:05.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.993+0000 7febe7fff700 1 --2- 192.168.123.106:0/1911194719 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7febe8116fd0 0x7febe81b27b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:05.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.996+0000 7febeca47700 1 -- 192.168.123.106:0/1911194719 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7febe8116fd0 msgr2=0x7febe81b27b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:05.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.996+0000 7febeca47700 1 --2- 192.168.123.106:0/1911194719 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7febe8116fd0 0x7febe81b27b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:05.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.996+0000 7febeca47700 1 -- 192.168.123.106:0/1911194719 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7febd80097e0 con 0x7febe8072470 2026-03-09T17:29:05.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.996+0000 7febeca47700 1 --2- 192.168.123.106:0/1911194719 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febe8072470 0x7febe8116a90 secure :-1 s=READY pgs=307 cs=0 l=1 rev1=1 crypto rx=0x7febd800f690 tx=0x7febd8009770 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:05.999 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.996+0000 7febe5ffb700 1 -- 192.168.123.106:0/1911194719 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7febd801c070 con 0x7febe8072470 2026-03-09T17:29:05.999 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.997+0000 7febeecab700 1 -- 192.168.123.106:0/1911194719 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7febe81b2cf0 con 0x7febe8072470 2026-03-09T17:29:05.999 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.997+0000 7febeecab700 1 -- 192.168.123.106:0/1911194719 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7febe81b3190 con 0x7febe8072470 2026-03-09T17:29:05.999 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.997+0000 7febe5ffb700 1 -- 192.168.123.106:0/1911194719 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7febd8003680 con 0x7febe8072470 2026-03-09T17:29:05.999 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.997+0000 7febeecab700 1 -- 192.168.123.106:0/1911194719 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7febe804ea50 con 0x7febe8072470 2026-03-09T17:29:06.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:05.998+0000 7febe5ffb700 1 -- 192.168.123.106:0/1911194719 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7febd8021610 con 0x7febe8072470 2026-03-09T17:29:06.004 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.002+0000 7febe5ffb700 1 -- 192.168.123.106:0/1911194719 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7febd800f9d0 con 0x7febe8072470 2026-03-09T17:29:06.004 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.002+0000 7febe5ffb700 1 --2- 192.168.123.106:0/1911194719 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7febd006c4d0 0x7febd006e980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:06.004 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.002+0000 7febe7fff700 1 --2- 192.168.123.106:0/1911194719 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7febd006c4d0 0x7febd006e980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:06.005 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.003+0000 7febe7fff700 1 --2- 192.168.123.106:0/1911194719 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7febd006c4d0 0x7febd006e980 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7febe8107dc0 tx=0x7febe000b040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:06.005 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.003+0000 7febe5ffb700 1 -- 192.168.123.106:0/1911194719 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7febd805b4a0 con 0x7febe8072470 2026-03-09T17:29:06.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.004+0000 7febe5ffb700 1 -- 192.168.123.106:0/1911194719 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7febd805b270 con 0x7febe8072470 2026-03-09T17:29:06.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.149+0000 7febeecab700 1 -- 192.168.123.106:0/1911194719 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7febe810b6e0 con 0x7febd006c4d0 2026-03-09T17:29:06.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.155+0000 7febe5ffb700 1 -- 192.168.123.106:0/1911194719 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7febe810b6e0 con 0x7febd006c4d0 2026-03-09T17:29:06.157 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:29:06.157 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (3m) 107s ago 4m 24.7M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:29:06.157 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (4m) 107s ago 4m 8078k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:29:06.157 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (3m) 108s ago 3m 8136k - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:29:06.157 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (4m) 107s ago 4m 7402k - 18.2.0 dc2bc1663786 8c6366ef2954 2026-03-09T17:29:06.157 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (3m) 108s ago 3m 7419k - 18.2.0 dc2bc1663786 78af352f0367 2026-03-09T17:29:06.157 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (3m) 107s ago 3m 84.7M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:29:06.157 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (112s) 107s ago 112s 10.8M - 18.2.0 dc2bc1663786 4b4cbdf0c640 2026-03-09T17:29:06.157 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (115s) 107s ago 115s 19.5M - 18.2.0 dc2bc1663786 4c8e86b2b8cd 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (114s) 108s ago 114s 14.7M - 18.2.0 dc2bc1663786 aa1f0430b448 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (111s) 108s ago 111s 13.4M - 18.2.0 dc2bc1663786 8dc8a0159213 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:9283,8765,8443 running (4m) 107s ago 4m 498M - 18.2.0 dc2bc1663786 2765e8d99a9c 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (3m) 108s ago 3m 443M - 18.2.0 dc2bc1663786 e6525bf5de20 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (4m) 107s ago 4m 52.0M 2048M 18.2.0 dc2bc1663786 e0e1a20b1577 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (3m) 108s ago 3m 47.4M 2048M 18.2.0 dc2bc1663786 4c30d1217de3 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (4m) 107s ago 4m 13.9M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (3m) 108s ago 3m 14.9M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (3m) 107s ago 3m 46.8M 4096M 18.2.0 dc2bc1663786 7a07f019bdd7 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (3m) 107s ago 3m 46.0M 4096M 18.2.0 dc2bc1663786 decf7e88ebaf 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (2m) 107s ago 2m 48.3M 4096M 18.2.0 dc2bc1663786 df6c7067c3e4 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (2m) 108s ago 2m 43.8M 4096M 18.2.0 dc2bc1663786 48a594500ef1 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (2m) 108s ago 2m 43.8M 4096M 18.2.0 dc2bc1663786 a47c39052541 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (2m) 108s ago 2m 42.5M 4096M 18.2.0 dc2bc1663786 89f436540a49 2026-03-09T17:29:06.158 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (3m) 107s ago 3m 41.3M - 2.43.0 a07b618ecd1d 9f52c04d903c 2026-03-09T17:29:06.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.158+0000 7febeecab700 1 -- 192.168.123.106:0/1911194719 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7febd006c4d0 msgr2=0x7febd006e980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:06.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.158+0000 7febeecab700 1 --2- 192.168.123.106:0/1911194719 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7febd006c4d0 0x7febd006e980 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7febe8107dc0 tx=0x7febe000b040 comp rx=0 tx=0).stop 2026-03-09T17:29:06.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.158+0000 7febeecab700 1 -- 192.168.123.106:0/1911194719 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febe8072470 msgr2=0x7febe8116a90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:06.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.158+0000 7febeecab700 1 --2- 192.168.123.106:0/1911194719 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febe8072470 0x7febe8116a90 secure :-1 s=READY pgs=307 cs=0 l=1 rev1=1 crypto rx=0x7febd800f690 tx=0x7febd8009770 comp rx=0 tx=0).stop 2026-03-09T17:29:06.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.158+0000 7febeecab700 1 -- 192.168.123.106:0/1911194719 shutdown_connections 2026-03-09T17:29:06.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.158+0000 7febeecab700 1 --2- 192.168.123.106:0/1911194719 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7febd006c4d0 0x7febd006e980 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.158+0000 7febeecab700 1 --2- 192.168.123.106:0/1911194719 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febe8072470 0x7febe8116a90 unknown :-1 s=CLOSED pgs=307 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.158+0000 7febeecab700 1 --2- 192.168.123.106:0/1911194719 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7febe8116fd0 0x7febe81b27b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.158+0000 7febeecab700 1 -- 192.168.123.106:0/1911194719 >> 192.168.123.106:0/1911194719 conn(0x7febe806d1a0 msgr2=0x7febe80705f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:06.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.158+0000 7febeecab700 1 -- 192.168.123.106:0/1911194719 shutdown_connections 2026-03-09T17:29:06.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.158+0000 7febeecab700 1 -- 192.168.123.106:0/1911194719 wait complete. 2026-03-09T17:29:06.231 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.229+0000 7fb88a1c4700 1 -- 192.168.123.106:0/451811757 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb884071980 msgr2=0x7fb884071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:06.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.229+0000 7fb88a1c4700 1 --2- 192.168.123.106:0/451811757 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb884071980 0x7fb884071d90 secure :-1 s=READY pgs=308 cs=0 l=1 rev1=1 crypto rx=0x7fb874007780 tx=0x7fb87400c050 comp rx=0 tx=0).stop 2026-03-09T17:29:06.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.229+0000 7fb88a1c4700 1 -- 192.168.123.106:0/451811757 shutdown_connections 2026-03-09T17:29:06.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.229+0000 7fb88a1c4700 1 --2- 192.168.123.106:0/451811757 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb884072360 0x7fb8840770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.229+0000 7fb88a1c4700 1 --2- 192.168.123.106:0/451811757 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb884071980 0x7fb884071d90 unknown :-1 s=CLOSED pgs=308 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.229+0000 7fb88a1c4700 1 -- 192.168.123.106:0/451811757 >> 192.168.123.106:0/451811757 conn(0x7fb88406d1a0 msgr2=0x7fb88406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:06.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.229+0000 7fb88a1c4700 1 -- 192.168.123.106:0/451811757 shutdown_connections 2026-03-09T17:29:06.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.229+0000 7fb88a1c4700 1 -- 192.168.123.106:0/451811757 wait complete. 2026-03-09T17:29:06.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.230+0000 7fb88a1c4700 1 Processor -- start 2026-03-09T17:29:06.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.230+0000 7fb88a1c4700 1 -- start start 2026-03-09T17:29:06.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.230+0000 7fb88a1c4700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb884072360 0x7fb884131380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:06.232 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.230+0000 7fb88a1c4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb8841318c0 0x7fb88407f550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:06.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.230+0000 7fb88a1c4700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb884131dc0 con 0x7fb8841318c0 2026-03-09T17:29:06.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.230+0000 7fb88a1c4700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb884131f30 con 0x7fb884072360 2026-03-09T17:29:06.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.230+0000 7fb882ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb8841318c0 0x7fb88407f550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:06.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.230+0000 7fb882ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb8841318c0 0x7fb88407f550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:34354/0 (socket says 192.168.123.106:34354) 2026-03-09T17:29:06.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.230+0000 7fb882ffd700 1 -- 192.168.123.106:0/2025919157 learned_addr learned my addr 192.168.123.106:0/2025919157 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:29:06.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.230+0000 7fb8837fe700 1 --2- 192.168.123.106:0/2025919157 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb884072360 0x7fb884131380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:06.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.230+0000 7fb8837fe700 1 -- 192.168.123.106:0/2025919157 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb8841318c0 msgr2=0x7fb88407f550 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:06.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.230+0000 7fb8837fe700 1 --2- 192.168.123.106:0/2025919157 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb8841318c0 0x7fb88407f550 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.230+0000 7fb8837fe700 1 -- 192.168.123.106:0/2025919157 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb874007430 con 0x7fb884072360 2026-03-09T17:29:06.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.231+0000 7fb8837fe700 1 --2- 192.168.123.106:0/2025919157 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb884072360 0x7fb884131380 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fb874000c00 tx=0x7fb87400a390 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:06.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.399+0000 7fb880ff9700 1 -- 192.168.123.106:0/2025919157 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb87400f040 con 0x7fb884072360 2026-03-09T17:29:06.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.399+0000 7fb88a1c4700 1 -- 192.168.123.106:0/2025919157 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb88407fa90 con 0x7fb884072360 2026-03-09T17:29:06.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.399+0000 7fb88a1c4700 1 -- 192.168.123.106:0/2025919157 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb88407ffb0 con 0x7fb884072360 2026-03-09T17:29:06.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.400+0000 7fb880ff9700 1 -- 192.168.123.106:0/2025919157 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb8740072b0 con 0x7fb884072360 2026-03-09T17:29:06.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.400+0000 7fb880ff9700 1 -- 192.168.123.106:0/2025919157 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb874003890 con 0x7fb884072360 2026-03-09T17:29:06.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.401+0000 7fb88a1c4700 1 -- 192.168.123.106:0/2025919157 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb870005320 con 0x7fb884072360 2026-03-09T17:29:06.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.404+0000 7fb880ff9700 1 -- 192.168.123.106:0/2025919157 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb874004100 con 0x7fb884072360 2026-03-09T17:29:06.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.404+0000 7fb880ff9700 1 --2- 192.168.123.106:0/2025919157 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb86c06c6d0 0x7fb86c06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:06.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.404+0000 7fb880ff9700 1 -- 192.168.123.106:0/2025919157 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb87408b6f0 con 0x7fb884072360 2026-03-09T17:29:06.407 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.405+0000 7fb882ffd700 1 --2- 192.168.123.106:0/2025919157 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb86c06c6d0 0x7fb86c06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:06.407 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.405+0000 7fb882ffd700 1 --2- 192.168.123.106:0/2025919157 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb86c06c6d0 0x7fb86c06eb80 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7fb87c009510 tx=0x7fb87c00b040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:06.408 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.406+0000 7fb880ff9700 1 -- 192.168.123.106:0/2025919157 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb874059980 con 0x7fb884072360 2026-03-09T17:29:06.583 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.580+0000 7fb88a1c4700 1 -- 192.168.123.106:0/2025919157 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fb870005cc0 con 0x7fb884072360 2026-03-09T17:29:06.583 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.581+0000 7fb880ff9700 1 -- 192.168.123.106:0/2025919157 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7fb874059510 con 0x7fb884072360 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:29:06.584 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:29:06.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.584+0000 7fb86a7fc700 1 -- 192.168.123.106:0/2025919157 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb86c06c6d0 msgr2=0x7fb86c06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:06.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.584+0000 7fb86a7fc700 1 --2- 192.168.123.106:0/2025919157 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb86c06c6d0 0x7fb86c06eb80 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7fb87c009510 tx=0x7fb87c00b040 comp rx=0 tx=0).stop 2026-03-09T17:29:06.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.584+0000 7fb86a7fc700 1 -- 192.168.123.106:0/2025919157 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb884072360 msgr2=0x7fb884131380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:06.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.584+0000 7fb86a7fc700 1 --2- 192.168.123.106:0/2025919157 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb884072360 0x7fb884131380 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fb874000c00 tx=0x7fb87400a390 comp rx=0 tx=0).stop 2026-03-09T17:29:06.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.584+0000 7fb86a7fc700 1 -- 192.168.123.106:0/2025919157 shutdown_connections 2026-03-09T17:29:06.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.584+0000 7fb86a7fc700 1 --2- 192.168.123.106:0/2025919157 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb884072360 0x7fb884131380 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.584+0000 7fb86a7fc700 1 --2- 192.168.123.106:0/2025919157 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb86c06c6d0 0x7fb86c06eb80 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.584+0000 7fb86a7fc700 1 --2- 192.168.123.106:0/2025919157 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb8841318c0 0x7fb88407f550 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.584+0000 7fb86a7fc700 1 -- 192.168.123.106:0/2025919157 >> 192.168.123.106:0/2025919157 conn(0x7fb88406d1a0 msgr2=0x7fb884070600 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:06.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.584+0000 7fb86a7fc700 1 -- 192.168.123.106:0/2025919157 shutdown_connections 2026-03-09T17:29:06.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.584+0000 7fb86a7fc700 1 -- 192.168.123.106:0/2025919157 wait complete. 2026-03-09T17:29:06.673 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.671+0000 7fb419dc2700 1 -- 192.168.123.106:0/2831604754 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb414071980 msgr2=0x7fb414071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:06.673 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.671+0000 7fb419dc2700 1 --2- 192.168.123.106:0/2831604754 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb414071980 0x7fb414071d90 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fb404007780 tx=0x7fb40400c050 comp rx=0 tx=0).stop 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.671+0000 7fb419dc2700 1 -- 192.168.123.106:0/2831604754 shutdown_connections 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.671+0000 7fb419dc2700 1 --2- 192.168.123.106:0/2831604754 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb414072360 0x7fb4140770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.671+0000 7fb419dc2700 1 --2- 192.168.123.106:0/2831604754 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb414071980 0x7fb414071d90 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.671+0000 7fb419dc2700 1 -- 192.168.123.106:0/2831604754 >> 192.168.123.106:0/2831604754 conn(0x7fb41406d1a0 msgr2=0x7fb41406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:06.674 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:06 vm06.local ceph-mon[57307]: pgmap v136: 65 pgs: 65 active+clean; 111 MiB data, 989 MiB used, 119 GiB / 120 GiB avail; 1.1 MiB/s rd, 2.8 MiB/s wr, 326 op/s 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.671+0000 7fb419dc2700 1 -- 192.168.123.106:0/2831604754 shutdown_connections 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.671+0000 7fb419dc2700 1 -- 192.168.123.106:0/2831604754 wait complete. 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.672+0000 7fb419dc2700 1 Processor -- start 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.672+0000 7fb419dc2700 1 -- start start 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.672+0000 7fb419dc2700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb414072360 0x7fb414131300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.672+0000 7fb419dc2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb414131840 0x7fb41407f550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.672+0000 7fb419dc2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb414131cb0 con 0x7fb414131840 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.672+0000 7fb419dc2700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb414131e20 con 0x7fb414072360 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.672+0000 7fb4137fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb414072360 0x7fb414131300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.672+0000 7fb4137fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb414072360 0x7fb414131300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:56648/0 (socket says 192.168.123.106:56648) 2026-03-09T17:29:06.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.672+0000 7fb4137fe700 1 -- 192.168.123.106:0/351111991 learned_addr learned my addr 192.168.123.106:0/351111991 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:29:06.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.672+0000 7fb4137fe700 1 -- 192.168.123.106:0/351111991 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb414131840 msgr2=0x7fb41407f550 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:06.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.672+0000 7fb4137fe700 1 --2- 192.168.123.106:0/351111991 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb414131840 0x7fb41407f550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.672+0000 7fb4137fe700 1 -- 192.168.123.106:0/351111991 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb404007430 con 0x7fb414072360 2026-03-09T17:29:06.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.673+0000 7fb4137fe700 1 --2- 192.168.123.106:0/351111991 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb414072360 0x7fb414131300 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fb404007fd0 tx=0x7fb40400da70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:06.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.673+0000 7fb410ff9700 1 -- 192.168.123.106:0/351111991 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb40400f040 con 0x7fb414072360 2026-03-09T17:29:06.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.673+0000 7fb419dc2700 1 -- 192.168.123.106:0/351111991 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb41407fa90 con 0x7fb414072360 2026-03-09T17:29:06.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.673+0000 7fb419dc2700 1 -- 192.168.123.106:0/351111991 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb41407ff80 con 0x7fb414072360 2026-03-09T17:29:06.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.673+0000 7fb410ff9700 1 -- 192.168.123.106:0/351111991 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb404003680 con 0x7fb414072360 2026-03-09T17:29:06.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.674+0000 7fb410ff9700 1 -- 192.168.123.106:0/351111991 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb404008250 con 0x7fb414072360 2026-03-09T17:29:06.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.675+0000 7fb410ff9700 1 -- 192.168.123.106:0/351111991 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb40401a070 con 0x7fb414072360 2026-03-09T17:29:06.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.675+0000 7fb410ff9700 1 --2- 192.168.123.106:0/351111991 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb3fc06c6d0 0x7fb3fc06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:06.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.675+0000 7fb410ff9700 1 -- 192.168.123.106:0/351111991 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb40408c4b0 con 0x7fb414072360 2026-03-09T17:29:06.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.675+0000 7fb419dc2700 1 -- 192.168.123.106:0/351111991 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb400005320 con 0x7fb414072360 2026-03-09T17:29:06.678 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.676+0000 7fb412ffd700 1 --2- 192.168.123.106:0/351111991 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb3fc06c6d0 0x7fb3fc06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:06.678 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.676+0000 7fb412ffd700 1 --2- 192.168.123.106:0/351111991 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb3fc06c6d0 0x7fb3fc06eb80 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7fb414072ff0 tx=0x7fb40c009250 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:06.681 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.679+0000 7fb410ff9700 1 -- 192.168.123.106:0/351111991 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb404056c30 con 0x7fb414072360 2026-03-09T17:29:06.871 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.868+0000 7fb419dc2700 1 -- 192.168.123.106:0/351111991 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fb400006200 con 0x7fb414072360 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.869+0000 7fb410ff9700 1 -- 192.168.123.106:0/351111991 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1851 (secure 0 0 0) 0x7fb404018070 con 0x7fb414072360 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:e13 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:epoch 11 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24291} 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:29:06.872 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:29:06.873 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:29:06.873 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:29:06.873 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:29:06.873 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:29:06.873 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:29:06.873 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:29:06.873 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:29:06.873 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:29:06.873 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:29:06.873 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:29:06.873 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:29:06.873 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:29:06.874 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.872+0000 7fb3fa7fc700 1 -- 192.168.123.106:0/351111991 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb3fc06c6d0 msgr2=0x7fb3fc06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:06.874 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.872+0000 7fb3fa7fc700 1 --2- 192.168.123.106:0/351111991 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb3fc06c6d0 0x7fb3fc06eb80 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7fb414072ff0 tx=0x7fb40c009250 comp rx=0 tx=0).stop 2026-03-09T17:29:06.874 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.872+0000 7fb3fa7fc700 1 -- 192.168.123.106:0/351111991 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb414072360 msgr2=0x7fb414131300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:06.874 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.872+0000 7fb3fa7fc700 1 --2- 192.168.123.106:0/351111991 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb414072360 0x7fb414131300 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fb404007fd0 tx=0x7fb40400da70 comp rx=0 tx=0).stop 2026-03-09T17:29:06.874 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.872+0000 7fb3fa7fc700 1 -- 192.168.123.106:0/351111991 shutdown_connections 2026-03-09T17:29:06.874 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.872+0000 7fb3fa7fc700 1 --2- 192.168.123.106:0/351111991 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb414072360 0x7fb414131300 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.874 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.872+0000 7fb3fa7fc700 1 --2- 192.168.123.106:0/351111991 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb3fc06c6d0 0x7fb3fc06eb80 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.874 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.872+0000 7fb3fa7fc700 1 --2- 192.168.123.106:0/351111991 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb414131840 0x7fb41407f550 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.874 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.872+0000 7fb3fa7fc700 1 -- 192.168.123.106:0/351111991 >> 192.168.123.106:0/351111991 conn(0x7fb41406d1a0 msgr2=0x7fb414076460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:06.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.872+0000 7fb3fa7fc700 1 -- 192.168.123.106:0/351111991 shutdown_connections 2026-03-09T17:29:06.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.872+0000 7fb3fa7fc700 1 -- 192.168.123.106:0/351111991 wait complete. 2026-03-09T17:29:06.876 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:29:06.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:06 vm09.local ceph-mon[62061]: pgmap v136: 65 pgs: 65 active+clean; 111 MiB data, 989 MiB used, 119 GiB / 120 GiB avail; 1.1 MiB/s rd, 2.8 MiB/s wr, 326 op/s 2026-03-09T17:29:06.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.970+0000 7fc61fa75700 1 -- 192.168.123.106:0/3326576967 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc618072360 msgr2=0x7fc6180770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:06.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.970+0000 7fc61fa75700 1 --2- 192.168.123.106:0/3326576967 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc618072360 0x7fc6180770e0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fc610009230 tx=0x7fc610009260 comp rx=0 tx=0).stop 2026-03-09T17:29:06.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.970+0000 7fc61fa75700 1 -- 192.168.123.106:0/3326576967 shutdown_connections 2026-03-09T17:29:06.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.970+0000 7fc61fa75700 1 --2- 192.168.123.106:0/3326576967 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc618072360 0x7fc6180770e0 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.970+0000 7fc61fa75700 1 --2- 192.168.123.106:0/3326576967 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc618071980 0x7fc618071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.970+0000 7fc61fa75700 1 -- 192.168.123.106:0/3326576967 >> 192.168.123.106:0/3326576967 conn(0x7fc61806d1a0 msgr2=0x7fc61806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.970+0000 7fc61fa75700 1 -- 192.168.123.106:0/3326576967 shutdown_connections 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.970+0000 7fc61fa75700 1 -- 192.168.123.106:0/3326576967 wait complete. 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.971+0000 7fc61fa75700 1 Processor -- start 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.971+0000 7fc61fa75700 1 -- start start 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.971+0000 7fc61fa75700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc618071980 0x7fc6180824d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.971+0000 7fc61fa75700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc618082a10 0x7fc618082e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.971+0000 7fc61fa75700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6181b2a90 con 0x7fc618071980 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.971+0000 7fc61fa75700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6181b2bd0 con 0x7fc618082a10 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.972+0000 7fc61d010700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc618082a10 0x7fc618082e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.972+0000 7fc61d010700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc618082a10 0x7fc618082e80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:56664/0 (socket says 192.168.123.106:56664) 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.972+0000 7fc61d010700 1 -- 192.168.123.106:0/1861709974 learned_addr learned my addr 192.168.123.106:0/1861709974 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.972+0000 7fc61d010700 1 -- 192.168.123.106:0/1861709974 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc618071980 msgr2=0x7fc6180824d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.972+0000 7fc61d010700 1 --2- 192.168.123.106:0/1861709974 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc618071980 0x7fc6180824d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.972+0000 7fc61d010700 1 -- 192.168.123.106:0/1861709974 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc610008ee0 con 0x7fc618082a10 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.972+0000 7fc61d010700 1 --2- 192.168.123.106:0/1861709974 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc618082a10 0x7fc618082e80 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fc610004740 tx=0x7fc610004820 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:06.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.972+0000 7fc60effd700 1 -- 192.168.123.106:0/1861709974 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc61000cee0 con 0x7fc618082a10 2026-03-09T17:29:06.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.972+0000 7fc61fa75700 1 -- 192.168.123.106:0/1861709974 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc6181b2d10 con 0x7fc618082a10 2026-03-09T17:29:06.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.972+0000 7fc61fa75700 1 -- 192.168.123.106:0/1861709974 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc6181b3180 con 0x7fc618082a10 2026-03-09T17:29:06.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.973+0000 7fc60effd700 1 -- 192.168.123.106:0/1861709974 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc61001dcf0 con 0x7fc618082a10 2026-03-09T17:29:06.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.973+0000 7fc60effd700 1 -- 192.168.123.106:0/1861709974 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc61000e950 con 0x7fc618082a10 2026-03-09T17:29:06.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.974+0000 7fc60effd700 1 -- 192.168.123.106:0/1861709974 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fc6100085f0 con 0x7fc618082a10 2026-03-09T17:29:06.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.974+0000 7fc60effd700 1 --2- 192.168.123.106:0/1861709974 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc60406c6d0 0x7fc60406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:06.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.975+0000 7fc60effd700 1 -- 192.168.123.106:0/1861709974 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fc610012070 con 0x7fc618082a10 2026-03-09T17:29:06.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.975+0000 7fc61fa75700 1 -- 192.168.123.106:0/1861709974 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc5fc005320 con 0x7fc618082a10 2026-03-09T17:29:06.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.975+0000 7fc61d811700 1 --2- 192.168.123.106:0/1861709974 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc60406c6d0 0x7fc60406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:06.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.975+0000 7fc61d811700 1 --2- 192.168.123.106:0/1861709974 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc60406c6d0 0x7fc60406eb80 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7fc61400b3c0 tx=0x7fc61400d040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:06.980 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:06.978+0000 7fc60effd700 1 -- 192.168.123.106:0/1861709974 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc61005fb90 con 0x7fc618082a10 2026-03-09T17:29:07.116 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.114+0000 7fc61fa75700 1 -- 192.168.123.106:0/1861709974 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc5fc000bf0 con 0x7fc60406c6d0 2026-03-09T17:29:07.117 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.115+0000 7fc60effd700 1 -- 192.168.123.106:0/1861709974 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fc5fc000bf0 con 0x7fc60406c6d0 2026-03-09T17:29:07.117 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:29:07.117 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T17:29:07.117 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:29:07.117 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:29:07.117 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [], 2026-03-09T17:29:07.117 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "0/23 daemons upgraded", 2026-03-09T17:29:07.117 INFO:teuthology.orchestra.run.vm06.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm09", 2026-03-09T17:29:07.117 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:29:07.117 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:29:07.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.119+0000 7fc60cf39700 1 -- 192.168.123.106:0/1861709974 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc60406c6d0 msgr2=0x7fc60406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:07.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.119+0000 7fc60cf39700 1 --2- 192.168.123.106:0/1861709974 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc60406c6d0 0x7fc60406eb80 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7fc61400b3c0 tx=0x7fc61400d040 comp rx=0 tx=0).stop 2026-03-09T17:29:07.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.119+0000 7fc60cf39700 1 -- 192.168.123.106:0/1861709974 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc618082a10 msgr2=0x7fc618082e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:07.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.119+0000 7fc60cf39700 1 --2- 192.168.123.106:0/1861709974 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc618082a10 0x7fc618082e80 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fc610004740 tx=0x7fc610004820 comp rx=0 tx=0).stop 2026-03-09T17:29:07.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.119+0000 7fc60cf39700 1 -- 192.168.123.106:0/1861709974 shutdown_connections 2026-03-09T17:29:07.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.119+0000 7fc60cf39700 1 --2- 192.168.123.106:0/1861709974 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fc60406c6d0 0x7fc60406eb80 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:07.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.119+0000 7fc60cf39700 1 --2- 192.168.123.106:0/1861709974 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc618071980 0x7fc6180824d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:07.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.119+0000 7fc60cf39700 1 --2- 192.168.123.106:0/1861709974 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc618082a10 0x7fc618082e80 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:07.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.119+0000 7fc60cf39700 1 -- 192.168.123.106:0/1861709974 >> 192.168.123.106:0/1861709974 conn(0x7fc61806d1a0 msgr2=0x7fc618076450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:07.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.119+0000 7fc60cf39700 1 -- 192.168.123.106:0/1861709974 shutdown_connections 2026-03-09T17:29:07.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.119+0000 7fc60cf39700 1 -- 192.168.123.106:0/1861709974 wait complete. 2026-03-09T17:29:07.199 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.197+0000 7fdb9bfff700 1 -- 192.168.123.106:0/4265671196 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdb940a5410 msgr2=0x7fdb940a5880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:07.199 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.197+0000 7fdb9bfff700 1 --2- 192.168.123.106:0/4265671196 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdb940a5410 0x7fdb940a5880 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fdb8c009b00 tx=0x7fdb8c009e10 comp rx=0 tx=0).stop 2026-03-09T17:29:07.199 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.197+0000 7fdb9bfff700 1 -- 192.168.123.106:0/4265671196 shutdown_connections 2026-03-09T17:29:07.199 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.197+0000 7fdb9bfff700 1 --2- 192.168.123.106:0/4265671196 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdb940a5410 0x7fdb940a5880 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:07.199 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.197+0000 7fdb9bfff700 1 --2- 192.168.123.106:0/4265671196 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb940a42d0 0x7fdb940a46e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:07.199 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.197+0000 7fdb9bfff700 1 -- 192.168.123.106:0/4265671196 >> 192.168.123.106:0/4265671196 conn(0x7fdb9409f7a0 msgr2=0x7fdb940a1bf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:07.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.197+0000 7fdb9bfff700 1 -- 192.168.123.106:0/4265671196 shutdown_connections 2026-03-09T17:29:07.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.197+0000 7fdb9bfff700 1 -- 192.168.123.106:0/4265671196 wait complete. 2026-03-09T17:29:07.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.198+0000 7fdb9bfff700 1 Processor -- start 2026-03-09T17:29:07.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.198+0000 7fdb9bfff700 1 -- start start 2026-03-09T17:29:07.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.198+0000 7fdb9bfff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdb940a42d0 0x7fdb940b34f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:07.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.198+0000 7fdb9bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb940b3a30 0x7fdb941500e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:07.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.198+0000 7fdb9bfff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb940b3ea0 con 0x7fdb940b3a30 2026-03-09T17:29:07.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.198+0000 7fdb9bfff700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb940b3fe0 con 0x7fdb940a42d0 2026-03-09T17:29:07.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.199+0000 7fdb9affd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdb940a42d0 0x7fdb940b34f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:07.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.199+0000 7fdb9affd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdb940a42d0 0x7fdb940b34f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:56692/0 (socket says 192.168.123.106:56692) 2026-03-09T17:29:07.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.199+0000 7fdb9affd700 1 -- 192.168.123.106:0/734369791 learned_addr learned my addr 192.168.123.106:0/734369791 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:29:07.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.200+0000 7fdb9affd700 1 -- 192.168.123.106:0/734369791 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb940b3a30 msgr2=0x7fdb941500e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:07.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.200+0000 7fdb9affd700 1 --2- 192.168.123.106:0/734369791 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb940b3a30 0x7fdb941500e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:07.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.200+0000 7fdb9affd700 1 -- 192.168.123.106:0/734369791 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdb8c0097e0 con 0x7fdb940a42d0 2026-03-09T17:29:07.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.200+0000 7fdb9affd700 1 --2- 192.168.123.106:0/734369791 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdb940a42d0 0x7fdb940b34f0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fdb9c0503f0 tx=0x7fdb9c071670 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:07.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.200+0000 7fdb83fff700 1 -- 192.168.123.106:0/734369791 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdb9c069280 con 0x7fdb940a42d0 2026-03-09T17:29:07.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.201+0000 7fdb9bfff700 1 -- 192.168.123.106:0/734369791 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdb94150620 con 0x7fdb940a42d0 2026-03-09T17:29:07.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.201+0000 7fdb9bfff700 1 -- 192.168.123.106:0/734369791 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdb94150ae0 con 0x7fdb940a42d0 2026-03-09T17:29:07.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.201+0000 7fdb83fff700 1 -- 192.168.123.106:0/734369791 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdb9c0693e0 con 0x7fdb940a42d0 2026-03-09T17:29:07.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.201+0000 7fdb83fff700 1 -- 192.168.123.106:0/734369791 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdb9c074a10 con 0x7fdb940a42d0 2026-03-09T17:29:07.204 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.202+0000 7fdb9bfff700 1 -- 192.168.123.106:0/734369791 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdb88005320 con 0x7fdb940a42d0 2026-03-09T17:29:07.204 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.202+0000 7fdb83fff700 1 -- 192.168.123.106:0/734369791 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fdb9c074b70 con 0x7fdb940a42d0 2026-03-09T17:29:07.205 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.203+0000 7fdb83fff700 1 --2- 192.168.123.106:0/734369791 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdb8406c7a0 0x7fdb8406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:07.205 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.203+0000 7fdb9a7fc700 1 --2- 192.168.123.106:0/734369791 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdb8406c7a0 0x7fdb8406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:07.205 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.203+0000 7fdb83fff700 1 -- 192.168.123.106:0/734369791 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fdb9c0eddc0 con 0x7fdb940a42d0 2026-03-09T17:29:07.205 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.203+0000 7fdb9a7fc700 1 --2- 192.168.123.106:0/734369791 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdb8406c7a0 0x7fdb8406ec50 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fdb8c009ad0 tx=0x7fdb8c009f90 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:07.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.205+0000 7fdb83fff700 1 -- 192.168.123.106:0/734369791 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fdb9c0bc050 con 0x7fdb940a42d0 2026-03-09T17:29:07.396 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.393+0000 7fdb9bfff700 1 -- 192.168.123.106:0/734369791 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fdb88005190 con 0x7fdb940a42d0 2026-03-09T17:29:07.397 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.395+0000 7fdb83fff700 1 -- 192.168.123.106:0/734369791 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7fdb9c0bbbe0 con 0x7fdb940a42d0 2026-03-09T17:29:07.397 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_OK 2026-03-09T17:29:07.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.399+0000 7fdb81ffb700 1 -- 192.168.123.106:0/734369791 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdb8406c7a0 msgr2=0x7fdb8406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:07.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.399+0000 7fdb81ffb700 1 --2- 192.168.123.106:0/734369791 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdb8406c7a0 0x7fdb8406ec50 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fdb8c009ad0 tx=0x7fdb8c009f90 comp rx=0 tx=0).stop 2026-03-09T17:29:07.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.399+0000 7fdb81ffb700 1 -- 192.168.123.106:0/734369791 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdb940a42d0 msgr2=0x7fdb940b34f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:07.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.399+0000 7fdb81ffb700 1 --2- 192.168.123.106:0/734369791 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdb940a42d0 0x7fdb940b34f0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fdb9c0503f0 tx=0x7fdb9c071670 comp rx=0 tx=0).stop 2026-03-09T17:29:07.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.399+0000 7fdb81ffb700 1 -- 192.168.123.106:0/734369791 shutdown_connections 2026-03-09T17:29:07.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.399+0000 7fdb81ffb700 1 --2- 192.168.123.106:0/734369791 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdb940a42d0 0x7fdb940b34f0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:07.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.399+0000 7fdb81ffb700 1 --2- 192.168.123.106:0/734369791 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fdb8406c7a0 0x7fdb8406ec50 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:07.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.399+0000 7fdb81ffb700 1 --2- 192.168.123.106:0/734369791 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb940b3a30 0x7fdb941500e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:07.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.399+0000 7fdb81ffb700 1 -- 192.168.123.106:0/734369791 >> 192.168.123.106:0/734369791 conn(0x7fdb9409f7a0 msgr2=0x7fdb940a19e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:07.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.399+0000 7fdb81ffb700 1 -- 192.168.123.106:0/734369791 shutdown_connections 2026-03-09T17:29:07.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:07.399+0000 7fdb81ffb700 1 -- 192.168.123.106:0/734369791 wait complete. 2026-03-09T17:29:07.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:07 vm06.local ceph-mon[57307]: from='client.24421 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:07.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:07 vm06.local ceph-mon[57307]: from='client.24425 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:07.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:07 vm06.local ceph-mon[57307]: from='client.14636 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:07.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:07 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/2025919157' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:29:07.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:07 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/351111991' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:29:07.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:07 vm06.local ceph-mon[57307]: from='client.24441 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:07.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:07 vm06.local ceph-mon[57307]: pgmap v137: 65 pgs: 65 active+clean; 127 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 4.1 MiB/s wr, 390 op/s 2026-03-09T17:29:07.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:07 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/734369791' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:29:07.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:07 vm09.local ceph-mon[62061]: from='client.24421 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:07.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:07 vm09.local ceph-mon[62061]: from='client.24425 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:07.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:07 vm09.local ceph-mon[62061]: from='client.14636 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:07.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:07 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/2025919157' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:29:07.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:07 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/351111991' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:29:07.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:07 vm09.local ceph-mon[62061]: from='client.24441 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:07.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:07 vm09.local ceph-mon[62061]: pgmap v137: 65 pgs: 65 active+clean; 127 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 4.1 MiB/s wr, 390 op/s 2026-03-09T17:29:07.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:07 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/734369791' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:29:09.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:08 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:29:09.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:08 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:29:10.088 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:09 vm06.local ceph-mon[57307]: pgmap v138: 65 pgs: 65 active+clean; 133 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.6 MiB/s rd, 3.8 MiB/s wr, 343 op/s 2026-03-09T17:29:10.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:09 vm09.local ceph-mon[62061]: pgmap v138: 65 pgs: 65 active+clean; 133 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.6 MiB/s rd, 3.8 MiB/s wr, 343 op/s 2026-03-09T17:29:12.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:12 vm06.local ceph-mon[57307]: pgmap v139: 65 pgs: 65 active+clean; 136 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 1.6 MiB/s rd, 3.9 MiB/s wr, 359 op/s 2026-03-09T17:29:12.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:12 vm09.local ceph-mon[62061]: pgmap v139: 65 pgs: 65 active+clean; 136 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 1.6 MiB/s rd, 3.9 MiB/s wr, 359 op/s 2026-03-09T17:29:14.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:14 vm06.local ceph-mon[57307]: pgmap v140: 65 pgs: 65 active+clean; 146 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 1.9 MiB/s rd, 4.6 MiB/s wr, 412 op/s 2026-03-09T17:29:14.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:14 vm09.local ceph-mon[62061]: pgmap v140: 65 pgs: 65 active+clean; 146 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 1.9 MiB/s rd, 4.6 MiB/s wr, 412 op/s 2026-03-09T17:29:15.687 INFO:tasks.workunit.client.0.vm06.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-09T17:29:15.691 INFO:tasks.workunit.client.0.vm06.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-09T17:29:15.691 INFO:tasks.workunit.client.0.vm06.stderr:+ make 2026-03-09T17:29:16.456 INFO:tasks.workunit.client.0.vm06.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-09T17:29:16.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:16 vm06.local ceph-mon[57307]: pgmap v141: 65 pgs: 65 active+clean; 151 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 1.6 MiB/s rd, 4.3 MiB/s wr, 385 op/s 2026-03-09T17:29:16.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:16 vm09.local ceph-mon[62061]: pgmap v141: 65 pgs: 65 active+clean; 151 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 1.6 MiB/s rd, 4.3 MiB/s wr, 385 op/s 2026-03-09T17:29:16.828 INFO:tasks.workunit.client.0.vm06.stderr:++ readlink -f fsstress 2026-03-09T17:29:16.833 INFO:tasks.workunit.client.0.vm06.stderr:+ BIN=/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-09T17:29:16.833 INFO:tasks.workunit.client.0.vm06.stderr:+ popd 2026-03-09T17:29:16.835 INFO:tasks.workunit.client.0.vm06.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-09T17:29:16.835 INFO:tasks.workunit.client.0.vm06.stderr:+ popd 2026-03-09T17:29:16.836 INFO:tasks.workunit.client.0.vm06.stdout:~/cephtest/mnt.0/client.0/tmp 2026-03-09T17:29:16.836 INFO:tasks.workunit.client.0.vm06.stderr:++ mktemp -d -p . 2026-03-09T17:29:16.841 INFO:tasks.workunit.client.0.vm06.stderr:+ T=./tmp.lxoFKUeY46 2026-03-09T17:29:16.841 INFO:tasks.workunit.client.0.vm06.stderr:+ /home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.lxoFKUeY46 -l 1 -n 1000 -p 10 -v 2026-03-09T17:29:16.849 INFO:tasks.workunit.client.0.vm06.stdout:seed = 1772351892 2026-03-09T17:29:16.854 INFO:tasks.workunit.client.0.vm06.stdout:0/0: chown . 529255 1 2026-03-09T17:29:16.854 INFO:tasks.workunit.client.0.vm06.stdout:0/1: dread - no filename 2026-03-09T17:29:16.854 INFO:tasks.workunit.client.0.vm06.stdout:0/2: write - no filename 2026-03-09T17:29:16.854 INFO:tasks.workunit.client.0.vm06.stdout:0/3: write - no filename 2026-03-09T17:29:16.857 INFO:tasks.workunit.client.0.vm06.stdout:1/0: write - no filename 2026-03-09T17:29:16.858 INFO:tasks.workunit.client.0.vm06.stdout:4/0: fdatasync - no filename 2026-03-09T17:29:16.859 INFO:tasks.workunit.client.0.vm06.stdout:0/4: symlink l0 0 2026-03-09T17:29:16.859 INFO:tasks.workunit.client.0.vm06.stdout:0/5: dread - no filename 2026-03-09T17:29:16.862 INFO:tasks.workunit.client.0.vm06.stdout:1/1: mknod c0 0 2026-03-09T17:29:16.863 INFO:tasks.workunit.client.0.vm06.stdout:0/6: creat f1 x:0 0 0 2026-03-09T17:29:16.864 INFO:tasks.workunit.client.0.vm06.stdout:4/1: creat f0 x:0 0 0 2026-03-09T17:29:16.864 INFO:tasks.workunit.client.0.vm06.stdout:0/7: chown l0 31 1 2026-03-09T17:29:16.864 INFO:tasks.workunit.client.0.vm06.stdout:4/2: stat f0 0 2026-03-09T17:29:16.866 INFO:tasks.workunit.client.0.vm06.stdout:4/3: write f0 [679005,77021] 0 2026-03-09T17:29:16.866 INFO:tasks.workunit.client.0.vm06.stdout:4/4: stat f0 0 2026-03-09T17:29:16.870 INFO:tasks.workunit.client.0.vm06.stdout:1/2: creat f1 x:0 0 0 2026-03-09T17:29:16.870 INFO:tasks.workunit.client.0.vm06.stdout:1/3: rmdir - no directory 2026-03-09T17:29:16.870 INFO:tasks.workunit.client.0.vm06.stdout:1/4: dread - f1 zero size 2026-03-09T17:29:16.871 INFO:tasks.workunit.client.0.vm06.stdout:5/0: creat f0 x:0 0 0 2026-03-09T17:29:16.871 INFO:tasks.workunit.client.0.vm06.stdout:1/5: stat f1 0 2026-03-09T17:29:16.874 INFO:tasks.workunit.client.0.vm06.stdout:0/8: dwrite f1 [0,4194304] 0 2026-03-09T17:29:16.881 INFO:tasks.workunit.client.0.vm06.stdout:4/5: rename f0 to f1 0 2026-03-09T17:29:16.881 INFO:tasks.workunit.client.0.vm06.stdout:7/0: truncate - no filename 2026-03-09T17:29:16.881 INFO:tasks.workunit.client.0.vm06.stdout:7/1: dwrite - no filename 2026-03-09T17:29:16.881 INFO:tasks.workunit.client.0.vm06.stdout:7/2: stat - no entries 2026-03-09T17:29:16.881 INFO:tasks.workunit.client.0.vm06.stdout:7/3: truncate - no filename 2026-03-09T17:29:16.881 INFO:tasks.workunit.client.0.vm06.stdout:7/4: dread - no filename 2026-03-09T17:29:16.885 INFO:tasks.workunit.client.0.vm06.stdout:0/9: mknod c2 0 2026-03-09T17:29:16.885 INFO:tasks.workunit.client.0.vm06.stdout:5/1: creat f1 x:0 0 0 2026-03-09T17:29:16.888 INFO:tasks.workunit.client.0.vm06.stdout:9/0: dwrite - no filename 2026-03-09T17:29:16.901 INFO:tasks.workunit.client.0.vm06.stdout:0/10: dwrite f1 [0,4194304] 0 2026-03-09T17:29:16.901 INFO:tasks.workunit.client.0.vm06.stdout:6/0: symlink l0 0 2026-03-09T17:29:16.901 INFO:tasks.workunit.client.0.vm06.stdout:0/11: rmdir - no directory 2026-03-09T17:29:16.901 INFO:tasks.workunit.client.0.vm06.stdout:6/1: read - no filename 2026-03-09T17:29:16.909 INFO:tasks.workunit.client.0.vm06.stdout:6/2: chown l0 3 1 2026-03-09T17:29:16.914 INFO:tasks.workunit.client.0.vm06.stdout:5/2: mknod c2 0 2026-03-09T17:29:16.915 INFO:tasks.workunit.client.0.vm06.stdout:5/3: read - f0 zero size 2026-03-09T17:29:16.917 INFO:tasks.workunit.client.0.vm06.stdout:7/5: creat f0 x:0 0 0 2026-03-09T17:29:16.919 INFO:tasks.workunit.client.0.vm06.stdout:8/0: read - no filename 2026-03-09T17:29:16.919 INFO:tasks.workunit.client.0.vm06.stdout:8/1: write - no filename 2026-03-09T17:29:16.919 INFO:tasks.workunit.client.0.vm06.stdout:8/2: stat - no entries 2026-03-09T17:29:16.922 INFO:tasks.workunit.client.0.vm06.stdout:1/6: link c0 c2 0 2026-03-09T17:29:16.927 INFO:tasks.workunit.client.0.vm06.stdout:1/7: dread - f1 zero size 2026-03-09T17:29:16.933 INFO:tasks.workunit.client.0.vm06.stdout:0/12: mknod c3 0 2026-03-09T17:29:16.942 INFO:tasks.workunit.client.0.vm06.stdout:6/3: rename l0 to l1 0 2026-03-09T17:29:16.943 INFO:tasks.workunit.client.0.vm06.stdout:5/4: creat f3 x:0 0 0 2026-03-09T17:29:16.943 INFO:tasks.workunit.client.0.vm06.stdout:5/5: write f1 [865866,17478] 0 2026-03-09T17:29:16.944 INFO:tasks.workunit.client.0.vm06.stdout:5/6: dread - f0 zero size 2026-03-09T17:29:16.947 INFO:tasks.workunit.client.0.vm06.stdout:7/6: creat f1 x:0 0 0 2026-03-09T17:29:16.947 INFO:tasks.workunit.client.0.vm06.stdout:7/7: truncate f0 696410 0 2026-03-09T17:29:16.947 INFO:tasks.workunit.client.0.vm06.stdout:7/8: rmdir - no directory 2026-03-09T17:29:16.950 INFO:tasks.workunit.client.0.vm06.stdout:7/9: dread f0 [0,4194304] 0 2026-03-09T17:29:16.953 INFO:tasks.workunit.client.0.vm06.stdout:5/7: dwrite f1 [0,4194304] 0 2026-03-09T17:29:16.974 INFO:tasks.workunit.client.0.vm06.stdout:9/1: creat f0 x:0 0 0 2026-03-09T17:29:16.974 INFO:tasks.workunit.client.0.vm06.stdout:9/2: read - f0 zero size 2026-03-09T17:29:16.974 INFO:tasks.workunit.client.0.vm06.stdout:9/3: dread - f0 zero size 2026-03-09T17:29:16.978 INFO:tasks.workunit.client.0.vm06.stdout:6/4: creat f2 x:0 0 0 2026-03-09T17:29:16.978 INFO:tasks.workunit.client.0.vm06.stdout:6/5: dread - f2 zero size 2026-03-09T17:29:16.979 INFO:tasks.workunit.client.0.vm06.stdout:6/6: write f2 [772626,64083] 0 2026-03-09T17:29:16.983 INFO:tasks.workunit.client.0.vm06.stdout:7/10: creat f2 x:0 0 0 2026-03-09T17:29:16.986 INFO:tasks.workunit.client.0.vm06.stdout:2/0: chown . 30020 1 2026-03-09T17:29:16.986 INFO:tasks.workunit.client.0.vm06.stdout:2/1: stat - no entries 2026-03-09T17:29:16.986 INFO:tasks.workunit.client.0.vm06.stdout:2/2: write - no filename 2026-03-09T17:29:16.986 INFO:tasks.workunit.client.0.vm06.stdout:2/3: rename - no filename 2026-03-09T17:29:16.986 INFO:tasks.workunit.client.0.vm06.stdout:2/4: write - no filename 2026-03-09T17:29:16.986 INFO:tasks.workunit.client.0.vm06.stdout:2/5: write - no filename 2026-03-09T17:29:16.986 INFO:tasks.workunit.client.0.vm06.stdout:2/6: truncate - no filename 2026-03-09T17:29:16.987 INFO:tasks.workunit.client.0.vm06.stdout:5/8: mkdir d4 0 2026-03-09T17:29:16.990 INFO:tasks.workunit.client.0.vm06.stdout:8/3: creat f0 x:0 0 0 2026-03-09T17:29:16.990 INFO:tasks.workunit.client.0.vm06.stdout:8/4: chown f0 1 1 2026-03-09T17:29:16.992 INFO:tasks.workunit.client.0.vm06.stdout:9/4: unlink f0 0 2026-03-09T17:29:16.992 INFO:tasks.workunit.client.0.vm06.stdout:9/5: chown . 62 1 2026-03-09T17:29:17.001 INFO:tasks.workunit.client.0.vm06.stdout:1/8: chown c0 20202 1 2026-03-09T17:29:17.007 INFO:tasks.workunit.client.0.vm06.stdout:6/7: mknod c3 0 2026-03-09T17:29:17.008 INFO:tasks.workunit.client.0.vm06.stdout:8/5: symlink l1 0 2026-03-09T17:29:17.008 INFO:tasks.workunit.client.0.vm06.stdout:9/6: mknod c1 0 2026-03-09T17:29:17.011 INFO:tasks.workunit.client.0.vm06.stdout:1/9: dwrite f1 [0,4194304] 0 2026-03-09T17:29:17.011 INFO:tasks.workunit.client.0.vm06.stdout:9/7: rmdir - no directory 2026-03-09T17:29:17.012 INFO:tasks.workunit.client.0.vm06.stdout:9/8: dwrite - no filename 2026-03-09T17:29:17.012 INFO:tasks.workunit.client.0.vm06.stdout:9/9: dread - no filename 2026-03-09T17:29:17.012 INFO:tasks.workunit.client.0.vm06.stdout:0/13: write f1 [5179807,73611] 0 2026-03-09T17:29:17.013 INFO:tasks.workunit.client.0.vm06.stdout:8/6: truncate f0 629798 0 2026-03-09T17:29:17.017 INFO:tasks.workunit.client.0.vm06.stdout:0/14: write f1 [839137,65446] 0 2026-03-09T17:29:17.021 INFO:tasks.workunit.client.0.vm06.stdout:5/9: creat d4/f5 x:0 0 0 2026-03-09T17:29:17.021 INFO:tasks.workunit.client.0.vm06.stdout:5/10: rename d4 to d4/d6 22 2026-03-09T17:29:17.021 INFO:tasks.workunit.client.0.vm06.stdout:2/7: getdents . 0 2026-03-09T17:29:17.021 INFO:tasks.workunit.client.0.vm06.stdout:2/8: truncate - no filename 2026-03-09T17:29:17.023 INFO:tasks.workunit.client.0.vm06.stdout:6/8: dwrite f2 [0,4194304] 0 2026-03-09T17:29:17.033 INFO:tasks.workunit.client.0.vm06.stdout:1/10: creat f3 x:0 0 0 2026-03-09T17:29:17.033 INFO:tasks.workunit.client.0.vm06.stdout:1/11: dread - f3 zero size 2026-03-09T17:29:17.040 INFO:tasks.workunit.client.0.vm06.stdout:5/11: creat d4/f7 x:0 0 0 2026-03-09T17:29:17.042 INFO:tasks.workunit.client.0.vm06.stdout:7/11: fsync f0 0 2026-03-09T17:29:17.052 INFO:tasks.workunit.client.0.vm06.stdout:7/12: stat f2 0 2026-03-09T17:29:17.052 INFO:tasks.workunit.client.0.vm06.stdout:5/12: dwrite d4/f5 [0,4194304] 0 2026-03-09T17:29:17.052 INFO:tasks.workunit.client.0.vm06.stdout:5/13: chown f1 1046090110 1 2026-03-09T17:29:17.052 INFO:tasks.workunit.client.0.vm06.stdout:5/14: truncate d4/f7 829439 0 2026-03-09T17:29:17.052 INFO:tasks.workunit.client.0.vm06.stdout:6/9: mknod c4 0 2026-03-09T17:29:17.052 INFO:tasks.workunit.client.0.vm06.stdout:1/12: creat f4 x:0 0 0 2026-03-09T17:29:17.071 INFO:tasks.workunit.client.0.vm06.stdout:2/9: symlink l0 0 2026-03-09T17:29:17.071 INFO:tasks.workunit.client.0.vm06.stdout:2/10: dread - no filename 2026-03-09T17:29:17.071 INFO:tasks.workunit.client.0.vm06.stdout:7/13: mknod c3 0 2026-03-09T17:29:17.071 INFO:tasks.workunit.client.0.vm06.stdout:5/15: mknod d4/c8 0 2026-03-09T17:29:17.075 INFO:tasks.workunit.client.0.vm06.stdout:1/13: symlink l5 0 2026-03-09T17:29:17.075 INFO:tasks.workunit.client.0.vm06.stdout:1/14: write f4 [419457,17619] 0 2026-03-09T17:29:17.075 INFO:tasks.workunit.client.0.vm06.stdout:1/15: rmdir - no directory 2026-03-09T17:29:17.078 INFO:tasks.workunit.client.0.vm06.stdout:7/14: creat f4 x:0 0 0 2026-03-09T17:29:17.078 INFO:tasks.workunit.client.0.vm06.stdout:7/15: readlink - no filename 2026-03-09T17:29:17.081 INFO:tasks.workunit.client.0.vm06.stdout:5/16: mkdir d4/d9 0 2026-03-09T17:29:17.083 INFO:tasks.workunit.client.0.vm06.stdout:5/17: dread f1 [0,4194304] 0 2026-03-09T17:29:17.091 INFO:tasks.workunit.client.0.vm06.stdout:5/18: dwrite f0 [0,4194304] 0 2026-03-09T17:29:17.091 INFO:tasks.workunit.client.0.vm06.stdout:5/19: dread d4/f5 [0,4194304] 0 2026-03-09T17:29:17.093 INFO:tasks.workunit.client.0.vm06.stdout:1/16: creat f6 x:0 0 0 2026-03-09T17:29:17.093 INFO:tasks.workunit.client.0.vm06.stdout:7/16: mkdir d5 0 2026-03-09T17:29:17.093 INFO:tasks.workunit.client.0.vm06.stdout:7/17: read - f1 zero size 2026-03-09T17:29:17.101 INFO:tasks.workunit.client.0.vm06.stdout:7/18: dwrite f1 [0,4194304] 0 2026-03-09T17:29:17.105 INFO:tasks.workunit.client.0.vm06.stdout:5/20: creat d4/fa x:0 0 0 2026-03-09T17:29:17.124 INFO:tasks.workunit.client.0.vm06.stdout:5/21: write f3 [490386,88309] 0 2026-03-09T17:29:17.125 INFO:tasks.workunit.client.0.vm06.stdout:1/17: creat f7 x:0 0 0 2026-03-09T17:29:17.125 INFO:tasks.workunit.client.0.vm06.stdout:5/22: write d4/f5 [3120959,123825] 0 2026-03-09T17:29:17.125 INFO:tasks.workunit.client.0.vm06.stdout:1/18: creat f8 x:0 0 0 2026-03-09T17:29:17.125 INFO:tasks.workunit.client.0.vm06.stdout:7/19: mknod d5/c6 0 2026-03-09T17:29:17.125 INFO:tasks.workunit.client.0.vm06.stdout:1/19: dwrite f1 [0,4194304] 0 2026-03-09T17:29:17.125 INFO:tasks.workunit.client.0.vm06.stdout:1/20: write f7 [256132,55928] 0 2026-03-09T17:29:17.125 INFO:tasks.workunit.client.0.vm06.stdout:5/23: rename d4/fa to d4/fb 0 2026-03-09T17:29:17.133 INFO:tasks.workunit.client.0.vm06.stdout:1/21: mknod c9 0 2026-03-09T17:29:17.137 INFO:tasks.workunit.client.0.vm06.stdout:5/24: dwrite d4/f7 [0,4194304] 0 2026-03-09T17:29:17.158 INFO:tasks.workunit.client.0.vm06.stdout:7/20: dwrite f2 [0,4194304] 0 2026-03-09T17:29:17.158 INFO:tasks.workunit.client.0.vm06.stdout:7/21: stat f1 0 2026-03-09T17:29:17.158 INFO:tasks.workunit.client.0.vm06.stdout:7/22: write f2 [136096,16107] 0 2026-03-09T17:29:17.158 INFO:tasks.workunit.client.0.vm06.stdout:7/23: dread f2 [0,4194304] 0 2026-03-09T17:29:17.242 INFO:tasks.workunit.client.0.vm06.stdout:1/22: creat fa x:0 0 0 2026-03-09T17:29:17.245 INFO:tasks.workunit.client.0.vm06.stdout:5/25: write d4/fb [347013,44523] 0 2026-03-09T17:29:17.246 INFO:tasks.workunit.client.0.vm06.stdout:7/24: mkdir d5/d7 0 2026-03-09T17:29:17.248 INFO:tasks.workunit.client.0.vm06.stdout:1/23: rename l5 to lb 0 2026-03-09T17:29:17.261 INFO:tasks.workunit.client.0.vm06.stdout:7/25: getdents d5/d7 0 2026-03-09T17:29:17.261 INFO:tasks.workunit.client.0.vm06.stdout:7/26: readlink - no filename 2026-03-09T17:29:17.263 INFO:tasks.workunit.client.0.vm06.stdout:7/27: link f4 d5/f8 0 2026-03-09T17:29:17.264 INFO:tasks.workunit.client.0.vm06.stdout:7/28: symlink d5/l9 0 2026-03-09T17:29:17.269 INFO:tasks.workunit.client.0.vm06.stdout:7/29: dwrite f4 [0,4194304] 0 2026-03-09T17:29:17.272 INFO:tasks.workunit.client.0.vm06.stdout:7/30: creat d5/d7/fa x:0 0 0 2026-03-09T17:29:17.273 INFO:tasks.workunit.client.0.vm06.stdout:7/31: rename d5/l9 to d5/d7/lb 0 2026-03-09T17:29:17.275 INFO:tasks.workunit.client.0.vm06.stdout:7/32: dread f4 [0,4194304] 0 2026-03-09T17:29:17.275 INFO:tasks.workunit.client.0.vm06.stdout:7/33: write f0 [841617,15892] 0 2026-03-09T17:29:17.281 INFO:tasks.workunit.client.0.vm06.stdout:7/34: dwrite d5/f8 [0,4194304] 0 2026-03-09T17:29:17.284 INFO:tasks.workunit.client.0.vm06.stdout:7/35: unlink c3 0 2026-03-09T17:29:17.287 INFO:tasks.workunit.client.0.vm06.stdout:7/36: dread d5/f8 [0,4194304] 0 2026-03-09T17:29:17.326 INFO:tasks.workunit.client.0.vm06.stdout:7/37: mknod d5/cc 0 2026-03-09T17:29:17.342 INFO:tasks.workunit.client.0.vm06.stdout:6/10: fdatasync f2 0 2026-03-09T17:29:17.363 INFO:tasks.workunit.client.0.vm06.stdout:6/11: readlink l1 0 2026-03-09T17:29:17.363 INFO:tasks.workunit.client.0.vm06.stdout:6/12: rename c4 to c5 0 2026-03-09T17:29:17.363 INFO:tasks.workunit.client.0.vm06.stdout:6/13: mkdir d6 0 2026-03-09T17:29:17.363 INFO:tasks.workunit.client.0.vm06.stdout:6/14: symlink d6/l7 0 2026-03-09T17:29:17.569 INFO:tasks.workunit.client.0.vm06.stdout:0/15: fdatasync f1 0 2026-03-09T17:29:17.569 INFO:tasks.workunit.client.0.vm06.stdout:0/16: rmdir - no directory 2026-03-09T17:29:17.571 INFO:tasks.workunit.client.0.vm06.stdout:0/17: symlink l4 0 2026-03-09T17:29:17.571 INFO:tasks.workunit.client.0.vm06.stdout:0/18: rmdir - no directory 2026-03-09T17:29:17.573 INFO:tasks.workunit.client.0.vm06.stdout:0/19: creat f5 x:0 0 0 2026-03-09T17:29:17.574 INFO:tasks.workunit.client.0.vm06.stdout:0/20: write f5 [758177,77758] 0 2026-03-09T17:29:17.577 INFO:tasks.workunit.client.0.vm06.stdout:1/24: dread f4 [0,4194304] 0 2026-03-09T17:29:17.578 INFO:tasks.workunit.client.0.vm06.stdout:0/21: dwrite f1 [0,4194304] 0 2026-03-09T17:29:17.578 INFO:tasks.workunit.client.0.vm06.stdout:0/22: chown c2 46 1 2026-03-09T17:29:17.596 INFO:tasks.workunit.client.0.vm06.stdout:1/25: mknod cc 0 2026-03-09T17:29:17.596 INFO:tasks.workunit.client.0.vm06.stdout:1/26: chown f1 1 1 2026-03-09T17:29:17.596 INFO:tasks.workunit.client.0.vm06.stdout:1/27: chown f6 31432 1 2026-03-09T17:29:17.597 INFO:tasks.workunit.client.0.vm06.stdout:0/23: creat f6 x:0 0 0 2026-03-09T17:29:17.598 INFO:tasks.workunit.client.0.vm06.stdout:0/24: truncate f6 214552 0 2026-03-09T17:29:17.599 INFO:tasks.workunit.client.0.vm06.stdout:2/11: getdents . 0 2026-03-09T17:29:17.607 INFO:tasks.workunit.client.0.vm06.stdout:0/25: mkdir d7 0 2026-03-09T17:29:17.607 INFO:tasks.workunit.client.0.vm06.stdout:1/28: dwrite f4 [0,4194304] 0 2026-03-09T17:29:17.607 INFO:tasks.workunit.client.0.vm06.stdout:1/29: write f8 [240538,46808] 0 2026-03-09T17:29:17.612 INFO:tasks.workunit.client.0.vm06.stdout:4/6: sync 2026-03-09T17:29:17.612 INFO:tasks.workunit.client.0.vm06.stdout:9/10: sync 2026-03-09T17:29:17.612 INFO:tasks.workunit.client.0.vm06.stdout:5/26: sync 2026-03-09T17:29:17.612 INFO:tasks.workunit.client.0.vm06.stdout:5/27: stat d4/c8 0 2026-03-09T17:29:17.612 INFO:tasks.workunit.client.0.vm06.stdout:9/11: chown c1 32575481 1 2026-03-09T17:29:17.612 INFO:tasks.workunit.client.0.vm06.stdout:9/12: fdatasync - no filename 2026-03-09T17:29:17.613 INFO:tasks.workunit.client.0.vm06.stdout:3/0: sync 2026-03-09T17:29:17.613 INFO:tasks.workunit.client.0.vm06.stdout:3/1: rename - no filename 2026-03-09T17:29:17.614 INFO:tasks.workunit.client.0.vm06.stdout:3/2: rename - no filename 2026-03-09T17:29:17.614 INFO:tasks.workunit.client.0.vm06.stdout:3/3: write - no filename 2026-03-09T17:29:17.614 INFO:tasks.workunit.client.0.vm06.stdout:3/4: dread - no filename 2026-03-09T17:29:17.614 INFO:tasks.workunit.client.0.vm06.stdout:8/7: sync 2026-03-09T17:29:17.614 INFO:tasks.workunit.client.0.vm06.stdout:9/13: stat c1 0 2026-03-09T17:29:17.614 INFO:tasks.workunit.client.0.vm06.stdout:9/14: fdatasync - no filename 2026-03-09T17:29:17.614 INFO:tasks.workunit.client.0.vm06.stdout:9/15: dwrite - no filename 2026-03-09T17:29:17.614 INFO:tasks.workunit.client.0.vm06.stdout:9/16: dwrite - no filename 2026-03-09T17:29:17.614 INFO:tasks.workunit.client.0.vm06.stdout:9/17: dread - no filename 2026-03-09T17:29:17.614 INFO:tasks.workunit.client.0.vm06.stdout:9/18: write - no filename 2026-03-09T17:29:17.616 INFO:tasks.workunit.client.0.vm06.stdout:5/28: dwrite f0 [0,4194304] 0 2026-03-09T17:29:17.627 INFO:tasks.workunit.client.0.vm06.stdout:7/38: fdatasync f1 0 2026-03-09T17:29:17.631 INFO:tasks.workunit.client.0.vm06.stdout:8/8: dread f0 [0,4194304] 0 2026-03-09T17:29:17.635 INFO:tasks.workunit.client.0.vm06.stdout:8/9: dwrite f0 [0,4194304] 0 2026-03-09T17:29:17.639 INFO:tasks.workunit.client.0.vm06.stdout:8/10: dread f0 [0,4194304] 0 2026-03-09T17:29:17.642 INFO:tasks.workunit.client.0.vm06.stdout:1/30: creat fd x:0 0 0 2026-03-09T17:29:17.645 INFO:tasks.workunit.client.0.vm06.stdout:4/7: write f1 [211197,94451] 0 2026-03-09T17:29:17.645 INFO:tasks.workunit.client.0.vm06.stdout:4/8: readlink - no filename 2026-03-09T17:29:17.649 INFO:tasks.workunit.client.0.vm06.stdout:1/31: dread f4 [0,4194304] 0 2026-03-09T17:29:17.652 INFO:tasks.workunit.client.0.vm06.stdout:9/19: unlink c1 0 2026-03-09T17:29:17.655 INFO:tasks.workunit.client.0.vm06.stdout:8/11: mknod c2 0 2026-03-09T17:29:17.657 INFO:tasks.workunit.client.0.vm06.stdout:0/26: creat d7/f8 x:0 0 0 2026-03-09T17:29:17.660 INFO:tasks.workunit.client.0.vm06.stdout:0/27: dwrite f1 [4194304,4194304] 0 2026-03-09T17:29:17.670 INFO:tasks.workunit.client.0.vm06.stdout:3/5: creat f0 x:0 0 0 2026-03-09T17:29:17.671 INFO:tasks.workunit.client.0.vm06.stdout:9/20: symlink l2 0 2026-03-09T17:29:17.671 INFO:tasks.workunit.client.0.vm06.stdout:9/21: dread - no filename 2026-03-09T17:29:17.671 INFO:tasks.workunit.client.0.vm06.stdout:5/29: getdents d4/d9 0 2026-03-09T17:29:17.671 INFO:tasks.workunit.client.0.vm06.stdout:9/22: readlink l2 0 2026-03-09T17:29:17.671 INFO:tasks.workunit.client.0.vm06.stdout:2/12: link l0 l1 0 2026-03-09T17:29:17.671 INFO:tasks.workunit.client.0.vm06.stdout:9/23: stat l2 0 2026-03-09T17:29:17.671 INFO:tasks.workunit.client.0.vm06.stdout:9/24: dwrite - no filename 2026-03-09T17:29:17.671 INFO:tasks.workunit.client.0.vm06.stdout:9/25: dread - no filename 2026-03-09T17:29:17.671 INFO:tasks.workunit.client.0.vm06.stdout:9/26: read - no filename 2026-03-09T17:29:17.671 INFO:tasks.workunit.client.0.vm06.stdout:9/27: write - no filename 2026-03-09T17:29:17.671 INFO:tasks.workunit.client.0.vm06.stdout:9/28: dread - no filename 2026-03-09T17:29:17.672 INFO:tasks.workunit.client.0.vm06.stdout:5/30: write d4/fb [944871,93673] 0 2026-03-09T17:29:17.672 INFO:tasks.workunit.client.0.vm06.stdout:5/31: chown d4/d9 1500442766 1 2026-03-09T17:29:17.672 INFO:tasks.workunit.client.0.vm06.stdout:8/12: mknod c3 0 2026-03-09T17:29:17.688 INFO:tasks.workunit.client.0.vm06.stdout:0/28: symlink d7/l9 0 2026-03-09T17:29:17.692 INFO:tasks.workunit.client.0.vm06.stdout:3/6: mknod c1 0 2026-03-09T17:29:17.693 INFO:tasks.workunit.client.0.vm06.stdout:3/7: write f0 [406204,38291] 0 2026-03-09T17:29:17.701 INFO:tasks.workunit.client.0.vm06.stdout:9/29: mkdir d3 0 2026-03-09T17:29:17.701 INFO:tasks.workunit.client.0.vm06.stdout:8/13: creat f4 x:0 0 0 2026-03-09T17:29:17.707 INFO:tasks.workunit.client.0.vm06.stdout:8/14: dwrite f4 [0,4194304] 0 2026-03-09T17:29:17.721 INFO:tasks.workunit.client.0.vm06.stdout:5/32: rename f1 to d4/d9/fc 0 2026-03-09T17:29:17.722 INFO:tasks.workunit.client.0.vm06.stdout:5/33: truncate d4/f5 4543473 0 2026-03-09T17:29:17.725 INFO:tasks.workunit.client.0.vm06.stdout:0/29: symlink d7/la 0 2026-03-09T17:29:17.730 INFO:tasks.workunit.client.0.vm06.stdout:3/8: creat f2 x:0 0 0 2026-03-09T17:29:17.739 INFO:tasks.workunit.client.0.vm06.stdout:3/9: readlink - no filename 2026-03-09T17:29:17.739 INFO:tasks.workunit.client.0.vm06.stdout:3/10: dwrite f0 [0,4194304] 0 2026-03-09T17:29:17.740 INFO:tasks.workunit.client.0.vm06.stdout:8/15: symlink l5 0 2026-03-09T17:29:17.743 INFO:tasks.workunit.client.0.vm06.stdout:0/30: creat d7/fb x:0 0 0 2026-03-09T17:29:17.747 INFO:tasks.workunit.client.0.vm06.stdout:2/13: creat f2 x:0 0 0 2026-03-09T17:29:17.751 INFO:tasks.workunit.client.0.vm06.stdout:2/14: dwrite f2 [0,4194304] 0 2026-03-09T17:29:17.753 INFO:tasks.workunit.client.0.vm06.stdout:2/15: chown f2 11105 1 2026-03-09T17:29:17.753 INFO:tasks.workunit.client.0.vm06.stdout:9/30: getdents d3 0 2026-03-09T17:29:17.753 INFO:tasks.workunit.client.0.vm06.stdout:9/31: truncate - no filename 2026-03-09T17:29:17.753 INFO:tasks.workunit.client.0.vm06.stdout:9/32: stat l2 0 2026-03-09T17:29:17.753 INFO:tasks.workunit.client.0.vm06.stdout:9/33: fsync - no filename 2026-03-09T17:29:17.753 INFO:tasks.workunit.client.0.vm06.stdout:9/34: dwrite - no filename 2026-03-09T17:29:17.754 INFO:tasks.workunit.client.0.vm06.stdout:2/16: read f2 [2903216,88884] 0 2026-03-09T17:29:17.755 INFO:tasks.workunit.client.0.vm06.stdout:2/17: write f2 [1948416,101214] 0 2026-03-09T17:29:17.755 INFO:tasks.workunit.client.0.vm06.stdout:2/18: rmdir - no directory 2026-03-09T17:29:17.756 INFO:tasks.workunit.client.0.vm06.stdout:2/19: write f2 [1015683,32998] 0 2026-03-09T17:29:17.764 INFO:tasks.workunit.client.0.vm06.stdout:3/11: sync 2026-03-09T17:29:17.765 INFO:tasks.workunit.client.0.vm06.stdout:3/12: stat f0 0 2026-03-09T17:29:17.768 INFO:tasks.workunit.client.0.vm06.stdout:3/13: truncate f2 123946 0 2026-03-09T17:29:17.768 INFO:tasks.workunit.client.0.vm06.stdout:3/14: chown c1 16252171 1 2026-03-09T17:29:17.769 INFO:tasks.workunit.client.0.vm06.stdout:2/20: dwrite f2 [0,4194304] 0 2026-03-09T17:29:17.769 INFO:tasks.workunit.client.0.vm06.stdout:2/21: rmdir - no directory 2026-03-09T17:29:17.772 INFO:tasks.workunit.client.0.vm06.stdout:3/15: write f2 [223471,120547] 0 2026-03-09T17:29:17.776 INFO:tasks.workunit.client.0.vm06.stdout:8/16: dwrite f0 [0,4194304] 0 2026-03-09T17:29:17.776 INFO:tasks.workunit.client.0.vm06.stdout:8/17: read f4 [3136396,116420] 0 2026-03-09T17:29:17.777 INFO:tasks.workunit.client.0.vm06.stdout:4/9: dread f1 [0,4194304] 0 2026-03-09T17:29:17.797 INFO:tasks.workunit.client.0.vm06.stdout:8/18: mknod c6 0 2026-03-09T17:29:17.805 INFO:tasks.workunit.client.0.vm06.stdout:4/10: write f1 [1191960,54142] 0 2026-03-09T17:29:17.811 INFO:tasks.workunit.client.0.vm06.stdout:2/22: mkdir d3 0 2026-03-09T17:29:17.813 INFO:tasks.workunit.client.0.vm06.stdout:8/19: rename f4 to f7 0 2026-03-09T17:29:17.818 INFO:tasks.workunit.client.0.vm06.stdout:4/11: sync 2026-03-09T17:29:17.822 INFO:tasks.workunit.client.0.vm06.stdout:9/35: link l2 d3/l4 0 2026-03-09T17:29:17.827 INFO:tasks.workunit.client.0.vm06.stdout:8/20: creat f8 x:0 0 0 2026-03-09T17:29:17.830 INFO:tasks.workunit.client.0.vm06.stdout:2/23: dwrite f2 [0,4194304] 0 2026-03-09T17:29:17.830 INFO:tasks.workunit.client.0.vm06.stdout:2/24: stat f2 0 2026-03-09T17:29:17.834 INFO:tasks.workunit.client.0.vm06.stdout:3/16: link c1 c3 0 2026-03-09T17:29:17.837 INFO:tasks.workunit.client.0.vm06.stdout:3/17: write f2 [1161417,110498] 0 2026-03-09T17:29:17.849 INFO:tasks.workunit.client.0.vm06.stdout:2/25: dwrite f2 [0,4194304] 0 2026-03-09T17:29:17.855 INFO:tasks.workunit.client.0.vm06.stdout:3/18: rename f2 to f4 0 2026-03-09T17:29:17.873 INFO:tasks.workunit.client.0.vm06.stdout:7/39: write d5/f8 [5135603,91169] 0 2026-03-09T17:29:17.874 INFO:tasks.workunit.client.0.vm06.stdout:7/40: truncate f0 1449940 0 2026-03-09T17:29:17.874 INFO:tasks.workunit.client.0.vm06.stdout:3/19: dwrite f0 [0,4194304] 0 2026-03-09T17:29:17.874 INFO:tasks.workunit.client.0.vm06.stdout:2/26: dwrite f2 [0,4194304] 0 2026-03-09T17:29:17.874 INFO:tasks.workunit.client.0.vm06.stdout:9/36: rename l2 to d3/l5 0 2026-03-09T17:29:17.874 INFO:tasks.workunit.client.0.vm06.stdout:9/37: write - no filename 2026-03-09T17:29:17.874 INFO:tasks.workunit.client.0.vm06.stdout:7/41: mkdir d5/dd 0 2026-03-09T17:29:17.875 INFO:tasks.workunit.client.0.vm06.stdout:7/42: chown d5/d7/fa 25405 1 2026-03-09T17:29:17.875 INFO:tasks.workunit.client.0.vm06.stdout:3/20: write f4 [1788181,126742] 0 2026-03-09T17:29:17.877 INFO:tasks.workunit.client.0.vm06.stdout:9/38: mknod d3/c6 0 2026-03-09T17:29:17.877 INFO:tasks.workunit.client.0.vm06.stdout:9/39: stat d3 0 2026-03-09T17:29:17.882 INFO:tasks.workunit.client.0.vm06.stdout:7/43: unlink f1 0 2026-03-09T17:29:17.885 INFO:tasks.workunit.client.0.vm06.stdout:7/44: truncate d5/d7/fa 575649 0 2026-03-09T17:29:17.885 INFO:tasks.workunit.client.0.vm06.stdout:2/27: mkdir d3/d4 0 2026-03-09T17:29:17.892 INFO:tasks.workunit.client.0.vm06.stdout:2/28: symlink d3/l5 0 2026-03-09T17:29:17.894 INFO:tasks.workunit.client.0.vm06.stdout:3/21: unlink c1 0 2026-03-09T17:29:17.898 INFO:tasks.workunit.client.0.vm06.stdout:3/22: symlink l5 0 2026-03-09T17:29:17.899 INFO:tasks.workunit.client.0.vm06.stdout:2/29: creat d3/d4/f6 x:0 0 0 2026-03-09T17:29:17.903 INFO:tasks.workunit.client.0.vm06.stdout:3/23: rename l5 to l6 0 2026-03-09T17:29:17.907 INFO:tasks.workunit.client.0.vm06.stdout:2/30: mknod d3/d4/c7 0 2026-03-09T17:29:17.915 INFO:tasks.workunit.client.0.vm06.stdout:2/31: rename d3/d4/c7 to d3/c8 0 2026-03-09T17:29:17.915 INFO:tasks.workunit.client.0.vm06.stdout:2/32: dread f2 [0,4194304] 0 2026-03-09T17:29:17.921 INFO:tasks.workunit.client.0.vm06.stdout:8/21: sync 2026-03-09T17:29:17.921 INFO:tasks.workunit.client.0.vm06.stdout:9/40: sync 2026-03-09T17:29:17.921 INFO:tasks.workunit.client.0.vm06.stdout:9/41: write - no filename 2026-03-09T17:29:17.921 INFO:tasks.workunit.client.0.vm06.stdout:7/45: sync 2026-03-09T17:29:17.921 INFO:tasks.workunit.client.0.vm06.stdout:8/22: chown l5 1150646 1 2026-03-09T17:29:17.928 INFO:tasks.workunit.client.0.vm06.stdout:8/23: creat f9 x:0 0 0 2026-03-09T17:29:17.931 INFO:tasks.workunit.client.0.vm06.stdout:7/46: symlink d5/dd/le 0 2026-03-09T17:29:17.935 INFO:tasks.workunit.client.0.vm06.stdout:7/47: fsync f0 0 2026-03-09T17:29:17.935 INFO:tasks.workunit.client.0.vm06.stdout:7/48: chown d5/cc 255 1 2026-03-09T17:29:17.935 INFO:tasks.workunit.client.0.vm06.stdout:7/49: fsync d5/d7/fa 0 2026-03-09T17:29:17.935 INFO:tasks.workunit.client.0.vm06.stdout:8/24: dwrite f7 [0,4194304] 0 2026-03-09T17:29:17.935 INFO:tasks.workunit.client.0.vm06.stdout:7/50: fdatasync f0 0 2026-03-09T17:29:17.941 INFO:tasks.workunit.client.0.vm06.stdout:7/51: dwrite f4 [0,4194304] 0 2026-03-09T17:29:17.942 INFO:tasks.workunit.client.0.vm06.stdout:8/25: sync 2026-03-09T17:29:17.942 INFO:tasks.workunit.client.0.vm06.stdout:7/52: stat d5/cc 0 2026-03-09T17:29:17.942 INFO:tasks.workunit.client.0.vm06.stdout:8/26: dread - f8 zero size 2026-03-09T17:29:17.952 INFO:tasks.workunit.client.0.vm06.stdout:8/27: link f8 fa 0 2026-03-09T17:29:17.953 INFO:tasks.workunit.client.0.vm06.stdout:4/12: fsync f1 0 2026-03-09T17:29:17.953 INFO:tasks.workunit.client.0.vm06.stdout:4/13: write f1 [1327317,57184] 0 2026-03-09T17:29:17.954 INFO:tasks.workunit.client.0.vm06.stdout:4/14: chown f1 1 1 2026-03-09T17:29:17.958 INFO:tasks.workunit.client.0.vm06.stdout:8/28: mknod cb 0 2026-03-09T17:29:17.958 INFO:tasks.workunit.client.0.vm06.stdout:8/29: chown f0 3 1 2026-03-09T17:29:17.964 INFO:tasks.workunit.client.0.vm06.stdout:4/15: rename f1 to f2 0 2026-03-09T17:29:17.968 INFO:tasks.workunit.client.0.vm06.stdout:4/16: creat f3 x:0 0 0 2026-03-09T17:29:17.974 INFO:tasks.workunit.client.0.vm06.stdout:4/17: stat f2 0 2026-03-09T17:29:17.974 INFO:tasks.workunit.client.0.vm06.stdout:4/18: mknod c4 0 2026-03-09T17:29:17.974 INFO:tasks.workunit.client.0.vm06.stdout:4/19: chown f2 1845878585 1 2026-03-09T17:29:17.974 INFO:tasks.workunit.client.0.vm06.stdout:4/20: creat f5 x:0 0 0 2026-03-09T17:29:17.976 INFO:tasks.workunit.client.0.vm06.stdout:4/21: dwrite f3 [0,4194304] 0 2026-03-09T17:29:17.977 INFO:tasks.workunit.client.0.vm06.stdout:4/22: chown c4 58937 1 2026-03-09T17:29:17.985 INFO:tasks.workunit.client.0.vm06.stdout:4/23: rename f5 to f6 0 2026-03-09T17:29:17.988 INFO:tasks.workunit.client.0.vm06.stdout:4/24: link f2 f7 0 2026-03-09T17:29:17.992 INFO:tasks.workunit.client.0.vm06.stdout:4/25: dwrite f7 [0,4194304] 0 2026-03-09T17:29:17.994 INFO:tasks.workunit.client.0.vm06.stdout:4/26: stat c4 0 2026-03-09T17:29:18.004 INFO:tasks.workunit.client.0.vm06.stdout:4/27: sync 2026-03-09T17:29:18.008 INFO:tasks.workunit.client.0.vm06.stdout:6/15: unlink c5 0 2026-03-09T17:29:18.008 INFO:tasks.workunit.client.0.vm06.stdout:6/16: write f2 [30091,29365] 0 2026-03-09T17:29:18.009 INFO:tasks.workunit.client.0.vm06.stdout:6/17: write f2 [659819,58718] 0 2026-03-09T17:29:18.013 INFO:tasks.workunit.client.0.vm06.stdout:6/18: dwrite f2 [0,4194304] 0 2026-03-09T17:29:18.013 INFO:tasks.workunit.client.0.vm06.stdout:6/19: truncate f2 4982986 0 2026-03-09T17:29:18.029 INFO:tasks.workunit.client.0.vm06.stdout:6/20: mknod d6/c8 0 2026-03-09T17:29:18.030 INFO:tasks.workunit.client.0.vm06.stdout:6/21: mknod d6/c9 0 2026-03-09T17:29:18.030 INFO:tasks.workunit.client.0.vm06.stdout:6/22: write f2 [5986943,125406] 0 2026-03-09T17:29:18.033 INFO:tasks.workunit.client.0.vm06.stdout:6/23: dread f2 [0,4194304] 0 2026-03-09T17:29:18.034 INFO:tasks.workunit.client.0.vm06.stdout:6/24: stat d6/c9 0 2026-03-09T17:29:18.095 INFO:tasks.workunit.client.0.vm06.stdout:1/32: getdents . 0 2026-03-09T17:29:18.097 INFO:tasks.workunit.client.0.vm06.stdout:1/33: unlink lb 0 2026-03-09T17:29:18.099 INFO:tasks.workunit.client.0.vm06.stdout:1/34: dread f8 [0,4194304] 0 2026-03-09T17:29:18.100 INFO:tasks.workunit.client.0.vm06.stdout:1/35: truncate fd 571355 0 2026-03-09T17:29:18.102 INFO:tasks.workunit.client.0.vm06.stdout:1/36: mkdir de 0 2026-03-09T17:29:18.106 INFO:tasks.workunit.client.0.vm06.stdout:1/37: dwrite fd [0,4194304] 0 2026-03-09T17:29:18.108 INFO:tasks.workunit.client.0.vm06.stdout:1/38: write f7 [771014,104132] 0 2026-03-09T17:29:18.128 INFO:tasks.workunit.client.0.vm06.stdout:7/53: fsync f4 0 2026-03-09T17:29:18.128 INFO:tasks.workunit.client.0.vm06.stdout:7/54: write f4 [3984179,29304] 0 2026-03-09T17:29:18.130 INFO:tasks.workunit.client.0.vm06.stdout:7/55: read d5/d7/fa [30885,20898] 0 2026-03-09T17:29:18.130 INFO:tasks.workunit.client.0.vm06.stdout:7/56: write d5/f8 [4433880,44144] 0 2026-03-09T17:29:18.131 INFO:tasks.workunit.client.0.vm06.stdout:7/57: read d5/d7/fa [253059,126846] 0 2026-03-09T17:29:18.131 INFO:tasks.workunit.client.0.vm06.stdout:7/58: chown f0 712 1 2026-03-09T17:29:18.132 INFO:tasks.workunit.client.0.vm06.stdout:7/59: write d5/f8 [5738329,120994] 0 2026-03-09T17:29:18.140 INFO:tasks.workunit.client.0.vm06.stdout:8/30: write f0 [4592311,17013] 0 2026-03-09T17:29:18.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:17 vm09.local ceph-mon[62061]: pgmap v142: 65 pgs: 65 active+clean; 162 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 4.5 MiB/s wr, 433 op/s 2026-03-09T17:29:18.145 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:17 vm06.local ceph-mon[57307]: pgmap v142: 65 pgs: 65 active+clean; 162 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 4.5 MiB/s wr, 433 op/s 2026-03-09T17:29:18.145 INFO:tasks.workunit.client.0.vm06.stdout:1/39: fdatasync f4 0 2026-03-09T17:29:18.149 INFO:tasks.workunit.client.0.vm06.stdout:7/60: chown d5/d7/lb 15902 1 2026-03-09T17:29:18.171 INFO:tasks.workunit.client.0.vm06.stdout:5/34: getdents d4/d9 0 2026-03-09T17:29:18.173 INFO:tasks.workunit.client.0.vm06.stdout:5/35: dread f0 [0,4194304] 0 2026-03-09T17:29:18.180 INFO:tasks.workunit.client.0.vm06.stdout:6/25: dread f2 [4194304,4194304] 0 2026-03-09T17:29:18.182 INFO:tasks.workunit.client.0.vm06.stdout:6/26: dread f2 [0,4194304] 0 2026-03-09T17:29:18.202 INFO:tasks.workunit.client.0.vm06.stdout:0/31: getdents d7 0 2026-03-09T17:29:18.222 INFO:tasks.workunit.client.0.vm06.stdout:9/42: getdents d3 0 2026-03-09T17:29:18.222 INFO:tasks.workunit.client.0.vm06.stdout:9/43: truncate - no filename 2026-03-09T17:29:18.222 INFO:tasks.workunit.client.0.vm06.stdout:9/44: dread - no filename 2026-03-09T17:29:18.227 INFO:tasks.workunit.client.0.vm06.stdout:2/33: fsync d3/d4/f6 0 2026-03-09T17:29:18.231 INFO:tasks.workunit.client.0.vm06.stdout:3/24: truncate f0 1352198 0 2026-03-09T17:29:18.245 INFO:tasks.workunit.client.0.vm06.stdout:4/28: write f3 [5003263,23697] 0 2026-03-09T17:29:18.248 INFO:tasks.workunit.client.0.vm06.stdout:4/29: dwrite f2 [0,4194304] 0 2026-03-09T17:29:18.388 INFO:tasks.workunit.client.0.vm06.stdout:8/31: mknod cc 0 2026-03-09T17:29:18.393 INFO:tasks.workunit.client.0.vm06.stdout:5/36: unlink d4/d9/fc 0 2026-03-09T17:29:18.399 INFO:tasks.workunit.client.0.vm06.stdout:6/27: dwrite f2 [0,4194304] 0 2026-03-09T17:29:18.404 INFO:tasks.workunit.client.0.vm06.stdout:6/28: dread f2 [0,4194304] 0 2026-03-09T17:29:18.408 INFO:tasks.workunit.client.0.vm06.stdout:2/34: mknod d3/c9 0 2026-03-09T17:29:18.415 INFO:tasks.workunit.client.0.vm06.stdout:3/25: readlink l6 0 2026-03-09T17:29:18.417 INFO:tasks.workunit.client.0.vm06.stdout:4/30: symlink l8 0 2026-03-09T17:29:18.418 INFO:tasks.workunit.client.0.vm06.stdout:8/32: creat fd x:0 0 0 2026-03-09T17:29:18.428 INFO:tasks.workunit.client.0.vm06.stdout:5/37: rename f3 to d4/fd 0 2026-03-09T17:29:18.428 INFO:tasks.workunit.client.0.vm06.stdout:0/32: getdents d7 0 2026-03-09T17:29:18.429 INFO:tasks.workunit.client.0.vm06.stdout:0/33: chown c2 15446350 1 2026-03-09T17:29:18.435 INFO:tasks.workunit.client.0.vm06.stdout:6/29: creat d6/fa x:0 0 0 2026-03-09T17:29:18.437 INFO:tasks.workunit.client.0.vm06.stdout:2/35: mkdir d3/d4/da 0 2026-03-09T17:29:18.438 INFO:tasks.workunit.client.0.vm06.stdout:2/36: chown d3/l5 1722117009 1 2026-03-09T17:29:18.452 INFO:tasks.workunit.client.0.vm06.stdout:4/31: symlink l9 0 2026-03-09T17:29:18.455 INFO:tasks.workunit.client.0.vm06.stdout:8/33: rename f8 to fe 0 2026-03-09T17:29:18.456 INFO:tasks.workunit.client.0.vm06.stdout:1/40: rmdir de 0 2026-03-09T17:29:18.459 INFO:tasks.workunit.client.0.vm06.stdout:7/61: truncate f0 888603 0 2026-03-09T17:29:18.463 INFO:tasks.workunit.client.0.vm06.stdout:1/41: dwrite f7 [0,4194304] 0 2026-03-09T17:29:18.467 INFO:tasks.workunit.client.0.vm06.stdout:5/38: write f0 [1366497,94982] 0 2026-03-09T17:29:18.469 INFO:tasks.workunit.client.0.vm06.stdout:5/39: dread d4/f7 [0,4194304] 0 2026-03-09T17:29:18.470 INFO:tasks.workunit.client.0.vm06.stdout:5/40: write d4/fb [2080229,47549] 0 2026-03-09T17:29:18.476 INFO:tasks.workunit.client.0.vm06.stdout:6/30: rename d6/fa to d6/fb 0 2026-03-09T17:29:18.503 INFO:tasks.workunit.client.0.vm06.stdout:4/32: creat fa x:0 0 0 2026-03-09T17:29:18.504 INFO:tasks.workunit.client.0.vm06.stdout:8/34: mkdir df 0 2026-03-09T17:29:18.504 INFO:tasks.workunit.client.0.vm06.stdout:8/35: dread f7 [0,4194304] 0 2026-03-09T17:29:18.504 INFO:tasks.workunit.client.0.vm06.stdout:8/36: truncate fa 12176 0 2026-03-09T17:29:18.504 INFO:tasks.workunit.client.0.vm06.stdout:0/34: getdents d7 0 2026-03-09T17:29:18.504 INFO:tasks.workunit.client.0.vm06.stdout:0/35: dwrite f1 [0,4194304] 0 2026-03-09T17:29:18.504 INFO:tasks.workunit.client.0.vm06.stdout:0/36: fsync f6 0 2026-03-09T17:29:18.504 INFO:tasks.workunit.client.0.vm06.stdout:0/37: write f5 [1319268,43262] 0 2026-03-09T17:29:18.504 INFO:tasks.workunit.client.0.vm06.stdout:6/31: write f2 [4378514,7380] 0 2026-03-09T17:29:18.508 INFO:tasks.workunit.client.0.vm06.stdout:6/32: dwrite f2 [4194304,4194304] 0 2026-03-09T17:29:18.509 INFO:tasks.workunit.client.0.vm06.stdout:6/33: dread - d6/fb zero size 2026-03-09T17:29:18.511 INFO:tasks.workunit.client.0.vm06.stdout:9/45: link d3/l5 d3/l7 0 2026-03-09T17:29:18.511 INFO:tasks.workunit.client.0.vm06.stdout:9/46: write - no filename 2026-03-09T17:29:18.520 INFO:tasks.workunit.client.0.vm06.stdout:2/37: sync 2026-03-09T17:29:18.521 INFO:tasks.workunit.client.0.vm06.stdout:8/37: unlink c6 0 2026-03-09T17:29:18.523 INFO:tasks.workunit.client.0.vm06.stdout:1/42: rename c2 to cf 0 2026-03-09T17:29:18.529 INFO:tasks.workunit.client.0.vm06.stdout:5/41: link d4/f5 d4/d9/fe 0 2026-03-09T17:29:18.534 INFO:tasks.workunit.client.0.vm06.stdout:5/42: chown c2 70468206 1 2026-03-09T17:29:18.534 INFO:tasks.workunit.client.0.vm06.stdout:0/38: mknod d7/cc 0 2026-03-09T17:29:18.537 INFO:tasks.workunit.client.0.vm06.stdout:0/39: dwrite f6 [0,4194304] 0 2026-03-09T17:29:18.547 INFO:tasks.workunit.client.0.vm06.stdout:9/47: mknod d3/c8 0 2026-03-09T17:29:18.548 INFO:tasks.workunit.client.0.vm06.stdout:9/48: dwrite - no filename 2026-03-09T17:29:18.554 INFO:tasks.workunit.client.0.vm06.stdout:2/38: symlink d3/d4/lb 0 2026-03-09T17:29:18.555 INFO:tasks.workunit.client.0.vm06.stdout:8/38: unlink l5 0 2026-03-09T17:29:18.561 INFO:tasks.workunit.client.0.vm06.stdout:1/43: rename f6 to f10 0 2026-03-09T17:29:18.564 INFO:tasks.workunit.client.0.vm06.stdout:0/40: rename c2 to d7/cd 0 2026-03-09T17:29:18.567 INFO:tasks.workunit.client.0.vm06.stdout:0/41: write d7/f8 [961878,69627] 0 2026-03-09T17:29:18.567 INFO:tasks.workunit.client.0.vm06.stdout:0/42: stat f1 0 2026-03-09T17:29:18.567 INFO:tasks.workunit.client.0.vm06.stdout:9/49: symlink d3/l9 0 2026-03-09T17:29:18.568 INFO:tasks.workunit.client.0.vm06.stdout:2/39: creat d3/d4/fc x:0 0 0 2026-03-09T17:29:18.570 INFO:tasks.workunit.client.0.vm06.stdout:8/39: write f7 [4988396,58439] 0 2026-03-09T17:29:18.571 INFO:tasks.workunit.client.0.vm06.stdout:7/62: getdents d5/d7 0 2026-03-09T17:29:18.571 INFO:tasks.workunit.client.0.vm06.stdout:7/63: truncate f2 4999015 0 2026-03-09T17:29:18.573 INFO:tasks.workunit.client.0.vm06.stdout:1/44: mkdir d11 0 2026-03-09T17:29:18.573 INFO:tasks.workunit.client.0.vm06.stdout:1/45: write f4 [236096,29560] 0 2026-03-09T17:29:18.579 INFO:tasks.workunit.client.0.vm06.stdout:6/34: fdatasync f2 0 2026-03-09T17:29:18.579 INFO:tasks.workunit.client.0.vm06.stdout:6/35: fsync f2 0 2026-03-09T17:29:18.582 INFO:tasks.workunit.client.0.vm06.stdout:6/36: dwrite f2 [4194304,4194304] 0 2026-03-09T17:29:18.584 INFO:tasks.workunit.client.0.vm06.stdout:6/37: truncate d6/fb 872152 0 2026-03-09T17:29:18.588 INFO:tasks.workunit.client.0.vm06.stdout:0/43: creat d7/fe x:0 0 0 2026-03-09T17:29:18.590 INFO:tasks.workunit.client.0.vm06.stdout:9/50: creat d3/fa x:0 0 0 2026-03-09T17:29:18.602 INFO:tasks.workunit.client.0.vm06.stdout:7/64: unlink d5/dd/le 0 2026-03-09T17:29:18.604 INFO:tasks.workunit.client.0.vm06.stdout:5/43: link d4/c8 d4/d9/cf 0 2026-03-09T17:29:18.607 INFO:tasks.workunit.client.0.vm06.stdout:0/44: rename f1 to d7/ff 0 2026-03-09T17:29:18.608 INFO:tasks.workunit.client.0.vm06.stdout:9/51: creat d3/fb x:0 0 0 2026-03-09T17:29:18.608 INFO:tasks.workunit.client.0.vm06.stdout:9/52: truncate d3/fb 106312 0 2026-03-09T17:29:18.614 INFO:tasks.workunit.client.0.vm06.stdout:1/46: getdents d11 0 2026-03-09T17:29:18.622 INFO:tasks.workunit.client.0.vm06.stdout:6/38: sync 2026-03-09T17:29:18.622 INFO:tasks.workunit.client.0.vm06.stdout:9/53: sync 2026-03-09T17:29:18.623 INFO:tasks.workunit.client.0.vm06.stdout:9/54: write d3/fb [702213,40611] 0 2026-03-09T17:29:18.626 INFO:tasks.workunit.client.0.vm06.stdout:6/39: dwrite f2 [0,4194304] 0 2026-03-09T17:29:18.644 INFO:tasks.workunit.client.0.vm06.stdout:9/55: mknod d3/cc 0 2026-03-09T17:29:18.648 INFO:tasks.workunit.client.0.vm06.stdout:9/56: dread d3/fb [0,4194304] 0 2026-03-09T17:29:18.648 INFO:tasks.workunit.client.0.vm06.stdout:9/57: truncate d3/fa 957387 0 2026-03-09T17:29:18.648 INFO:tasks.workunit.client.0.vm06.stdout:9/58: chown d3 184952 1 2026-03-09T17:29:18.651 INFO:tasks.workunit.client.0.vm06.stdout:1/47: getdents d11 0 2026-03-09T17:29:18.655 INFO:tasks.workunit.client.0.vm06.stdout:6/40: fdatasync f2 0 2026-03-09T17:29:18.658 INFO:tasks.workunit.client.0.vm06.stdout:9/59: readlink d3/l5 0 2026-03-09T17:29:18.659 INFO:tasks.workunit.client.0.vm06.stdout:9/60: stat d3/fb 0 2026-03-09T17:29:18.662 INFO:tasks.workunit.client.0.vm06.stdout:6/41: unlink d6/c9 0 2026-03-09T17:29:18.664 INFO:tasks.workunit.client.0.vm06.stdout:6/42: dread d6/fb [0,4194304] 0 2026-03-09T17:29:18.671 INFO:tasks.workunit.client.0.vm06.stdout:9/61: mknod d3/cd 0 2026-03-09T17:29:18.677 INFO:tasks.workunit.client.0.vm06.stdout:1/48: link c9 d11/c12 0 2026-03-09T17:29:18.677 INFO:tasks.workunit.client.0.vm06.stdout:1/49: write f3 [290344,36328] 0 2026-03-09T17:29:18.677 INFO:tasks.workunit.client.0.vm06.stdout:1/50: fdatasync f4 0 2026-03-09T17:29:18.680 INFO:tasks.workunit.client.0.vm06.stdout:9/62: sync 2026-03-09T17:29:18.688 INFO:tasks.workunit.client.0.vm06.stdout:1/51: rename f4 to d11/f13 0 2026-03-09T17:29:18.691 INFO:tasks.workunit.client.0.vm06.stdout:9/63: unlink d3/fa 0 2026-03-09T17:29:18.699 INFO:tasks.workunit.client.0.vm06.stdout:1/52: chown d11/c12 337880423 1 2026-03-09T17:29:18.703 INFO:tasks.workunit.client.0.vm06.stdout:1/53: unlink f3 0 2026-03-09T17:29:18.706 INFO:tasks.workunit.client.0.vm06.stdout:1/54: rmdir d11 39 2026-03-09T17:29:18.760 INFO:tasks.workunit.client.0.vm06.stdout:4/33: dwrite f2 [4194304,4194304] 0 2026-03-09T17:29:18.760 INFO:tasks.workunit.client.0.vm06.stdout:4/34: truncate f6 81065 0 2026-03-09T17:29:18.760 INFO:tasks.workunit.client.0.vm06.stdout:4/35: chown f6 3 1 2026-03-09T17:29:18.763 INFO:tasks.workunit.client.0.vm06.stdout:4/36: mkdir db 0 2026-03-09T17:29:18.765 INFO:tasks.workunit.client.0.vm06.stdout:4/37: dread f3 [4194304,4194304] 0 2026-03-09T17:29:18.769 INFO:tasks.workunit.client.0.vm06.stdout:4/38: dwrite f3 [0,4194304] 0 2026-03-09T17:29:18.771 INFO:tasks.workunit.client.0.vm06.stdout:4/39: write fa [48236,104494] 0 2026-03-09T17:29:18.776 INFO:tasks.workunit.client.0.vm06.stdout:4/40: dwrite fa [0,4194304] 0 2026-03-09T17:29:18.805 INFO:tasks.workunit.client.0.vm06.stdout:4/41: creat db/fc x:0 0 0 2026-03-09T17:29:18.807 INFO:tasks.workunit.client.0.vm06.stdout:4/42: mknod db/cd 0 2026-03-09T17:29:18.807 INFO:tasks.workunit.client.0.vm06.stdout:3/26: truncate f4 345813 0 2026-03-09T17:29:18.810 INFO:tasks.workunit.client.0.vm06.stdout:3/27: chown c3 1 1 2026-03-09T17:29:18.819 INFO:tasks.workunit.client.0.vm06.stdout:3/28: dwrite f0 [0,4194304] 0 2026-03-09T17:29:18.822 INFO:tasks.workunit.client.0.vm06.stdout:3/29: dread f0 [0,4194304] 0 2026-03-09T17:29:18.826 INFO:tasks.workunit.client.0.vm06.stdout:3/30: dread f0 [0,4194304] 0 2026-03-09T17:29:18.829 INFO:tasks.workunit.client.0.vm06.stdout:3/31: dread f0 [0,4194304] 0 2026-03-09T17:29:18.829 INFO:tasks.workunit.client.0.vm06.stdout:3/32: write f0 [3129026,113157] 0 2026-03-09T17:29:18.829 INFO:tasks.workunit.client.0.vm06.stdout:3/33: stat l6 0 2026-03-09T17:29:18.831 INFO:tasks.workunit.client.0.vm06.stdout:3/34: write f0 [4584093,91342] 0 2026-03-09T17:29:18.838 INFO:tasks.workunit.client.0.vm06.stdout:2/40: getdents d3/d4 0 2026-03-09T17:29:18.839 INFO:tasks.workunit.client.0.vm06.stdout:2/41: chown d3/d4/fc 0 1 2026-03-09T17:29:18.843 INFO:tasks.workunit.client.0.vm06.stdout:2/42: dwrite d3/d4/f6 [0,4194304] 0 2026-03-09T17:29:18.856 INFO:tasks.workunit.client.0.vm06.stdout:3/35: sync 2026-03-09T17:29:18.860 INFO:tasks.workunit.client.0.vm06.stdout:2/43: dwrite f2 [4194304,4194304] 0 2026-03-09T17:29:18.872 INFO:tasks.workunit.client.0.vm06.stdout:1/55: mkdir d11/d14 0 2026-03-09T17:29:18.873 INFO:tasks.workunit.client.0.vm06.stdout:1/56: chown fa 297641029 1 2026-03-09T17:29:18.873 INFO:tasks.workunit.client.0.vm06.stdout:8/40: write fe [701982,53224] 0 2026-03-09T17:29:18.873 INFO:tasks.workunit.client.0.vm06.stdout:3/36: dread f0 [0,4194304] 0 2026-03-09T17:29:18.873 INFO:tasks.workunit.client.0.vm06.stdout:3/37: stat f0 0 2026-03-09T17:29:18.873 INFO:tasks.workunit.client.0.vm06.stdout:1/57: dwrite fa [0,4194304] 0 2026-03-09T17:29:18.873 INFO:tasks.workunit.client.0.vm06.stdout:8/41: dread f0 [0,4194304] 0 2026-03-09T17:29:18.873 INFO:tasks.workunit.client.0.vm06.stdout:4/43: fsync f3 0 2026-03-09T17:29:18.874 INFO:tasks.workunit.client.0.vm06.stdout:4/44: read f3 [1516662,29850] 0 2026-03-09T17:29:18.874 INFO:tasks.workunit.client.0.vm06.stdout:4/45: readlink l9 0 2026-03-09T17:29:18.878 INFO:tasks.workunit.client.0.vm06.stdout:4/46: dread f7 [0,4194304] 0 2026-03-09T17:29:18.883 INFO:tasks.workunit.client.0.vm06.stdout:2/44: mknod d3/cd 0 2026-03-09T17:29:18.885 INFO:tasks.workunit.client.0.vm06.stdout:2/45: dread f2 [0,4194304] 0 2026-03-09T17:29:18.891 INFO:tasks.workunit.client.0.vm06.stdout:4/47: creat db/fe x:0 0 0 2026-03-09T17:29:18.895 INFO:tasks.workunit.client.0.vm06.stdout:2/46: creat d3/d4/fe x:0 0 0 2026-03-09T17:29:18.900 INFO:tasks.workunit.client.0.vm06.stdout:2/47: dwrite d3/d4/f6 [0,4194304] 0 2026-03-09T17:29:18.905 INFO:tasks.workunit.client.0.vm06.stdout:8/42: mknod df/c10 0 2026-03-09T17:29:18.906 INFO:tasks.workunit.client.0.vm06.stdout:0/45: fsync d7/ff 0 2026-03-09T17:29:18.906 INFO:tasks.workunit.client.0.vm06.stdout:8/43: dread - f9 zero size 2026-03-09T17:29:18.906 INFO:tasks.workunit.client.0.vm06.stdout:8/44: read - f9 zero size 2026-03-09T17:29:18.908 INFO:tasks.workunit.client.0.vm06.stdout:8/45: dwrite f9 [0,4194304] 0 2026-03-09T17:29:18.918 INFO:tasks.workunit.client.0.vm06.stdout:3/38: creat f7 x:0 0 0 2026-03-09T17:29:18.922 INFO:tasks.workunit.client.0.vm06.stdout:0/46: creat d7/f10 x:0 0 0 2026-03-09T17:29:18.923 INFO:tasks.workunit.client.0.vm06.stdout:0/47: read d7/f8 [507262,15682] 0 2026-03-09T17:29:18.932 INFO:tasks.workunit.client.0.vm06.stdout:5/44: truncate f0 272215 0 2026-03-09T17:29:18.933 INFO:tasks.workunit.client.0.vm06.stdout:8/46: unlink df/c10 0 2026-03-09T17:29:18.933 INFO:tasks.workunit.client.0.vm06.stdout:8/47: readlink l1 0 2026-03-09T17:29:18.937 INFO:tasks.workunit.client.0.vm06.stdout:8/48: dwrite fe [0,4194304] 0 2026-03-09T17:29:18.940 INFO:tasks.workunit.client.0.vm06.stdout:8/49: write f9 [1085996,120278] 0 2026-03-09T17:29:18.941 INFO:tasks.workunit.client.0.vm06.stdout:6/43: truncate f2 5273458 0 2026-03-09T17:29:18.941 INFO:tasks.workunit.client.0.vm06.stdout:0/48: mkdir d7/d11 0 2026-03-09T17:29:18.942 INFO:tasks.workunit.client.0.vm06.stdout:6/44: dread d6/fb [0,4194304] 0 2026-03-09T17:29:18.948 INFO:tasks.workunit.client.0.vm06.stdout:5/45: truncate d4/f7 851839 0 2026-03-09T17:29:18.950 INFO:tasks.workunit.client.0.vm06.stdout:9/64: truncate d3/fb 617512 0 2026-03-09T17:29:18.963 INFO:tasks.workunit.client.0.vm06.stdout:1/58: fsync d11/f13 0 2026-03-09T17:29:18.963 INFO:tasks.workunit.client.0.vm06.stdout:1/59: write d11/f13 [3590370,119871] 0 2026-03-09T17:29:18.965 INFO:tasks.workunit.client.0.vm06.stdout:0/49: dread d7/f8 [0,4194304] 0 2026-03-09T17:29:18.966 INFO:tasks.workunit.client.0.vm06.stdout:0/50: dread d7/f8 [0,4194304] 0 2026-03-09T17:29:18.966 INFO:tasks.workunit.client.0.vm06.stdout:0/51: stat d7/ff 0 2026-03-09T17:29:18.966 INFO:tasks.workunit.client.0.vm06.stdout:2/48: fsync d3/d4/f6 0 2026-03-09T17:29:18.967 INFO:tasks.workunit.client.0.vm06.stdout:6/45: symlink d6/lc 0 2026-03-09T17:29:18.970 INFO:tasks.workunit.client.0.vm06.stdout:5/46: creat d4/d9/f10 x:0 0 0 2026-03-09T17:29:18.971 INFO:tasks.workunit.client.0.vm06.stdout:9/65: mknod d3/ce 0 2026-03-09T17:29:18.975 INFO:tasks.workunit.client.0.vm06.stdout:0/52: creat d7/f12 x:0 0 0 2026-03-09T17:29:18.980 INFO:tasks.workunit.client.0.vm06.stdout:2/49: mknod d3/d4/cf 0 2026-03-09T17:29:18.982 INFO:tasks.workunit.client.0.vm06.stdout:6/46: dwrite d6/fb [0,4194304] 0 2026-03-09T17:29:18.984 INFO:tasks.workunit.client.0.vm06.stdout:8/50: fsync fe 0 2026-03-09T17:29:18.990 INFO:tasks.workunit.client.0.vm06.stdout:5/47: creat d4/f11 x:0 0 0 2026-03-09T17:29:18.992 INFO:tasks.workunit.client.0.vm06.stdout:1/60: mknod d11/d14/c15 0 2026-03-09T17:29:18.992 INFO:tasks.workunit.client.0.vm06.stdout:1/61: chown d11 0 1 2026-03-09T17:29:18.993 INFO:tasks.workunit.client.0.vm06.stdout:7/65: write f0 [1293954,37466] 0 2026-03-09T17:29:18.997 INFO:tasks.workunit.client.0.vm06.stdout:7/66: dwrite f0 [0,4194304] 0 2026-03-09T17:29:19.001 INFO:tasks.workunit.client.0.vm06.stdout:1/62: dwrite d11/f13 [0,4194304] 0 2026-03-09T17:29:19.018 INFO:tasks.workunit.client.0.vm06.stdout:0/53: unlink l0 0 2026-03-09T17:29:19.018 INFO:tasks.workunit.client.0.vm06.stdout:2/50: creat d3/f10 x:0 0 0 2026-03-09T17:29:19.034 INFO:tasks.workunit.client.0.vm06.stdout:9/66: fdatasync d3/fb 0 2026-03-09T17:29:19.036 INFO:tasks.workunit.client.0.vm06.stdout:4/48: fsync db/fe 0 2026-03-09T17:29:19.039 INFO:tasks.workunit.client.0.vm06.stdout:1/63: symlink d11/d14/l16 0 2026-03-09T17:29:19.040 INFO:tasks.workunit.client.0.vm06.stdout:7/67: write f2 [5022790,117129] 0 2026-03-09T17:29:19.041 INFO:tasks.workunit.client.0.vm06.stdout:7/68: write d5/d7/fa [1194969,20573] 0 2026-03-09T17:29:19.047 INFO:tasks.workunit.client.0.vm06.stdout:0/54: dwrite d7/f8 [0,4194304] 0 2026-03-09T17:29:19.050 INFO:tasks.workunit.client.0.vm06.stdout:2/51: creat d3/d4/f11 x:0 0 0 2026-03-09T17:29:19.055 INFO:tasks.workunit.client.0.vm06.stdout:3/39: dread f4 [0,4194304] 0 2026-03-09T17:29:19.055 INFO:tasks.workunit.client.0.vm06.stdout:9/67: mknod d3/cf 0 2026-03-09T17:29:19.058 INFO:tasks.workunit.client.0.vm06.stdout:4/49: rmdir db 39 2026-03-09T17:29:19.060 INFO:tasks.workunit.client.0.vm06.stdout:7/69: fdatasync f2 0 2026-03-09T17:29:19.065 INFO:tasks.workunit.client.0.vm06.stdout:7/70: dwrite d5/f8 [4194304,4194304] 0 2026-03-09T17:29:19.074 INFO:tasks.workunit.client.0.vm06.stdout:0/55: rename f6 to d7/d11/f13 0 2026-03-09T17:29:19.075 INFO:tasks.workunit.client.0.vm06.stdout:0/56: dread - d7/fe zero size 2026-03-09T17:29:19.075 INFO:tasks.workunit.client.0.vm06.stdout:0/57: fdatasync d7/f10 0 2026-03-09T17:29:19.080 INFO:tasks.workunit.client.0.vm06.stdout:6/47: dwrite f2 [4194304,4194304] 0 2026-03-09T17:29:19.092 INFO:tasks.workunit.client.0.vm06.stdout:3/40: dwrite f0 [0,4194304] 0 2026-03-09T17:29:19.092 INFO:tasks.workunit.client.0.vm06.stdout:3/41: chown l6 27205147 1 2026-03-09T17:29:19.095 INFO:tasks.workunit.client.0.vm06.stdout:9/68: mknod d3/c10 0 2026-03-09T17:29:19.098 INFO:tasks.workunit.client.0.vm06.stdout:3/42: dread f0 [0,4194304] 0 2026-03-09T17:29:19.118 INFO:tasks.workunit.client.0.vm06.stdout:7/71: rename f4 to d5/dd/ff 0 2026-03-09T17:29:19.122 INFO:tasks.workunit.client.0.vm06.stdout:0/58: creat d7/f14 x:0 0 0 2026-03-09T17:29:19.122 INFO:tasks.workunit.client.0.vm06.stdout:0/59: write d7/f12 [1014378,106987] 0 2026-03-09T17:29:19.130 INFO:tasks.workunit.client.0.vm06.stdout:6/48: rename d6/l7 to d6/ld 0 2026-03-09T17:29:19.132 INFO:tasks.workunit.client.0.vm06.stdout:9/69: chown d3/l5 1982 1 2026-03-09T17:29:19.133 INFO:tasks.workunit.client.0.vm06.stdout:4/50: mkdir db/df 0 2026-03-09T17:29:19.133 INFO:tasks.workunit.client.0.vm06.stdout:4/51: rename db to db/df/d10 22 2026-03-09T17:29:19.134 INFO:tasks.workunit.client.0.vm06.stdout:4/52: read f2 [565292,120234] 0 2026-03-09T17:29:19.134 INFO:tasks.workunit.client.0.vm06.stdout:4/53: truncate f6 675885 0 2026-03-09T17:29:19.137 INFO:tasks.workunit.client.0.vm06.stdout:8/51: rmdir df 0 2026-03-09T17:29:19.137 INFO:tasks.workunit.client.0.vm06.stdout:8/52: stat fd 0 2026-03-09T17:29:19.138 INFO:tasks.workunit.client.0.vm06.stdout:7/72: fdatasync d5/f8 0 2026-03-09T17:29:19.138 INFO:tasks.workunit.client.0.vm06.stdout:0/60: mknod d7/d11/c15 0 2026-03-09T17:29:19.139 INFO:tasks.workunit.client.0.vm06.stdout:0/61: write f5 [421917,1930] 0 2026-03-09T17:29:19.155 INFO:tasks.workunit.client.0.vm06.stdout:2/52: truncate d3/d4/f6 2826419 0 2026-03-09T17:29:19.157 INFO:tasks.workunit.client.0.vm06.stdout:9/70: read d3/fb [224329,17488] 0 2026-03-09T17:29:19.158 INFO:tasks.workunit.client.0.vm06.stdout:3/43: symlink l8 0 2026-03-09T17:29:19.158 INFO:tasks.workunit.client.0.vm06.stdout:1/64: getdents d11/d14 0 2026-03-09T17:29:19.162 INFO:tasks.workunit.client.0.vm06.stdout:4/54: mknod db/c11 0 2026-03-09T17:29:19.163 INFO:tasks.workunit.client.0.vm06.stdout:8/53: creat f11 x:0 0 0 2026-03-09T17:29:19.164 INFO:tasks.workunit.client.0.vm06.stdout:7/73: rmdir d5 39 2026-03-09T17:29:19.168 INFO:tasks.workunit.client.0.vm06.stdout:5/48: dwrite d4/f7 [0,4194304] 0 2026-03-09T17:29:19.173 INFO:tasks.workunit.client.0.vm06.stdout:5/49: dread d4/f5 [0,4194304] 0 2026-03-09T17:29:19.176 INFO:tasks.workunit.client.0.vm06.stdout:5/50: dread d4/f5 [0,4194304] 0 2026-03-09T17:29:19.177 INFO:tasks.workunit.client.0.vm06.stdout:5/51: read d4/d9/fe [1573555,119823] 0 2026-03-09T17:29:19.177 INFO:tasks.workunit.client.0.vm06.stdout:5/52: chown d4/d9/f10 64358 1 2026-03-09T17:29:19.182 INFO:tasks.workunit.client.0.vm06.stdout:0/62: unlink c3 0 2026-03-09T17:29:19.182 INFO:tasks.workunit.client.0.vm06.stdout:6/49: readlink d6/ld 0 2026-03-09T17:29:19.182 INFO:tasks.workunit.client.0.vm06.stdout:0/63: write d7/f8 [2978782,54330] 0 2026-03-09T17:29:19.183 INFO:tasks.workunit.client.0.vm06.stdout:6/50: read d6/fb [2368866,54336] 0 2026-03-09T17:29:19.183 INFO:tasks.workunit.client.0.vm06.stdout:6/51: write d6/fb [478163,14283] 0 2026-03-09T17:29:19.184 INFO:tasks.workunit.client.0.vm06.stdout:6/52: write d6/fb [1640741,71208] 0 2026-03-09T17:29:19.184 INFO:tasks.workunit.client.0.vm06.stdout:6/53: chown f2 16 1 2026-03-09T17:29:19.187 INFO:tasks.workunit.client.0.vm06.stdout:0/64: dread d7/d11/f13 [0,4194304] 0 2026-03-09T17:29:19.389 INFO:tasks.workunit.client.0.vm06.stdout:2/53: unlink d3/d4/fc 0 2026-03-09T17:29:19.390 INFO:tasks.workunit.client.0.vm06.stdout:9/71: unlink d3/l5 0 2026-03-09T17:29:19.392 INFO:tasks.workunit.client.0.vm06.stdout:1/65: rename fd to d11/d14/f17 0 2026-03-09T17:29:19.393 INFO:tasks.workunit.client.0.vm06.stdout:8/54: creat f12 x:0 0 0 2026-03-09T17:29:19.393 INFO:tasks.workunit.client.0.vm06.stdout:4/55: write f7 [5718206,83660] 0 2026-03-09T17:29:19.397 INFO:tasks.workunit.client.0.vm06.stdout:4/56: dwrite f2 [0,4194304] 0 2026-03-09T17:29:19.402 INFO:tasks.workunit.client.0.vm06.stdout:5/53: symlink d4/l12 0 2026-03-09T17:29:19.409 INFO:tasks.workunit.client.0.vm06.stdout:0/65: mknod d7/c16 0 2026-03-09T17:29:19.410 INFO:tasks.workunit.client.0.vm06.stdout:0/66: truncate d7/f14 614208 0 2026-03-09T17:29:19.411 INFO:tasks.workunit.client.0.vm06.stdout:2/54: mkdir d3/d4/d12 0 2026-03-09T17:29:19.412 INFO:tasks.workunit.client.0.vm06.stdout:9/72: mkdir d3/d11 0 2026-03-09T17:29:19.413 INFO:tasks.workunit.client.0.vm06.stdout:9/73: chown d3/c6 4665 1 2026-03-09T17:29:19.415 INFO:tasks.workunit.client.0.vm06.stdout:0/67: dwrite f5 [0,4194304] 0 2026-03-09T17:29:19.428 INFO:tasks.workunit.client.0.vm06.stdout:1/66: write f10 [844476,126616] 0 2026-03-09T17:29:19.428 INFO:tasks.workunit.client.0.vm06.stdout:8/55: creat f13 x:0 0 0 2026-03-09T17:29:19.428 INFO:tasks.workunit.client.0.vm06.stdout:8/56: stat f12 0 2026-03-09T17:29:19.428 INFO:tasks.workunit.client.0.vm06.stdout:7/74: creat d5/f10 x:0 0 0 2026-03-09T17:29:19.429 INFO:tasks.workunit.client.0.vm06.stdout:5/54: truncate d4/f5 2333305 0 2026-03-09T17:29:19.429 INFO:tasks.workunit.client.0.vm06.stdout:8/57: dwrite fe [0,4194304] 0 2026-03-09T17:29:19.429 INFO:tasks.workunit.client.0.vm06.stdout:8/58: stat cb 0 2026-03-09T17:29:19.437 INFO:tasks.workunit.client.0.vm06.stdout:2/55: chown l1 1 1 2026-03-09T17:29:19.457 INFO:tasks.workunit.client.0.vm06.stdout:3/44: rename l8 to l9 0 2026-03-09T17:29:19.458 INFO:tasks.workunit.client.0.vm06.stdout:0/68: fsync d7/d11/f13 0 2026-03-09T17:29:19.458 INFO:tasks.workunit.client.0.vm06.stdout:1/67: creat d11/f18 x:0 0 0 2026-03-09T17:29:19.458 INFO:tasks.workunit.client.0.vm06.stdout:3/45: dwrite f7 [0,4194304] 0 2026-03-09T17:29:19.458 INFO:tasks.workunit.client.0.vm06.stdout:7/75: creat d5/d7/f11 x:0 0 0 2026-03-09T17:29:19.458 INFO:tasks.workunit.client.0.vm06.stdout:1/68: dread f7 [0,4194304] 0 2026-03-09T17:29:19.458 INFO:tasks.workunit.client.0.vm06.stdout:7/76: dread d5/dd/ff [0,4194304] 0 2026-03-09T17:29:19.458 INFO:tasks.workunit.client.0.vm06.stdout:7/77: read d5/dd/ff [973865,92604] 0 2026-03-09T17:29:19.458 INFO:tasks.workunit.client.0.vm06.stdout:7/78: dread f0 [0,4194304] 0 2026-03-09T17:29:19.458 INFO:tasks.workunit.client.0.vm06.stdout:4/57: symlink db/df/l12 0 2026-03-09T17:29:19.458 INFO:tasks.workunit.client.0.vm06.stdout:5/55: symlink d4/d9/l13 0 2026-03-09T17:29:19.458 INFO:tasks.workunit.client.0.vm06.stdout:8/59: mknod c14 0 2026-03-09T17:29:19.458 INFO:tasks.workunit.client.0.vm06.stdout:0/69: mkdir d7/d17 0 2026-03-09T17:29:19.461 INFO:tasks.workunit.client.0.vm06.stdout:5/56: chown f0 8 1 2026-03-09T17:29:19.468 INFO:tasks.workunit.client.0.vm06.stdout:3/46: mknod ca 0 2026-03-09T17:29:19.472 INFO:tasks.workunit.client.0.vm06.stdout:1/69: unlink cf 0 2026-03-09T17:29:19.472 INFO:tasks.workunit.client.0.vm06.stdout:4/58: link f3 db/f13 0 2026-03-09T17:29:19.472 INFO:tasks.workunit.client.0.vm06.stdout:4/59: chown db/df 1 1 2026-03-09T17:29:19.472 INFO:tasks.workunit.client.0.vm06.stdout:5/57: creat d4/d9/f14 x:0 0 0 2026-03-09T17:29:19.472 INFO:tasks.workunit.client.0.vm06.stdout:4/60: chown db/df/l12 75826212 1 2026-03-09T17:29:19.473 INFO:tasks.workunit.client.0.vm06.stdout:1/70: creat d11/f19 x:0 0 0 2026-03-09T17:29:19.476 INFO:tasks.workunit.client.0.vm06.stdout:1/71: symlink d11/l1a 0 2026-03-09T17:29:19.477 INFO:tasks.workunit.client.0.vm06.stdout:1/72: write d11/f18 [980826,81942] 0 2026-03-09T17:29:19.477 INFO:tasks.workunit.client.0.vm06.stdout:1/73: dread - d11/f19 zero size 2026-03-09T17:29:19.480 INFO:tasks.workunit.client.0.vm06.stdout:4/61: dwrite f3 [0,4194304] 0 2026-03-09T17:29:19.488 INFO:tasks.workunit.client.0.vm06.stdout:1/74: unlink c9 0 2026-03-09T17:29:19.488 INFO:tasks.workunit.client.0.vm06.stdout:1/75: readlink d11/l1a 0 2026-03-09T17:29:19.488 INFO:tasks.workunit.client.0.vm06.stdout:4/62: dwrite f7 [8388608,4194304] 0 2026-03-09T17:29:19.492 INFO:tasks.workunit.client.0.vm06.stdout:1/76: dread fa [0,4194304] 0 2026-03-09T17:29:19.496 INFO:tasks.workunit.client.0.vm06.stdout:4/63: dread f6 [0,4194304] 0 2026-03-09T17:29:19.501 INFO:tasks.workunit.client.0.vm06.stdout:1/77: mkdir d11/d14/d1b 0 2026-03-09T17:29:19.503 INFO:tasks.workunit.client.0.vm06.stdout:1/78: truncate f8 180157 0 2026-03-09T17:29:19.505 INFO:tasks.workunit.client.0.vm06.stdout:1/79: rmdir d11/d14 39 2026-03-09T17:29:19.505 INFO:tasks.workunit.client.0.vm06.stdout:1/80: chown f1 2241378 1 2026-03-09T17:29:19.520 INFO:tasks.workunit.client.0.vm06.stdout:1/81: rename d11/d14/d1b to d11/d14/d1c 0 2026-03-09T17:29:19.521 INFO:tasks.workunit.client.0.vm06.stdout:1/82: read d11/f13 [3043335,14447] 0 2026-03-09T17:29:19.521 INFO:tasks.workunit.client.0.vm06.stdout:1/83: dread - d11/f19 zero size 2026-03-09T17:29:19.539 INFO:tasks.workunit.client.0.vm06.stdout:1/84: dwrite d11/d14/f17 [0,4194304] 0 2026-03-09T17:29:19.549 INFO:tasks.workunit.client.0.vm06.stdout:1/85: unlink fa 0 2026-03-09T17:29:19.549 INFO:tasks.workunit.client.0.vm06.stdout:1/86: stat cc 0 2026-03-09T17:29:19.551 INFO:tasks.workunit.client.0.vm06.stdout:1/87: fsync f7 0 2026-03-09T17:29:19.556 INFO:tasks.workunit.client.0.vm06.stdout:1/88: readlink d11/l1a 0 2026-03-09T17:29:19.556 INFO:tasks.workunit.client.0.vm06.stdout:1/89: rmdir d11/d14 39 2026-03-09T17:29:19.560 INFO:tasks.workunit.client.0.vm06.stdout:1/90: mkdir d11/d14/d1d 0 2026-03-09T17:29:19.562 INFO:tasks.workunit.client.0.vm06.stdout:1/91: mkdir d11/d14/d1d/d1e 0 2026-03-09T17:29:19.562 INFO:tasks.workunit.client.0.vm06.stdout:1/92: write d11/d14/f17 [3489117,125957] 0 2026-03-09T17:29:19.608 INFO:tasks.workunit.client.0.vm06.stdout:4/64: sync 2026-03-09T17:29:19.608 INFO:tasks.workunit.client.0.vm06.stdout:4/65: readlink l8 0 2026-03-09T17:29:19.623 INFO:tasks.workunit.client.0.vm06.stdout:4/66: rename f2 to db/df/f14 0 2026-03-09T17:29:19.623 INFO:tasks.workunit.client.0.vm06.stdout:4/67: truncate db/fc 604712 0 2026-03-09T17:29:19.624 INFO:tasks.workunit.client.0.vm06.stdout:4/68: dread - db/fe zero size 2026-03-09T17:29:19.626 INFO:tasks.workunit.client.0.vm06.stdout:4/69: creat db/f15 x:0 0 0 2026-03-09T17:29:19.630 INFO:tasks.workunit.client.0.vm06.stdout:4/70: mknod db/df/c16 0 2026-03-09T17:29:19.632 INFO:tasks.workunit.client.0.vm06.stdout:4/71: link f7 db/f17 0 2026-03-09T17:29:19.637 INFO:tasks.workunit.client.0.vm06.stdout:4/72: dwrite db/f17 [0,4194304] 0 2026-03-09T17:29:19.639 INFO:tasks.workunit.client.0.vm06.stdout:3/47: fsync f7 0 2026-03-09T17:29:19.642 INFO:tasks.workunit.client.0.vm06.stdout:4/73: dwrite fa [0,4194304] 0 2026-03-09T17:29:19.643 INFO:tasks.workunit.client.0.vm06.stdout:4/74: readlink l9 0 2026-03-09T17:29:19.649 INFO:tasks.workunit.client.0.vm06.stdout:4/75: write db/f15 [73594,11627] 0 2026-03-09T17:29:19.659 INFO:tasks.workunit.client.0.vm06.stdout:4/76: creat db/df/f18 x:0 0 0 2026-03-09T17:29:19.677 INFO:tasks.workunit.client.0.vm06.stdout:3/48: dwrite f0 [0,4194304] 0 2026-03-09T17:29:19.677 INFO:tasks.workunit.client.0.vm06.stdout:4/77: rename db/df/l12 to db/l19 0 2026-03-09T17:29:19.677 INFO:tasks.workunit.client.0.vm06.stdout:4/78: unlink l8 0 2026-03-09T17:29:19.677 INFO:tasks.workunit.client.0.vm06.stdout:4/79: readlink l9 0 2026-03-09T17:29:19.679 INFO:tasks.workunit.client.0.vm06.stdout:3/49: dwrite f4 [0,4194304] 0 2026-03-09T17:29:19.688 INFO:tasks.workunit.client.0.vm06.stdout:3/50: symlink lb 0 2026-03-09T17:29:19.694 INFO:tasks.workunit.client.0.vm06.stdout:3/51: link f0 fc 0 2026-03-09T17:29:19.707 INFO:tasks.workunit.client.0.vm06.stdout:3/52: write f0 [2185169,92843] 0 2026-03-09T17:29:19.707 INFO:tasks.workunit.client.0.vm06.stdout:3/53: write f7 [1190317,107428] 0 2026-03-09T17:29:19.707 INFO:tasks.workunit.client.0.vm06.stdout:3/54: dread f0 [0,4194304] 0 2026-03-09T17:29:19.707 INFO:tasks.workunit.client.0.vm06.stdout:3/55: write f7 [3355408,66929] 0 2026-03-09T17:29:19.707 INFO:tasks.workunit.client.0.vm06.stdout:3/56: dread f4 [0,4194304] 0 2026-03-09T17:29:19.707 INFO:tasks.workunit.client.0.vm06.stdout:3/57: mkdir dd 0 2026-03-09T17:29:19.752 INFO:tasks.workunit.client.0.vm06.stdout:1/93: fdatasync d11/d14/f17 0 2026-03-09T17:29:19.753 INFO:tasks.workunit.client.0.vm06.stdout:1/94: write d11/f18 [1520283,26794] 0 2026-03-09T17:29:19.758 INFO:tasks.workunit.client.0.vm06.stdout:1/95: dread f10 [0,4194304] 0 2026-03-09T17:29:19.762 INFO:tasks.workunit.client.0.vm06.stdout:1/96: dwrite f7 [0,4194304] 0 2026-03-09T17:29:19.802 INFO:tasks.workunit.client.0.vm06.stdout:9/74: rmdir d3 39 2026-03-09T17:29:19.808 INFO:tasks.workunit.client.0.vm06.stdout:7/79: getdents d5 0 2026-03-09T17:29:19.808 INFO:tasks.workunit.client.0.vm06.stdout:7/80: chown d5/c6 53333625 1 2026-03-09T17:29:19.809 INFO:tasks.workunit.client.0.vm06.stdout:9/75: creat d3/d11/f12 x:0 0 0 2026-03-09T17:29:19.812 INFO:tasks.workunit.client.0.vm06.stdout:7/81: dread f0 [0,4194304] 0 2026-03-09T17:29:19.818 INFO:tasks.workunit.client.0.vm06.stdout:1/97: sync 2026-03-09T17:29:19.820 INFO:tasks.workunit.client.0.vm06.stdout:6/54: truncate d6/fb 364626 0 2026-03-09T17:29:19.824 INFO:tasks.workunit.client.0.vm06.stdout:3/58: fdatasync f7 0 2026-03-09T17:29:19.827 INFO:tasks.workunit.client.0.vm06.stdout:7/82: mkdir d5/d12 0 2026-03-09T17:29:19.828 INFO:tasks.workunit.client.0.vm06.stdout:1/98: mkdir d11/d14/d1c/d1f 0 2026-03-09T17:29:19.829 INFO:tasks.workunit.client.0.vm06.stdout:1/99: readlink d11/l1a 0 2026-03-09T17:29:19.830 INFO:tasks.workunit.client.0.vm06.stdout:1/100: dread f10 [0,4194304] 0 2026-03-09T17:29:19.831 INFO:tasks.workunit.client.0.vm06.stdout:1/101: rename d11 to d11/d14/d1d/d1e/d20 22 2026-03-09T17:29:19.831 INFO:tasks.workunit.client.0.vm06.stdout:1/102: chown f1 267161862 1 2026-03-09T17:29:19.832 INFO:tasks.workunit.client.0.vm06.stdout:1/103: dread f10 [0,4194304] 0 2026-03-09T17:29:19.832 INFO:tasks.workunit.client.0.vm06.stdout:1/104: write d11/f18 [2182299,73308] 0 2026-03-09T17:29:19.833 INFO:tasks.workunit.client.0.vm06.stdout:1/105: chown f7 11398 1 2026-03-09T17:29:19.848 INFO:tasks.workunit.client.0.vm06.stdout:0/70: getdents d7 0 2026-03-09T17:29:19.848 INFO:tasks.workunit.client.0.vm06.stdout:3/59: link lb dd/le 0 2026-03-09T17:29:19.853 INFO:tasks.workunit.client.0.vm06.stdout:0/71: rename d7/l9 to d7/d17/l18 0 2026-03-09T17:29:19.854 INFO:tasks.workunit.client.0.vm06.stdout:0/72: truncate d7/f10 330798 0 2026-03-09T17:29:19.854 INFO:tasks.workunit.client.0.vm06.stdout:0/73: fsync d7/d11/f13 0 2026-03-09T17:29:19.856 INFO:tasks.workunit.client.0.vm06.stdout:0/74: dread d7/f14 [0,4194304] 0 2026-03-09T17:29:19.858 INFO:tasks.workunit.client.0.vm06.stdout:3/60: symlink dd/lf 0 2026-03-09T17:29:19.865 INFO:tasks.workunit.client.0.vm06.stdout:2/56: dwrite d3/d4/f6 [0,4194304] 0 2026-03-09T17:29:19.872 INFO:tasks.workunit.client.0.vm06.stdout:2/57: dwrite d3/d4/f6 [0,4194304] 0 2026-03-09T17:29:19.882 INFO:tasks.workunit.client.0.vm06.stdout:0/75: mkdir d7/d11/d19 0 2026-03-09T17:29:19.886 INFO:tasks.workunit.client.0.vm06.stdout:3/61: fsync f4 0 2026-03-09T17:29:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:19 vm06.local ceph-mon[57307]: pgmap v143: 65 pgs: 65 active+clean; 201 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.1 MiB/s rd, 7.2 MiB/s wr, 349 op/s 2026-03-09T17:29:19.905 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:19 vm09.local ceph-mon[62061]: pgmap v143: 65 pgs: 65 active+clean; 201 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.1 MiB/s rd, 7.2 MiB/s wr, 349 op/s 2026-03-09T17:29:19.905 INFO:tasks.workunit.client.0.vm06.stdout:8/60: truncate f7 3966652 0 2026-03-09T17:29:19.905 INFO:tasks.workunit.client.0.vm06.stdout:8/61: write f13 [835919,128501] 0 2026-03-09T17:29:19.905 INFO:tasks.workunit.client.0.vm06.stdout:8/62: mkdir d15 0 2026-03-09T17:29:19.905 INFO:tasks.workunit.client.0.vm06.stdout:8/63: unlink f9 0 2026-03-09T17:29:19.913 INFO:tasks.workunit.client.0.vm06.stdout:2/58: sync 2026-03-09T17:29:19.913 INFO:tasks.workunit.client.0.vm06.stdout:1/106: dread f8 [0,4194304] 0 2026-03-09T17:29:19.921 INFO:tasks.workunit.client.0.vm06.stdout:8/64: mkdir d15/d16 0 2026-03-09T17:29:19.922 INFO:tasks.workunit.client.0.vm06.stdout:8/65: chown d15/d16 96958977 1 2026-03-09T17:29:19.930 INFO:tasks.workunit.client.0.vm06.stdout:8/66: creat d15/d16/f17 x:0 0 0 2026-03-09T17:29:19.939 INFO:tasks.workunit.client.0.vm06.stdout:4/80: chown db/l19 49 1 2026-03-09T17:29:19.940 INFO:tasks.workunit.client.0.vm06.stdout:4/81: mknod db/df/c1a 0 2026-03-09T17:29:19.942 INFO:tasks.workunit.client.0.vm06.stdout:4/82: mknod db/c1b 0 2026-03-09T17:29:19.943 INFO:tasks.workunit.client.0.vm06.stdout:4/83: write fa [4324818,19761] 0 2026-03-09T17:29:19.951 INFO:tasks.workunit.client.0.vm06.stdout:4/84: mknod db/c1c 0 2026-03-09T17:29:19.955 INFO:tasks.workunit.client.0.vm06.stdout:4/85: dread fa [0,4194304] 0 2026-03-09T17:29:19.960 INFO:tasks.workunit.client.0.vm06.stdout:4/86: mkdir db/d1d 0 2026-03-09T17:29:19.964 INFO:tasks.workunit.client.0.vm06.stdout:4/87: link db/df/c16 db/d1d/c1e 0 2026-03-09T17:29:19.969 INFO:tasks.workunit.client.0.vm06.stdout:4/88: dread fa [0,4194304] 0 2026-03-09T17:29:19.970 INFO:tasks.workunit.client.0.vm06.stdout:4/89: dread f7 [8388608,4194304] 0 2026-03-09T17:29:19.971 INFO:tasks.workunit.client.0.vm06.stdout:4/90: fsync db/fe 0 2026-03-09T17:29:19.977 INFO:tasks.workunit.client.0.vm06.stdout:9/76: readlink d3/l9 0 2026-03-09T17:29:19.977 INFO:tasks.workunit.client.0.vm06.stdout:5/58: write d4/f5 [2468963,61693] 0 2026-03-09T17:29:19.978 INFO:tasks.workunit.client.0.vm06.stdout:5/59: write d4/f7 [4493275,99673] 0 2026-03-09T17:29:19.981 INFO:tasks.workunit.client.0.vm06.stdout:4/91: dwrite fa [0,4194304] 0 2026-03-09T17:29:19.990 INFO:tasks.workunit.client.0.vm06.stdout:5/60: dread d4/fb [0,4194304] 0 2026-03-09T17:29:19.991 INFO:tasks.workunit.client.0.vm06.stdout:5/61: dread f0 [0,4194304] 0 2026-03-09T17:29:19.991 INFO:tasks.workunit.client.0.vm06.stdout:5/62: chown d4/d9/fe 920 1 2026-03-09T17:29:19.996 INFO:tasks.workunit.client.0.vm06.stdout:6/55: dread d6/fb [0,4194304] 0 2026-03-09T17:29:19.996 INFO:tasks.workunit.client.0.vm06.stdout:9/77: symlink d3/d11/l13 0 2026-03-09T17:29:19.998 INFO:tasks.workunit.client.0.vm06.stdout:6/56: dread f2 [4194304,4194304] 0 2026-03-09T17:29:19.999 INFO:tasks.workunit.client.0.vm06.stdout:6/57: chown l1 9277 1 2026-03-09T17:29:19.999 INFO:tasks.workunit.client.0.vm06.stdout:6/58: dread d6/fb [0,4194304] 0 2026-03-09T17:29:20.001 INFO:tasks.workunit.client.0.vm06.stdout:7/83: getdents d5 0 2026-03-09T17:29:20.001 INFO:tasks.workunit.client.0.vm06.stdout:7/84: write d5/f10 [1041970,113005] 0 2026-03-09T17:29:20.002 INFO:tasks.workunit.client.0.vm06.stdout:7/85: stat d5/d7/lb 0 2026-03-09T17:29:20.007 INFO:tasks.workunit.client.0.vm06.stdout:4/92: dwrite f7 [4194304,4194304] 0 2026-03-09T17:29:20.007 INFO:tasks.workunit.client.0.vm06.stdout:4/93: stat l9 0 2026-03-09T17:29:20.010 INFO:tasks.workunit.client.0.vm06.stdout:9/78: creat d3/d11/f14 x:0 0 0 2026-03-09T17:29:20.010 INFO:tasks.workunit.client.0.vm06.stdout:3/62: getdents dd 0 2026-03-09T17:29:20.012 INFO:tasks.workunit.client.0.vm06.stdout:7/86: truncate f0 718163 0 2026-03-09T17:29:20.020 INFO:tasks.workunit.client.0.vm06.stdout:6/59: sync 2026-03-09T17:29:20.023 INFO:tasks.workunit.client.0.vm06.stdout:0/76: rename d7/d17/l18 to d7/d11/d19/l1a 0 2026-03-09T17:29:20.026 INFO:tasks.workunit.client.0.vm06.stdout:0/77: dwrite d7/f12 [0,4194304] 0 2026-03-09T17:29:20.037 INFO:tasks.workunit.client.0.vm06.stdout:4/94: creat db/d1d/f1f x:0 0 0 2026-03-09T17:29:20.042 INFO:tasks.workunit.client.0.vm06.stdout:8/67: dwrite f0 [0,4194304] 0 2026-03-09T17:29:20.044 INFO:tasks.workunit.client.0.vm06.stdout:6/60: creat d6/fe x:0 0 0 2026-03-09T17:29:20.048 INFO:tasks.workunit.client.0.vm06.stdout:9/79: mkdir d3/d15 0 2026-03-09T17:29:20.053 INFO:tasks.workunit.client.0.vm06.stdout:0/78: sync 2026-03-09T17:29:20.063 INFO:tasks.workunit.client.0.vm06.stdout:7/87: symlink d5/d12/l13 0 2026-03-09T17:29:20.063 INFO:tasks.workunit.client.0.vm06.stdout:7/88: chown d5/d7/lb 1 1 2026-03-09T17:29:20.065 INFO:tasks.workunit.client.0.vm06.stdout:5/63: getdents d4 0 2026-03-09T17:29:20.066 INFO:tasks.workunit.client.0.vm06.stdout:1/107: dwrite d11/f18 [0,4194304] 0 2026-03-09T17:29:20.070 INFO:tasks.workunit.client.0.vm06.stdout:7/89: dwrite d5/f8 [0,4194304] 0 2026-03-09T17:29:20.075 INFO:tasks.workunit.client.0.vm06.stdout:4/95: creat db/d1d/f20 x:0 0 0 2026-03-09T17:29:20.077 INFO:tasks.workunit.client.0.vm06.stdout:8/68: mknod d15/c18 0 2026-03-09T17:29:20.080 INFO:tasks.workunit.client.0.vm06.stdout:6/61: write f2 [4103029,23613] 0 2026-03-09T17:29:20.081 INFO:tasks.workunit.client.0.vm06.stdout:6/62: dread d6/fb [0,4194304] 0 2026-03-09T17:29:20.088 INFO:tasks.workunit.client.0.vm06.stdout:2/59: rename d3/c8 to d3/d4/c13 0 2026-03-09T17:29:20.089 INFO:tasks.workunit.client.0.vm06.stdout:3/63: truncate f7 2924791 0 2026-03-09T17:29:20.090 INFO:tasks.workunit.client.0.vm06.stdout:5/64: creat d4/f15 x:0 0 0 2026-03-09T17:29:20.093 INFO:tasks.workunit.client.0.vm06.stdout:7/90: unlink d5/d7/f11 0 2026-03-09T17:29:20.094 INFO:tasks.workunit.client.0.vm06.stdout:7/91: readlink d5/d7/lb 0 2026-03-09T17:29:20.094 INFO:tasks.workunit.client.0.vm06.stdout:4/96: mkdir db/d1d/d21 0 2026-03-09T17:29:20.096 INFO:tasks.workunit.client.0.vm06.stdout:8/69: unlink c14 0 2026-03-09T17:29:20.096 INFO:tasks.workunit.client.0.vm06.stdout:8/70: write f0 [169306,69849] 0 2026-03-09T17:29:20.097 INFO:tasks.workunit.client.0.vm06.stdout:6/63: creat d6/ff x:0 0 0 2026-03-09T17:29:20.098 INFO:tasks.workunit.client.0.vm06.stdout:0/79: rename d7/cc to d7/d11/c1b 0 2026-03-09T17:29:20.099 INFO:tasks.workunit.client.0.vm06.stdout:9/80: mkdir d3/d15/d16 0 2026-03-09T17:29:20.100 INFO:tasks.workunit.client.0.vm06.stdout:3/64: rmdir dd 39 2026-03-09T17:29:20.102 INFO:tasks.workunit.client.0.vm06.stdout:5/65: write d4/f7 [998794,18726] 0 2026-03-09T17:29:20.104 INFO:tasks.workunit.client.0.vm06.stdout:7/92: creat d5/d7/f14 x:0 0 0 2026-03-09T17:29:20.105 INFO:tasks.workunit.client.0.vm06.stdout:4/97: creat db/d1d/f22 x:0 0 0 2026-03-09T17:29:20.106 INFO:tasks.workunit.client.0.vm06.stdout:8/71: mkdir d15/d16/d19 0 2026-03-09T17:29:20.108 INFO:tasks.workunit.client.0.vm06.stdout:0/80: creat d7/d11/f1c x:0 0 0 2026-03-09T17:29:20.110 INFO:tasks.workunit.client.0.vm06.stdout:2/60: symlink d3/d4/d12/l14 0 2026-03-09T17:29:20.112 INFO:tasks.workunit.client.0.vm06.stdout:3/65: chown dd/lf 0 1 2026-03-09T17:29:20.112 INFO:tasks.workunit.client.0.vm06.stdout:0/81: dwrite d7/f8 [4194304,4194304] 0 2026-03-09T17:29:20.115 INFO:tasks.workunit.client.0.vm06.stdout:1/108: creat d11/d14/d1c/d1f/f21 x:0 0 0 2026-03-09T17:29:20.116 INFO:tasks.workunit.client.0.vm06.stdout:5/66: write d4/f11 [997230,36352] 0 2026-03-09T17:29:20.121 INFO:tasks.workunit.client.0.vm06.stdout:0/82: dwrite d7/d11/f13 [0,4194304] 0 2026-03-09T17:29:20.122 INFO:tasks.workunit.client.0.vm06.stdout:9/81: sync 2026-03-09T17:29:20.122 INFO:tasks.workunit.client.0.vm06.stdout:3/66: sync 2026-03-09T17:29:20.123 INFO:tasks.workunit.client.0.vm06.stdout:9/82: truncate d3/d11/f12 233693 0 2026-03-09T17:29:20.133 INFO:tasks.workunit.client.0.vm06.stdout:4/98: unlink db/d1d/f20 0 2026-03-09T17:29:20.134 INFO:tasks.workunit.client.0.vm06.stdout:8/72: mkdir d15/d16/d1a 0 2026-03-09T17:29:20.142 INFO:tasks.workunit.client.0.vm06.stdout:8/73: fdatasync f12 0 2026-03-09T17:29:20.142 INFO:tasks.workunit.client.0.vm06.stdout:8/74: truncate fd 353761 0 2026-03-09T17:29:20.142 INFO:tasks.workunit.client.0.vm06.stdout:8/75: fsync f12 0 2026-03-09T17:29:20.142 INFO:tasks.workunit.client.0.vm06.stdout:8/76: dwrite f12 [0,4194304] 0 2026-03-09T17:29:20.142 INFO:tasks.workunit.client.0.vm06.stdout:8/77: chown fe 63 1 2026-03-09T17:29:20.148 INFO:tasks.workunit.client.0.vm06.stdout:2/61: creat d3/d4/d12/f15 x:0 0 0 2026-03-09T17:29:20.161 INFO:tasks.workunit.client.0.vm06.stdout:1/109: rmdir d11/d14/d1d 39 2026-03-09T17:29:20.162 INFO:tasks.workunit.client.0.vm06.stdout:0/83: mkdir d7/d11/d19/d1d 0 2026-03-09T17:29:20.162 INFO:tasks.workunit.client.0.vm06.stdout:0/84: chown d7/fe 120 1 2026-03-09T17:29:20.164 INFO:tasks.workunit.client.0.vm06.stdout:3/67: creat dd/f10 x:0 0 0 2026-03-09T17:29:20.169 INFO:tasks.workunit.client.0.vm06.stdout:3/68: dwrite f4 [4194304,4194304] 0 2026-03-09T17:29:20.182 INFO:tasks.workunit.client.0.vm06.stdout:9/83: creat d3/d15/f17 x:0 0 0 2026-03-09T17:29:20.183 INFO:tasks.workunit.client.0.vm06.stdout:4/99: creat db/f23 x:0 0 0 2026-03-09T17:29:20.192 INFO:tasks.workunit.client.0.vm06.stdout:4/100: dwrite db/f23 [0,4194304] 0 2026-03-09T17:29:20.218 INFO:tasks.workunit.client.0.vm06.stdout:2/62: mknod d3/d4/c16 0 2026-03-09T17:29:20.227 INFO:tasks.workunit.client.0.vm06.stdout:2/63: dwrite d3/f10 [0,4194304] 0 2026-03-09T17:29:20.228 INFO:tasks.workunit.client.0.vm06.stdout:2/64: readlink l1 0 2026-03-09T17:29:20.228 INFO:tasks.workunit.client.0.vm06.stdout:2/65: write d3/d4/fe [186735,53964] 0 2026-03-09T17:29:20.228 INFO:tasks.workunit.client.0.vm06.stdout:2/66: chown d3 5 1 2026-03-09T17:29:20.231 INFO:tasks.workunit.client.0.vm06.stdout:2/67: dread d3/d4/f6 [0,4194304] 0 2026-03-09T17:29:20.231 INFO:tasks.workunit.client.0.vm06.stdout:2/68: chown d3/l5 8 1 2026-03-09T17:29:20.358 INFO:tasks.workunit.client.0.vm06.stdout:7/93: link d5/d7/lb d5/l15 0 2026-03-09T17:29:20.362 INFO:tasks.workunit.client.0.vm06.stdout:0/85: dread d7/f14 [0,4194304] 0 2026-03-09T17:29:20.362 INFO:tasks.workunit.client.0.vm06.stdout:0/86: truncate d7/fb 616113 0 2026-03-09T17:29:20.363 INFO:tasks.workunit.client.0.vm06.stdout:9/84: stat d3/l7 0 2026-03-09T17:29:20.363 INFO:tasks.workunit.client.0.vm06.stdout:9/85: readlink d3/l9 0 2026-03-09T17:29:20.367 INFO:tasks.workunit.client.0.vm06.stdout:6/64: link d6/ld d6/l10 0 2026-03-09T17:29:20.367 INFO:tasks.workunit.client.0.vm06.stdout:6/65: read f2 [8367527,11196] 0 2026-03-09T17:29:20.368 INFO:tasks.workunit.client.0.vm06.stdout:8/78: creat d15/d16/d1a/f1b x:0 0 0 2026-03-09T17:29:20.368 INFO:tasks.workunit.client.0.vm06.stdout:8/79: chown f12 184189094 1 2026-03-09T17:29:20.373 INFO:tasks.workunit.client.0.vm06.stdout:8/80: dwrite fe [4194304,4194304] 0 2026-03-09T17:29:20.374 INFO:tasks.workunit.client.0.vm06.stdout:8/81: chown cc 35924187 1 2026-03-09T17:29:20.376 INFO:tasks.workunit.client.0.vm06.stdout:7/94: creat d5/f16 x:0 0 0 2026-03-09T17:29:20.376 INFO:tasks.workunit.client.0.vm06.stdout:7/95: fdatasync f2 0 2026-03-09T17:29:20.379 INFO:tasks.workunit.client.0.vm06.stdout:9/86: read d3/fb [105795,30327] 0 2026-03-09T17:29:20.380 INFO:tasks.workunit.client.0.vm06.stdout:9/87: dread - d3/d15/f17 zero size 2026-03-09T17:29:20.382 INFO:tasks.workunit.client.0.vm06.stdout:3/69: unlink dd/le 0 2026-03-09T17:29:20.385 INFO:tasks.workunit.client.0.vm06.stdout:4/101: symlink db/d1d/d21/l24 0 2026-03-09T17:29:20.394 INFO:tasks.workunit.client.0.vm06.stdout:8/82: symlink d15/d16/l1c 0 2026-03-09T17:29:20.398 INFO:tasks.workunit.client.0.vm06.stdout:2/69: rename d3/d4/f6 to d3/d4/da/f17 0 2026-03-09T17:29:20.408 INFO:tasks.workunit.client.0.vm06.stdout:3/70: creat dd/f11 x:0 0 0 2026-03-09T17:29:20.408 INFO:tasks.workunit.client.0.vm06.stdout:3/71: write dd/f11 [458323,122427] 0 2026-03-09T17:29:20.412 INFO:tasks.workunit.client.0.vm06.stdout:4/102: mkdir db/d1d/d21/d25 0 2026-03-09T17:29:20.419 INFO:tasks.workunit.client.0.vm06.stdout:8/83: symlink d15/d16/d1a/l1d 0 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:5/67: getdents d4/d9 0 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:9/88: symlink d3/l18 0 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:3/72: read f7 [360706,13830] 0 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:9/89: dread d3/d11/f12 [0,4194304] 0 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:3/73: write dd/f11 [498266,60971] 0 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:3/74: read - dd/f10 zero size 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:3/75: truncate f0 5711901 0 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:3/76: dread - dd/f10 zero size 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:3/77: rename dd to dd/d12 22 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:3/78: dread f4 [0,4194304] 0 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:9/90: dwrite d3/d11/f12 [0,4194304] 0 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:8/84: mkdir d15/d16/d1e 0 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:8/85: stat l1 0 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:3/79: creat dd/f13 x:0 0 0 2026-03-09T17:29:20.444 INFO:tasks.workunit.client.0.vm06.stdout:3/80: dread - dd/f10 zero size 2026-03-09T17:29:20.446 INFO:tasks.workunit.client.0.vm06.stdout:2/70: sync 2026-03-09T17:29:20.452 INFO:tasks.workunit.client.0.vm06.stdout:3/81: unlink dd/f13 0 2026-03-09T17:29:20.455 INFO:tasks.workunit.client.0.vm06.stdout:9/91: mknod d3/d15/d16/c19 0 2026-03-09T17:29:20.459 INFO:tasks.workunit.client.0.vm06.stdout:8/86: creat d15/d16/d1e/f1f x:0 0 0 2026-03-09T17:29:20.461 INFO:tasks.workunit.client.0.vm06.stdout:8/87: dread - d15/d16/f17 zero size 2026-03-09T17:29:20.461 INFO:tasks.workunit.client.0.vm06.stdout:8/88: write d15/d16/d1a/f1b [264381,118420] 0 2026-03-09T17:29:20.462 INFO:tasks.workunit.client.0.vm06.stdout:5/68: getdents d4 0 2026-03-09T17:29:20.463 INFO:tasks.workunit.client.0.vm06.stdout:9/92: creat d3/d15/f1a x:0 0 0 2026-03-09T17:29:20.464 INFO:tasks.workunit.client.0.vm06.stdout:2/71: unlink l0 0 2026-03-09T17:29:20.465 INFO:tasks.workunit.client.0.vm06.stdout:8/89: symlink d15/d16/d1a/l20 0 2026-03-09T17:29:20.467 INFO:tasks.workunit.client.0.vm06.stdout:5/69: mknod d4/c16 0 2026-03-09T17:29:20.467 INFO:tasks.workunit.client.0.vm06.stdout:3/82: sync 2026-03-09T17:29:20.468 INFO:tasks.workunit.client.0.vm06.stdout:3/83: write dd/f11 [1484571,3291] 0 2026-03-09T17:29:20.469 INFO:tasks.workunit.client.0.vm06.stdout:3/84: write dd/f11 [2297292,114753] 0 2026-03-09T17:29:20.472 INFO:tasks.workunit.client.0.vm06.stdout:2/72: mknod d3/d4/da/c18 0 2026-03-09T17:29:20.477 INFO:tasks.workunit.client.0.vm06.stdout:3/85: sync 2026-03-09T17:29:20.483 INFO:tasks.workunit.client.0.vm06.stdout:2/73: symlink d3/d4/l19 0 2026-03-09T17:29:20.484 INFO:tasks.workunit.client.0.vm06.stdout:3/86: fdatasync fc 0 2026-03-09T17:29:20.487 INFO:tasks.workunit.client.0.vm06.stdout:2/74: sync 2026-03-09T17:29:20.487 INFO:tasks.workunit.client.0.vm06.stdout:2/75: chown d3 2531122 1 2026-03-09T17:29:20.492 INFO:tasks.workunit.client.0.vm06.stdout:2/76: write d3/d4/fe [1068235,6934] 0 2026-03-09T17:29:20.494 INFO:tasks.workunit.client.0.vm06.stdout:2/77: chown d3/d4/f11 342 1 2026-03-09T17:29:20.502 INFO:tasks.workunit.client.0.vm06.stdout:2/78: creat d3/d4/da/f1a x:0 0 0 2026-03-09T17:29:20.512 INFO:tasks.workunit.client.0.vm06.stdout:2/79: link d3/d4/fe d3/d4/da/f1b 0 2026-03-09T17:29:20.519 INFO:tasks.workunit.client.0.vm06.stdout:2/80: dwrite d3/d4/da/f17 [0,4194304] 0 2026-03-09T17:29:20.533 INFO:tasks.workunit.client.0.vm06.stdout:2/81: write f2 [7045566,46282] 0 2026-03-09T17:29:20.538 INFO:tasks.workunit.client.0.vm06.stdout:2/82: mknod d3/c1c 0 2026-03-09T17:29:20.540 INFO:tasks.workunit.client.0.vm06.stdout:2/83: dread d3/d4/da/f17 [0,4194304] 0 2026-03-09T17:29:20.546 INFO:tasks.workunit.client.0.vm06.stdout:2/84: chown d3/d4/cf 1654411 1 2026-03-09T17:29:20.546 INFO:tasks.workunit.client.0.vm06.stdout:2/85: truncate d3/d4/da/f1a 13381 0 2026-03-09T17:29:20.546 INFO:tasks.workunit.client.0.vm06.stdout:2/86: mknod d3/d4/c1d 0 2026-03-09T17:29:20.551 INFO:tasks.workunit.client.0.vm06.stdout:2/87: creat d3/d4/d12/f1e x:0 0 0 2026-03-09T17:29:20.552 INFO:tasks.workunit.client.0.vm06.stdout:2/88: read d3/d4/da/f1b [380077,105157] 0 2026-03-09T17:29:20.554 INFO:tasks.workunit.client.0.vm06.stdout:2/89: creat d3/d4/f1f x:0 0 0 2026-03-09T17:29:20.558 INFO:tasks.workunit.client.0.vm06.stdout:2/90: write d3/d4/fe [1660178,89457] 0 2026-03-09T17:29:20.636 INFO:tasks.workunit.client.0.vm06.stdout:0/87: write d7/f14 [1118447,73195] 0 2026-03-09T17:29:20.638 INFO:tasks.workunit.client.0.vm06.stdout:0/88: truncate d7/d11/f1c 641242 0 2026-03-09T17:29:20.639 INFO:tasks.workunit.client.0.vm06.stdout:0/89: rename d7/d11/d19/d1d to d7/d11/d19/d1d/d1e 22 2026-03-09T17:29:20.640 INFO:tasks.workunit.client.0.vm06.stdout:1/110: dwrite d11/f13 [4194304,4194304] 0 2026-03-09T17:29:20.650 INFO:tasks.workunit.client.0.vm06.stdout:6/66: dwrite d6/fb [0,4194304] 0 2026-03-09T17:29:20.666 INFO:tasks.workunit.client.0.vm06.stdout:4/103: rmdir db/d1d 39 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:4/104: dread - db/d1d/f22 zero size 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:7/96: truncate d5/f10 885982 0 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:0/90: rmdir d7/d17 0 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:0/91: truncate d7/fe 637567 0 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:0/92: stat d7/d11/c15 0 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:4/105: dwrite db/fe [0,4194304] 0 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:6/67: link d6/c8 d6/c11 0 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:4/106: write db/fe [3096176,3520] 0 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:3/87: rmdir dd 39 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:3/88: fdatasync fc 0 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:4/107: mkdir db/d1d/d21/d26 0 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:4/108: truncate db/f17 12719427 0 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:4/109: dwrite db/d1d/f1f [0,4194304] 0 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:4/110: write db/fc [1401006,41493] 0 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:4/111: fsync db/df/f14 0 2026-03-09T17:29:20.700 INFO:tasks.workunit.client.0.vm06.stdout:9/93: write d3/fb [1240523,55712] 0 2026-03-09T17:29:20.702 INFO:tasks.workunit.client.0.vm06.stdout:8/90: truncate f12 533046 0 2026-03-09T17:29:20.705 INFO:tasks.workunit.client.0.vm06.stdout:5/70: fsync d4/f11 0 2026-03-09T17:29:20.716 INFO:tasks.workunit.client.0.vm06.stdout:4/112: mknod db/d1d/d21/d25/c27 0 2026-03-09T17:29:20.716 INFO:tasks.workunit.client.0.vm06.stdout:4/113: fsync db/f13 0 2026-03-09T17:29:20.717 INFO:tasks.workunit.client.0.vm06.stdout:2/91: fsync d3/d4/d12/f1e 0 2026-03-09T17:29:20.719 INFO:tasks.workunit.client.0.vm06.stdout:9/94: creat d3/f1b x:0 0 0 2026-03-09T17:29:20.720 INFO:tasks.workunit.client.0.vm06.stdout:9/95: chown d3/f1b 1210543 1 2026-03-09T17:29:20.721 INFO:tasks.workunit.client.0.vm06.stdout:4/114: symlink db/d1d/d21/d25/l28 0 2026-03-09T17:29:20.723 INFO:tasks.workunit.client.0.vm06.stdout:9/96: creat d3/d11/f1c x:0 0 0 2026-03-09T17:29:20.748 INFO:tasks.workunit.client.0.vm06.stdout:9/97: dread - d3/d11/f1c zero size 2026-03-09T17:29:20.749 INFO:tasks.workunit.client.0.vm06.stdout:9/98: dwrite d3/f1b [0,4194304] 0 2026-03-09T17:29:20.749 INFO:tasks.workunit.client.0.vm06.stdout:9/99: dread - d3/d11/f14 zero size 2026-03-09T17:29:20.749 INFO:tasks.workunit.client.0.vm06.stdout:9/100: dwrite d3/d11/f12 [4194304,4194304] 0 2026-03-09T17:29:20.749 INFO:tasks.workunit.client.0.vm06.stdout:2/92: getdents d3 0 2026-03-09T17:29:20.749 INFO:tasks.workunit.client.0.vm06.stdout:9/101: write d3/d15/f1a [903792,98116] 0 2026-03-09T17:29:20.749 INFO:tasks.workunit.client.0.vm06.stdout:9/102: stat d3/l9 0 2026-03-09T17:29:20.749 INFO:tasks.workunit.client.0.vm06.stdout:2/93: dread f2 [0,4194304] 0 2026-03-09T17:29:20.749 INFO:tasks.workunit.client.0.vm06.stdout:9/103: rename d3/l18 to d3/d11/l1d 0 2026-03-09T17:29:20.749 INFO:tasks.workunit.client.0.vm06.stdout:9/104: truncate d3/d11/f14 3610 0 2026-03-09T17:29:20.749 INFO:tasks.workunit.client.0.vm06.stdout:9/105: read d3/d11/f12 [5415541,11966] 0 2026-03-09T17:29:20.749 INFO:tasks.workunit.client.0.vm06.stdout:2/94: creat d3/d4/d12/f20 x:0 0 0 2026-03-09T17:29:20.751 INFO:tasks.workunit.client.0.vm06.stdout:2/95: dwrite d3/d4/f11 [0,4194304] 0 2026-03-09T17:29:20.755 INFO:tasks.workunit.client.0.vm06.stdout:2/96: dread d3/f10 [0,4194304] 0 2026-03-09T17:29:20.761 INFO:tasks.workunit.client.0.vm06.stdout:2/97: creat d3/f21 x:0 0 0 2026-03-09T17:29:20.761 INFO:tasks.workunit.client.0.vm06.stdout:2/98: mkdir d3/d4/d22 0 2026-03-09T17:29:20.763 INFO:tasks.workunit.client.0.vm06.stdout:2/99: dread d3/d4/da/f17 [0,4194304] 0 2026-03-09T17:29:20.765 INFO:tasks.workunit.client.0.vm06.stdout:2/100: write d3/d4/da/f1b [2347038,101088] 0 2026-03-09T17:29:20.773 INFO:tasks.workunit.client.0.vm06.stdout:1/111: sync 2026-03-09T17:29:20.773 INFO:tasks.workunit.client.0.vm06.stdout:1/112: dread - d11/d14/d1c/d1f/f21 zero size 2026-03-09T17:29:20.777 INFO:tasks.workunit.client.0.vm06.stdout:2/101: dwrite d3/d4/da/f1a [0,4194304] 0 2026-03-09T17:29:20.777 INFO:tasks.workunit.client.0.vm06.stdout:1/113: link cc d11/d14/d1d/c22 0 2026-03-09T17:29:20.782 INFO:tasks.workunit.client.0.vm06.stdout:1/114: mknod d11/d14/d1c/d1f/c23 0 2026-03-09T17:29:20.782 INFO:tasks.workunit.client.0.vm06.stdout:1/115: chown f8 4427 1 2026-03-09T17:29:20.783 INFO:tasks.workunit.client.0.vm06.stdout:1/116: chown d11/d14 39570 1 2026-03-09T17:29:20.783 INFO:tasks.workunit.client.0.vm06.stdout:2/102: mknod d3/d4/d22/c23 0 2026-03-09T17:29:20.784 INFO:tasks.workunit.client.0.vm06.stdout:2/103: creat d3/f24 x:0 0 0 2026-03-09T17:29:20.785 INFO:tasks.workunit.client.0.vm06.stdout:2/104: chown d3/c1c 3133 1 2026-03-09T17:29:20.786 INFO:tasks.workunit.client.0.vm06.stdout:2/105: unlink d3/d4/d22/c23 0 2026-03-09T17:29:20.795 INFO:tasks.workunit.client.0.vm06.stdout:4/115: sync 2026-03-09T17:29:20.796 INFO:tasks.workunit.client.0.vm06.stdout:8/91: sync 2026-03-09T17:29:20.796 INFO:tasks.workunit.client.0.vm06.stdout:9/106: sync 2026-03-09T17:29:20.796 INFO:tasks.workunit.client.0.vm06.stdout:1/117: sync 2026-03-09T17:29:20.800 INFO:tasks.workunit.client.0.vm06.stdout:4/116: dread db/df/f14 [8388608,4194304] 0 2026-03-09T17:29:20.803 INFO:tasks.workunit.client.0.vm06.stdout:4/117: dread fa [0,4194304] 0 2026-03-09T17:29:20.804 INFO:tasks.workunit.client.0.vm06.stdout:4/118: chown db/f15 63253981 1 2026-03-09T17:29:20.806 INFO:tasks.workunit.client.0.vm06.stdout:4/119: dread f6 [0,4194304] 0 2026-03-09T17:29:20.808 INFO:tasks.workunit.client.0.vm06.stdout:8/92: sync 2026-03-09T17:29:20.808 INFO:tasks.workunit.client.0.vm06.stdout:8/93: stat fe 0 2026-03-09T17:29:20.812 INFO:tasks.workunit.client.0.vm06.stdout:1/118: rename c0 to d11/d14/d1d/d1e/c24 0 2026-03-09T17:29:20.812 INFO:tasks.workunit.client.0.vm06.stdout:1/119: stat d11/d14/d1c/d1f 0 2026-03-09T17:29:20.818 INFO:tasks.workunit.client.0.vm06.stdout:6/68: fsync d6/fb 0 2026-03-09T17:29:20.818 INFO:tasks.workunit.client.0.vm06.stdout:8/94: creat d15/d16/f21 x:0 0 0 2026-03-09T17:29:20.818 INFO:tasks.workunit.client.0.vm06.stdout:8/95: chown d15/d16/f17 6717239 1 2026-03-09T17:29:20.824 INFO:tasks.workunit.client.0.vm06.stdout:1/120: mkdir d11/d14/d1c/d1f/d25 0 2026-03-09T17:29:20.830 INFO:tasks.workunit.client.0.vm06.stdout:6/69: mkdir d6/d12 0 2026-03-09T17:29:20.839 INFO:tasks.workunit.client.0.vm06.stdout:6/70: dwrite d6/ff [0,4194304] 0 2026-03-09T17:29:20.848 INFO:tasks.workunit.client.0.vm06.stdout:9/107: dread d3/d11/f14 [0,4194304] 0 2026-03-09T17:29:20.857 INFO:tasks.workunit.client.0.vm06.stdout:7/97: rmdir d5 39 2026-03-09T17:29:20.859 INFO:tasks.workunit.client.0.vm06.stdout:4/120: link db/c1b db/d1d/d21/c29 0 2026-03-09T17:29:20.859 INFO:tasks.workunit.client.0.vm06.stdout:4/121: stat l9 0 2026-03-09T17:29:20.870 INFO:tasks.workunit.client.0.vm06.stdout:0/93: truncate d7/f12 971755 0 2026-03-09T17:29:20.880 INFO:tasks.workunit.client.0.vm06.stdout:3/89: dwrite f4 [0,4194304] 0 2026-03-09T17:29:20.888 INFO:tasks.workunit.client.0.vm06.stdout:4/122: unlink db/cd 0 2026-03-09T17:29:20.888 INFO:tasks.workunit.client.0.vm06.stdout:4/123: dread - db/df/f18 zero size 2026-03-09T17:29:20.890 INFO:tasks.workunit.client.0.vm06.stdout:8/96: dread f12 [0,4194304] 0 2026-03-09T17:29:20.890 INFO:tasks.workunit.client.0.vm06.stdout:8/97: fdatasync f13 0 2026-03-09T17:29:20.891 INFO:tasks.workunit.client.0.vm06.stdout:8/98: write fd [925677,1218] 0 2026-03-09T17:29:20.893 INFO:tasks.workunit.client.0.vm06.stdout:6/71: rename d6/c8 to d6/d12/c13 0 2026-03-09T17:29:20.895 INFO:tasks.workunit.client.0.vm06.stdout:6/72: dread d6/ff [0,4194304] 0 2026-03-09T17:29:20.908 INFO:tasks.workunit.client.0.vm06.stdout:5/71: write d4/fb [2113770,19916] 0 2026-03-09T17:29:20.917 INFO:tasks.workunit.client.0.vm06.stdout:8/99: dwrite f12 [0,4194304] 0 2026-03-09T17:29:20.933 INFO:tasks.workunit.client.0.vm06.stdout:6/73: rename f2 to d6/d12/f14 0 2026-03-09T17:29:20.938 INFO:tasks.workunit.client.0.vm06.stdout:5/72: read d4/d9/fe [616861,669] 0 2026-03-09T17:29:20.944 INFO:tasks.workunit.client.0.vm06.stdout:2/106: truncate f2 3742204 0 2026-03-09T17:29:20.944 INFO:tasks.workunit.client.0.vm06.stdout:5/73: dwrite d4/d9/f10 [0,4194304] 0 2026-03-09T17:29:20.948 INFO:tasks.workunit.client.0.vm06.stdout:8/100: creat d15/d16/d1a/f22 x:0 0 0 2026-03-09T17:29:20.948 INFO:tasks.workunit.client.0.vm06.stdout:8/101: write f13 [1381987,97226] 0 2026-03-09T17:29:20.949 INFO:tasks.workunit.client.0.vm06.stdout:8/102: chown cc 0 1 2026-03-09T17:29:20.958 INFO:tasks.workunit.client.0.vm06.stdout:9/108: getdents d3 0 2026-03-09T17:29:20.958 INFO:tasks.workunit.client.0.vm06.stdout:9/109: write d3/f1b [3775440,9953] 0 2026-03-09T17:29:20.958 INFO:tasks.workunit.client.0.vm06.stdout:9/110: readlink d3/l7 0 2026-03-09T17:29:20.969 INFO:tasks.workunit.client.0.vm06.stdout:2/107: mknod d3/d4/da/c25 0 2026-03-09T17:29:20.969 INFO:tasks.workunit.client.0.vm06.stdout:2/108: write d3/d4/fe [1370923,116274] 0 2026-03-09T17:29:20.970 INFO:tasks.workunit.client.0.vm06.stdout:2/109: fdatasync d3/d4/d12/f15 0 2026-03-09T17:29:20.978 INFO:tasks.workunit.client.0.vm06.stdout:9/111: fdatasync d3/d11/f12 0 2026-03-09T17:29:20.981 INFO:tasks.workunit.client.0.vm06.stdout:7/98: getdents d5/d7 0 2026-03-09T17:29:20.982 INFO:tasks.workunit.client.0.vm06.stdout:7/99: chown d5/d12/l13 188498168 1 2026-03-09T17:29:20.993 INFO:tasks.workunit.client.0.vm06.stdout:2/110: rename d3/d4/da/c25 to d3/d4/d12/c26 0 2026-03-09T17:29:21.000 INFO:tasks.workunit.client.0.vm06.stdout:9/112: rmdir d3/d15 39 2026-03-09T17:29:21.013 INFO:tasks.workunit.client.0.vm06.stdout:7/100: unlink d5/d7/f14 0 2026-03-09T17:29:21.025 INFO:tasks.workunit.client.0.vm06.stdout:1/121: truncate d11/f18 3731511 0 2026-03-09T17:29:21.029 INFO:tasks.workunit.client.0.vm06.stdout:4/124: rmdir db 39 2026-03-09T17:29:21.031 INFO:tasks.workunit.client.0.vm06.stdout:0/94: write f5 [5086401,18230] 0 2026-03-09T17:29:21.033 INFO:tasks.workunit.client.0.vm06.stdout:3/90: getdents dd 0 2026-03-09T17:29:21.036 INFO:tasks.workunit.client.0.vm06.stdout:5/74: creat d4/f17 x:0 0 0 2026-03-09T17:29:21.038 INFO:tasks.workunit.client.0.vm06.stdout:2/111: rename d3/c1c to d3/d4/c27 0 2026-03-09T17:29:21.043 INFO:tasks.workunit.client.0.vm06.stdout:7/101: mknod d5/dd/c17 0 2026-03-09T17:29:21.044 INFO:tasks.workunit.client.0.vm06.stdout:1/122: symlink d11/d14/d1d/d1e/l26 0 2026-03-09T17:29:21.044 INFO:tasks.workunit.client.0.vm06.stdout:1/123: stat f1 0 2026-03-09T17:29:21.057 INFO:tasks.workunit.client.0.vm06.stdout:3/91: creat dd/f14 x:0 0 0 2026-03-09T17:29:21.059 INFO:tasks.workunit.client.0.vm06.stdout:2/112: rmdir d3/d4/da 39 2026-03-09T17:29:21.060 INFO:tasks.workunit.client.0.vm06.stdout:8/103: truncate fe 5150419 0 2026-03-09T17:29:21.060 INFO:tasks.workunit.client.0.vm06.stdout:8/104: readlink d15/d16/l1c 0 2026-03-09T17:29:21.066 INFO:tasks.workunit.client.0.vm06.stdout:9/113: dread d3/d15/f1a [0,4194304] 0 2026-03-09T17:29:21.069 INFO:tasks.workunit.client.0.vm06.stdout:1/124: dread - d11/f19 zero size 2026-03-09T17:29:21.078 INFO:tasks.workunit.client.0.vm06.stdout:0/95: creat d7/d11/d19/d1d/f1f x:0 0 0 2026-03-09T17:29:21.078 INFO:tasks.workunit.client.0.vm06.stdout:3/92: creat dd/f15 x:0 0 0 2026-03-09T17:29:21.078 INFO:tasks.workunit.client.0.vm06.stdout:5/75: mkdir d4/d9/d18 0 2026-03-09T17:29:21.079 INFO:tasks.workunit.client.0.vm06.stdout:3/93: stat dd/f14 0 2026-03-09T17:29:21.079 INFO:tasks.workunit.client.0.vm06.stdout:0/96: dread - d7/d11/d19/d1d/f1f zero size 2026-03-09T17:29:21.092 INFO:tasks.workunit.client.0.vm06.stdout:1/125: creat d11/d14/d1d/d1e/f27 x:0 0 0 2026-03-09T17:29:21.092 INFO:tasks.workunit.client.0.vm06.stdout:1/126: write d11/d14/d1d/d1e/f27 [806511,30646] 0 2026-03-09T17:29:21.094 INFO:tasks.workunit.client.0.vm06.stdout:4/125: link db/d1d/f22 db/df/f2a 0 2026-03-09T17:29:21.100 INFO:tasks.workunit.client.0.vm06.stdout:5/76: mknod d4/d9/c19 0 2026-03-09T17:29:21.100 INFO:tasks.workunit.client.0.vm06.stdout:5/77: dread - d4/f17 zero size 2026-03-09T17:29:21.105 INFO:tasks.workunit.client.0.vm06.stdout:0/97: readlink d7/d11/d19/l1a 0 2026-03-09T17:29:21.106 INFO:tasks.workunit.client.0.vm06.stdout:0/98: write d7/f14 [988327,76801] 0 2026-03-09T17:29:21.106 INFO:tasks.workunit.client.0.vm06.stdout:0/99: write d7/f8 [6671782,25088] 0 2026-03-09T17:29:21.111 INFO:tasks.workunit.client.0.vm06.stdout:0/100: dwrite d7/ff [0,4194304] 0 2026-03-09T17:29:21.113 INFO:tasks.workunit.client.0.vm06.stdout:0/101: write d7/fb [752345,7626] 0 2026-03-09T17:29:21.118 INFO:tasks.workunit.client.0.vm06.stdout:0/102: dwrite d7/fe [0,4194304] 0 2026-03-09T17:29:21.126 INFO:tasks.workunit.client.0.vm06.stdout:0/103: dwrite d7/f10 [0,4194304] 0 2026-03-09T17:29:21.136 INFO:tasks.workunit.client.0.vm06.stdout:0/104: dwrite d7/fe [0,4194304] 0 2026-03-09T17:29:21.141 INFO:tasks.workunit.client.0.vm06.stdout:8/105: link d15/d16/d1a/f1b d15/d16/f23 0 2026-03-09T17:29:21.142 INFO:tasks.workunit.client.0.vm06.stdout:8/106: dread - d15/d16/d1a/f22 zero size 2026-03-09T17:29:21.153 INFO:tasks.workunit.client.0.vm06.stdout:9/114: rename d3/d11/l1d to d3/d15/l1e 0 2026-03-09T17:29:21.157 INFO:tasks.workunit.client.0.vm06.stdout:1/127: mknod d11/d14/d1d/c28 0 2026-03-09T17:29:21.163 INFO:tasks.workunit.client.0.vm06.stdout:1/128: dwrite f1 [0,4194304] 0 2026-03-09T17:29:21.180 INFO:tasks.workunit.client.0.vm06.stdout:5/78: dwrite d4/f15 [0,4194304] 0 2026-03-09T17:29:21.195 INFO:tasks.workunit.client.0.vm06.stdout:6/74: dwrite d6/d12/f14 [4194304,4194304] 0 2026-03-09T17:29:21.216 INFO:tasks.workunit.client.0.vm06.stdout:3/94: rename l6 to dd/l16 0 2026-03-09T17:29:21.217 INFO:tasks.workunit.client.0.vm06.stdout:3/95: dread - dd/f10 zero size 2026-03-09T17:29:21.218 INFO:tasks.workunit.client.0.vm06.stdout:7/102: write f0 [1510841,57203] 0 2026-03-09T17:29:21.219 INFO:tasks.workunit.client.0.vm06.stdout:7/103: readlink d5/d7/lb 0 2026-03-09T17:29:21.219 INFO:tasks.workunit.client.0.vm06.stdout:7/104: fsync d5/d7/fa 0 2026-03-09T17:29:21.223 INFO:tasks.workunit.client.0.vm06.stdout:9/115: creat d3/d11/f1f x:0 0 0 2026-03-09T17:29:21.240 INFO:tasks.workunit.client.0.vm06.stdout:7/105: dread f2 [4194304,4194304] 0 2026-03-09T17:29:21.295 INFO:tasks.workunit.client.0.vm06.stdout:4/126: creat db/d1d/d21/d26/f2b x:0 0 0 2026-03-09T17:29:21.295 INFO:tasks.workunit.client.0.vm06.stdout:1/129: symlink d11/d14/d1c/l29 0 2026-03-09T17:29:21.296 INFO:tasks.workunit.client.0.vm06.stdout:2/113: getdents d3/d4 0 2026-03-09T17:29:21.302 INFO:tasks.workunit.client.0.vm06.stdout:6/75: mknod d6/c15 0 2026-03-09T17:29:21.312 INFO:tasks.workunit.client.0.vm06.stdout:5/79: write d4/f7 [3976742,43604] 0 2026-03-09T17:29:21.312 INFO:tasks.workunit.client.0.vm06.stdout:0/105: truncate d7/f12 347979 0 2026-03-09T17:29:21.312 INFO:tasks.workunit.client.0.vm06.stdout:0/106: readlink d7/d11/d19/l1a 0 2026-03-09T17:29:21.312 INFO:tasks.workunit.client.0.vm06.stdout:0/107: read d7/f14 [254684,65321] 0 2026-03-09T17:29:21.312 INFO:tasks.workunit.client.0.vm06.stdout:0/108: write d7/f10 [1950696,60144] 0 2026-03-09T17:29:21.312 INFO:tasks.workunit.client.0.vm06.stdout:9/116: symlink d3/d15/d16/l20 0 2026-03-09T17:29:21.312 INFO:tasks.workunit.client.0.vm06.stdout:7/106: creat d5/f18 x:0 0 0 2026-03-09T17:29:21.312 INFO:tasks.workunit.client.0.vm06.stdout:7/107: write d5/f18 [91613,24028] 0 2026-03-09T17:29:21.313 INFO:tasks.workunit.client.0.vm06.stdout:1/130: mkdir d11/d14/d1d/d1e/d2a 0 2026-03-09T17:29:21.313 INFO:tasks.workunit.client.0.vm06.stdout:1/131: dread - d11/f19 zero size 2026-03-09T17:29:21.314 INFO:tasks.workunit.client.0.vm06.stdout:1/132: stat d11 0 2026-03-09T17:29:21.314 INFO:tasks.workunit.client.0.vm06.stdout:1/133: fsync d11/d14/f17 0 2026-03-09T17:29:21.320 INFO:tasks.workunit.client.0.vm06.stdout:2/114: creat d3/d4/d22/f28 x:0 0 0 2026-03-09T17:29:21.329 INFO:tasks.workunit.client.0.vm06.stdout:9/117: rmdir d3/d15/d16 39 2026-03-09T17:29:21.333 INFO:tasks.workunit.client.0.vm06.stdout:9/118: dwrite d3/d11/f1c [0,4194304] 0 2026-03-09T17:29:21.336 INFO:tasks.workunit.client.0.vm06.stdout:7/108: creat d5/dd/f19 x:0 0 0 2026-03-09T17:29:21.341 INFO:tasks.workunit.client.0.vm06.stdout:4/127: symlink db/l2c 0 2026-03-09T17:29:21.341 INFO:tasks.workunit.client.0.vm06.stdout:4/128: dread db/f13 [0,4194304] 0 2026-03-09T17:29:21.341 INFO:tasks.workunit.client.0.vm06.stdout:2/115: unlink d3/d4/c27 0 2026-03-09T17:29:21.342 INFO:tasks.workunit.client.0.vm06.stdout:5/80: symlink d4/d9/d18/l1a 0 2026-03-09T17:29:21.355 INFO:tasks.workunit.client.0.vm06.stdout:3/96: truncate fc 5576046 0 2026-03-09T17:29:21.356 INFO:tasks.workunit.client.0.vm06.stdout:3/97: chown f4 12 1 2026-03-09T17:29:21.356 INFO:tasks.workunit.client.0.vm06.stdout:9/119: fsync d3/d11/f14 0 2026-03-09T17:29:21.356 INFO:tasks.workunit.client.0.vm06.stdout:7/109: rename d5/d7/fa to d5/dd/f1a 0 2026-03-09T17:29:21.356 INFO:tasks.workunit.client.0.vm06.stdout:7/110: dwrite d5/f8 [4194304,4194304] 0 2026-03-09T17:29:21.358 INFO:tasks.workunit.client.0.vm06.stdout:4/129: unlink f3 0 2026-03-09T17:29:21.361 INFO:tasks.workunit.client.0.vm06.stdout:2/116: creat d3/f29 x:0 0 0 2026-03-09T17:29:21.362 INFO:tasks.workunit.client.0.vm06.stdout:3/98: symlink dd/l17 0 2026-03-09T17:29:21.364 INFO:tasks.workunit.client.0.vm06.stdout:5/81: dwrite d4/fb [0,4194304] 0 2026-03-09T17:29:21.364 INFO:tasks.workunit.client.0.vm06.stdout:5/82: chown d4/d9/l13 38021868 1 2026-03-09T17:29:21.371 INFO:tasks.workunit.client.0.vm06.stdout:3/99: dread dd/f11 [0,4194304] 0 2026-03-09T17:29:21.383 INFO:tasks.workunit.client.0.vm06.stdout:7/111: fdatasync d5/f18 0 2026-03-09T17:29:21.410 INFO:tasks.workunit.client.0.vm06.stdout:0/109: getdents d7/d11/d19 0 2026-03-09T17:29:21.421 INFO:tasks.workunit.client.0.vm06.stdout:4/130: creat db/df/f2d x:0 0 0 2026-03-09T17:29:21.422 INFO:tasks.workunit.client.0.vm06.stdout:4/131: fsync db/fe 0 2026-03-09T17:29:21.428 INFO:tasks.workunit.client.0.vm06.stdout:1/134: link d11/d14/c15 d11/d14/c2b 0 2026-03-09T17:29:21.446 INFO:tasks.workunit.client.0.vm06.stdout:1/135: readlink d11/d14/d1c/l29 0 2026-03-09T17:29:21.446 INFO:tasks.workunit.client.0.vm06.stdout:4/132: symlink db/df/l2e 0 2026-03-09T17:29:21.446 INFO:tasks.workunit.client.0.vm06.stdout:4/133: dread f7 [4194304,4194304] 0 2026-03-09T17:29:21.446 INFO:tasks.workunit.client.0.vm06.stdout:9/120: creat d3/f21 x:0 0 0 2026-03-09T17:29:21.446 INFO:tasks.workunit.client.0.vm06.stdout:1/136: write f10 [46263,111063] 0 2026-03-09T17:29:21.446 INFO:tasks.workunit.client.0.vm06.stdout:1/137: write f10 [1909188,105102] 0 2026-03-09T17:29:21.446 INFO:tasks.workunit.client.0.vm06.stdout:4/134: creat db/d1d/d21/f2f x:0 0 0 2026-03-09T17:29:21.448 INFO:tasks.workunit.client.0.vm06.stdout:1/138: dwrite d11/d14/d1c/d1f/f21 [0,4194304] 0 2026-03-09T17:29:21.455 INFO:tasks.workunit.client.0.vm06.stdout:1/139: dwrite d11/f19 [0,4194304] 0 2026-03-09T17:29:21.462 INFO:tasks.workunit.client.0.vm06.stdout:1/140: dwrite f7 [0,4194304] 0 2026-03-09T17:29:21.466 INFO:tasks.workunit.client.0.vm06.stdout:0/110: sync 2026-03-09T17:29:21.467 INFO:tasks.workunit.client.0.vm06.stdout:3/100: sync 2026-03-09T17:29:21.467 INFO:tasks.workunit.client.0.vm06.stdout:0/111: fdatasync d7/fb 0 2026-03-09T17:29:21.468 INFO:tasks.workunit.client.0.vm06.stdout:3/101: chown c3 230795 1 2026-03-09T17:29:21.483 INFO:tasks.workunit.client.0.vm06.stdout:4/135: creat db/df/f30 x:0 0 0 2026-03-09T17:29:21.488 INFO:tasks.workunit.client.0.vm06.stdout:1/141: fsync f8 0 2026-03-09T17:29:21.491 INFO:tasks.workunit.client.0.vm06.stdout:0/112: write d7/f14 [1712823,4977] 0 2026-03-09T17:29:21.508 INFO:tasks.workunit.client.0.vm06.stdout:9/121: mknod d3/d15/d16/c22 0 2026-03-09T17:29:21.513 INFO:tasks.workunit.client.0.vm06.stdout:0/113: creat d7/d11/f20 x:0 0 0 2026-03-09T17:29:21.517 INFO:tasks.workunit.client.0.vm06.stdout:3/102: unlink l9 0 2026-03-09T17:29:21.521 INFO:tasks.workunit.client.0.vm06.stdout:1/142: rename d11/d14/d1d/c22 to d11/d14/d1d/d1e/d2a/c2c 0 2026-03-09T17:29:21.523 INFO:tasks.workunit.client.0.vm06.stdout:1/143: dread d11/f13 [4194304,4194304] 0 2026-03-09T17:29:21.529 INFO:tasks.workunit.client.0.vm06.stdout:0/114: creat d7/d11/d19/f21 x:0 0 0 2026-03-09T17:29:21.540 INFO:tasks.workunit.client.0.vm06.stdout:4/136: truncate db/f17 11181834 0 2026-03-09T17:29:21.542 INFO:tasks.workunit.client.0.vm06.stdout:0/115: sync 2026-03-09T17:29:21.547 INFO:tasks.workunit.client.0.vm06.stdout:9/122: creat d3/d15/f23 x:0 0 0 2026-03-09T17:29:21.555 INFO:tasks.workunit.client.0.vm06.stdout:0/116: mknod d7/d11/c22 0 2026-03-09T17:29:21.556 INFO:tasks.workunit.client.0.vm06.stdout:0/117: write d7/fb [426216,7876] 0 2026-03-09T17:29:21.573 INFO:tasks.workunit.client.0.vm06.stdout:1/144: truncate d11/d14/f17 2770191 0 2026-03-09T17:29:21.575 INFO:tasks.workunit.client.0.vm06.stdout:9/123: symlink d3/d15/l24 0 2026-03-09T17:29:21.575 INFO:tasks.workunit.client.0.vm06.stdout:9/124: stat d3/d11 0 2026-03-09T17:29:21.576 INFO:tasks.workunit.client.0.vm06.stdout:9/125: chown d3/d11/f12 809801 1 2026-03-09T17:29:21.578 INFO:tasks.workunit.client.0.vm06.stdout:0/118: chown d7/f12 170 1 2026-03-09T17:29:21.580 INFO:tasks.workunit.client.0.vm06.stdout:0/119: read d7/d11/f1c [144599,1410] 0 2026-03-09T17:29:21.583 INFO:tasks.workunit.client.0.vm06.stdout:1/145: creat d11/d14/d1c/d1f/f2d x:0 0 0 2026-03-09T17:29:21.583 INFO:tasks.workunit.client.0.vm06.stdout:9/126: sync 2026-03-09T17:29:21.588 INFO:tasks.workunit.client.0.vm06.stdout:0/120: mkdir d7/d11/d19/d23 0 2026-03-09T17:29:21.591 INFO:tasks.workunit.client.0.vm06.stdout:1/146: creat d11/d14/d1c/f2e x:0 0 0 2026-03-09T17:29:21.594 INFO:tasks.workunit.client.0.vm06.stdout:9/127: mknod d3/c25 0 2026-03-09T17:29:21.595 INFO:tasks.workunit.client.0.vm06.stdout:9/128: chown d3/d15/d16/c22 1894 1 2026-03-09T17:29:21.598 INFO:tasks.workunit.client.0.vm06.stdout:1/147: dwrite d11/f13 [0,4194304] 0 2026-03-09T17:29:21.603 INFO:tasks.workunit.client.0.vm06.stdout:0/121: link d7/d11/f1c d7/d11/d19/f24 0 2026-03-09T17:29:21.613 INFO:tasks.workunit.client.0.vm06.stdout:0/122: rename d7/ff to d7/f25 0 2026-03-09T17:29:21.614 INFO:tasks.workunit.client.0.vm06.stdout:9/129: mkdir d3/d26 0 2026-03-09T17:29:21.616 INFO:tasks.workunit.client.0.vm06.stdout:0/123: creat d7/d11/d19/f26 x:0 0 0 2026-03-09T17:29:21.618 INFO:tasks.workunit.client.0.vm06.stdout:9/130: chown d3/l4 17 1 2026-03-09T17:29:21.622 INFO:tasks.workunit.client.0.vm06.stdout:0/124: link d7/d11/d19/f21 d7/f27 0 2026-03-09T17:29:21.624 INFO:tasks.workunit.client.0.vm06.stdout:0/125: dread d7/fe [0,4194304] 0 2026-03-09T17:29:21.624 INFO:tasks.workunit.client.0.vm06.stdout:0/126: stat d7/cd 0 2026-03-09T17:29:21.624 INFO:tasks.workunit.client.0.vm06.stdout:0/127: read - d7/d11/d19/f26 zero size 2026-03-09T17:29:21.628 INFO:tasks.workunit.client.0.vm06.stdout:0/128: mkdir d7/d28 0 2026-03-09T17:29:21.629 INFO:tasks.workunit.client.0.vm06.stdout:0/129: write d7/d11/f13 [4167706,59700] 0 2026-03-09T17:29:21.630 INFO:tasks.workunit.client.0.vm06.stdout:9/131: creat d3/f27 x:0 0 0 2026-03-09T17:29:21.635 INFO:tasks.workunit.client.0.vm06.stdout:0/130: creat d7/d11/f29 x:0 0 0 2026-03-09T17:29:21.637 INFO:tasks.workunit.client.0.vm06.stdout:0/131: creat d7/f2a x:0 0 0 2026-03-09T17:29:21.638 INFO:tasks.workunit.client.0.vm06.stdout:9/132: creat d3/d26/f28 x:0 0 0 2026-03-09T17:29:21.640 INFO:tasks.workunit.client.0.vm06.stdout:0/132: symlink d7/l2b 0 2026-03-09T17:29:21.642 INFO:tasks.workunit.client.0.vm06.stdout:0/133: creat d7/d11/f2c x:0 0 0 2026-03-09T17:29:21.645 INFO:tasks.workunit.client.0.vm06.stdout:0/134: mkdir d7/d11/d2d 0 2026-03-09T17:29:21.647 INFO:tasks.workunit.client.0.vm06.stdout:9/133: sync 2026-03-09T17:29:21.648 INFO:tasks.workunit.client.0.vm06.stdout:9/134: write d3/d15/f23 [730542,21342] 0 2026-03-09T17:29:21.649 INFO:tasks.workunit.client.0.vm06.stdout:9/135: truncate d3/d11/f1f 791903 0 2026-03-09T17:29:21.654 INFO:tasks.workunit.client.0.vm06.stdout:9/136: creat d3/d26/f29 x:0 0 0 2026-03-09T17:29:21.685 INFO:tasks.workunit.client.0.vm06.stdout:6/76: rmdir d6 39 2026-03-09T17:29:21.688 INFO:tasks.workunit.client.0.vm06.stdout:6/77: write d6/fb [1018155,44197] 0 2026-03-09T17:29:21.694 INFO:tasks.workunit.client.0.vm06.stdout:0/135: dread d7/f12 [0,4194304] 0 2026-03-09T17:29:21.701 INFO:tasks.workunit.client.0.vm06.stdout:6/78: mknod d6/d12/c16 0 2026-03-09T17:29:21.703 INFO:tasks.workunit.client.0.vm06.stdout:0/136: mknod d7/d11/c2e 0 2026-03-09T17:29:21.703 INFO:tasks.workunit.client.0.vm06.stdout:0/137: chown d7/d11/d19/d1d 22 1 2026-03-09T17:29:21.705 INFO:tasks.workunit.client.0.vm06.stdout:6/79: mkdir d6/d12/d17 0 2026-03-09T17:29:21.708 INFO:tasks.workunit.client.0.vm06.stdout:6/80: dwrite d6/d12/f14 [0,4194304] 0 2026-03-09T17:29:21.713 INFO:tasks.workunit.client.0.vm06.stdout:0/138: creat d7/d11/d2d/f2f x:0 0 0 2026-03-09T17:29:21.721 INFO:tasks.workunit.client.0.vm06.stdout:6/81: chown d6/d12/c13 3 1 2026-03-09T17:29:21.722 INFO:tasks.workunit.client.0.vm06.stdout:6/82: chown d6/d12/c13 1403 1 2026-03-09T17:29:21.722 INFO:tasks.workunit.client.0.vm06.stdout:6/83: fsync d6/fe 0 2026-03-09T17:29:21.727 INFO:tasks.workunit.client.0.vm06.stdout:9/137: write d3/d11/f14 [64752,120507] 0 2026-03-09T17:29:21.730 INFO:tasks.workunit.client.0.vm06.stdout:6/84: rename d6/d12/c16 to d6/c18 0 2026-03-09T17:29:21.735 INFO:tasks.workunit.client.0.vm06.stdout:6/85: dread d6/d12/f14 [4194304,4194304] 0 2026-03-09T17:29:21.735 INFO:tasks.workunit.client.0.vm06.stdout:2/117: creat d3/d4/da/f2a x:0 0 0 2026-03-09T17:29:21.736 INFO:tasks.workunit.client.0.vm06.stdout:9/138: creat d3/d11/f2a x:0 0 0 2026-03-09T17:29:21.741 INFO:tasks.workunit.client.0.vm06.stdout:6/86: creat d6/d12/f19 x:0 0 0 2026-03-09T17:29:21.745 INFO:tasks.workunit.client.0.vm06.stdout:9/139: mknod d3/d11/c2b 0 2026-03-09T17:29:21.748 INFO:tasks.workunit.client.0.vm06.stdout:9/140: dwrite d3/d11/f1f [0,4194304] 0 2026-03-09T17:29:21.752 INFO:tasks.workunit.client.0.vm06.stdout:6/87: write d6/d12/f14 [2843337,31307] 0 2026-03-09T17:29:21.756 INFO:tasks.workunit.client.0.vm06.stdout:6/88: dwrite d6/fe [0,4194304] 0 2026-03-09T17:29:21.757 INFO:tasks.workunit.client.0.vm06.stdout:6/89: readlink d6/l10 0 2026-03-09T17:29:21.763 INFO:tasks.workunit.client.0.vm06.stdout:9/141: sync 2026-03-09T17:29:21.766 INFO:tasks.workunit.client.0.vm06.stdout:2/118: rmdir d3/d4/d12 39 2026-03-09T17:29:21.774 INFO:tasks.workunit.client.0.vm06.stdout:6/90: creat d6/f1a x:0 0 0 2026-03-09T17:29:21.775 INFO:tasks.workunit.client.0.vm06.stdout:9/142: truncate d3/d15/f1a 225427 0 2026-03-09T17:29:21.775 INFO:tasks.workunit.client.0.vm06.stdout:9/143: fdatasync d3/d15/f23 0 2026-03-09T17:29:21.782 INFO:tasks.workunit.client.0.vm06.stdout:6/91: creat d6/d12/d17/f1b x:0 0 0 2026-03-09T17:29:21.785 INFO:tasks.workunit.client.0.vm06.stdout:6/92: dread d6/fe [0,4194304] 0 2026-03-09T17:29:21.786 INFO:tasks.workunit.client.0.vm06.stdout:9/144: mkdir d3/d2c 0 2026-03-09T17:29:21.788 INFO:tasks.workunit.client.0.vm06.stdout:2/119: unlink d3/d4/l19 0 2026-03-09T17:29:21.793 INFO:tasks.workunit.client.0.vm06.stdout:6/93: unlink d6/f1a 0 2026-03-09T17:29:21.794 INFO:tasks.workunit.client.0.vm06.stdout:9/145: chown d3/fb 648743 1 2026-03-09T17:29:21.798 INFO:tasks.workunit.client.0.vm06.stdout:6/94: dwrite d6/d12/d17/f1b [0,4194304] 0 2026-03-09T17:29:21.816 INFO:tasks.workunit.client.0.vm06.stdout:5/83: write d4/d9/fe [359775,28999] 0 2026-03-09T17:29:21.817 INFO:tasks.workunit.client.0.vm06.stdout:5/84: write d4/d9/f10 [3001120,26760] 0 2026-03-09T17:29:21.820 INFO:tasks.workunit.client.0.vm06.stdout:7/112: write d5/f10 [1276190,104665] 0 2026-03-09T17:29:21.830 INFO:tasks.workunit.client.0.vm06.stdout:6/95: rename d6/ff to d6/d12/f1c 0 2026-03-09T17:29:21.831 INFO:tasks.workunit.client.0.vm06.stdout:2/120: mkdir d3/d4/d12/d2b 0 2026-03-09T17:29:21.840 INFO:tasks.workunit.client.0.vm06.stdout:5/85: creat d4/d9/d18/f1b x:0 0 0 2026-03-09T17:29:21.842 INFO:tasks.workunit.client.0.vm06.stdout:7/113: rmdir d5 39 2026-03-09T17:29:21.849 INFO:tasks.workunit.client.0.vm06.stdout:6/96: rename d6/d12/f14 to d6/f1d 0 2026-03-09T17:29:21.849 INFO:tasks.workunit.client.0.vm06.stdout:6/97: chown d6/c15 175994 1 2026-03-09T17:29:21.855 INFO:tasks.workunit.client.0.vm06.stdout:5/86: symlink d4/d9/l1c 0 2026-03-09T17:29:21.856 INFO:tasks.workunit.client.0.vm06.stdout:7/114: chown d5/dd/f19 32207 1 2026-03-09T17:29:21.857 INFO:tasks.workunit.client.0.vm06.stdout:7/115: read d5/dd/ff [6275351,33141] 0 2026-03-09T17:29:21.860 INFO:tasks.workunit.client.0.vm06.stdout:9/146: link d3/d11/c2b d3/d2c/c2d 0 2026-03-09T17:29:21.865 INFO:tasks.workunit.client.0.vm06.stdout:6/98: rename d6/ld to d6/l1e 0 2026-03-09T17:29:21.868 INFO:tasks.workunit.client.0.vm06.stdout:6/99: dwrite d6/d12/d17/f1b [0,4194304] 0 2026-03-09T17:29:21.873 INFO:tasks.workunit.client.0.vm06.stdout:6/100: dread d6/fb [0,4194304] 0 2026-03-09T17:29:21.875 INFO:tasks.workunit.client.0.vm06.stdout:6/101: dread d6/fe [0,4194304] 0 2026-03-09T17:29:21.875 INFO:tasks.workunit.client.0.vm06.stdout:6/102: chown d6/fe 6 1 2026-03-09T17:29:21.883 INFO:tasks.workunit.client.0.vm06.stdout:9/147: fsync d3/f1b 0 2026-03-09T17:29:21.885 INFO:tasks.workunit.client.0.vm06.stdout:7/116: dwrite d5/dd/f1a [0,4194304] 0 2026-03-09T17:29:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:21 vm06.local ceph-mon[57307]: pgmap v144: 65 pgs: 65 active+clean; 256 MiB data, 1.5 GiB used, 118 GiB / 120 GiB avail; 1.2 MiB/s rd, 14 MiB/s wr, 349 op/s 2026-03-09T17:29:21.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:21 vm09.local ceph-mon[62061]: pgmap v144: 65 pgs: 65 active+clean; 256 MiB data, 1.5 GiB used, 118 GiB / 120 GiB avail; 1.2 MiB/s rd, 14 MiB/s wr, 349 op/s 2026-03-09T17:29:21.897 INFO:tasks.workunit.client.0.vm06.stdout:6/103: mknod d6/d12/c1f 0 2026-03-09T17:29:21.897 INFO:tasks.workunit.client.0.vm06.stdout:6/104: chown c3 28 1 2026-03-09T17:29:21.897 INFO:tasks.workunit.client.0.vm06.stdout:8/107: write fa [3428679,4980] 0 2026-03-09T17:29:21.898 INFO:tasks.workunit.client.0.vm06.stdout:8/108: fdatasync f0 0 2026-03-09T17:29:21.906 INFO:tasks.workunit.client.0.vm06.stdout:7/117: dread d5/f8 [4194304,4194304] 0 2026-03-09T17:29:21.908 INFO:tasks.workunit.client.0.vm06.stdout:0/139: rmdir d7 39 2026-03-09T17:29:21.911 INFO:tasks.workunit.client.0.vm06.stdout:3/103: dwrite f0 [4194304,4194304] 0 2026-03-09T17:29:21.915 INFO:tasks.workunit.client.0.vm06.stdout:6/105: fdatasync d6/d12/f1c 0 2026-03-09T17:29:21.918 INFO:tasks.workunit.client.0.vm06.stdout:8/109: creat d15/d16/f24 x:0 0 0 2026-03-09T17:29:21.923 INFO:tasks.workunit.client.0.vm06.stdout:7/118: creat d5/d7/f1b x:0 0 0 2026-03-09T17:29:21.927 INFO:tasks.workunit.client.0.vm06.stdout:3/104: readlink dd/l16 0 2026-03-09T17:29:21.930 INFO:tasks.workunit.client.0.vm06.stdout:3/105: dwrite dd/f15 [0,4194304] 0 2026-03-09T17:29:21.935 INFO:tasks.workunit.client.0.vm06.stdout:6/106: symlink d6/l20 0 2026-03-09T17:29:21.944 INFO:tasks.workunit.client.0.vm06.stdout:8/110: rename c2 to d15/d16/d19/c25 0 2026-03-09T17:29:21.948 INFO:tasks.workunit.client.0.vm06.stdout:9/148: creat d3/d15/f2e x:0 0 0 2026-03-09T17:29:21.953 INFO:tasks.workunit.client.0.vm06.stdout:9/149: dwrite d3/d11/f2a [0,4194304] 0 2026-03-09T17:29:21.955 INFO:tasks.workunit.client.0.vm06.stdout:9/150: readlink d3/d11/l13 0 2026-03-09T17:29:21.960 INFO:tasks.workunit.client.0.vm06.stdout:0/140: creat d7/d11/f30 x:0 0 0 2026-03-09T17:29:21.966 INFO:tasks.workunit.client.0.vm06.stdout:3/106: mknod dd/c18 0 2026-03-09T17:29:21.969 INFO:tasks.workunit.client.0.vm06.stdout:6/107: unlink d6/d12/c13 0 2026-03-09T17:29:21.972 INFO:tasks.workunit.client.0.vm06.stdout:4/137: dwrite db/f17 [4194304,4194304] 0 2026-03-09T17:29:21.974 INFO:tasks.workunit.client.0.vm06.stdout:6/108: dread d6/d12/f1c [0,4194304] 0 2026-03-09T17:29:21.975 INFO:tasks.workunit.client.0.vm06.stdout:1/148: write d11/d14/f17 [333381,105888] 0 2026-03-09T17:29:21.981 INFO:tasks.workunit.client.0.vm06.stdout:8/111: creat d15/d16/d19/f26 x:0 0 0 2026-03-09T17:29:21.987 INFO:tasks.workunit.client.0.vm06.stdout:9/151: creat d3/d15/d16/f2f x:0 0 0 2026-03-09T17:29:21.990 INFO:tasks.workunit.client.0.vm06.stdout:2/121: dwrite d3/f10 [4194304,4194304] 0 2026-03-09T17:29:21.995 INFO:tasks.workunit.client.0.vm06.stdout:9/152: dwrite d3/d11/f12 [0,4194304] 0 2026-03-09T17:29:21.997 INFO:tasks.workunit.client.0.vm06.stdout:3/107: truncate f7 994841 0 2026-03-09T17:29:22.001 INFO:tasks.workunit.client.0.vm06.stdout:2/122: dwrite d3/d4/d12/f20 [0,4194304] 0 2026-03-09T17:29:22.005 INFO:tasks.workunit.client.0.vm06.stdout:0/141: dread d7/fe [0,4194304] 0 2026-03-09T17:29:22.009 INFO:tasks.workunit.client.0.vm06.stdout:9/153: dread d3/d11/f12 [4194304,4194304] 0 2026-03-09T17:29:22.010 INFO:tasks.workunit.client.0.vm06.stdout:1/149: mknod d11/d14/d1d/d1e/d2a/c2f 0 2026-03-09T17:29:22.013 INFO:tasks.workunit.client.0.vm06.stdout:5/87: truncate d4/d9/fe 756954 0 2026-03-09T17:29:22.028 INFO:tasks.workunit.client.0.vm06.stdout:8/112: rmdir d15/d16/d1e 39 2026-03-09T17:29:22.029 INFO:tasks.workunit.client.0.vm06.stdout:8/113: write fd [1957488,125309] 0 2026-03-09T17:29:22.038 INFO:tasks.workunit.client.0.vm06.stdout:7/119: link d5/l15 d5/d12/l1c 0 2026-03-09T17:29:22.047 INFO:tasks.workunit.client.0.vm06.stdout:3/108: mkdir dd/d19 0 2026-03-09T17:29:22.057 INFO:tasks.workunit.client.0.vm06.stdout:2/123: dread d3/d4/fe [0,4194304] 0 2026-03-09T17:29:22.064 INFO:tasks.workunit.client.0.vm06.stdout:1/150: symlink d11/d14/d1c/l30 0 2026-03-09T17:29:22.066 INFO:tasks.workunit.client.0.vm06.stdout:5/88: creat d4/d9/f1d x:0 0 0 2026-03-09T17:29:22.078 INFO:tasks.workunit.client.0.vm06.stdout:8/114: mknod d15/d16/d19/c27 0 2026-03-09T17:29:22.092 INFO:tasks.workunit.client.0.vm06.stdout:4/138: symlink db/d1d/d21/d25/l31 0 2026-03-09T17:29:22.092 INFO:tasks.workunit.client.0.vm06.stdout:4/139: stat c4 0 2026-03-09T17:29:22.093 INFO:tasks.workunit.client.0.vm06.stdout:4/140: write db/d1d/d21/d26/f2b [547772,67534] 0 2026-03-09T17:29:22.098 INFO:tasks.workunit.client.0.vm06.stdout:0/142: link d7/f2a d7/f31 0 2026-03-09T17:29:22.105 INFO:tasks.workunit.client.0.vm06.stdout:9/154: symlink d3/d15/l30 0 2026-03-09T17:29:22.105 INFO:tasks.workunit.client.0.vm06.stdout:9/155: stat d3/l4 0 2026-03-09T17:29:22.106 INFO:tasks.workunit.client.0.vm06.stdout:9/156: chown d3/d26/f29 401204554 1 2026-03-09T17:29:22.110 INFO:tasks.workunit.client.0.vm06.stdout:1/151: creat d11/d14/d1d/f31 x:0 0 0 2026-03-09T17:29:22.110 INFO:tasks.workunit.client.0.vm06.stdout:1/152: fsync d11/f19 0 2026-03-09T17:29:22.113 INFO:tasks.workunit.client.0.vm06.stdout:5/89: rename d4/d9/d18/f1b to d4/d9/f1e 0 2026-03-09T17:29:22.117 INFO:tasks.workunit.client.0.vm06.stdout:6/109: truncate d6/d12/f1c 265484 0 2026-03-09T17:29:22.123 INFO:tasks.workunit.client.0.vm06.stdout:8/115: unlink d15/d16/f17 0 2026-03-09T17:29:22.125 INFO:tasks.workunit.client.0.vm06.stdout:8/116: dread d15/d16/d1a/f1b [0,4194304] 0 2026-03-09T17:29:22.135 INFO:tasks.workunit.client.0.vm06.stdout:4/141: mknod db/d1d/d21/d26/c32 0 2026-03-09T17:29:22.144 INFO:tasks.workunit.client.0.vm06.stdout:2/124: dwrite d3/d4/da/f1b [0,4194304] 0 2026-03-09T17:29:22.146 INFO:tasks.workunit.client.0.vm06.stdout:1/153: sync 2026-03-09T17:29:22.149 INFO:tasks.workunit.client.0.vm06.stdout:9/157: chown d3/ce 5013 1 2026-03-09T17:29:22.149 INFO:tasks.workunit.client.0.vm06.stdout:9/158: dread - d3/d26/f29 zero size 2026-03-09T17:29:22.157 INFO:tasks.workunit.client.0.vm06.stdout:0/143: rename d7/cd to d7/d11/d19/d1d/c32 0 2026-03-09T17:29:22.168 INFO:tasks.workunit.client.0.vm06.stdout:0/144: sync 2026-03-09T17:29:22.180 INFO:tasks.workunit.client.0.vm06.stdout:4/142: truncate db/df/f2d 438318 0 2026-03-09T17:29:22.182 INFO:tasks.workunit.client.0.vm06.stdout:2/125: readlink l1 0 2026-03-09T17:29:22.184 INFO:tasks.workunit.client.0.vm06.stdout:2/126: dread d3/d4/da/f1b [0,4194304] 0 2026-03-09T17:29:22.204 INFO:tasks.workunit.client.0.vm06.stdout:8/117: fsync f7 0 2026-03-09T17:29:22.205 INFO:tasks.workunit.client.0.vm06.stdout:8/118: dread - d15/d16/d1a/f22 zero size 2026-03-09T17:29:22.206 INFO:tasks.workunit.client.0.vm06.stdout:0/145: creat d7/f33 x:0 0 0 2026-03-09T17:29:22.207 INFO:tasks.workunit.client.0.vm06.stdout:0/146: chown d7/d11/f2c 7160 1 2026-03-09T17:29:22.207 INFO:tasks.workunit.client.0.vm06.stdout:0/147: readlink d7/la 0 2026-03-09T17:29:22.209 INFO:tasks.workunit.client.0.vm06.stdout:7/120: getdents d5/d12 0 2026-03-09T17:29:22.214 INFO:tasks.workunit.client.0.vm06.stdout:4/143: read db/f23 [51665,2827] 0 2026-03-09T17:29:22.220 INFO:tasks.workunit.client.0.vm06.stdout:2/127: fsync d3/d4/da/f17 0 2026-03-09T17:29:22.220 INFO:tasks.workunit.client.0.vm06.stdout:2/128: write d3/d4/d12/f20 [1826776,4839] 0 2026-03-09T17:29:22.226 INFO:tasks.workunit.client.0.vm06.stdout:1/154: symlink d11/l32 0 2026-03-09T17:29:22.226 INFO:tasks.workunit.client.0.vm06.stdout:1/155: chown d11/d14/d1c/d1f/f21 1 1 2026-03-09T17:29:22.227 INFO:tasks.workunit.client.0.vm06.stdout:1/156: stat d11/f19 0 2026-03-09T17:29:22.231 INFO:tasks.workunit.client.0.vm06.stdout:5/90: creat d4/f1f x:0 0 0 2026-03-09T17:29:22.231 INFO:tasks.workunit.client.0.vm06.stdout:5/91: fsync d4/d9/f1d 0 2026-03-09T17:29:22.241 INFO:tasks.workunit.client.0.vm06.stdout:7/121: creat d5/d7/f1d x:0 0 0 2026-03-09T17:29:22.242 INFO:tasks.workunit.client.0.vm06.stdout:3/109: link f7 dd/f1a 0 2026-03-09T17:29:22.243 INFO:tasks.workunit.client.0.vm06.stdout:3/110: write fc [5987765,71361] 0 2026-03-09T17:29:22.246 INFO:tasks.workunit.client.0.vm06.stdout:7/122: dwrite d5/dd/f19 [0,4194304] 0 2026-03-09T17:29:22.255 INFO:tasks.workunit.client.0.vm06.stdout:4/144: dread db/f17 [8388608,4194304] 0 2026-03-09T17:29:22.255 INFO:tasks.workunit.client.0.vm06.stdout:2/129: creat d3/d4/da/f2c x:0 0 0 2026-03-09T17:29:22.258 INFO:tasks.workunit.client.0.vm06.stdout:1/157: symlink d11/d14/d1d/l33 0 2026-03-09T17:29:22.258 INFO:tasks.workunit.client.0.vm06.stdout:1/158: truncate d11/f19 4830739 0 2026-03-09T17:29:22.258 INFO:tasks.workunit.client.0.vm06.stdout:1/159: stat f10 0 2026-03-09T17:29:22.261 INFO:tasks.workunit.client.0.vm06.stdout:5/92: rmdir d4/d9/d18 39 2026-03-09T17:29:22.262 INFO:tasks.workunit.client.0.vm06.stdout:5/93: write d4/f17 [570882,30208] 0 2026-03-09T17:29:22.263 INFO:tasks.workunit.client.0.vm06.stdout:6/110: mkdir d6/d12/d17/d21 0 2026-03-09T17:29:22.270 INFO:tasks.workunit.client.0.vm06.stdout:8/119: mkdir d15/d16/d1e/d28 0 2026-03-09T17:29:22.270 INFO:tasks.workunit.client.0.vm06.stdout:0/148: getdents d7/d11/d19/d23 0 2026-03-09T17:29:22.270 INFO:tasks.workunit.client.0.vm06.stdout:3/111: creat dd/f1b x:0 0 0 2026-03-09T17:29:22.272 INFO:tasks.workunit.client.0.vm06.stdout:3/112: dread dd/f15 [0,4194304] 0 2026-03-09T17:29:22.272 INFO:tasks.workunit.client.0.vm06.stdout:3/113: fsync dd/f15 0 2026-03-09T17:29:22.287 INFO:tasks.workunit.client.0.vm06.stdout:4/145: symlink db/d1d/d21/d25/l33 0 2026-03-09T17:29:22.292 INFO:tasks.workunit.client.0.vm06.stdout:2/130: dwrite d3/d4/f1f [0,4194304] 0 2026-03-09T17:29:22.293 INFO:tasks.workunit.client.0.vm06.stdout:2/131: chown d3/d4/d12/d2b 208 1 2026-03-09T17:29:22.295 INFO:tasks.workunit.client.0.vm06.stdout:1/160: rmdir d11 39 2026-03-09T17:29:22.297 INFO:tasks.workunit.client.0.vm06.stdout:9/159: getdents d3/d26 0 2026-03-09T17:29:22.303 INFO:tasks.workunit.client.0.vm06.stdout:8/120: rename d15/d16/d1e/f1f to d15/d16/d1a/f29 0 2026-03-09T17:29:22.304 INFO:tasks.workunit.client.0.vm06.stdout:8/121: truncate d15/d16/d1a/f29 136954 0 2026-03-09T17:29:22.307 INFO:tasks.workunit.client.0.vm06.stdout:0/149: mknod d7/d11/d19/c34 0 2026-03-09T17:29:22.308 INFO:tasks.workunit.client.0.vm06.stdout:7/123: dwrite d5/dd/ff [0,4194304] 0 2026-03-09T17:29:22.308 INFO:tasks.workunit.client.0.vm06.stdout:0/150: write d7/d11/f13 [2484027,65150] 0 2026-03-09T17:29:22.309 INFO:tasks.workunit.client.0.vm06.stdout:0/151: fsync d7/d11/f20 0 2026-03-09T17:29:22.326 INFO:tasks.workunit.client.0.vm06.stdout:2/132: dwrite d3/d4/da/f1b [0,4194304] 0 2026-03-09T17:29:22.335 INFO:tasks.workunit.client.0.vm06.stdout:9/160: creat d3/d15/d16/f31 x:0 0 0 2026-03-09T17:29:22.335 INFO:tasks.workunit.client.0.vm06.stdout:6/111: creat d6/d12/f22 x:0 0 0 2026-03-09T17:29:22.342 INFO:tasks.workunit.client.0.vm06.stdout:7/124: rmdir d5/d7 39 2026-03-09T17:29:22.342 INFO:tasks.workunit.client.0.vm06.stdout:7/125: truncate f0 1659887 0 2026-03-09T17:29:22.343 INFO:tasks.workunit.client.0.vm06.stdout:7/126: read d5/dd/ff [4921526,102548] 0 2026-03-09T17:29:22.344 INFO:tasks.workunit.client.0.vm06.stdout:7/127: write d5/f16 [139437,73880] 0 2026-03-09T17:29:22.344 INFO:tasks.workunit.client.0.vm06.stdout:7/128: chown d5 1082190 1 2026-03-09T17:29:22.344 INFO:tasks.workunit.client.0.vm06.stdout:7/129: chown d5/d12 255151 1 2026-03-09T17:29:22.344 INFO:tasks.workunit.client.0.vm06.stdout:7/130: chown d5 121610 1 2026-03-09T17:29:22.344 INFO:tasks.workunit.client.0.vm06.stdout:7/131: chown f0 10 1 2026-03-09T17:29:22.345 INFO:tasks.workunit.client.0.vm06.stdout:7/132: fdatasync f2 0 2026-03-09T17:29:22.345 INFO:tasks.workunit.client.0.vm06.stdout:7/133: stat d5/dd/ff 0 2026-03-09T17:29:22.346 INFO:tasks.workunit.client.0.vm06.stdout:0/152: rename d7/d11/d19/f26 to d7/d11/f35 0 2026-03-09T17:29:22.348 INFO:tasks.workunit.client.0.vm06.stdout:3/114: symlink dd/d19/l1c 0 2026-03-09T17:29:22.350 INFO:tasks.workunit.client.0.vm06.stdout:4/146: mknod db/c34 0 2026-03-09T17:29:22.353 INFO:tasks.workunit.client.0.vm06.stdout:1/161: mkdir d11/d14/d1d/d1e/d2a/d34 0 2026-03-09T17:29:22.355 INFO:tasks.workunit.client.0.vm06.stdout:5/94: creat d4/f20 x:0 0 0 2026-03-09T17:29:22.355 INFO:tasks.workunit.client.0.vm06.stdout:5/95: chown d4/d9/c19 85103866 1 2026-03-09T17:29:22.356 INFO:tasks.workunit.client.0.vm06.stdout:5/96: chown d4/f17 402490 1 2026-03-09T17:29:22.356 INFO:tasks.workunit.client.0.vm06.stdout:5/97: write d4/f17 [941610,65437] 0 2026-03-09T17:29:22.359 INFO:tasks.workunit.client.0.vm06.stdout:3/115: dread f0 [0,4194304] 0 2026-03-09T17:29:22.363 INFO:tasks.workunit.client.0.vm06.stdout:5/98: dwrite d4/f1f [0,4194304] 0 2026-03-09T17:29:22.365 INFO:tasks.workunit.client.0.vm06.stdout:9/161: mknod d3/d11/c32 0 2026-03-09T17:29:22.366 INFO:tasks.workunit.client.0.vm06.stdout:9/162: fsync d3/d11/f1c 0 2026-03-09T17:29:22.368 INFO:tasks.workunit.client.0.vm06.stdout:8/122: link d15/d16/d19/f26 d15/d16/d1e/d28/f2a 0 2026-03-09T17:29:22.376 INFO:tasks.workunit.client.0.vm06.stdout:2/133: rename d3/d4/da to d3/d4/d12/d2b/d2d 0 2026-03-09T17:29:22.377 INFO:tasks.workunit.client.0.vm06.stdout:2/134: write d3/d4/d12/d2b/d2d/f1a [1658013,95778] 0 2026-03-09T17:29:22.378 INFO:tasks.workunit.client.0.vm06.stdout:0/153: write d7/d11/d19/f24 [1173286,96704] 0 2026-03-09T17:29:22.382 INFO:tasks.workunit.client.0.vm06.stdout:0/154: dwrite d7/fb [0,4194304] 0 2026-03-09T17:29:22.383 INFO:tasks.workunit.client.0.vm06.stdout:4/147: creat db/d1d/d21/d25/f35 x:0 0 0 2026-03-09T17:29:22.390 INFO:tasks.workunit.client.0.vm06.stdout:3/116: mkdir dd/d1d 0 2026-03-09T17:29:22.403 INFO:tasks.workunit.client.0.vm06.stdout:5/99: rmdir d4/d9 39 2026-03-09T17:29:22.403 INFO:tasks.workunit.client.0.vm06.stdout:6/112: mknod d6/c23 0 2026-03-09T17:29:22.403 INFO:tasks.workunit.client.0.vm06.stdout:8/123: mkdir d15/d16/d19/d2b 0 2026-03-09T17:29:22.403 INFO:tasks.workunit.client.0.vm06.stdout:7/134: readlink d5/d7/lb 0 2026-03-09T17:29:22.403 INFO:tasks.workunit.client.0.vm06.stdout:7/135: chown d5/d7/f1b 28251 1 2026-03-09T17:29:22.403 INFO:tasks.workunit.client.0.vm06.stdout:2/135: creat d3/d4/d12/f2e x:0 0 0 2026-03-09T17:29:22.403 INFO:tasks.workunit.client.0.vm06.stdout:2/136: read - d3/d4/d12/f1e zero size 2026-03-09T17:29:22.403 INFO:tasks.workunit.client.0.vm06.stdout:2/137: write d3/d4/d12/d2b/d2d/f1a [3165623,56285] 0 2026-03-09T17:29:22.403 INFO:tasks.workunit.client.0.vm06.stdout:2/138: read d3/d4/d12/f20 [3494035,84948] 0 2026-03-09T17:29:22.406 INFO:tasks.workunit.client.0.vm06.stdout:0/155: creat d7/f36 x:0 0 0 2026-03-09T17:29:22.407 INFO:tasks.workunit.client.0.vm06.stdout:1/162: getdents d11/d14/d1c/d1f/d25 0 2026-03-09T17:29:22.408 INFO:tasks.workunit.client.0.vm06.stdout:4/148: creat db/df/f36 x:0 0 0 2026-03-09T17:29:22.409 INFO:tasks.workunit.client.0.vm06.stdout:3/117: mkdir dd/d19/d1e 0 2026-03-09T17:29:22.411 INFO:tasks.workunit.client.0.vm06.stdout:8/124: mknod d15/d16/d1e/c2c 0 2026-03-09T17:29:22.412 INFO:tasks.workunit.client.0.vm06.stdout:7/136: rmdir d5/dd 39 2026-03-09T17:29:22.417 INFO:tasks.workunit.client.0.vm06.stdout:2/139: dread d3/d4/f11 [0,4194304] 0 2026-03-09T17:29:22.417 INFO:tasks.workunit.client.0.vm06.stdout:2/140: stat d3/c9 0 2026-03-09T17:29:22.419 INFO:tasks.workunit.client.0.vm06.stdout:0/156: mkdir d7/d11/d19/d37 0 2026-03-09T17:29:22.419 INFO:tasks.workunit.client.0.vm06.stdout:0/157: write d7/f36 [332296,107247] 0 2026-03-09T17:29:22.424 INFO:tasks.workunit.client.0.vm06.stdout:9/163: sync 2026-03-09T17:29:22.424 INFO:tasks.workunit.client.0.vm06.stdout:5/100: sync 2026-03-09T17:29:22.425 INFO:tasks.workunit.client.0.vm06.stdout:9/164: dread - d3/d15/f2e zero size 2026-03-09T17:29:22.426 INFO:tasks.workunit.client.0.vm06.stdout:0/158: dwrite d7/f36 [0,4194304] 0 2026-03-09T17:29:22.428 INFO:tasks.workunit.client.0.vm06.stdout:9/165: chown d3/d15/f17 12092 1 2026-03-09T17:29:22.433 INFO:tasks.workunit.client.0.vm06.stdout:1/163: mknod d11/d14/d1d/d1e/c35 0 2026-03-09T17:29:22.434 INFO:tasks.workunit.client.0.vm06.stdout:4/149: mkdir db/d1d/d21/d37 0 2026-03-09T17:29:22.434 INFO:tasks.workunit.client.0.vm06.stdout:1/164: chown d11/l32 2181825 1 2026-03-09T17:29:22.439 INFO:tasks.workunit.client.0.vm06.stdout:3/118: rename dd/l16 to dd/d19/l1f 0 2026-03-09T17:29:22.440 INFO:tasks.workunit.client.0.vm06.stdout:7/137: read d5/f8 [7767590,121389] 0 2026-03-09T17:29:22.449 INFO:tasks.workunit.client.0.vm06.stdout:9/166: creat d3/d26/f33 x:0 0 0 2026-03-09T17:29:22.455 INFO:tasks.workunit.client.0.vm06.stdout:4/150: creat db/d1d/d21/d25/f38 x:0 0 0 2026-03-09T17:29:22.455 INFO:tasks.workunit.client.0.vm06.stdout:1/165: mknod d11/d14/d1d/c36 0 2026-03-09T17:29:22.455 INFO:tasks.workunit.client.0.vm06.stdout:8/125: rename d15/c18 to d15/d16/d1e/c2d 0 2026-03-09T17:29:22.455 INFO:tasks.workunit.client.0.vm06.stdout:8/126: dread - d15/d16/f21 zero size 2026-03-09T17:29:22.455 INFO:tasks.workunit.client.0.vm06.stdout:3/119: symlink dd/d19/l20 0 2026-03-09T17:29:22.458 INFO:tasks.workunit.client.0.vm06.stdout:3/120: dwrite dd/f14 [0,4194304] 0 2026-03-09T17:29:22.479 INFO:tasks.workunit.client.0.vm06.stdout:9/167: write d3/d11/f12 [4776832,121716] 0 2026-03-09T17:29:22.482 INFO:tasks.workunit.client.0.vm06.stdout:1/166: rename d11/d14/d1d/d1e/f27 to d11/d14/d1c/f37 0 2026-03-09T17:29:22.490 INFO:tasks.workunit.client.0.vm06.stdout:3/121: dread dd/f11 [0,4194304] 0 2026-03-09T17:29:22.491 INFO:tasks.workunit.client.0.vm06.stdout:7/138: symlink d5/l1e 0 2026-03-09T17:29:22.492 INFO:tasks.workunit.client.0.vm06.stdout:7/139: fdatasync d5/d7/f1b 0 2026-03-09T17:29:22.492 INFO:tasks.workunit.client.0.vm06.stdout:7/140: stat d5/d7/lb 0 2026-03-09T17:29:22.494 INFO:tasks.workunit.client.0.vm06.stdout:2/141: link d3/f10 d3/d4/d22/f2f 0 2026-03-09T17:29:22.497 INFO:tasks.workunit.client.0.vm06.stdout:2/142: dread d3/d4/d22/f2f [0,4194304] 0 2026-03-09T17:29:22.497 INFO:tasks.workunit.client.0.vm06.stdout:5/101: creat d4/f21 x:0 0 0 2026-03-09T17:29:22.501 INFO:tasks.workunit.client.0.vm06.stdout:9/168: mkdir d3/d15/d16/d34 0 2026-03-09T17:29:22.507 INFO:tasks.workunit.client.0.vm06.stdout:8/127: rename f11 to d15/f2e 0 2026-03-09T17:29:22.510 INFO:tasks.workunit.client.0.vm06.stdout:0/159: truncate d7/fb 2613448 0 2026-03-09T17:29:22.514 INFO:tasks.workunit.client.0.vm06.stdout:0/160: dwrite d7/f14 [0,4194304] 0 2026-03-09T17:29:22.529 INFO:tasks.workunit.client.0.vm06.stdout:2/143: mknod d3/d4/d12/c30 0 2026-03-09T17:29:22.531 INFO:tasks.workunit.client.0.vm06.stdout:6/113: mknod d6/c24 0 2026-03-09T17:29:22.532 INFO:tasks.workunit.client.0.vm06.stdout:9/169: mkdir d3/d26/d35 0 2026-03-09T17:29:22.533 INFO:tasks.workunit.client.0.vm06.stdout:4/151: creat db/f39 x:0 0 0 2026-03-09T17:29:22.536 INFO:tasks.workunit.client.0.vm06.stdout:0/161: mknod d7/d11/d2d/c38 0 2026-03-09T17:29:22.536 INFO:tasks.workunit.client.0.vm06.stdout:0/162: readlink d7/la 0 2026-03-09T17:29:22.537 INFO:tasks.workunit.client.0.vm06.stdout:0/163: chown d7/d11/f2c 209 1 2026-03-09T17:29:22.538 INFO:tasks.workunit.client.0.vm06.stdout:5/102: mkdir d4/d22 0 2026-03-09T17:29:22.539 INFO:tasks.workunit.client.0.vm06.stdout:2/144: creat d3/d4/d12/f31 x:0 0 0 2026-03-09T17:29:22.541 INFO:tasks.workunit.client.0.vm06.stdout:6/114: read d6/d12/f1c [88133,12102] 0 2026-03-09T17:29:22.542 INFO:tasks.workunit.client.0.vm06.stdout:6/115: dread - d6/d12/f22 zero size 2026-03-09T17:29:22.546 INFO:tasks.workunit.client.0.vm06.stdout:4/152: readlink db/l19 0 2026-03-09T17:29:22.547 INFO:tasks.workunit.client.0.vm06.stdout:4/153: truncate db/df/f36 408875 0 2026-03-09T17:29:22.547 INFO:tasks.workunit.client.0.vm06.stdout:4/154: stat db/d1d/d21 0 2026-03-09T17:29:22.555 INFO:tasks.workunit.client.0.vm06.stdout:2/145: creat d3/d4/d12/d2b/f32 x:0 0 0 2026-03-09T17:29:22.555 INFO:tasks.workunit.client.0.vm06.stdout:2/146: truncate d3/f21 567329 0 2026-03-09T17:29:22.563 INFO:tasks.workunit.client.0.vm06.stdout:3/122: write f7 [1495325,98950] 0 2026-03-09T17:29:22.576 INFO:tasks.workunit.client.0.vm06.stdout:1/167: dwrite d11/f18 [0,4194304] 0 2026-03-09T17:29:22.583 INFO:tasks.workunit.client.0.vm06.stdout:8/128: truncate f12 3227982 0 2026-03-09T17:29:22.584 INFO:tasks.workunit.client.0.vm06.stdout:7/141: truncate d5/dd/f1a 1948640 0 2026-03-09T17:29:22.584 INFO:tasks.workunit.client.0.vm06.stdout:6/116: creat d6/d12/d17/d21/f25 x:0 0 0 2026-03-09T17:29:22.588 INFO:tasks.workunit.client.0.vm06.stdout:9/170: dwrite d3/d15/f1a [0,4194304] 0 2026-03-09T17:29:22.591 INFO:tasks.workunit.client.0.vm06.stdout:4/155: write db/d1d/f22 [223095,115738] 0 2026-03-09T17:29:22.595 INFO:tasks.workunit.client.0.vm06.stdout:4/156: dwrite db/df/f2a [0,4194304] 0 2026-03-09T17:29:22.600 INFO:tasks.workunit.client.0.vm06.stdout:4/157: readlink db/d1d/d21/d25/l28 0 2026-03-09T17:29:22.619 INFO:tasks.workunit.client.0.vm06.stdout:0/164: read d7/fb [569573,76629] 0 2026-03-09T17:29:22.619 INFO:tasks.workunit.client.0.vm06.stdout:2/147: dwrite d3/f10 [0,4194304] 0 2026-03-09T17:29:22.621 INFO:tasks.workunit.client.0.vm06.stdout:4/158: dread db/df/f36 [0,4194304] 0 2026-03-09T17:29:22.622 INFO:tasks.workunit.client.0.vm06.stdout:2/148: truncate d3/d4/d12/d2b/d2d/f2a 991744 0 2026-03-09T17:29:22.627 INFO:tasks.workunit.client.0.vm06.stdout:5/103: mknod d4/d9/c23 0 2026-03-09T17:29:22.635 INFO:tasks.workunit.client.0.vm06.stdout:1/168: unlink d11/d14/d1d/c28 0 2026-03-09T17:29:22.635 INFO:tasks.workunit.client.0.vm06.stdout:1/169: read f7 [2702364,64706] 0 2026-03-09T17:29:22.638 INFO:tasks.workunit.client.0.vm06.stdout:1/170: dwrite d11/d14/f17 [0,4194304] 0 2026-03-09T17:29:22.639 INFO:tasks.workunit.client.0.vm06.stdout:1/171: fdatasync f7 0 2026-03-09T17:29:22.642 INFO:tasks.workunit.client.0.vm06.stdout:8/129: creat d15/d16/d1e/f2f x:0 0 0 2026-03-09T17:29:22.650 INFO:tasks.workunit.client.0.vm06.stdout:0/165: mkdir d7/d11/d19/d1d/d39 0 2026-03-09T17:29:22.652 INFO:tasks.workunit.client.0.vm06.stdout:4/159: rename db/d1d/d21/d26/f2b to db/d1d/f3a 0 2026-03-09T17:29:22.653 INFO:tasks.workunit.client.0.vm06.stdout:4/160: stat db/d1d/d21/d25/f38 0 2026-03-09T17:29:22.658 INFO:tasks.workunit.client.0.vm06.stdout:1/172: dwrite d11/d14/d1c/f37 [0,4194304] 0 2026-03-09T17:29:22.661 INFO:tasks.workunit.client.0.vm06.stdout:9/171: mkdir d3/d15/d36 0 2026-03-09T17:29:22.662 INFO:tasks.workunit.client.0.vm06.stdout:9/172: read d3/fb [202311,71469] 0 2026-03-09T17:29:22.668 INFO:tasks.workunit.client.0.vm06.stdout:9/173: dwrite d3/f21 [0,4194304] 0 2026-03-09T17:29:22.678 INFO:tasks.workunit.client.0.vm06.stdout:8/130: mkdir d15/d16/d1e/d30 0 2026-03-09T17:29:22.681 INFO:tasks.workunit.client.0.vm06.stdout:0/166: rename d7/f33 to d7/d11/d2d/f3a 0 2026-03-09T17:29:22.683 INFO:tasks.workunit.client.0.vm06.stdout:4/161: dread db/f13 [0,4194304] 0 2026-03-09T17:29:22.685 INFO:tasks.workunit.client.0.vm06.stdout:5/104: fdatasync d4/d9/fe 0 2026-03-09T17:29:22.685 INFO:tasks.workunit.client.0.vm06.stdout:5/105: readlink d4/d9/l13 0 2026-03-09T17:29:22.686 INFO:tasks.workunit.client.0.vm06.stdout:5/106: read - d4/f20 zero size 2026-03-09T17:29:22.686 INFO:tasks.workunit.client.0.vm06.stdout:5/107: dread - d4/d9/f14 zero size 2026-03-09T17:29:22.689 INFO:tasks.workunit.client.0.vm06.stdout:1/173: creat d11/d14/d1d/d1e/d2a/f38 x:0 0 0 2026-03-09T17:29:22.689 INFO:tasks.workunit.client.0.vm06.stdout:1/174: readlink d11/l32 0 2026-03-09T17:29:22.689 INFO:tasks.workunit.client.0.vm06.stdout:1/175: readlink d11/d14/d1d/l33 0 2026-03-09T17:29:22.707 INFO:tasks.workunit.client.0.vm06.stdout:2/149: truncate d3/d4/d12/d2b/d2d/f17 4162935 0 2026-03-09T17:29:22.711 INFO:tasks.workunit.client.0.vm06.stdout:3/123: dwrite dd/f11 [0,4194304] 0 2026-03-09T17:29:22.718 INFO:tasks.workunit.client.0.vm06.stdout:7/142: dwrite d5/dd/ff [4194304,4194304] 0 2026-03-09T17:29:22.723 INFO:tasks.workunit.client.0.vm06.stdout:6/117: write d6/fb [3375372,43309] 0 2026-03-09T17:29:22.737 INFO:tasks.workunit.client.0.vm06.stdout:1/176: truncate f7 1167745 0 2026-03-09T17:29:22.751 INFO:tasks.workunit.client.0.vm06.stdout:2/150: rmdir d3/d4/d22 39 2026-03-09T17:29:22.753 INFO:tasks.workunit.client.0.vm06.stdout:1/177: rmdir d11/d14/d1d/d1e/d2a 39 2026-03-09T17:29:22.756 INFO:tasks.workunit.client.0.vm06.stdout:1/178: dwrite d11/d14/f17 [0,4194304] 0 2026-03-09T17:29:22.767 INFO:tasks.workunit.client.0.vm06.stdout:9/174: rename d3/d15/d16/d34 to d3/d15/d37 0 2026-03-09T17:29:22.771 INFO:tasks.workunit.client.0.vm06.stdout:0/167: rmdir d7/d28 0 2026-03-09T17:29:22.771 INFO:tasks.workunit.client.0.vm06.stdout:0/168: write d7/f8 [7384918,47290] 0 2026-03-09T17:29:22.771 INFO:tasks.workunit.client.0.vm06.stdout:0/169: fsync d7/d11/f29 0 2026-03-09T17:29:22.771 INFO:tasks.workunit.client.0.vm06.stdout:0/170: write d7/d11/d19/f24 [717173,15121] 0 2026-03-09T17:29:22.771 INFO:tasks.workunit.client.0.vm06.stdout:0/171: dread - d7/d11/d2d/f2f zero size 2026-03-09T17:29:22.774 INFO:tasks.workunit.client.0.vm06.stdout:2/151: unlink d3/d4/d12/d2b/d2d/f2c 0 2026-03-09T17:29:22.775 INFO:tasks.workunit.client.0.vm06.stdout:3/124: creat dd/d1d/f21 x:0 0 0 2026-03-09T17:29:22.777 INFO:tasks.workunit.client.0.vm06.stdout:6/118: unlink d6/lc 0 2026-03-09T17:29:22.777 INFO:tasks.workunit.client.0.vm06.stdout:6/119: dread - d6/d12/f19 zero size 2026-03-09T17:29:22.780 INFO:tasks.workunit.client.0.vm06.stdout:8/131: getdents d15/d16/d1e/d28 0 2026-03-09T17:29:22.782 INFO:tasks.workunit.client.0.vm06.stdout:7/143: sync 2026-03-09T17:29:22.787 INFO:tasks.workunit.client.0.vm06.stdout:8/132: dwrite d15/d16/d1e/f2f [0,4194304] 0 2026-03-09T17:29:22.789 INFO:tasks.workunit.client.0.vm06.stdout:7/144: dwrite d5/f16 [0,4194304] 0 2026-03-09T17:29:22.790 INFO:tasks.workunit.client.0.vm06.stdout:8/133: stat d15/d16/d1e/d28 0 2026-03-09T17:29:22.790 INFO:tasks.workunit.client.0.vm06.stdout:7/145: write d5/f8 [8530090,5382] 0 2026-03-09T17:29:22.791 INFO:tasks.workunit.client.0.vm06.stdout:8/134: read - d15/d16/d1a/f22 zero size 2026-03-09T17:29:22.792 INFO:tasks.workunit.client.0.vm06.stdout:8/135: fdatasync fe 0 2026-03-09T17:29:22.792 INFO:tasks.workunit.client.0.vm06.stdout:8/136: fdatasync fa 0 2026-03-09T17:29:22.793 INFO:tasks.workunit.client.0.vm06.stdout:6/120: fsync d6/fb 0 2026-03-09T17:29:22.806 INFO:tasks.workunit.client.0.vm06.stdout:4/162: rename db/d1d/d21/d25/c27 to db/d1d/d21/c3b 0 2026-03-09T17:29:22.806 INFO:tasks.workunit.client.0.vm06.stdout:9/175: mknod d3/d11/c38 0 2026-03-09T17:29:22.806 INFO:tasks.workunit.client.0.vm06.stdout:2/152: mkdir d3/d33 0 2026-03-09T17:29:22.807 INFO:tasks.workunit.client.0.vm06.stdout:2/153: chown d3/f21 77 1 2026-03-09T17:29:22.811 INFO:tasks.workunit.client.0.vm06.stdout:3/125: rename dd/f11 to dd/f22 0 2026-03-09T17:29:22.815 INFO:tasks.workunit.client.0.vm06.stdout:3/126: dwrite f7 [0,4194304] 0 2026-03-09T17:29:22.815 INFO:tasks.workunit.client.0.vm06.stdout:3/127: chown dd/f22 29 1 2026-03-09T17:29:22.828 INFO:tasks.workunit.client.0.vm06.stdout:5/108: getdents d4 0 2026-03-09T17:29:22.839 INFO:tasks.workunit.client.0.vm06.stdout:6/121: dread d6/fe [0,4194304] 0 2026-03-09T17:29:22.841 INFO:tasks.workunit.client.0.vm06.stdout:4/163: mkdir db/d1d/d21/d26/d3c 0 2026-03-09T17:29:22.842 INFO:tasks.workunit.client.0.vm06.stdout:4/164: chown db/df/f2d 6025 1 2026-03-09T17:29:22.845 INFO:tasks.workunit.client.0.vm06.stdout:4/165: dwrite db/d1d/d21/d25/f38 [0,4194304] 0 2026-03-09T17:29:22.851 INFO:tasks.workunit.client.0.vm06.stdout:4/166: dwrite db/d1d/d21/f2f [0,4194304] 0 2026-03-09T17:29:22.871 INFO:tasks.workunit.client.0.vm06.stdout:7/146: mkdir d5/d1f 0 2026-03-09T17:29:22.875 INFO:tasks.workunit.client.0.vm06.stdout:6/122: chown d6/c15 1177547 1 2026-03-09T17:29:22.879 INFO:tasks.workunit.client.0.vm06.stdout:1/179: rename d11/l1a to d11/d14/d1c/d1f/d25/l39 0 2026-03-09T17:29:22.884 INFO:tasks.workunit.client.0.vm06.stdout:0/172: link d7/f2a d7/d11/d2d/f3b 0 2026-03-09T17:29:22.885 INFO:tasks.workunit.client.0.vm06.stdout:3/128: creat dd/d19/d1e/f23 x:0 0 0 2026-03-09T17:29:22.885 INFO:tasks.workunit.client.0.vm06.stdout:3/129: write dd/d1d/f21 [4906,67124] 0 2026-03-09T17:29:22.886 INFO:tasks.workunit.client.0.vm06.stdout:3/130: truncate dd/f22 4566470 0 2026-03-09T17:29:22.886 INFO:tasks.workunit.client.0.vm06.stdout:3/131: truncate f7 4336429 0 2026-03-09T17:29:22.892 INFO:tasks.workunit.client.0.vm06.stdout:6/123: creat d6/d12/d17/d21/f26 x:0 0 0 2026-03-09T17:29:22.895 INFO:tasks.workunit.client.0.vm06.stdout:2/154: rename d3/d33 to d3/d4/d12/d34 0 2026-03-09T17:29:22.899 INFO:tasks.workunit.client.0.vm06.stdout:2/155: dwrite d3/d4/d12/f1e [0,4194304] 0 2026-03-09T17:29:22.901 INFO:tasks.workunit.client.0.vm06.stdout:2/156: fdatasync d3/d4/d12/d2b/f32 0 2026-03-09T17:29:22.903 INFO:tasks.workunit.client.0.vm06.stdout:1/180: mkdir d11/d14/d1c/d3a 0 2026-03-09T17:29:22.903 INFO:tasks.workunit.client.0.vm06.stdout:4/167: symlink db/l3d 0 2026-03-09T17:29:22.904 INFO:tasks.workunit.client.0.vm06.stdout:0/173: mkdir d7/d11/d19/d3c 0 2026-03-09T17:29:22.904 INFO:tasks.workunit.client.0.vm06.stdout:4/168: read f6 [93928,102459] 0 2026-03-09T17:29:22.905 INFO:tasks.workunit.client.0.vm06.stdout:5/109: creat d4/d9/f24 x:0 0 0 2026-03-09T17:29:22.909 INFO:tasks.workunit.client.0.vm06.stdout:7/147: symlink d5/l20 0 2026-03-09T17:29:22.909 INFO:tasks.workunit.client.0.vm06.stdout:7/148: write f0 [804486,57896] 0 2026-03-09T17:29:22.913 INFO:tasks.workunit.client.0.vm06.stdout:7/149: dwrite d5/d7/f1d [0,4194304] 0 2026-03-09T17:29:22.915 INFO:tasks.workunit.client.0.vm06.stdout:7/150: write f2 [6168,121363] 0 2026-03-09T17:29:22.928 INFO:tasks.workunit.client.0.vm06.stdout:6/124: mkdir d6/d12/d17/d27 0 2026-03-09T17:29:22.929 INFO:tasks.workunit.client.0.vm06.stdout:6/125: write d6/d12/d17/d21/f26 [718180,51942] 0 2026-03-09T17:29:22.930 INFO:tasks.workunit.client.0.vm06.stdout:8/137: truncate f13 1175984 0 2026-03-09T17:29:22.931 INFO:tasks.workunit.client.0.vm06.stdout:8/138: readlink d15/d16/d1a/l20 0 2026-03-09T17:29:22.933 INFO:tasks.workunit.client.0.vm06.stdout:2/157: creat d3/d4/d12/f35 x:0 0 0 2026-03-09T17:29:22.937 INFO:tasks.workunit.client.0.vm06.stdout:9/176: getdents d3/d15 0 2026-03-09T17:29:22.947 INFO:tasks.workunit.client.0.vm06.stdout:5/110: mknod d4/c25 0 2026-03-09T17:29:22.949 INFO:tasks.workunit.client.0.vm06.stdout:4/169: mknod db/df/c3e 0 2026-03-09T17:29:22.955 INFO:tasks.workunit.client.0.vm06.stdout:7/151: rename d5/c6 to d5/dd/c21 0 2026-03-09T17:29:22.959 INFO:tasks.workunit.client.0.vm06.stdout:7/152: dwrite d5/d7/f1b [0,4194304] 0 2026-03-09T17:29:22.960 INFO:tasks.workunit.client.0.vm06.stdout:7/153: write d5/f16 [228554,42432] 0 2026-03-09T17:29:22.966 INFO:tasks.workunit.client.0.vm06.stdout:7/154: dwrite d5/f8 [4194304,4194304] 0 2026-03-09T17:29:22.968 INFO:tasks.workunit.client.0.vm06.stdout:3/132: truncate dd/f1a 3414771 0 2026-03-09T17:29:22.970 INFO:tasks.workunit.client.0.vm06.stdout:0/174: dwrite d7/f12 [0,4194304] 0 2026-03-09T17:29:22.980 INFO:tasks.workunit.client.0.vm06.stdout:0/175: stat d7/f36 0 2026-03-09T17:29:22.981 INFO:tasks.workunit.client.0.vm06.stdout:0/176: write d7/d11/f30 [544128,61192] 0 2026-03-09T17:29:22.981 INFO:tasks.workunit.client.0.vm06.stdout:0/177: dread d7/d11/f1c [0,4194304] 0 2026-03-09T17:29:22.981 INFO:tasks.workunit.client.0.vm06.stdout:0/178: stat d7/la 0 2026-03-09T17:29:22.981 INFO:tasks.workunit.client.0.vm06.stdout:8/139: chown d15/d16/d19/c25 56133 1 2026-03-09T17:29:22.988 INFO:tasks.workunit.client.0.vm06.stdout:9/177: creat d3/d15/d37/f39 x:0 0 0 2026-03-09T17:29:22.989 INFO:tasks.workunit.client.0.vm06.stdout:9/178: read - d3/f27 zero size 2026-03-09T17:29:22.989 INFO:tasks.workunit.client.0.vm06.stdout:9/179: chown d3/d15/f1a 130447 1 2026-03-09T17:29:22.993 INFO:tasks.workunit.client.0.vm06.stdout:4/170: symlink db/df/l3f 0 2026-03-09T17:29:22.994 INFO:tasks.workunit.client.0.vm06.stdout:4/171: truncate db/f39 807067 0 2026-03-09T17:29:22.999 INFO:tasks.workunit.client.0.vm06.stdout:7/155: creat d5/dd/f22 x:0 0 0 2026-03-09T17:29:23.003 INFO:tasks.workunit.client.0.vm06.stdout:3/133: dwrite dd/f22 [0,4194304] 0 2026-03-09T17:29:23.008 INFO:tasks.workunit.client.0.vm06.stdout:6/126: unlink d6/l10 0 2026-03-09T17:29:23.015 INFO:tasks.workunit.client.0.vm06.stdout:6/127: stat d6/d12/d17/d27 0 2026-03-09T17:29:23.016 INFO:tasks.workunit.client.0.vm06.stdout:8/140: rmdir d15/d16/d1e 39 2026-03-09T17:29:23.017 INFO:tasks.workunit.client.0.vm06.stdout:8/141: chown d15/d16/d19 227 1 2026-03-09T17:29:23.019 INFO:tasks.workunit.client.0.vm06.stdout:2/158: truncate d3/d4/d12/d2b/d2d/f17 2411370 0 2026-03-09T17:29:23.024 INFO:tasks.workunit.client.0.vm06.stdout:9/180: unlink d3/d11/f12 0 2026-03-09T17:29:23.025 INFO:tasks.workunit.client.0.vm06.stdout:1/181: creat d11/d14/d1d/d1e/d2a/d34/f3b x:0 0 0 2026-03-09T17:29:23.030 INFO:tasks.workunit.client.0.vm06.stdout:1/182: dwrite d11/d14/d1c/f37 [0,4194304] 0 2026-03-09T17:29:23.034 INFO:tasks.workunit.client.0.vm06.stdout:3/134: truncate fc 1823918 0 2026-03-09T17:29:23.040 INFO:tasks.workunit.client.0.vm06.stdout:1/183: dwrite d11/d14/f17 [4194304,4194304] 0 2026-03-09T17:29:23.043 INFO:tasks.workunit.client.0.vm06.stdout:1/184: dread d11/f18 [0,4194304] 0 2026-03-09T17:29:23.057 INFO:tasks.workunit.client.0.vm06.stdout:0/179: symlink d7/d11/d19/d37/l3d 0 2026-03-09T17:29:23.059 INFO:tasks.workunit.client.0.vm06.stdout:8/142: fsync d15/f2e 0 2026-03-09T17:29:23.062 INFO:tasks.workunit.client.0.vm06.stdout:2/159: mkdir d3/d4/d12/d2b/d36 0 2026-03-09T17:29:23.068 INFO:tasks.workunit.client.0.vm06.stdout:5/111: dread d4/d9/fe [0,4194304] 0 2026-03-09T17:29:23.075 INFO:tasks.workunit.client.0.vm06.stdout:9/181: dread d3/fb [0,4194304] 0 2026-03-09T17:29:23.093 INFO:tasks.workunit.client.0.vm06.stdout:7/156: symlink d5/l23 0 2026-03-09T17:29:23.110 INFO:tasks.workunit.client.0.vm06.stdout:8/143: mkdir d15/d31 0 2026-03-09T17:29:23.110 INFO:tasks.workunit.client.0.vm06.stdout:8/144: fsync d15/f2e 0 2026-03-09T17:29:23.116 INFO:tasks.workunit.client.0.vm06.stdout:2/160: dread - d3/d4/d22/f28 zero size 2026-03-09T17:29:23.116 INFO:tasks.workunit.client.0.vm06.stdout:2/161: write d3/d4/d22/f28 [394043,47211] 0 2026-03-09T17:29:23.123 INFO:tasks.workunit.client.0.vm06.stdout:8/145: dread fa [0,4194304] 0 2026-03-09T17:29:23.128 INFO:tasks.workunit.client.0.vm06.stdout:4/172: link c4 db/d1d/d21/d25/c40 0 2026-03-09T17:29:23.130 INFO:tasks.workunit.client.0.vm06.stdout:7/157: mknod d5/d7/c24 0 2026-03-09T17:29:23.132 INFO:tasks.workunit.client.0.vm06.stdout:3/135: rename dd/d19/l1f to dd/l24 0 2026-03-09T17:29:23.136 INFO:tasks.workunit.client.0.vm06.stdout:1/185: symlink d11/d14/d1d/d1e/l3c 0 2026-03-09T17:29:23.137 INFO:tasks.workunit.client.0.vm06.stdout:1/186: write d11/d14/d1c/d1f/f2d [420063,94223] 0 2026-03-09T17:29:23.140 INFO:tasks.workunit.client.0.vm06.stdout:9/182: write d3/fb [1551804,8655] 0 2026-03-09T17:29:23.140 INFO:tasks.workunit.client.0.vm06.stdout:9/183: dread - d3/d15/f17 zero size 2026-03-09T17:29:23.141 INFO:tasks.workunit.client.0.vm06.stdout:9/184: write d3/d26/f29 [50718,25263] 0 2026-03-09T17:29:23.142 INFO:tasks.workunit.client.0.vm06.stdout:9/185: truncate d3/d15/d16/f31 1004686 0 2026-03-09T17:29:23.158 INFO:tasks.workunit.client.0.vm06.stdout:4/173: dread db/df/f36 [0,4194304] 0 2026-03-09T17:29:23.160 INFO:tasks.workunit.client.0.vm06.stdout:7/158: symlink d5/d7/l25 0 2026-03-09T17:29:23.165 INFO:tasks.workunit.client.0.vm06.stdout:7/159: dwrite d5/f16 [0,4194304] 0 2026-03-09T17:29:23.165 INFO:tasks.workunit.client.0.vm06.stdout:0/180: rename d7/d11/d19/c34 to d7/c3e 0 2026-03-09T17:29:23.168 INFO:tasks.workunit.client.0.vm06.stdout:3/136: mkdir dd/d19/d25 0 2026-03-09T17:29:23.170 INFO:tasks.workunit.client.0.vm06.stdout:1/187: mknod d11/d14/d1d/d1e/d2a/c3d 0 2026-03-09T17:29:23.174 INFO:tasks.workunit.client.0.vm06.stdout:8/146: write d15/d16/d1a/f1b [846549,122216] 0 2026-03-09T17:29:23.175 INFO:tasks.workunit.client.0.vm06.stdout:2/162: dwrite d3/d4/d12/f20 [4194304,4194304] 0 2026-03-09T17:29:23.183 INFO:tasks.workunit.client.0.vm06.stdout:6/128: link d6/l1e d6/d12/d17/l28 0 2026-03-09T17:29:23.183 INFO:tasks.workunit.client.0.vm06.stdout:6/129: chown d6/d12/c1f 344 1 2026-03-09T17:29:23.185 INFO:tasks.workunit.client.0.vm06.stdout:5/112: creat d4/f26 x:0 0 0 2026-03-09T17:29:23.196 INFO:tasks.workunit.client.0.vm06.stdout:0/181: rename d7/f25 to d7/d11/d2d/f3f 0 2026-03-09T17:29:23.198 INFO:tasks.workunit.client.0.vm06.stdout:3/137: dread fc [0,4194304] 0 2026-03-09T17:29:23.202 INFO:tasks.workunit.client.0.vm06.stdout:3/138: dwrite dd/d19/d1e/f23 [0,4194304] 0 2026-03-09T17:29:23.208 INFO:tasks.workunit.client.0.vm06.stdout:1/188: symlink d11/d14/d1d/d1e/d2a/d34/l3e 0 2026-03-09T17:29:23.225 INFO:tasks.workunit.client.0.vm06.stdout:9/186: link d3/f1b d3/d15/d37/f3a 0 2026-03-09T17:29:23.244 INFO:tasks.workunit.client.0.vm06.stdout:7/160: symlink d5/l26 0 2026-03-09T17:29:23.248 INFO:tasks.workunit.client.0.vm06.stdout:7/161: dwrite f0 [0,4194304] 0 2026-03-09T17:29:23.249 INFO:tasks.workunit.client.0.vm06.stdout:7/162: readlink d5/l20 0 2026-03-09T17:29:23.254 INFO:tasks.workunit.client.0.vm06.stdout:7/163: dwrite d5/f18 [0,4194304] 0 2026-03-09T17:29:23.287 INFO:tasks.workunit.client.0.vm06.stdout:8/147: symlink d15/d16/d1e/d28/l32 0 2026-03-09T17:29:23.291 INFO:tasks.workunit.client.0.vm06.stdout:2/163: mkdir d3/d4/d12/d2b/d36/d37 0 2026-03-09T17:29:23.291 INFO:tasks.workunit.client.0.vm06.stdout:2/164: write d3/f21 [1319314,83325] 0 2026-03-09T17:29:23.294 INFO:tasks.workunit.client.0.vm06.stdout:9/187: symlink d3/d15/d16/l3b 0 2026-03-09T17:29:23.294 INFO:tasks.workunit.client.0.vm06.stdout:4/174: truncate db/f23 2108625 0 2026-03-09T17:29:23.298 INFO:tasks.workunit.client.0.vm06.stdout:6/130: link d6/fb d6/d12/d17/f29 0 2026-03-09T17:29:23.303 INFO:tasks.workunit.client.0.vm06.stdout:5/113: truncate d4/f5 953108 0 2026-03-09T17:29:23.306 INFO:tasks.workunit.client.0.vm06.stdout:7/164: mknod d5/d12/c27 0 2026-03-09T17:29:23.309 INFO:tasks.workunit.client.0.vm06.stdout:7/165: dread d5/dd/f19 [0,4194304] 0 2026-03-09T17:29:23.311 INFO:tasks.workunit.client.0.vm06.stdout:0/182: fsync d7/f31 0 2026-03-09T17:29:23.311 INFO:tasks.workunit.client.0.vm06.stdout:0/183: stat l4 0 2026-03-09T17:29:23.315 INFO:tasks.workunit.client.0.vm06.stdout:0/184: dwrite d7/d11/f20 [0,4194304] 0 2026-03-09T17:29:23.324 INFO:tasks.workunit.client.0.vm06.stdout:1/189: rename cc to d11/d14/d1d/c3f 0 2026-03-09T17:29:23.327 INFO:tasks.workunit.client.0.vm06.stdout:6/131: symlink d6/d12/d17/l2a 0 2026-03-09T17:29:23.328 INFO:tasks.workunit.client.0.vm06.stdout:5/114: truncate d4/fd 1532041 0 2026-03-09T17:29:23.331 INFO:tasks.workunit.client.0.vm06.stdout:7/166: unlink f2 0 2026-03-09T17:29:23.335 INFO:tasks.workunit.client.0.vm06.stdout:8/148: creat d15/d31/f33 x:0 0 0 2026-03-09T17:29:23.338 INFO:tasks.workunit.client.0.vm06.stdout:2/165: mkdir d3/d4/d38 0 2026-03-09T17:29:23.338 INFO:tasks.workunit.client.0.vm06.stdout:2/166: dread - d3/d4/d12/f2e zero size 2026-03-09T17:29:23.338 INFO:tasks.workunit.client.0.vm06.stdout:2/167: chown d3/l5 444055183 1 2026-03-09T17:29:23.342 INFO:tasks.workunit.client.0.vm06.stdout:5/115: dwrite d4/fb [4194304,4194304] 0 2026-03-09T17:29:23.343 INFO:tasks.workunit.client.0.vm06.stdout:5/116: truncate d4/d9/f14 755975 0 2026-03-09T17:29:23.343 INFO:tasks.workunit.client.0.vm06.stdout:5/117: stat d4/d9/f24 0 2026-03-09T17:29:23.349 INFO:tasks.workunit.client.0.vm06.stdout:9/188: rename d3/d2c/c2d to d3/d15/d36/c3c 0 2026-03-09T17:29:23.352 INFO:tasks.workunit.client.0.vm06.stdout:9/189: dread d3/d11/f1f [0,4194304] 0 2026-03-09T17:29:23.365 INFO:tasks.workunit.client.0.vm06.stdout:8/149: creat d15/d16/d1e/f34 x:0 0 0 2026-03-09T17:29:23.367 INFO:tasks.workunit.client.0.vm06.stdout:3/139: truncate f4 6427672 0 2026-03-09T17:29:23.382 INFO:tasks.workunit.client.0.vm06.stdout:0/185: write d7/fe [3981311,97395] 0 2026-03-09T17:29:23.387 INFO:tasks.workunit.client.0.vm06.stdout:4/175: rename db/df/l2e to db/d1d/d21/d25/l41 0 2026-03-09T17:29:23.401 INFO:tasks.workunit.client.0.vm06.stdout:4/176: dread db/d1d/f1f [0,4194304] 0 2026-03-09T17:29:23.404 INFO:tasks.workunit.client.0.vm06.stdout:3/140: creat dd/f26 x:0 0 0 2026-03-09T17:29:23.404 INFO:tasks.workunit.client.0.vm06.stdout:3/141: fsync dd/f10 0 2026-03-09T17:29:23.406 INFO:tasks.workunit.client.0.vm06.stdout:2/168: symlink d3/d4/l39 0 2026-03-09T17:29:23.407 INFO:tasks.workunit.client.0.vm06.stdout:6/132: link d6/d12/d17/l28 d6/d12/d17/l2b 0 2026-03-09T17:29:23.408 INFO:tasks.workunit.client.0.vm06.stdout:6/133: dread - d6/d12/f19 zero size 2026-03-09T17:29:23.408 INFO:tasks.workunit.client.0.vm06.stdout:6/134: write d6/d12/d17/f1b [3539370,45914] 0 2026-03-09T17:29:23.414 INFO:tasks.workunit.client.0.vm06.stdout:0/186: creat d7/d11/d19/d1d/f40 x:0 0 0 2026-03-09T17:29:23.414 INFO:tasks.workunit.client.0.vm06.stdout:0/187: dread - d7/f27 zero size 2026-03-09T17:29:23.416 INFO:tasks.workunit.client.0.vm06.stdout:7/167: link d5/d7/f1b d5/f28 0 2026-03-09T17:29:23.418 INFO:tasks.workunit.client.0.vm06.stdout:7/168: dwrite d5/f10 [0,4194304] 0 2026-03-09T17:29:23.424 INFO:tasks.workunit.client.0.vm06.stdout:1/190: rename f1 to d11/d14/d1d/d1e/d2a/f40 0 2026-03-09T17:29:23.428 INFO:tasks.workunit.client.0.vm06.stdout:1/191: dwrite f10 [0,4194304] 0 2026-03-09T17:29:23.429 INFO:tasks.workunit.client.0.vm06.stdout:1/192: chown d11/d14/d1c 2 1 2026-03-09T17:29:23.430 INFO:tasks.workunit.client.0.vm06.stdout:1/193: write d11/d14/f17 [1358737,62398] 0 2026-03-09T17:29:23.440 INFO:tasks.workunit.client.0.vm06.stdout:5/118: write d4/d9/fe [1115658,22394] 0 2026-03-09T17:29:23.446 INFO:tasks.workunit.client.0.vm06.stdout:5/119: fsync d4/f26 0 2026-03-09T17:29:23.446 INFO:tasks.workunit.client.0.vm06.stdout:5/120: dwrite d4/d9/fe [0,4194304] 0 2026-03-09T17:29:23.451 INFO:tasks.workunit.client.0.vm06.stdout:9/190: creat d3/d26/d35/f3d x:0 0 0 2026-03-09T17:29:23.453 INFO:tasks.workunit.client.0.vm06.stdout:3/142: mknod dd/d1d/c27 0 2026-03-09T17:29:23.460 INFO:tasks.workunit.client.0.vm06.stdout:6/135: rmdir d6 39 2026-03-09T17:29:23.466 INFO:tasks.workunit.client.0.vm06.stdout:7/169: rmdir d5 39 2026-03-09T17:29:23.474 INFO:tasks.workunit.client.0.vm06.stdout:3/143: mkdir dd/d19/d28 0 2026-03-09T17:29:23.476 INFO:tasks.workunit.client.0.vm06.stdout:2/169: creat d3/d4/d12/d2b/d36/d37/f3a x:0 0 0 2026-03-09T17:29:23.481 INFO:tasks.workunit.client.0.vm06.stdout:7/170: write d5/f8 [418971,77523] 0 2026-03-09T17:29:23.483 INFO:tasks.workunit.client.0.vm06.stdout:6/136: dwrite d6/d12/d17/d21/f25 [0,4194304] 0 2026-03-09T17:29:23.483 INFO:tasks.workunit.client.0.vm06.stdout:6/137: chown d6 41 1 2026-03-09T17:29:23.486 INFO:tasks.workunit.client.0.vm06.stdout:7/171: dread d5/f18 [0,4194304] 0 2026-03-09T17:29:23.487 INFO:tasks.workunit.client.0.vm06.stdout:7/172: write d5/f10 [813321,53615] 0 2026-03-09T17:29:23.491 INFO:tasks.workunit.client.0.vm06.stdout:7/173: dread d5/f10 [0,4194304] 0 2026-03-09T17:29:23.493 INFO:tasks.workunit.client.0.vm06.stdout:8/150: link d15/d16/d1e/c2d d15/d16/d1e/d30/c35 0 2026-03-09T17:29:23.495 INFO:tasks.workunit.client.0.vm06.stdout:9/191: mknod d3/c3e 0 2026-03-09T17:29:23.522 INFO:tasks.workunit.client.0.vm06.stdout:7/174: creat d5/dd/f29 x:0 0 0 2026-03-09T17:29:23.524 INFO:tasks.workunit.client.0.vm06.stdout:1/194: creat d11/d14/d1d/d1e/f41 x:0 0 0 2026-03-09T17:29:23.531 INFO:tasks.workunit.client.0.vm06.stdout:9/192: rmdir d3/d15/d16 39 2026-03-09T17:29:23.537 INFO:tasks.workunit.client.0.vm06.stdout:8/151: dread f7 [0,4194304] 0 2026-03-09T17:29:23.539 INFO:tasks.workunit.client.0.vm06.stdout:2/170: rename d3/d4/d12/d2b/d2d/f17 to d3/f3b 0 2026-03-09T17:29:23.539 INFO:tasks.workunit.client.0.vm06.stdout:2/171: fdatasync d3/d4/d12/f2e 0 2026-03-09T17:29:23.542 INFO:tasks.workunit.client.0.vm06.stdout:2/172: dread d3/d4/d12/f20 [4194304,4194304] 0 2026-03-09T17:29:23.542 INFO:tasks.workunit.client.0.vm06.stdout:0/188: write d7/f2a [770317,35562] 0 2026-03-09T17:29:23.554 INFO:tasks.workunit.client.0.vm06.stdout:1/195: mkdir d11/d14/d1d/d42 0 2026-03-09T17:29:23.558 INFO:tasks.workunit.client.0.vm06.stdout:8/152: mknod d15/c36 0 2026-03-09T17:29:23.562 INFO:tasks.workunit.client.0.vm06.stdout:8/153: dwrite d15/f2e [0,4194304] 0 2026-03-09T17:29:23.563 INFO:tasks.workunit.client.0.vm06.stdout:8/154: dread - d15/d16/d1e/f34 zero size 2026-03-09T17:29:23.565 INFO:tasks.workunit.client.0.vm06.stdout:8/155: write d15/d16/d1e/f2f [5132540,54430] 0 2026-03-09T17:29:23.568 INFO:tasks.workunit.client.0.vm06.stdout:5/121: rename d4/d9/d18/l1a to d4/d22/l27 0 2026-03-09T17:29:23.573 INFO:tasks.workunit.client.0.vm06.stdout:0/189: mknod d7/d11/d2d/c41 0 2026-03-09T17:29:23.574 INFO:tasks.workunit.client.0.vm06.stdout:0/190: dread - d7/d11/d2d/f2f zero size 2026-03-09T17:29:23.574 INFO:tasks.workunit.client.0.vm06.stdout:0/191: fsync d7/d11/d19/d1d/f1f 0 2026-03-09T17:29:23.577 INFO:tasks.workunit.client.0.vm06.stdout:3/144: write f0 [2115839,41417] 0 2026-03-09T17:29:23.582 INFO:tasks.workunit.client.0.vm06.stdout:1/196: unlink d11/d14/d1c/d1f/f2d 0 2026-03-09T17:29:23.592 INFO:tasks.workunit.client.0.vm06.stdout:5/122: chown d4/f15 237475337 1 2026-03-09T17:29:23.595 INFO:tasks.workunit.client.0.vm06.stdout:0/192: mknod d7/d11/c42 0 2026-03-09T17:29:23.596 INFO:tasks.workunit.client.0.vm06.stdout:0/193: chown d7/d11/d19/l1a 58 1 2026-03-09T17:29:23.598 INFO:tasks.workunit.client.0.vm06.stdout:6/138: getdents d6/d12/d17/d21 0 2026-03-09T17:29:23.598 INFO:tasks.workunit.client.0.vm06.stdout:6/139: fsync d6/d12/d17/f1b 0 2026-03-09T17:29:23.601 INFO:tasks.workunit.client.0.vm06.stdout:4/177: dwrite db/f23 [0,4194304] 0 2026-03-09T17:29:23.606 INFO:tasks.workunit.client.0.vm06.stdout:3/145: dwrite dd/f15 [0,4194304] 0 2026-03-09T17:29:23.608 INFO:tasks.workunit.client.0.vm06.stdout:3/146: chown f0 128172 1 2026-03-09T17:29:23.615 INFO:tasks.workunit.client.0.vm06.stdout:1/197: creat d11/d14/d1d/d1e/d2a/f43 x:0 0 0 2026-03-09T17:29:23.617 INFO:tasks.workunit.client.0.vm06.stdout:8/156: symlink d15/d16/l37 0 2026-03-09T17:29:23.618 INFO:tasks.workunit.client.0.vm06.stdout:8/157: write f0 [3784666,106620] 0 2026-03-09T17:29:23.633 INFO:tasks.workunit.client.0.vm06.stdout:8/158: write d15/d31/f33 [498625,68577] 0 2026-03-09T17:29:23.633 INFO:tasks.workunit.client.0.vm06.stdout:9/193: write d3/f1b [3008421,1239] 0 2026-03-09T17:29:23.633 INFO:tasks.workunit.client.0.vm06.stdout:5/123: rename d4/f15 to d4/d9/d18/f28 0 2026-03-09T17:29:23.633 INFO:tasks.workunit.client.0.vm06.stdout:2/173: creat d3/d4/f3c x:0 0 0 2026-03-09T17:29:23.633 INFO:tasks.workunit.client.0.vm06.stdout:2/174: truncate d3/d4/d12/d2b/f32 654418 0 2026-03-09T17:29:23.634 INFO:tasks.workunit.client.0.vm06.stdout:2/175: chown d3/d4/f3c 0 1 2026-03-09T17:29:23.634 INFO:tasks.workunit.client.0.vm06.stdout:2/176: dread - d3/d4/d12/f15 zero size 2026-03-09T17:29:23.634 INFO:tasks.workunit.client.0.vm06.stdout:2/177: fsync d3/d4/fe 0 2026-03-09T17:29:23.638 INFO:tasks.workunit.client.0.vm06.stdout:4/178: creat db/d1d/d21/f42 x:0 0 0 2026-03-09T17:29:23.639 INFO:tasks.workunit.client.0.vm06.stdout:4/179: dread - db/d1d/d21/d25/f35 zero size 2026-03-09T17:29:23.640 INFO:tasks.workunit.client.0.vm06.stdout:4/180: dread db/df/f2d [0,4194304] 0 2026-03-09T17:29:23.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:23 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:29:23.642 INFO:tasks.workunit.client.0.vm06.stdout:7/175: write d5/dd/f1a [2694360,29293] 0 2026-03-09T17:29:23.642 INFO:tasks.workunit.client.0.vm06.stdout:7/176: fsync d5/dd/f29 0 2026-03-09T17:29:23.643 INFO:tasks.workunit.client.0.vm06.stdout:7/177: readlink d5/l26 0 2026-03-09T17:29:23.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:23 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:29:23.645 INFO:tasks.workunit.client.0.vm06.stdout:7/178: dwrite d5/dd/f29 [0,4194304] 0 2026-03-09T17:29:23.647 INFO:tasks.workunit.client.0.vm06.stdout:7/179: read - d5/dd/f22 zero size 2026-03-09T17:29:23.651 INFO:tasks.workunit.client.0.vm06.stdout:7/180: write d5/f16 [5200672,87129] 0 2026-03-09T17:29:23.654 INFO:tasks.workunit.client.0.vm06.stdout:7/181: dread f0 [0,4194304] 0 2026-03-09T17:29:23.654 INFO:tasks.workunit.client.0.vm06.stdout:7/182: chown d5/d7/l25 111 1 2026-03-09T17:29:23.656 INFO:tasks.workunit.client.0.vm06.stdout:7/183: dread d5/dd/f29 [0,4194304] 0 2026-03-09T17:29:23.656 INFO:tasks.workunit.client.0.vm06.stdout:7/184: stat d5/f18 0 2026-03-09T17:29:23.658 INFO:tasks.workunit.client.0.vm06.stdout:0/194: rename d7/d11/d2d/f3b to d7/d11/d19/d1d/f43 0 2026-03-09T17:29:23.663 INFO:tasks.workunit.client.0.vm06.stdout:2/178: symlink d3/d4/d22/l3d 0 2026-03-09T17:29:23.663 INFO:tasks.workunit.client.0.vm06.stdout:2/179: write d3/d4/f3c [760114,53012] 0 2026-03-09T17:29:23.664 INFO:tasks.workunit.client.0.vm06.stdout:2/180: read d3/d4/fe [1752046,75678] 0 2026-03-09T17:29:23.665 INFO:tasks.workunit.client.0.vm06.stdout:6/140: symlink d6/d12/d17/d27/l2c 0 2026-03-09T17:29:23.670 INFO:tasks.workunit.client.0.vm06.stdout:1/198: creat d11/d14/d1d/d42/f44 x:0 0 0 2026-03-09T17:29:23.676 INFO:tasks.workunit.client.0.vm06.stdout:9/194: fdatasync d3/d15/f17 0 2026-03-09T17:29:23.678 INFO:tasks.workunit.client.0.vm06.stdout:7/185: creat d5/dd/f2a x:0 0 0 2026-03-09T17:29:23.680 INFO:tasks.workunit.client.0.vm06.stdout:0/195: creat d7/d11/d2d/f44 x:0 0 0 2026-03-09T17:29:23.683 INFO:tasks.workunit.client.0.vm06.stdout:7/186: dwrite d5/d7/f1d [0,4194304] 0 2026-03-09T17:29:23.689 INFO:tasks.workunit.client.0.vm06.stdout:2/181: symlink d3/d4/d12/d2b/d36/l3e 0 2026-03-09T17:29:23.690 INFO:tasks.workunit.client.0.vm06.stdout:6/141: mkdir d6/d12/d2d 0 2026-03-09T17:29:23.701 INFO:tasks.workunit.client.0.vm06.stdout:1/199: rmdir d11/d14/d1c/d1f 39 2026-03-09T17:29:23.703 INFO:tasks.workunit.client.0.vm06.stdout:8/159: truncate fa 4359595 0 2026-03-09T17:29:23.704 INFO:tasks.workunit.client.0.vm06.stdout:8/160: chown d15/d31/f33 1 1 2026-03-09T17:29:23.707 INFO:tasks.workunit.client.0.vm06.stdout:9/195: dread - d3/d15/d16/f2f zero size 2026-03-09T17:29:23.711 INFO:tasks.workunit.client.0.vm06.stdout:4/181: dread f7 [4194304,4194304] 0 2026-03-09T17:29:23.711 INFO:tasks.workunit.client.0.vm06.stdout:4/182: stat db/d1d/d21/d25/l33 0 2026-03-09T17:29:23.713 INFO:tasks.workunit.client.0.vm06.stdout:4/183: dread db/f13 [0,4194304] 0 2026-03-09T17:29:23.718 INFO:tasks.workunit.client.0.vm06.stdout:0/196: symlink d7/d11/d2d/l45 0 2026-03-09T17:29:23.719 INFO:tasks.workunit.client.0.vm06.stdout:5/124: creat d4/d9/f29 x:0 0 0 2026-03-09T17:29:23.722 INFO:tasks.workunit.client.0.vm06.stdout:6/142: chown d6/c18 1694 1 2026-03-09T17:29:23.723 INFO:tasks.workunit.client.0.vm06.stdout:1/200: rmdir d11/d14 39 2026-03-09T17:29:23.724 INFO:tasks.workunit.client.0.vm06.stdout:9/196: symlink d3/d2c/l3f 0 2026-03-09T17:29:23.726 INFO:tasks.workunit.client.0.vm06.stdout:0/197: rename d7/d11/d19/d1d/f1f to d7/f46 0 2026-03-09T17:29:23.734 INFO:tasks.workunit.client.0.vm06.stdout:0/198: chown d7/d11/f30 10 1 2026-03-09T17:29:23.734 INFO:tasks.workunit.client.0.vm06.stdout:5/125: truncate d4/f20 340675 0 2026-03-09T17:29:23.734 INFO:tasks.workunit.client.0.vm06.stdout:6/143: unlink d6/fe 0 2026-03-09T17:29:23.734 INFO:tasks.workunit.client.0.vm06.stdout:6/144: fdatasync d6/d12/f19 0 2026-03-09T17:29:23.734 INFO:tasks.workunit.client.0.vm06.stdout:8/161: symlink d15/d16/d1e/d30/l38 0 2026-03-09T17:29:23.734 INFO:tasks.workunit.client.0.vm06.stdout:8/162: write d15/d16/d1e/f2f [2342904,50588] 0 2026-03-09T17:29:23.736 INFO:tasks.workunit.client.0.vm06.stdout:0/199: mknod d7/d11/c47 0 2026-03-09T17:29:23.739 INFO:tasks.workunit.client.0.vm06.stdout:9/197: dwrite d3/d15/f23 [0,4194304] 0 2026-03-09T17:29:23.741 INFO:tasks.workunit.client.0.vm06.stdout:3/147: sync 2026-03-09T17:29:23.745 INFO:tasks.workunit.client.0.vm06.stdout:5/126: symlink d4/d9/l2a 0 2026-03-09T17:29:23.745 INFO:tasks.workunit.client.0.vm06.stdout:6/145: mknod d6/d12/d2d/c2e 0 2026-03-09T17:29:23.745 INFO:tasks.workunit.client.0.vm06.stdout:1/201: creat d11/d14/d1c/d1f/f45 x:0 0 0 2026-03-09T17:29:23.755 INFO:tasks.workunit.client.0.vm06.stdout:9/198: chown d3/cd 11 1 2026-03-09T17:29:23.762 INFO:tasks.workunit.client.0.vm06.stdout:3/148: rename dd/d1d/f21 to dd/d1d/f29 0 2026-03-09T17:29:23.762 INFO:tasks.workunit.client.0.vm06.stdout:3/149: dread - dd/f10 zero size 2026-03-09T17:29:23.762 INFO:tasks.workunit.client.0.vm06.stdout:5/127: rmdir d4/d9/d18 39 2026-03-09T17:29:23.762 INFO:tasks.workunit.client.0.vm06.stdout:5/128: chown d4/d9/c23 831920 1 2026-03-09T17:29:23.762 INFO:tasks.workunit.client.0.vm06.stdout:1/202: write d11/d14/d1d/d1e/d2a/f40 [1271749,72076] 0 2026-03-09T17:29:23.762 INFO:tasks.workunit.client.0.vm06.stdout:1/203: dwrite d11/d14/d1d/d1e/f41 [0,4194304] 0 2026-03-09T17:29:23.764 INFO:tasks.workunit.client.0.vm06.stdout:1/204: dread - d11/d14/d1d/d1e/d2a/f38 zero size 2026-03-09T17:29:23.764 INFO:tasks.workunit.client.0.vm06.stdout:8/163: sync 2026-03-09T17:29:23.770 INFO:tasks.workunit.client.0.vm06.stdout:0/200: mknod d7/d11/d19/d3c/c48 0 2026-03-09T17:29:23.772 INFO:tasks.workunit.client.0.vm06.stdout:9/199: symlink d3/d2c/l40 0 2026-03-09T17:29:23.773 INFO:tasks.workunit.client.0.vm06.stdout:9/200: fsync d3/f1b 0 2026-03-09T17:29:23.773 INFO:tasks.workunit.client.0.vm06.stdout:9/201: dread - d3/f27 zero size 2026-03-09T17:29:23.773 INFO:tasks.workunit.client.0.vm06.stdout:9/202: stat d3/f21 0 2026-03-09T17:29:23.778 INFO:tasks.workunit.client.0.vm06.stdout:5/129: read d4/f11 [286942,62605] 0 2026-03-09T17:29:23.782 INFO:tasks.workunit.client.0.vm06.stdout:1/205: unlink d11/d14/d1d/d1e/f41 0 2026-03-09T17:29:23.783 INFO:tasks.workunit.client.0.vm06.stdout:1/206: write d11/d14/d1d/d1e/d2a/d34/f3b [339420,35999] 0 2026-03-09T17:29:23.787 INFO:tasks.workunit.client.0.vm06.stdout:9/203: write d3/d15/f2e [74168,98744] 0 2026-03-09T17:29:23.792 INFO:tasks.workunit.client.0.vm06.stdout:9/204: dwrite d3/f1b [0,4194304] 0 2026-03-09T17:29:23.800 INFO:tasks.workunit.client.0.vm06.stdout:9/205: dwrite d3/d11/f14 [0,4194304] 0 2026-03-09T17:29:23.801 INFO:tasks.workunit.client.0.vm06.stdout:9/206: stat d3/d11/f1c 0 2026-03-09T17:29:23.804 INFO:tasks.workunit.client.0.vm06.stdout:3/150: unlink dd/l24 0 2026-03-09T17:29:23.805 INFO:tasks.workunit.client.0.vm06.stdout:3/151: read - dd/f26 zero size 2026-03-09T17:29:23.807 INFO:tasks.workunit.client.0.vm06.stdout:9/207: dwrite d3/f21 [0,4194304] 0 2026-03-09T17:29:23.825 INFO:tasks.workunit.client.0.vm06.stdout:1/207: rmdir d11 39 2026-03-09T17:29:23.831 INFO:tasks.workunit.client.0.vm06.stdout:7/187: truncate d5/dd/f29 3335799 0 2026-03-09T17:29:23.833 INFO:tasks.workunit.client.0.vm06.stdout:8/164: truncate f13 1477964 0 2026-03-09T17:29:23.834 INFO:tasks.workunit.client.0.vm06.stdout:8/165: read d15/d31/f33 [187035,106616] 0 2026-03-09T17:29:23.835 INFO:tasks.workunit.client.0.vm06.stdout:8/166: write d15/d31/f33 [263149,61182] 0 2026-03-09T17:29:23.836 INFO:tasks.workunit.client.0.vm06.stdout:0/201: creat d7/d11/d19/d23/f49 x:0 0 0 2026-03-09T17:29:23.836 INFO:tasks.workunit.client.0.vm06.stdout:2/182: rmdir d3/d4 39 2026-03-09T17:29:23.840 INFO:tasks.workunit.client.0.vm06.stdout:9/208: stat d3/d15/d36 0 2026-03-09T17:29:23.841 INFO:tasks.workunit.client.0.vm06.stdout:5/130: symlink d4/l2b 0 2026-03-09T17:29:23.841 INFO:tasks.workunit.client.0.vm06.stdout:5/131: dread - d4/d9/f1d zero size 2026-03-09T17:29:23.847 INFO:tasks.workunit.client.0.vm06.stdout:6/146: link d6/d12/d17/l2a d6/l2f 0 2026-03-09T17:29:23.849 INFO:tasks.workunit.client.0.vm06.stdout:9/209: dread d3/d15/d16/f31 [0,4194304] 0 2026-03-09T17:29:23.852 INFO:tasks.workunit.client.0.vm06.stdout:9/210: dwrite d3/fb [0,4194304] 0 2026-03-09T17:29:23.863 INFO:tasks.workunit.client.0.vm06.stdout:7/188: mkdir d5/d7/d2b 0 2026-03-09T17:29:23.863 INFO:tasks.workunit.client.0.vm06.stdout:7/189: chown d5/d7/d2b 1 1 2026-03-09T17:29:23.866 INFO:tasks.workunit.client.0.vm06.stdout:0/202: fsync d7/f2a 0 2026-03-09T17:29:23.866 INFO:tasks.workunit.client.0.vm06.stdout:2/183: readlink d3/d4/d22/l3d 0 2026-03-09T17:29:23.866 INFO:tasks.workunit.client.0.vm06.stdout:3/152: symlink dd/d19/d28/l2a 0 2026-03-09T17:29:23.867 INFO:tasks.workunit.client.0.vm06.stdout:2/184: stat d3/d4/d12 0 2026-03-09T17:29:23.869 INFO:tasks.workunit.client.0.vm06.stdout:9/211: rename d3/c25 to d3/d11/c41 0 2026-03-09T17:29:23.869 INFO:tasks.workunit.client.0.vm06.stdout:9/212: readlink d3/d2c/l40 0 2026-03-09T17:29:23.871 INFO:tasks.workunit.client.0.vm06.stdout:1/208: mkdir d11/d14/d1d/d42/d46 0 2026-03-09T17:29:23.872 INFO:tasks.workunit.client.0.vm06.stdout:1/209: write d11/d14/d1c/f2e [874880,67319] 0 2026-03-09T17:29:23.875 INFO:tasks.workunit.client.0.vm06.stdout:7/190: dread f0 [0,4194304] 0 2026-03-09T17:29:23.877 INFO:tasks.workunit.client.0.vm06.stdout:8/167: chown f13 1 1 2026-03-09T17:29:23.879 INFO:tasks.workunit.client.0.vm06.stdout:5/132: sync 2026-03-09T17:29:23.883 INFO:tasks.workunit.client.0.vm06.stdout:8/168: dwrite d15/d16/f21 [0,4194304] 0 2026-03-09T17:29:23.885 INFO:tasks.workunit.client.0.vm06.stdout:8/169: fsync d15/f2e 0 2026-03-09T17:29:23.885 INFO:tasks.workunit.client.0.vm06.stdout:5/133: dread d4/f5 [0,4194304] 0 2026-03-09T17:29:23.889 INFO:tasks.workunit.client.0.vm06.stdout:3/153: creat dd/d19/f2b x:0 0 0 2026-03-09T17:29:23.895 INFO:tasks.workunit.client.0.vm06.stdout:6/147: symlink d6/l30 0 2026-03-09T17:29:23.895 INFO:tasks.workunit.client.0.vm06.stdout:9/213: write d3/d11/f1f [5184766,63314] 0 2026-03-09T17:29:23.899 INFO:tasks.workunit.client.0.vm06.stdout:9/214: dwrite d3/d15/f23 [0,4194304] 0 2026-03-09T17:29:23.908 INFO:tasks.workunit.client.0.vm06.stdout:7/191: creat d5/d12/f2c x:0 0 0 2026-03-09T17:29:23.908 INFO:tasks.workunit.client.0.vm06.stdout:7/192: fsync d5/dd/f22 0 2026-03-09T17:29:23.910 INFO:tasks.workunit.client.0.vm06.stdout:8/170: rmdir d15/d16/d1e/d28 39 2026-03-09T17:29:23.911 INFO:tasks.workunit.client.0.vm06.stdout:8/171: chown d15/d16/d1e/d30 5 1 2026-03-09T17:29:23.913 INFO:tasks.workunit.client.0.vm06.stdout:5/134: symlink d4/d22/l2c 0 2026-03-09T17:29:23.914 INFO:tasks.workunit.client.0.vm06.stdout:3/154: mkdir dd/d19/d2c 0 2026-03-09T17:29:23.915 INFO:tasks.workunit.client.0.vm06.stdout:8/172: dwrite d15/d16/d1e/f2f [0,4194304] 0 2026-03-09T17:29:23.929 INFO:tasks.workunit.client.0.vm06.stdout:7/193: unlink d5/d12/c27 0 2026-03-09T17:29:23.929 INFO:tasks.workunit.client.0.vm06.stdout:7/194: write d5/dd/f1a [582821,103639] 0 2026-03-09T17:29:23.935 INFO:tasks.workunit.client.0.vm06.stdout:5/135: readlink d4/l12 0 2026-03-09T17:29:23.943 INFO:tasks.workunit.client.0.vm06.stdout:5/136: stat d4/c16 0 2026-03-09T17:29:23.943 INFO:tasks.workunit.client.0.vm06.stdout:6/148: rename d6/f1d to d6/d12/f31 0 2026-03-09T17:29:23.944 INFO:tasks.workunit.client.0.vm06.stdout:6/149: chown d6/d12/d17/l2b 54235819 1 2026-03-09T17:29:23.944 INFO:tasks.workunit.client.0.vm06.stdout:9/215: mknod d3/c42 0 2026-03-09T17:29:23.944 INFO:tasks.workunit.client.0.vm06.stdout:1/210: creat d11/d14/d1d/d1e/f47 x:0 0 0 2026-03-09T17:29:23.944 INFO:tasks.workunit.client.0.vm06.stdout:8/173: sync 2026-03-09T17:29:23.944 INFO:tasks.workunit.client.0.vm06.stdout:7/195: unlink d5/l20 0 2026-03-09T17:29:23.944 INFO:tasks.workunit.client.0.vm06.stdout:7/196: stat d5/f28 0 2026-03-09T17:29:23.948 INFO:tasks.workunit.client.0.vm06.stdout:6/150: readlink d6/d12/d17/l2a 0 2026-03-09T17:29:23.949 INFO:tasks.workunit.client.0.vm06.stdout:9/216: chown d3/d11/c2b 2637371 1 2026-03-09T17:29:23.952 INFO:tasks.workunit.client.0.vm06.stdout:1/211: mknod d11/d14/d1d/d1e/d2a/c48 0 2026-03-09T17:29:23.953 INFO:tasks.workunit.client.0.vm06.stdout:7/197: creat d5/dd/f2d x:0 0 0 2026-03-09T17:29:23.955 INFO:tasks.workunit.client.0.vm06.stdout:6/151: creat d6/d12/d17/f32 x:0 0 0 2026-03-09T17:29:23.956 INFO:tasks.workunit.client.0.vm06.stdout:9/217: rename d3/cd to d3/d15/d36/c43 0 2026-03-09T17:29:23.956 INFO:tasks.workunit.client.0.vm06.stdout:9/218: truncate d3/d26/f33 521783 0 2026-03-09T17:29:23.959 INFO:tasks.workunit.client.0.vm06.stdout:7/198: unlink d5/dd/f2a 0 2026-03-09T17:29:23.960 INFO:tasks.workunit.client.0.vm06.stdout:5/137: creat d4/f2d x:0 0 0 2026-03-09T17:29:23.962 INFO:tasks.workunit.client.0.vm06.stdout:6/152: creat d6/d12/d17/d21/f33 x:0 0 0 2026-03-09T17:29:23.963 INFO:tasks.workunit.client.0.vm06.stdout:1/212: symlink d11/l49 0 2026-03-09T17:29:23.965 INFO:tasks.workunit.client.0.vm06.stdout:9/219: dread - d3/d15/d37/f39 zero size 2026-03-09T17:29:23.965 INFO:tasks.workunit.client.0.vm06.stdout:9/220: chown d3/d2c/l3f 120010 1 2026-03-09T17:29:23.966 INFO:tasks.workunit.client.0.vm06.stdout:9/221: write d3/d15/f23 [539555,5805] 0 2026-03-09T17:29:23.969 INFO:tasks.workunit.client.0.vm06.stdout:7/199: symlink d5/d7/l2e 0 2026-03-09T17:29:23.971 INFO:tasks.workunit.client.0.vm06.stdout:5/138: mknod d4/d22/c2e 0 2026-03-09T17:29:23.975 INFO:tasks.workunit.client.0.vm06.stdout:1/213: rename d11/d14/d1c/d1f/d25 to d11/d14/d1d/d4a 0 2026-03-09T17:29:23.984 INFO:tasks.workunit.client.0.vm06.stdout:9/222: creat d3/d26/d35/f44 x:0 0 0 2026-03-09T17:29:23.987 INFO:tasks.workunit.client.0.vm06.stdout:4/184: truncate f6 308398 0 2026-03-09T17:29:23.992 INFO:tasks.workunit.client.0.vm06.stdout:4/185: dwrite db/df/f2a [0,4194304] 0 2026-03-09T17:29:23.992 INFO:tasks.workunit.client.0.vm06.stdout:0/203: getdents d7/d11 0 2026-03-09T17:29:23.993 INFO:tasks.workunit.client.0.vm06.stdout:0/204: chown d7/d11/d2d/f2f 21588612 1 2026-03-09T17:29:24.010 INFO:tasks.workunit.client.0.vm06.stdout:5/139: chown d4/f21 13322759 1 2026-03-09T17:29:24.012 INFO:tasks.workunit.client.0.vm06.stdout:1/214: dread d11/d14/d1d/d1e/d2a/f40 [0,4194304] 0 2026-03-09T17:29:24.012 INFO:tasks.workunit.client.0.vm06.stdout:1/215: chown d11/d14/d1d/c36 0 1 2026-03-09T17:29:24.019 INFO:tasks.workunit.client.0.vm06.stdout:0/205: rename d7/f10 to d7/d11/d19/d1d/d39/f4a 0 2026-03-09T17:29:24.020 INFO:tasks.workunit.client.0.vm06.stdout:7/200: symlink d5/l2f 0 2026-03-09T17:29:24.028 INFO:tasks.workunit.client.0.vm06.stdout:7/201: dread d5/f16 [0,4194304] 0 2026-03-09T17:29:24.034 INFO:tasks.workunit.client.0.vm06.stdout:5/140: sync 2026-03-09T17:29:24.035 INFO:tasks.workunit.client.0.vm06.stdout:5/141: chown d4/d9 115366 1 2026-03-09T17:29:24.060 INFO:tasks.workunit.client.0.vm06.stdout:8/174: truncate f13 920215 0 2026-03-09T17:29:24.060 INFO:tasks.workunit.client.0.vm06.stdout:8/175: read d15/d16/f21 [2933320,112030] 0 2026-03-09T17:29:24.070 INFO:tasks.workunit.client.0.vm06.stdout:0/206: symlink d7/l4b 0 2026-03-09T17:29:24.074 INFO:tasks.workunit.client.0.vm06.stdout:0/207: dwrite d7/d11/f29 [0,4194304] 0 2026-03-09T17:29:24.085 INFO:tasks.workunit.client.0.vm06.stdout:7/202: dread d5/dd/f29 [0,4194304] 0 2026-03-09T17:29:24.089 INFO:tasks.workunit.client.0.vm06.stdout:9/223: truncate d3/d11/f1c 115102 0 2026-03-09T17:29:24.089 INFO:tasks.workunit.client.0.vm06.stdout:9/224: write d3/d11/f14 [591977,38316] 0 2026-03-09T17:29:24.090 INFO:tasks.workunit.client.0.vm06.stdout:9/225: fdatasync d3/d15/d37/f3a 0 2026-03-09T17:29:24.090 INFO:tasks.workunit.client.0.vm06.stdout:9/226: dread - d3/d15/f17 zero size 2026-03-09T17:29:24.095 INFO:tasks.workunit.client.0.vm06.stdout:9/227: dwrite d3/d26/f28 [0,4194304] 0 2026-03-09T17:29:24.097 INFO:tasks.workunit.client.0.vm06.stdout:8/176: mkdir d15/d39 0 2026-03-09T17:29:24.100 INFO:tasks.workunit.client.0.vm06.stdout:8/177: dread d15/d16/d1a/f29 [0,4194304] 0 2026-03-09T17:29:24.101 INFO:tasks.workunit.client.0.vm06.stdout:1/216: rename d11/d14/c15 to d11/d14/d1d/c4b 0 2026-03-09T17:29:24.107 INFO:tasks.workunit.client.0.vm06.stdout:2/185: dwrite d3/d4/f11 [0,4194304] 0 2026-03-09T17:29:24.121 INFO:tasks.workunit.client.0.vm06.stdout:5/142: mknod d4/d9/c2f 0 2026-03-09T17:29:24.123 INFO:tasks.workunit.client.0.vm06.stdout:7/203: unlink d5/f10 0 2026-03-09T17:29:24.126 INFO:tasks.workunit.client.0.vm06.stdout:7/204: dwrite d5/dd/ff [4194304,4194304] 0 2026-03-09T17:29:24.128 INFO:tasks.workunit.client.0.vm06.stdout:7/205: dread - d5/d12/f2c zero size 2026-03-09T17:29:24.129 INFO:tasks.workunit.client.0.vm06.stdout:7/206: chown d5/d12/l1c 391007 1 2026-03-09T17:29:24.138 INFO:tasks.workunit.client.0.vm06.stdout:3/155: dwrite f4 [4194304,4194304] 0 2026-03-09T17:29:24.143 INFO:tasks.workunit.client.0.vm06.stdout:9/228: symlink d3/d26/l45 0 2026-03-09T17:29:24.144 INFO:tasks.workunit.client.0.vm06.stdout:4/186: getdents db/df 0 2026-03-09T17:29:24.145 INFO:tasks.workunit.client.0.vm06.stdout:4/187: chown db/l19 61338 1 2026-03-09T17:29:24.151 INFO:tasks.workunit.client.0.vm06.stdout:1/217: creat d11/d14/d1c/d1f/f4c x:0 0 0 2026-03-09T17:29:24.153 INFO:tasks.workunit.client.0.vm06.stdout:2/186: symlink d3/d4/d12/d2b/d36/l3f 0 2026-03-09T17:29:24.154 INFO:tasks.workunit.client.0.vm06.stdout:2/187: write d3/d4/d22/f2f [8259349,101277] 0 2026-03-09T17:29:24.155 INFO:tasks.workunit.client.0.vm06.stdout:2/188: write d3/d4/f3c [23403,122576] 0 2026-03-09T17:29:24.165 INFO:tasks.workunit.client.0.vm06.stdout:6/153: truncate d6/d12/d17/d21/f25 2090532 0 2026-03-09T17:29:24.166 INFO:tasks.workunit.client.0.vm06.stdout:6/154: fdatasync d6/d12/d17/d21/f26 0 2026-03-09T17:29:24.181 INFO:tasks.workunit.client.0.vm06.stdout:1/218: unlink d11/d14/d1d/d1e/d2a/c3d 0 2026-03-09T17:29:24.184 INFO:tasks.workunit.client.0.vm06.stdout:1/219: dread f8 [0,4194304] 0 2026-03-09T17:29:24.185 INFO:tasks.workunit.client.0.vm06.stdout:1/220: truncate d11/d14/d1c/f2e 1614646 0 2026-03-09T17:29:24.188 INFO:tasks.workunit.client.0.vm06.stdout:0/208: link d7/d11/d19/f24 d7/d11/d19/d1d/f4c 0 2026-03-09T17:29:24.189 INFO:tasks.workunit.client.0.vm06.stdout:0/209: write d7/d11/d2d/f2f [847137,2766] 0 2026-03-09T17:29:24.190 INFO:tasks.workunit.client.0.vm06.stdout:5/143: rmdir d4 39 2026-03-09T17:29:24.193 INFO:tasks.workunit.client.0.vm06.stdout:6/155: unlink d6/d12/f19 0 2026-03-09T17:29:24.198 INFO:tasks.workunit.client.0.vm06.stdout:3/156: mkdir dd/d19/d25/d2d 0 2026-03-09T17:29:24.198 INFO:tasks.workunit.client.0.vm06.stdout:3/157: dread - dd/d19/f2b zero size 2026-03-09T17:29:24.202 INFO:tasks.workunit.client.0.vm06.stdout:8/178: dwrite f7 [0,4194304] 0 2026-03-09T17:29:24.207 INFO:tasks.workunit.client.0.vm06.stdout:4/188: symlink db/d1d/d21/d37/l43 0 2026-03-09T17:29:24.207 INFO:tasks.workunit.client.0.vm06.stdout:2/189: mknod d3/d4/c40 0 2026-03-09T17:29:24.208 INFO:tasks.workunit.client.0.vm06.stdout:4/189: write db/f23 [4718572,118180] 0 2026-03-09T17:29:24.208 INFO:tasks.workunit.client.0.vm06.stdout:0/210: chown d7/d11/d19/f24 2057871 1 2026-03-09T17:29:24.217 INFO:tasks.workunit.client.0.vm06.stdout:7/207: creat d5/f30 x:0 0 0 2026-03-09T17:29:24.221 INFO:tasks.workunit.client.0.vm06.stdout:9/229: creat d3/d15/f46 x:0 0 0 2026-03-09T17:29:24.222 INFO:tasks.workunit.client.0.vm06.stdout:8/179: read fe [4236494,18170] 0 2026-03-09T17:29:24.226 INFO:tasks.workunit.client.0.vm06.stdout:0/211: creat d7/d11/d19/d1d/f4d x:0 0 0 2026-03-09T17:29:24.227 INFO:tasks.workunit.client.0.vm06.stdout:0/212: dread - d7/d11/d19/d1d/f4d zero size 2026-03-09T17:29:24.229 INFO:tasks.workunit.client.0.vm06.stdout:2/190: creat d3/d4/d12/d2b/d36/d37/f41 x:0 0 0 2026-03-09T17:29:24.230 INFO:tasks.workunit.client.0.vm06.stdout:2/191: fsync d3/f10 0 2026-03-09T17:29:24.241 INFO:tasks.workunit.client.0.vm06.stdout:0/213: dwrite d7/fb [0,4194304] 0 2026-03-09T17:29:24.247 INFO:tasks.workunit.client.0.vm06.stdout:5/144: write f0 [1199624,60712] 0 2026-03-09T17:29:24.251 INFO:tasks.workunit.client.0.vm06.stdout:2/192: creat d3/d4/d12/f42 x:0 0 0 2026-03-09T17:29:24.259 INFO:tasks.workunit.client.0.vm06.stdout:7/208: sync 2026-03-09T17:29:24.260 INFO:tasks.workunit.client.0.vm06.stdout:9/230: sync 2026-03-09T17:29:24.260 INFO:tasks.workunit.client.0.vm06.stdout:7/209: readlink d5/l23 0 2026-03-09T17:29:24.261 INFO:tasks.workunit.client.0.vm06.stdout:9/231: write d3/d26/f33 [980466,87582] 0 2026-03-09T17:29:24.263 INFO:tasks.workunit.client.0.vm06.stdout:0/214: unlink d7/d11/d2d/f3f 0 2026-03-09T17:29:24.269 INFO:tasks.workunit.client.0.vm06.stdout:2/193: mkdir d3/d4/d22/d43 0 2026-03-09T17:29:24.281 INFO:tasks.workunit.client.0.vm06.stdout:8/180: creat d15/d16/f3a x:0 0 0 2026-03-09T17:29:24.281 INFO:tasks.workunit.client.0.vm06.stdout:0/215: mknod d7/d11/d19/d1d/d39/c4e 0 2026-03-09T17:29:24.288 INFO:tasks.workunit.client.0.vm06.stdout:1/221: link d11/d14/c2b d11/d14/d1d/d1e/c4d 0 2026-03-09T17:29:24.291 INFO:tasks.workunit.client.0.vm06.stdout:6/156: write d6/d12/f1c [909103,73305] 0 2026-03-09T17:29:24.292 INFO:tasks.workunit.client.0.vm06.stdout:4/190: write db/df/f14 [11750084,79081] 0 2026-03-09T17:29:24.297 INFO:tasks.workunit.client.0.vm06.stdout:9/232: truncate d3/d11/f1c 89994 0 2026-03-09T17:29:24.301 INFO:tasks.workunit.client.0.vm06.stdout:0/216: creat d7/d11/d19/d37/f4f x:0 0 0 2026-03-09T17:29:24.302 INFO:tasks.workunit.client.0.vm06.stdout:0/217: stat d7/f12 0 2026-03-09T17:29:24.303 INFO:tasks.workunit.client.0.vm06.stdout:0/218: write d7/d11/f30 [951150,63884] 0 2026-03-09T17:29:24.304 INFO:tasks.workunit.client.0.vm06.stdout:9/233: dwrite d3/f21 [0,4194304] 0 2026-03-09T17:29:24.307 INFO:tasks.workunit.client.0.vm06.stdout:9/234: write d3/f27 [1016189,97674] 0 2026-03-09T17:29:24.307 INFO:tasks.workunit.client.0.vm06.stdout:0/219: dread - d7/d11/d19/d23/f49 zero size 2026-03-09T17:29:24.308 INFO:tasks.workunit.client.0.vm06.stdout:9/235: chown d3/d11/f14 753222931 1 2026-03-09T17:29:24.309 INFO:tasks.workunit.client.0.vm06.stdout:9/236: write d3/d15/f17 [314498,106210] 0 2026-03-09T17:29:24.309 INFO:tasks.workunit.client.0.vm06.stdout:0/220: write d7/d11/d2d/f2f [1802584,2686] 0 2026-03-09T17:29:24.312 INFO:tasks.workunit.client.0.vm06.stdout:0/221: truncate d7/d11/d19/d1d/f4d 673474 0 2026-03-09T17:29:24.318 INFO:tasks.workunit.client.0.vm06.stdout:3/158: dwrite dd/f1a [4194304,4194304] 0 2026-03-09T17:29:24.321 INFO:tasks.workunit.client.0.vm06.stdout:3/159: stat dd/d19/d1e/f23 0 2026-03-09T17:29:24.338 INFO:tasks.workunit.client.0.vm06.stdout:7/210: dwrite d5/dd/f19 [4194304,4194304] 0 2026-03-09T17:29:24.338 INFO:tasks.workunit.client.0.vm06.stdout:5/145: symlink d4/l30 0 2026-03-09T17:29:24.343 INFO:tasks.workunit.client.0.vm06.stdout:1/222: creat d11/d14/d1d/f4e x:0 0 0 2026-03-09T17:29:24.344 INFO:tasks.workunit.client.0.vm06.stdout:1/223: chown d11/d14/d1d/d1e/d2a/d34/l3e 2122 1 2026-03-09T17:29:24.344 INFO:tasks.workunit.client.0.vm06.stdout:1/224: write d11/d14/d1d/d1e/d2a/d34/f3b [1006050,127407] 0 2026-03-09T17:29:24.352 INFO:tasks.workunit.client.0.vm06.stdout:8/181: link d15/d16/d1e/f2f d15/d16/d1e/d30/f3b 0 2026-03-09T17:29:24.356 INFO:tasks.workunit.client.0.vm06.stdout:4/191: mkdir db/d1d/d21/d44 0 2026-03-09T17:29:24.366 INFO:tasks.workunit.client.0.vm06.stdout:0/222: creat d7/f50 x:0 0 0 2026-03-09T17:29:24.376 INFO:tasks.workunit.client.0.vm06.stdout:5/146: stat d4/d9/d18/f28 0 2026-03-09T17:29:24.378 INFO:tasks.workunit.client.0.vm06.stdout:7/211: write d5/dd/f29 [4302159,125740] 0 2026-03-09T17:29:24.386 INFO:tasks.workunit.client.0.vm06.stdout:8/182: dwrite d15/d16/f21 [0,4194304] 0 2026-03-09T17:29:24.388 INFO:tasks.workunit.client.0.vm06.stdout:6/157: fdatasync d6/d12/d17/d21/f25 0 2026-03-09T17:29:24.391 INFO:tasks.workunit.client.0.vm06.stdout:0/223: creat d7/d11/d19/d1d/d39/f51 x:0 0 0 2026-03-09T17:29:24.392 INFO:tasks.workunit.client.0.vm06.stdout:5/147: creat d4/d9/d18/f31 x:0 0 0 2026-03-09T17:29:24.393 INFO:tasks.workunit.client.0.vm06.stdout:5/148: write d4/d9/f29 [299092,34779] 0 2026-03-09T17:29:24.397 INFO:tasks.workunit.client.0.vm06.stdout:0/224: dwrite f5 [4194304,4194304] 0 2026-03-09T17:29:24.399 INFO:tasks.workunit.client.0.vm06.stdout:5/149: read f0 [673100,23811] 0 2026-03-09T17:29:24.406 INFO:tasks.workunit.client.0.vm06.stdout:8/183: dread - d15/d16/d1e/d28/f2a zero size 2026-03-09T17:29:24.407 INFO:tasks.workunit.client.0.vm06.stdout:8/184: write d15/d16/d1e/f34 [88704,77463] 0 2026-03-09T17:29:24.410 INFO:tasks.workunit.client.0.vm06.stdout:6/158: unlink d6/c15 0 2026-03-09T17:29:24.412 INFO:tasks.workunit.client.0.vm06.stdout:9/237: rename d3/d11/c32 to d3/c47 0 2026-03-09T17:29:24.416 INFO:tasks.workunit.client.0.vm06.stdout:9/238: dwrite d3/f27 [0,4194304] 0 2026-03-09T17:29:24.422 INFO:tasks.workunit.client.0.vm06.stdout:0/225: unlink d7/d11/d19/d1d/f4d 0 2026-03-09T17:29:24.427 INFO:tasks.workunit.client.0.vm06.stdout:0/226: dwrite d7/d11/d2d/f44 [0,4194304] 0 2026-03-09T17:29:24.433 INFO:tasks.workunit.client.0.vm06.stdout:6/159: readlink d6/d12/d17/l2a 0 2026-03-09T17:29:24.434 INFO:tasks.workunit.client.0.vm06.stdout:6/160: chown d6/l1e 258315276 1 2026-03-09T17:29:24.434 INFO:tasks.workunit.client.0.vm06.stdout:6/161: dread - d6/d12/d17/f32 zero size 2026-03-09T17:29:24.435 INFO:tasks.workunit.client.0.vm06.stdout:6/162: write d6/d12/d17/f1b [2507905,113633] 0 2026-03-09T17:29:24.441 INFO:tasks.workunit.client.0.vm06.stdout:5/150: mknod d4/d9/c32 0 2026-03-09T17:29:24.442 INFO:tasks.workunit.client.0.vm06.stdout:5/151: chown d4/fb 396224731 1 2026-03-09T17:29:24.442 INFO:tasks.workunit.client.0.vm06.stdout:5/152: stat d4/l12 0 2026-03-09T17:29:24.445 INFO:tasks.workunit.client.0.vm06.stdout:0/227: symlink d7/d11/d19/d1d/l52 0 2026-03-09T17:29:24.445 INFO:tasks.workunit.client.0.vm06.stdout:0/228: write d7/f31 [1307017,21246] 0 2026-03-09T17:29:24.445 INFO:tasks.workunit.client.0.vm06.stdout:0/229: stat d7/c16 0 2026-03-09T17:29:24.453 INFO:tasks.workunit.client.0.vm06.stdout:8/185: mkdir d15/d39/d3c 0 2026-03-09T17:29:24.456 INFO:tasks.workunit.client.0.vm06.stdout:9/239: mkdir d3/d15/d48 0 2026-03-09T17:29:24.460 INFO:tasks.workunit.client.0.vm06.stdout:9/240: dwrite d3/d26/f29 [0,4194304] 0 2026-03-09T17:29:24.460 INFO:tasks.workunit.client.0.vm06.stdout:9/241: readlink d3/d15/d16/l20 0 2026-03-09T17:29:24.468 INFO:tasks.workunit.client.0.vm06.stdout:1/225: link d11/d14/d1d/c3f d11/d14/c4f 0 2026-03-09T17:29:24.471 INFO:tasks.workunit.client.0.vm06.stdout:5/153: dwrite d4/f17 [0,4194304] 0 2026-03-09T17:29:24.473 INFO:tasks.workunit.client.0.vm06.stdout:0/230: fdatasync d7/d11/d19/f21 0 2026-03-09T17:29:24.477 INFO:tasks.workunit.client.0.vm06.stdout:8/186: mkdir d15/d16/d19/d3d 0 2026-03-09T17:29:24.479 INFO:tasks.workunit.client.0.vm06.stdout:8/187: dread fa [0,4194304] 0 2026-03-09T17:29:24.487 INFO:tasks.workunit.client.0.vm06.stdout:9/242: creat d3/d15/d36/f49 x:0 0 0 2026-03-09T17:29:24.488 INFO:tasks.workunit.client.0.vm06.stdout:9/243: chown d3/d11/f14 1 1 2026-03-09T17:29:24.499 INFO:tasks.workunit.client.0.vm06.stdout:5/154: dwrite d4/f1f [4194304,4194304] 0 2026-03-09T17:29:24.501 INFO:tasks.workunit.client.0.vm06.stdout:0/231: mkdir d7/d11/d19/d53 0 2026-03-09T17:29:24.511 INFO:tasks.workunit.client.0.vm06.stdout:9/244: rename d3/d15/d37/f39 to d3/d15/d16/f4a 0 2026-03-09T17:29:24.515 INFO:tasks.workunit.client.0.vm06.stdout:0/232: fsync d7/d11/d19/f21 0 2026-03-09T17:29:24.516 INFO:tasks.workunit.client.0.vm06.stdout:0/233: fdatasync d7/f14 0 2026-03-09T17:29:24.516 INFO:tasks.workunit.client.0.vm06.stdout:0/234: truncate d7/d11/f2c 879841 0 2026-03-09T17:29:24.517 INFO:tasks.workunit.client.0.vm06.stdout:0/235: readlink d7/l4b 0 2026-03-09T17:29:24.522 INFO:tasks.workunit.client.0.vm06.stdout:6/163: link d6/l20 d6/d12/d17/l34 0 2026-03-09T17:29:24.545 INFO:tasks.workunit.client.0.vm06.stdout:5/155: dread d4/d9/d18/f28 [0,4194304] 0 2026-03-09T17:29:24.546 INFO:tasks.workunit.client.0.vm06.stdout:0/236: symlink d7/d11/d19/d37/l54 0 2026-03-09T17:29:24.547 INFO:tasks.workunit.client.0.vm06.stdout:0/237: write d7/d11/f29 [4595823,67684] 0 2026-03-09T17:29:24.551 INFO:tasks.workunit.client.0.vm06.stdout:6/164: mknod d6/d12/c35 0 2026-03-09T17:29:24.560 INFO:tasks.workunit.client.0.vm06.stdout:0/238: fdatasync d7/d11/f29 0 2026-03-09T17:29:24.565 INFO:tasks.workunit.client.0.vm06.stdout:6/165: fdatasync d6/d12/d17/f1b 0 2026-03-09T17:29:24.567 INFO:tasks.workunit.client.0.vm06.stdout:2/194: dwrite f2 [0,4194304] 0 2026-03-09T17:29:24.579 INFO:tasks.workunit.client.0.vm06.stdout:9/245: rename d3/d15/d16/f2f to d3/f4b 0 2026-03-09T17:29:24.581 INFO:tasks.workunit.client.0.vm06.stdout:0/239: creat d7/d11/d19/d3c/f55 x:0 0 0 2026-03-09T17:29:24.583 INFO:tasks.workunit.client.0.vm06.stdout:5/156: mknod d4/c33 0 2026-03-09T17:29:24.585 INFO:tasks.workunit.client.0.vm06.stdout:6/166: symlink d6/d12/d17/d27/l36 0 2026-03-09T17:29:24.585 INFO:tasks.workunit.client.0.vm06.stdout:6/167: fdatasync d6/d12/d17/f32 0 2026-03-09T17:29:24.589 INFO:tasks.workunit.client.0.vm06.stdout:3/160: dwrite f7 [8388608,4194304] 0 2026-03-09T17:29:24.601 INFO:tasks.workunit.client.0.vm06.stdout:0/240: unlink d7/f8 0 2026-03-09T17:29:24.614 INFO:tasks.workunit.client.0.vm06.stdout:5/157: symlink d4/d9/d18/l34 0 2026-03-09T17:29:24.614 INFO:tasks.workunit.client.0.vm06.stdout:3/161: mkdir dd/d1d/d2e 0 2026-03-09T17:29:24.614 INFO:tasks.workunit.client.0.vm06.stdout:0/241: creat d7/f56 x:0 0 0 2026-03-09T17:29:24.616 INFO:tasks.workunit.client.0.vm06.stdout:0/242: creat d7/d11/d19/f57 x:0 0 0 2026-03-09T17:29:24.617 INFO:tasks.workunit.client.0.vm06.stdout:0/243: write d7/d11/d19/d23/f49 [971993,16139] 0 2026-03-09T17:29:24.619 INFO:tasks.workunit.client.0.vm06.stdout:5/158: dread f0 [0,4194304] 0 2026-03-09T17:29:24.620 INFO:tasks.workunit.client.0.vm06.stdout:5/159: read d4/f20 [197504,116923] 0 2026-03-09T17:29:24.622 INFO:tasks.workunit.client.0.vm06.stdout:3/162: creat dd/d1d/d2e/f2f x:0 0 0 2026-03-09T17:29:24.625 INFO:tasks.workunit.client.0.vm06.stdout:3/163: dread f0 [0,4194304] 0 2026-03-09T17:29:24.629 INFO:tasks.workunit.client.0.vm06.stdout:2/195: sync 2026-03-09T17:29:24.633 INFO:tasks.workunit.client.0.vm06.stdout:2/196: dwrite f2 [0,4194304] 0 2026-03-09T17:29:24.633 INFO:tasks.workunit.client.0.vm06.stdout:6/168: sync 2026-03-09T17:29:24.634 INFO:tasks.workunit.client.0.vm06.stdout:2/197: fdatasync d3/f24 0 2026-03-09T17:29:24.635 INFO:tasks.workunit.client.0.vm06.stdout:7/212: dwrite d5/f28 [0,4194304] 0 2026-03-09T17:29:24.659 INFO:tasks.workunit.client.0.vm06.stdout:2/198: dread d3/f10 [4194304,4194304] 0 2026-03-09T17:29:24.661 INFO:tasks.workunit.client.0.vm06.stdout:2/199: truncate d3/d4/d12/d2b/d36/d37/f3a 684611 0 2026-03-09T17:29:24.664 INFO:tasks.workunit.client.0.vm06.stdout:4/192: truncate db/d1d/f3a 613415 0 2026-03-09T17:29:24.678 INFO:tasks.workunit.client.0.vm06.stdout:6/169: dread d6/d12/d17/d21/f26 [0,4194304] 0 2026-03-09T17:29:24.683 INFO:tasks.workunit.client.0.vm06.stdout:6/170: dwrite d6/d12/d17/d21/f33 [0,4194304] 0 2026-03-09T17:29:24.687 INFO:tasks.workunit.client.0.vm06.stdout:5/160: mkdir d4/d9/d35 0 2026-03-09T17:29:24.693 INFO:tasks.workunit.client.0.vm06.stdout:5/161: dwrite d4/d9/f1d [0,4194304] 0 2026-03-09T17:29:24.693 INFO:tasks.workunit.client.0.vm06.stdout:6/171: dwrite d6/d12/d17/f32 [0,4194304] 0 2026-03-09T17:29:24.694 INFO:tasks.workunit.client.0.vm06.stdout:6/172: truncate d6/d12/f1c 1242408 0 2026-03-09T17:29:24.700 INFO:tasks.workunit.client.0.vm06.stdout:3/164: creat dd/d19/d2c/f30 x:0 0 0 2026-03-09T17:29:24.710 INFO:tasks.workunit.client.0.vm06.stdout:0/244: rmdir d7/d11/d19/d53 0 2026-03-09T17:29:24.720 INFO:tasks.workunit.client.0.vm06.stdout:0/245: dread d7/fb [0,4194304] 0 2026-03-09T17:29:24.727 INFO:tasks.workunit.client.0.vm06.stdout:2/200: dread d3/f3b [0,4194304] 0 2026-03-09T17:29:24.731 INFO:tasks.workunit.client.0.vm06.stdout:2/201: stat d3/d4/d12/f35 0 2026-03-09T17:29:24.731 INFO:tasks.workunit.client.0.vm06.stdout:4/193: mkdir db/d1d/d21/d26/d3c/d45 0 2026-03-09T17:29:24.734 INFO:tasks.workunit.client.0.vm06.stdout:0/246: mknod d7/d11/d2d/c58 0 2026-03-09T17:29:24.736 INFO:tasks.workunit.client.0.vm06.stdout:6/173: chown d6/l20 23735746 1 2026-03-09T17:29:24.746 INFO:tasks.workunit.client.0.vm06.stdout:4/194: mknod db/df/c46 0 2026-03-09T17:29:24.749 INFO:tasks.workunit.client.0.vm06.stdout:7/213: getdents d5/dd 0 2026-03-09T17:29:24.752 INFO:tasks.workunit.client.0.vm06.stdout:2/202: mkdir d3/d44 0 2026-03-09T17:29:24.782 INFO:tasks.workunit.client.0.vm06.stdout:4/195: truncate db/d1d/f1f 4247008 0 2026-03-09T17:29:24.783 INFO:tasks.workunit.client.0.vm06.stdout:7/214: mknod d5/d7/c31 0 2026-03-09T17:29:24.783 INFO:tasks.workunit.client.0.vm06.stdout:7/215: chown d5/d1f 359 1 2026-03-09T17:29:24.783 INFO:tasks.workunit.client.0.vm06.stdout:7/216: dread - d5/f30 zero size 2026-03-09T17:29:24.783 INFO:tasks.workunit.client.0.vm06.stdout:6/174: link d6/d12/f31 d6/d12/d17/d27/f37 0 2026-03-09T17:29:24.783 INFO:tasks.workunit.client.0.vm06.stdout:6/175: readlink d6/d12/d17/d27/l2c 0 2026-03-09T17:29:24.783 INFO:tasks.workunit.client.0.vm06.stdout:2/203: rmdir d3/d4/d12 39 2026-03-09T17:29:24.783 INFO:tasks.workunit.client.0.vm06.stdout:4/196: symlink db/d1d/d21/d37/l47 0 2026-03-09T17:29:24.783 INFO:tasks.workunit.client.0.vm06.stdout:4/197: chown db/d1d/d21/f2f 0 1 2026-03-09T17:29:24.783 INFO:tasks.workunit.client.0.vm06.stdout:7/217: rename d5/f30 to d5/d12/f32 0 2026-03-09T17:29:24.783 INFO:tasks.workunit.client.0.vm06.stdout:2/204: dread d3/d4/d12/d2b/d2d/f1b [0,4194304] 0 2026-03-09T17:29:24.783 INFO:tasks.workunit.client.0.vm06.stdout:4/198: mknod db/d1d/c48 0 2026-03-09T17:29:24.783 INFO:tasks.workunit.client.0.vm06.stdout:4/199: dwrite db/df/f30 [0,4194304] 0 2026-03-09T17:29:24.791 INFO:tasks.workunit.client.0.vm06.stdout:6/176: dread d6/d12/d17/f29 [0,4194304] 0 2026-03-09T17:29:24.804 INFO:tasks.workunit.client.0.vm06.stdout:1/226: truncate d11/d14/d1c/d1f/f21 1570747 0 2026-03-09T17:29:24.804 INFO:tasks.workunit.client.0.vm06.stdout:1/227: truncate d11/d14/d1d/f4e 295221 0 2026-03-09T17:29:24.804 INFO:tasks.workunit.client.0.vm06.stdout:1/228: dwrite d11/d14/d1d/d1e/d2a/d34/f3b [0,4194304] 0 2026-03-09T17:29:24.805 INFO:tasks.workunit.client.0.vm06.stdout:0/247: sync 2026-03-09T17:29:24.808 INFO:tasks.workunit.client.0.vm06.stdout:2/205: unlink d3/d4/fe 0 2026-03-09T17:29:24.809 INFO:tasks.workunit.client.0.vm06.stdout:2/206: stat d3/d4/d12/f1e 0 2026-03-09T17:29:24.813 INFO:tasks.workunit.client.0.vm06.stdout:2/207: dwrite d3/d4/d12/f15 [0,4194304] 0 2026-03-09T17:29:24.826 INFO:tasks.workunit.client.0.vm06.stdout:4/200: dread f6 [0,4194304] 0 2026-03-09T17:29:24.826 INFO:tasks.workunit.client.0.vm06.stdout:4/201: readlink db/l3d 0 2026-03-09T17:29:24.830 INFO:tasks.workunit.client.0.vm06.stdout:4/202: dwrite db/f23 [4194304,4194304] 0 2026-03-09T17:29:24.831 INFO:tasks.workunit.client.0.vm06.stdout:4/203: read - db/d1d/d21/f42 zero size 2026-03-09T17:29:24.831 INFO:tasks.workunit.client.0.vm06.stdout:4/204: readlink db/d1d/d21/l24 0 2026-03-09T17:29:24.835 INFO:tasks.workunit.client.0.vm06.stdout:8/188: write f13 [1709341,119899] 0 2026-03-09T17:29:24.835 INFO:tasks.workunit.client.0.vm06.stdout:6/177: creat d6/d12/d17/d21/f38 x:0 0 0 2026-03-09T17:29:24.845 INFO:tasks.workunit.client.0.vm06.stdout:7/218: mknod d5/c33 0 2026-03-09T17:29:24.846 INFO:tasks.workunit.client.0.vm06.stdout:7/219: dread - d5/d12/f2c zero size 2026-03-09T17:29:24.846 INFO:tasks.workunit.client.0.vm06.stdout:7/220: chown d5/f8 4430308 1 2026-03-09T17:29:24.851 INFO:tasks.workunit.client.0.vm06.stdout:7/221: dwrite d5/d7/f1d [0,4194304] 0 2026-03-09T17:29:24.853 INFO:tasks.workunit.client.0.vm06.stdout:7/222: write d5/dd/f1a [827931,88842] 0 2026-03-09T17:29:24.857 INFO:tasks.workunit.client.0.vm06.stdout:1/229: creat d11/d14/d1d/d1e/d2a/f50 x:0 0 0 2026-03-09T17:29:24.863 INFO:tasks.workunit.client.0.vm06.stdout:0/248: mkdir d7/d11/d19/d1d/d59 0 2026-03-09T17:29:24.863 INFO:tasks.workunit.client.0.vm06.stdout:9/246: fsync d3/f4b 0 2026-03-09T17:29:24.867 INFO:tasks.workunit.client.0.vm06.stdout:2/208: symlink d3/d4/d12/d2b/d36/l45 0 2026-03-09T17:29:24.870 INFO:tasks.workunit.client.0.vm06.stdout:4/205: stat db/df/c16 0 2026-03-09T17:29:24.871 INFO:tasks.workunit.client.0.vm06.stdout:6/178: creat d6/d12/d2d/f39 x:0 0 0 2026-03-09T17:29:24.887 INFO:tasks.workunit.client.0.vm06.stdout:0/249: dwrite d7/f46 [0,4194304] 0 2026-03-09T17:29:24.891 INFO:tasks.workunit.client.0.vm06.stdout:9/247: mkdir d3/d15/d36/d4c 0 2026-03-09T17:29:24.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:24 vm06.local ceph-mon[57307]: pgmap v145: 65 pgs: 65 active+clean; 396 MiB data, 2.0 GiB used, 118 GiB / 120 GiB avail; 2.7 MiB/s rd, 36 MiB/s wr, 375 op/s 2026-03-09T17:29:24.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:24 vm09.local ceph-mon[62061]: pgmap v145: 65 pgs: 65 active+clean; 396 MiB data, 2.0 GiB used, 118 GiB / 120 GiB avail; 2.7 MiB/s rd, 36 MiB/s wr, 375 op/s 2026-03-09T17:29:24.902 INFO:tasks.workunit.client.0.vm06.stdout:2/209: unlink d3/d4/d12/l14 0 2026-03-09T17:29:24.906 INFO:tasks.workunit.client.0.vm06.stdout:2/210: dwrite d3/f24 [0,4194304] 0 2026-03-09T17:29:24.913 INFO:tasks.workunit.client.0.vm06.stdout:2/211: chown d3/d4 7326404 1 2026-03-09T17:29:24.913 INFO:tasks.workunit.client.0.vm06.stdout:4/206: mkdir db/d1d/d21/d37/d49 0 2026-03-09T17:29:24.918 INFO:tasks.workunit.client.0.vm06.stdout:6/179: write d6/d12/d17/d27/f37 [1834155,66478] 0 2026-03-09T17:29:24.926 INFO:tasks.workunit.client.0.vm06.stdout:7/223: mkdir d5/d1f/d34 0 2026-03-09T17:29:24.927 INFO:tasks.workunit.client.0.vm06.stdout:7/224: write d5/dd/f19 [8650985,43987] 0 2026-03-09T17:29:24.940 INFO:tasks.workunit.client.0.vm06.stdout:0/250: truncate d7/d11/d19/f21 891515 0 2026-03-09T17:29:24.945 INFO:tasks.workunit.client.0.vm06.stdout:9/248: mkdir d3/d15/d36/d4d 0 2026-03-09T17:29:24.952 INFO:tasks.workunit.client.0.vm06.stdout:6/180: creat d6/d12/d17/d21/f3a x:0 0 0 2026-03-09T17:29:24.955 INFO:tasks.workunit.client.0.vm06.stdout:7/225: unlink d5/dd/f2d 0 2026-03-09T17:29:24.955 INFO:tasks.workunit.client.0.vm06.stdout:7/226: readlink d5/d7/l2e 0 2026-03-09T17:29:24.957 INFO:tasks.workunit.client.0.vm06.stdout:0/251: mkdir d7/d11/d5a 0 2026-03-09T17:29:24.959 INFO:tasks.workunit.client.0.vm06.stdout:9/249: chown d3/c6 125471 1 2026-03-09T17:29:24.963 INFO:tasks.workunit.client.0.vm06.stdout:9/250: dwrite d3/d26/d35/f3d [0,4194304] 0 2026-03-09T17:29:24.964 INFO:tasks.workunit.client.0.vm06.stdout:9/251: chown d3/d15/d48 389012 1 2026-03-09T17:29:24.968 INFO:tasks.workunit.client.0.vm06.stdout:5/162: truncate d4/f1f 253349 0 2026-03-09T17:29:24.968 INFO:tasks.workunit.client.0.vm06.stdout:5/163: chown d4/d9 261997 1 2026-03-09T17:29:24.974 INFO:tasks.workunit.client.0.vm06.stdout:2/212: mkdir d3/d4/d46 0 2026-03-09T17:29:24.976 INFO:tasks.workunit.client.0.vm06.stdout:3/165: truncate dd/d19/d1e/f23 853516 0 2026-03-09T17:29:24.979 INFO:tasks.workunit.client.0.vm06.stdout:4/207: creat db/d1d/d21/d26/d3c/d45/f4a x:0 0 0 2026-03-09T17:29:24.979 INFO:tasks.workunit.client.0.vm06.stdout:4/208: write db/f39 [360078,74779] 0 2026-03-09T17:29:24.989 INFO:tasks.workunit.client.0.vm06.stdout:5/164: readlink d4/d22/l27 0 2026-03-09T17:29:25.000 INFO:tasks.workunit.client.0.vm06.stdout:5/165: stat d4/d9/d18 0 2026-03-09T17:29:25.000 INFO:tasks.workunit.client.0.vm06.stdout:5/166: fsync d4/d9/f24 0 2026-03-09T17:29:25.000 INFO:tasks.workunit.client.0.vm06.stdout:6/181: write d6/fb [4997892,39308] 0 2026-03-09T17:29:25.000 INFO:tasks.workunit.client.0.vm06.stdout:3/166: rename f0 to dd/d19/d25/f31 0 2026-03-09T17:29:25.000 INFO:tasks.workunit.client.0.vm06.stdout:3/167: read - dd/d19/d2c/f30 zero size 2026-03-09T17:29:25.002 INFO:tasks.workunit.client.0.vm06.stdout:9/252: sync 2026-03-09T17:29:25.008 INFO:tasks.workunit.client.0.vm06.stdout:4/209: dread db/fc [0,4194304] 0 2026-03-09T17:29:25.012 INFO:tasks.workunit.client.0.vm06.stdout:6/182: dread d6/d12/f31 [4194304,4194304] 0 2026-03-09T17:29:25.015 INFO:tasks.workunit.client.0.vm06.stdout:4/210: dwrite db/df/f18 [0,4194304] 0 2026-03-09T17:29:25.020 INFO:tasks.workunit.client.0.vm06.stdout:0/252: mkdir d7/d11/d19/d1d/d59/d5b 0 2026-03-09T17:29:25.021 INFO:tasks.workunit.client.0.vm06.stdout:2/213: symlink d3/l47 0 2026-03-09T17:29:25.022 INFO:tasks.workunit.client.0.vm06.stdout:2/214: chown d3/f24 1326166052 1 2026-03-09T17:29:25.041 INFO:tasks.workunit.client.0.vm06.stdout:0/253: dread f5 [0,4194304] 0 2026-03-09T17:29:25.044 INFO:tasks.workunit.client.0.vm06.stdout:8/189: truncate d15/d16/f21 4104940 0 2026-03-09T17:29:25.045 INFO:tasks.workunit.client.0.vm06.stdout:8/190: write fd [578559,124556] 0 2026-03-09T17:29:25.048 INFO:tasks.workunit.client.0.vm06.stdout:9/253: chown d3/d15/f1a 13 1 2026-03-09T17:29:25.050 INFO:tasks.workunit.client.0.vm06.stdout:1/230: truncate d11/d14/d1d/d1e/d2a/d34/f3b 1900346 0 2026-03-09T17:29:25.053 INFO:tasks.workunit.client.0.vm06.stdout:1/231: chown d11/d14/d1d/d1e/d2a/d34 63364 1 2026-03-09T17:29:25.054 INFO:tasks.workunit.client.0.vm06.stdout:1/232: chown d11/d14/d1d/d1e/d2a/f50 8362160 1 2026-03-09T17:29:25.055 INFO:tasks.workunit.client.0.vm06.stdout:0/254: sync 2026-03-09T17:29:25.077 INFO:tasks.workunit.client.0.vm06.stdout:6/183: rmdir d6/d12/d17/d21 39 2026-03-09T17:29:25.090 INFO:tasks.workunit.client.0.vm06.stdout:8/191: creat d15/f3e x:0 0 0 2026-03-09T17:29:25.091 INFO:tasks.workunit.client.0.vm06.stdout:8/192: write d15/d16/d1a/f22 [690552,15707] 0 2026-03-09T17:29:25.091 INFO:tasks.workunit.client.0.vm06.stdout:8/193: chown d15/d16/d1e/d28/f2a 71 1 2026-03-09T17:29:25.092 INFO:tasks.workunit.client.0.vm06.stdout:8/194: write d15/f2e [5108982,102016] 0 2026-03-09T17:29:25.096 INFO:tasks.workunit.client.0.vm06.stdout:3/168: write dd/d19/d1e/f23 [1577913,59960] 0 2026-03-09T17:29:25.098 INFO:tasks.workunit.client.0.vm06.stdout:8/195: sync 2026-03-09T17:29:25.108 INFO:tasks.workunit.client.0.vm06.stdout:0/255: truncate d7/d11/f1c 151651 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:0/256: fsync d7/f31 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:4/211: unlink db/c11 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:7/227: getdents d5/d1f 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:5/167: creat d4/f36 x:0 0 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:5/168: readlink d4/l30 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:5/169: stat d4/c33 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:6/184: write d6/d12/d17/d21/f38 [497453,42088] 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:9/254: getdents d3/d15/d48 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:9/255: write d3/d11/f14 [4988028,67330] 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:9/256: fsync d3/d15/f46 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:9/257: dread d3/d11/f1f [4194304,4194304] 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:1/233: symlink d11/d14/d1c/l51 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:1/234: chown d11/d14/d1c 1735251206 1 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:1/235: write d11/d14/d1c/f37 [2979626,57425] 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:0/257: mknod d7/d11/d19/d1d/c5c 0 2026-03-09T17:29:25.137 INFO:tasks.workunit.client.0.vm06.stdout:4/212: mkdir db/d1d/d21/d25/d4b 0 2026-03-09T17:29:25.138 INFO:tasks.workunit.client.0.vm06.stdout:6/185: mknod d6/d12/d17/d27/c3b 0 2026-03-09T17:29:25.140 INFO:tasks.workunit.client.0.vm06.stdout:6/186: dread d6/d12/f31 [4194304,4194304] 0 2026-03-09T17:29:25.140 INFO:tasks.workunit.client.0.vm06.stdout:6/187: chown d6/d12/d17 84768 1 2026-03-09T17:29:25.145 INFO:tasks.workunit.client.0.vm06.stdout:9/258: rename d3/d11/l13 to d3/d2c/l4e 0 2026-03-09T17:29:25.151 INFO:tasks.workunit.client.0.vm06.stdout:1/236: creat d11/d14/d1d/d42/f52 x:0 0 0 2026-03-09T17:29:25.156 INFO:tasks.workunit.client.0.vm06.stdout:1/237: write d11/d14/d1c/f37 [2523977,89100] 0 2026-03-09T17:29:25.156 INFO:tasks.workunit.client.0.vm06.stdout:0/258: mkdir d7/d11/d5d 0 2026-03-09T17:29:25.157 INFO:tasks.workunit.client.0.vm06.stdout:6/188: fdatasync d6/d12/d17/d27/f37 0 2026-03-09T17:29:25.160 INFO:tasks.workunit.client.0.vm06.stdout:8/196: creat d15/d16/f3f x:0 0 0 2026-03-09T17:29:25.164 INFO:tasks.workunit.client.0.vm06.stdout:8/197: dwrite d15/f3e [0,4194304] 0 2026-03-09T17:29:25.172 INFO:tasks.workunit.client.0.vm06.stdout:5/170: sync 2026-03-09T17:29:25.173 INFO:tasks.workunit.client.0.vm06.stdout:7/228: sync 2026-03-09T17:29:25.188 INFO:tasks.workunit.client.0.vm06.stdout:6/189: creat d6/d12/d17/d21/f3c x:0 0 0 2026-03-09T17:29:25.189 INFO:tasks.workunit.client.0.vm06.stdout:3/169: getdents dd/d19/d2c 0 2026-03-09T17:29:25.192 INFO:tasks.workunit.client.0.vm06.stdout:2/215: creat d3/d4/d12/d2b/d2d/f48 x:0 0 0 2026-03-09T17:29:25.199 INFO:tasks.workunit.client.0.vm06.stdout:9/259: mknod d3/d15/d48/c4f 0 2026-03-09T17:29:25.204 INFO:tasks.workunit.client.0.vm06.stdout:7/229: creat d5/d12/f35 x:0 0 0 2026-03-09T17:29:25.205 INFO:tasks.workunit.client.0.vm06.stdout:7/230: write d5/dd/ff [645016,23014] 0 2026-03-09T17:29:25.208 INFO:tasks.workunit.client.0.vm06.stdout:7/231: dwrite d5/d12/f35 [0,4194304] 0 2026-03-09T17:29:25.242 INFO:tasks.workunit.client.0.vm06.stdout:6/190: creat d6/d12/d17/d27/f3d x:0 0 0 2026-03-09T17:29:25.242 INFO:tasks.workunit.client.0.vm06.stdout:6/191: write d6/d12/f1c [2049189,128442] 0 2026-03-09T17:29:25.263 INFO:tasks.workunit.client.0.vm06.stdout:5/171: mknod d4/d9/d35/c37 0 2026-03-09T17:29:25.266 INFO:tasks.workunit.client.0.vm06.stdout:8/198: truncate d15/d16/d1a/f1b 688669 0 2026-03-09T17:29:25.274 INFO:tasks.workunit.client.0.vm06.stdout:1/238: link d11/d14/d1d/d42/f52 d11/d14/d1d/d1e/f53 0 2026-03-09T17:29:25.282 INFO:tasks.workunit.client.0.vm06.stdout:4/213: getdents db/d1d/d21 0 2026-03-09T17:29:25.286 INFO:tasks.workunit.client.0.vm06.stdout:6/192: mkdir d6/d12/d17/d21/d3e 0 2026-03-09T17:29:25.291 INFO:tasks.workunit.client.0.vm06.stdout:2/216: creat d3/d4/d38/f49 x:0 0 0 2026-03-09T17:29:25.294 INFO:tasks.workunit.client.0.vm06.stdout:7/232: mknod d5/c36 0 2026-03-09T17:29:25.297 INFO:tasks.workunit.client.0.vm06.stdout:5/172: unlink d4/c33 0 2026-03-09T17:29:25.298 INFO:tasks.workunit.client.0.vm06.stdout:5/173: stat d4/l12 0 2026-03-09T17:29:25.300 INFO:tasks.workunit.client.0.vm06.stdout:8/199: creat d15/d39/f40 x:0 0 0 2026-03-09T17:29:25.304 INFO:tasks.workunit.client.0.vm06.stdout:0/259: getdents d7/d11/d19/d37 0 2026-03-09T17:29:25.305 INFO:tasks.workunit.client.0.vm06.stdout:0/260: chown d7/d11/d19/d23/f49 1 1 2026-03-09T17:29:25.308 INFO:tasks.workunit.client.0.vm06.stdout:4/214: symlink db/d1d/d21/l4c 0 2026-03-09T17:29:25.312 INFO:tasks.workunit.client.0.vm06.stdout:6/193: rmdir d6/d12/d17/d27 39 2026-03-09T17:29:25.313 INFO:tasks.workunit.client.0.vm06.stdout:6/194: write d6/d12/d17/d21/f3c [772853,79350] 0 2026-03-09T17:29:25.314 INFO:tasks.workunit.client.0.vm06.stdout:6/195: truncate d6/d12/d17/d21/f38 864068 0 2026-03-09T17:29:25.323 INFO:tasks.workunit.client.0.vm06.stdout:6/196: dread d6/d12/f31 [0,4194304] 0 2026-03-09T17:29:25.327 INFO:tasks.workunit.client.0.vm06.stdout:7/233: rename d5/c36 to d5/d1f/c37 0 2026-03-09T17:29:25.328 INFO:tasks.workunit.client.0.vm06.stdout:7/234: dread - d5/d12/f2c zero size 2026-03-09T17:29:25.331 INFO:tasks.workunit.client.0.vm06.stdout:5/174: dread d4/f20 [0,4194304] 0 2026-03-09T17:29:25.336 INFO:tasks.workunit.client.0.vm06.stdout:1/239: dwrite d11/d14/d1d/d1e/f53 [0,4194304] 0 2026-03-09T17:29:25.352 INFO:tasks.workunit.client.0.vm06.stdout:4/215: creat db/df/f4d x:0 0 0 2026-03-09T17:29:25.352 INFO:tasks.workunit.client.0.vm06.stdout:3/170: getdents dd/d19/d25 0 2026-03-09T17:29:25.357 INFO:tasks.workunit.client.0.vm06.stdout:9/260: getdents d3 0 2026-03-09T17:29:25.362 INFO:tasks.workunit.client.0.vm06.stdout:5/175: unlink d4/d9/fe 0 2026-03-09T17:29:25.362 INFO:tasks.workunit.client.0.vm06.stdout:8/200: unlink d15/d16/d1e/c2d 0 2026-03-09T17:29:25.362 INFO:tasks.workunit.client.0.vm06.stdout:8/201: write f0 [3206186,51500] 0 2026-03-09T17:29:25.369 INFO:tasks.workunit.client.0.vm06.stdout:0/261: symlink d7/d11/d19/d1d/d59/d5b/l5e 0 2026-03-09T17:29:25.372 INFO:tasks.workunit.client.0.vm06.stdout:3/171: creat dd/d19/d28/f32 x:0 0 0 2026-03-09T17:29:25.373 INFO:tasks.workunit.client.0.vm06.stdout:2/217: truncate d3/d4/d22/f2f 694986 0 2026-03-09T17:29:25.374 INFO:tasks.workunit.client.0.vm06.stdout:6/197: mknod d6/d12/d17/c3f 0 2026-03-09T17:29:25.374 INFO:tasks.workunit.client.0.vm06.stdout:1/240: dread f7 [0,4194304] 0 2026-03-09T17:29:25.375 INFO:tasks.workunit.client.0.vm06.stdout:1/241: write d11/d14/d1c/f37 [4022097,84646] 0 2026-03-09T17:29:25.376 INFO:tasks.workunit.client.0.vm06.stdout:9/261: rmdir d3/d15/d16 39 2026-03-09T17:29:25.376 INFO:tasks.workunit.client.0.vm06.stdout:1/242: write d11/d14/d1d/d1e/f53 [2071284,4746] 0 2026-03-09T17:29:25.381 INFO:tasks.workunit.client.0.vm06.stdout:1/243: dwrite d11/d14/d1d/d1e/f47 [0,4194304] 0 2026-03-09T17:29:25.383 INFO:tasks.workunit.client.0.vm06.stdout:5/176: sync 2026-03-09T17:29:25.383 INFO:tasks.workunit.client.0.vm06.stdout:0/262: sync 2026-03-09T17:29:25.384 INFO:tasks.workunit.client.0.vm06.stdout:0/263: truncate d7/f31 1977257 0 2026-03-09T17:29:25.395 INFO:tasks.workunit.client.0.vm06.stdout:0/264: dread d7/f31 [0,4194304] 0 2026-03-09T17:29:25.395 INFO:tasks.workunit.client.0.vm06.stdout:0/265: dread - d7/d11/f35 zero size 2026-03-09T17:29:25.395 INFO:tasks.workunit.client.0.vm06.stdout:0/266: readlink d7/l2b 0 2026-03-09T17:29:25.405 INFO:tasks.workunit.client.0.vm06.stdout:7/235: truncate d5/f8 6480327 0 2026-03-09T17:29:25.405 INFO:tasks.workunit.client.0.vm06.stdout:3/172: mknod dd/d19/d25/c33 0 2026-03-09T17:29:25.426 INFO:tasks.workunit.client.0.vm06.stdout:8/202: mknod d15/d16/d1e/c41 0 2026-03-09T17:29:25.426 INFO:tasks.workunit.client.0.vm06.stdout:0/267: dwrite d7/d11/d2d/f3a [0,4194304] 0 2026-03-09T17:29:25.427 INFO:tasks.workunit.client.0.vm06.stdout:8/203: write fd [2741740,102940] 0 2026-03-09T17:29:25.427 INFO:tasks.workunit.client.0.vm06.stdout:0/268: readlink d7/la 0 2026-03-09T17:29:25.428 INFO:tasks.workunit.client.0.vm06.stdout:0/269: truncate d7/d11/d19/d3c/f55 566046 0 2026-03-09T17:29:25.431 INFO:tasks.workunit.client.0.vm06.stdout:4/216: creat db/d1d/d21/d25/d4b/f4e x:0 0 0 2026-03-09T17:29:25.444 INFO:tasks.workunit.client.0.vm06.stdout:7/236: creat d5/d12/f38 x:0 0 0 2026-03-09T17:29:25.445 INFO:tasks.workunit.client.0.vm06.stdout:7/237: write d5/dd/f19 [6581550,121021] 0 2026-03-09T17:29:25.445 INFO:tasks.workunit.client.0.vm06.stdout:7/238: stat d5/f16 0 2026-03-09T17:29:25.450 INFO:tasks.workunit.client.0.vm06.stdout:3/173: unlink dd/d1d/d2e/f2f 0 2026-03-09T17:29:25.453 INFO:tasks.workunit.client.0.vm06.stdout:2/218: dread d3/d4/d12/d2b/d2d/f2a [0,4194304] 0 2026-03-09T17:29:25.453 INFO:tasks.workunit.client.0.vm06.stdout:6/198: mkdir d6/d12/d17/d27/d40 0 2026-03-09T17:29:25.454 INFO:tasks.workunit.client.0.vm06.stdout:2/219: stat d3/d4/d12/d2b/d36/l3e 0 2026-03-09T17:29:25.454 INFO:tasks.workunit.client.0.vm06.stdout:2/220: chown d3/cd 5375225 1 2026-03-09T17:29:25.459 INFO:tasks.workunit.client.0.vm06.stdout:1/244: rename d11/d14/d1d/d1e/c24 to d11/d14/d1d/d42/c54 0 2026-03-09T17:29:25.464 INFO:tasks.workunit.client.0.vm06.stdout:0/270: symlink d7/d11/d19/d1d/d59/l5f 0 2026-03-09T17:29:25.465 INFO:tasks.workunit.client.0.vm06.stdout:4/217: dread - db/d1d/d21/d26/d3c/d45/f4a zero size 2026-03-09T17:29:25.472 INFO:tasks.workunit.client.0.vm06.stdout:9/262: truncate d3/d26/f28 639408 0 2026-03-09T17:29:25.472 INFO:tasks.workunit.client.0.vm06.stdout:9/263: fsync d3/f1b 0 2026-03-09T17:29:25.473 INFO:tasks.workunit.client.0.vm06.stdout:3/174: creat dd/d1d/f34 x:0 0 0 2026-03-09T17:29:25.475 INFO:tasks.workunit.client.0.vm06.stdout:6/199: unlink d6/d12/d17/l2b 0 2026-03-09T17:29:25.480 INFO:tasks.workunit.client.0.vm06.stdout:8/204: rename d15/f2e to d15/d16/d1a/f42 0 2026-03-09T17:29:25.480 INFO:tasks.workunit.client.0.vm06.stdout:8/205: fsync f13 0 2026-03-09T17:29:25.481 INFO:tasks.workunit.client.0.vm06.stdout:8/206: chown d15/d16/d19/d2b 6 1 2026-03-09T17:29:25.484 INFO:tasks.workunit.client.0.vm06.stdout:0/271: creat d7/d11/d19/d23/f60 x:0 0 0 2026-03-09T17:29:25.485 INFO:tasks.workunit.client.0.vm06.stdout:0/272: chown d7/fb 1 1 2026-03-09T17:29:25.487 INFO:tasks.workunit.client.0.vm06.stdout:7/239: symlink d5/d1f/d34/l39 0 2026-03-09T17:29:25.489 INFO:tasks.workunit.client.0.vm06.stdout:7/240: dread d5/d7/f1d [0,4194304] 0 2026-03-09T17:29:25.492 INFO:tasks.workunit.client.0.vm06.stdout:3/175: rmdir dd/d19 39 2026-03-09T17:29:25.496 INFO:tasks.workunit.client.0.vm06.stdout:3/176: dwrite dd/f1a [0,4194304] 0 2026-03-09T17:29:25.508 INFO:tasks.workunit.client.0.vm06.stdout:6/200: unlink d6/d12/d17/d21/f38 0 2026-03-09T17:29:25.508 INFO:tasks.workunit.client.0.vm06.stdout:6/201: dread - d6/d12/d17/d21/f3a zero size 2026-03-09T17:29:25.512 INFO:tasks.workunit.client.0.vm06.stdout:6/202: dread d6/d12/d17/d21/f3c [0,4194304] 0 2026-03-09T17:29:25.515 INFO:tasks.workunit.client.0.vm06.stdout:6/203: dwrite d6/d12/f31 [4194304,4194304] 0 2026-03-09T17:29:25.517 INFO:tasks.workunit.client.0.vm06.stdout:6/204: write d6/d12/f22 [171455,100390] 0 2026-03-09T17:29:25.535 INFO:tasks.workunit.client.0.vm06.stdout:2/221: dwrite d3/f3b [0,4194304] 0 2026-03-09T17:29:25.537 INFO:tasks.workunit.client.0.vm06.stdout:2/222: truncate d3/d4/d12/f35 250737 0 2026-03-09T17:29:25.538 INFO:tasks.workunit.client.0.vm06.stdout:2/223: stat d3/d4/d12/d2b 0 2026-03-09T17:29:25.546 INFO:tasks.workunit.client.0.vm06.stdout:1/245: creat d11/d14/d1d/d42/d46/f55 x:0 0 0 2026-03-09T17:29:25.549 INFO:tasks.workunit.client.0.vm06.stdout:1/246: chown d11/d14/d1d/d42 265599 1 2026-03-09T17:29:25.550 INFO:tasks.workunit.client.0.vm06.stdout:5/177: getdents d4/d22 0 2026-03-09T17:29:25.551 INFO:tasks.workunit.client.0.vm06.stdout:0/273: read d7/d11/d19/f21 [677916,75261] 0 2026-03-09T17:29:25.552 INFO:tasks.workunit.client.0.vm06.stdout:0/274: chown d7/d11/d2d/l45 217826555 1 2026-03-09T17:29:25.557 INFO:tasks.workunit.client.0.vm06.stdout:7/241: creat d5/d1f/f3a x:0 0 0 2026-03-09T17:29:25.565 INFO:tasks.workunit.client.0.vm06.stdout:8/207: mkdir d15/d16/d19/d2b/d43 0 2026-03-09T17:29:25.568 INFO:tasks.workunit.client.0.vm06.stdout:1/247: creat d11/d14/d1d/f56 x:0 0 0 2026-03-09T17:29:25.571 INFO:tasks.workunit.client.0.vm06.stdout:0/275: symlink d7/d11/d2d/l61 0 2026-03-09T17:29:25.580 INFO:tasks.workunit.client.0.vm06.stdout:0/276: dread d7/d11/d19/d1d/d39/f4a [0,4194304] 0 2026-03-09T17:29:25.581 INFO:tasks.workunit.client.0.vm06.stdout:0/277: write d7/f56 [207921,110676] 0 2026-03-09T17:29:25.585 INFO:tasks.workunit.client.0.vm06.stdout:7/242: mknod d5/d7/c3b 0 2026-03-09T17:29:25.593 INFO:tasks.workunit.client.0.vm06.stdout:7/243: dread d5/dd/f1a [0,4194304] 0 2026-03-09T17:29:25.605 INFO:tasks.workunit.client.0.vm06.stdout:2/224: mknod d3/d4/d22/d43/c4a 0 2026-03-09T17:29:25.606 INFO:tasks.workunit.client.0.vm06.stdout:2/225: write d3/f24 [3533742,24800] 0 2026-03-09T17:29:25.607 INFO:tasks.workunit.client.0.vm06.stdout:2/226: fdatasync d3/d4/d38/f49 0 2026-03-09T17:29:25.607 INFO:tasks.workunit.client.0.vm06.stdout:2/227: chown d3/d4/c40 24092314 1 2026-03-09T17:29:25.611 INFO:tasks.workunit.client.0.vm06.stdout:1/248: mkdir d11/d14/d1c/d1f/d57 0 2026-03-09T17:29:25.613 INFO:tasks.workunit.client.0.vm06.stdout:5/178: mknod d4/d9/c38 0 2026-03-09T17:29:25.615 INFO:tasks.workunit.client.0.vm06.stdout:4/218: link db/df/f36 db/f4f 0 2026-03-09T17:29:25.621 INFO:tasks.workunit.client.0.vm06.stdout:6/205: truncate d6/d12/d17/f32 1938981 0 2026-03-09T17:29:25.622 INFO:tasks.workunit.client.0.vm06.stdout:6/206: dread d6/d12/d17/d21/f26 [0,4194304] 0 2026-03-09T17:29:25.623 INFO:tasks.workunit.client.0.vm06.stdout:0/278: mknod d7/d11/c62 0 2026-03-09T17:29:25.626 INFO:tasks.workunit.client.0.vm06.stdout:9/264: getdents d3/d15/d37 0 2026-03-09T17:29:25.630 INFO:tasks.workunit.client.0.vm06.stdout:6/207: dread d6/d12/d17/d21/f25 [0,4194304] 0 2026-03-09T17:29:25.630 INFO:tasks.workunit.client.0.vm06.stdout:6/208: dread - d6/d12/d2d/f39 zero size 2026-03-09T17:29:25.631 INFO:tasks.workunit.client.0.vm06.stdout:3/177: symlink dd/d19/d25/d2d/l35 0 2026-03-09T17:29:25.636 INFO:tasks.workunit.client.0.vm06.stdout:3/178: dwrite dd/f1b [0,4194304] 0 2026-03-09T17:29:25.637 INFO:tasks.workunit.client.0.vm06.stdout:3/179: fdatasync dd/f26 0 2026-03-09T17:29:25.638 INFO:tasks.workunit.client.0.vm06.stdout:3/180: truncate dd/d19/f2b 200124 0 2026-03-09T17:29:25.638 INFO:tasks.workunit.client.0.vm06.stdout:3/181: write f4 [6581197,10816] 0 2026-03-09T17:29:25.639 INFO:tasks.workunit.client.0.vm06.stdout:3/182: chown dd/d19/d28 41 1 2026-03-09T17:29:25.640 INFO:tasks.workunit.client.0.vm06.stdout:3/183: write dd/f1b [1431832,14367] 0 2026-03-09T17:29:25.644 INFO:tasks.workunit.client.0.vm06.stdout:2/228: creat d3/d4/d22/f4b x:0 0 0 2026-03-09T17:29:25.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:25 vm09.local ceph-mon[62061]: pgmap v146: 65 pgs: 65 active+clean; 510 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 3.5 MiB/s rd, 54 MiB/s wr, 312 op/s 2026-03-09T17:29:25.645 INFO:tasks.workunit.client.0.vm06.stdout:1/249: mkdir d11/d14/d1d/d1e/d2a/d34/d58 0 2026-03-09T17:29:25.646 INFO:tasks.workunit.client.0.vm06.stdout:1/250: chown d11/d14/d1d/d42/f44 603825 1 2026-03-09T17:29:25.646 INFO:tasks.workunit.client.0.vm06.stdout:1/251: readlink d11/l32 0 2026-03-09T17:29:25.647 INFO:tasks.workunit.client.0.vm06.stdout:1/252: dread - d11/d14/d1d/d42/d46/f55 zero size 2026-03-09T17:29:25.648 INFO:tasks.workunit.client.0.vm06.stdout:1/253: read - d11/d14/d1d/d1e/d2a/f38 zero size 2026-03-09T17:29:25.658 INFO:tasks.workunit.client.0.vm06.stdout:5/179: creat d4/d9/d35/f39 x:0 0 0 2026-03-09T17:29:25.658 INFO:tasks.workunit.client.0.vm06.stdout:5/180: fsync d4/f17 0 2026-03-09T17:29:25.663 INFO:tasks.workunit.client.0.vm06.stdout:0/279: symlink d7/l63 0 2026-03-09T17:29:25.666 INFO:tasks.workunit.client.0.vm06.stdout:9/265: readlink d3/d15/l30 0 2026-03-09T17:29:25.666 INFO:tasks.workunit.client.0.vm06.stdout:9/266: chown d3 166040 1 2026-03-09T17:29:25.668 INFO:tasks.workunit.client.0.vm06.stdout:7/244: dwrite d5/f16 [0,4194304] 0 2026-03-09T17:29:25.672 INFO:tasks.workunit.client.0.vm06.stdout:9/267: dwrite d3/d15/f17 [0,4194304] 0 2026-03-09T17:29:25.676 INFO:tasks.workunit.client.0.vm06.stdout:3/184: creat dd/d19/f36 x:0 0 0 2026-03-09T17:29:25.678 INFO:tasks.workunit.client.0.vm06.stdout:2/229: symlink d3/d4/d12/d2b/d36/l4c 0 2026-03-09T17:29:25.689 INFO:tasks.workunit.client.0.vm06.stdout:3/185: dwrite dd/f1b [0,4194304] 0 2026-03-09T17:29:25.696 INFO:tasks.workunit.client.0.vm06.stdout:3/186: dwrite dd/d1d/f34 [0,4194304] 0 2026-03-09T17:29:25.697 INFO:tasks.workunit.client.0.vm06.stdout:1/254: sync 2026-03-09T17:29:25.698 INFO:tasks.workunit.client.0.vm06.stdout:3/187: chown dd/d19/d28/l2a 221640 1 2026-03-09T17:29:25.698 INFO:tasks.workunit.client.0.vm06.stdout:1/255: chown d11/d14/d1d/l33 84269189 1 2026-03-09T17:29:25.708 INFO:tasks.workunit.client.0.vm06.stdout:6/209: mknod d6/c41 0 2026-03-09T17:29:25.708 INFO:tasks.workunit.client.0.vm06.stdout:6/210: fsync d6/d12/f31 0 2026-03-09T17:29:25.709 INFO:tasks.workunit.client.0.vm06.stdout:6/211: write d6/d12/d17/d27/f37 [932901,15520] 0 2026-03-09T17:29:25.710 INFO:tasks.workunit.client.0.vm06.stdout:9/268: readlink d3/d15/l1e 0 2026-03-09T17:29:25.714 INFO:tasks.workunit.client.0.vm06.stdout:9/269: write d3/d11/f2a [585114,85296] 0 2026-03-09T17:29:25.714 INFO:tasks.workunit.client.0.vm06.stdout:9/270: chown d3/c42 30910968 1 2026-03-09T17:29:25.721 INFO:tasks.workunit.client.0.vm06.stdout:8/208: dwrite d15/d16/f23 [0,4194304] 0 2026-03-09T17:29:25.721 INFO:tasks.workunit.client.0.vm06.stdout:8/209: write d15/d16/d1a/f22 [992234,51230] 0 2026-03-09T17:29:25.728 INFO:tasks.workunit.client.0.vm06.stdout:8/210: dwrite d15/d16/d1e/f34 [0,4194304] 0 2026-03-09T17:29:25.737 INFO:tasks.workunit.client.0.vm06.stdout:0/280: mkdir d7/d11/d5d/d64 0 2026-03-09T17:29:25.755 INFO:tasks.workunit.client.0.vm06.stdout:9/271: creat d3/f50 x:0 0 0 2026-03-09T17:29:25.757 INFO:tasks.workunit.client.0.vm06.stdout:5/181: link d4/d9/d18/f31 d4/f3a 0 2026-03-09T17:29:25.758 INFO:tasks.workunit.client.0.vm06.stdout:5/182: truncate d4/d9/f24 623095 0 2026-03-09T17:29:25.763 INFO:tasks.workunit.client.0.vm06.stdout:8/211: rmdir d15/d16/d1e/d28 39 2026-03-09T17:29:25.769 INFO:tasks.workunit.client.0.vm06.stdout:6/212: link d6/fb d6/d12/d17/d21/d3e/f42 0 2026-03-09T17:29:25.769 INFO:tasks.workunit.client.0.vm06.stdout:6/213: chown d6/d12/d17/c3f 570481684 1 2026-03-09T17:29:25.771 INFO:tasks.workunit.client.0.vm06.stdout:8/212: mknod d15/d16/d19/c44 0 2026-03-09T17:29:25.775 INFO:tasks.workunit.client.0.vm06.stdout:1/256: creat d11/d14/f59 x:0 0 0 2026-03-09T17:29:25.775 INFO:tasks.workunit.client.0.vm06.stdout:1/257: dread - d11/d14/d1d/d1e/d2a/f38 zero size 2026-03-09T17:29:25.776 INFO:tasks.workunit.client.0.vm06.stdout:1/258: write d11/d14/f59 [406144,108503] 0 2026-03-09T17:29:25.780 INFO:tasks.workunit.client.0.vm06.stdout:1/259: dwrite d11/d14/d1d/f56 [0,4194304] 0 2026-03-09T17:29:25.784 INFO:tasks.workunit.client.0.vm06.stdout:2/230: rename d3/d4/c13 to d3/d4/d12/d2b/d2d/c4d 0 2026-03-09T17:29:25.785 INFO:tasks.workunit.client.0.vm06.stdout:2/231: write d3/d4/d38/f49 [320346,121077] 0 2026-03-09T17:29:25.790 INFO:tasks.workunit.client.0.vm06.stdout:1/260: write d11/f18 [4227744,30622] 0 2026-03-09T17:29:25.806 INFO:tasks.workunit.client.0.vm06.stdout:0/281: getdents d7/d11/d19/d1d 0 2026-03-09T17:29:25.807 INFO:tasks.workunit.client.0.vm06.stdout:0/282: write d7/f12 [4860643,107637] 0 2026-03-09T17:29:25.807 INFO:tasks.workunit.client.0.vm06.stdout:1/261: dwrite d11/d14/f17 [4194304,4194304] 0 2026-03-09T17:29:25.807 INFO:tasks.workunit.client.0.vm06.stdout:6/214: mknod d6/d12/d17/c43 0 2026-03-09T17:29:25.807 INFO:tasks.workunit.client.0.vm06.stdout:2/232: mknod d3/d4/d12/d2b/d36/d37/c4e 0 2026-03-09T17:29:25.807 INFO:tasks.workunit.client.0.vm06.stdout:2/233: dread d3/d4/d12/d2b/d36/d37/f3a [0,4194304] 0 2026-03-09T17:29:25.808 INFO:tasks.workunit.client.0.vm06.stdout:5/183: sync 2026-03-09T17:29:25.809 INFO:tasks.workunit.client.0.vm06.stdout:0/283: symlink d7/d11/d19/d1d/d39/l65 0 2026-03-09T17:29:25.814 INFO:tasks.workunit.client.0.vm06.stdout:2/234: unlink d3/d4/d38/f49 0 2026-03-09T17:29:25.815 INFO:tasks.workunit.client.0.vm06.stdout:1/262: rename d11/d14/d1d/d1e/c4d to d11/d14/d1d/d1e/d2a/c5a 0 2026-03-09T17:29:25.816 INFO:tasks.workunit.client.0.vm06.stdout:5/184: symlink d4/d22/l3b 0 2026-03-09T17:29:25.818 INFO:tasks.workunit.client.0.vm06.stdout:6/215: link d6/d12/d17/d21/d3e/f42 d6/d12/d17/d21/f44 0 2026-03-09T17:29:25.819 INFO:tasks.workunit.client.0.vm06.stdout:6/216: write d6/d12/d2d/f39 [585020,109804] 0 2026-03-09T17:29:25.823 INFO:tasks.workunit.client.0.vm06.stdout:1/263: rmdir d11 39 2026-03-09T17:29:25.825 INFO:tasks.workunit.client.0.vm06.stdout:2/235: creat d3/d4/d46/f4f x:0 0 0 2026-03-09T17:29:25.829 INFO:tasks.workunit.client.0.vm06.stdout:2/236: dwrite d3/d4/f3c [0,4194304] 0 2026-03-09T17:29:25.843 INFO:tasks.workunit.client.0.vm06.stdout:1/264: write d11/d14/d1c/f37 [4799461,14400] 0 2026-03-09T17:29:25.846 INFO:tasks.workunit.client.0.vm06.stdout:6/217: mknod d6/d12/d17/d27/d40/c45 0 2026-03-09T17:29:25.851 INFO:tasks.workunit.client.0.vm06.stdout:5/185: dread d4/f11 [0,4194304] 0 2026-03-09T17:29:25.861 INFO:tasks.workunit.client.0.vm06.stdout:5/186: creat d4/d9/d18/f3c x:0 0 0 2026-03-09T17:29:25.865 INFO:tasks.workunit.client.0.vm06.stdout:5/187: dwrite d4/f36 [0,4194304] 0 2026-03-09T17:29:25.865 INFO:tasks.workunit.client.0.vm06.stdout:5/188: fsync d4/f21 0 2026-03-09T17:29:25.869 INFO:tasks.workunit.client.0.vm06.stdout:2/237: rename d3/d4/d12/d2b/d2d/f1a to d3/d4/d38/f50 0 2026-03-09T17:29:25.873 INFO:tasks.workunit.client.0.vm06.stdout:1/265: link d11/d14/d1d/d1e/d2a/f43 d11/d14/d1c/f5b 0 2026-03-09T17:29:25.874 INFO:tasks.workunit.client.0.vm06.stdout:1/266: read - d11/d14/d1d/d1e/d2a/f43 zero size 2026-03-09T17:29:25.876 INFO:tasks.workunit.client.0.vm06.stdout:6/218: creat d6/f46 x:0 0 0 2026-03-09T17:29:25.879 INFO:tasks.workunit.client.0.vm06.stdout:1/267: creat d11/d14/d1d/d1e/d2a/d34/f5c x:0 0 0 2026-03-09T17:29:25.883 INFO:tasks.workunit.client.0.vm06.stdout:1/268: stat d11/d14/d1d/d1e/d2a/c2c 0 2026-03-09T17:29:25.883 INFO:tasks.workunit.client.0.vm06.stdout:1/269: stat d11 0 2026-03-09T17:29:25.883 INFO:tasks.workunit.client.0.vm06.stdout:1/270: readlink d11/l49 0 2026-03-09T17:29:25.885 INFO:tasks.workunit.client.0.vm06.stdout:1/271: read f10 [988313,108573] 0 2026-03-09T17:29:25.885 INFO:tasks.workunit.client.0.vm06.stdout:1/272: fdatasync f10 0 2026-03-09T17:29:25.887 INFO:tasks.workunit.client.0.vm06.stdout:2/238: sync 2026-03-09T17:29:25.890 INFO:tasks.workunit.client.0.vm06.stdout:1/273: dwrite d11/d14/d1d/d1e/d2a/f50 [0,4194304] 0 2026-03-09T17:29:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:25 vm06.local ceph-mon[57307]: pgmap v146: 65 pgs: 65 active+clean; 510 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 3.5 MiB/s rd, 54 MiB/s wr, 312 op/s 2026-03-09T17:29:25.898 INFO:tasks.workunit.client.0.vm06.stdout:1/274: dwrite d11/d14/d1d/d1e/d2a/f50 [0,4194304] 0 2026-03-09T17:29:25.908 INFO:tasks.workunit.client.0.vm06.stdout:6/219: mkdir d6/d47 0 2026-03-09T17:29:25.915 INFO:tasks.workunit.client.0.vm06.stdout:1/275: symlink d11/l5d 0 2026-03-09T17:29:25.917 INFO:tasks.workunit.client.0.vm06.stdout:2/239: link d3/l47 d3/d4/d12/d2b/d2d/l51 0 2026-03-09T17:29:25.920 INFO:tasks.workunit.client.0.vm06.stdout:1/276: rename d11/d14/d1d/c4b to d11/d14/d1c/d1f/d57/c5e 0 2026-03-09T17:29:25.926 INFO:tasks.workunit.client.0.vm06.stdout:1/277: mkdir d11/d14/d1c/d5f 0 2026-03-09T17:29:25.927 INFO:tasks.workunit.client.0.vm06.stdout:1/278: write d11/d14/d1d/d42/f52 [245487,94032] 0 2026-03-09T17:29:25.932 INFO:tasks.workunit.client.0.vm06.stdout:1/279: unlink d11/d14/d1d/d1e/l26 0 2026-03-09T17:29:25.937 INFO:tasks.workunit.client.0.vm06.stdout:1/280: link d11/d14/d1c/f37 d11/d14/d1d/d1e/d2a/d34/f60 0 2026-03-09T17:29:25.939 INFO:tasks.workunit.client.0.vm06.stdout:6/220: sync 2026-03-09T17:29:25.944 INFO:tasks.workunit.client.0.vm06.stdout:1/281: dwrite d11/d14/d1d/d1e/d2a/f38 [0,4194304] 0 2026-03-09T17:29:25.951 INFO:tasks.workunit.client.0.vm06.stdout:3/188: getdents dd/d19 0 2026-03-09T17:29:25.953 INFO:tasks.workunit.client.0.vm06.stdout:3/189: dread dd/d1d/f29 [0,4194304] 0 2026-03-09T17:29:25.956 INFO:tasks.workunit.client.0.vm06.stdout:2/240: dread d3/d4/d12/f20 [0,4194304] 0 2026-03-09T17:29:25.957 INFO:tasks.workunit.client.0.vm06.stdout:2/241: chown d3/d4/d12/c30 1 1 2026-03-09T17:29:25.963 INFO:tasks.workunit.client.0.vm06.stdout:6/221: rename d6/d12/d17/f1b to d6/d12/d2d/f48 0 2026-03-09T17:29:25.964 INFO:tasks.workunit.client.0.vm06.stdout:2/242: dwrite d3/d4/d12/f1e [0,4194304] 0 2026-03-09T17:29:25.971 INFO:tasks.workunit.client.0.vm06.stdout:1/282: symlink d11/d14/d1d/d1e/d2a/d34/l61 0 2026-03-09T17:29:25.972 INFO:tasks.workunit.client.0.vm06.stdout:1/283: truncate d11/d14/d1d/d42/d46/f55 765907 0 2026-03-09T17:29:25.980 INFO:tasks.workunit.client.0.vm06.stdout:4/219: truncate db/d1d/d21/d25/f38 488794 0 2026-03-09T17:29:25.984 INFO:tasks.workunit.client.0.vm06.stdout:7/245: dwrite f0 [0,4194304] 0 2026-03-09T17:29:25.986 INFO:tasks.workunit.client.0.vm06.stdout:1/284: dread d11/d14/d1c/f2e [0,4194304] 0 2026-03-09T17:29:25.999 INFO:tasks.workunit.client.0.vm06.stdout:3/190: link dd/f1b dd/d19/d2c/f37 0 2026-03-09T17:29:26.007 INFO:tasks.workunit.client.0.vm06.stdout:6/222: creat d6/d47/f49 x:0 0 0 2026-03-09T17:29:26.007 INFO:tasks.workunit.client.0.vm06.stdout:4/220: sync 2026-03-09T17:29:26.008 INFO:tasks.workunit.client.0.vm06.stdout:4/221: chown db/df/l3f 1194068560 1 2026-03-09T17:29:26.034 INFO:tasks.workunit.client.0.vm06.stdout:7/246: rename d5/d12/l1c to d5/d1f/d34/l3c 0 2026-03-09T17:29:26.034 INFO:tasks.workunit.client.0.vm06.stdout:7/247: dread - d5/d12/f32 zero size 2026-03-09T17:29:26.035 INFO:tasks.workunit.client.0.vm06.stdout:7/248: chown d5/d1f/d34/l39 5171000 1 2026-03-09T17:29:26.035 INFO:tasks.workunit.client.0.vm06.stdout:7/249: rename d5/d7 to d5/d7/d3d 22 2026-03-09T17:29:26.043 INFO:tasks.workunit.client.0.vm06.stdout:1/285: dwrite d11/d14/d1c/f5b [0,4194304] 0 2026-03-09T17:29:26.067 INFO:tasks.workunit.client.0.vm06.stdout:9/272: write d3/d26/f29 [584851,54999] 0 2026-03-09T17:29:26.069 INFO:tasks.workunit.client.0.vm06.stdout:8/213: write d15/d16/d1e/d28/f2a [329962,87355] 0 2026-03-09T17:29:26.080 INFO:tasks.workunit.client.0.vm06.stdout:8/214: write d15/d16/d1a/f42 [5060692,107013] 0 2026-03-09T17:29:26.086 INFO:tasks.workunit.client.0.vm06.stdout:8/215: creat d15/d39/f45 x:0 0 0 2026-03-09T17:29:26.092 INFO:tasks.workunit.client.0.vm06.stdout:0/284: truncate d7/f12 4148829 0 2026-03-09T17:29:26.094 INFO:tasks.workunit.client.0.vm06.stdout:5/189: getdents d4/d9/d18 0 2026-03-09T17:29:26.095 INFO:tasks.workunit.client.0.vm06.stdout:5/190: truncate d4/f3a 716890 0 2026-03-09T17:29:26.098 INFO:tasks.workunit.client.0.vm06.stdout:5/191: dwrite d4/f7 [0,4194304] 0 2026-03-09T17:29:26.110 INFO:tasks.workunit.client.0.vm06.stdout:5/192: mkdir d4/d9/d18/d3d 0 2026-03-09T17:29:26.114 INFO:tasks.workunit.client.0.vm06.stdout:5/193: getdents d4/d22 0 2026-03-09T17:29:26.143 INFO:tasks.workunit.client.0.vm06.stdout:9/273: rename d3/d15/d16/l3b to d3/d15/l51 0 2026-03-09T17:29:26.144 INFO:tasks.workunit.client.0.vm06.stdout:7/250: getdents d5/d1f/d34 0 2026-03-09T17:29:26.154 INFO:tasks.workunit.client.0.vm06.stdout:3/191: dwrite fc [0,4194304] 0 2026-03-09T17:29:26.157 INFO:tasks.workunit.client.0.vm06.stdout:8/216: rename d15/d16/d1e/f2f to d15/d16/d19/d2b/f46 0 2026-03-09T17:29:26.157 INFO:tasks.workunit.client.0.vm06.stdout:0/285: rename d7/d11/d19 to d7/d11/d19/d1d/d39/d66 22 2026-03-09T17:29:26.157 INFO:tasks.workunit.client.0.vm06.stdout:8/217: fdatasync d15/d16/f3f 0 2026-03-09T17:29:26.161 INFO:tasks.workunit.client.0.vm06.stdout:4/222: write db/d1d/f3a [1383736,60973] 0 2026-03-09T17:29:26.162 INFO:tasks.workunit.client.0.vm06.stdout:4/223: write db/df/f14 [10354410,114565] 0 2026-03-09T17:29:26.165 INFO:tasks.workunit.client.0.vm06.stdout:9/274: symlink d3/d15/d36/l52 0 2026-03-09T17:29:26.165 INFO:tasks.workunit.client.0.vm06.stdout:9/275: fdatasync d3/d26/d35/f44 0 2026-03-09T17:29:26.167 INFO:tasks.workunit.client.0.vm06.stdout:7/251: sync 2026-03-09T17:29:26.167 INFO:tasks.workunit.client.0.vm06.stdout:9/276: truncate d3/d15/f46 527952 0 2026-03-09T17:29:26.175 INFO:tasks.workunit.client.0.vm06.stdout:0/286: dwrite d7/d11/d19/d1d/f43 [0,4194304] 0 2026-03-09T17:29:26.181 INFO:tasks.workunit.client.0.vm06.stdout:9/277: dwrite d3/d15/d36/f49 [0,4194304] 0 2026-03-09T17:29:26.186 INFO:tasks.workunit.client.0.vm06.stdout:8/218: mkdir d15/d16/d1a/d47 0 2026-03-09T17:29:26.191 INFO:tasks.workunit.client.0.vm06.stdout:9/278: dwrite d3/d26/f29 [0,4194304] 0 2026-03-09T17:29:26.209 INFO:tasks.workunit.client.0.vm06.stdout:0/287: truncate d7/fb 3882775 0 2026-03-09T17:29:26.216 INFO:tasks.workunit.client.0.vm06.stdout:9/279: symlink d3/d15/d36/l53 0 2026-03-09T17:29:26.222 INFO:tasks.workunit.client.0.vm06.stdout:9/280: dwrite d3/f1b [0,4194304] 0 2026-03-09T17:29:26.225 INFO:tasks.workunit.client.0.vm06.stdout:2/243: creat d3/d4/f52 x:0 0 0 2026-03-09T17:29:26.228 INFO:tasks.workunit.client.0.vm06.stdout:9/281: read d3/d15/f1a [3888125,49206] 0 2026-03-09T17:29:26.231 INFO:tasks.workunit.client.0.vm06.stdout:4/224: dread db/d1d/f3a [0,4194304] 0 2026-03-09T17:29:26.240 INFO:tasks.workunit.client.0.vm06.stdout:9/282: rename d3/d15/d37 to d3/d15/d16/d54 0 2026-03-09T17:29:26.254 INFO:tasks.workunit.client.0.vm06.stdout:4/225: dwrite db/d1d/d21/f42 [0,4194304] 0 2026-03-09T17:29:26.266 INFO:tasks.workunit.client.0.vm06.stdout:6/223: link d6/d12/f31 d6/f4a 0 2026-03-09T17:29:26.271 INFO:tasks.workunit.client.0.vm06.stdout:9/283: creat d3/d15/d36/d4c/f55 x:0 0 0 2026-03-09T17:29:26.294 INFO:tasks.workunit.client.0.vm06.stdout:9/284: write d3/d15/d36/d4c/f55 [145586,86138] 0 2026-03-09T17:29:26.294 INFO:tasks.workunit.client.0.vm06.stdout:7/252: rmdir d5/dd 39 2026-03-09T17:29:26.294 INFO:tasks.workunit.client.0.vm06.stdout:6/224: dread d6/d12/d17/f32 [0,4194304] 0 2026-03-09T17:29:26.294 INFO:tasks.workunit.client.0.vm06.stdout:6/225: stat d6 0 2026-03-09T17:29:26.294 INFO:tasks.workunit.client.0.vm06.stdout:8/219: unlink cb 0 2026-03-09T17:29:26.294 INFO:tasks.workunit.client.0.vm06.stdout:8/220: chown d15/d16/d1a/l1d 7737 1 2026-03-09T17:29:26.294 INFO:tasks.workunit.client.0.vm06.stdout:9/285: symlink d3/d26/l56 0 2026-03-09T17:29:26.294 INFO:tasks.workunit.client.0.vm06.stdout:3/192: truncate dd/d19/d25/f31 3917920 0 2026-03-09T17:29:26.294 INFO:tasks.workunit.client.0.vm06.stdout:8/221: mkdir d15/d31/d48 0 2026-03-09T17:29:26.294 INFO:tasks.workunit.client.0.vm06.stdout:0/288: unlink d7/c16 0 2026-03-09T17:29:26.294 INFO:tasks.workunit.client.0.vm06.stdout:8/222: dread fd [0,4194304] 0 2026-03-09T17:29:26.299 INFO:tasks.workunit.client.0.vm06.stdout:4/226: symlink db/d1d/d21/d44/l50 0 2026-03-09T17:29:26.303 INFO:tasks.workunit.client.0.vm06.stdout:9/286: creat d3/d26/f57 x:0 0 0 2026-03-09T17:29:26.313 INFO:tasks.workunit.client.0.vm06.stdout:7/253: creat d5/dd/f3e x:0 0 0 2026-03-09T17:29:26.313 INFO:tasks.workunit.client.0.vm06.stdout:5/194: rmdir d4/d9 39 2026-03-09T17:29:26.319 INFO:tasks.workunit.client.0.vm06.stdout:0/289: symlink d7/d11/d19/d1d/d39/l67 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:0/290: write d7/d11/f13 [5196724,2270] 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:8/223: truncate fa 4031038 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:8/224: stat d15/d16/d19/c25 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:7/254: readlink d5/l15 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:0/291: creat d7/d11/d19/f68 x:0 0 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:8/225: mknod d15/d31/c49 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:7/255: unlink d5/d1f/d34/l39 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:5/195: creat d4/d9/d18/f3e x:0 0 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:5/196: write d4/fb [6618412,118678] 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:5/197: dread - d4/d9/d18/f3c zero size 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:3/193: creat dd/f38 x:0 0 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:9/287: creat d3/d15/f58 x:0 0 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:7/256: mkdir d5/d1f/d34/d3f 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:7/257: chown d5/d12 450913385 1 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:5/198: creat d4/d22/f3f x:0 0 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:0/292: fsync d7/d11/d19/d1d/f4c 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:8/226: readlink d15/d16/d1e/d30/l38 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:4/227: rmdir db/d1d/d21/d37/d49 0 2026-03-09T17:29:26.360 INFO:tasks.workunit.client.0.vm06.stdout:7/258: symlink d5/d7/l40 0 2026-03-09T17:29:26.365 INFO:tasks.workunit.client.0.vm06.stdout:7/259: creat d5/d1f/d34/f41 x:0 0 0 2026-03-09T17:29:26.367 INFO:tasks.workunit.client.0.vm06.stdout:8/227: mknod d15/d16/d1e/d30/c4a 0 2026-03-09T17:29:26.375 INFO:tasks.workunit.client.0.vm06.stdout:3/194: getdents dd/d1d 0 2026-03-09T17:29:26.385 INFO:tasks.workunit.client.0.vm06.stdout:8/228: rename f0 to d15/d39/f4b 0 2026-03-09T17:29:26.387 INFO:tasks.workunit.client.0.vm06.stdout:9/288: getdents d3/d11 0 2026-03-09T17:29:26.388 INFO:tasks.workunit.client.0.vm06.stdout:9/289: write d3/d15/f23 [3622753,2844] 0 2026-03-09T17:29:26.389 INFO:tasks.workunit.client.0.vm06.stdout:1/286: symlink d11/d14/l62 0 2026-03-09T17:29:26.392 INFO:tasks.workunit.client.0.vm06.stdout:2/244: truncate d3/d4/f1f 2145654 0 2026-03-09T17:29:26.396 INFO:tasks.workunit.client.0.vm06.stdout:7/260: getdents d5/d1f/d34/d3f 0 2026-03-09T17:29:26.396 INFO:tasks.workunit.client.0.vm06.stdout:3/195: symlink dd/d19/l39 0 2026-03-09T17:29:26.396 INFO:tasks.workunit.client.0.vm06.stdout:8/229: chown f12 291322063 1 2026-03-09T17:29:26.406 INFO:tasks.workunit.client.0.vm06.stdout:6/226: dwrite d6/d12/d17/f32 [0,4194304] 0 2026-03-09T17:29:26.419 INFO:tasks.workunit.client.0.vm06.stdout:0/293: rmdir d7/d11/d19 39 2026-03-09T17:29:26.421 INFO:tasks.workunit.client.0.vm06.stdout:4/228: getdents db/d1d/d21/d26/d3c 0 2026-03-09T17:29:26.426 INFO:tasks.workunit.client.0.vm06.stdout:1/287: dread d11/d14/f59 [0,4194304] 0 2026-03-09T17:29:26.444 INFO:tasks.workunit.client.0.vm06.stdout:3/196: creat dd/d1d/d2e/f3a x:0 0 0 2026-03-09T17:29:26.455 INFO:tasks.workunit.client.0.vm06.stdout:5/199: dwrite d4/f3a [0,4194304] 0 2026-03-09T17:29:26.465 INFO:tasks.workunit.client.0.vm06.stdout:1/288: symlink d11/d14/d1d/d1e/d2a/l63 0 2026-03-09T17:29:26.469 INFO:tasks.workunit.client.0.vm06.stdout:2/245: link f2 d3/d4/d22/d43/f53 0 2026-03-09T17:29:26.470 INFO:tasks.workunit.client.0.vm06.stdout:2/246: truncate d3/d4/f52 308889 0 2026-03-09T17:29:26.471 INFO:tasks.workunit.client.0.vm06.stdout:2/247: write d3/f21 [785380,107816] 0 2026-03-09T17:29:26.472 INFO:tasks.workunit.client.0.vm06.stdout:7/261: creat d5/d7/d2b/f42 x:0 0 0 2026-03-09T17:29:26.473 INFO:tasks.workunit.client.0.vm06.stdout:7/262: write d5/f28 [1710596,34263] 0 2026-03-09T17:29:26.474 INFO:tasks.workunit.client.0.vm06.stdout:3/197: rmdir dd/d19/d2c 39 2026-03-09T17:29:26.477 INFO:tasks.workunit.client.0.vm06.stdout:3/198: dwrite f7 [8388608,4194304] 0 2026-03-09T17:29:26.481 INFO:tasks.workunit.client.0.vm06.stdout:4/229: dread db/d1d/d21/f2f [0,4194304] 0 2026-03-09T17:29:26.491 INFO:tasks.workunit.client.0.vm06.stdout:5/200: mkdir d4/d9/d35/d40 0 2026-03-09T17:29:26.494 INFO:tasks.workunit.client.0.vm06.stdout:5/201: dread d4/d9/d18/f31 [0,4194304] 0 2026-03-09T17:29:26.504 INFO:tasks.workunit.client.0.vm06.stdout:1/289: mkdir d11/d14/d1d/d1e/d2a/d34/d64 0 2026-03-09T17:29:26.507 INFO:tasks.workunit.client.0.vm06.stdout:2/248: creat d3/d4/d12/d34/f54 x:0 0 0 2026-03-09T17:29:26.508 INFO:tasks.workunit.client.0.vm06.stdout:2/249: read - d3/d4/d12/d2b/d36/d37/f41 zero size 2026-03-09T17:29:26.509 INFO:tasks.workunit.client.0.vm06.stdout:7/263: rmdir d5 39 2026-03-09T17:29:26.509 INFO:tasks.workunit.client.0.vm06.stdout:2/250: read d3/d4/d12/d2b/f32 [546845,38088] 0 2026-03-09T17:29:26.514 INFO:tasks.workunit.client.0.vm06.stdout:3/199: symlink dd/d19/d28/l3b 0 2026-03-09T17:29:26.515 INFO:tasks.workunit.client.0.vm06.stdout:3/200: write dd/d19/d1e/f23 [2600422,105336] 0 2026-03-09T17:29:26.526 INFO:tasks.workunit.client.0.vm06.stdout:2/251: mknod d3/d4/d12/d34/c55 0 2026-03-09T17:29:26.527 INFO:tasks.workunit.client.0.vm06.stdout:8/230: link d15/d16/d1e/d30/f3b d15/d16/d19/d3d/f4c 0 2026-03-09T17:29:26.530 INFO:tasks.workunit.client.0.vm06.stdout:8/231: dwrite f7 [0,4194304] 0 2026-03-09T17:29:26.547 INFO:tasks.workunit.client.0.vm06.stdout:3/201: rename ca to dd/d19/d1e/c3c 0 2026-03-09T17:29:26.548 INFO:tasks.workunit.client.0.vm06.stdout:3/202: dread - dd/d1d/d2e/f3a zero size 2026-03-09T17:29:26.551 INFO:tasks.workunit.client.0.vm06.stdout:6/227: write d6/d12/d17/d27/f37 [5501705,44463] 0 2026-03-09T17:29:26.557 INFO:tasks.workunit.client.0.vm06.stdout:9/290: write d3/d26/f28 [347283,125575] 0 2026-03-09T17:29:26.558 INFO:tasks.workunit.client.0.vm06.stdout:3/203: read f4 [6517893,64178] 0 2026-03-09T17:29:26.559 INFO:tasks.workunit.client.0.vm06.stdout:2/252: dread d3/d4/d12/d2b/d36/d37/f3a [0,4194304] 0 2026-03-09T17:29:26.560 INFO:tasks.workunit.client.0.vm06.stdout:2/253: truncate d3/d4/d12/d34/f54 652463 0 2026-03-09T17:29:26.576 INFO:tasks.workunit.client.0.vm06.stdout:4/230: creat db/f51 x:0 0 0 2026-03-09T17:29:26.577 INFO:tasks.workunit.client.0.vm06.stdout:6/228: chown d6/l20 29483 1 2026-03-09T17:29:26.577 INFO:tasks.workunit.client.0.vm06.stdout:8/232: fsync f7 0 2026-03-09T17:29:26.578 INFO:tasks.workunit.client.0.vm06.stdout:8/233: readlink d15/d16/l1c 0 2026-03-09T17:29:26.579 INFO:tasks.workunit.client.0.vm06.stdout:5/202: creat d4/d9/f41 x:0 0 0 2026-03-09T17:29:26.582 INFO:tasks.workunit.client.0.vm06.stdout:5/203: dread d4/f3a [0,4194304] 0 2026-03-09T17:29:26.583 INFO:tasks.workunit.client.0.vm06.stdout:6/229: dread d6/d12/f22 [0,4194304] 0 2026-03-09T17:29:26.584 INFO:tasks.workunit.client.0.vm06.stdout:1/290: link d11/d14/d1d/f31 d11/d14/d1d/d1e/f65 0 2026-03-09T17:29:26.585 INFO:tasks.workunit.client.0.vm06.stdout:7/264: symlink d5/d1f/d34/d3f/l43 0 2026-03-09T17:29:26.586 INFO:tasks.workunit.client.0.vm06.stdout:7/265: read - d5/d7/d2b/f42 zero size 2026-03-09T17:29:26.587 INFO:tasks.workunit.client.0.vm06.stdout:9/291: creat d3/d11/f59 x:0 0 0 2026-03-09T17:29:26.589 INFO:tasks.workunit.client.0.vm06.stdout:6/230: dread d6/f4a [0,4194304] 0 2026-03-09T17:29:26.589 INFO:tasks.workunit.client.0.vm06.stdout:6/231: chown d6/d12/d17/d27/f37 2 1 2026-03-09T17:29:26.591 INFO:tasks.workunit.client.0.vm06.stdout:1/291: dread d11/d14/d1d/f56 [0,4194304] 0 2026-03-09T17:29:26.596 INFO:tasks.workunit.client.0.vm06.stdout:4/231: creat db/d1d/d21/d26/d3c/f52 x:0 0 0 2026-03-09T17:29:26.596 INFO:tasks.workunit.client.0.vm06.stdout:4/232: chown db/d1d/c1e 205119 1 2026-03-09T17:29:26.597 INFO:tasks.workunit.client.0.vm06.stdout:4/233: read db/d1d/f3a [194635,45020] 0 2026-03-09T17:29:26.597 INFO:tasks.workunit.client.0.vm06.stdout:4/234: readlink db/d1d/d21/d37/l43 0 2026-03-09T17:29:26.598 INFO:tasks.workunit.client.0.vm06.stdout:4/235: chown db/d1d/d21/d44/l50 92 1 2026-03-09T17:29:26.598 INFO:tasks.workunit.client.0.vm06.stdout:4/236: dread - db/d1d/d21/d26/d3c/d45/f4a zero size 2026-03-09T17:29:26.604 INFO:tasks.workunit.client.0.vm06.stdout:8/234: unlink fd 0 2026-03-09T17:29:26.605 INFO:tasks.workunit.client.0.vm06.stdout:0/294: truncate d7/d11/d19/d1d/d39/f4a 3954977 0 2026-03-09T17:29:26.606 INFO:tasks.workunit.client.0.vm06.stdout:0/295: read d7/d11/f29 [527758,24597] 0 2026-03-09T17:29:26.610 INFO:tasks.workunit.client.0.vm06.stdout:7/266: chown d5/f8 1 1 2026-03-09T17:29:26.611 INFO:tasks.workunit.client.0.vm06.stdout:7/267: write d5/d1f/f3a [1015448,5862] 0 2026-03-09T17:29:26.611 INFO:tasks.workunit.client.0.vm06.stdout:7/268: fsync d5/d1f/d34/f41 0 2026-03-09T17:29:26.612 INFO:tasks.workunit.client.0.vm06.stdout:9/292: creat d3/d15/d36/d4c/f5a x:0 0 0 2026-03-09T17:29:26.615 INFO:tasks.workunit.client.0.vm06.stdout:7/269: dread f0 [0,4194304] 0 2026-03-09T17:29:26.618 INFO:tasks.workunit.client.0.vm06.stdout:9/293: dwrite d3/d11/f2a [0,4194304] 0 2026-03-09T17:29:26.624 INFO:tasks.workunit.client.0.vm06.stdout:6/232: rename d6/l30 to d6/d12/d17/d21/l4b 0 2026-03-09T17:29:26.633 INFO:tasks.workunit.client.0.vm06.stdout:4/237: creat db/d1d/d21/d25/f53 x:0 0 0 2026-03-09T17:29:26.633 INFO:tasks.workunit.client.0.vm06.stdout:4/238: write db/d1d/d21/d26/d3c/d45/f4a [748755,41941] 0 2026-03-09T17:29:26.638 INFO:tasks.workunit.client.0.vm06.stdout:0/296: fdatasync d7/f12 0 2026-03-09T17:29:26.657 INFO:tasks.workunit.client.0.vm06.stdout:5/204: mknod d4/d9/d18/d3d/c42 0 2026-03-09T17:29:26.657 INFO:tasks.workunit.client.0.vm06.stdout:0/297: dwrite d7/d11/d19/d23/f49 [0,4194304] 0 2026-03-09T17:29:26.657 INFO:tasks.workunit.client.0.vm06.stdout:0/298: read d7/f27 [318420,101316] 0 2026-03-09T17:29:26.657 INFO:tasks.workunit.client.0.vm06.stdout:0/299: chown d7/d11/f35 63994 1 2026-03-09T17:29:26.657 INFO:tasks.workunit.client.0.vm06.stdout:9/294: creat d3/d15/d16/d54/f5b x:0 0 0 2026-03-09T17:29:26.657 INFO:tasks.workunit.client.0.vm06.stdout:1/292: fdatasync d11/d14/d1c/d1f/f21 0 2026-03-09T17:29:26.661 INFO:tasks.workunit.client.0.vm06.stdout:0/300: mknod d7/d11/d19/d37/c69 0 2026-03-09T17:29:26.663 INFO:tasks.workunit.client.0.vm06.stdout:2/254: getdents d3/d4/d46 0 2026-03-09T17:29:26.664 INFO:tasks.workunit.client.0.vm06.stdout:3/204: link dd/d19/d28/l3b dd/d19/d2c/l3d 0 2026-03-09T17:29:26.664 INFO:tasks.workunit.client.0.vm06.stdout:3/205: read dd/f1b [3601269,109435] 0 2026-03-09T17:29:26.667 INFO:tasks.workunit.client.0.vm06.stdout:1/293: unlink d11/d14/d1c/f37 0 2026-03-09T17:29:26.671 INFO:tasks.workunit.client.0.vm06.stdout:9/295: link d3/d15/f17 d3/d15/d16/f5c 0 2026-03-09T17:29:26.706 INFO:tasks.workunit.client.0.vm06.stdout:1/294: symlink d11/d14/d1c/d1f/d57/l66 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:4/239: link db/f13 db/d1d/d21/d37/f54 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:8/235: getdents d15/d16/d1e 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:5/205: rename d4/f36 to d4/d9/f43 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:0/301: creat d7/d11/d5d/d64/f6a x:0 0 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:1/295: chown d11/d14/d1d/d1e/d2a/d34/f60 0 1 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:0/302: dread d7/d11/d19/d1d/f43 [0,4194304] 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:0/303: fdatasync d7/d11/d19/f68 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:5/206: read d4/fd [239191,21950] 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:3/206: getdents dd/d1d 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:3/207: truncate dd/f10 1003234 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:3/208: chown dd/d19/l1c 34225 1 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:9/296: mknod d3/d15/d36/d4d/c5d 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:9/297: dwrite d3/d15/d36/d4c/f5a [0,4194304] 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:0/304: creat d7/d11/d5d/d64/f6b x:0 0 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:5/207: creat d4/d9/d18/d3d/f44 x:0 0 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:5/208: dwrite d4/f2d [0,4194304] 0 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:5/209: dread - d4/f26 zero size 2026-03-09T17:29:26.716 INFO:tasks.workunit.client.0.vm06.stdout:8/236: mknod d15/d16/d1e/d30/c4d 0 2026-03-09T17:29:26.717 INFO:tasks.workunit.client.0.vm06.stdout:3/209: mknod dd/d1d/d2e/c3e 0 2026-03-09T17:29:26.717 INFO:tasks.workunit.client.0.vm06.stdout:1/296: mknod d11/d14/c67 0 2026-03-09T17:29:26.717 INFO:tasks.workunit.client.0.vm06.stdout:0/305: mknod d7/d11/d19/d1d/d59/d5b/c6c 0 2026-03-09T17:29:26.717 INFO:tasks.workunit.client.0.vm06.stdout:0/306: chown d7/d11/d2d/f44 615237542 1 2026-03-09T17:29:26.718 INFO:tasks.workunit.client.0.vm06.stdout:5/210: creat d4/d22/f45 x:0 0 0 2026-03-09T17:29:26.720 INFO:tasks.workunit.client.0.vm06.stdout:8/237: creat d15/d16/d1e/f4e x:0 0 0 2026-03-09T17:29:26.722 INFO:tasks.workunit.client.0.vm06.stdout:8/238: stat d15/d16/d19/c25 0 2026-03-09T17:29:26.722 INFO:tasks.workunit.client.0.vm06.stdout:8/239: truncate d15/d39/f40 853692 0 2026-03-09T17:29:26.726 INFO:tasks.workunit.client.0.vm06.stdout:3/210: dwrite dd/d19/d25/f31 [4194304,4194304] 0 2026-03-09T17:29:26.730 INFO:tasks.workunit.client.0.vm06.stdout:0/307: dread d7/d11/f29 [0,4194304] 0 2026-03-09T17:29:26.736 INFO:tasks.workunit.client.0.vm06.stdout:5/211: mkdir d4/d22/d46 0 2026-03-09T17:29:26.737 INFO:tasks.workunit.client.0.vm06.stdout:9/298: rename d3/d26/d35/f44 to d3/d15/f5e 0 2026-03-09T17:29:26.741 INFO:tasks.workunit.client.0.vm06.stdout:8/240: creat d15/d16/d19/f4f x:0 0 0 2026-03-09T17:29:26.743 INFO:tasks.workunit.client.0.vm06.stdout:3/211: creat dd/d19/d1e/f3f x:0 0 0 2026-03-09T17:29:26.745 INFO:tasks.workunit.client.0.vm06.stdout:1/297: link d11/d14/d1d/d1e/d2a/d34/f3b d11/d14/d1c/d1f/f68 0 2026-03-09T17:29:26.747 INFO:tasks.workunit.client.0.vm06.stdout:0/308: creat d7/d11/d19/d37/f6d x:0 0 0 2026-03-09T17:29:26.747 INFO:tasks.workunit.client.0.vm06.stdout:0/309: chown d7/d11/d19/f21 147085 1 2026-03-09T17:29:26.748 INFO:tasks.workunit.client.0.vm06.stdout:1/298: dread d11/d14/f17 [4194304,4194304] 0 2026-03-09T17:29:26.748 INFO:tasks.workunit.client.0.vm06.stdout:0/310: readlink d7/d11/d19/l1a 0 2026-03-09T17:29:26.748 INFO:tasks.workunit.client.0.vm06.stdout:0/311: write d7/f12 [878244,47380] 0 2026-03-09T17:29:26.749 INFO:tasks.workunit.client.0.vm06.stdout:5/212: mknod d4/d22/c47 0 2026-03-09T17:29:26.752 INFO:tasks.workunit.client.0.vm06.stdout:5/213: creat d4/d9/d18/f48 x:0 0 0 2026-03-09T17:29:26.786 INFO:tasks.workunit.client.0.vm06.stdout:0/312: symlink d7/d11/d19/d1d/d39/l6e 0 2026-03-09T17:29:26.786 INFO:tasks.workunit.client.0.vm06.stdout:0/313: write d7/d11/d19/f57 [235062,108861] 0 2026-03-09T17:29:26.786 INFO:tasks.workunit.client.0.vm06.stdout:0/314: read - d7/f50 zero size 2026-03-09T17:29:26.786 INFO:tasks.workunit.client.0.vm06.stdout:1/299: mkdir d11/d69 0 2026-03-09T17:29:26.786 INFO:tasks.workunit.client.0.vm06.stdout:0/315: mkdir d7/d11/d19/d23/d6f 0 2026-03-09T17:29:26.786 INFO:tasks.workunit.client.0.vm06.stdout:0/316: write d7/d11/d2d/f3a [2983495,37700] 0 2026-03-09T17:29:26.786 INFO:tasks.workunit.client.0.vm06.stdout:5/214: creat d4/f49 x:0 0 0 2026-03-09T17:29:26.786 INFO:tasks.workunit.client.0.vm06.stdout:1/300: creat d11/d14/d1d/d1e/d2a/d34/d58/f6a x:0 0 0 2026-03-09T17:29:26.786 INFO:tasks.workunit.client.0.vm06.stdout:5/215: rename d4/d9/f10 to d4/d9/d18/f4a 0 2026-03-09T17:29:26.786 INFO:tasks.workunit.client.0.vm06.stdout:5/216: write d4/d22/f3f [145889,70084] 0 2026-03-09T17:29:26.786 INFO:tasks.workunit.client.0.vm06.stdout:5/217: creat d4/d9/d18/f4b x:0 0 0 2026-03-09T17:29:26.786 INFO:tasks.workunit.client.0.vm06.stdout:5/218: symlink d4/d9/l4c 0 2026-03-09T17:29:26.814 INFO:tasks.workunit.client.0.vm06.stdout:2/255: truncate d3/d4/f1f 1030981 0 2026-03-09T17:29:26.815 INFO:tasks.workunit.client.0.vm06.stdout:2/256: mknod d3/d4/d38/c56 0 2026-03-09T17:29:26.820 INFO:tasks.workunit.client.0.vm06.stdout:3/212: dread dd/f15 [0,4194304] 0 2026-03-09T17:29:26.821 INFO:tasks.workunit.client.0.vm06.stdout:3/213: write dd/d19/d1e/f23 [3727087,8536] 0 2026-03-09T17:29:26.824 INFO:tasks.workunit.client.0.vm06.stdout:3/214: dwrite f7 [8388608,4194304] 0 2026-03-09T17:29:26.827 INFO:tasks.workunit.client.0.vm06.stdout:3/215: write dd/d19/f36 [159669,11777] 0 2026-03-09T17:29:26.827 INFO:tasks.workunit.client.0.vm06.stdout:3/216: chown fc 1701 1 2026-03-09T17:29:26.830 INFO:tasks.workunit.client.0.vm06.stdout:3/217: dwrite dd/f14 [0,4194304] 0 2026-03-09T17:29:26.848 INFO:tasks.workunit.client.0.vm06.stdout:2/257: link d3/d4/c16 d3/d4/d12/d2b/d2d/c57 0 2026-03-09T17:29:26.848 INFO:tasks.workunit.client.0.vm06.stdout:2/258: read - d3/d4/d12/f2e zero size 2026-03-09T17:29:26.849 INFO:tasks.workunit.client.0.vm06.stdout:2/259: fdatasync d3/d4/d22/f28 0 2026-03-09T17:29:26.853 INFO:tasks.workunit.client.0.vm06.stdout:6/233: write d6/d12/d17/f29 [591119,34950] 0 2026-03-09T17:29:26.854 INFO:tasks.workunit.client.0.vm06.stdout:3/218: creat dd/f40 x:0 0 0 2026-03-09T17:29:26.858 INFO:tasks.workunit.client.0.vm06.stdout:3/219: dwrite dd/f10 [0,4194304] 0 2026-03-09T17:29:26.870 INFO:tasks.workunit.client.0.vm06.stdout:3/220: dwrite dd/f22 [0,4194304] 0 2026-03-09T17:29:26.871 INFO:tasks.workunit.client.0.vm06.stdout:6/234: rmdir d6/d12/d17/d21/d3e 39 2026-03-09T17:29:26.871 INFO:tasks.workunit.client.0.vm06.stdout:6/235: write d6/d12/d2d/f39 [467957,66111] 0 2026-03-09T17:29:26.875 INFO:tasks.workunit.client.0.vm06.stdout:3/221: stat dd/d19/d2c/l3d 0 2026-03-09T17:29:26.890 INFO:tasks.workunit.client.0.vm06.stdout:4/240: dwrite db/f13 [0,4194304] 0 2026-03-09T17:29:26.895 INFO:tasks.workunit.client.0.vm06.stdout:4/241: creat db/f55 x:0 0 0 2026-03-09T17:29:26.900 INFO:tasks.workunit.client.0.vm06.stdout:4/242: dread db/d1d/d21/d25/f38 [0,4194304] 0 2026-03-09T17:29:26.900 INFO:tasks.workunit.client.0.vm06.stdout:4/243: symlink db/d1d/d21/d26/d3c/d45/l56 0 2026-03-09T17:29:26.903 INFO:tasks.workunit.client.0.vm06.stdout:4/244: mkdir db/d57 0 2026-03-09T17:29:26.904 INFO:tasks.workunit.client.0.vm06.stdout:4/245: rmdir db/df 39 2026-03-09T17:29:26.905 INFO:tasks.workunit.client.0.vm06.stdout:4/246: rename db to db/d1d/d58 22 2026-03-09T17:29:26.908 INFO:tasks.workunit.client.0.vm06.stdout:4/247: dwrite db/f51 [0,4194304] 0 2026-03-09T17:29:26.918 INFO:tasks.workunit.client.0.vm06.stdout:4/248: mkdir db/d59 0 2026-03-09T17:29:26.926 INFO:tasks.workunit.client.0.vm06.stdout:8/241: dwrite d15/d16/d1e/f4e [0,4194304] 0 2026-03-09T17:29:26.927 INFO:tasks.workunit.client.0.vm06.stdout:8/242: chown d15/d39/f45 774 1 2026-03-09T17:29:26.927 INFO:tasks.workunit.client.0.vm06.stdout:8/243: dread - d15/d39/f45 zero size 2026-03-09T17:29:26.929 INFO:tasks.workunit.client.0.vm06.stdout:8/244: dread f7 [0,4194304] 0 2026-03-09T17:29:26.937 INFO:tasks.workunit.client.0.vm06.stdout:1/301: write d11/d14/d1c/d1f/f21 [1188845,52098] 0 2026-03-09T17:29:26.938 INFO:tasks.workunit.client.0.vm06.stdout:9/299: sync 2026-03-09T17:29:26.939 INFO:tasks.workunit.client.0.vm06.stdout:9/300: dread - d3/d11/f59 zero size 2026-03-09T17:29:26.942 INFO:tasks.workunit.client.0.vm06.stdout:1/302: dwrite d11/d14/d1c/d1f/f4c [0,4194304] 0 2026-03-09T17:29:26.947 INFO:tasks.workunit.client.0.vm06.stdout:5/219: sync 2026-03-09T17:29:26.957 INFO:tasks.workunit.client.0.vm06.stdout:3/222: sync 2026-03-09T17:29:26.957 INFO:tasks.workunit.client.0.vm06.stdout:2/260: sync 2026-03-09T17:29:26.957 INFO:tasks.workunit.client.0.vm06.stdout:9/301: truncate d3/d11/f1c 486754 0 2026-03-09T17:29:26.957 INFO:tasks.workunit.client.0.vm06.stdout:1/303: truncate d11/d14/d1d/d1e/d2a/f40 3508600 0 2026-03-09T17:29:26.957 INFO:tasks.workunit.client.0.vm06.stdout:4/249: truncate db/fe 3834018 0 2026-03-09T17:29:26.957 INFO:tasks.workunit.client.0.vm06.stdout:9/302: readlink d3/d2c/l40 0 2026-03-09T17:29:26.964 INFO:tasks.workunit.client.0.vm06.stdout:2/261: rmdir d3/d4/d12/d2b 39 2026-03-09T17:29:26.969 INFO:tasks.workunit.client.0.vm06.stdout:3/223: creat dd/d19/d1e/f41 x:0 0 0 2026-03-09T17:29:26.969 INFO:tasks.workunit.client.0.vm06.stdout:3/224: write dd/d1d/f34 [3638219,93871] 0 2026-03-09T17:29:26.970 INFO:tasks.workunit.client.0.vm06.stdout:3/225: readlink dd/d19/l20 0 2026-03-09T17:29:26.975 INFO:tasks.workunit.client.0.vm06.stdout:6/236: read d6/d12/d17/d21/d3e/f42 [2131297,75977] 0 2026-03-09T17:29:26.975 INFO:tasks.workunit.client.0.vm06.stdout:6/237: chown d6 48 1 2026-03-09T17:29:26.979 INFO:tasks.workunit.client.0.vm06.stdout:8/245: creat d15/d16/f50 x:0 0 0 2026-03-09T17:29:26.981 INFO:tasks.workunit.client.0.vm06.stdout:0/317: truncate d7/d11/d19/d3c/f55 355835 0 2026-03-09T17:29:26.991 INFO:tasks.workunit.client.0.vm06.stdout:4/250: symlink db/d1d/d21/d44/l5a 0 2026-03-09T17:29:26.991 INFO:tasks.workunit.client.0.vm06.stdout:4/251: write db/f13 [4232509,114872] 0 2026-03-09T17:29:26.996 INFO:tasks.workunit.client.0.vm06.stdout:2/262: creat d3/d4/d38/f58 x:0 0 0 2026-03-09T17:29:26.997 INFO:tasks.workunit.client.0.vm06.stdout:5/220: link d4/f21 d4/d9/d35/f4d 0 2026-03-09T17:29:26.998 INFO:tasks.workunit.client.0.vm06.stdout:5/221: rename d4/d9/d18 to d4/d9/d18/d3d/d4e 22 2026-03-09T17:29:27.000 INFO:tasks.workunit.client.0.vm06.stdout:6/238: symlink d6/d12/d2d/l4c 0 2026-03-09T17:29:27.001 INFO:tasks.workunit.client.0.vm06.stdout:0/318: fdatasync f5 0 2026-03-09T17:29:27.002 INFO:tasks.workunit.client.0.vm06.stdout:0/319: chown l4 737 1 2026-03-09T17:29:27.007 INFO:tasks.workunit.client.0.vm06.stdout:1/304: write d11/d14/d1d/d1e/d2a/d34/f5c [734759,108196] 0 2026-03-09T17:29:27.024 INFO:tasks.workunit.client.0.vm06.stdout:5/222: symlink d4/d22/l4f 0 2026-03-09T17:29:27.024 INFO:tasks.workunit.client.0.vm06.stdout:5/223: fdatasync d4/d22/f45 0 2026-03-09T17:29:27.029 INFO:tasks.workunit.client.0.vm06.stdout:5/224: dwrite d4/fb [0,4194304] 0 2026-03-09T17:29:27.030 INFO:tasks.workunit.client.0.vm06.stdout:5/225: stat d4/d9/d35 0 2026-03-09T17:29:27.030 INFO:tasks.workunit.client.0.vm06.stdout:5/226: read - d4/f49 zero size 2026-03-09T17:29:27.036 INFO:tasks.workunit.client.0.vm06.stdout:6/239: mkdir d6/d47/d4d 0 2026-03-09T17:29:27.042 INFO:tasks.workunit.client.0.vm06.stdout:6/240: dwrite d6/d12/d17/d21/f26 [0,4194304] 0 2026-03-09T17:29:27.061 INFO:tasks.workunit.client.0.vm06.stdout:1/305: dread d11/d14/d1d/d1e/d2a/d34/f5c [0,4194304] 0 2026-03-09T17:29:27.061 INFO:tasks.workunit.client.0.vm06.stdout:2/263: mknod d3/c59 0 2026-03-09T17:29:27.062 INFO:tasks.workunit.client.0.vm06.stdout:2/264: write d3/f29 [803097,81441] 0 2026-03-09T17:29:27.063 INFO:tasks.workunit.client.0.vm06.stdout:2/265: write d3/f24 [500467,81734] 0 2026-03-09T17:29:27.066 INFO:tasks.workunit.client.0.vm06.stdout:1/306: dwrite d11/d14/d1c/f5b [0,4194304] 0 2026-03-09T17:29:27.083 INFO:tasks.workunit.client.0.vm06.stdout:2/266: dread d3/d4/d12/f15 [0,4194304] 0 2026-03-09T17:29:27.085 INFO:tasks.workunit.client.0.vm06.stdout:8/246: creat d15/d16/f51 x:0 0 0 2026-03-09T17:29:27.092 INFO:tasks.workunit.client.0.vm06.stdout:9/303: write d3/d15/f1a [2778939,92461] 0 2026-03-09T17:29:27.101 INFO:tasks.workunit.client.0.vm06.stdout:3/226: write f4 [6668481,12799] 0 2026-03-09T17:29:27.104 INFO:tasks.workunit.client.0.vm06.stdout:7/270: dread d5/f18 [0,4194304] 0 2026-03-09T17:29:27.105 INFO:tasks.workunit.client.0.vm06.stdout:7/271: truncate d5/dd/f3e 358805 0 2026-03-09T17:29:27.106 INFO:tasks.workunit.client.0.vm06.stdout:4/252: write db/d1d/d21/f2f [868852,103985] 0 2026-03-09T17:29:27.128 INFO:tasks.workunit.client.0.vm06.stdout:2/267: chown d3/d4/d12/d2b/f32 24 1 2026-03-09T17:29:27.138 INFO:tasks.workunit.client.0.vm06.stdout:2/268: dread d3/d4/d38/f50 [0,4194304] 0 2026-03-09T17:29:27.139 INFO:tasks.workunit.client.0.vm06.stdout:2/269: chown d3/d4/d38/f58 147773 1 2026-03-09T17:29:27.144 INFO:tasks.workunit.client.0.vm06.stdout:0/320: link d7/d11/d19/d1d/c5c d7/c70 0 2026-03-09T17:29:27.150 INFO:tasks.workunit.client.0.vm06.stdout:7/272: symlink d5/d1f/d34/d3f/l44 0 2026-03-09T17:29:27.167 INFO:tasks.workunit.client.0.vm06.stdout:5/227: rename d4/d9 to d4/d50 0 2026-03-09T17:29:27.170 INFO:tasks.workunit.client.0.vm06.stdout:1/307: write d11/d14/f17 [6380138,16917] 0 2026-03-09T17:29:27.171 INFO:tasks.workunit.client.0.vm06.stdout:1/308: chown d11/d14/d1c/d1f 4968237 1 2026-03-09T17:29:27.176 INFO:tasks.workunit.client.0.vm06.stdout:0/321: symlink d7/d11/d19/d1d/d59/d5b/l71 0 2026-03-09T17:29:27.180 INFO:tasks.workunit.client.0.vm06.stdout:3/227: mknod dd/c42 0 2026-03-09T17:29:27.180 INFO:tasks.workunit.client.0.vm06.stdout:9/304: write d3/d15/f17 [1996256,75499] 0 2026-03-09T17:29:27.182 INFO:tasks.workunit.client.0.vm06.stdout:4/253: rename db/fe to db/d1d/f5b 0 2026-03-09T17:29:27.183 INFO:tasks.workunit.client.0.vm06.stdout:4/254: fdatasync db/d1d/d21/d25/f35 0 2026-03-09T17:29:27.188 INFO:tasks.workunit.client.0.vm06.stdout:4/255: dwrite db/d1d/d21/d26/d3c/d45/f4a [0,4194304] 0 2026-03-09T17:29:27.188 INFO:tasks.workunit.client.0.vm06.stdout:7/273: dread d5/dd/f1a [0,4194304] 0 2026-03-09T17:29:27.189 INFO:tasks.workunit.client.0.vm06.stdout:7/274: write d5/d1f/f3a [535395,90636] 0 2026-03-09T17:29:27.194 INFO:tasks.workunit.client.0.vm06.stdout:4/256: dwrite db/d1d/d21/f2f [4194304,4194304] 0 2026-03-09T17:29:27.194 INFO:tasks.workunit.client.0.vm06.stdout:8/247: creat d15/d16/f52 x:0 0 0 2026-03-09T17:29:27.212 INFO:tasks.workunit.client.0.vm06.stdout:0/322: dread d7/fe [0,4194304] 0 2026-03-09T17:29:27.216 INFO:tasks.workunit.client.0.vm06.stdout:8/248: dread d15/d16/d19/f26 [0,4194304] 0 2026-03-09T17:29:27.222 INFO:tasks.workunit.client.0.vm06.stdout:4/257: symlink db/d1d/d21/d25/d4b/l5c 0 2026-03-09T17:29:27.225 INFO:tasks.workunit.client.0.vm06.stdout:0/323: fsync d7/d11/f1c 0 2026-03-09T17:29:27.226 INFO:tasks.workunit.client.0.vm06.stdout:6/241: link d6/d12/d17/d21/l4b d6/d12/d17/d21/l4e 0 2026-03-09T17:29:27.229 INFO:tasks.workunit.client.0.vm06.stdout:8/249: rename cc to d15/d31/c53 0 2026-03-09T17:29:27.233 INFO:tasks.workunit.client.0.vm06.stdout:8/250: dwrite d15/d16/f51 [0,4194304] 0 2026-03-09T17:29:27.236 INFO:tasks.workunit.client.0.vm06.stdout:4/258: mkdir db/d1d/d21/d26/d3c/d5d 0 2026-03-09T17:29:27.242 INFO:tasks.workunit.client.0.vm06.stdout:0/324: mknod d7/d11/d19/d37/c72 0 2026-03-09T17:29:27.242 INFO:tasks.workunit.client.0.vm06.stdout:8/251: dwrite d15/d16/f3f [0,4194304] 0 2026-03-09T17:29:27.243 INFO:tasks.workunit.client.0.vm06.stdout:2/270: getdents d3/d4 0 2026-03-09T17:29:27.247 INFO:tasks.workunit.client.0.vm06.stdout:7/275: sync 2026-03-09T17:29:27.250 INFO:tasks.workunit.client.0.vm06.stdout:4/259: rename db/l19 to db/d1d/l5e 0 2026-03-09T17:29:27.252 INFO:tasks.workunit.client.0.vm06.stdout:1/309: getdents d11/d14/d1c/d1f 0 2026-03-09T17:29:27.262 INFO:tasks.workunit.client.0.vm06.stdout:2/271: truncate d3/d4/d12/f15 5122533 0 2026-03-09T17:29:27.265 INFO:tasks.workunit.client.0.vm06.stdout:7/276: mkdir d5/d1f/d34/d3f/d45 0 2026-03-09T17:29:27.265 INFO:tasks.workunit.client.0.vm06.stdout:3/228: link dd/c42 dd/d19/c43 0 2026-03-09T17:29:27.272 INFO:tasks.workunit.client.0.vm06.stdout:2/272: read d3/d4/f1f [703404,83699] 0 2026-03-09T17:29:27.272 INFO:tasks.workunit.client.0.vm06.stdout:2/273: write d3/d4/f52 [1148372,16255] 0 2026-03-09T17:29:27.276 INFO:tasks.workunit.client.0.vm06.stdout:7/277: dread f0 [0,4194304] 0 2026-03-09T17:29:27.276 INFO:tasks.workunit.client.0.vm06.stdout:3/229: mkdir dd/d19/d25/d44 0 2026-03-09T17:29:27.276 INFO:tasks.workunit.client.0.vm06.stdout:7/278: truncate d5/d12/f2c 712628 0 2026-03-09T17:29:27.277 INFO:tasks.workunit.client.0.vm06.stdout:3/230: write dd/d19/d25/f31 [1106248,57247] 0 2026-03-09T17:29:27.278 INFO:tasks.workunit.client.0.vm06.stdout:3/231: truncate dd/f38 14702 0 2026-03-09T17:29:27.288 INFO:tasks.workunit.client.0.vm06.stdout:1/310: mkdir d11/d6b 0 2026-03-09T17:29:27.290 INFO:tasks.workunit.client.0.vm06.stdout:6/242: getdents d6/d12/d17 0 2026-03-09T17:29:27.291 INFO:tasks.workunit.client.0.vm06.stdout:6/243: write d6/f46 [657856,106804] 0 2026-03-09T17:29:27.294 INFO:tasks.workunit.client.0.vm06.stdout:8/252: rmdir d15/d31/d48 0 2026-03-09T17:29:27.299 INFO:tasks.workunit.client.0.vm06.stdout:2/274: dread d3/d4/d22/f2f [0,4194304] 0 2026-03-09T17:29:27.300 INFO:tasks.workunit.client.0.vm06.stdout:2/275: fdatasync d3/d4/f3c 0 2026-03-09T17:29:27.302 INFO:tasks.workunit.client.0.vm06.stdout:7/279: mkdir d5/d1f/d34/d46 0 2026-03-09T17:29:27.303 INFO:tasks.workunit.client.0.vm06.stdout:1/311: mknod d11/d14/d1d/d1e/d2a/c6c 0 2026-03-09T17:29:27.306 INFO:tasks.workunit.client.0.vm06.stdout:2/276: readlink d3/l47 0 2026-03-09T17:29:27.312 INFO:tasks.workunit.client.0.vm06.stdout:9/305: dwrite d3/d15/d16/f31 [0,4194304] 0 2026-03-09T17:29:27.314 INFO:tasks.workunit.client.0.vm06.stdout:9/306: readlink d3/d2c/l40 0 2026-03-09T17:29:27.314 INFO:tasks.workunit.client.0.vm06.stdout:9/307: fdatasync d3/d26/f28 0 2026-03-09T17:29:27.315 INFO:tasks.workunit.client.0.vm06.stdout:9/308: dread - d3/d15/f58 zero size 2026-03-09T17:29:27.327 INFO:tasks.workunit.client.0.vm06.stdout:5/228: truncate d4/f2d 2632141 0 2026-03-09T17:29:27.333 INFO:tasks.workunit.client.0.vm06.stdout:2/277: unlink d3/d4/l39 0 2026-03-09T17:29:27.335 INFO:tasks.workunit.client.0.vm06.stdout:4/260: rename db/d1d/d21/d26/d3c to db/d59/d5f 0 2026-03-09T17:29:27.341 INFO:tasks.workunit.client.0.vm06.stdout:1/312: mknod d11/d69/c6d 0 2026-03-09T17:29:27.341 INFO:tasks.workunit.client.0.vm06.stdout:1/313: write d11/d14/f17 [9133139,115249] 0 2026-03-09T17:29:27.347 INFO:tasks.workunit.client.0.vm06.stdout:0/325: write d7/d11/d19/d1d/f43 [158418,53090] 0 2026-03-09T17:29:27.352 INFO:tasks.workunit.client.0.vm06.stdout:8/253: creat d15/d16/f54 x:0 0 0 2026-03-09T17:29:27.353 INFO:tasks.workunit.client.0.vm06.stdout:8/254: truncate d15/d39/f45 933847 0 2026-03-09T17:29:27.361 INFO:tasks.workunit.client.0.vm06.stdout:3/232: rename dd/d19/f36 to dd/d1d/f45 0 2026-03-09T17:29:27.366 INFO:tasks.workunit.client.0.vm06.stdout:6/244: write d6/d12/d17/d27/f37 [5440430,5272] 0 2026-03-09T17:29:27.368 INFO:tasks.workunit.client.0.vm06.stdout:1/314: mknod d11/d14/d1d/d1e/d2a/c6e 0 2026-03-09T17:29:27.372 INFO:tasks.workunit.client.0.vm06.stdout:0/326: mknod d7/d11/d5d/c73 0 2026-03-09T17:29:27.381 INFO:tasks.workunit.client.0.vm06.stdout:2/278: chown d3/d4 3371072 1 2026-03-09T17:29:27.385 INFO:tasks.workunit.client.0.vm06.stdout:9/309: creat d3/f5f x:0 0 0 2026-03-09T17:29:27.388 INFO:tasks.workunit.client.0.vm06.stdout:7/280: getdents d5 0 2026-03-09T17:29:27.392 INFO:tasks.workunit.client.0.vm06.stdout:1/315: dread d11/d14/d1c/f2e [0,4194304] 0 2026-03-09T17:29:27.395 INFO:tasks.workunit.client.0.vm06.stdout:0/327: mknod d7/d11/d19/d23/c74 0 2026-03-09T17:29:27.400 INFO:tasks.workunit.client.0.vm06.stdout:2/279: dread - d3/d4/d38/f58 zero size 2026-03-09T17:29:27.403 INFO:tasks.workunit.client.0.vm06.stdout:3/233: mknod dd/d19/c46 0 2026-03-09T17:29:27.407 INFO:tasks.workunit.client.0.vm06.stdout:4/261: dwrite db/df/f2d [0,4194304] 0 2026-03-09T17:29:27.409 INFO:tasks.workunit.client.0.vm06.stdout:7/281: creat d5/d1f/d34/f47 x:0 0 0 2026-03-09T17:29:27.415 INFO:tasks.workunit.client.0.vm06.stdout:8/255: mkdir d15/d16/d1e/d30/d55 0 2026-03-09T17:29:27.418 INFO:tasks.workunit.client.0.vm06.stdout:5/229: rename d4/l30 to d4/d50/l51 0 2026-03-09T17:29:27.422 INFO:tasks.workunit.client.0.vm06.stdout:4/262: rmdir db/d1d/d21/d26 39 2026-03-09T17:29:27.430 INFO:tasks.workunit.client.0.vm06.stdout:6/245: getdents d6/d12/d17/d21/d3e 0 2026-03-09T17:29:27.433 INFO:tasks.workunit.client.0.vm06.stdout:0/328: creat d7/d11/f75 x:0 0 0 2026-03-09T17:29:27.434 INFO:tasks.workunit.client.0.vm06.stdout:0/329: write d7/f14 [3846578,56030] 0 2026-03-09T17:29:27.435 INFO:tasks.workunit.client.0.vm06.stdout:8/256: mknod d15/d16/d1a/d47/c56 0 2026-03-09T17:29:27.437 INFO:tasks.workunit.client.0.vm06.stdout:5/230: mkdir d4/d52 0 2026-03-09T17:29:27.440 INFO:tasks.workunit.client.0.vm06.stdout:1/316: sync 2026-03-09T17:29:27.441 INFO:tasks.workunit.client.0.vm06.stdout:6/246: sync 2026-03-09T17:29:27.442 INFO:tasks.workunit.client.0.vm06.stdout:5/231: dwrite d4/f7 [0,4194304] 0 2026-03-09T17:29:27.449 INFO:tasks.workunit.client.0.vm06.stdout:6/247: dwrite d6/d12/d17/f32 [4194304,4194304] 0 2026-03-09T17:29:27.452 INFO:tasks.workunit.client.0.vm06.stdout:6/248: truncate d6/d12/d17/d21/f26 5174134 0 2026-03-09T17:29:27.453 INFO:tasks.workunit.client.0.vm06.stdout:6/249: read d6/f46 [761887,40214] 0 2026-03-09T17:29:27.469 INFO:tasks.workunit.client.0.vm06.stdout:9/310: truncate d3/d11/f2a 3065256 0 2026-03-09T17:29:27.471 INFO:tasks.workunit.client.0.vm06.stdout:4/263: rename db/d1d/d21/d25/l28 to db/d1d/d21/d26/l60 0 2026-03-09T17:29:27.472 INFO:tasks.workunit.client.0.vm06.stdout:4/264: write db/d1d/d21/d25/f53 [402644,112039] 0 2026-03-09T17:29:27.474 INFO:tasks.workunit.client.0.vm06.stdout:0/330: creat d7/f76 x:0 0 0 2026-03-09T17:29:27.476 INFO:tasks.workunit.client.0.vm06.stdout:8/257: rmdir d15/d16/d1e/d30 39 2026-03-09T17:29:27.486 INFO:tasks.workunit.client.0.vm06.stdout:7/282: dwrite d5/f8 [0,4194304] 0 2026-03-09T17:29:27.486 INFO:tasks.workunit.client.0.vm06.stdout:2/280: getdents d3 0 2026-03-09T17:29:27.486 INFO:tasks.workunit.client.0.vm06.stdout:7/283: chown d5/d1f/d34/d3f 8738 1 2026-03-09T17:29:27.486 INFO:tasks.workunit.client.0.vm06.stdout:7/284: dread - d5/dd/f22 zero size 2026-03-09T17:29:27.486 INFO:tasks.workunit.client.0.vm06.stdout:7/285: write d5/d7/f1b [3120150,77972] 0 2026-03-09T17:29:27.486 INFO:tasks.workunit.client.0.vm06.stdout:7/286: write d5/f16 [2066534,109592] 0 2026-03-09T17:29:27.486 INFO:tasks.workunit.client.0.vm06.stdout:1/317: unlink d11/d14/d1c/f5b 0 2026-03-09T17:29:27.486 INFO:tasks.workunit.client.0.vm06.stdout:7/287: write d5/dd/ff [1113531,85063] 0 2026-03-09T17:29:27.486 INFO:tasks.workunit.client.0.vm06.stdout:7/288: chown d5/d7/f1d 0 1 2026-03-09T17:29:27.493 INFO:tasks.workunit.client.0.vm06.stdout:5/232: mknod d4/d50/d35/c53 0 2026-03-09T17:29:27.493 INFO:tasks.workunit.client.0.vm06.stdout:5/233: readlink d4/d50/l13 0 2026-03-09T17:29:27.496 INFO:tasks.workunit.client.0.vm06.stdout:3/234: getdents dd/d19/d1e 0 2026-03-09T17:29:27.497 INFO:tasks.workunit.client.0.vm06.stdout:9/311: read d3/d15/f2e [86303,64006] 0 2026-03-09T17:29:27.499 INFO:tasks.workunit.client.0.vm06.stdout:6/250: dwrite d6/d12/d17/f29 [0,4194304] 0 2026-03-09T17:29:27.510 INFO:tasks.workunit.client.0.vm06.stdout:0/331: creat d7/d11/d19/d3c/f77 x:0 0 0 2026-03-09T17:29:27.511 INFO:tasks.workunit.client.0.vm06.stdout:0/332: write d7/d11/f13 [408948,99659] 0 2026-03-09T17:29:27.515 INFO:tasks.workunit.client.0.vm06.stdout:8/258: dread d15/d16/d19/d2b/f46 [4194304,4194304] 0 2026-03-09T17:29:27.516 INFO:tasks.workunit.client.0.vm06.stdout:8/259: stat c3 0 2026-03-09T17:29:27.516 INFO:tasks.workunit.client.0.vm06.stdout:2/281: dwrite d3/d4/f3c [0,4194304] 0 2026-03-09T17:29:27.520 INFO:tasks.workunit.client.0.vm06.stdout:1/318: fdatasync d11/f13 0 2026-03-09T17:29:27.520 INFO:tasks.workunit.client.0.vm06.stdout:1/319: chown d11/d69/c6d 396 1 2026-03-09T17:29:27.524 INFO:tasks.workunit.client.0.vm06.stdout:1/320: dwrite d11/d14/d1d/f4e [0,4194304] 0 2026-03-09T17:29:27.536 INFO:tasks.workunit.client.0.vm06.stdout:5/234: unlink d4/d50/c23 0 2026-03-09T17:29:27.548 INFO:tasks.workunit.client.0.vm06.stdout:8/260: mknod d15/d16/d1a/d47/c57 0 2026-03-09T17:29:27.548 INFO:tasks.workunit.client.0.vm06.stdout:8/261: write d15/f3e [4125532,97335] 0 2026-03-09T17:29:27.549 INFO:tasks.workunit.client.0.vm06.stdout:8/262: truncate d15/d16/d1a/f22 1318019 0 2026-03-09T17:29:27.552 INFO:tasks.workunit.client.0.vm06.stdout:2/282: creat d3/f5a x:0 0 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:5/235: write d4/d50/d18/f4a [2620251,109083] 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:3/235: mknod dd/d19/c47 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:9/312: rename d3/d26/d35/f3d to d3/d15/d36/d4d/f60 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:8/263: mkdir d15/d31/d58 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:8/264: dread - d15/d16/f50 zero size 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:9/313: dwrite d3/d26/f33 [0,4194304] 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:9/314: read - d3/d11/f59 zero size 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:7/289: rmdir d5/d1f/d34/d3f/d45 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:1/321: truncate d11/d14/d1d/d1e/d2a/f40 2453500 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:9/315: dread d3/d15/d16/d54/f3a [0,4194304] 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:1/322: readlink d11/d14/d1d/d1e/d2a/l63 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:3/236: mkdir dd/d19/d25/d48 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:3/237: chown dd/d1d/d2e 212 1 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:5/236: write d4/d50/d18/f28 [5211663,3580] 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:7/290: creat d5/dd/f48 x:0 0 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:1/323: symlink d11/d14/d1c/d1f/l6f 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:8/265: truncate d15/d16/f21 4835087 0 2026-03-09T17:29:27.577 INFO:tasks.workunit.client.0.vm06.stdout:8/266: write d15/d16/f50 [111873,122110] 0 2026-03-09T17:29:27.578 INFO:tasks.workunit.client.0.vm06.stdout:8/267: write d15/d16/f54 [1037811,26510] 0 2026-03-09T17:29:27.578 INFO:tasks.workunit.client.0.vm06.stdout:5/237: creat d4/d50/d18/d3d/f54 x:0 0 0 2026-03-09T17:29:27.588 INFO:tasks.workunit.client.0.vm06.stdout:3/238: mknod dd/d19/d25/d44/c49 0 2026-03-09T17:29:27.588 INFO:tasks.workunit.client.0.vm06.stdout:9/316: link d3/f27 d3/d15/d36/d4d/f61 0 2026-03-09T17:29:27.588 INFO:tasks.workunit.client.0.vm06.stdout:9/317: fsync d3/d15/d36/d4c/f5a 0 2026-03-09T17:29:27.588 INFO:tasks.workunit.client.0.vm06.stdout:5/238: mkdir d4/d52/d55 0 2026-03-09T17:29:27.588 INFO:tasks.workunit.client.0.vm06.stdout:7/291: link d5/d7/lb d5/d1f/l49 0 2026-03-09T17:29:27.588 INFO:tasks.workunit.client.0.vm06.stdout:5/239: rename d4/d50/d18/f28 to d4/d22/f56 0 2026-03-09T17:29:27.590 INFO:tasks.workunit.client.0.vm06.stdout:7/292: rename d5/d12/l13 to d5/d12/l4a 0 2026-03-09T17:29:27.593 INFO:tasks.workunit.client.0.vm06.stdout:5/240: symlink d4/d50/d35/l57 0 2026-03-09T17:29:27.599 INFO:tasks.workunit.client.0.vm06.stdout:3/239: creat dd/f4a x:0 0 0 2026-03-09T17:29:27.599 INFO:tasks.workunit.client.0.vm06.stdout:3/240: creat dd/d1d/f4b x:0 0 0 2026-03-09T17:29:27.599 INFO:tasks.workunit.client.0.vm06.stdout:3/241: read - dd/d19/d28/f32 zero size 2026-03-09T17:29:27.599 INFO:tasks.workunit.client.0.vm06.stdout:8/268: link fe d15/d16/d1e/f59 0 2026-03-09T17:29:27.599 INFO:tasks.workunit.client.0.vm06.stdout:5/241: creat d4/d22/d46/f58 x:0 0 0 2026-03-09T17:29:27.600 INFO:tasks.workunit.client.0.vm06.stdout:3/242: creat dd/d19/d25/d48/f4c x:0 0 0 2026-03-09T17:29:27.601 INFO:tasks.workunit.client.0.vm06.stdout:5/242: creat d4/d22/d46/f59 x:0 0 0 2026-03-09T17:29:27.602 INFO:tasks.workunit.client.0.vm06.stdout:3/243: symlink dd/d19/d1e/l4d 0 2026-03-09T17:29:27.606 INFO:tasks.workunit.client.0.vm06.stdout:3/244: dread dd/f10 [0,4194304] 0 2026-03-09T17:29:27.619 INFO:tasks.workunit.client.0.vm06.stdout:3/245: mkdir dd/d1d/d4e 0 2026-03-09T17:29:27.624 INFO:tasks.workunit.client.0.vm06.stdout:5/243: link d4/d50/c2f d4/d22/c5a 0 2026-03-09T17:29:27.625 INFO:tasks.workunit.client.0.vm06.stdout:3/246: dwrite dd/f10 [0,4194304] 0 2026-03-09T17:29:27.631 INFO:tasks.workunit.client.0.vm06.stdout:5/244: creat d4/d52/f5b x:0 0 0 2026-03-09T17:29:27.631 INFO:tasks.workunit.client.0.vm06.stdout:5/245: truncate d4/f49 341372 0 2026-03-09T17:29:27.637 INFO:tasks.workunit.client.0.vm06.stdout:3/247: creat dd/d19/d25/f4f x:0 0 0 2026-03-09T17:29:27.640 INFO:tasks.workunit.client.0.vm06.stdout:5/246: dwrite d4/d22/f56 [0,4194304] 0 2026-03-09T17:29:27.641 INFO:tasks.workunit.client.0.vm06.stdout:5/247: chown d4/d50/c19 1 1 2026-03-09T17:29:27.647 INFO:tasks.workunit.client.0.vm06.stdout:3/248: link dd/c18 dd/d19/d25/c50 0 2026-03-09T17:29:27.648 INFO:tasks.workunit.client.0.vm06.stdout:3/249: creat dd/f51 x:0 0 0 2026-03-09T17:29:27.649 INFO:tasks.workunit.client.0.vm06.stdout:3/250: rmdir dd/d19/d28 39 2026-03-09T17:29:27.651 INFO:tasks.workunit.client.0.vm06.stdout:3/251: dread dd/d1d/f29 [0,4194304] 0 2026-03-09T17:29:27.661 INFO:tasks.workunit.client.0.vm06.stdout:3/252: symlink dd/d19/d25/l52 0 2026-03-09T17:29:27.661 INFO:tasks.workunit.client.0.vm06.stdout:3/253: creat dd/d19/d25/f53 x:0 0 0 2026-03-09T17:29:27.661 INFO:tasks.workunit.client.0.vm06.stdout:3/254: dread - dd/d1d/f4b zero size 2026-03-09T17:29:27.661 INFO:tasks.workunit.client.0.vm06.stdout:3/255: unlink fc 0 2026-03-09T17:29:27.661 INFO:tasks.workunit.client.0.vm06.stdout:3/256: fsync dd/d19/d2c/f30 0 2026-03-09T17:29:27.661 INFO:tasks.workunit.client.0.vm06.stdout:3/257: write dd/f22 [5019783,65352] 0 2026-03-09T17:29:28.122 INFO:tasks.workunit.client.0.vm06.stdout:4/265: dwrite db/fc [0,4194304] 0 2026-03-09T17:29:28.141 INFO:tasks.workunit.client.0.vm06.stdout:4/266: rmdir db/d1d/d21/d44 39 2026-03-09T17:29:28.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:27 vm06.local ceph-mon[57307]: pgmap v147: 65 pgs: 65 active+clean; 658 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 9.4 MiB/s rd, 81 MiB/s wr, 340 op/s 2026-03-09T17:29:28.143 INFO:tasks.workunit.client.0.vm06.stdout:0/333: write d7/f27 [961575,46879] 0 2026-03-09T17:29:28.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:27 vm09.local ceph-mon[62061]: pgmap v147: 65 pgs: 65 active+clean; 658 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 9.4 MiB/s rd, 81 MiB/s wr, 340 op/s 2026-03-09T17:29:28.147 INFO:tasks.workunit.client.0.vm06.stdout:4/267: unlink db/d59/d5f/f52 0 2026-03-09T17:29:28.150 INFO:tasks.workunit.client.0.vm06.stdout:0/334: read d7/f36 [4192988,126967] 0 2026-03-09T17:29:28.151 INFO:tasks.workunit.client.0.vm06.stdout:0/335: stat d7/d11/d2d/f44 0 2026-03-09T17:29:28.155 INFO:tasks.workunit.client.0.vm06.stdout:4/268: creat db/d59/d5f/d45/f61 x:0 0 0 2026-03-09T17:29:28.159 INFO:tasks.workunit.client.0.vm06.stdout:0/336: rmdir d7/d11/d5d 39 2026-03-09T17:29:28.160 INFO:tasks.workunit.client.0.vm06.stdout:6/251: truncate d6/fb 4840868 0 2026-03-09T17:29:28.167 INFO:tasks.workunit.client.0.vm06.stdout:0/337: read - d7/d11/d5d/d64/f6a zero size 2026-03-09T17:29:28.171 INFO:tasks.workunit.client.0.vm06.stdout:4/269: creat db/d59/d5f/d5d/f62 x:0 0 0 2026-03-09T17:29:28.175 INFO:tasks.workunit.client.0.vm06.stdout:6/252: dread d6/f46 [0,4194304] 0 2026-03-09T17:29:28.177 INFO:tasks.workunit.client.0.vm06.stdout:2/283: chown d3/d4/d12/d2b/d2d/c18 3600161 1 2026-03-09T17:29:28.179 INFO:tasks.workunit.client.0.vm06.stdout:0/338: rename d7/d11/d19/d1d/f43 to d7/d11/d2d/f78 0 2026-03-09T17:29:28.186 INFO:tasks.workunit.client.0.vm06.stdout:0/339: dwrite d7/f14 [0,4194304] 0 2026-03-09T17:29:28.209 INFO:tasks.workunit.client.0.vm06.stdout:1/324: dwrite d11/d14/d1d/d1e/d2a/d34/f60 [0,4194304] 0 2026-03-09T17:29:28.231 INFO:tasks.workunit.client.0.vm06.stdout:1/325: creat d11/d14/d1d/d42/f70 x:0 0 0 2026-03-09T17:29:28.237 INFO:tasks.workunit.client.0.vm06.stdout:9/318: truncate d3/d15/d36/d4d/f60 3847387 0 2026-03-09T17:29:28.241 INFO:tasks.workunit.client.0.vm06.stdout:0/340: symlink d7/d11/l79 0 2026-03-09T17:29:28.244 INFO:tasks.workunit.client.0.vm06.stdout:6/253: rename d6/d12/d17/d21 to d6/d4f 0 2026-03-09T17:29:28.250 INFO:tasks.workunit.client.0.vm06.stdout:0/341: rename d7/d11/d19/d3c/f77 to d7/d11/d19/d1d/d59/f7a 0 2026-03-09T17:29:28.250 INFO:tasks.workunit.client.0.vm06.stdout:0/342: dread - d7/d11/d5d/d64/f6b zero size 2026-03-09T17:29:28.252 INFO:tasks.workunit.client.0.vm06.stdout:7/293: truncate d5/f16 195761 0 2026-03-09T17:29:28.253 INFO:tasks.workunit.client.0.vm06.stdout:7/294: truncate d5/d12/f32 731102 0 2026-03-09T17:29:28.255 INFO:tasks.workunit.client.0.vm06.stdout:6/254: write d6/d12/d2d/f48 [4686822,64718] 0 2026-03-09T17:29:28.257 INFO:tasks.workunit.client.0.vm06.stdout:1/326: symlink d11/d14/d1d/d1e/l71 0 2026-03-09T17:29:28.258 INFO:tasks.workunit.client.0.vm06.stdout:1/327: dread - d11/d14/d1c/d1f/f45 zero size 2026-03-09T17:29:28.258 INFO:tasks.workunit.client.0.vm06.stdout:1/328: stat d11/d14 0 2026-03-09T17:29:28.268 INFO:tasks.workunit.client.0.vm06.stdout:0/343: symlink d7/d11/d19/d1d/d39/l7b 0 2026-03-09T17:29:28.271 INFO:tasks.workunit.client.0.vm06.stdout:7/295: creat d5/f4b x:0 0 0 2026-03-09T17:29:28.273 INFO:tasks.workunit.client.0.vm06.stdout:7/296: dread f0 [0,4194304] 0 2026-03-09T17:29:28.275 INFO:tasks.workunit.client.0.vm06.stdout:5/248: truncate d4/d22/f56 468989 0 2026-03-09T17:29:28.278 INFO:tasks.workunit.client.0.vm06.stdout:1/329: fdatasync d11/d14/d1d/d1e/f65 0 2026-03-09T17:29:28.288 INFO:tasks.workunit.client.0.vm06.stdout:0/344: dread d7/d11/d19/d3c/f55 [0,4194304] 0 2026-03-09T17:29:28.289 INFO:tasks.workunit.client.0.vm06.stdout:3/258: truncate dd/d1d/f34 425849 0 2026-03-09T17:29:28.293 INFO:tasks.workunit.client.0.vm06.stdout:5/249: dwrite d4/d50/f1e [0,4194304] 0 2026-03-09T17:29:28.298 INFO:tasks.workunit.client.0.vm06.stdout:0/345: symlink d7/d11/d19/l7c 0 2026-03-09T17:29:28.301 INFO:tasks.workunit.client.0.vm06.stdout:5/250: creat d4/d50/d18/f5c x:0 0 0 2026-03-09T17:29:28.305 INFO:tasks.workunit.client.0.vm06.stdout:5/251: dwrite d4/d50/f41 [0,4194304] 0 2026-03-09T17:29:28.311 INFO:tasks.workunit.client.0.vm06.stdout:3/259: getdents dd/d1d/d4e 0 2026-03-09T17:29:28.316 INFO:tasks.workunit.client.0.vm06.stdout:5/252: dwrite d4/d50/d35/f39 [0,4194304] 0 2026-03-09T17:29:28.316 INFO:tasks.workunit.client.0.vm06.stdout:5/253: fsync d4/d50/f24 0 2026-03-09T17:29:28.331 INFO:tasks.workunit.client.0.vm06.stdout:0/346: link d7/d11/f75 d7/d11/d19/d1d/d39/f7d 0 2026-03-09T17:29:28.332 INFO:tasks.workunit.client.0.vm06.stdout:0/347: chown d7/d11/d19/d1d/d39/f7d 16549 1 2026-03-09T17:29:28.340 INFO:tasks.workunit.client.0.vm06.stdout:3/260: getdents dd/d1d/d4e 0 2026-03-09T17:29:28.344 INFO:tasks.workunit.client.0.vm06.stdout:3/261: symlink dd/d1d/d4e/l54 0 2026-03-09T17:29:28.347 INFO:tasks.workunit.client.0.vm06.stdout:3/262: dwrite dd/f22 [4194304,4194304] 0 2026-03-09T17:29:28.347 INFO:tasks.workunit.client.0.vm06.stdout:3/263: fdatasync dd/d1d/d2e/f3a 0 2026-03-09T17:29:28.348 INFO:tasks.workunit.client.0.vm06.stdout:3/264: fdatasync dd/d19/d1e/f3f 0 2026-03-09T17:29:28.359 INFO:tasks.workunit.client.0.vm06.stdout:0/348: dread d7/f56 [0,4194304] 0 2026-03-09T17:29:28.360 INFO:tasks.workunit.client.0.vm06.stdout:0/349: stat d7/d11/d2d/l61 0 2026-03-09T17:29:28.360 INFO:tasks.workunit.client.0.vm06.stdout:3/265: creat dd/d19/d25/d2d/f55 x:0 0 0 2026-03-09T17:29:28.362 INFO:tasks.workunit.client.0.vm06.stdout:3/266: fsync dd/f15 0 2026-03-09T17:29:28.372 INFO:tasks.workunit.client.0.vm06.stdout:3/267: rename dd/d19/d25/f31 to dd/d19/d25/f56 0 2026-03-09T17:29:28.372 INFO:tasks.workunit.client.0.vm06.stdout:3/268: chown dd/d19/d2c/f37 3993 1 2026-03-09T17:29:28.375 INFO:tasks.workunit.client.0.vm06.stdout:3/269: creat dd/d19/d25/d44/f57 x:0 0 0 2026-03-09T17:29:28.381 INFO:tasks.workunit.client.0.vm06.stdout:3/270: creat dd/d19/d28/f58 x:0 0 0 2026-03-09T17:29:28.382 INFO:tasks.workunit.client.0.vm06.stdout:3/271: truncate dd/d19/d1e/f23 4555141 0 2026-03-09T17:29:28.382 INFO:tasks.workunit.client.0.vm06.stdout:3/272: chown dd/d19/d28/l3b 4844485 1 2026-03-09T17:29:28.386 INFO:tasks.workunit.client.0.vm06.stdout:3/273: mkdir dd/d59 0 2026-03-09T17:29:28.397 INFO:tasks.workunit.client.0.vm06.stdout:1/330: sync 2026-03-09T17:29:28.404 INFO:tasks.workunit.client.0.vm06.stdout:1/331: creat d11/d14/d1c/d3a/f72 x:0 0 0 2026-03-09T17:29:28.409 INFO:tasks.workunit.client.0.vm06.stdout:0/350: sync 2026-03-09T17:29:28.410 INFO:tasks.workunit.client.0.vm06.stdout:0/351: dread - d7/d11/d5d/d64/f6b zero size 2026-03-09T17:29:28.426 INFO:tasks.workunit.client.0.vm06.stdout:6/255: rename d6/d4f/d3e/f42 to d6/d47/d4d/f50 0 2026-03-09T17:29:28.437 INFO:tasks.workunit.client.0.vm06.stdout:4/270: write db/d1d/d21/d25/f38 [1417652,17379] 0 2026-03-09T17:29:28.442 INFO:tasks.workunit.client.0.vm06.stdout:2/284: dwrite d3/d4/d12/d2b/d36/d37/f3a [0,4194304] 0 2026-03-09T17:29:28.452 INFO:tasks.workunit.client.0.vm06.stdout:4/271: sync 2026-03-09T17:29:28.453 INFO:tasks.workunit.client.0.vm06.stdout:2/285: sync 2026-03-09T17:29:28.457 INFO:tasks.workunit.client.0.vm06.stdout:9/319: rmdir d3/d15/d36 39 2026-03-09T17:29:28.463 INFO:tasks.workunit.client.0.vm06.stdout:4/272: symlink db/d59/d5f/d45/l63 0 2026-03-09T17:29:28.463 INFO:tasks.workunit.client.0.vm06.stdout:4/273: fsync db/df/f14 0 2026-03-09T17:29:28.471 INFO:tasks.workunit.client.0.vm06.stdout:9/320: creat d3/d15/d36/d4d/f62 x:0 0 0 2026-03-09T17:29:28.477 INFO:tasks.workunit.client.0.vm06.stdout:4/274: rename db/d1d/d21/d26/l60 to db/d59/l64 0 2026-03-09T17:29:28.481 INFO:tasks.workunit.client.0.vm06.stdout:4/275: dwrite db/d1d/f22 [0,4194304] 0 2026-03-09T17:29:28.487 INFO:tasks.workunit.client.0.vm06.stdout:4/276: write db/d1d/f3a [146747,81314] 0 2026-03-09T17:29:28.499 INFO:tasks.workunit.client.0.vm06.stdout:7/297: stat d5/d1f/d34 0 2026-03-09T17:29:28.503 INFO:tasks.workunit.client.0.vm06.stdout:8/269: dwrite d15/d16/d1e/f59 [0,4194304] 0 2026-03-09T17:29:28.507 INFO:tasks.workunit.client.0.vm06.stdout:4/277: fdatasync db/f4f 0 2026-03-09T17:29:28.513 INFO:tasks.workunit.client.0.vm06.stdout:7/298: rmdir d5 39 2026-03-09T17:29:28.516 INFO:tasks.workunit.client.0.vm06.stdout:9/321: getdents d3/d15/d36/d4c 0 2026-03-09T17:29:28.518 INFO:tasks.workunit.client.0.vm06.stdout:8/270: symlink d15/l5a 0 2026-03-09T17:29:28.518 INFO:tasks.workunit.client.0.vm06.stdout:8/271: chown d15/d16/f52 7846 1 2026-03-09T17:29:28.520 INFO:tasks.workunit.client.0.vm06.stdout:8/272: dread d15/d16/f51 [0,4194304] 0 2026-03-09T17:29:28.524 INFO:tasks.workunit.client.0.vm06.stdout:4/278: rename db/d1d/d21/d25/l31 to db/d57/l65 0 2026-03-09T17:29:28.526 INFO:tasks.workunit.client.0.vm06.stdout:7/299: fsync d5/f4b 0 2026-03-09T17:29:28.529 INFO:tasks.workunit.client.0.vm06.stdout:5/254: rmdir d4/d50 39 2026-03-09T17:29:28.530 INFO:tasks.workunit.client.0.vm06.stdout:4/279: unlink db/d1d/d21/d25/c40 0 2026-03-09T17:29:28.534 INFO:tasks.workunit.client.0.vm06.stdout:5/255: fdatasync d4/f20 0 2026-03-09T17:29:28.538 INFO:tasks.workunit.client.0.vm06.stdout:7/300: rename d5/d12/f38 to d5/d1f/d34/d46/f4c 0 2026-03-09T17:29:28.539 INFO:tasks.workunit.client.0.vm06.stdout:5/256: chown d4/d50/d35/d40 908085 1 2026-03-09T17:29:28.542 INFO:tasks.workunit.client.0.vm06.stdout:4/280: link db/df/f2d db/d1d/d21/d25/d4b/f66 0 2026-03-09T17:29:28.545 INFO:tasks.workunit.client.0.vm06.stdout:7/301: dread - d5/dd/f22 zero size 2026-03-09T17:29:28.547 INFO:tasks.workunit.client.0.vm06.stdout:7/302: fsync d5/f8 0 2026-03-09T17:29:28.551 INFO:tasks.workunit.client.0.vm06.stdout:7/303: truncate d5/d7/d2b/f42 196319 0 2026-03-09T17:29:28.570 INFO:tasks.workunit.client.0.vm06.stdout:5/257: link f0 d4/d22/f5d 0 2026-03-09T17:29:28.570 INFO:tasks.workunit.client.0.vm06.stdout:4/281: rename db/f4f to db/d1d/d21/f67 0 2026-03-09T17:29:28.570 INFO:tasks.workunit.client.0.vm06.stdout:7/304: mknod d5/dd/c4d 0 2026-03-09T17:29:28.570 INFO:tasks.workunit.client.0.vm06.stdout:5/258: rmdir d4/d50 39 2026-03-09T17:29:28.570 INFO:tasks.workunit.client.0.vm06.stdout:7/305: rename d5/f28 to d5/d1f/d34/d46/f4e 0 2026-03-09T17:29:28.570 INFO:tasks.workunit.client.0.vm06.stdout:5/259: write d4/d50/f43 [3108562,101708] 0 2026-03-09T17:29:28.572 INFO:tasks.workunit.client.0.vm06.stdout:5/260: dwrite d4/d50/f1e [4194304,4194304] 0 2026-03-09T17:29:28.577 INFO:tasks.workunit.client.0.vm06.stdout:9/322: dread d3/d15/d36/d4d/f61 [0,4194304] 0 2026-03-09T17:29:28.577 INFO:tasks.workunit.client.0.vm06.stdout:4/282: creat db/f68 x:0 0 0 2026-03-09T17:29:28.586 INFO:tasks.workunit.client.0.vm06.stdout:1/332: dwrite d11/d14/d1d/d1e/d2a/f43 [0,4194304] 0 2026-03-09T17:29:28.587 INFO:tasks.workunit.client.0.vm06.stdout:0/352: dwrite d7/d11/f1c [0,4194304] 0 2026-03-09T17:29:28.591 INFO:tasks.workunit.client.0.vm06.stdout:0/353: stat d7/d11/d19/f24 0 2026-03-09T17:29:28.595 INFO:tasks.workunit.client.0.vm06.stdout:5/261: dwrite d4/d50/d18/f48 [0,4194304] 0 2026-03-09T17:29:28.596 INFO:tasks.workunit.client.0.vm06.stdout:9/323: dwrite d3/d15/d36/f49 [4194304,4194304] 0 2026-03-09T17:29:28.598 INFO:tasks.workunit.client.0.vm06.stdout:5/262: truncate d4/d50/d18/d3d/f54 716878 0 2026-03-09T17:29:28.602 INFO:tasks.workunit.client.0.vm06.stdout:9/324: dread d3/f27 [0,4194304] 0 2026-03-09T17:29:28.603 INFO:tasks.workunit.client.0.vm06.stdout:9/325: chown d3/d26/f33 13860440 1 2026-03-09T17:29:28.613 INFO:tasks.workunit.client.0.vm06.stdout:4/283: mkdir db/d1d/d21/d37/d69 0 2026-03-09T17:29:28.614 INFO:tasks.workunit.client.0.vm06.stdout:6/256: truncate d6/f4a 6527215 0 2026-03-09T17:29:28.628 INFO:tasks.workunit.client.0.vm06.stdout:1/333: rename d11/d14/d1d/d1e/f53 to d11/d14/d1d/f73 0 2026-03-09T17:29:28.632 INFO:tasks.workunit.client.0.vm06.stdout:0/354: dread d7/d11/d19/d1d/d39/f4a [0,4194304] 0 2026-03-09T17:29:28.632 INFO:tasks.workunit.client.0.vm06.stdout:4/284: fsync db/d1d/f3a 0 2026-03-09T17:29:28.636 INFO:tasks.workunit.client.0.vm06.stdout:0/355: dread d7/d11/f1c [0,4194304] 0 2026-03-09T17:29:28.636 INFO:tasks.workunit.client.0.vm06.stdout:4/285: dwrite db/f68 [0,4194304] 0 2026-03-09T17:29:28.639 INFO:tasks.workunit.client.0.vm06.stdout:0/356: chown d7/f36 229 1 2026-03-09T17:29:28.643 INFO:tasks.workunit.client.0.vm06.stdout:0/357: dread d7/d11/d19/d3c/f55 [0,4194304] 0 2026-03-09T17:29:28.649 INFO:tasks.workunit.client.0.vm06.stdout:4/286: dwrite db/d1d/d21/d25/d4b/f4e [0,4194304] 0 2026-03-09T17:29:28.654 INFO:tasks.workunit.client.0.vm06.stdout:6/257: creat d6/d4f/d3e/f51 x:0 0 0 2026-03-09T17:29:28.663 INFO:tasks.workunit.client.0.vm06.stdout:1/334: creat d11/d14/d1d/d1e/d2a/f74 x:0 0 0 2026-03-09T17:29:28.663 INFO:tasks.workunit.client.0.vm06.stdout:1/335: chown f10 124543890 1 2026-03-09T17:29:28.672 INFO:tasks.workunit.client.0.vm06.stdout:0/358: symlink d7/d11/d19/d37/l7e 0 2026-03-09T17:29:28.673 INFO:tasks.workunit.client.0.vm06.stdout:4/287: mknod db/d59/d5f/d45/c6a 0 2026-03-09T17:29:28.676 INFO:tasks.workunit.client.0.vm06.stdout:4/288: dread db/df/f2d [0,4194304] 0 2026-03-09T17:29:28.690 INFO:tasks.workunit.client.0.vm06.stdout:0/359: rename d7/d11/d19/d1d/d59/f7a to d7/d11/d5d/d64/f7f 0 2026-03-09T17:29:28.690 INFO:tasks.workunit.client.0.vm06.stdout:0/360: dread - d7/d11/d19/d37/f6d zero size 2026-03-09T17:29:28.690 INFO:tasks.workunit.client.0.vm06.stdout:4/289: mkdir db/d57/d6b 0 2026-03-09T17:29:28.690 INFO:tasks.workunit.client.0.vm06.stdout:5/263: getdents d4/d50/d18/d3d 0 2026-03-09T17:29:28.690 INFO:tasks.workunit.client.0.vm06.stdout:5/264: fsync d4/f11 0 2026-03-09T17:29:28.690 INFO:tasks.workunit.client.0.vm06.stdout:0/361: mkdir d7/d11/d19/d1d/d80 0 2026-03-09T17:29:28.693 INFO:tasks.workunit.client.0.vm06.stdout:0/362: rename d7/d11/d19/d1d/l52 to d7/d11/d19/d1d/d59/d5b/l81 0 2026-03-09T17:29:28.695 INFO:tasks.workunit.client.0.vm06.stdout:4/290: rename db/d1d/d21/d25/l33 to db/d1d/d21/d44/l6c 0 2026-03-09T17:29:28.699 INFO:tasks.workunit.client.0.vm06.stdout:4/291: dwrite db/d1d/d21/d25/f35 [0,4194304] 0 2026-03-09T17:29:28.706 INFO:tasks.workunit.client.0.vm06.stdout:0/363: unlink d7/d11/d2d/f44 0 2026-03-09T17:29:28.706 INFO:tasks.workunit.client.0.vm06.stdout:5/265: rename d4/f20 to d4/f5e 0 2026-03-09T17:29:28.707 INFO:tasks.workunit.client.0.vm06.stdout:4/292: mkdir db/d59/d5f/d6d 0 2026-03-09T17:29:28.707 INFO:tasks.workunit.client.0.vm06.stdout:5/266: creat d4/d52/f5f x:0 0 0 2026-03-09T17:29:28.708 INFO:tasks.workunit.client.0.vm06.stdout:0/364: dwrite d7/f27 [0,4194304] 0 2026-03-09T17:29:28.724 INFO:tasks.workunit.client.0.vm06.stdout:5/267: mknod d4/d22/d46/c60 0 2026-03-09T17:29:28.729 INFO:tasks.workunit.client.0.vm06.stdout:4/293: creat db/d1d/d21/f6e x:0 0 0 2026-03-09T17:29:28.738 INFO:tasks.workunit.client.0.vm06.stdout:5/268: creat d4/d50/f61 x:0 0 0 2026-03-09T17:29:28.738 INFO:tasks.workunit.client.0.vm06.stdout:5/269: write d4/d50/f1e [7857836,82309] 0 2026-03-09T17:29:28.744 INFO:tasks.workunit.client.0.vm06.stdout:5/270: symlink d4/d50/d18/l62 0 2026-03-09T17:29:28.748 INFO:tasks.workunit.client.0.vm06.stdout:4/294: link db/d59/d5f/d5d/f62 db/f6f 0 2026-03-09T17:29:28.751 INFO:tasks.workunit.client.0.vm06.stdout:2/286: truncate d3/f21 1128856 0 2026-03-09T17:29:28.755 INFO:tasks.workunit.client.0.vm06.stdout:4/295: creat db/d1d/d21/d26/f70 x:0 0 0 2026-03-09T17:29:28.755 INFO:tasks.workunit.client.0.vm06.stdout:4/296: fdatasync db/f68 0 2026-03-09T17:29:28.755 INFO:tasks.workunit.client.0.vm06.stdout:4/297: write db/d1d/d21/d25/d4b/f4e [2138834,38814] 0 2026-03-09T17:29:28.755 INFO:tasks.workunit.client.0.vm06.stdout:5/271: mknod d4/c63 0 2026-03-09T17:29:28.755 INFO:tasks.workunit.client.0.vm06.stdout:5/272: fdatasync d4/d50/d18/f4b 0 2026-03-09T17:29:28.761 INFO:tasks.workunit.client.0.vm06.stdout:8/273: write d15/d16/f24 [656745,71245] 0 2026-03-09T17:29:28.764 INFO:tasks.workunit.client.0.vm06.stdout:5/273: mkdir d4/d22/d64 0 2026-03-09T17:29:28.765 INFO:tasks.workunit.client.0.vm06.stdout:5/274: fdatasync d4/d22/d46/f59 0 2026-03-09T17:29:28.765 INFO:tasks.workunit.client.0.vm06.stdout:5/275: readlink d4/d22/l2c 0 2026-03-09T17:29:28.768 INFO:tasks.workunit.client.0.vm06.stdout:5/276: creat d4/d22/d64/f65 x:0 0 0 2026-03-09T17:29:28.770 INFO:tasks.workunit.client.0.vm06.stdout:5/277: symlink d4/d22/d46/l66 0 2026-03-09T17:29:28.771 INFO:tasks.workunit.client.0.vm06.stdout:5/278: chown d4/d50/d18 1629555 1 2026-03-09T17:29:28.775 INFO:tasks.workunit.client.0.vm06.stdout:5/279: dwrite d4/d50/f1d [0,4194304] 0 2026-03-09T17:29:28.777 INFO:tasks.workunit.client.0.vm06.stdout:5/280: write d4/f26 [385302,65749] 0 2026-03-09T17:29:28.780 INFO:tasks.workunit.client.0.vm06.stdout:5/281: stat d4/d50/d35/c53 0 2026-03-09T17:29:28.791 INFO:tasks.workunit.client.0.vm06.stdout:5/282: mknod d4/d22/d46/c67 0 2026-03-09T17:29:28.794 INFO:tasks.workunit.client.0.vm06.stdout:5/283: mknod d4/c68 0 2026-03-09T17:29:28.796 INFO:tasks.workunit.client.0.vm06.stdout:5/284: rename d4/d22/f56 to d4/d50/d18/d3d/f69 0 2026-03-09T17:29:28.801 INFO:tasks.workunit.client.0.vm06.stdout:5/285: dwrite d4/d52/f5f [0,4194304] 0 2026-03-09T17:29:28.808 INFO:tasks.workunit.client.0.vm06.stdout:2/287: dread d3/d4/d22/f28 [0,4194304] 0 2026-03-09T17:29:28.814 INFO:tasks.workunit.client.0.vm06.stdout:7/306: getdents d5 0 2026-03-09T17:29:28.820 INFO:tasks.workunit.client.0.vm06.stdout:5/286: mknod d4/d22/d46/c6a 0 2026-03-09T17:29:28.824 INFO:tasks.workunit.client.0.vm06.stdout:2/288: stat d3/d4/d12/d2b/d2d/f2a 0 2026-03-09T17:29:28.825 INFO:tasks.workunit.client.0.vm06.stdout:7/307: mknod d5/d12/c4f 0 2026-03-09T17:29:28.830 INFO:tasks.workunit.client.0.vm06.stdout:7/308: rename d5/d7/f1b to d5/d7/d2b/f50 0 2026-03-09T17:29:28.837 INFO:tasks.workunit.client.0.vm06.stdout:7/309: write d5/d7/d2b/f50 [4608856,89600] 0 2026-03-09T17:29:28.840 INFO:tasks.workunit.client.0.vm06.stdout:1/336: getdents d11/d14/d1d/d1e 0 2026-03-09T17:29:28.843 INFO:tasks.workunit.client.0.vm06.stdout:7/310: fdatasync d5/f18 0 2026-03-09T17:29:28.845 INFO:tasks.workunit.client.0.vm06.stdout:1/337: dread d11/d14/d1d/d1e/d2a/d34/f5c [0,4194304] 0 2026-03-09T17:29:28.845 INFO:tasks.workunit.client.0.vm06.stdout:1/338: read - d11/d14/d1d/f31 zero size 2026-03-09T17:29:28.845 INFO:tasks.workunit.client.0.vm06.stdout:3/274: truncate dd/d1d/f34 391715 0 2026-03-09T17:29:28.846 INFO:tasks.workunit.client.0.vm06.stdout:7/311: dread d5/dd/ff [0,4194304] 0 2026-03-09T17:29:28.848 INFO:tasks.workunit.client.0.vm06.stdout:7/312: read - d5/dd/f48 zero size 2026-03-09T17:29:28.849 INFO:tasks.workunit.client.0.vm06.stdout:7/313: stat d5/dd/f19 0 2026-03-09T17:29:28.852 INFO:tasks.workunit.client.0.vm06.stdout:3/275: creat dd/d1d/d4e/f5a x:0 0 0 2026-03-09T17:29:28.855 INFO:tasks.workunit.client.0.vm06.stdout:9/326: dwrite d3/d11/f2a [0,4194304] 0 2026-03-09T17:29:28.858 INFO:tasks.workunit.client.0.vm06.stdout:1/339: mknod d11/d14/d1c/d5f/c75 0 2026-03-09T17:29:28.867 INFO:tasks.workunit.client.0.vm06.stdout:3/276: mkdir dd/d5b 0 2026-03-09T17:29:28.867 INFO:tasks.workunit.client.0.vm06.stdout:3/277: fsync dd/d19/d28/f32 0 2026-03-09T17:29:28.871 INFO:tasks.workunit.client.0.vm06.stdout:6/258: dwrite d6/fb [0,4194304] 0 2026-03-09T17:29:28.876 INFO:tasks.workunit.client.0.vm06.stdout:3/278: dwrite dd/d19/d25/f53 [0,4194304] 0 2026-03-09T17:29:28.884 INFO:tasks.workunit.client.0.vm06.stdout:3/279: dread - dd/d19/d25/d44/f57 zero size 2026-03-09T17:29:28.884 INFO:tasks.workunit.client.0.vm06.stdout:3/280: fdatasync f4 0 2026-03-09T17:29:28.898 INFO:tasks.workunit.client.0.vm06.stdout:0/365: truncate d7/d11/d2d/f3a 1044618 0 2026-03-09T17:29:28.899 INFO:tasks.workunit.client.0.vm06.stdout:1/340: link d11/d69/c6d d11/d69/c76 0 2026-03-09T17:29:28.900 INFO:tasks.workunit.client.0.vm06.stdout:1/341: write d11/d14/f17 [7856977,18131] 0 2026-03-09T17:29:28.903 INFO:tasks.workunit.client.0.vm06.stdout:0/366: unlink d7/d11/f35 0 2026-03-09T17:29:28.906 INFO:tasks.workunit.client.0.vm06.stdout:2/289: unlink d3/f21 0 2026-03-09T17:29:28.913 INFO:tasks.workunit.client.0.vm06.stdout:4/298: write f6 [1155332,65937] 0 2026-03-09T17:29:28.915 INFO:tasks.workunit.client.0.vm06.stdout:1/342: dread d11/d14/d1d/d1e/d2a/f40 [0,4194304] 0 2026-03-09T17:29:28.919 INFO:tasks.workunit.client.0.vm06.stdout:8/274: write d15/d16/d19/d2b/f46 [32241,36788] 0 2026-03-09T17:29:28.920 INFO:tasks.workunit.client.0.vm06.stdout:8/275: write d15/d16/f50 [1138447,106858] 0 2026-03-09T17:29:28.924 INFO:tasks.workunit.client.0.vm06.stdout:8/276: dread d15/d16/f24 [0,4194304] 0 2026-03-09T17:29:28.929 INFO:tasks.workunit.client.0.vm06.stdout:1/343: truncate d11/d14/d1d/d1e/d2a/d34/f3b 2139651 0 2026-03-09T17:29:28.931 INFO:tasks.workunit.client.0.vm06.stdout:2/290: symlink d3/d4/d12/d34/l5b 0 2026-03-09T17:29:28.937 INFO:tasks.workunit.client.0.vm06.stdout:4/299: truncate db/d1d/f5b 1317790 0 2026-03-09T17:29:28.943 INFO:tasks.workunit.client.0.vm06.stdout:4/300: creat db/d1d/d21/d37/f71 x:0 0 0 2026-03-09T17:29:28.946 INFO:tasks.workunit.client.0.vm06.stdout:4/301: dwrite db/df/f14 [0,4194304] 0 2026-03-09T17:29:28.947 INFO:tasks.workunit.client.0.vm06.stdout:4/302: chown db/d1d/d21/d26/f70 1 1 2026-03-09T17:29:28.949 INFO:tasks.workunit.client.0.vm06.stdout:2/291: unlink d3/f24 0 2026-03-09T17:29:28.954 INFO:tasks.workunit.client.0.vm06.stdout:0/367: sync 2026-03-09T17:29:28.954 INFO:tasks.workunit.client.0.vm06.stdout:5/287: dwrite d4/f5 [4194304,4194304] 0 2026-03-09T17:29:28.972 INFO:tasks.workunit.client.0.vm06.stdout:5/288: sync 2026-03-09T17:29:28.974 INFO:tasks.workunit.client.0.vm06.stdout:2/292: truncate d3/d4/d12/f20 6345065 0 2026-03-09T17:29:28.976 INFO:tasks.workunit.client.0.vm06.stdout:7/314: write f0 [472690,63608] 0 2026-03-09T17:29:28.979 INFO:tasks.workunit.client.0.vm06.stdout:5/289: dwrite d4/d50/d18/d3d/f44 [0,4194304] 0 2026-03-09T17:29:28.980 INFO:tasks.workunit.client.0.vm06.stdout:5/290: chown d4/d22/f3f 13821 1 2026-03-09T17:29:28.986 INFO:tasks.workunit.client.0.vm06.stdout:5/291: dwrite d4/d50/f29 [0,4194304] 0 2026-03-09T17:29:28.987 INFO:tasks.workunit.client.0.vm06.stdout:5/292: chown d4/d22/l2c 105622956 1 2026-03-09T17:29:28.987 INFO:tasks.workunit.client.0.vm06.stdout:5/293: chown d4/d22/l3b 53938 1 2026-03-09T17:29:28.995 INFO:tasks.workunit.client.0.vm06.stdout:9/327: write d3/d15/d36/d4d/f60 [1485602,23020] 0 2026-03-09T17:29:28.996 INFO:tasks.workunit.client.0.vm06.stdout:7/315: dread d5/d7/d2b/f50 [0,4194304] 0 2026-03-09T17:29:28.999 INFO:tasks.workunit.client.0.vm06.stdout:8/277: getdents d15/d16/d19 0 2026-03-09T17:29:29.003 INFO:tasks.workunit.client.0.vm06.stdout:9/328: dwrite d3/d15/d36/d4d/f62 [0,4194304] 0 2026-03-09T17:29:29.019 INFO:tasks.workunit.client.0.vm06.stdout:8/278: dwrite d15/d39/f45 [0,4194304] 0 2026-03-09T17:29:29.025 INFO:tasks.workunit.client.0.vm06.stdout:6/259: write d6/d12/f22 [1146173,83158] 0 2026-03-09T17:29:29.035 INFO:tasks.workunit.client.0.vm06.stdout:3/281: dwrite dd/f1b [0,4194304] 0 2026-03-09T17:29:29.036 INFO:tasks.workunit.client.0.vm06.stdout:3/282: read - dd/d19/d28/f32 zero size 2026-03-09T17:29:29.043 INFO:tasks.workunit.client.0.vm06.stdout:5/294: creat d4/d50/d35/f6b x:0 0 0 2026-03-09T17:29:29.049 INFO:tasks.workunit.client.0.vm06.stdout:8/279: dread d15/f3e [0,4194304] 0 2026-03-09T17:29:29.049 INFO:tasks.workunit.client.0.vm06.stdout:7/316: mkdir d5/d1f/d34/d46/d51 0 2026-03-09T17:29:29.055 INFO:tasks.workunit.client.0.vm06.stdout:9/329: read - d3/f4b zero size 2026-03-09T17:29:29.055 INFO:tasks.workunit.client.0.vm06.stdout:9/330: fsync d3/d15/d16/d54/f5b 0 2026-03-09T17:29:29.061 INFO:tasks.workunit.client.0.vm06.stdout:6/260: mkdir d6/d4f/d3e/d52 0 2026-03-09T17:29:29.064 INFO:tasks.workunit.client.0.vm06.stdout:2/293: symlink d3/d4/d12/l5c 0 2026-03-09T17:29:29.073 INFO:tasks.workunit.client.0.vm06.stdout:3/283: unlink dd/f40 0 2026-03-09T17:29:29.074 INFO:tasks.workunit.client.0.vm06.stdout:3/284: read - dd/d19/d25/d48/f4c zero size 2026-03-09T17:29:29.075 INFO:tasks.workunit.client.0.vm06.stdout:1/344: write d11/f13 [6159144,125061] 0 2026-03-09T17:29:29.078 INFO:tasks.workunit.client.0.vm06.stdout:5/295: creat d4/d52/f6c x:0 0 0 2026-03-09T17:29:29.082 INFO:tasks.workunit.client.0.vm06.stdout:5/296: dwrite d4/d50/d18/f3c [0,4194304] 0 2026-03-09T17:29:29.092 INFO:tasks.workunit.client.0.vm06.stdout:8/280: dread d15/d16/f24 [0,4194304] 0 2026-03-09T17:29:29.094 INFO:tasks.workunit.client.0.vm06.stdout:1/345: dread d11/d14/d1c/d1f/f68 [0,4194304] 0 2026-03-09T17:29:29.095 INFO:tasks.workunit.client.0.vm06.stdout:9/331: symlink d3/d15/d36/d4d/l63 0 2026-03-09T17:29:29.096 INFO:tasks.workunit.client.0.vm06.stdout:9/332: chown d3/d15/d36/d4d/f62 79911 1 2026-03-09T17:29:29.101 INFO:tasks.workunit.client.0.vm06.stdout:4/303: getdents db/d59 0 2026-03-09T17:29:29.103 INFO:tasks.workunit.client.0.vm06.stdout:2/294: read d3/f10 [6793,109593] 0 2026-03-09T17:29:29.104 INFO:tasks.workunit.client.0.vm06.stdout:0/368: getdents d7 0 2026-03-09T17:29:29.110 INFO:tasks.workunit.client.0.vm06.stdout:5/297: symlink d4/d50/d18/l6d 0 2026-03-09T17:29:29.114 INFO:tasks.workunit.client.0.vm06.stdout:1/346: unlink d11/d14/d1c/d3a/f72 0 2026-03-09T17:29:29.114 INFO:tasks.workunit.client.0.vm06.stdout:1/347: dread - d11/d14/d1d/d1e/d2a/f74 zero size 2026-03-09T17:29:29.115 INFO:tasks.workunit.client.0.vm06.stdout:1/348: read d11/d14/d1d/d1e/d2a/f38 [3604285,36368] 0 2026-03-09T17:29:29.115 INFO:tasks.workunit.client.0.vm06.stdout:1/349: readlink d11/d14/d1c/d1f/d57/l66 0 2026-03-09T17:29:29.116 INFO:tasks.workunit.client.0.vm06.stdout:8/281: sync 2026-03-09T17:29:29.117 INFO:tasks.workunit.client.0.vm06.stdout:1/350: chown d11/d14/d1c/l29 6695 1 2026-03-09T17:29:29.118 INFO:tasks.workunit.client.0.vm06.stdout:1/351: dread d11/d14/d1c/f2e [0,4194304] 0 2026-03-09T17:29:29.121 INFO:tasks.workunit.client.0.vm06.stdout:0/369: rename d7/d11/d19/d1d/d39/l67 to d7/d11/d19/d23/d6f/l82 0 2026-03-09T17:29:29.123 INFO:tasks.workunit.client.0.vm06.stdout:3/285: mkdir dd/d5c 0 2026-03-09T17:29:29.125 INFO:tasks.workunit.client.0.vm06.stdout:3/286: dread - dd/d1d/f4b zero size 2026-03-09T17:29:29.126 INFO:tasks.workunit.client.0.vm06.stdout:3/287: truncate dd/f51 590579 0 2026-03-09T17:29:29.126 INFO:tasks.workunit.client.0.vm06.stdout:3/288: stat dd/f1b 0 2026-03-09T17:29:29.126 INFO:tasks.workunit.client.0.vm06.stdout:5/298: creat d4/d22/d46/f6e x:0 0 0 2026-03-09T17:29:29.126 INFO:tasks.workunit.client.0.vm06.stdout:5/299: write d4/d50/d18/f5c [318633,68385] 0 2026-03-09T17:29:29.126 INFO:tasks.workunit.client.0.vm06.stdout:8/282: sync 2026-03-09T17:29:29.127 INFO:tasks.workunit.client.0.vm06.stdout:8/283: write d15/d16/d1e/f4e [353633,30724] 0 2026-03-09T17:29:29.149 INFO:tasks.workunit.client.0.vm06.stdout:6/261: truncate d6/d47/d4d/f50 4071951 0 2026-03-09T17:29:29.153 INFO:tasks.workunit.client.0.vm06.stdout:0/370: rename d7/d11/d19/d1d/d39/l65 to d7/d11/d19/d3c/l83 0 2026-03-09T17:29:29.155 INFO:tasks.workunit.client.0.vm06.stdout:9/333: truncate d3/d15/d16/f31 535535 0 2026-03-09T17:29:29.158 INFO:tasks.workunit.client.0.vm06.stdout:4/304: dwrite db/f15 [0,4194304] 0 2026-03-09T17:29:29.160 INFO:tasks.workunit.client.0.vm06.stdout:4/305: write db/d1d/d21/d25/f38 [2426865,23842] 0 2026-03-09T17:29:29.163 INFO:tasks.workunit.client.0.vm06.stdout:2/295: dwrite d3/d4/d12/f15 [4194304,4194304] 0 2026-03-09T17:29:29.172 INFO:tasks.workunit.client.0.vm06.stdout:7/317: getdents d5/d7/d2b 0 2026-03-09T17:29:29.176 INFO:tasks.workunit.client.0.vm06.stdout:1/352: symlink d11/d14/l77 0 2026-03-09T17:29:29.176 INFO:tasks.workunit.client.0.vm06.stdout:6/262: mkdir d6/d12/d53 0 2026-03-09T17:29:29.179 INFO:tasks.workunit.client.0.vm06.stdout:0/371: symlink d7/d11/d5d/l84 0 2026-03-09T17:29:29.180 INFO:tasks.workunit.client.0.vm06.stdout:9/334: creat d3/d15/d48/f64 x:0 0 0 2026-03-09T17:29:29.186 INFO:tasks.workunit.client.0.vm06.stdout:2/296: dwrite d3/d4/d38/f50 [0,4194304] 0 2026-03-09T17:29:29.189 INFO:tasks.workunit.client.0.vm06.stdout:5/300: mkdir d4/d50/d35/d40/d6f 0 2026-03-09T17:29:29.189 INFO:tasks.workunit.client.0.vm06.stdout:8/284: mknod d15/d16/d19/d3d/c5b 0 2026-03-09T17:29:29.190 INFO:tasks.workunit.client.0.vm06.stdout:5/301: dread - d4/d22/d64/f65 zero size 2026-03-09T17:29:29.205 INFO:tasks.workunit.client.0.vm06.stdout:8/285: write d15/d16/d19/f4f [859468,11037] 0 2026-03-09T17:29:29.205 INFO:tasks.workunit.client.0.vm06.stdout:5/302: read d4/d52/f5f [1614411,86890] 0 2026-03-09T17:29:29.205 INFO:tasks.workunit.client.0.vm06.stdout:8/286: write d15/d16/f52 [273125,47912] 0 2026-03-09T17:29:29.205 INFO:tasks.workunit.client.0.vm06.stdout:1/353: symlink d11/d14/d1c/d1f/d57/l78 0 2026-03-09T17:29:29.205 INFO:tasks.workunit.client.0.vm06.stdout:0/372: write d7/d11/d19/d1d/f40 [156852,114649] 0 2026-03-09T17:29:29.205 INFO:tasks.workunit.client.0.vm06.stdout:9/335: mkdir d3/d11/d65 0 2026-03-09T17:29:29.205 INFO:tasks.workunit.client.0.vm06.stdout:4/306: unlink db/d1d/l5e 0 2026-03-09T17:29:29.205 INFO:tasks.workunit.client.0.vm06.stdout:2/297: symlink d3/d4/d12/d34/l5d 0 2026-03-09T17:29:29.206 INFO:tasks.workunit.client.0.vm06.stdout:8/287: unlink d15/c36 0 2026-03-09T17:29:29.206 INFO:tasks.workunit.client.0.vm06.stdout:7/318: read d5/d7/f1d [2160982,53856] 0 2026-03-09T17:29:29.206 INFO:tasks.workunit.client.0.vm06.stdout:7/319: write d5/f4b [701489,690] 0 2026-03-09T17:29:29.209 INFO:tasks.workunit.client.0.vm06.stdout:6/263: symlink d6/d4f/l54 0 2026-03-09T17:29:29.211 INFO:tasks.workunit.client.0.vm06.stdout:3/289: getdents dd/d1d/d4e 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:3/290: chown dd/d19/d28/l2a 60393127 1 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:3/291: dwrite dd/d19/d2c/f37 [0,4194304] 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:3/292: write dd/d19/d2c/f37 [4079303,24242] 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:3/293: readlink dd/l17 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:4/307: symlink db/d59/d5f/d5d/l72 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:3/294: dwrite dd/f1b [0,4194304] 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:3/295: chown dd/d19/d1e 2 1 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:2/298: mknod d3/d4/d38/c5e 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:7/320: creat d5/d7/d2b/f52 x:0 0 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:2/299: write d3/d4/d12/f35 [1290424,74224] 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:7/321: write d5/d7/d2b/f42 [1166907,73726] 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:6/264: creat d6/d47/d4d/f55 x:0 0 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:0/373: mkdir d7/d11/d19/d1d/d80/d85 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:0/374: fsync d7/d11/f29 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:9/336: truncate d3/f1b 3184246 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:4/308: fdatasync db/d1d/d21/d25/d4b/f66 0 2026-03-09T17:29:29.243 INFO:tasks.workunit.client.0.vm06.stdout:3/296: mknod dd/d19/d25/d44/c5d 0 2026-03-09T17:29:29.248 INFO:tasks.workunit.client.0.vm06.stdout:5/303: dread d4/d50/d18/f31 [0,4194304] 0 2026-03-09T17:29:29.250 INFO:tasks.workunit.client.0.vm06.stdout:0/375: unlink d7/f31 0 2026-03-09T17:29:29.255 INFO:tasks.workunit.client.0.vm06.stdout:7/322: getdents d5/d1f/d34/d46/d51 0 2026-03-09T17:29:29.257 INFO:tasks.workunit.client.0.vm06.stdout:9/337: creat d3/d11/d65/f66 x:0 0 0 2026-03-09T17:29:29.258 INFO:tasks.workunit.client.0.vm06.stdout:0/376: mknod d7/d11/d19/d1d/d80/c86 0 2026-03-09T17:29:29.258 INFO:tasks.workunit.client.0.vm06.stdout:0/377: chown d7/d11/d19/d37/l54 982 1 2026-03-09T17:29:29.260 INFO:tasks.workunit.client.0.vm06.stdout:4/309: rename c4 to db/d57/c73 0 2026-03-09T17:29:29.262 INFO:tasks.workunit.client.0.vm06.stdout:7/323: stat d5/l15 0 2026-03-09T17:29:29.262 INFO:tasks.workunit.client.0.vm06.stdout:7/324: dread - d5/d1f/d34/f41 zero size 2026-03-09T17:29:29.264 INFO:tasks.workunit.client.0.vm06.stdout:2/300: getdents d3/d4/d46 0 2026-03-09T17:29:29.266 INFO:tasks.workunit.client.0.vm06.stdout:0/378: mkdir d7/d11/d19/d1d/d87 0 2026-03-09T17:29:29.268 INFO:tasks.workunit.client.0.vm06.stdout:5/304: rename d4/d50/f29 to d4/d22/d64/f70 0 2026-03-09T17:29:29.271 INFO:tasks.workunit.client.0.vm06.stdout:6/265: getdents d6/d12/d2d 0 2026-03-09T17:29:29.276 INFO:tasks.workunit.client.0.vm06.stdout:4/310: rename db/d1d/d21/d37/l43 to db/d59/d5f/d5d/l74 0 2026-03-09T17:29:29.283 INFO:tasks.workunit.client.0.vm06.stdout:9/338: link d3/d15/d36/d4d/l63 d3/d15/d16/l67 0 2026-03-09T17:29:29.283 INFO:tasks.workunit.client.0.vm06.stdout:8/288: dread d15/d16/d1e/f4e [0,4194304] 0 2026-03-09T17:29:29.284 INFO:tasks.workunit.client.0.vm06.stdout:0/379: rename d7/d11/d19/d23/d6f to d7/d88 0 2026-03-09T17:29:29.288 INFO:tasks.workunit.client.0.vm06.stdout:0/380: dwrite d7/f76 [0,4194304] 0 2026-03-09T17:29:29.288 INFO:tasks.workunit.client.0.vm06.stdout:9/339: mkdir d3/d15/d16/d54/d68 0 2026-03-09T17:29:29.290 INFO:tasks.workunit.client.0.vm06.stdout:9/340: truncate d3/d15/d36/d4c/f55 245687 0 2026-03-09T17:29:29.297 INFO:tasks.workunit.client.0.vm06.stdout:9/341: write d3/d15/d16/f4a [208534,14049] 0 2026-03-09T17:29:29.299 INFO:tasks.workunit.client.0.vm06.stdout:8/289: mkdir d15/d16/d1e/d5c 0 2026-03-09T17:29:29.300 INFO:tasks.workunit.client.0.vm06.stdout:8/290: chown d15/d39/f4b 10 1 2026-03-09T17:29:29.300 INFO:tasks.workunit.client.0.vm06.stdout:4/311: rmdir db/d57/d6b 0 2026-03-09T17:29:29.301 INFO:tasks.workunit.client.0.vm06.stdout:0/381: mkdir d7/d11/d89 0 2026-03-09T17:29:29.303 INFO:tasks.workunit.client.0.vm06.stdout:9/342: symlink d3/d15/d16/d54/d68/l69 0 2026-03-09T17:29:29.304 INFO:tasks.workunit.client.0.vm06.stdout:0/382: rename d7/d11/f2c to d7/d11/d19/d1d/f8a 0 2026-03-09T17:29:29.305 INFO:tasks.workunit.client.0.vm06.stdout:8/291: creat d15/d39/d3c/f5d x:0 0 0 2026-03-09T17:29:29.307 INFO:tasks.workunit.client.0.vm06.stdout:0/383: mkdir d7/d11/d19/d8b 0 2026-03-09T17:29:29.307 INFO:tasks.workunit.client.0.vm06.stdout:0/384: fdatasync d7/d11/d19/d1d/d39/f51 0 2026-03-09T17:29:29.309 INFO:tasks.workunit.client.0.vm06.stdout:8/292: mkdir d15/d16/d1e/d28/d5e 0 2026-03-09T17:29:29.310 INFO:tasks.workunit.client.0.vm06.stdout:0/385: write d7/d11/d2d/f78 [2397174,12753] 0 2026-03-09T17:29:29.311 INFO:tasks.workunit.client.0.vm06.stdout:8/293: mkdir d15/d16/d19/d3d/d5f 0 2026-03-09T17:29:29.311 INFO:tasks.workunit.client.0.vm06.stdout:8/294: chown d15/d16/d1e/f4e 1 1 2026-03-09T17:29:29.312 INFO:tasks.workunit.client.0.vm06.stdout:8/295: rmdir d15/d16/d1e 39 2026-03-09T17:29:29.433 INFO:tasks.workunit.client.0.vm06.stdout:9/343: read d3/d11/f14 [703684,110009] 0 2026-03-09T17:29:29.437 INFO:tasks.workunit.client.0.vm06.stdout:9/344: mkdir d3/d15/d36/d4c/d6a 0 2026-03-09T17:29:29.438 INFO:tasks.workunit.client.0.vm06.stdout:9/345: stat d3/d15/d36/d4d/f60 0 2026-03-09T17:29:29.438 INFO:tasks.workunit.client.0.vm06.stdout:9/346: chown d3/d15/d16/l20 4068 1 2026-03-09T17:29:29.445 INFO:tasks.workunit.client.0.vm06.stdout:3/297: sync 2026-03-09T17:29:29.448 INFO:tasks.workunit.client.0.vm06.stdout:3/298: link dd/d1d/d2e/c3e dd/d59/c5e 0 2026-03-09T17:29:29.449 INFO:tasks.workunit.client.0.vm06.stdout:5/305: sync 2026-03-09T17:29:29.449 INFO:tasks.workunit.client.0.vm06.stdout:4/312: sync 2026-03-09T17:29:29.449 INFO:tasks.workunit.client.0.vm06.stdout:4/313: readlink db/d59/d5f/d45/l56 0 2026-03-09T17:29:29.450 INFO:tasks.workunit.client.0.vm06.stdout:4/314: chown db/df/c46 461 1 2026-03-09T17:29:29.450 INFO:tasks.workunit.client.0.vm06.stdout:4/315: readlink db/df/l3f 0 2026-03-09T17:29:29.459 INFO:tasks.workunit.client.0.vm06.stdout:4/316: creat db/d1d/d21/d37/d69/f75 x:0 0 0 2026-03-09T17:29:29.459 INFO:tasks.workunit.client.0.vm06.stdout:5/306: creat d4/f71 x:0 0 0 2026-03-09T17:29:29.461 INFO:tasks.workunit.client.0.vm06.stdout:4/317: creat db/d59/f76 x:0 0 0 2026-03-09T17:29:29.462 INFO:tasks.workunit.client.0.vm06.stdout:5/307: mknod d4/d52/d55/c72 0 2026-03-09T17:29:29.463 INFO:tasks.workunit.client.0.vm06.stdout:4/318: symlink db/d59/d5f/d45/l77 0 2026-03-09T17:29:29.463 INFO:tasks.workunit.client.0.vm06.stdout:4/319: read - db/f55 zero size 2026-03-09T17:29:29.464 INFO:tasks.workunit.client.0.vm06.stdout:4/320: write db/d1d/f3a [782307,44738] 0 2026-03-09T17:29:29.466 INFO:tasks.workunit.client.0.vm06.stdout:5/308: creat d4/d50/d18/f73 x:0 0 0 2026-03-09T17:29:29.466 INFO:tasks.workunit.client.0.vm06.stdout:5/309: chown d4/d22/d46/c6a 61155 1 2026-03-09T17:29:29.488 INFO:tasks.workunit.client.0.vm06.stdout:5/310: truncate d4/d22/d46/f58 713847 0 2026-03-09T17:29:29.489 INFO:tasks.workunit.client.0.vm06.stdout:8/296: dread d15/d16/f21 [0,4194304] 0 2026-03-09T17:29:29.489 INFO:tasks.workunit.client.0.vm06.stdout:5/311: dwrite d4/d50/d18/f3e [0,4194304] 0 2026-03-09T17:29:29.489 INFO:tasks.workunit.client.0.vm06.stdout:5/312: write d4/f71 [111313,25351] 0 2026-03-09T17:29:29.489 INFO:tasks.workunit.client.0.vm06.stdout:4/321: mkdir db/d1d/d21/d37/d69/d78 0 2026-03-09T17:29:29.489 INFO:tasks.workunit.client.0.vm06.stdout:4/322: dread - db/df/f4d zero size 2026-03-09T17:29:29.489 INFO:tasks.workunit.client.0.vm06.stdout:4/323: dread - db/d1d/d21/d26/f70 zero size 2026-03-09T17:29:29.489 INFO:tasks.workunit.client.0.vm06.stdout:4/324: symlink db/df/l79 0 2026-03-09T17:29:29.489 INFO:tasks.workunit.client.0.vm06.stdout:4/325: fdatasync db/d59/d5f/d45/f4a 0 2026-03-09T17:29:29.492 INFO:tasks.workunit.client.0.vm06.stdout:4/326: dwrite db/d1d/d21/d25/f53 [0,4194304] 0 2026-03-09T17:29:29.493 INFO:tasks.workunit.client.0.vm06.stdout:4/327: stat db/d59/d5f 0 2026-03-09T17:29:29.493 INFO:tasks.workunit.client.0.vm06.stdout:8/297: symlink d15/d16/d1e/d30/l60 0 2026-03-09T17:29:29.495 INFO:tasks.workunit.client.0.vm06.stdout:4/328: dread db/d1d/f22 [0,4194304] 0 2026-03-09T17:29:29.498 INFO:tasks.workunit.client.0.vm06.stdout:8/298: rmdir d15/d16/d19/d3d 39 2026-03-09T17:29:29.502 INFO:tasks.workunit.client.0.vm06.stdout:8/299: creat d15/d16/d19/f61 x:0 0 0 2026-03-09T17:29:29.504 INFO:tasks.workunit.client.0.vm06.stdout:8/300: mknod d15/c62 0 2026-03-09T17:29:29.504 INFO:tasks.workunit.client.0.vm06.stdout:8/301: write fe [801717,1047] 0 2026-03-09T17:29:29.509 INFO:tasks.workunit.client.0.vm06.stdout:8/302: link d15/d16/f21 d15/d16/d19/d2b/f63 0 2026-03-09T17:29:29.515 INFO:tasks.workunit.client.0.vm06.stdout:8/303: rename d15/d16/f3a to d15/d16/d1e/f64 0 2026-03-09T17:29:29.521 INFO:tasks.workunit.client.0.vm06.stdout:8/304: rename d15/d16/f21 to d15/d16/d1e/d5c/f65 0 2026-03-09T17:29:29.525 INFO:tasks.workunit.client.0.vm06.stdout:3/299: sync 2026-03-09T17:29:29.525 INFO:tasks.workunit.client.0.vm06.stdout:3/300: write dd/d1d/d2e/f3a [752123,78924] 0 2026-03-09T17:29:29.527 INFO:tasks.workunit.client.0.vm06.stdout:8/305: creat d15/d16/f66 x:0 0 0 2026-03-09T17:29:29.530 INFO:tasks.workunit.client.0.vm06.stdout:8/306: mkdir d15/d39/d67 0 2026-03-09T17:29:29.531 INFO:tasks.workunit.client.0.vm06.stdout:8/307: truncate d15/d16/d1e/f64 418324 0 2026-03-09T17:29:29.531 INFO:tasks.workunit.client.0.vm06.stdout:1/354: write d11/d14/d1d/d1e/f65 [1033429,61219] 0 2026-03-09T17:29:29.535 INFO:tasks.workunit.client.0.vm06.stdout:8/308: read d15/d16/d19/f26 [285073,90184] 0 2026-03-09T17:29:29.536 INFO:tasks.workunit.client.0.vm06.stdout:1/355: mknod d11/d14/d1c/d1f/d57/c79 0 2026-03-09T17:29:29.537 INFO:tasks.workunit.client.0.vm06.stdout:1/356: write d11/d14/d1d/d1e/f65 [682494,55938] 0 2026-03-09T17:29:29.538 INFO:tasks.workunit.client.0.vm06.stdout:3/301: creat dd/f5f x:0 0 0 2026-03-09T17:29:29.542 INFO:tasks.workunit.client.0.vm06.stdout:1/357: fsync d11/d14/d1d/d1e/d2a/f40 0 2026-03-09T17:29:29.542 INFO:tasks.workunit.client.0.vm06.stdout:1/358: fdatasync d11/d14/f17 0 2026-03-09T17:29:29.543 INFO:tasks.workunit.client.0.vm06.stdout:1/359: write d11/d14/d1d/d42/f70 [812895,122382] 0 2026-03-09T17:29:29.546 INFO:tasks.workunit.client.0.vm06.stdout:8/309: mknod d15/d16/c68 0 2026-03-09T17:29:29.552 INFO:tasks.workunit.client.0.vm06.stdout:1/360: truncate f7 1548517 0 2026-03-09T17:29:29.553 INFO:tasks.workunit.client.0.vm06.stdout:8/310: unlink fa 0 2026-03-09T17:29:29.556 INFO:tasks.workunit.client.0.vm06.stdout:1/361: mknod d11/d14/c7a 0 2026-03-09T17:29:29.558 INFO:tasks.workunit.client.0.vm06.stdout:1/362: mkdir d11/d14/d1c/d1f/d57/d7b 0 2026-03-09T17:29:29.562 INFO:tasks.workunit.client.0.vm06.stdout:1/363: dwrite d11/d14/d1d/d42/f52 [0,4194304] 0 2026-03-09T17:29:29.563 INFO:tasks.workunit.client.0.vm06.stdout:1/364: stat d11/d14/d1c/d1f/f4c 0 2026-03-09T17:29:29.566 INFO:tasks.workunit.client.0.vm06.stdout:7/325: write d5/d1f/d34/d46/f4e [974686,124267] 0 2026-03-09T17:29:29.570 INFO:tasks.workunit.client.0.vm06.stdout:1/365: creat d11/d14/d1d/f7c x:0 0 0 2026-03-09T17:29:29.573 INFO:tasks.workunit.client.0.vm06.stdout:7/326: getdents d5/dd 0 2026-03-09T17:29:29.574 INFO:tasks.workunit.client.0.vm06.stdout:7/327: fsync d5/dd/f3e 0 2026-03-09T17:29:29.574 INFO:tasks.workunit.client.0.vm06.stdout:7/328: chown d5/d7/l25 5901351 1 2026-03-09T17:29:29.577 INFO:tasks.workunit.client.0.vm06.stdout:7/329: dwrite d5/d12/f35 [0,4194304] 0 2026-03-09T17:29:29.578 INFO:tasks.workunit.client.0.vm06.stdout:7/330: write d5/dd/f48 [25546,109661] 0 2026-03-09T17:29:29.585 INFO:tasks.workunit.client.0.vm06.stdout:7/331: mknod d5/c53 0 2026-03-09T17:29:29.628 INFO:tasks.workunit.client.0.vm06.stdout:8/311: sync 2026-03-09T17:29:29.636 INFO:tasks.workunit.client.0.vm06.stdout:8/312: dread d15/d39/f4b [0,4194304] 0 2026-03-09T17:29:29.641 INFO:tasks.workunit.client.0.vm06.stdout:0/386: fsync d7/f76 0 2026-03-09T17:29:29.646 INFO:tasks.workunit.client.0.vm06.stdout:8/313: read d15/d16/f51 [1044099,105284] 0 2026-03-09T17:29:29.650 INFO:tasks.workunit.client.0.vm06.stdout:8/314: mknod d15/d39/d3c/c69 0 2026-03-09T17:29:29.660 INFO:tasks.workunit.client.0.vm06.stdout:6/266: dwrite d6/f46 [0,4194304] 0 2026-03-09T17:29:29.667 INFO:tasks.workunit.client.0.vm06.stdout:4/329: unlink db/d59/d5f/d5d/l74 0 2026-03-09T17:29:29.675 INFO:tasks.workunit.client.0.vm06.stdout:4/330: mkdir db/d1d/d21/d26/d7a 0 2026-03-09T17:29:29.676 INFO:tasks.workunit.client.0.vm06.stdout:6/267: rename d6/d12/d2d/f48 to d6/f56 0 2026-03-09T17:29:29.678 INFO:tasks.workunit.client.0.vm06.stdout:6/268: symlink d6/d4f/d3e/d52/l57 0 2026-03-09T17:29:29.680 INFO:tasks.workunit.client.0.vm06.stdout:6/269: symlink d6/d47/l58 0 2026-03-09T17:29:29.684 INFO:tasks.workunit.client.0.vm06.stdout:6/270: link d6/d12/d17/d27/c3b d6/c59 0 2026-03-09T17:29:29.692 INFO:tasks.workunit.client.0.vm06.stdout:6/271: chown d6/d4f/f25 6008237 1 2026-03-09T17:29:29.692 INFO:tasks.workunit.client.0.vm06.stdout:6/272: readlink d6/l2f 0 2026-03-09T17:29:29.692 INFO:tasks.workunit.client.0.vm06.stdout:6/273: symlink d6/d4f/l5a 0 2026-03-09T17:29:29.692 INFO:tasks.workunit.client.0.vm06.stdout:6/274: creat d6/d12/d53/f5b x:0 0 0 2026-03-09T17:29:29.692 INFO:tasks.workunit.client.0.vm06.stdout:6/275: creat d6/f5c x:0 0 0 2026-03-09T17:29:29.692 INFO:tasks.workunit.client.0.vm06.stdout:6/276: mknod d6/d12/d2d/c5d 0 2026-03-09T17:29:29.692 INFO:tasks.workunit.client.0.vm06.stdout:6/277: creat d6/d12/d2d/f5e x:0 0 0 2026-03-09T17:29:29.692 INFO:tasks.workunit.client.0.vm06.stdout:6/278: dread - d6/d47/d4d/f55 zero size 2026-03-09T17:29:29.696 INFO:tasks.workunit.client.0.vm06.stdout:6/279: dwrite d6/d12/f22 [0,4194304] 0 2026-03-09T17:29:29.698 INFO:tasks.workunit.client.0.vm06.stdout:6/280: readlink d6/d4f/l4e 0 2026-03-09T17:29:29.699 INFO:tasks.workunit.client.0.vm06.stdout:6/281: chown d6/f46 225 1 2026-03-09T17:29:29.765 INFO:tasks.workunit.client.0.vm06.stdout:0/387: rmdir d7/d11 39 2026-03-09T17:29:29.768 INFO:tasks.workunit.client.0.vm06.stdout:9/347: dread d3/d11/f1c [0,4194304] 0 2026-03-09T17:29:29.774 INFO:tasks.workunit.client.0.vm06.stdout:0/388: symlink d7/d11/d2d/l8c 0 2026-03-09T17:29:29.775 INFO:tasks.workunit.client.0.vm06.stdout:9/348: fdatasync d3/d15/d36/d4d/f61 0 2026-03-09T17:29:29.776 INFO:tasks.workunit.client.0.vm06.stdout:9/349: write d3/d15/d16/f5c [4454150,49136] 0 2026-03-09T17:29:29.783 INFO:tasks.workunit.client.0.vm06.stdout:9/350: stat d3/d15/d36/c43 0 2026-03-09T17:29:29.795 INFO:tasks.workunit.client.0.vm06.stdout:5/313: rmdir d4/d50/d18 39 2026-03-09T17:29:29.799 INFO:tasks.workunit.client.0.vm06.stdout:5/314: dread d4/d52/f5f [0,4194304] 0 2026-03-09T17:29:29.800 INFO:tasks.workunit.client.0.vm06.stdout:2/301: write d3/d4/d12/f20 [4322255,47930] 0 2026-03-09T17:29:29.801 INFO:tasks.workunit.client.0.vm06.stdout:2/302: readlink d3/d4/d12/d2b/d36/l3e 0 2026-03-09T17:29:29.806 INFO:tasks.workunit.client.0.vm06.stdout:2/303: creat d3/d4/d22/d43/f5f x:0 0 0 2026-03-09T17:29:29.808 INFO:tasks.workunit.client.0.vm06.stdout:2/304: mkdir d3/d4/d38/d60 0 2026-03-09T17:29:29.809 INFO:tasks.workunit.client.0.vm06.stdout:2/305: stat d3/d4/d12/d34/l5d 0 2026-03-09T17:29:29.811 INFO:tasks.workunit.client.0.vm06.stdout:2/306: symlink d3/l61 0 2026-03-09T17:29:29.816 INFO:tasks.workunit.client.0.vm06.stdout:9/351: sync 2026-03-09T17:29:29.817 INFO:tasks.workunit.client.0.vm06.stdout:9/352: write d3/d15/d16/f4a [501026,9846] 0 2026-03-09T17:29:29.817 INFO:tasks.workunit.client.0.vm06.stdout:9/353: chown d3/d26/f57 0 1 2026-03-09T17:29:29.818 INFO:tasks.workunit.client.0.vm06.stdout:9/354: chown d3/d15/d16/l20 13 1 2026-03-09T17:29:29.818 INFO:tasks.workunit.client.0.vm06.stdout:9/355: readlink d3/d15/l1e 0 2026-03-09T17:29:29.819 INFO:tasks.workunit.client.0.vm06.stdout:9/356: truncate d3/d15/f46 1410391 0 2026-03-09T17:29:29.822 INFO:tasks.workunit.client.0.vm06.stdout:9/357: symlink d3/d15/d16/l6b 0 2026-03-09T17:29:29.826 INFO:tasks.workunit.client.0.vm06.stdout:9/358: rename d3/d15/d16/d54 to d3/d26/d6c 0 2026-03-09T17:29:29.832 INFO:tasks.workunit.client.0.vm06.stdout:9/359: mkdir d3/d6d 0 2026-03-09T17:29:29.836 INFO:tasks.workunit.client.0.vm06.stdout:9/360: dwrite d3/d15/d36/f49 [4194304,4194304] 0 2026-03-09T17:29:29.837 INFO:tasks.workunit.client.0.vm06.stdout:9/361: fdatasync d3/d15/d16/f5c 0 2026-03-09T17:29:29.856 INFO:tasks.workunit.client.0.vm06.stdout:9/362: dread d3/fb [0,4194304] 0 2026-03-09T17:29:29.858 INFO:tasks.workunit.client.0.vm06.stdout:9/363: mknod d3/d15/d36/c6e 0 2026-03-09T17:29:29.862 INFO:tasks.workunit.client.0.vm06.stdout:9/364: dread d3/d15/f2e [0,4194304] 0 2026-03-09T17:29:29.863 INFO:tasks.workunit.client.0.vm06.stdout:9/365: dread - d3/d11/d65/f66 zero size 2026-03-09T17:29:29.865 INFO:tasks.workunit.client.0.vm06.stdout:3/302: write dd/d1d/f34 [1009637,110767] 0 2026-03-09T17:29:29.867 INFO:tasks.workunit.client.0.vm06.stdout:9/366: creat d3/d26/d35/f6f x:0 0 0 2026-03-09T17:29:29.870 INFO:tasks.workunit.client.0.vm06.stdout:3/303: read dd/f14 [3516224,70951] 0 2026-03-09T17:29:29.874 INFO:tasks.workunit.client.0.vm06.stdout:3/304: creat dd/d5c/f60 x:0 0 0 2026-03-09T17:29:29.874 INFO:tasks.workunit.client.0.vm06.stdout:3/305: chown dd/d19/d25 3 1 2026-03-09T17:29:29.875 INFO:tasks.workunit.client.0.vm06.stdout:1/366: write d11/d14/d1d/f56 [4419141,88225] 0 2026-03-09T17:29:29.877 INFO:tasks.workunit.client.0.vm06.stdout:3/306: symlink dd/d5c/l61 0 2026-03-09T17:29:29.878 INFO:tasks.workunit.client.0.vm06.stdout:3/307: dread dd/d1d/f29 [0,4194304] 0 2026-03-09T17:29:29.878 INFO:tasks.workunit.client.0.vm06.stdout:3/308: readlink dd/d19/d2c/l3d 0 2026-03-09T17:29:29.883 INFO:tasks.workunit.client.0.vm06.stdout:3/309: dwrite dd/d1d/d2e/f3a [0,4194304] 0 2026-03-09T17:29:29.891 INFO:tasks.workunit.client.0.vm06.stdout:7/332: write d5/d7/f1d [3787254,48112] 0 2026-03-09T17:29:29.893 INFO:tasks.workunit.client.0.vm06.stdout:3/310: dwrite dd/d1d/f4b [0,4194304] 0 2026-03-09T17:29:29.904 INFO:tasks.workunit.client.0.vm06.stdout:7/333: write d5/f8 [6693172,101322] 0 2026-03-09T17:29:29.908 INFO:tasks.workunit.client.0.vm06.stdout:3/311: symlink dd/d19/d1e/l62 0 2026-03-09T17:29:29.911 INFO:tasks.workunit.client.0.vm06.stdout:3/312: dread dd/d1d/d2e/f3a [0,4194304] 0 2026-03-09T17:29:29.912 INFO:tasks.workunit.client.0.vm06.stdout:3/313: truncate dd/d5c/f60 257605 0 2026-03-09T17:29:29.918 INFO:tasks.workunit.client.0.vm06.stdout:1/367: dread d11/d14/d1d/f56 [0,4194304] 0 2026-03-09T17:29:29.918 INFO:tasks.workunit.client.0.vm06.stdout:7/334: rmdir d5/dd 39 2026-03-09T17:29:29.919 INFO:tasks.workunit.client.0.vm06.stdout:7/335: read - d5/d7/d2b/f52 zero size 2026-03-09T17:29:29.920 INFO:tasks.workunit.client.0.vm06.stdout:3/314: write dd/d1d/f29 [328929,110790] 0 2026-03-09T17:29:29.921 INFO:tasks.workunit.client.0.vm06.stdout:3/315: chown dd/d19/l39 434 1 2026-03-09T17:29:29.926 INFO:tasks.workunit.client.0.vm06.stdout:3/316: symlink dd/d19/d25/l63 0 2026-03-09T17:29:29.930 INFO:tasks.workunit.client.0.vm06.stdout:3/317: write dd/d19/d1e/f3f [20904,59131] 0 2026-03-09T17:29:29.930 INFO:tasks.workunit.client.0.vm06.stdout:1/368: getdents d11/d6b 0 2026-03-09T17:29:29.930 INFO:tasks.workunit.client.0.vm06.stdout:1/369: chown d11/d14 28 1 2026-03-09T17:29:29.930 INFO:tasks.workunit.client.0.vm06.stdout:3/318: read - dd/d19/d25/d2d/f55 zero size 2026-03-09T17:29:29.931 INFO:tasks.workunit.client.0.vm06.stdout:7/336: creat d5/d1f/d34/f54 x:0 0 0 2026-03-09T17:29:29.933 INFO:tasks.workunit.client.0.vm06.stdout:3/319: chown lb 775008162 1 2026-03-09T17:29:29.933 INFO:tasks.workunit.client.0.vm06.stdout:8/315: truncate d15/d16/d1e/f4e 3901697 0 2026-03-09T17:29:29.937 INFO:tasks.workunit.client.0.vm06.stdout:8/316: dwrite d15/d16/d1e/d30/f3b [0,4194304] 0 2026-03-09T17:29:29.938 INFO:tasks.workunit.client.0.vm06.stdout:8/317: fsync f7 0 2026-03-09T17:29:29.948 INFO:tasks.workunit.client.0.vm06.stdout:1/370: truncate d11/d14/d1d/d1e/d2a/f40 3296520 0 2026-03-09T17:29:29.949 INFO:tasks.workunit.client.0.vm06.stdout:1/371: stat d11/d14/d1c/d1f/f68 0 2026-03-09T17:29:29.954 INFO:tasks.workunit.client.0.vm06.stdout:7/337: dread d5/d12/f35 [0,4194304] 0 2026-03-09T17:29:29.954 INFO:tasks.workunit.client.0.vm06.stdout:7/338: dread - d5/d1f/d34/f54 zero size 2026-03-09T17:29:29.956 INFO:tasks.workunit.client.0.vm06.stdout:3/320: rename dd/d19/d25/c33 to dd/d19/d2c/c64 0 2026-03-09T17:29:29.958 INFO:tasks.workunit.client.0.vm06.stdout:8/318: write d15/d16/d1e/f64 [953338,2927] 0 2026-03-09T17:29:29.961 INFO:tasks.workunit.client.0.vm06.stdout:3/321: dwrite dd/d19/d25/f53 [0,4194304] 0 2026-03-09T17:29:29.963 INFO:tasks.workunit.client.0.vm06.stdout:3/322: chown dd/d19/d25/d44/c5d 3128 1 2026-03-09T17:29:29.970 INFO:tasks.workunit.client.0.vm06.stdout:3/323: write dd/f38 [905204,104907] 0 2026-03-09T17:29:29.972 INFO:tasks.workunit.client.0.vm06.stdout:7/339: creat d5/d1f/d34/d46/f55 x:0 0 0 2026-03-09T17:29:29.977 INFO:tasks.workunit.client.0.vm06.stdout:1/372: sync 2026-03-09T17:29:29.979 INFO:tasks.workunit.client.0.vm06.stdout:3/324: dread dd/f14 [0,4194304] 0 2026-03-09T17:29:29.991 INFO:tasks.workunit.client.0.vm06.stdout:8/319: write d15/d16/d1e/f59 [2293295,112120] 0 2026-03-09T17:29:30.000 INFO:tasks.workunit.client.0.vm06.stdout:3/325: dwrite dd/d1d/f45 [0,4194304] 0 2026-03-09T17:29:30.002 INFO:tasks.workunit.client.0.vm06.stdout:3/326: chown dd/d19/d1e/f23 4470171 1 2026-03-09T17:29:30.004 INFO:tasks.workunit.client.0.vm06.stdout:7/340: unlink d5/l2f 0 2026-03-09T17:29:30.014 INFO:tasks.workunit.client.0.vm06.stdout:1/373: symlink d11/d14/d1c/d1f/d57/d7b/l7d 0 2026-03-09T17:29:30.018 INFO:tasks.workunit.client.0.vm06.stdout:4/331: write fa [2035050,75838] 0 2026-03-09T17:29:30.023 INFO:tasks.workunit.client.0.vm06.stdout:1/374: dread - d11/d14/d1d/d1e/d2a/d34/d58/f6a zero size 2026-03-09T17:29:30.030 INFO:tasks.workunit.client.0.vm06.stdout:7/341: rename d5/dd/ff to d5/d1f/f56 0 2026-03-09T17:29:30.031 INFO:tasks.workunit.client.0.vm06.stdout:7/342: dread - d5/d1f/d34/f54 zero size 2026-03-09T17:29:30.031 INFO:tasks.workunit.client.0.vm06.stdout:7/343: readlink d5/d7/l2e 0 2026-03-09T17:29:30.034 INFO:tasks.workunit.client.0.vm06.stdout:4/332: creat db/d59/d5f/d6d/f7b x:0 0 0 2026-03-09T17:29:30.035 INFO:tasks.workunit.client.0.vm06.stdout:8/320: getdents d15 0 2026-03-09T17:29:30.037 INFO:tasks.workunit.client.0.vm06.stdout:7/344: mknod d5/dd/c57 0 2026-03-09T17:29:30.037 INFO:tasks.workunit.client.0.vm06.stdout:7/345: dread - d5/d1f/d34/f47 zero size 2026-03-09T17:29:30.038 INFO:tasks.workunit.client.0.vm06.stdout:1/375: sync 2026-03-09T17:29:30.041 INFO:tasks.workunit.client.0.vm06.stdout:1/376: dwrite d11/d14/f17 [0,4194304] 0 2026-03-09T17:29:30.047 INFO:tasks.workunit.client.0.vm06.stdout:4/333: mknod db/d1d/d21/d37/d69/c7c 0 2026-03-09T17:29:30.047 INFO:tasks.workunit.client.0.vm06.stdout:3/327: getdents dd/d1d 0 2026-03-09T17:29:30.051 INFO:tasks.workunit.client.0.vm06.stdout:7/346: creat d5/d7/f58 x:0 0 0 2026-03-09T17:29:30.052 INFO:tasks.workunit.client.0.vm06.stdout:7/347: dread d5/d7/d2b/f50 [4194304,4194304] 0 2026-03-09T17:29:30.053 INFO:tasks.workunit.client.0.vm06.stdout:1/377: symlink d11/d14/d1d/d1e/l7e 0 2026-03-09T17:29:30.053 INFO:tasks.workunit.client.0.vm06.stdout:1/378: chown d11/d14/d1c/d1f/d57/l66 675216265 1 2026-03-09T17:29:30.056 INFO:tasks.workunit.client.0.vm06.stdout:1/379: dwrite d11/d14/d1c/d1f/f4c [0,4194304] 0 2026-03-09T17:29:30.064 INFO:tasks.workunit.client.0.vm06.stdout:8/321: rename f13 to d15/d16/d19/d3d/f6a 0 2026-03-09T17:29:30.068 INFO:tasks.workunit.client.0.vm06.stdout:8/322: dread d15/d16/f50 [0,4194304] 0 2026-03-09T17:29:30.070 INFO:tasks.workunit.client.0.vm06.stdout:7/348: creat d5/d7/d2b/f59 x:0 0 0 2026-03-09T17:29:30.073 INFO:tasks.workunit.client.0.vm06.stdout:1/380: creat d11/d14/d1c/d1f/f7f x:0 0 0 2026-03-09T17:29:30.075 INFO:tasks.workunit.client.0.vm06.stdout:4/334: mknod db/d1d/d21/d37/d69/d78/c7d 0 2026-03-09T17:29:30.075 INFO:tasks.workunit.client.0.vm06.stdout:4/335: fsync db/d1d/d21/f2f 0 2026-03-09T17:29:30.079 INFO:tasks.workunit.client.0.vm06.stdout:3/328: mkdir dd/d5b/d65 0 2026-03-09T17:29:30.080 INFO:tasks.workunit.client.0.vm06.stdout:3/329: write dd/d19/d28/f32 [476207,48913] 0 2026-03-09T17:29:30.080 INFO:tasks.workunit.client.0.vm06.stdout:3/330: write f7 [5987500,108266] 0 2026-03-09T17:29:30.087 INFO:tasks.workunit.client.0.vm06.stdout:8/323: rename d15/d16/d1a/d47/c56 to d15/c6b 0 2026-03-09T17:29:30.091 INFO:tasks.workunit.client.0.vm06.stdout:7/349: mknod d5/d1f/c5a 0 2026-03-09T17:29:30.091 INFO:tasks.workunit.client.0.vm06.stdout:3/331: dread dd/d19/d28/f32 [0,4194304] 0 2026-03-09T17:29:30.092 INFO:tasks.workunit.client.0.vm06.stdout:3/332: write dd/f1a [10631698,13463] 0 2026-03-09T17:29:30.093 INFO:tasks.workunit.client.0.vm06.stdout:3/333: chown dd/d1d/d4e/f5a 1620623 1 2026-03-09T17:29:30.102 INFO:tasks.workunit.client.0.vm06.stdout:8/324: mkdir d15/d39/d3c/d6c 0 2026-03-09T17:29:30.104 INFO:tasks.workunit.client.0.vm06.stdout:7/350: stat d5/l1e 0 2026-03-09T17:29:30.109 INFO:tasks.workunit.client.0.vm06.stdout:0/389: write d7/d11/d19/d1d/d39/f7d [269681,120791] 0 2026-03-09T17:29:30.115 INFO:tasks.workunit.client.0.vm06.stdout:5/315: dwrite d4/f21 [0,4194304] 0 2026-03-09T17:29:30.117 INFO:tasks.workunit.client.0.vm06.stdout:5/316: write d4/d50/d35/f39 [1713318,91514] 0 2026-03-09T17:29:30.118 INFO:tasks.workunit.client.0.vm06.stdout:6/282: write d6/fb [4616365,84068] 0 2026-03-09T17:29:30.122 INFO:tasks.workunit.client.0.vm06.stdout:0/390: sync 2026-03-09T17:29:30.128 INFO:tasks.workunit.client.0.vm06.stdout:2/307: truncate d3/d4/d12/f35 481792 0 2026-03-09T17:29:30.129 INFO:tasks.workunit.client.0.vm06.stdout:7/351: fdatasync d5/f8 0 2026-03-09T17:29:30.131 INFO:tasks.workunit.client.0.vm06.stdout:1/381: mknod d11/d14/d1d/d1e/d2a/d34/c80 0 2026-03-09T17:29:30.132 INFO:tasks.workunit.client.0.vm06.stdout:4/336: link db/l3d db/df/l7e 0 2026-03-09T17:29:30.133 INFO:tasks.workunit.client.0.vm06.stdout:4/337: chown db/df/c46 144 1 2026-03-09T17:29:30.133 INFO:tasks.workunit.client.0.vm06.stdout:4/338: chown db/d1d/d21/d26/c32 1 1 2026-03-09T17:29:30.134 INFO:tasks.workunit.client.0.vm06.stdout:4/339: write db/df/f14 [9345702,78104] 0 2026-03-09T17:29:30.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:29 vm06.local ceph-mon[57307]: pgmap v148: 65 pgs: 65 active+clean; 731 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 12 MiB/s rd, 91 MiB/s wr, 249 op/s 2026-03-09T17:29:30.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:29 vm09.local ceph-mon[62061]: pgmap v148: 65 pgs: 65 active+clean; 731 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 12 MiB/s rd, 91 MiB/s wr, 249 op/s 2026-03-09T17:29:30.145 INFO:tasks.workunit.client.0.vm06.stdout:0/391: rename d7/d11/c1b to d7/d88/c8d 0 2026-03-09T17:29:30.148 INFO:tasks.workunit.client.0.vm06.stdout:8/325: mkdir d15/d16/d6d 0 2026-03-09T17:29:30.153 INFO:tasks.workunit.client.0.vm06.stdout:1/382: symlink d11/d14/d1d/l81 0 2026-03-09T17:29:30.159 INFO:tasks.workunit.client.0.vm06.stdout:4/340: symlink db/d1d/d21/d44/l7f 0 2026-03-09T17:29:30.161 INFO:tasks.workunit.client.0.vm06.stdout:4/341: dread db/df/f36 [0,4194304] 0 2026-03-09T17:29:30.165 INFO:tasks.workunit.client.0.vm06.stdout:5/317: creat d4/d50/d18/f74 x:0 0 0 2026-03-09T17:29:30.176 INFO:tasks.workunit.client.0.vm06.stdout:3/334: getdents dd/d19/d25/d48 0 2026-03-09T17:29:30.180 INFO:tasks.workunit.client.0.vm06.stdout:4/342: creat db/d1d/d21/d25/f80 x:0 0 0 2026-03-09T17:29:30.181 INFO:tasks.workunit.client.0.vm06.stdout:5/318: mkdir d4/d50/d35/d75 0 2026-03-09T17:29:30.184 INFO:tasks.workunit.client.0.vm06.stdout:8/326: mknod d15/d16/c6e 0 2026-03-09T17:29:30.184 INFO:tasks.workunit.client.0.vm06.stdout:8/327: fdatasync d15/d16/d1e/f64 0 2026-03-09T17:29:30.194 INFO:tasks.workunit.client.0.vm06.stdout:3/335: creat dd/d5c/f66 x:0 0 0 2026-03-09T17:29:30.195 INFO:tasks.workunit.client.0.vm06.stdout:1/383: mknod d11/d14/d1c/c82 0 2026-03-09T17:29:30.198 INFO:tasks.workunit.client.0.vm06.stdout:4/343: fsync db/df/f2d 0 2026-03-09T17:29:30.199 INFO:tasks.workunit.client.0.vm06.stdout:8/328: creat d15/d39/f6f x:0 0 0 2026-03-09T17:29:30.200 INFO:tasks.workunit.client.0.vm06.stdout:8/329: fsync d15/d16/d1a/f22 0 2026-03-09T17:29:30.200 INFO:tasks.workunit.client.0.vm06.stdout:7/352: getdents d5/d12 0 2026-03-09T17:29:30.202 INFO:tasks.workunit.client.0.vm06.stdout:1/384: unlink d11/d14/d1c/d1f/d57/c79 0 2026-03-09T17:29:30.204 INFO:tasks.workunit.client.0.vm06.stdout:4/344: creat db/d1d/d21/d37/f81 x:0 0 0 2026-03-09T17:29:30.205 INFO:tasks.workunit.client.0.vm06.stdout:5/319: symlink d4/d50/d35/d40/d6f/l76 0 2026-03-09T17:29:30.207 INFO:tasks.workunit.client.0.vm06.stdout:3/336: dread f4 [4194304,4194304] 0 2026-03-09T17:29:30.208 INFO:tasks.workunit.client.0.vm06.stdout:5/320: dread d4/d50/d18/d3d/f54 [0,4194304] 0 2026-03-09T17:29:30.211 INFO:tasks.workunit.client.0.vm06.stdout:8/330: dread d15/d16/d19/d2b/f63 [0,4194304] 0 2026-03-09T17:29:30.214 INFO:tasks.workunit.client.0.vm06.stdout:7/353: rename d5/f4b to d5/d1f/d34/d3f/f5b 0 2026-03-09T17:29:30.215 INFO:tasks.workunit.client.0.vm06.stdout:8/331: dwrite d15/d39/d3c/f5d [0,4194304] 0 2026-03-09T17:29:30.233 INFO:tasks.workunit.client.0.vm06.stdout:4/345: dread db/df/f30 [0,4194304] 0 2026-03-09T17:29:30.233 INFO:tasks.workunit.client.0.vm06.stdout:4/346: chown db/f15 3984 1 2026-03-09T17:29:30.235 INFO:tasks.workunit.client.0.vm06.stdout:3/337: mkdir dd/d1d/d2e/d67 0 2026-03-09T17:29:30.236 INFO:tasks.workunit.client.0.vm06.stdout:3/338: write dd/f5f [1036660,12387] 0 2026-03-09T17:29:30.237 INFO:tasks.workunit.client.0.vm06.stdout:5/321: creat d4/d22/f77 x:0 0 0 2026-03-09T17:29:30.240 INFO:tasks.workunit.client.0.vm06.stdout:5/322: dread d4/d22/d64/f70 [0,4194304] 0 2026-03-09T17:29:30.243 INFO:tasks.workunit.client.0.vm06.stdout:8/332: symlink d15/d39/d3c/l70 0 2026-03-09T17:29:30.247 INFO:tasks.workunit.client.0.vm06.stdout:7/354: dwrite d5/d1f/d34/d3f/f5b [0,4194304] 0 2026-03-09T17:29:30.247 INFO:tasks.workunit.client.0.vm06.stdout:5/323: creat d4/d22/d46/f78 x:0 0 0 2026-03-09T17:29:30.252 INFO:tasks.workunit.client.0.vm06.stdout:5/324: mkdir d4/d52/d79 0 2026-03-09T17:29:30.253 INFO:tasks.workunit.client.0.vm06.stdout:5/325: write d4/d50/d18/f4b [15936,93314] 0 2026-03-09T17:29:30.257 INFO:tasks.workunit.client.0.vm06.stdout:5/326: dwrite d4/d52/f6c [0,4194304] 0 2026-03-09T17:29:30.259 INFO:tasks.workunit.client.0.vm06.stdout:4/347: sync 2026-03-09T17:29:30.268 INFO:tasks.workunit.client.0.vm06.stdout:3/339: creat dd/f68 x:0 0 0 2026-03-09T17:29:30.277 INFO:tasks.workunit.client.0.vm06.stdout:7/355: unlink d5/d1f/d34/f47 0 2026-03-09T17:29:30.279 INFO:tasks.workunit.client.0.vm06.stdout:5/327: mknod d4/d50/d35/d40/c7a 0 2026-03-09T17:29:30.280 INFO:tasks.workunit.client.0.vm06.stdout:5/328: read d4/d50/d18/d3d/f44 [1858021,118409] 0 2026-03-09T17:29:30.285 INFO:tasks.workunit.client.0.vm06.stdout:5/329: dwrite d4/d50/f61 [0,4194304] 0 2026-03-09T17:29:30.292 INFO:tasks.workunit.client.0.vm06.stdout:3/340: write dd/d1d/d2e/f3a [1889689,68737] 0 2026-03-09T17:29:30.298 INFO:tasks.workunit.client.0.vm06.stdout:3/341: dwrite dd/d19/d2c/f37 [0,4194304] 0 2026-03-09T17:29:30.299 INFO:tasks.workunit.client.0.vm06.stdout:3/342: stat dd/d1d/f4b 0 2026-03-09T17:29:30.312 INFO:tasks.workunit.client.0.vm06.stdout:8/333: rename d15/d16/d1e/d5c to d15/d16/d19/d71 0 2026-03-09T17:29:30.312 INFO:tasks.workunit.client.0.vm06.stdout:8/334: chown d15/d39/f45 9 1 2026-03-09T17:29:30.318 INFO:tasks.workunit.client.0.vm06.stdout:8/335: dwrite d15/d16/d1e/f64 [0,4194304] 0 2026-03-09T17:29:30.346 INFO:tasks.workunit.client.0.vm06.stdout:5/330: truncate d4/d50/f14 1629137 0 2026-03-09T17:29:30.360 INFO:tasks.workunit.client.0.vm06.stdout:5/331: symlink d4/d50/d18/l7b 0 2026-03-09T17:29:30.362 INFO:tasks.workunit.client.0.vm06.stdout:3/343: mknod dd/c69 0 2026-03-09T17:29:30.380 INFO:tasks.workunit.client.0.vm06.stdout:5/332: dwrite d4/d50/d18/f31 [4194304,4194304] 0 2026-03-09T17:29:30.390 INFO:tasks.workunit.client.0.vm06.stdout:2/308: dwrite d3/d4/d12/d2b/f32 [0,4194304] 0 2026-03-09T17:29:30.393 INFO:tasks.workunit.client.0.vm06.stdout:9/367: write d3/d26/d6c/f3a [987135,52271] 0 2026-03-09T17:29:30.396 INFO:tasks.workunit.client.0.vm06.stdout:6/283: dwrite d6/f4a [4194304,4194304] 0 2026-03-09T17:29:30.399 INFO:tasks.workunit.client.0.vm06.stdout:7/356: rename d5/d1f/d34/l3c to d5/l5c 0 2026-03-09T17:29:30.408 INFO:tasks.workunit.client.0.vm06.stdout:5/333: mknod d4/d52/d55/c7c 0 2026-03-09T17:29:30.415 INFO:tasks.workunit.client.0.vm06.stdout:2/309: unlink d3/d4/d12/d2b/d36/l4c 0 2026-03-09T17:29:30.417 INFO:tasks.workunit.client.0.vm06.stdout:3/344: creat dd/d5b/d65/f6a x:0 0 0 2026-03-09T17:29:30.418 INFO:tasks.workunit.client.0.vm06.stdout:6/284: chown d6/d12/d17/f32 0 1 2026-03-09T17:29:30.419 INFO:tasks.workunit.client.0.vm06.stdout:6/285: write d6/d4f/f26 [2825632,121946] 0 2026-03-09T17:29:30.420 INFO:tasks.workunit.client.0.vm06.stdout:7/357: creat d5/dd/f5d x:0 0 0 2026-03-09T17:29:30.421 INFO:tasks.workunit.client.0.vm06.stdout:7/358: write d5/d7/d2b/f42 [995916,83945] 0 2026-03-09T17:29:30.422 INFO:tasks.workunit.client.0.vm06.stdout:7/359: write d5/d7/d2b/f52 [130654,67583] 0 2026-03-09T17:29:30.422 INFO:tasks.workunit.client.0.vm06.stdout:7/360: fdatasync f0 0 2026-03-09T17:29:30.423 INFO:tasks.workunit.client.0.vm06.stdout:7/361: chown d5/d7/d2b 268638403 1 2026-03-09T17:29:30.425 INFO:tasks.workunit.client.0.vm06.stdout:5/334: creat d4/d22/d64/f7d x:0 0 0 2026-03-09T17:29:30.425 INFO:tasks.workunit.client.0.vm06.stdout:5/335: fdatasync d4/fb 0 2026-03-09T17:29:30.425 INFO:tasks.workunit.client.0.vm06.stdout:5/336: chown d4/f11 0 1 2026-03-09T17:29:30.428 INFO:tasks.workunit.client.0.vm06.stdout:9/368: symlink d3/d6d/l70 0 2026-03-09T17:29:30.431 INFO:tasks.workunit.client.0.vm06.stdout:6/286: write d6/f56 [4462835,106669] 0 2026-03-09T17:29:30.440 INFO:tasks.workunit.client.0.vm06.stdout:2/310: mknod d3/c62 0 2026-03-09T17:29:30.442 INFO:tasks.workunit.client.0.vm06.stdout:9/369: creat d3/d11/d65/f71 x:0 0 0 2026-03-09T17:29:30.445 INFO:tasks.workunit.client.0.vm06.stdout:0/392: write d7/d11/d19/d1d/d39/f4a [4318048,113767] 0 2026-03-09T17:29:30.445 INFO:tasks.workunit.client.0.vm06.stdout:0/393: readlink d7/d11/d5d/l84 0 2026-03-09T17:29:30.447 INFO:tasks.workunit.client.0.vm06.stdout:3/345: symlink dd/l6b 0 2026-03-09T17:29:30.452 INFO:tasks.workunit.client.0.vm06.stdout:6/287: dwrite d6/d4f/f33 [0,4194304] 0 2026-03-09T17:29:30.452 INFO:tasks.workunit.client.0.vm06.stdout:6/288: chown d6/c41 1 1 2026-03-09T17:29:30.452 INFO:tasks.workunit.client.0.vm06.stdout:6/289: read - d6/d12/d2d/f5e zero size 2026-03-09T17:29:30.459 INFO:tasks.workunit.client.0.vm06.stdout:7/362: link d5/d1f/d34/d3f/f5b d5/d1f/d34/f5e 0 2026-03-09T17:29:30.461 INFO:tasks.workunit.client.0.vm06.stdout:5/337: unlink d4/c8 0 2026-03-09T17:29:30.464 INFO:tasks.workunit.client.0.vm06.stdout:5/338: dread d4/d50/d18/f5c [0,4194304] 0 2026-03-09T17:29:30.467 INFO:tasks.workunit.client.0.vm06.stdout:0/394: rename d7/f27 to d7/d11/d19/d23/f8e 0 2026-03-09T17:29:30.471 INFO:tasks.workunit.client.0.vm06.stdout:0/395: chown d7/d11/d19/d37/l3d 2164546 1 2026-03-09T17:29:30.471 INFO:tasks.workunit.client.0.vm06.stdout:3/346: unlink dd/d19/d25/f53 0 2026-03-09T17:29:30.472 INFO:tasks.workunit.client.0.vm06.stdout:6/290: chown d6/d12/d17/l2a 1527 1 2026-03-09T17:29:30.475 INFO:tasks.workunit.client.0.vm06.stdout:7/363: mkdir d5/d12/d5f 0 2026-03-09T17:29:30.479 INFO:tasks.workunit.client.0.vm06.stdout:3/347: dread f4 [0,4194304] 0 2026-03-09T17:29:30.482 INFO:tasks.workunit.client.0.vm06.stdout:3/348: dwrite dd/f4a [0,4194304] 0 2026-03-09T17:29:30.484 INFO:tasks.workunit.client.0.vm06.stdout:3/349: read dd/f4a [65908,4545] 0 2026-03-09T17:29:30.484 INFO:tasks.workunit.client.0.vm06.stdout:3/350: chown dd/d19/d25/f4f 874578 1 2026-03-09T17:29:30.490 INFO:tasks.workunit.client.0.vm06.stdout:2/311: mknod d3/d4/d12/d2b/d2d/c63 0 2026-03-09T17:29:30.495 INFO:tasks.workunit.client.0.vm06.stdout:9/370: link d3/d15/f1a d3/d15/d16/f72 0 2026-03-09T17:29:30.496 INFO:tasks.workunit.client.0.vm06.stdout:9/371: read - d3/d15/f5e zero size 2026-03-09T17:29:30.496 INFO:tasks.workunit.client.0.vm06.stdout:9/372: fdatasync d3/f1b 0 2026-03-09T17:29:30.497 INFO:tasks.workunit.client.0.vm06.stdout:9/373: dread - d3/d15/d48/f64 zero size 2026-03-09T17:29:30.500 INFO:tasks.workunit.client.0.vm06.stdout:9/374: dwrite d3/d15/d36/d4d/f60 [0,4194304] 0 2026-03-09T17:29:30.513 INFO:tasks.workunit.client.0.vm06.stdout:0/396: fsync d7/d11/d19/d3c/f55 0 2026-03-09T17:29:30.518 INFO:tasks.workunit.client.0.vm06.stdout:5/339: mkdir d4/d7e 0 2026-03-09T17:29:30.526 INFO:tasks.workunit.client.0.vm06.stdout:9/375: symlink d3/d11/d65/l73 0 2026-03-09T17:29:30.528 INFO:tasks.workunit.client.0.vm06.stdout:0/397: chown d7/d11/d19/d1d/d59/d5b/l81 6 1 2026-03-09T17:29:30.532 INFO:tasks.workunit.client.0.vm06.stdout:5/340: symlink d4/d50/d18/d3d/l7f 0 2026-03-09T17:29:30.535 INFO:tasks.workunit.client.0.vm06.stdout:5/341: dwrite d4/d50/d18/f73 [0,4194304] 0 2026-03-09T17:29:30.546 INFO:tasks.workunit.client.0.vm06.stdout:2/312: fsync d3/d4/d12/d2b/d2d/f1b 0 2026-03-09T17:29:30.546 INFO:tasks.workunit.client.0.vm06.stdout:9/376: chown d3/d2c/l3f 1066906 1 2026-03-09T17:29:30.547 INFO:tasks.workunit.client.0.vm06.stdout:2/313: chown d3/d4/d12/f20 2723941 1 2026-03-09T17:29:30.554 INFO:tasks.workunit.client.0.vm06.stdout:4/348: rmdir db/d1d 39 2026-03-09T17:29:30.561 INFO:tasks.workunit.client.0.vm06.stdout:7/364: link d5/dd/f29 d5/f60 0 2026-03-09T17:29:30.561 INFO:tasks.workunit.client.0.vm06.stdout:7/365: chown d5/dd 10 1 2026-03-09T17:29:30.561 INFO:tasks.workunit.client.0.vm06.stdout:2/314: rename d3/d4/d38/d60 to d3/d4/d38/d64 0 2026-03-09T17:29:30.561 INFO:tasks.workunit.client.0.vm06.stdout:7/366: dwrite d5/d7/d2b/f42 [0,4194304] 0 2026-03-09T17:29:30.561 INFO:tasks.workunit.client.0.vm06.stdout:2/315: write d3/d4/d12/d2b/d36/d37/f41 [702834,51307] 0 2026-03-09T17:29:30.564 INFO:tasks.workunit.client.0.vm06.stdout:2/316: dread d3/f10 [0,4194304] 0 2026-03-09T17:29:30.565 INFO:tasks.workunit.client.0.vm06.stdout:2/317: read - d3/f5a zero size 2026-03-09T17:29:30.568 INFO:tasks.workunit.client.0.vm06.stdout:1/385: dwrite d11/d14/d1c/d1f/f68 [0,4194304] 0 2026-03-09T17:29:30.580 INFO:tasks.workunit.client.0.vm06.stdout:0/398: rmdir d7/d11/d5a 0 2026-03-09T17:29:30.583 INFO:tasks.workunit.client.0.vm06.stdout:0/399: dwrite d7/f2a [4194304,4194304] 0 2026-03-09T17:29:30.591 INFO:tasks.workunit.client.0.vm06.stdout:7/367: mknod d5/d12/c61 0 2026-03-09T17:29:30.595 INFO:tasks.workunit.client.0.vm06.stdout:5/342: link d4/f26 d4/d50/f80 0 2026-03-09T17:29:30.596 INFO:tasks.workunit.client.0.vm06.stdout:5/343: write d4/f11 [1070298,103520] 0 2026-03-09T17:29:30.597 INFO:tasks.workunit.client.0.vm06.stdout:5/344: write d4/d50/d18/f3e [4651256,95537] 0 2026-03-09T17:29:30.598 INFO:tasks.workunit.client.0.vm06.stdout:5/345: write d4/d22/d46/f58 [939445,4742] 0 2026-03-09T17:29:30.598 INFO:tasks.workunit.client.0.vm06.stdout:5/346: stat d4/d50/d18/l7b 0 2026-03-09T17:29:30.608 INFO:tasks.workunit.client.0.vm06.stdout:9/377: rename d3/fb to d3/d15/f74 0 2026-03-09T17:29:30.610 INFO:tasks.workunit.client.0.vm06.stdout:0/400: chown d7/d11/d2d/f3a 95113654 1 2026-03-09T17:29:30.611 INFO:tasks.workunit.client.0.vm06.stdout:4/349: symlink db/l82 0 2026-03-09T17:29:30.612 INFO:tasks.workunit.client.0.vm06.stdout:4/350: chown db/df/c3e 7634267 1 2026-03-09T17:29:30.615 INFO:tasks.workunit.client.0.vm06.stdout:4/351: dwrite db/d59/d5f/d45/f4a [0,4194304] 0 2026-03-09T17:29:30.627 INFO:tasks.workunit.client.0.vm06.stdout:7/368: creat d5/d7/f62 x:0 0 0 2026-03-09T17:29:30.627 INFO:tasks.workunit.client.0.vm06.stdout:1/386: mknod d11/d14/d1c/c83 0 2026-03-09T17:29:30.630 INFO:tasks.workunit.client.0.vm06.stdout:5/347: creat d4/d50/d18/d3d/f81 x:0 0 0 2026-03-09T17:29:30.638 INFO:tasks.workunit.client.0.vm06.stdout:7/369: mknod d5/d1f/d34/c63 0 2026-03-09T17:29:30.641 INFO:tasks.workunit.client.0.vm06.stdout:7/370: dread d5/d7/d2b/f42 [0,4194304] 0 2026-03-09T17:29:30.641 INFO:tasks.workunit.client.0.vm06.stdout:7/371: chown d5/d12/f32 198942 1 2026-03-09T17:29:30.648 INFO:tasks.workunit.client.0.vm06.stdout:7/372: dwrite d5/f8 [0,4194304] 0 2026-03-09T17:29:30.659 INFO:tasks.workunit.client.0.vm06.stdout:5/348: dwrite d4/d22/d64/f70 [4194304,4194304] 0 2026-03-09T17:29:30.665 INFO:tasks.workunit.client.0.vm06.stdout:0/401: mknod d7/d11/d19/d8b/c8f 0 2026-03-09T17:29:30.666 INFO:tasks.workunit.client.0.vm06.stdout:5/349: dwrite d4/d50/d18/f3e [4194304,4194304] 0 2026-03-09T17:29:30.671 INFO:tasks.workunit.client.0.vm06.stdout:4/352: link db/df/f14 db/df/f83 0 2026-03-09T17:29:30.672 INFO:tasks.workunit.client.0.vm06.stdout:2/318: link d3/cd d3/d4/d12/d2b/d36/d37/c65 0 2026-03-09T17:29:30.682 INFO:tasks.workunit.client.0.vm06.stdout:7/373: mkdir d5/d12/d64 0 2026-03-09T17:29:30.686 INFO:tasks.workunit.client.0.vm06.stdout:5/350: creat d4/d22/d46/f82 x:0 0 0 2026-03-09T17:29:30.691 INFO:tasks.workunit.client.0.vm06.stdout:5/351: dwrite d4/d22/d46/f6e [0,4194304] 0 2026-03-09T17:29:30.708 INFO:tasks.workunit.client.0.vm06.stdout:4/353: unlink db/f23 0 2026-03-09T17:29:30.717 INFO:tasks.workunit.client.0.vm06.stdout:5/352: symlink d4/d50/d18/l83 0 2026-03-09T17:29:30.718 INFO:tasks.workunit.client.0.vm06.stdout:4/354: stat db/c1c 0 2026-03-09T17:29:30.722 INFO:tasks.workunit.client.0.vm06.stdout:0/402: dread d7/d11/f20 [0,4194304] 0 2026-03-09T17:29:30.723 INFO:tasks.workunit.client.0.vm06.stdout:0/403: truncate d7/f50 908387 0 2026-03-09T17:29:30.726 INFO:tasks.workunit.client.0.vm06.stdout:7/374: getdents d5/d1f/d34/d46/d51 0 2026-03-09T17:29:30.727 INFO:tasks.workunit.client.0.vm06.stdout:4/355: mkdir db/d59/d5f/d45/d84 0 2026-03-09T17:29:30.729 INFO:tasks.workunit.client.0.vm06.stdout:0/404: symlink d7/d11/d19/d23/l90 0 2026-03-09T17:29:30.730 INFO:tasks.workunit.client.0.vm06.stdout:0/405: dread - d7/d11/d19/d23/f60 zero size 2026-03-09T17:29:30.731 INFO:tasks.workunit.client.0.vm06.stdout:2/319: creat d3/d4/d12/f66 x:0 0 0 2026-03-09T17:29:30.741 INFO:tasks.workunit.client.0.vm06.stdout:2/320: creat d3/d4/d22/f67 x:0 0 0 2026-03-09T17:29:30.741 INFO:tasks.workunit.client.0.vm06.stdout:2/321: dread - d3/d4/d22/d43/f5f zero size 2026-03-09T17:29:30.742 INFO:tasks.workunit.client.0.vm06.stdout:2/322: dread - d3/d4/d22/f67 zero size 2026-03-09T17:29:30.745 INFO:tasks.workunit.client.0.vm06.stdout:0/406: dwrite d7/f56 [0,4194304] 0 2026-03-09T17:29:30.746 INFO:tasks.workunit.client.0.vm06.stdout:0/407: write d7/d11/d19/f57 [1311019,10919] 0 2026-03-09T17:29:30.747 INFO:tasks.workunit.client.0.vm06.stdout:0/408: readlink d7/d11/d19/d1d/d39/l7b 0 2026-03-09T17:29:30.758 INFO:tasks.workunit.client.0.vm06.stdout:0/409: symlink d7/d11/d2d/l91 0 2026-03-09T17:29:30.777 INFO:tasks.workunit.client.0.vm06.stdout:2/323: mknod d3/c68 0 2026-03-09T17:29:30.777 INFO:tasks.workunit.client.0.vm06.stdout:2/324: unlink d3/d4/d12/d34/l5d 0 2026-03-09T17:29:30.777 INFO:tasks.workunit.client.0.vm06.stdout:2/325: stat d3/d4/d38 0 2026-03-09T17:29:30.777 INFO:tasks.workunit.client.0.vm06.stdout:2/326: write d3/d4/d22/f67 [174312,118234] 0 2026-03-09T17:29:30.777 INFO:tasks.workunit.client.0.vm06.stdout:2/327: symlink d3/d4/d12/d2b/d36/d37/l69 0 2026-03-09T17:29:30.777 INFO:tasks.workunit.client.0.vm06.stdout:2/328: dread - d3/d4/d38/f58 zero size 2026-03-09T17:29:30.777 INFO:tasks.workunit.client.0.vm06.stdout:2/329: truncate d3/d4/d22/f4b 502666 0 2026-03-09T17:29:30.777 INFO:tasks.workunit.client.0.vm06.stdout:2/330: dwrite d3/d4/d12/f66 [0,4194304] 0 2026-03-09T17:29:30.777 INFO:tasks.workunit.client.0.vm06.stdout:2/331: mkdir d3/d4/d38/d64/d6a 0 2026-03-09T17:29:30.777 INFO:tasks.workunit.client.0.vm06.stdout:2/332: mknod d3/d4/d38/d64/d6a/c6b 0 2026-03-09T17:29:30.777 INFO:tasks.workunit.client.0.vm06.stdout:2/333: write d3/d4/d22/f28 [324984,79822] 0 2026-03-09T17:29:30.781 INFO:tasks.workunit.client.0.vm06.stdout:4/356: sync 2026-03-09T17:29:30.792 INFO:tasks.workunit.client.0.vm06.stdout:8/336: write d15/d16/d19/d2b/f63 [2113002,123521] 0 2026-03-09T17:29:30.798 INFO:tasks.workunit.client.0.vm06.stdout:8/337: rmdir d15/d31 39 2026-03-09T17:29:30.799 INFO:tasks.workunit.client.0.vm06.stdout:8/338: write d15/d39/f45 [2639421,122775] 0 2026-03-09T17:29:30.806 INFO:tasks.workunit.client.0.vm06.stdout:4/357: mkdir db/d1d/d21/d25/d4b/d85 0 2026-03-09T17:29:30.817 INFO:tasks.workunit.client.0.vm06.stdout:8/339: mknod d15/d39/d3c/c72 0 2026-03-09T17:29:30.829 INFO:tasks.workunit.client.0.vm06.stdout:8/340: truncate f12 4051324 0 2026-03-09T17:29:30.832 INFO:tasks.workunit.client.0.vm06.stdout:4/358: mknod db/d1d/d21/d26/d7a/c86 0 2026-03-09T17:29:30.834 INFO:tasks.workunit.client.0.vm06.stdout:4/359: chown db/f13 136 1 2026-03-09T17:29:30.848 INFO:tasks.workunit.client.0.vm06.stdout:4/360: symlink db/d1d/d21/d37/d69/d78/l87 0 2026-03-09T17:29:30.860 INFO:tasks.workunit.client.0.vm06.stdout:8/341: link c3 d15/d16/d19/d3d/d5f/c73 0 2026-03-09T17:29:30.861 INFO:tasks.workunit.client.0.vm06.stdout:8/342: readlink d15/d16/d1a/l20 0 2026-03-09T17:29:30.864 INFO:tasks.workunit.client.0.vm06.stdout:4/361: mkdir db/d1d/d21/d88 0 2026-03-09T17:29:30.874 INFO:tasks.workunit.client.0.vm06.stdout:4/362: mkdir db/d1d/d21/d26/d89 0 2026-03-09T17:29:30.875 INFO:tasks.workunit.client.0.vm06.stdout:4/363: write db/f68 [287015,98185] 0 2026-03-09T17:29:30.875 INFO:tasks.workunit.client.0.vm06.stdout:4/364: stat db/d1d/d21/d44/l5a 0 2026-03-09T17:29:30.883 INFO:tasks.workunit.client.0.vm06.stdout:4/365: getdents db/d59/d5f/d6d 0 2026-03-09T17:29:30.890 INFO:tasks.workunit.client.0.vm06.stdout:4/366: mkdir db/d1d/d21/d44/d8a 0 2026-03-09T17:29:30.895 INFO:tasks.workunit.client.0.vm06.stdout:4/367: fdatasync db/df/f2a 0 2026-03-09T17:29:30.900 INFO:tasks.workunit.client.0.vm06.stdout:0/410: dwrite d7/d11/d19/d23/f8e [4194304,4194304] 0 2026-03-09T17:29:30.903 INFO:tasks.workunit.client.0.vm06.stdout:4/368: dwrite db/f13 [0,4194304] 0 2026-03-09T17:29:30.923 INFO:tasks.workunit.client.0.vm06.stdout:4/369: creat db/d1d/d21/d37/d69/f8b x:0 0 0 2026-03-09T17:29:30.935 INFO:tasks.workunit.client.0.vm06.stdout:0/411: creat d7/d11/d19/d1d/d87/f92 x:0 0 0 2026-03-09T17:29:30.935 INFO:tasks.workunit.client.0.vm06.stdout:0/412: stat d7/d11/d19/d1d/d80 0 2026-03-09T17:29:30.937 INFO:tasks.workunit.client.0.vm06.stdout:6/291: truncate d6/d4f/f44 3654408 0 2026-03-09T17:29:30.939 INFO:tasks.workunit.client.0.vm06.stdout:4/370: symlink db/d1d/d21/d37/l8c 0 2026-03-09T17:29:30.940 INFO:tasks.workunit.client.0.vm06.stdout:0/413: creat d7/d11/d5d/f93 x:0 0 0 2026-03-09T17:29:30.942 INFO:tasks.workunit.client.0.vm06.stdout:6/292: unlink d6/d12/d17/d27/l36 0 2026-03-09T17:29:30.944 INFO:tasks.workunit.client.0.vm06.stdout:4/371: write db/f17 [12715689,7030] 0 2026-03-09T17:29:30.945 INFO:tasks.workunit.client.0.vm06.stdout:4/372: truncate db/d1d/d21/d26/f70 700500 0 2026-03-09T17:29:30.948 INFO:tasks.workunit.client.0.vm06.stdout:0/414: rename d7/d11/d5d/l84 to d7/d11/d5d/d64/l94 0 2026-03-09T17:29:30.949 INFO:tasks.workunit.client.0.vm06.stdout:0/415: read d7/f36 [2101869,75471] 0 2026-03-09T17:29:30.950 INFO:tasks.workunit.client.0.vm06.stdout:6/293: truncate d6/d4f/f3c 843141 0 2026-03-09T17:29:30.951 INFO:tasks.workunit.client.0.vm06.stdout:4/373: truncate db/f39 194828 0 2026-03-09T17:29:30.954 INFO:tasks.workunit.client.0.vm06.stdout:6/294: rmdir d6/d4f/d3e/d52 39 2026-03-09T17:29:30.956 INFO:tasks.workunit.client.0.vm06.stdout:3/351: truncate dd/d19/d2c/f37 1652784 0 2026-03-09T17:29:30.957 INFO:tasks.workunit.client.0.vm06.stdout:3/352: write dd/d19/d25/f4f [194668,17182] 0 2026-03-09T17:29:30.962 INFO:tasks.workunit.client.0.vm06.stdout:4/374: rmdir db/d1d/d21/d26/d7a 39 2026-03-09T17:29:30.964 INFO:tasks.workunit.client.0.vm06.stdout:6/295: symlink d6/d12/d17/d27/d40/l5f 0 2026-03-09T17:29:30.965 INFO:tasks.workunit.client.0.vm06.stdout:3/353: symlink dd/d19/d25/d2d/l6c 0 2026-03-09T17:29:30.965 INFO:tasks.workunit.client.0.vm06.stdout:3/354: stat dd/d19/d1e 0 2026-03-09T17:29:30.968 INFO:tasks.workunit.client.0.vm06.stdout:4/375: mknod db/d1d/c8d 0 2026-03-09T17:29:30.970 INFO:tasks.workunit.client.0.vm06.stdout:4/376: creat db/d59/d5f/d45/f8e x:0 0 0 2026-03-09T17:29:30.970 INFO:tasks.workunit.client.0.vm06.stdout:4/377: fsync db/f55 0 2026-03-09T17:29:30.978 INFO:tasks.workunit.client.0.vm06.stdout:4/378: mkdir db/d1d/d21/d37/d69/d78/d8f 0 2026-03-09T17:29:30.981 INFO:tasks.workunit.client.0.vm06.stdout:4/379: mkdir db/d59/d90 0 2026-03-09T17:29:30.981 INFO:tasks.workunit.client.0.vm06.stdout:4/380: chown f7 0 1 2026-03-09T17:29:30.983 INFO:tasks.workunit.client.0.vm06.stdout:4/381: mkdir db/d59/d5f/d45/d84/d91 0 2026-03-09T17:29:30.986 INFO:tasks.workunit.client.0.vm06.stdout:4/382: mknod db/d59/d5f/d5d/c92 0 2026-03-09T17:29:30.989 INFO:tasks.workunit.client.0.vm06.stdout:4/383: mknod db/d1d/d21/d37/d69/d78/c93 0 2026-03-09T17:29:30.990 INFO:tasks.workunit.client.0.vm06.stdout:3/355: sync 2026-03-09T17:29:30.992 INFO:tasks.workunit.client.0.vm06.stdout:4/384: dwrite db/d1d/d21/d37/f71 [0,4194304] 0 2026-03-09T17:29:31.002 INFO:tasks.workunit.client.0.vm06.stdout:4/385: readlink db/l2c 0 2026-03-09T17:29:31.003 INFO:tasks.workunit.client.0.vm06.stdout:4/386: chown db/df/f2d 42659 1 2026-03-09T17:29:31.050 INFO:tasks.workunit.client.0.vm06.stdout:5/353: getdents d4/d50 0 2026-03-09T17:29:31.051 INFO:tasks.workunit.client.0.vm06.stdout:5/354: creat d4/d52/d55/f84 x:0 0 0 2026-03-09T17:29:31.052 INFO:tasks.workunit.client.0.vm06.stdout:5/355: fsync d4/d22/d46/f6e 0 2026-03-09T17:29:31.068 INFO:tasks.workunit.client.0.vm06.stdout:9/378: write d3/d15/d16/f31 [1534965,93206] 0 2026-03-09T17:29:31.068 INFO:tasks.workunit.client.0.vm06.stdout:9/379: dread - d3/d11/d65/f66 zero size 2026-03-09T17:29:31.080 INFO:tasks.workunit.client.0.vm06.stdout:9/380: sync 2026-03-09T17:29:31.082 INFO:tasks.workunit.client.0.vm06.stdout:9/381: sync 2026-03-09T17:29:31.084 INFO:tasks.workunit.client.0.vm06.stdout:9/382: stat d3/d15/d16/c19 0 2026-03-09T17:29:31.086 INFO:tasks.workunit.client.0.vm06.stdout:1/387: truncate d11/d14/d1d/d1e/d2a/d34/f3b 4148349 0 2026-03-09T17:29:31.093 INFO:tasks.workunit.client.0.vm06.stdout:9/383: dread d3/d15/f74 [0,4194304] 0 2026-03-09T17:29:31.094 INFO:tasks.workunit.client.0.vm06.stdout:1/388: creat d11/d14/d1c/d1f/d57/d7b/f84 x:0 0 0 2026-03-09T17:29:31.110 INFO:tasks.workunit.client.0.vm06.stdout:7/375: write d5/dd/f1a [186477,36599] 0 2026-03-09T17:29:31.117 INFO:tasks.workunit.client.0.vm06.stdout:9/384: creat d3/d26/d6c/f75 x:0 0 0 2026-03-09T17:29:31.120 INFO:tasks.workunit.client.0.vm06.stdout:9/385: dwrite d3/d15/d36/d4c/f5a [4194304,4194304] 0 2026-03-09T17:29:31.134 INFO:tasks.workunit.client.0.vm06.stdout:7/376: fsync d5/d1f/d34/f5e 0 2026-03-09T17:29:31.143 INFO:tasks.workunit.client.0.vm06.stdout:2/334: dwrite d3/d4/d22/d43/f53 [0,4194304] 0 2026-03-09T17:29:31.145 INFO:tasks.workunit.client.0.vm06.stdout:2/335: chown d3/d4/d22/d43 49885232 1 2026-03-09T17:29:31.147 INFO:tasks.workunit.client.0.vm06.stdout:1/389: mknod d11/d14/c85 0 2026-03-09T17:29:31.148 INFO:tasks.workunit.client.0.vm06.stdout:1/390: dread - d11/d14/d1d/f7c zero size 2026-03-09T17:29:31.161 INFO:tasks.workunit.client.0.vm06.stdout:2/336: dread d3/d4/d12/f20 [0,4194304] 0 2026-03-09T17:29:31.161 INFO:tasks.workunit.client.0.vm06.stdout:2/337: read - d3/f5a zero size 2026-03-09T17:29:31.164 INFO:tasks.workunit.client.0.vm06.stdout:7/377: symlink d5/d12/d64/l65 0 2026-03-09T17:29:31.164 INFO:tasks.workunit.client.0.vm06.stdout:7/378: write d5/d7/d2b/f59 [718994,34196] 0 2026-03-09T17:29:31.167 INFO:tasks.workunit.client.0.vm06.stdout:2/338: stat d3/d4/cf 0 2026-03-09T17:29:31.167 INFO:tasks.workunit.client.0.vm06.stdout:2/339: readlink l1 0 2026-03-09T17:29:31.168 INFO:tasks.workunit.client.0.vm06.stdout:2/340: write d3/d4/f3c [2787459,22778] 0 2026-03-09T17:29:31.195 INFO:tasks.workunit.client.0.vm06.stdout:7/379: dread d5/f18 [0,4194304] 0 2026-03-09T17:29:31.221 INFO:tasks.workunit.client.0.vm06.stdout:9/386: getdents d3/d15/d36 0 2026-03-09T17:29:31.238 INFO:tasks.workunit.client.0.vm06.stdout:8/343: truncate d15/d16/d1a/f42 3692237 0 2026-03-09T17:29:31.239 INFO:tasks.workunit.client.0.vm06.stdout:8/344: write d15/d16/d1e/f64 [3312331,54571] 0 2026-03-09T17:29:31.261 INFO:tasks.workunit.client.0.vm06.stdout:7/380: creat d5/d12/d5f/f66 x:0 0 0 2026-03-09T17:29:31.261 INFO:tasks.workunit.client.0.vm06.stdout:7/381: readlink d5/d7/l2e 0 2026-03-09T17:29:31.261 INFO:tasks.workunit.client.0.vm06.stdout:7/382: readlink d5/d7/l25 0 2026-03-09T17:29:31.262 INFO:tasks.workunit.client.0.vm06.stdout:7/383: chown d5/d7/c24 18 1 2026-03-09T17:29:31.262 INFO:tasks.workunit.client.0.vm06.stdout:7/384: write d5/dd/f48 [16681,112053] 0 2026-03-09T17:29:31.271 INFO:tasks.workunit.client.0.vm06.stdout:2/341: link d3/d4/d12/f42 d3/d44/f6c 0 2026-03-09T17:29:31.282 INFO:tasks.workunit.client.0.vm06.stdout:8/345: dread d15/d16/f3f [0,4194304] 0 2026-03-09T17:29:31.283 INFO:tasks.workunit.client.0.vm06.stdout:8/346: dread - d15/d16/f66 zero size 2026-03-09T17:29:31.297 INFO:tasks.workunit.client.0.vm06.stdout:6/296: dread d6/d4f/f44 [0,4194304] 0 2026-03-09T17:29:31.297 INFO:tasks.workunit.client.0.vm06.stdout:6/297: fdatasync d6/d47/d4d/f55 0 2026-03-09T17:29:31.303 INFO:tasks.workunit.client.0.vm06.stdout:2/342: rename d3/d4/d22/d43/f53 to d3/d4/d38/d64/d6a/f6d 0 2026-03-09T17:29:31.304 INFO:tasks.workunit.client.0.vm06.stdout:7/385: mknod d5/d1f/d34/d46/d51/c67 0 2026-03-09T17:29:31.319 INFO:tasks.workunit.client.0.vm06.stdout:6/298: write d6/fb [4047902,123184] 0 2026-03-09T17:29:31.320 INFO:tasks.workunit.client.0.vm06.stdout:6/299: fdatasync d6/d12/d17/d27/f37 0 2026-03-09T17:29:31.320 INFO:tasks.workunit.client.0.vm06.stdout:2/343: chown d3/l5 11024 1 2026-03-09T17:29:31.321 INFO:tasks.workunit.client.0.vm06.stdout:0/416: truncate d7/d11/d19/d23/f49 4149048 0 2026-03-09T17:29:31.334 INFO:tasks.workunit.client.0.vm06.stdout:6/300: dread - d6/d47/f49 zero size 2026-03-09T17:29:32.153 INFO:tasks.workunit.client.0.vm06.stdout:0/417: mknod d7/d11/d19/d1d/d59/c95 0 2026-03-09T17:29:32.158 INFO:tasks.workunit.client.0.vm06.stdout:0/418: mkdir d7/d11/d5d/d64/d96 0 2026-03-09T17:29:32.160 INFO:tasks.workunit.client.0.vm06.stdout:1/391: creat d11/d14/d1c/d1f/f86 x:0 0 0 2026-03-09T17:29:32.177 INFO:tasks.workunit.client.0.vm06.stdout:4/387: write db/df/f30 [74961,65544] 0 2026-03-09T17:29:32.177 INFO:tasks.workunit.client.0.vm06.stdout:3/356: dwrite dd/d19/d25/f56 [4194304,4194304] 0 2026-03-09T17:29:32.192 INFO:tasks.workunit.client.0.vm06.stdout:4/388: link db/d1d/d21/d37/d69/c7c db/d1d/d21/d37/c94 0 2026-03-09T17:29:32.193 INFO:tasks.workunit.client.0.vm06.stdout:4/389: write fa [265945,95325] 0 2026-03-09T17:29:32.199 INFO:tasks.workunit.client.0.vm06.stdout:4/390: symlink db/d1d/d21/d88/l95 0 2026-03-09T17:29:32.200 INFO:tasks.workunit.client.0.vm06.stdout:4/391: chown db/d1d/d21/d25/f53 633981 1 2026-03-09T17:29:32.201 INFO:tasks.workunit.client.0.vm06.stdout:4/392: creat db/d59/d5f/d6d/f96 x:0 0 0 2026-03-09T17:29:32.211 INFO:tasks.workunit.client.0.vm06.stdout:7/386: rename d5/d1f/l49 to d5/l68 0 2026-03-09T17:29:32.218 INFO:tasks.workunit.client.0.vm06.stdout:7/387: unlink d5/f60 0 2026-03-09T17:29:32.234 INFO:tasks.workunit.client.0.vm06.stdout:4/393: dread db/d1d/d21/d26/f70 [0,4194304] 0 2026-03-09T17:29:32.234 INFO:tasks.workunit.client.0.vm06.stdout:4/394: fsync db/fc 0 2026-03-09T17:29:32.247 INFO:tasks.workunit.client.0.vm06.stdout:4/395: rmdir db/d59/d5f/d45/d84 39 2026-03-09T17:29:32.285 INFO:tasks.workunit.client.0.vm06.stdout:1/392: read d11/d14/f59 [25957,98406] 0 2026-03-09T17:29:32.291 INFO:tasks.workunit.client.0.vm06.stdout:1/393: unlink d11/d14/d1d/d1e/d2a/d34/c80 0 2026-03-09T17:29:32.293 INFO:tasks.workunit.client.0.vm06.stdout:1/394: mknod d11/d14/d1c/d3a/c87 0 2026-03-09T17:29:32.364 INFO:tasks.workunit.client.0.vm06.stdout:1/395: sync 2026-03-09T17:29:32.367 INFO:tasks.workunit.client.0.vm06.stdout:1/396: read d11/d14/d1d/f31 [682641,88182] 0 2026-03-09T17:29:32.368 INFO:tasks.workunit.client.0.vm06.stdout:8/347: truncate d15/d16/d19/d71/f65 1309774 0 2026-03-09T17:29:32.371 INFO:tasks.workunit.client.0.vm06.stdout:1/397: unlink d11/d14/d1c/d1f/l6f 0 2026-03-09T17:29:32.372 INFO:tasks.workunit.client.0.vm06.stdout:1/398: truncate d11/d14/d1c/d1f/d57/d7b/f84 373626 0 2026-03-09T17:29:32.375 INFO:tasks.workunit.client.0.vm06.stdout:8/348: creat d15/d16/d6d/f74 x:0 0 0 2026-03-09T17:29:32.376 INFO:tasks.workunit.client.0.vm06.stdout:1/399: symlink d11/d14/d1d/d42/d46/l88 0 2026-03-09T17:29:32.378 INFO:tasks.workunit.client.0.vm06.stdout:8/349: symlink d15/d16/d19/d71/l75 0 2026-03-09T17:29:32.384 INFO:tasks.workunit.client.0.vm06.stdout:1/400: dwrite d11/d14/d1d/f31 [0,4194304] 0 2026-03-09T17:29:32.397 INFO:tasks.workunit.client.0.vm06.stdout:1/401: dwrite d11/d14/d1d/d42/f52 [0,4194304] 0 2026-03-09T17:29:32.459 INFO:tasks.workunit.client.0.vm06.stdout:1/402: unlink d11/d14/d1d/d1e/c35 0 2026-03-09T17:29:32.461 INFO:tasks.workunit.client.0.vm06.stdout:1/403: mknod d11/d14/d1c/d5f/c89 0 2026-03-09T17:29:32.473 INFO:tasks.workunit.client.0.vm06.stdout:9/387: creat d3/d26/f76 x:0 0 0 2026-03-09T17:29:32.474 INFO:tasks.workunit.client.0.vm06.stdout:9/388: mknod d3/d15/d36/d4d/c77 0 2026-03-09T17:29:32.475 INFO:tasks.workunit.client.0.vm06.stdout:9/389: creat d3/d6d/f78 x:0 0 0 2026-03-09T17:29:32.487 INFO:tasks.workunit.client.0.vm06.stdout:1/404: unlink d11/d14/d1d/d1e/d2a/f74 0 2026-03-09T17:29:32.488 INFO:tasks.workunit.client.0.vm06.stdout:2/344: rename d3/d4/d12/d2b/d2d/c18 to d3/d4/d12/c6e 0 2026-03-09T17:29:32.493 INFO:tasks.workunit.client.0.vm06.stdout:3/357: rename dd/d19/d2c/f37 to dd/d19/d25/d2d/f6d 0 2026-03-09T17:29:32.493 INFO:tasks.workunit.client.0.vm06.stdout:3/358: chown dd/d19/d25/d48 50580024 1 2026-03-09T17:29:32.494 INFO:tasks.workunit.client.0.vm06.stdout:3/359: write dd/f22 [235965,2197] 0 2026-03-09T17:29:32.494 INFO:tasks.workunit.client.0.vm06.stdout:2/345: fdatasync d3/d4/d12/f35 0 2026-03-09T17:29:32.496 INFO:tasks.workunit.client.0.vm06.stdout:3/360: mkdir dd/d1d/d6e 0 2026-03-09T17:29:32.497 INFO:tasks.workunit.client.0.vm06.stdout:3/361: fdatasync dd/d19/d1e/f3f 0 2026-03-09T17:29:32.498 INFO:tasks.workunit.client.0.vm06.stdout:7/388: rename d5/l5c to d5/d12/d64/l69 0 2026-03-09T17:29:32.506 INFO:tasks.workunit.client.0.vm06.stdout:3/362: creat dd/d19/d28/f6f x:0 0 0 2026-03-09T17:29:32.509 INFO:tasks.workunit.client.0.vm06.stdout:3/363: dread dd/f38 [0,4194304] 0 2026-03-09T17:29:32.513 INFO:tasks.workunit.client.0.vm06.stdout:7/389: symlink d5/d12/l6a 0 2026-03-09T17:29:32.515 INFO:tasks.workunit.client.0.vm06.stdout:2/346: creat d3/d4/d12/d2b/d2d/f6f x:0 0 0 2026-03-09T17:29:32.517 INFO:tasks.workunit.client.0.vm06.stdout:3/364: dread - dd/d19/d1e/f41 zero size 2026-03-09T17:29:32.517 INFO:tasks.workunit.client.0.vm06.stdout:7/390: dwrite d5/f8 [0,4194304] 0 2026-03-09T17:29:32.524 INFO:tasks.workunit.client.0.vm06.stdout:3/365: dwrite dd/d1d/f34 [0,4194304] 0 2026-03-09T17:29:32.533 INFO:tasks.workunit.client.0.vm06.stdout:4/396: write db/d1d/f5b [2152089,19993] 0 2026-03-09T17:29:32.534 INFO:tasks.workunit.client.0.vm06.stdout:4/397: truncate db/d1d/d21/d37/d69/f8b 187908 0 2026-03-09T17:29:32.537 INFO:tasks.workunit.client.0.vm06.stdout:7/391: truncate d5/f18 4940425 0 2026-03-09T17:29:32.542 INFO:tasks.workunit.client.0.vm06.stdout:4/398: creat db/df/f97 x:0 0 0 2026-03-09T17:29:32.544 INFO:tasks.workunit.client.0.vm06.stdout:7/392: mkdir d5/d12/d64/d6b 0 2026-03-09T17:29:32.546 INFO:tasks.workunit.client.0.vm06.stdout:2/347: rename d3/f10 to d3/d4/f70 0 2026-03-09T17:29:32.552 INFO:tasks.workunit.client.0.vm06.stdout:2/348: dread - d3/d44/f6c zero size 2026-03-09T17:29:32.560 INFO:tasks.workunit.client.0.vm06.stdout:2/349: dread d3/d4/d12/f20 [0,4194304] 0 2026-03-09T17:29:32.561 INFO:tasks.workunit.client.0.vm06.stdout:2/350: mkdir d3/d4/d12/d71 0 2026-03-09T17:29:32.561 INFO:tasks.workunit.client.0.vm06.stdout:6/301: mknod d6/d12/d17/c60 0 2026-03-09T17:29:32.561 INFO:tasks.workunit.client.0.vm06.stdout:6/302: fsync d6/d12/d2d/f5e 0 2026-03-09T17:29:32.564 INFO:tasks.workunit.client.0.vm06.stdout:6/303: dwrite d6/d12/d53/f5b [0,4194304] 0 2026-03-09T17:29:32.570 INFO:tasks.workunit.client.0.vm06.stdout:6/304: creat d6/d47/f61 x:0 0 0 2026-03-09T17:29:32.573 INFO:tasks.workunit.client.0.vm06.stdout:6/305: chown d6/d12/d2d/c5d 1558799 1 2026-03-09T17:29:32.573 INFO:tasks.workunit.client.0.vm06.stdout:6/306: readlink d6/l1e 0 2026-03-09T17:29:32.573 INFO:tasks.workunit.client.0.vm06.stdout:6/307: write d6/d12/d17/d27/f37 [6852557,77984] 0 2026-03-09T17:29:32.576 INFO:tasks.workunit.client.0.vm06.stdout:6/308: dwrite d6/d12/d17/f32 [0,4194304] 0 2026-03-09T17:29:32.579 INFO:tasks.workunit.client.0.vm06.stdout:2/351: getdents d3/d4/d12/d2b/d36/d37 0 2026-03-09T17:29:32.580 INFO:tasks.workunit.client.0.vm06.stdout:6/309: fsync d6/d12/d17/d27/f3d 0 2026-03-09T17:29:32.582 INFO:tasks.workunit.client.0.vm06.stdout:2/352: rename d3/d4/d12/d34 to d3/d4/d22/d72 0 2026-03-09T17:29:32.585 INFO:tasks.workunit.client.0.vm06.stdout:2/353: chown d3/d4/d12/d2b/d2d/l51 34573 1 2026-03-09T17:29:32.597 INFO:tasks.workunit.client.0.vm06.stdout:6/310: unlink d6/d12/d17/l28 0 2026-03-09T17:29:32.602 INFO:tasks.workunit.client.0.vm06.stdout:6/311: creat d6/d4f/d3e/f62 x:0 0 0 2026-03-09T17:29:32.604 INFO:tasks.workunit.client.0.vm06.stdout:6/312: mknod d6/d4f/d3e/d52/c63 0 2026-03-09T17:29:32.617 INFO:tasks.workunit.client.0.vm06.stdout:0/419: unlink d7/d11/d19/f21 0 2026-03-09T17:29:32.617 INFO:tasks.workunit.client.0.vm06.stdout:0/420: chown d7/d11/d19/l1a 10 1 2026-03-09T17:29:32.621 INFO:tasks.workunit.client.0.vm06.stdout:0/421: truncate d7/d11/d2d/f3a 1497818 0 2026-03-09T17:29:32.622 INFO:tasks.workunit.client.0.vm06.stdout:0/422: truncate d7/d11/d5d/d64/f6a 851544 0 2026-03-09T17:29:32.625 INFO:tasks.workunit.client.0.vm06.stdout:0/423: creat d7/d11/d19/d23/f97 x:0 0 0 2026-03-09T17:29:32.625 INFO:tasks.workunit.client.0.vm06.stdout:0/424: readlink d7/d11/d19/d1d/d59/d5b/l81 0 2026-03-09T17:29:32.630 INFO:tasks.workunit.client.0.vm06.stdout:0/425: symlink d7/d11/d2d/l98 0 2026-03-09T17:29:32.634 INFO:tasks.workunit.client.0.vm06.stdout:9/390: write d3/d15/d16/f72 [3932844,94394] 0 2026-03-09T17:29:32.645 INFO:tasks.workunit.client.0.vm06.stdout:5/356: write d4/d50/f14 [2281804,14769] 0 2026-03-09T17:29:32.647 INFO:tasks.workunit.client.0.vm06.stdout:1/405: dwrite d11/d14/d1c/d1f/f21 [0,4194304] 0 2026-03-09T17:29:32.661 INFO:tasks.workunit.client.0.vm06.stdout:5/357: dwrite d4/d50/f61 [4194304,4194304] 0 2026-03-09T17:29:32.677 INFO:tasks.workunit.client.0.vm06.stdout:1/406: creat d11/d14/d1d/d1e/d2a/d34/d64/f8a x:0 0 0 2026-03-09T17:29:32.679 INFO:tasks.workunit.client.0.vm06.stdout:1/407: creat d11/d14/d1d/f8b x:0 0 0 2026-03-09T17:29:32.680 INFO:tasks.workunit.client.0.vm06.stdout:1/408: write d11/d14/d1c/d1f/f7f [632200,36032] 0 2026-03-09T17:29:32.684 INFO:tasks.workunit.client.0.vm06.stdout:5/358: getdents d4/d50/d35/d75 0 2026-03-09T17:29:32.685 INFO:tasks.workunit.client.0.vm06.stdout:7/393: rmdir d5 39 2026-03-09T17:29:32.687 INFO:tasks.workunit.client.0.vm06.stdout:3/366: dwrite dd/d19/d1e/f41 [0,4194304] 0 2026-03-09T17:29:32.692 INFO:tasks.workunit.client.0.vm06.stdout:3/367: fdatasync dd/f1a 0 2026-03-09T17:29:32.697 INFO:tasks.workunit.client.0.vm06.stdout:4/399: truncate db/d1d/d21/d37/f71 536001 0 2026-03-09T17:29:32.705 INFO:tasks.workunit.client.0.vm06.stdout:5/359: rename c2 to d4/d50/d18/d3d/c85 0 2026-03-09T17:29:32.708 INFO:tasks.workunit.client.0.vm06.stdout:3/368: dread dd/f10 [0,4194304] 0 2026-03-09T17:29:32.709 INFO:tasks.workunit.client.0.vm06.stdout:3/369: stat dd/d19/d2c 0 2026-03-09T17:29:32.709 INFO:tasks.workunit.client.0.vm06.stdout:4/400: unlink db/d59/d5f/d45/l77 0 2026-03-09T17:29:32.713 INFO:tasks.workunit.client.0.vm06.stdout:7/394: creat d5/d12/f6c x:0 0 0 2026-03-09T17:29:32.713 INFO:tasks.workunit.client.0.vm06.stdout:7/395: stat d5/d7/l25 0 2026-03-09T17:29:32.713 INFO:tasks.workunit.client.0.vm06.stdout:7/396: stat d5/d7/f58 0 2026-03-09T17:29:32.714 INFO:tasks.workunit.client.0.vm06.stdout:7/397: chown d5/d12/d5f/f66 10 1 2026-03-09T17:29:32.719 INFO:tasks.workunit.client.0.vm06.stdout:7/398: dwrite d5/dd/f5d [0,4194304] 0 2026-03-09T17:29:32.720 INFO:tasks.workunit.client.0.vm06.stdout:2/354: dwrite d3/f3b [0,4194304] 0 2026-03-09T17:29:32.720 INFO:tasks.workunit.client.0.vm06.stdout:7/399: chown d5/d1f/d34/c63 6341426 1 2026-03-09T17:29:32.722 INFO:tasks.workunit.client.0.vm06.stdout:7/400: read d5/dd/f48 [59737,9670] 0 2026-03-09T17:29:32.722 INFO:tasks.workunit.client.0.vm06.stdout:8/350: link d15/d16/d19/d71/f65 d15/d16/d1a/d47/f76 0 2026-03-09T17:29:32.730 INFO:tasks.workunit.client.0.vm06.stdout:6/313: dwrite d6/d4f/f3a [0,4194304] 0 2026-03-09T17:29:32.734 INFO:tasks.workunit.client.0.vm06.stdout:7/401: dwrite d5/dd/f5d [0,4194304] 0 2026-03-09T17:29:32.736 INFO:tasks.workunit.client.0.vm06.stdout:7/402: write d5/d7/d2b/f52 [317104,97700] 0 2026-03-09T17:29:32.748 INFO:tasks.workunit.client.0.vm06.stdout:6/314: dread d6/f4a [0,4194304] 0 2026-03-09T17:29:32.750 INFO:tasks.workunit.client.0.vm06.stdout:5/360: symlink d4/d22/d64/l86 0 2026-03-09T17:29:32.755 INFO:tasks.workunit.client.0.vm06.stdout:3/370: unlink dd/d19/d1e/l62 0 2026-03-09T17:29:32.756 INFO:tasks.workunit.client.0.vm06.stdout:3/371: readlink dd/d19/d1e/l4d 0 2026-03-09T17:29:32.756 INFO:tasks.workunit.client.0.vm06.stdout:3/372: write dd/f68 [698099,62581] 0 2026-03-09T17:29:32.758 INFO:tasks.workunit.client.0.vm06.stdout:0/426: write d7/d11/f20 [4091090,85688] 0 2026-03-09T17:29:32.763 INFO:tasks.workunit.client.0.vm06.stdout:3/373: dwrite dd/f15 [0,4194304] 0 2026-03-09T17:29:32.778 INFO:tasks.workunit.client.0.vm06.stdout:9/391: truncate d3/d26/f33 1088292 0 2026-03-09T17:29:32.778 INFO:tasks.workunit.client.0.vm06.stdout:2/355: read - d3/d4/d12/f31 zero size 2026-03-09T17:29:32.778 INFO:tasks.workunit.client.0.vm06.stdout:1/409: dwrite d11/d14/d1d/d1e/d2a/d34/f5c [0,4194304] 0 2026-03-09T17:29:32.778 INFO:tasks.workunit.client.0.vm06.stdout:6/315: creat d6/d12/d53/f64 x:0 0 0 2026-03-09T17:29:32.778 INFO:tasks.workunit.client.0.vm06.stdout:5/361: symlink d4/d50/d35/d40/d6f/l87 0 2026-03-09T17:29:32.782 INFO:tasks.workunit.client.0.vm06.stdout:1/410: dread d11/d14/d1d/d42/f70 [0,4194304] 0 2026-03-09T17:29:32.782 INFO:tasks.workunit.client.0.vm06.stdout:5/362: dwrite d4/d50/d18/f73 [0,4194304] 0 2026-03-09T17:29:32.785 INFO:tasks.workunit.client.0.vm06.stdout:5/363: readlink d4/d50/d18/l34 0 2026-03-09T17:29:32.788 INFO:tasks.workunit.client.0.vm06.stdout:4/401: creat db/d1d/d21/d25/d4b/d85/f98 x:0 0 0 2026-03-09T17:29:32.797 INFO:tasks.workunit.client.0.vm06.stdout:8/351: mkdir d15/d39/d67/d77 0 2026-03-09T17:29:32.802 INFO:tasks.workunit.client.0.vm06.stdout:1/411: dread d11/d14/d1d/f56 [0,4194304] 0 2026-03-09T17:29:32.806 INFO:tasks.workunit.client.0.vm06.stdout:0/427: mkdir d7/d11/d89/d99 0 2026-03-09T17:29:32.815 INFO:tasks.workunit.client.0.vm06.stdout:3/374: mkdir dd/d1d/d6e/d70 0 2026-03-09T17:29:32.815 INFO:tasks.workunit.client.0.vm06.stdout:9/392: mknod d3/c79 0 2026-03-09T17:29:32.815 INFO:tasks.workunit.client.0.vm06.stdout:8/352: mknod d15/d16/d19/d71/c78 0 2026-03-09T17:29:32.821 INFO:tasks.workunit.client.0.vm06.stdout:6/316: mkdir d6/d12/d17/d65 0 2026-03-09T17:29:32.830 INFO:tasks.workunit.client.0.vm06.stdout:0/428: rmdir d7/d11/d2d 39 2026-03-09T17:29:32.837 INFO:tasks.workunit.client.0.vm06.stdout:9/393: creat d3/d6d/f7a x:0 0 0 2026-03-09T17:29:32.841 INFO:tasks.workunit.client.0.vm06.stdout:6/317: mknod d6/d4f/d3e/d52/c66 0 2026-03-09T17:29:32.848 INFO:tasks.workunit.client.0.vm06.stdout:0/429: write d7/d11/d2d/f78 [1486447,55196] 0 2026-03-09T17:29:32.848 INFO:tasks.workunit.client.0.vm06.stdout:7/403: sync 2026-03-09T17:29:32.853 INFO:tasks.workunit.client.0.vm06.stdout:0/430: dread d7/d11/f75 [0,4194304] 0 2026-03-09T17:29:32.853 INFO:tasks.workunit.client.0.vm06.stdout:0/431: chown d7/d11/d19/d1d/d39/f7d 7 1 2026-03-09T17:29:32.863 INFO:tasks.workunit.client.0.vm06.stdout:4/402: rename db/d1d/d21/d37/f71 to db/d57/f99 0 2026-03-09T17:29:32.866 INFO:tasks.workunit.client.0.vm06.stdout:6/318: creat d6/d12/d17/d27/d40/f67 x:0 0 0 2026-03-09T17:29:32.866 INFO:tasks.workunit.client.0.vm06.stdout:6/319: write d6/d4f/f44 [4229185,92623] 0 2026-03-09T17:29:32.870 INFO:tasks.workunit.client.0.vm06.stdout:6/320: dwrite d6/d47/d4d/f55 [0,4194304] 0 2026-03-09T17:29:32.877 INFO:tasks.workunit.client.0.vm06.stdout:1/412: rmdir d11/d6b 0 2026-03-09T17:29:32.877 INFO:tasks.workunit.client.0.vm06.stdout:1/413: dwrite d11/d14/d1d/d1e/d2a/d34/d64/f8a [0,4194304] 0 2026-03-09T17:29:32.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:32 vm06.local ceph-mon[57307]: pgmap v149: 65 pgs: 65 active+clean; 811 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 15 MiB/s rd, 98 MiB/s wr, 255 op/s 2026-03-09T17:29:32.892 INFO:tasks.workunit.client.0.vm06.stdout:7/404: rename d5/d7/l40 to d5/d12/d64/d6b/l6d 0 2026-03-09T17:29:32.893 INFO:tasks.workunit.client.0.vm06.stdout:7/405: write d5/dd/f3e [498171,50795] 0 2026-03-09T17:29:32.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:32 vm09.local ceph-mon[62061]: pgmap v149: 65 pgs: 65 active+clean; 811 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 15 MiB/s rd, 98 MiB/s wr, 255 op/s 2026-03-09T17:29:32.899 INFO:tasks.workunit.client.0.vm06.stdout:1/414: write d11/d14/d1d/d1e/f65 [2577541,9999] 0 2026-03-09T17:29:32.900 INFO:tasks.workunit.client.0.vm06.stdout:6/321: dread d6/d12/f22 [0,4194304] 0 2026-03-09T17:29:32.904 INFO:tasks.workunit.client.0.vm06.stdout:9/394: link d3/d11/c2b d3/d15/d48/c7b 0 2026-03-09T17:29:32.907 INFO:tasks.workunit.client.0.vm06.stdout:4/403: mknod db/d59/d5f/d45/d84/c9a 0 2026-03-09T17:29:32.908 INFO:tasks.workunit.client.0.vm06.stdout:1/415: mkdir d11/d14/d1d/d8c 0 2026-03-09T17:29:32.909 INFO:tasks.workunit.client.0.vm06.stdout:9/395: readlink d3/l4 0 2026-03-09T17:29:32.910 INFO:tasks.workunit.client.0.vm06.stdout:0/432: truncate d7/f76 3977629 0 2026-03-09T17:29:32.912 INFO:tasks.workunit.client.0.vm06.stdout:0/433: dread d7/f2a [4194304,4194304] 0 2026-03-09T17:29:32.914 INFO:tasks.workunit.client.0.vm06.stdout:7/406: unlink d5/l15 0 2026-03-09T17:29:32.914 INFO:tasks.workunit.client.0.vm06.stdout:9/396: creat d3/d11/d65/f7c x:0 0 0 2026-03-09T17:29:32.915 INFO:tasks.workunit.client.0.vm06.stdout:7/407: write d5/d12/f6c [594242,25808] 0 2026-03-09T17:29:32.915 INFO:tasks.workunit.client.0.vm06.stdout:9/397: write d3/d15/f58 [64238,43181] 0 2026-03-09T17:29:32.917 INFO:tasks.workunit.client.0.vm06.stdout:9/398: truncate d3/d11/d65/f7c 916903 0 2026-03-09T17:29:32.918 INFO:tasks.workunit.client.0.vm06.stdout:9/399: chown d3/d6d/f78 19 1 2026-03-09T17:29:32.918 INFO:tasks.workunit.client.0.vm06.stdout:9/400: readlink d3/d15/d16/l6b 0 2026-03-09T17:29:32.921 INFO:tasks.workunit.client.0.vm06.stdout:6/322: dread d6/d47/d4d/f50 [0,4194304] 0 2026-03-09T17:29:32.925 INFO:tasks.workunit.client.0.vm06.stdout:7/408: dread d5/d12/f2c [0,4194304] 0 2026-03-09T17:29:32.925 INFO:tasks.workunit.client.0.vm06.stdout:7/409: write d5/d7/f58 [560039,130214] 0 2026-03-09T17:29:32.936 INFO:tasks.workunit.client.0.vm06.stdout:6/323: dread d6/d4f/f26 [0,4194304] 0 2026-03-09T17:29:32.940 INFO:tasks.workunit.client.0.vm06.stdout:6/324: dwrite d6/d4f/f3a [0,4194304] 0 2026-03-09T17:29:32.949 INFO:tasks.workunit.client.0.vm06.stdout:9/401: readlink d3/d15/l24 0 2026-03-09T17:29:32.957 INFO:tasks.workunit.client.0.vm06.stdout:5/364: dwrite d4/d50/f80 [0,4194304] 0 2026-03-09T17:29:32.963 INFO:tasks.workunit.client.0.vm06.stdout:3/375: dwrite dd/d19/f2b [0,4194304] 0 2026-03-09T17:29:32.969 INFO:tasks.workunit.client.0.vm06.stdout:3/376: read dd/d1d/f4b [2111971,105999] 0 2026-03-09T17:29:32.972 INFO:tasks.workunit.client.0.vm06.stdout:7/410: creat d5/d1f/d34/d46/d51/f6e x:0 0 0 2026-03-09T17:29:32.973 INFO:tasks.workunit.client.0.vm06.stdout:8/353: write d15/d31/f33 [136554,74591] 0 2026-03-09T17:29:32.992 INFO:tasks.workunit.client.0.vm06.stdout:2/356: dread d3/d4/d22/d72/f54 [0,4194304] 0 2026-03-09T17:29:33.000 INFO:tasks.workunit.client.0.vm06.stdout:5/365: rename d4/d22/l4f to d4/d7e/l88 0 2026-03-09T17:29:33.023 INFO:tasks.workunit.client.0.vm06.stdout:3/377: creat dd/d19/d25/d2d/f71 x:0 0 0 2026-03-09T17:29:33.023 INFO:tasks.workunit.client.0.vm06.stdout:1/416: write d11/d14/f59 [424786,114249] 0 2026-03-09T17:29:33.023 INFO:tasks.workunit.client.0.vm06.stdout:8/354: chown d15/d16/d19/d71/f65 129160 1 2026-03-09T17:29:33.023 INFO:tasks.workunit.client.0.vm06.stdout:9/402: getdents d3/d15/d36/d4c/d6a 0 2026-03-09T17:29:33.023 INFO:tasks.workunit.client.0.vm06.stdout:3/378: read dd/d1d/f4b [2749958,37785] 0 2026-03-09T17:29:33.023 INFO:tasks.workunit.client.0.vm06.stdout:1/417: dread d11/d14/d1d/d42/d46/f55 [0,4194304] 0 2026-03-09T17:29:33.023 INFO:tasks.workunit.client.0.vm06.stdout:9/403: creat d3/d15/d16/f7d x:0 0 0 2026-03-09T17:29:33.024 INFO:tasks.workunit.client.0.vm06.stdout:8/355: unlink d15/d16/d1a/f42 0 2026-03-09T17:29:33.025 INFO:tasks.workunit.client.0.vm06.stdout:8/356: chown d15/d16/d1e/d30/l38 745365 1 2026-03-09T17:29:33.026 INFO:tasks.workunit.client.0.vm06.stdout:6/325: getdents d6/d4f/d3e/d52 0 2026-03-09T17:29:33.026 INFO:tasks.workunit.client.0.vm06.stdout:9/404: chown d3/l7 55506 1 2026-03-09T17:29:33.028 INFO:tasks.workunit.client.0.vm06.stdout:5/366: rename d4/c68 to d4/d50/c89 0 2026-03-09T17:29:33.039 INFO:tasks.workunit.client.0.vm06.stdout:3/379: link dd/d19/d25/f4f dd/d1d/d4e/f72 0 2026-03-09T17:29:33.039 INFO:tasks.workunit.client.0.vm06.stdout:8/357: fsync d15/d16/d19/f26 0 2026-03-09T17:29:33.039 INFO:tasks.workunit.client.0.vm06.stdout:6/326: read d6/f4a [1371237,53213] 0 2026-03-09T17:29:33.039 INFO:tasks.workunit.client.0.vm06.stdout:8/358: dwrite d15/d16/d1e/f64 [4194304,4194304] 0 2026-03-09T17:29:33.039 INFO:tasks.workunit.client.0.vm06.stdout:5/367: creat d4/d52/f8a x:0 0 0 2026-03-09T17:29:33.039 INFO:tasks.workunit.client.0.vm06.stdout:1/418: creat d11/f8d x:0 0 0 2026-03-09T17:29:33.039 INFO:tasks.workunit.client.0.vm06.stdout:8/359: chown d15/d16/d19/d2b/f46 6978 1 2026-03-09T17:29:33.045 INFO:tasks.workunit.client.0.vm06.stdout:1/419: dwrite d11/d14/d1d/d1e/d2a/d34/d58/f6a [0,4194304] 0 2026-03-09T17:29:33.049 INFO:tasks.workunit.client.0.vm06.stdout:6/327: symlink d6/d12/d17/d27/d40/l68 0 2026-03-09T17:29:33.050 INFO:tasks.workunit.client.0.vm06.stdout:3/380: creat dd/d1d/d6e/d70/f73 x:0 0 0 2026-03-09T17:29:33.059 INFO:tasks.workunit.client.0.vm06.stdout:6/328: dwrite d6/f56 [4194304,4194304] 0 2026-03-09T17:29:33.063 INFO:tasks.workunit.client.0.vm06.stdout:6/329: dread d6/d4f/f44 [0,4194304] 0 2026-03-09T17:29:33.066 INFO:tasks.workunit.client.0.vm06.stdout:9/405: rename d3/d15/d36/c6e to d3/c7e 0 2026-03-09T17:29:33.066 INFO:tasks.workunit.client.0.vm06.stdout:6/330: write d6/f5c [207992,69789] 0 2026-03-09T17:29:33.067 INFO:tasks.workunit.client.0.vm06.stdout:7/411: sync 2026-03-09T17:29:33.068 INFO:tasks.workunit.client.0.vm06.stdout:7/412: fsync d5/d7/f58 0 2026-03-09T17:29:33.074 INFO:tasks.workunit.client.0.vm06.stdout:6/331: dwrite d6/d12/d53/f5b [0,4194304] 0 2026-03-09T17:29:33.097 INFO:tasks.workunit.client.0.vm06.stdout:5/368: dread d4/d50/f24 [0,4194304] 0 2026-03-09T17:29:33.097 INFO:tasks.workunit.client.0.vm06.stdout:3/381: creat dd/d1d/d4e/f74 x:0 0 0 2026-03-09T17:29:33.097 INFO:tasks.workunit.client.0.vm06.stdout:3/382: chown dd/d19/d1e/f23 363146130 1 2026-03-09T17:29:33.097 INFO:tasks.workunit.client.0.vm06.stdout:9/406: truncate d3/d11/f1f 5209132 0 2026-03-09T17:29:33.097 INFO:tasks.workunit.client.0.vm06.stdout:7/413: creat d5/d12/d64/d6b/f6f x:0 0 0 2026-03-09T17:29:33.097 INFO:tasks.workunit.client.0.vm06.stdout:9/407: dread d3/d15/d36/d4c/f5a [4194304,4194304] 0 2026-03-09T17:29:33.097 INFO:tasks.workunit.client.0.vm06.stdout:9/408: readlink d3/d15/d16/l20 0 2026-03-09T17:29:33.097 INFO:tasks.workunit.client.0.vm06.stdout:9/409: stat d3/d15/d36/d4c/f55 0 2026-03-09T17:29:33.097 INFO:tasks.workunit.client.0.vm06.stdout:7/414: dwrite d5/d7/f58 [0,4194304] 0 2026-03-09T17:29:33.097 INFO:tasks.workunit.client.0.vm06.stdout:9/410: truncate d3/d15/d36/d4c/f55 1182894 0 2026-03-09T17:29:33.098 INFO:tasks.workunit.client.0.vm06.stdout:3/383: unlink dd/d19/d25/d2d/l35 0 2026-03-09T17:29:33.099 INFO:tasks.workunit.client.0.vm06.stdout:1/420: unlink d11/d14/d1d/d1e/d2a/d34/l3e 0 2026-03-09T17:29:33.099 INFO:tasks.workunit.client.0.vm06.stdout:5/369: creat d4/d7e/f8b x:0 0 0 2026-03-09T17:29:33.100 INFO:tasks.workunit.client.0.vm06.stdout:3/384: stat dd/f51 0 2026-03-09T17:29:33.100 INFO:tasks.workunit.client.0.vm06.stdout:6/332: dread d6/d12/f1c [0,4194304] 0 2026-03-09T17:29:33.101 INFO:tasks.workunit.client.0.vm06.stdout:1/421: write d11/f13 [2948279,105065] 0 2026-03-09T17:29:33.101 INFO:tasks.workunit.client.0.vm06.stdout:6/333: write d6/f5c [386537,116708] 0 2026-03-09T17:29:33.105 INFO:tasks.workunit.client.0.vm06.stdout:3/385: rmdir dd/d19 39 2026-03-09T17:29:33.105 INFO:tasks.workunit.client.0.vm06.stdout:9/411: creat d3/d26/d6c/d68/f7f x:0 0 0 2026-03-09T17:29:33.107 INFO:tasks.workunit.client.0.vm06.stdout:5/370: rename d4/d22/d64/f65 to d4/d50/d18/f8c 0 2026-03-09T17:29:33.109 INFO:tasks.workunit.client.0.vm06.stdout:7/415: dwrite d5/d12/f32 [0,4194304] 0 2026-03-09T17:29:33.119 INFO:tasks.workunit.client.0.vm06.stdout:9/412: dwrite d3/d15/f46 [0,4194304] 0 2026-03-09T17:29:33.119 INFO:tasks.workunit.client.0.vm06.stdout:5/371: dwrite d4/f7 [4194304,4194304] 0 2026-03-09T17:29:33.122 INFO:tasks.workunit.client.0.vm06.stdout:6/334: rmdir d6/d47/d4d 39 2026-03-09T17:29:33.126 INFO:tasks.workunit.client.0.vm06.stdout:1/422: fsync d11/d14/d1d/d1e/d2a/f38 0 2026-03-09T17:29:33.126 INFO:tasks.workunit.client.0.vm06.stdout:5/372: dwrite d4/d50/d35/f6b [0,4194304] 0 2026-03-09T17:29:33.131 INFO:tasks.workunit.client.0.vm06.stdout:9/413: mkdir d3/d11/d65/d80 0 2026-03-09T17:29:33.131 INFO:tasks.workunit.client.0.vm06.stdout:7/416: dread d5/d1f/f56 [4194304,4194304] 0 2026-03-09T17:29:33.132 INFO:tasks.workunit.client.0.vm06.stdout:5/373: symlink d4/d50/d35/d40/l8d 0 2026-03-09T17:29:33.133 INFO:tasks.workunit.client.0.vm06.stdout:7/417: chown d5/d1f/d34/d46/d51/c67 7141069 1 2026-03-09T17:29:33.134 INFO:tasks.workunit.client.0.vm06.stdout:3/386: creat dd/f75 x:0 0 0 2026-03-09T17:29:33.137 INFO:tasks.workunit.client.0.vm06.stdout:9/414: dread d3/d15/f74 [0,4194304] 0 2026-03-09T17:29:33.137 INFO:tasks.workunit.client.0.vm06.stdout:6/335: truncate d6/d4f/f3c 1053437 0 2026-03-09T17:29:33.138 INFO:tasks.workunit.client.0.vm06.stdout:5/374: creat d4/d50/d35/d40/d6f/f8e x:0 0 0 2026-03-09T17:29:33.141 INFO:tasks.workunit.client.0.vm06.stdout:9/415: chown d3/d15/d36/d4d/l63 64533089 1 2026-03-09T17:29:33.141 INFO:tasks.workunit.client.0.vm06.stdout:1/423: rename d11/d14/d1c/d5f/c89 to d11/d14/d1c/c8e 0 2026-03-09T17:29:33.143 INFO:tasks.workunit.client.0.vm06.stdout:6/336: write d6/d47/d4d/f55 [2349068,99768] 0 2026-03-09T17:29:33.146 INFO:tasks.workunit.client.0.vm06.stdout:9/416: creat d3/d2c/f81 x:0 0 0 2026-03-09T17:29:33.146 INFO:tasks.workunit.client.0.vm06.stdout:9/417: write d3/d26/d6c/f5b [61474,7714] 0 2026-03-09T17:29:33.148 INFO:tasks.workunit.client.0.vm06.stdout:9/418: readlink d3/d15/d16/l67 0 2026-03-09T17:29:33.154 INFO:tasks.workunit.client.0.vm06.stdout:5/375: rename d4/d50/d18/f31 to d4/d52/f8f 0 2026-03-09T17:29:33.169 INFO:tasks.workunit.client.0.vm06.stdout:1/424: truncate d11/d14/d1d/d1e/d2a/f50 1177117 0 2026-03-09T17:29:33.169 INFO:tasks.workunit.client.0.vm06.stdout:3/387: link c3 dd/d1d/d6e/d70/c76 0 2026-03-09T17:29:33.170 INFO:tasks.workunit.client.0.vm06.stdout:6/337: mknod d6/d12/c69 0 2026-03-09T17:29:33.170 INFO:tasks.workunit.client.0.vm06.stdout:7/418: link d5/d7/lb d5/d12/d64/d6b/l70 0 2026-03-09T17:29:33.170 INFO:tasks.workunit.client.0.vm06.stdout:9/419: fsync d3/d15/d36/d4c/f5a 0 2026-03-09T17:29:33.170 INFO:tasks.workunit.client.0.vm06.stdout:5/376: stat d4/d50/d18/d3d/f44 0 2026-03-09T17:29:33.170 INFO:tasks.workunit.client.0.vm06.stdout:7/419: truncate d5/d1f/d34/d46/d51/f6e 566454 0 2026-03-09T17:29:33.170 INFO:tasks.workunit.client.0.vm06.stdout:5/377: dread - d4/d50/d18/f74 zero size 2026-03-09T17:29:33.170 INFO:tasks.workunit.client.0.vm06.stdout:9/420: rmdir d3/d26/d35 39 2026-03-09T17:29:33.170 INFO:tasks.workunit.client.0.vm06.stdout:1/425: chown d11/d14/d1d/d42/c54 101 1 2026-03-09T17:29:33.170 INFO:tasks.workunit.client.0.vm06.stdout:3/388: dwrite dd/d19/d2c/f30 [0,4194304] 0 2026-03-09T17:29:33.170 INFO:tasks.workunit.client.0.vm06.stdout:3/389: chown dd/c18 350666700 1 2026-03-09T17:29:33.173 INFO:tasks.workunit.client.0.vm06.stdout:7/420: creat d5/f71 x:0 0 0 2026-03-09T17:29:33.176 INFO:tasks.workunit.client.0.vm06.stdout:7/421: symlink d5/d12/d64/l72 0 2026-03-09T17:29:33.186 INFO:tasks.workunit.client.0.vm06.stdout:3/390: rename dd/d19/d28/l2a to dd/d19/d25/d44/l77 0 2026-03-09T17:29:33.187 INFO:tasks.workunit.client.0.vm06.stdout:1/426: getdents d11/d14/d1d/d1e/d2a/d34/d64 0 2026-03-09T17:29:33.189 INFO:tasks.workunit.client.0.vm06.stdout:3/391: symlink dd/d19/d2c/l78 0 2026-03-09T17:29:33.191 INFO:tasks.workunit.client.0.vm06.stdout:3/392: dread dd/d19/d1e/f3f [0,4194304] 0 2026-03-09T17:29:33.199 INFO:tasks.workunit.client.0.vm06.stdout:3/393: read dd/d19/d2c/f30 [4132908,85693] 0 2026-03-09T17:29:33.208 INFO:tasks.workunit.client.0.vm06.stdout:3/394: dwrite dd/d5b/d65/f6a [0,4194304] 0 2026-03-09T17:29:33.208 INFO:tasks.workunit.client.0.vm06.stdout:3/395: chown dd/d1d/f4b 350 1 2026-03-09T17:29:33.209 INFO:tasks.workunit.client.0.vm06.stdout:3/396: creat dd/d19/d2c/f79 x:0 0 0 2026-03-09T17:29:33.210 INFO:tasks.workunit.client.0.vm06.stdout:3/397: write dd/d19/d2c/f30 [1540780,88623] 0 2026-03-09T17:29:33.212 INFO:tasks.workunit.client.0.vm06.stdout:3/398: mknod dd/d19/c7a 0 2026-03-09T17:29:33.212 INFO:tasks.workunit.client.0.vm06.stdout:3/399: write dd/d19/d1e/f3f [610337,88044] 0 2026-03-09T17:29:33.214 INFO:tasks.workunit.client.0.vm06.stdout:3/400: dread f4 [0,4194304] 0 2026-03-09T17:29:33.218 INFO:tasks.workunit.client.0.vm06.stdout:3/401: rename dd/d19/d25/l52 to dd/d5b/l7b 0 2026-03-09T17:29:33.281 INFO:tasks.workunit.client.0.vm06.stdout:0/434: truncate d7/d11/d19/f57 851759 0 2026-03-09T17:29:33.283 INFO:tasks.workunit.client.0.vm06.stdout:0/435: symlink d7/d11/d19/d1d/d39/l9a 0 2026-03-09T17:29:33.289 INFO:tasks.workunit.client.0.vm06.stdout:0/436: symlink d7/d11/d19/d1d/l9b 0 2026-03-09T17:29:33.296 INFO:tasks.workunit.client.0.vm06.stdout:0/437: symlink d7/l9c 0 2026-03-09T17:29:33.296 INFO:tasks.workunit.client.0.vm06.stdout:0/438: stat d7/d11/d19/d1d/d59/d5b/c6c 0 2026-03-09T17:29:33.300 INFO:tasks.workunit.client.0.vm06.stdout:0/439: rename d7/d88/c8d to d7/d11/d19/d1d/d87/c9d 0 2026-03-09T17:29:33.320 INFO:tasks.workunit.client.0.vm06.stdout:0/440: rename d7/d11/d19/l7c to d7/d11/d19/d1d/d59/l9e 0 2026-03-09T17:29:33.320 INFO:tasks.workunit.client.0.vm06.stdout:0/441: write d7/d11/d5d/d64/f6b [145035,64592] 0 2026-03-09T17:29:33.320 INFO:tasks.workunit.client.0.vm06.stdout:0/442: mknod d7/d11/d19/d1d/d39/c9f 0 2026-03-09T17:29:33.320 INFO:tasks.workunit.client.0.vm06.stdout:0/443: rename d7/d11/c2e to d7/d11/d89/d99/ca0 0 2026-03-09T17:29:33.327 INFO:tasks.workunit.client.0.vm06.stdout:0/444: dread d7/f36 [0,4194304] 0 2026-03-09T17:29:33.328 INFO:tasks.workunit.client.0.vm06.stdout:0/445: chown d7/d11/f1c 98 1 2026-03-09T17:29:33.328 INFO:tasks.workunit.client.0.vm06.stdout:0/446: fdatasync d7/f50 0 2026-03-09T17:29:33.332 INFO:tasks.workunit.client.0.vm06.stdout:0/447: dwrite d7/d11/d19/d37/f6d [0,4194304] 0 2026-03-09T17:29:33.367 INFO:tasks.workunit.client.0.vm06.stdout:0/448: symlink d7/d11/d19/d1d/d87/la1 0 2026-03-09T17:29:33.367 INFO:tasks.workunit.client.0.vm06.stdout:0/449: rename d7/d11/d19/d37/l7e to d7/d11/d89/la2 0 2026-03-09T17:29:33.367 INFO:tasks.workunit.client.0.vm06.stdout:0/450: creat d7/d11/fa3 x:0 0 0 2026-03-09T17:29:33.367 INFO:tasks.workunit.client.0.vm06.stdout:0/451: stat d7/d11/f75 0 2026-03-09T17:29:33.367 INFO:tasks.workunit.client.0.vm06.stdout:0/452: rename d7/d11/d19/d1d/d80 to d7/d11/d19/d8b/da4 0 2026-03-09T17:29:33.367 INFO:tasks.workunit.client.0.vm06.stdout:0/453: rename d7/f46 to d7/d11/d89/fa5 0 2026-03-09T17:29:33.367 INFO:tasks.workunit.client.0.vm06.stdout:0/454: readlink d7/l4b 0 2026-03-09T17:29:33.367 INFO:tasks.workunit.client.0.vm06.stdout:0/455: symlink d7/d11/d5d/d64/d96/la6 0 2026-03-09T17:29:33.368 INFO:tasks.workunit.client.0.vm06.stdout:5/378: sync 2026-03-09T17:29:33.368 INFO:tasks.workunit.client.0.vm06.stdout:7/422: sync 2026-03-09T17:29:33.368 INFO:tasks.workunit.client.0.vm06.stdout:1/427: sync 2026-03-09T17:29:33.371 INFO:tasks.workunit.client.0.vm06.stdout:5/379: creat d4/d22/f90 x:0 0 0 2026-03-09T17:29:33.381 INFO:tasks.workunit.client.0.vm06.stdout:1/428: unlink d11/d14/d1d/d1e/d2a/c2f 0 2026-03-09T17:29:33.381 INFO:tasks.workunit.client.0.vm06.stdout:1/429: readlink d11/d14/l62 0 2026-03-09T17:29:33.381 INFO:tasks.workunit.client.0.vm06.stdout:5/380: dwrite d4/d50/d18/f4b [0,4194304] 0 2026-03-09T17:29:33.381 INFO:tasks.workunit.client.0.vm06.stdout:7/423: symlink d5/d1f/d34/d3f/l73 0 2026-03-09T17:29:33.382 INFO:tasks.workunit.client.0.vm06.stdout:1/430: unlink d11/d14/d1d/d1e/d2a/d34/f5c 0 2026-03-09T17:29:33.384 INFO:tasks.workunit.client.0.vm06.stdout:1/431: dread d11/d14/d1d/d42/f52 [0,4194304] 0 2026-03-09T17:29:33.401 INFO:tasks.workunit.client.0.vm06.stdout:7/424: creat d5/d1f/f74 x:0 0 0 2026-03-09T17:29:33.402 INFO:tasks.workunit.client.0.vm06.stdout:7/425: write d5/f71 [344253,101658] 0 2026-03-09T17:29:33.406 INFO:tasks.workunit.client.0.vm06.stdout:1/432: creat d11/d14/d1d/f8f x:0 0 0 2026-03-09T17:29:33.407 INFO:tasks.workunit.client.0.vm06.stdout:5/381: mknod d4/d50/c91 0 2026-03-09T17:29:33.408 INFO:tasks.workunit.client.0.vm06.stdout:5/382: fsync d4/d22/d46/f82 0 2026-03-09T17:29:33.412 INFO:tasks.workunit.client.0.vm06.stdout:7/426: dwrite d5/f8 [0,4194304] 0 2026-03-09T17:29:33.428 INFO:tasks.workunit.client.0.vm06.stdout:7/427: chown d5/d1f/d34/d3f/l73 0 1 2026-03-09T17:29:33.428 INFO:tasks.workunit.client.0.vm06.stdout:7/428: dwrite d5/dd/f5d [0,4194304] 0 2026-03-09T17:29:33.429 INFO:tasks.workunit.client.0.vm06.stdout:7/429: stat d5/d1f/d34/d46/d51 0 2026-03-09T17:29:33.913 INFO:tasks.workunit.client.0.vm06.stdout:0/456: dread d7/fe [0,4194304] 0 2026-03-09T17:29:33.917 INFO:tasks.workunit.client.0.vm06.stdout:0/457: dwrite d7/d11/d19/d1d/d39/f51 [0,4194304] 0 2026-03-09T17:29:33.924 INFO:tasks.workunit.client.0.vm06.stdout:0/458: dwrite d7/d11/f1c [4194304,4194304] 0 2026-03-09T17:29:33.930 INFO:tasks.workunit.client.0.vm06.stdout:0/459: dwrite d7/d11/d5d/d64/f6b [0,4194304] 0 2026-03-09T17:29:33.937 INFO:tasks.workunit.client.0.vm06.stdout:0/460: fsync d7/f12 0 2026-03-09T17:29:33.937 INFO:tasks.workunit.client.0.vm06.stdout:0/461: dwrite d7/d11/d19/d1d/f4c [4194304,4194304] 0 2026-03-09T17:29:34.133 INFO:tasks.workunit.client.0.vm06.stdout:4/404: dwrite db/d57/f99 [0,4194304] 0 2026-03-09T17:29:34.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:33 vm09.local ceph-mon[62061]: pgmap v150: 65 pgs: 65 active+clean; 914 MiB data, 3.6 GiB used, 116 GiB / 120 GiB avail; 18 MiB/s rd, 109 MiB/s wr, 298 op/s 2026-03-09T17:29:34.146 INFO:tasks.workunit.client.0.vm06.stdout:4/405: symlink db/d57/l9b 0 2026-03-09T17:29:34.146 INFO:tasks.workunit.client.0.vm06.stdout:4/406: read db/d57/f99 [1960569,14819] 0 2026-03-09T17:29:34.146 INFO:tasks.workunit.client.0.vm06.stdout:4/407: fsync db/df/f97 0 2026-03-09T17:29:34.146 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:33 vm06.local ceph-mon[57307]: pgmap v150: 65 pgs: 65 active+clean; 914 MiB data, 3.6 GiB used, 116 GiB / 120 GiB avail; 18 MiB/s rd, 109 MiB/s wr, 298 op/s 2026-03-09T17:29:34.147 INFO:tasks.workunit.client.0.vm06.stdout:4/408: creat db/d59/d5f/d45/f9c x:0 0 0 2026-03-09T17:29:34.148 INFO:tasks.workunit.client.0.vm06.stdout:4/409: truncate db/d1d/d21/d25/f38 2941950 0 2026-03-09T17:29:34.149 INFO:tasks.workunit.client.0.vm06.stdout:4/410: chown db/df/c3e 2144 1 2026-03-09T17:29:34.150 INFO:tasks.workunit.client.0.vm06.stdout:4/411: unlink db/d1d/c48 0 2026-03-09T17:29:34.152 INFO:tasks.workunit.client.0.vm06.stdout:4/412: unlink db/df/f18 0 2026-03-09T17:29:34.155 INFO:tasks.workunit.client.0.vm06.stdout:4/413: symlink db/l9d 0 2026-03-09T17:29:34.156 INFO:tasks.workunit.client.0.vm06.stdout:4/414: unlink db/d57/f99 0 2026-03-09T17:29:34.157 INFO:tasks.workunit.client.0.vm06.stdout:4/415: write db/f55 [337057,77620] 0 2026-03-09T17:29:34.159 INFO:tasks.workunit.client.0.vm06.stdout:4/416: mknod db/d59/d5f/c9e 0 2026-03-09T17:29:34.160 INFO:tasks.workunit.client.0.vm06.stdout:4/417: write db/d1d/d21/d37/d69/f8b [1117193,26214] 0 2026-03-09T17:29:34.229 INFO:tasks.workunit.client.0.vm06.stdout:2/357: write d3/d4/d22/d72/f54 [723806,107569] 0 2026-03-09T17:29:34.238 INFO:tasks.workunit.client.0.vm06.stdout:2/358: creat d3/d4/d22/f73 x:0 0 0 2026-03-09T17:29:34.255 INFO:tasks.workunit.client.0.vm06.stdout:2/359: mknod d3/d4/d38/d64/c74 0 2026-03-09T17:29:34.316 INFO:tasks.workunit.client.0.vm06.stdout:8/360: dwrite d15/d16/f23 [0,4194304] 0 2026-03-09T17:29:34.320 INFO:tasks.workunit.client.0.vm06.stdout:8/361: dwrite d15/d16/d19/f61 [0,4194304] 0 2026-03-09T17:29:34.352 INFO:tasks.workunit.client.0.vm06.stdout:9/421: write d3/f27 [1723907,106543] 0 2026-03-09T17:29:34.354 INFO:tasks.workunit.client.0.vm06.stdout:6/338: truncate d6/d4f/f3a 193257 0 2026-03-09T17:29:34.355 INFO:tasks.workunit.client.0.vm06.stdout:9/422: symlink d3/d15/d16/l82 0 2026-03-09T17:29:34.356 INFO:tasks.workunit.client.0.vm06.stdout:9/423: truncate d3/d6d/f78 33716 0 2026-03-09T17:29:34.360 INFO:tasks.workunit.client.0.vm06.stdout:6/339: mknod d6/d12/d17/d27/c6a 0 2026-03-09T17:29:34.360 INFO:tasks.workunit.client.0.vm06.stdout:6/340: fdatasync d6/d4f/d3e/f62 0 2026-03-09T17:29:34.362 INFO:tasks.workunit.client.0.vm06.stdout:9/424: mkdir d3/d15/d36/d83 0 2026-03-09T17:29:34.363 INFO:tasks.workunit.client.0.vm06.stdout:9/425: chown d3/d15/d36/d4d/f60 6459 1 2026-03-09T17:29:34.364 INFO:tasks.workunit.client.0.vm06.stdout:6/341: fsync d6/d4f/f26 0 2026-03-09T17:29:34.375 INFO:tasks.workunit.client.0.vm06.stdout:4/418: fdatasync db/f55 0 2026-03-09T17:29:34.376 INFO:tasks.workunit.client.0.vm06.stdout:6/342: creat d6/d12/d17/f6b x:0 0 0 2026-03-09T17:29:34.376 INFO:tasks.workunit.client.0.vm06.stdout:9/426: link d3/d11/c2b d3/d15/d36/c84 0 2026-03-09T17:29:34.381 INFO:tasks.workunit.client.0.vm06.stdout:6/343: creat d6/d12/d2d/f6c x:0 0 0 2026-03-09T17:29:34.383 INFO:tasks.workunit.client.0.vm06.stdout:9/427: mkdir d3/d6d/d85 0 2026-03-09T17:29:34.388 INFO:tasks.workunit.client.0.vm06.stdout:3/402: write dd/f1b [822158,117695] 0 2026-03-09T17:29:34.390 INFO:tasks.workunit.client.0.vm06.stdout:6/344: dwrite d6/d12/f22 [4194304,4194304] 0 2026-03-09T17:29:34.398 INFO:tasks.workunit.client.0.vm06.stdout:4/419: link db/d59/d5f/d45/f9c db/d1d/d21/f9f 0 2026-03-09T17:29:34.398 INFO:tasks.workunit.client.0.vm06.stdout:3/403: creat dd/d1d/d4e/f7c x:0 0 0 2026-03-09T17:29:34.404 INFO:tasks.workunit.client.0.vm06.stdout:3/404: dread dd/f10 [0,4194304] 0 2026-03-09T17:29:34.490 INFO:tasks.workunit.client.0.vm06.stdout:7/430: rmdir d5 39 2026-03-09T17:29:34.494 INFO:tasks.workunit.client.0.vm06.stdout:7/431: fdatasync d5/d7/d2b/f50 0 2026-03-09T17:29:34.499 INFO:tasks.workunit.client.0.vm06.stdout:5/383: write d4/d52/f5f [1266530,24021] 0 2026-03-09T17:29:34.500 INFO:tasks.workunit.client.0.vm06.stdout:7/432: getdents d5/d1f/d34 0 2026-03-09T17:29:34.505 INFO:tasks.workunit.client.0.vm06.stdout:7/433: dwrite d5/d1f/d34/d46/f55 [0,4194304] 0 2026-03-09T17:29:34.520 INFO:tasks.workunit.client.0.vm06.stdout:5/384: rename d4/l2b to d4/d22/d64/l92 0 2026-03-09T17:29:34.520 INFO:tasks.workunit.client.0.vm06.stdout:1/433: dwrite d11/d14/d1d/d1e/f47 [0,4194304] 0 2026-03-09T17:29:34.521 INFO:tasks.workunit.client.0.vm06.stdout:5/385: creat d4/d22/d46/f93 x:0 0 0 2026-03-09T17:29:34.524 INFO:tasks.workunit.client.0.vm06.stdout:1/434: dwrite d11/d14/d1c/d1f/f7f [0,4194304] 0 2026-03-09T17:29:34.526 INFO:tasks.workunit.client.0.vm06.stdout:1/435: write d11/d14/d1d/f7c [95534,94355] 0 2026-03-09T17:29:34.527 INFO:tasks.workunit.client.0.vm06.stdout:1/436: chown d11/d14/d1c/d3a 3 1 2026-03-09T17:29:34.544 INFO:tasks.workunit.client.0.vm06.stdout:5/386: creat d4/d50/d35/f94 x:0 0 0 2026-03-09T17:29:34.546 INFO:tasks.workunit.client.0.vm06.stdout:1/437: creat d11/d14/d1d/f90 x:0 0 0 2026-03-09T17:29:34.557 INFO:tasks.workunit.client.0.vm06.stdout:5/387: mkdir d4/d50/d35/d40/d95 0 2026-03-09T17:29:34.557 INFO:tasks.workunit.client.0.vm06.stdout:7/434: getdents d5/d1f/d34/d46/d51 0 2026-03-09T17:29:34.557 INFO:tasks.workunit.client.0.vm06.stdout:5/388: mkdir d4/d50/d35/d40/d96 0 2026-03-09T17:29:34.557 INFO:tasks.workunit.client.0.vm06.stdout:5/389: read d4/d50/d18/d3d/f44 [1586498,32052] 0 2026-03-09T17:29:34.563 INFO:tasks.workunit.client.0.vm06.stdout:7/435: dread d5/d7/d2b/f50 [0,4194304] 0 2026-03-09T17:29:34.563 INFO:tasks.workunit.client.0.vm06.stdout:7/436: chown d5/d12/c61 1379663695 1 2026-03-09T17:29:34.564 INFO:tasks.workunit.client.0.vm06.stdout:1/438: getdents d11 0 2026-03-09T17:29:34.566 INFO:tasks.workunit.client.0.vm06.stdout:5/390: unlink d4/c63 0 2026-03-09T17:29:34.568 INFO:tasks.workunit.client.0.vm06.stdout:1/439: mknod d11/d14/d1d/c91 0 2026-03-09T17:29:34.571 INFO:tasks.workunit.client.0.vm06.stdout:5/391: creat d4/d50/d35/f97 x:0 0 0 2026-03-09T17:29:34.574 INFO:tasks.workunit.client.0.vm06.stdout:5/392: write d4/d50/f61 [2962053,93012] 0 2026-03-09T17:29:34.576 INFO:tasks.workunit.client.0.vm06.stdout:1/440: mkdir d11/d14/d1d/d42/d46/d92 0 2026-03-09T17:29:34.579 INFO:tasks.workunit.client.0.vm06.stdout:4/420: sync 2026-03-09T17:29:34.590 INFO:tasks.workunit.client.0.vm06.stdout:4/421: mkdir db/d1d/d21/d37/d69/d78/da0 0 2026-03-09T17:29:34.590 INFO:tasks.workunit.client.0.vm06.stdout:4/422: chown db/d59/d90 1064 1 2026-03-09T17:29:34.591 INFO:tasks.workunit.client.0.vm06.stdout:4/423: read - db/d59/d5f/d6d/f96 zero size 2026-03-09T17:29:34.591 INFO:tasks.workunit.client.0.vm06.stdout:7/437: getdents d5/d1f 0 2026-03-09T17:29:34.601 INFO:tasks.workunit.client.0.vm06.stdout:4/424: read db/f15 [3941604,86150] 0 2026-03-09T17:29:34.606 INFO:tasks.workunit.client.0.vm06.stdout:7/438: creat d5/d7/f75 x:0 0 0 2026-03-09T17:29:34.608 INFO:tasks.workunit.client.0.vm06.stdout:1/441: dread d11/d14/d1d/d1e/d2a/d34/f60 [0,4194304] 0 2026-03-09T17:29:34.657 INFO:tasks.workunit.client.0.vm06.stdout:0/462: dwrite d7/d11/d2d/f2f [0,4194304] 0 2026-03-09T17:29:34.662 INFO:tasks.workunit.client.0.vm06.stdout:0/463: rename f5 to d7/d11/d19/d8b/da4/fa7 0 2026-03-09T17:29:34.666 INFO:tasks.workunit.client.0.vm06.stdout:0/464: mkdir d7/d11/d89/da8 0 2026-03-09T17:29:34.668 INFO:tasks.workunit.client.0.vm06.stdout:0/465: mknod d7/d11/d5d/d64/d96/ca9 0 2026-03-09T17:29:34.674 INFO:tasks.workunit.client.0.vm06.stdout:0/466: link d7/d11/d19/d1d/d59/l5f d7/d11/d19/d8b/laa 0 2026-03-09T17:29:34.675 INFO:tasks.workunit.client.0.vm06.stdout:0/467: chown d7/d11/d5d 3073581 1 2026-03-09T17:29:34.676 INFO:tasks.workunit.client.0.vm06.stdout:0/468: read d7/d11/d2d/f78 [4052367,57041] 0 2026-03-09T17:29:34.677 INFO:tasks.workunit.client.0.vm06.stdout:0/469: write d7/d11/d19/d1d/d39/f4a [2492636,91346] 0 2026-03-09T17:29:34.677 INFO:tasks.workunit.client.0.vm06.stdout:0/470: chown d7/d11/f30 1480533 1 2026-03-09T17:29:34.681 INFO:tasks.workunit.client.0.vm06.stdout:0/471: dwrite d7/d11/d2d/f2f [0,4194304] 0 2026-03-09T17:29:34.688 INFO:tasks.workunit.client.0.vm06.stdout:0/472: creat d7/d11/d19/d8b/da4/fab x:0 0 0 2026-03-09T17:29:34.692 INFO:tasks.workunit.client.0.vm06.stdout:0/473: read d7/fe [2916518,40803] 0 2026-03-09T17:29:34.696 INFO:tasks.workunit.client.0.vm06.stdout:0/474: link d7/c70 d7/d11/d89/d99/cac 0 2026-03-09T17:29:34.699 INFO:tasks.workunit.client.0.vm06.stdout:0/475: mknod d7/d11/d19/d8b/da4/d85/cad 0 2026-03-09T17:29:34.701 INFO:tasks.workunit.client.0.vm06.stdout:0/476: rename d7/d11/d19/d1d/c32 to d7/d11/d19/d8b/da4/cae 0 2026-03-09T17:29:34.733 INFO:tasks.workunit.client.0.vm06.stdout:5/393: truncate d4/d50/d18/f48 1871230 0 2026-03-09T17:29:34.735 INFO:tasks.workunit.client.0.vm06.stdout:5/394: mkdir d4/d50/d35/d40/d96/d98 0 2026-03-09T17:29:34.739 INFO:tasks.workunit.client.0.vm06.stdout:9/428: truncate d3/d11/f2a 3266160 0 2026-03-09T17:29:34.741 INFO:tasks.workunit.client.0.vm06.stdout:5/395: dread d4/d22/d46/f58 [0,4194304] 0 2026-03-09T17:29:34.741 INFO:tasks.workunit.client.0.vm06.stdout:5/396: dread - d4/d7e/f8b zero size 2026-03-09T17:29:34.742 INFO:tasks.workunit.client.0.vm06.stdout:0/477: sync 2026-03-09T17:29:34.742 INFO:tasks.workunit.client.0.vm06.stdout:5/397: write d4/f71 [22970,29985] 0 2026-03-09T17:29:34.748 INFO:tasks.workunit.client.0.vm06.stdout:9/429: link d3/c3e d3/d15/d16/c86 0 2026-03-09T17:29:34.749 INFO:tasks.workunit.client.0.vm06.stdout:5/398: mknod d4/d50/d35/d40/d95/c99 0 2026-03-09T17:29:34.749 INFO:tasks.workunit.client.0.vm06.stdout:9/430: chown d3/d11/d65/f7c 223 1 2026-03-09T17:29:34.755 INFO:tasks.workunit.client.0.vm06.stdout:9/431: unlink d3/d15/f58 0 2026-03-09T17:29:34.758 INFO:tasks.workunit.client.0.vm06.stdout:0/478: dread d7/d11/f30 [0,4194304] 0 2026-03-09T17:29:34.760 INFO:tasks.workunit.client.0.vm06.stdout:9/432: link d3/d15/d36/d4d/f60 d3/d11/f87 0 2026-03-09T17:29:34.763 INFO:tasks.workunit.client.0.vm06.stdout:0/479: dwrite d7/d11/d19/d1d/f8a [0,4194304] 0 2026-03-09T17:29:34.764 INFO:tasks.workunit.client.0.vm06.stdout:5/399: sync 2026-03-09T17:29:34.764 INFO:tasks.workunit.client.0.vm06.stdout:9/433: chown d3/c10 11332196 1 2026-03-09T17:29:34.765 INFO:tasks.workunit.client.0.vm06.stdout:5/400: chown d4/d50/d18/d3d/f44 0 1 2026-03-09T17:29:34.766 INFO:tasks.workunit.client.0.vm06.stdout:5/401: write d4/d50/d18/f4b [3578339,38677] 0 2026-03-09T17:29:34.768 INFO:tasks.workunit.client.0.vm06.stdout:9/434: mknod d3/d26/d6c/d68/c88 0 2026-03-09T17:29:34.770 INFO:tasks.workunit.client.0.vm06.stdout:0/480: dwrite d7/d11/d19/d1d/d39/f4a [4194304,4194304] 0 2026-03-09T17:29:34.781 INFO:tasks.workunit.client.0.vm06.stdout:2/360: rmdir d3 39 2026-03-09T17:29:34.786 INFO:tasks.workunit.client.0.vm06.stdout:9/435: sync 2026-03-09T17:29:34.790 INFO:tasks.workunit.client.0.vm06.stdout:5/402: rmdir d4/d52/d79 0 2026-03-09T17:29:34.793 INFO:tasks.workunit.client.0.vm06.stdout:9/436: sync 2026-03-09T17:29:34.794 INFO:tasks.workunit.client.0.vm06.stdout:2/361: getdents d3/d4/d12/d71 0 2026-03-09T17:29:34.797 INFO:tasks.workunit.client.0.vm06.stdout:2/362: dread - d3/d4/d12/f2e zero size 2026-03-09T17:29:34.798 INFO:tasks.workunit.client.0.vm06.stdout:2/363: chown d3/d4/d12/d2b/d36/d37/c4e 45991 1 2026-03-09T17:29:34.803 INFO:tasks.workunit.client.0.vm06.stdout:9/437: creat d3/d6d/d85/f89 x:0 0 0 2026-03-09T17:29:34.806 INFO:tasks.workunit.client.0.vm06.stdout:2/364: sync 2026-03-09T17:29:34.825 INFO:tasks.workunit.client.0.vm06.stdout:8/362: dwrite f12 [0,4194304] 0 2026-03-09T17:29:34.836 INFO:tasks.workunit.client.0.vm06.stdout:8/363: mknod d15/d16/d19/d2b/c79 0 2026-03-09T17:29:34.839 INFO:tasks.workunit.client.0.vm06.stdout:8/364: symlink d15/d16/d6d/l7a 0 2026-03-09T17:29:34.840 INFO:tasks.workunit.client.0.vm06.stdout:8/365: creat d15/d39/f7b x:0 0 0 2026-03-09T17:29:34.844 INFO:tasks.workunit.client.0.vm06.stdout:6/345: write d6/d47/f49 [675175,69420] 0 2026-03-09T17:29:34.846 INFO:tasks.workunit.client.0.vm06.stdout:3/405: truncate dd/d19/f2b 3469670 0 2026-03-09T17:29:34.846 INFO:tasks.workunit.client.0.vm06.stdout:3/406: write f7 [11844924,101845] 0 2026-03-09T17:29:34.855 INFO:tasks.workunit.client.0.vm06.stdout:3/407: rmdir dd/d19/d25 39 2026-03-09T17:29:34.860 INFO:tasks.workunit.client.0.vm06.stdout:3/408: rmdir dd/d19/d25/d48 39 2026-03-09T17:29:34.862 INFO:tasks.workunit.client.0.vm06.stdout:3/409: unlink dd/d19/d1e/f23 0 2026-03-09T17:29:34.864 INFO:tasks.workunit.client.0.vm06.stdout:3/410: creat dd/d1d/d4e/f7d x:0 0 0 2026-03-09T17:29:34.868 INFO:tasks.workunit.client.0.vm06.stdout:3/411: creat dd/d1d/d2e/d67/f7e x:0 0 0 2026-03-09T17:29:34.894 INFO:tasks.workunit.client.0.vm06.stdout:2/365: dread d3/d4/d12/f66 [0,4194304] 0 2026-03-09T17:29:34.897 INFO:tasks.workunit.client.0.vm06.stdout:4/425: dwrite db/d1d/d21/f42 [0,4194304] 0 2026-03-09T17:29:34.902 INFO:tasks.workunit.client.0.vm06.stdout:2/366: dwrite d3/d4/d38/f58 [0,4194304] 0 2026-03-09T17:29:34.910 INFO:tasks.workunit.client.0.vm06.stdout:7/439: dwrite d5/d12/f35 [4194304,4194304] 0 2026-03-09T17:29:34.926 INFO:tasks.workunit.client.0.vm06.stdout:2/367: link d3/d4/d12/d2b/d2d/c63 d3/d4/d12/d2b/d2d/c75 0 2026-03-09T17:29:34.929 INFO:tasks.workunit.client.0.vm06.stdout:4/426: sync 2026-03-09T17:29:34.937 INFO:tasks.workunit.client.0.vm06.stdout:4/427: creat db/d1d/d21/fa1 x:0 0 0 2026-03-09T17:29:34.938 INFO:tasks.workunit.client.0.vm06.stdout:4/428: write db/df/f83 [13532508,12158] 0 2026-03-09T17:29:34.942 INFO:tasks.workunit.client.0.vm06.stdout:0/481: rename d7/d11/d19/d1d/d59 to d7/d11/d2d/daf 0 2026-03-09T17:29:34.942 INFO:tasks.workunit.client.0.vm06.stdout:7/440: rename d5 to d5/d12/d5f/d76 22 2026-03-09T17:29:34.950 INFO:tasks.workunit.client.0.vm06.stdout:4/429: mknod db/d59/ca2 0 2026-03-09T17:29:34.950 INFO:tasks.workunit.client.0.vm06.stdout:0/482: rmdir d7 39 2026-03-09T17:29:34.950 INFO:tasks.workunit.client.0.vm06.stdout:7/441: creat d5/d12/d64/f77 x:0 0 0 2026-03-09T17:29:34.953 INFO:tasks.workunit.client.0.vm06.stdout:2/368: rename d3/d4/d12/d2b/d2d/c57 to d3/d4/d22/c76 0 2026-03-09T17:29:34.964 INFO:tasks.workunit.client.0.vm06.stdout:2/369: mkdir d3/d4/d22/d43/d77 0 2026-03-09T17:29:34.967 INFO:tasks.workunit.client.0.vm06.stdout:4/430: rmdir db/d1d/d21/d37/d69/d78/d8f 0 2026-03-09T17:29:34.970 INFO:tasks.workunit.client.0.vm06.stdout:2/370: truncate d3/d4/d12/f1e 3304967 0 2026-03-09T17:29:34.975 INFO:tasks.workunit.client.0.vm06.stdout:7/442: link d5/l1e d5/d1f/d34/l78 0 2026-03-09T17:29:34.983 INFO:tasks.workunit.client.0.vm06.stdout:7/443: mkdir d5/dd/d79 0 2026-03-09T17:29:34.991 INFO:tasks.workunit.client.0.vm06.stdout:5/403: write d4/f5 [7797549,1463] 0 2026-03-09T17:29:34.991 INFO:tasks.workunit.client.0.vm06.stdout:5/404: readlink d4/d50/d35/d40/d6f/l87 0 2026-03-09T17:29:34.992 INFO:tasks.workunit.client.0.vm06.stdout:5/405: write d4/d22/d64/f7d [812248,72227] 0 2026-03-09T17:29:34.996 INFO:tasks.workunit.client.0.vm06.stdout:7/444: sync 2026-03-09T17:29:34.996 INFO:tasks.workunit.client.0.vm06.stdout:5/406: sync 2026-03-09T17:29:35.000 INFO:tasks.workunit.client.0.vm06.stdout:9/438: dwrite d3/f21 [4194304,4194304] 0 2026-03-09T17:29:35.001 INFO:tasks.workunit.client.0.vm06.stdout:9/439: stat d3/d15/d36/l53 0 2026-03-09T17:29:35.002 INFO:tasks.workunit.client.0.vm06.stdout:9/440: readlink d3/d15/d16/l20 0 2026-03-09T17:29:35.002 INFO:tasks.workunit.client.0.vm06.stdout:9/441: dread - d3/d15/d48/f64 zero size 2026-03-09T17:29:35.004 INFO:tasks.workunit.client.0.vm06.stdout:5/407: truncate d4/fd 484613 0 2026-03-09T17:29:35.004 INFO:tasks.workunit.client.0.vm06.stdout:1/442: dread d11/d14/f59 [0,4194304] 0 2026-03-09T17:29:35.007 INFO:tasks.workunit.client.0.vm06.stdout:7/445: rename d5/dd/c4d to d5/d1f/d34/d3f/c7a 0 2026-03-09T17:29:35.010 INFO:tasks.workunit.client.0.vm06.stdout:5/408: dwrite d4/d22/d64/f70 [4194304,4194304] 0 2026-03-09T17:29:35.026 INFO:tasks.workunit.client.0.vm06.stdout:1/443: creat d11/d14/d1c/d1f/d57/d7b/f93 x:0 0 0 2026-03-09T17:29:35.027 INFO:tasks.workunit.client.0.vm06.stdout:1/444: chown d11/d14/d1c/d1f/f21 348 1 2026-03-09T17:29:35.028 INFO:tasks.workunit.client.0.vm06.stdout:1/445: fsync d11/d14/d1d/d1e/d2a/d34/d58/f6a 0 2026-03-09T17:29:35.029 INFO:tasks.workunit.client.0.vm06.stdout:7/446: readlink d5/l23 0 2026-03-09T17:29:35.035 INFO:tasks.workunit.client.0.vm06.stdout:9/442: mkdir d3/d15/d36/d4c/d6a/d8a 0 2026-03-09T17:29:35.038 INFO:tasks.workunit.client.0.vm06.stdout:1/446: mkdir d11/d14/d1d/d94 0 2026-03-09T17:29:35.042 INFO:tasks.workunit.client.0.vm06.stdout:1/447: dwrite d11/d14/d1c/d1f/f7f [0,4194304] 0 2026-03-09T17:29:35.046 INFO:tasks.workunit.client.0.vm06.stdout:7/447: fsync d5/d12/f2c 0 2026-03-09T17:29:35.051 INFO:tasks.workunit.client.0.vm06.stdout:5/409: mknod d4/d50/c9a 0 2026-03-09T17:29:35.057 INFO:tasks.workunit.client.0.vm06.stdout:8/366: write d15/d16/d19/d2b/f63 [966091,74677] 0 2026-03-09T17:29:35.058 INFO:tasks.workunit.client.0.vm06.stdout:9/443: truncate d3/d15/f2e 79295 0 2026-03-09T17:29:35.069 INFO:tasks.workunit.client.0.vm06.stdout:7/448: rename d5/d7/f1d to d5/d1f/d34/d46/d51/f7b 0 2026-03-09T17:29:35.071 INFO:tasks.workunit.client.0.vm06.stdout:6/346: write d6/d12/d17/d27/f37 [6849522,103357] 0 2026-03-09T17:29:35.076 INFO:tasks.workunit.client.0.vm06.stdout:9/444: creat d3/d15/d36/d4c/d6a/f8b x:0 0 0 2026-03-09T17:29:35.078 INFO:tasks.workunit.client.0.vm06.stdout:8/367: dread d15/f3e [0,4194304] 0 2026-03-09T17:29:35.080 INFO:tasks.workunit.client.0.vm06.stdout:8/368: write d15/d16/f66 [185503,110440] 0 2026-03-09T17:29:35.082 INFO:tasks.workunit.client.0.vm06.stdout:9/445: dwrite d3/d15/d36/d4c/f5a [4194304,4194304] 0 2026-03-09T17:29:35.083 INFO:tasks.workunit.client.0.vm06.stdout:9/446: truncate d3/d26/d6c/f5b 310815 0 2026-03-09T17:29:35.097 INFO:tasks.workunit.client.0.vm06.stdout:1/448: link d11/d14/f17 d11/d14/d1d/d94/f95 0 2026-03-09T17:29:35.104 INFO:tasks.workunit.client.0.vm06.stdout:7/449: truncate d5/d7/d2b/f42 3799944 0 2026-03-09T17:29:35.110 INFO:tasks.workunit.client.0.vm06.stdout:8/369: unlink d15/d16/l37 0 2026-03-09T17:29:35.110 INFO:tasks.workunit.client.0.vm06.stdout:8/370: dread - d15/d16/d6d/f74 zero size 2026-03-09T17:29:35.116 INFO:tasks.workunit.client.0.vm06.stdout:7/450: creat d5/d1f/d34/d46/d51/f7c x:0 0 0 2026-03-09T17:29:35.117 INFO:tasks.workunit.client.0.vm06.stdout:7/451: truncate d5/d1f/f74 1041821 0 2026-03-09T17:29:35.118 INFO:tasks.workunit.client.0.vm06.stdout:3/412: dread dd/d1d/f45 [0,4194304] 0 2026-03-09T17:29:35.124 INFO:tasks.workunit.client.0.vm06.stdout:7/452: dwrite d5/d12/f6c [0,4194304] 0 2026-03-09T17:29:35.127 INFO:tasks.workunit.client.0.vm06.stdout:5/410: link d4/d50/c91 d4/d22/d64/c9b 0 2026-03-09T17:29:35.128 INFO:tasks.workunit.client.0.vm06.stdout:5/411: chown d4/d50/d18/d3d/l7f 6 1 2026-03-09T17:29:35.130 INFO:tasks.workunit.client.0.vm06.stdout:8/371: mkdir d15/d16/d1a/d7c 0 2026-03-09T17:29:35.133 INFO:tasks.workunit.client.0.vm06.stdout:1/449: mkdir d11/d14/d1d/d1e/d96 0 2026-03-09T17:29:35.139 INFO:tasks.workunit.client.0.vm06.stdout:7/453: read d5/f18 [4220896,62492] 0 2026-03-09T17:29:35.148 INFO:tasks.workunit.client.0.vm06.stdout:0/483: dwrite d7/d11/f75 [0,4194304] 0 2026-03-09T17:29:35.154 INFO:tasks.workunit.client.0.vm06.stdout:0/484: dwrite d7/d11/d19/d1d/f8a [4194304,4194304] 0 2026-03-09T17:29:35.157 INFO:tasks.workunit.client.0.vm06.stdout:1/450: dread d11/d14/d1d/d1e/d2a/f40 [0,4194304] 0 2026-03-09T17:29:35.163 INFO:tasks.workunit.client.0.vm06.stdout:1/451: dwrite d11/d14/d1d/f90 [0,4194304] 0 2026-03-09T17:29:35.167 INFO:tasks.workunit.client.0.vm06.stdout:1/452: stat d11/d14/d1d/d42/d46 0 2026-03-09T17:29:35.178 INFO:tasks.workunit.client.0.vm06.stdout:7/454: creat d5/dd/f7d x:0 0 0 2026-03-09T17:29:35.191 INFO:tasks.workunit.client.0.vm06.stdout:7/455: chown d5/d1f/c37 20380 1 2026-03-09T17:29:35.191 INFO:tasks.workunit.client.0.vm06.stdout:0/485: unlink d7/d11/d2d/f78 0 2026-03-09T17:29:35.196 INFO:tasks.workunit.client.0.vm06.stdout:4/431: dwrite db/d1d/d21/d26/f70 [0,4194304] 0 2026-03-09T17:29:35.196 INFO:tasks.workunit.client.0.vm06.stdout:5/412: link d4/d22/d64/c9b d4/d50/d35/d40/c9c 0 2026-03-09T17:29:35.197 INFO:tasks.workunit.client.0.vm06.stdout:5/413: chown d4/d50/d35/c37 3535 1 2026-03-09T17:29:35.199 INFO:tasks.workunit.client.0.vm06.stdout:9/447: chown d3/d11/f2a 108 1 2026-03-09T17:29:35.200 INFO:tasks.workunit.client.0.vm06.stdout:1/453: symlink d11/d14/d1c/d5f/l97 0 2026-03-09T17:29:35.203 INFO:tasks.workunit.client.0.vm06.stdout:7/456: mkdir d5/d7/d7e 0 2026-03-09T17:29:35.209 INFO:tasks.workunit.client.0.vm06.stdout:2/371: write d3/d4/f70 [594382,71691] 0 2026-03-09T17:29:35.210 INFO:tasks.workunit.client.0.vm06.stdout:2/372: stat d3/d44 0 2026-03-09T17:29:35.216 INFO:tasks.workunit.client.0.vm06.stdout:8/372: creat d15/f7d x:0 0 0 2026-03-09T17:29:35.223 INFO:tasks.workunit.client.0.vm06.stdout:5/414: dread d4/d50/f61 [0,4194304] 0 2026-03-09T17:29:35.223 INFO:tasks.workunit.client.0.vm06.stdout:5/415: readlink d4/d50/d18/l7b 0 2026-03-09T17:29:35.224 INFO:tasks.workunit.client.0.vm06.stdout:5/416: chown d4/d22/d46/f59 256970760 1 2026-03-09T17:29:35.268 INFO:tasks.workunit.client.0.vm06.stdout:6/347: truncate d6/d4f/f33 1047266 0 2026-03-09T17:29:35.276 INFO:tasks.workunit.client.0.vm06.stdout:4/432: rename db/df/f36 to db/d1d/d21/d44/d8a/fa3 0 2026-03-09T17:29:35.277 INFO:tasks.workunit.client.0.vm06.stdout:4/433: write db/d1d/d21/d25/f38 [3704127,102748] 0 2026-03-09T17:29:35.280 INFO:tasks.workunit.client.0.vm06.stdout:3/413: truncate dd/d5b/d65/f6a 3909549 0 2026-03-09T17:29:35.284 INFO:tasks.workunit.client.0.vm06.stdout:9/448: creat d3/d2c/f8c x:0 0 0 2026-03-09T17:29:35.284 INFO:tasks.workunit.client.0.vm06.stdout:9/449: dread - d3/d15/f5e zero size 2026-03-09T17:29:35.285 INFO:tasks.workunit.client.0.vm06.stdout:9/450: fsync d3/d11/d65/f7c 0 2026-03-09T17:29:35.285 INFO:tasks.workunit.client.0.vm06.stdout:9/451: readlink d3/d15/l1e 0 2026-03-09T17:29:35.294 INFO:tasks.workunit.client.0.vm06.stdout:6/348: mkdir d6/d47/d4d/d6d 0 2026-03-09T17:29:35.301 INFO:tasks.workunit.client.0.vm06.stdout:1/454: rename d11/d14/d1d/c3f to d11/d14/d1d/d1e/d2a/c98 0 2026-03-09T17:29:35.319 INFO:tasks.workunit.client.0.vm06.stdout:6/349: creat d6/d47/d4d/f6e x:0 0 0 2026-03-09T17:29:35.327 INFO:tasks.workunit.client.0.vm06.stdout:6/350: dread d6/d4f/f26 [0,4194304] 0 2026-03-09T17:29:35.338 INFO:tasks.workunit.client.0.vm06.stdout:1/455: dread d11/d14/d1c/d1f/f68 [0,4194304] 0 2026-03-09T17:29:35.345 INFO:tasks.workunit.client.0.vm06.stdout:9/452: mknod d3/d15/d36/d4c/d6a/d8a/c8d 0 2026-03-09T17:29:35.348 INFO:tasks.workunit.client.0.vm06.stdout:9/453: fsync d3/d15/d16/f72 0 2026-03-09T17:29:35.349 INFO:tasks.workunit.client.0.vm06.stdout:9/454: read d3/d6d/f78 [26384,53249] 0 2026-03-09T17:29:35.350 INFO:tasks.workunit.client.0.vm06.stdout:9/455: dwrite d3/d26/f76 [0,4194304] 0 2026-03-09T17:29:35.352 INFO:tasks.workunit.client.0.vm06.stdout:9/456: stat d3/d15/d36/d4d/f60 0 2026-03-09T17:29:35.384 INFO:tasks.workunit.client.0.vm06.stdout:8/373: write d15/d16/f51 [3296923,48914] 0 2026-03-09T17:29:35.391 INFO:tasks.workunit.client.0.vm06.stdout:1/456: mkdir d11/d14/d1d/d1e/d2a/d99 0 2026-03-09T17:29:35.391 INFO:tasks.workunit.client.0.vm06.stdout:0/486: truncate d7/d11/d19/d1d/d39/f4a 5089078 0 2026-03-09T17:29:35.396 INFO:tasks.workunit.client.0.vm06.stdout:4/434: write db/d1d/d21/f9f [113401,115628] 0 2026-03-09T17:29:35.398 INFO:tasks.workunit.client.0.vm06.stdout:2/373: dread d3/d4/d12/f1e [0,4194304] 0 2026-03-09T17:29:35.401 INFO:tasks.workunit.client.0.vm06.stdout:7/457: dwrite d5/dd/f29 [0,4194304] 0 2026-03-09T17:29:35.402 INFO:tasks.workunit.client.0.vm06.stdout:7/458: stat d5/d1f/d34/d3f/f5b 0 2026-03-09T17:29:35.402 INFO:tasks.workunit.client.0.vm06.stdout:7/459: stat d5/d12/c4f 0 2026-03-09T17:29:35.402 INFO:tasks.workunit.client.0.vm06.stdout:5/417: link d4/d50/d18/f48 d4/d50/d18/f9d 0 2026-03-09T17:29:35.411 INFO:tasks.workunit.client.0.vm06.stdout:6/351: mkdir d6/d47/d6f 0 2026-03-09T17:29:35.413 INFO:tasks.workunit.client.0.vm06.stdout:3/414: getdents dd/d19/d25 0 2026-03-09T17:29:35.415 INFO:tasks.workunit.client.0.vm06.stdout:8/374: fsync d15/f3e 0 2026-03-09T17:29:35.428 INFO:tasks.workunit.client.0.vm06.stdout:8/375: chown f7 7 1 2026-03-09T17:29:35.429 INFO:tasks.workunit.client.0.vm06.stdout:0/487: read d7/d11/d19/d1d/f40 [194137,100454] 0 2026-03-09T17:29:35.429 INFO:tasks.workunit.client.0.vm06.stdout:0/488: dread d7/d11/d5d/d64/f6b [0,4194304] 0 2026-03-09T17:29:35.429 INFO:tasks.workunit.client.0.vm06.stdout:9/457: fsync d3/d26/d35/f6f 0 2026-03-09T17:29:35.429 INFO:tasks.workunit.client.0.vm06.stdout:9/458: truncate d3/d15/d16/f7d 931979 0 2026-03-09T17:29:35.429 INFO:tasks.workunit.client.0.vm06.stdout:4/435: readlink db/d1d/d21/d44/l6c 0 2026-03-09T17:29:35.430 INFO:tasks.workunit.client.0.vm06.stdout:5/418: creat d4/d50/d35/d40/d6f/f9e x:0 0 0 2026-03-09T17:29:35.431 INFO:tasks.workunit.client.0.vm06.stdout:6/352: mknod d6/d12/d17/d27/c70 0 2026-03-09T17:29:35.436 INFO:tasks.workunit.client.0.vm06.stdout:1/457: creat d11/d14/d1d/d42/d46/d92/f9a x:0 0 0 2026-03-09T17:29:35.437 INFO:tasks.workunit.client.0.vm06.stdout:1/458: write d11/d14/d1c/d1f/f7f [813842,26307] 0 2026-03-09T17:29:35.440 INFO:tasks.workunit.client.0.vm06.stdout:8/376: dread d15/d16/f3f [0,4194304] 0 2026-03-09T17:29:35.473 INFO:tasks.workunit.client.0.vm06.stdout:8/377: write d15/d16/d1a/d47/f76 [1196920,29368] 0 2026-03-09T17:29:35.473 INFO:tasks.workunit.client.0.vm06.stdout:7/460: mkdir d5/dd/d79/d7f 0 2026-03-09T17:29:35.473 INFO:tasks.workunit.client.0.vm06.stdout:5/419: creat d4/d22/d64/f9f x:0 0 0 2026-03-09T17:29:35.473 INFO:tasks.workunit.client.0.vm06.stdout:5/420: dwrite d4/d50/d18/f74 [0,4194304] 0 2026-03-09T17:29:35.473 INFO:tasks.workunit.client.0.vm06.stdout:5/421: symlink d4/d22/d46/la0 0 2026-03-09T17:29:35.473 INFO:tasks.workunit.client.0.vm06.stdout:1/459: unlink d11/d14/d1c/d1f/d57/c5e 0 2026-03-09T17:29:35.473 INFO:tasks.workunit.client.0.vm06.stdout:8/378: link d15/d16/d1a/f22 d15/d16/d1a/d47/f7e 0 2026-03-09T17:29:35.473 INFO:tasks.workunit.client.0.vm06.stdout:0/489: link d7/fe d7/d11/d89/d99/fb0 0 2026-03-09T17:29:35.473 INFO:tasks.workunit.client.0.vm06.stdout:9/459: link d3/c79 d3/d11/d65/c8e 0 2026-03-09T17:29:35.474 INFO:tasks.workunit.client.0.vm06.stdout:8/379: rename d15/d16/d19/c25 to d15/d16/d1e/d28/d5e/c7f 0 2026-03-09T17:29:35.474 INFO:tasks.workunit.client.0.vm06.stdout:0/490: dread d7/d11/d19/d1d/f40 [0,4194304] 0 2026-03-09T17:29:35.477 INFO:tasks.workunit.client.0.vm06.stdout:8/380: dread d15/d16/f23 [0,4194304] 0 2026-03-09T17:29:35.478 INFO:tasks.workunit.client.0.vm06.stdout:9/460: fdatasync d3/d26/f29 0 2026-03-09T17:29:35.479 INFO:tasks.workunit.client.0.vm06.stdout:9/461: read - d3/d26/d6c/d68/f7f zero size 2026-03-09T17:29:35.481 INFO:tasks.workunit.client.0.vm06.stdout:5/422: link d4/d52/f8a d4/d50/d35/d40/d95/fa1 0 2026-03-09T17:29:35.481 INFO:tasks.workunit.client.0.vm06.stdout:5/423: chown d4/d50 42595 1 2026-03-09T17:29:35.483 INFO:tasks.workunit.client.0.vm06.stdout:6/353: link d6/d12/d17/d27/c6a d6/d12/d17/c71 0 2026-03-09T17:29:35.489 INFO:tasks.workunit.client.0.vm06.stdout:9/462: mknod d3/d26/d6c/c8f 0 2026-03-09T17:29:35.490 INFO:tasks.workunit.client.0.vm06.stdout:9/463: write d3/d26/d6c/f75 [582027,117631] 0 2026-03-09T17:29:35.493 INFO:tasks.workunit.client.0.vm06.stdout:5/424: creat d4/d50/d18/fa2 x:0 0 0 2026-03-09T17:29:35.493 INFO:tasks.workunit.client.0.vm06.stdout:9/464: dread d3/f21 [4194304,4194304] 0 2026-03-09T17:29:35.510 INFO:tasks.workunit.client.0.vm06.stdout:0/491: link d7/d11/d19/d1d/f40 d7/fb1 0 2026-03-09T17:29:35.511 INFO:tasks.workunit.client.0.vm06.stdout:0/492: chown d7/d11/d19/f24 3415214 1 2026-03-09T17:29:35.514 INFO:tasks.workunit.client.0.vm06.stdout:8/381: getdents d15/d16/d1e/d28 0 2026-03-09T17:29:35.515 INFO:tasks.workunit.client.0.vm06.stdout:0/493: dwrite d7/d11/d19/d8b/da4/fab [0,4194304] 0 2026-03-09T17:29:35.517 INFO:tasks.workunit.client.0.vm06.stdout:8/382: creat d15/d16/d19/d71/f80 x:0 0 0 2026-03-09T17:29:35.519 INFO:tasks.workunit.client.0.vm06.stdout:8/383: dread - d15/d39/f6f zero size 2026-03-09T17:29:35.519 INFO:tasks.workunit.client.0.vm06.stdout:8/384: chown d15/d39/d67/d77 13716 1 2026-03-09T17:29:35.530 INFO:tasks.workunit.client.0.vm06.stdout:8/385: dwrite d15/d16/f3f [0,4194304] 0 2026-03-09T17:29:35.576 INFO:tasks.workunit.client.0.vm06.stdout:8/386: dread d15/d16/d19/f61 [0,4194304] 0 2026-03-09T17:29:35.576 INFO:tasks.workunit.client.0.vm06.stdout:8/387: read d15/d39/f40 [709584,117463] 0 2026-03-09T17:29:35.688 INFO:tasks.workunit.client.0.vm06.stdout:4/436: sync 2026-03-09T17:29:35.696 INFO:tasks.workunit.client.0.vm06.stdout:4/437: symlink db/d1d/d21/d44/d8a/la4 0 2026-03-09T17:29:35.697 INFO:tasks.workunit.client.0.vm06.stdout:4/438: creat db/d1d/d21/fa5 x:0 0 0 2026-03-09T17:29:35.700 INFO:tasks.workunit.client.0.vm06.stdout:4/439: getdents db/d1d/d21/d88 0 2026-03-09T17:29:35.708 INFO:tasks.workunit.client.0.vm06.stdout:4/440: truncate db/d59/f76 429109 0 2026-03-09T17:29:35.713 INFO:tasks.workunit.client.0.vm06.stdout:4/441: mknod db/d1d/d21/d88/ca6 0 2026-03-09T17:29:35.722 INFO:tasks.workunit.client.0.vm06.stdout:4/442: creat db/d1d/d21/d44/d8a/fa7 x:0 0 0 2026-03-09T17:29:35.727 INFO:tasks.workunit.client.0.vm06.stdout:4/443: dread - db/d59/d5f/d6d/f7b zero size 2026-03-09T17:29:35.729 INFO:tasks.workunit.client.0.vm06.stdout:2/374: write d3/d4/d38/f50 [2298244,76797] 0 2026-03-09T17:29:35.730 INFO:tasks.workunit.client.0.vm06.stdout:2/375: chown d3/d4/d12/d2b/f32 7716 1 2026-03-09T17:29:35.733 INFO:tasks.workunit.client.0.vm06.stdout:3/415: write dd/d19/d28/f32 [590082,111127] 0 2026-03-09T17:29:35.737 INFO:tasks.workunit.client.0.vm06.stdout:4/444: mknod db/d59/d5f/d6d/ca8 0 2026-03-09T17:29:35.740 INFO:tasks.workunit.client.0.vm06.stdout:2/376: mknod d3/d4/d12/d2b/d36/d37/c78 0 2026-03-09T17:29:35.740 INFO:tasks.workunit.client.0.vm06.stdout:2/377: readlink l1 0 2026-03-09T17:29:35.743 INFO:tasks.workunit.client.0.vm06.stdout:3/416: rename dd/l6b to dd/d19/d1e/l7f 0 2026-03-09T17:29:35.745 INFO:tasks.workunit.client.0.vm06.stdout:7/461: write d5/f18 [811993,44900] 0 2026-03-09T17:29:35.748 INFO:tasks.workunit.client.0.vm06.stdout:2/378: mknod d3/d4/d38/d64/c79 0 2026-03-09T17:29:35.751 INFO:tasks.workunit.client.0.vm06.stdout:2/379: dread d3/d4/d22/f2f [0,4194304] 0 2026-03-09T17:29:35.751 INFO:tasks.workunit.client.0.vm06.stdout:3/417: mkdir dd/d19/d25/d44/d80 0 2026-03-09T17:29:35.752 INFO:tasks.workunit.client.0.vm06.stdout:3/418: truncate dd/f68 1218736 0 2026-03-09T17:29:35.763 INFO:tasks.workunit.client.0.vm06.stdout:1/460: dwrite d11/d14/d1c/d1f/f45 [0,4194304] 0 2026-03-09T17:29:35.766 INFO:tasks.workunit.client.0.vm06.stdout:7/462: rename d5/d12/d5f/f66 to d5/d1f/f80 0 2026-03-09T17:29:35.788 INFO:tasks.workunit.client.0.vm06.stdout:2/380: rename l1 to d3/d4/d38/d64/l7a 0 2026-03-09T17:29:35.796 INFO:tasks.workunit.client.0.vm06.stdout:5/425: write d4/d50/d18/f9d [2245918,93433] 0 2026-03-09T17:29:35.797 INFO:tasks.workunit.client.0.vm06.stdout:5/426: write d4/f7 [3714611,127387] 0 2026-03-09T17:29:35.797 INFO:tasks.workunit.client.0.vm06.stdout:5/427: stat d4/d22/c47 0 2026-03-09T17:29:35.799 INFO:tasks.workunit.client.0.vm06.stdout:9/465: write d3/d11/f1f [4206633,105539] 0 2026-03-09T17:29:35.799 INFO:tasks.workunit.client.0.vm06.stdout:9/466: rename d3 to d3/d15/d36/d83/d90 22 2026-03-09T17:29:35.799 INFO:tasks.workunit.client.0.vm06.stdout:9/467: dread - d3/d2c/f81 zero size 2026-03-09T17:29:35.801 INFO:tasks.workunit.client.0.vm06.stdout:6/354: truncate d6/d12/d17/f29 2824720 0 2026-03-09T17:29:35.815 INFO:tasks.workunit.client.0.vm06.stdout:0/494: dwrite d7/d11/d19/d23/f8e [0,4194304] 0 2026-03-09T17:29:35.826 INFO:tasks.workunit.client.0.vm06.stdout:3/419: getdents dd/d19/d28 0 2026-03-09T17:29:35.827 INFO:tasks.workunit.client.0.vm06.stdout:2/381: creat d3/d4/d12/f7b x:0 0 0 2026-03-09T17:29:35.832 INFO:tasks.workunit.client.0.vm06.stdout:2/382: dwrite d3/d4/d12/d2b/d36/d37/f41 [0,4194304] 0 2026-03-09T17:29:35.836 INFO:tasks.workunit.client.0.vm06.stdout:9/468: mknod d3/d26/c91 0 2026-03-09T17:29:35.836 INFO:tasks.workunit.client.0.vm06.stdout:6/355: creat d6/d12/d17/d65/f72 x:0 0 0 2026-03-09T17:29:35.837 INFO:tasks.workunit.client.0.vm06.stdout:9/469: dread d3/d6d/f78 [0,4194304] 0 2026-03-09T17:29:35.838 INFO:tasks.workunit.client.0.vm06.stdout:3/420: dread dd/f1b [0,4194304] 0 2026-03-09T17:29:35.838 INFO:tasks.workunit.client.0.vm06.stdout:3/421: truncate dd/d5c/f66 761030 0 2026-03-09T17:29:35.851 INFO:tasks.workunit.client.0.vm06.stdout:8/388: write f7 [265583,92940] 0 2026-03-09T17:29:35.852 INFO:tasks.workunit.client.0.vm06.stdout:8/389: chown d15/d16/f66 0 1 2026-03-09T17:29:35.853 INFO:tasks.workunit.client.0.vm06.stdout:8/390: truncate d15/d16/d19/d2b/f63 1472392 0 2026-03-09T17:29:35.855 INFO:tasks.workunit.client.0.vm06.stdout:2/383: chown d3/c59 11060425 1 2026-03-09T17:29:35.869 INFO:tasks.workunit.client.0.vm06.stdout:5/428: creat d4/d50/fa3 x:0 0 0 2026-03-09T17:29:35.869 INFO:tasks.workunit.client.0.vm06.stdout:0/495: mkdir d7/d11/d89/da8/db2 0 2026-03-09T17:29:35.869 INFO:tasks.workunit.client.0.vm06.stdout:0/496: chown d7/d11/d19/l1a 1 1 2026-03-09T17:29:35.870 INFO:tasks.workunit.client.0.vm06.stdout:2/384: dwrite d3/d4/f70 [0,4194304] 0 2026-03-09T17:29:35.872 INFO:tasks.workunit.client.0.vm06.stdout:2/385: fsync d3/d4/d22/f4b 0 2026-03-09T17:29:35.884 INFO:tasks.workunit.client.0.vm06.stdout:6/356: mkdir d6/d4f/d73 0 2026-03-09T17:29:35.887 INFO:tasks.workunit.client.0.vm06.stdout:9/470: symlink d3/l92 0 2026-03-09T17:29:35.890 INFO:tasks.workunit.client.0.vm06.stdout:3/422: mkdir dd/d81 0 2026-03-09T17:29:35.891 INFO:tasks.workunit.client.0.vm06.stdout:9/471: dwrite d3/d11/f1f [4194304,4194304] 0 2026-03-09T17:29:35.898 INFO:tasks.workunit.client.0.vm06.stdout:8/391: creat d15/d16/d19/d3d/d5f/f81 x:0 0 0 2026-03-09T17:29:35.908 INFO:tasks.workunit.client.0.vm06.stdout:2/386: mkdir d3/d4/d46/d7c 0 2026-03-09T17:29:35.908 INFO:tasks.workunit.client.0.vm06.stdout:5/429: mkdir d4/da4 0 2026-03-09T17:29:35.908 INFO:tasks.workunit.client.0.vm06.stdout:6/357: unlink d6/d4f/d3e/d52/c63 0 2026-03-09T17:29:35.918 INFO:tasks.workunit.client.0.vm06.stdout:6/358: creat d6/d4f/d3e/d52/f74 x:0 0 0 2026-03-09T17:29:35.927 INFO:tasks.workunit.client.0.vm06.stdout:6/359: rename d6/d12/d17/c71 to d6/d47/d4d/d6d/c75 0 2026-03-09T17:29:35.927 INFO:tasks.workunit.client.0.vm06.stdout:5/430: getdents d4/d50/d35/d40/d95 0 2026-03-09T17:29:35.927 INFO:tasks.workunit.client.0.vm06.stdout:5/431: chown d4/d50/d35/c53 0 1 2026-03-09T17:29:35.927 INFO:tasks.workunit.client.0.vm06.stdout:5/432: fsync d4/d50/f24 0 2026-03-09T17:29:35.928 INFO:tasks.workunit.client.0.vm06.stdout:5/433: dread d4/d22/d46/f58 [0,4194304] 0 2026-03-09T17:29:35.929 INFO:tasks.workunit.client.0.vm06.stdout:5/434: dread d4/f5e [0,4194304] 0 2026-03-09T17:29:35.930 INFO:tasks.workunit.client.0.vm06.stdout:5/435: fsync d4/d52/f6c 0 2026-03-09T17:29:35.932 INFO:tasks.workunit.client.0.vm06.stdout:6/360: unlink d6/f46 0 2026-03-09T17:29:35.934 INFO:tasks.workunit.client.0.vm06.stdout:5/436: creat d4/d52/fa5 x:0 0 0 2026-03-09T17:29:35.939 INFO:tasks.workunit.client.0.vm06.stdout:3/423: dread dd/d19/f2b [0,4194304] 0 2026-03-09T17:29:35.944 INFO:tasks.workunit.client.0.vm06.stdout:6/361: creat d6/d12/f76 x:0 0 0 2026-03-09T17:29:35.944 INFO:tasks.workunit.client.0.vm06.stdout:6/362: stat d6/d47/d4d/d6d 0 2026-03-09T17:29:35.946 INFO:tasks.workunit.client.0.vm06.stdout:5/437: rename d4/d50/d35/d75 to d4/d50/d18/da6 0 2026-03-09T17:29:35.950 INFO:tasks.workunit.client.0.vm06.stdout:0/497: sync 2026-03-09T17:29:35.950 INFO:tasks.workunit.client.0.vm06.stdout:3/424: symlink dd/d19/d25/l82 0 2026-03-09T17:29:35.950 INFO:tasks.workunit.client.0.vm06.stdout:2/387: sync 2026-03-09T17:29:35.950 INFO:tasks.workunit.client.0.vm06.stdout:3/425: chown dd/d5b 1821642 1 2026-03-09T17:29:35.951 INFO:tasks.workunit.client.0.vm06.stdout:2/388: fsync d3/d4/f70 0 2026-03-09T17:29:35.951 INFO:tasks.workunit.client.0.vm06.stdout:3/426: dread - dd/d19/d2c/f79 zero size 2026-03-09T17:29:35.952 INFO:tasks.workunit.client.0.vm06.stdout:3/427: chown dd/l17 394 1 2026-03-09T17:29:35.953 INFO:tasks.workunit.client.0.vm06.stdout:3/428: dread - dd/d1d/d2e/d67/f7e zero size 2026-03-09T17:29:35.954 INFO:tasks.workunit.client.0.vm06.stdout:2/389: dread d3/d4/d12/f35 [0,4194304] 0 2026-03-09T17:29:35.958 INFO:tasks.workunit.client.0.vm06.stdout:5/438: creat d4/d50/d18/d3d/fa7 x:0 0 0 2026-03-09T17:29:35.959 INFO:tasks.workunit.client.0.vm06.stdout:0/498: fdatasync d7/d11/d19/d23/f97 0 2026-03-09T17:29:35.963 INFO:tasks.workunit.client.0.vm06.stdout:0/499: dwrite d7/d11/d19/d1d/d39/f7d [0,4194304] 0 2026-03-09T17:29:35.965 INFO:tasks.workunit.client.0.vm06.stdout:0/500: readlink d7/d11/d2d/l98 0 2026-03-09T17:29:35.966 INFO:tasks.workunit.client.0.vm06.stdout:0/501: write d7/f56 [4582177,1364] 0 2026-03-09T17:29:35.967 INFO:tasks.workunit.client.0.vm06.stdout:0/502: read d7/d11/d19/d1d/d39/f7d [4005815,18874] 0 2026-03-09T17:29:35.971 INFO:tasks.workunit.client.0.vm06.stdout:4/445: dwrite db/d1d/d21/d25/d4b/f66 [0,4194304] 0 2026-03-09T17:29:35.974 INFO:tasks.workunit.client.0.vm06.stdout:4/446: dread - db/f6f zero size 2026-03-09T17:29:35.993 INFO:tasks.workunit.client.0.vm06.stdout:0/503: creat d7/d11/d19/d1d/fb3 x:0 0 0 2026-03-09T17:29:35.995 INFO:tasks.workunit.client.0.vm06.stdout:3/429: creat dd/d59/f83 x:0 0 0 2026-03-09T17:29:36.002 INFO:tasks.workunit.client.0.vm06.stdout:7/463: truncate d5/d12/f6c 3999689 0 2026-03-09T17:29:36.041 INFO:tasks.workunit.client.0.vm06.stdout:2/390: mknod d3/d4/d12/d2b/d2d/c7d 0 2026-03-09T17:29:36.057 INFO:tasks.workunit.client.0.vm06.stdout:0/504: dread d7/d11/d19/d23/f49 [0,4194304] 0 2026-03-09T17:29:36.060 INFO:tasks.workunit.client.0.vm06.stdout:7/464: creat d5/d12/d5f/f81 x:0 0 0 2026-03-09T17:29:36.067 INFO:tasks.workunit.client.0.vm06.stdout:2/391: dread d3/f29 [0,4194304] 0 2026-03-09T17:29:36.068 INFO:tasks.workunit.client.0.vm06.stdout:2/392: chown d3/d4/d12/d2b/d36/l45 1 1 2026-03-09T17:29:36.069 INFO:tasks.workunit.client.0.vm06.stdout:2/393: fsync d3/d4/d12/f7b 0 2026-03-09T17:29:36.076 INFO:tasks.workunit.client.0.vm06.stdout:3/430: link dd/f68 dd/d59/f84 0 2026-03-09T17:29:36.082 INFO:tasks.workunit.client.0.vm06.stdout:9/472: dwrite d3/d15/d16/f5c [4194304,4194304] 0 2026-03-09T17:29:36.084 INFO:tasks.workunit.client.0.vm06.stdout:9/473: readlink d3/d15/d16/l82 0 2026-03-09T17:29:36.094 INFO:tasks.workunit.client.0.vm06.stdout:8/392: truncate d15/d16/d19/d3d/f6a 1283741 0 2026-03-09T17:29:36.097 INFO:tasks.workunit.client.0.vm06.stdout:8/393: dread d15/d16/f3f [0,4194304] 0 2026-03-09T17:29:36.103 INFO:tasks.workunit.client.0.vm06.stdout:2/394: creat d3/d4/d12/d2b/f7e x:0 0 0 2026-03-09T17:29:36.112 INFO:tasks.workunit.client.0.vm06.stdout:9/474: mknod d3/d26/d35/c93 0 2026-03-09T17:29:36.113 INFO:tasks.workunit.client.0.vm06.stdout:8/394: creat d15/d16/d19/d71/f82 x:0 0 0 2026-03-09T17:29:36.134 INFO:tasks.workunit.client.0.vm06.stdout:0/505: getdents d7/d11/d2d/daf/d5b 0 2026-03-09T17:29:36.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:35 vm06.local ceph-mon[57307]: pgmap v151: 65 pgs: 65 active+clean; 1003 MiB data, 3.8 GiB used, 116 GiB / 120 GiB avail; 20 MiB/s rd, 99 MiB/s wr, 270 op/s 2026-03-09T17:29:36.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:35 vm09.local ceph-mon[62061]: pgmap v151: 65 pgs: 65 active+clean; 1003 MiB data, 3.8 GiB used, 116 GiB / 120 GiB avail; 20 MiB/s rd, 99 MiB/s wr, 270 op/s 2026-03-09T17:29:36.151 INFO:tasks.workunit.client.0.vm06.stdout:0/506: link d7/fb1 d7/d11/d19/d37/fb4 0 2026-03-09T17:29:36.157 INFO:tasks.workunit.client.0.vm06.stdout:0/507: unlink d7/d11/d89/la2 0 2026-03-09T17:29:36.161 INFO:tasks.workunit.client.0.vm06.stdout:0/508: dwrite d7/d11/d19/d1d/d87/f92 [0,4194304] 0 2026-03-09T17:29:36.166 INFO:tasks.workunit.client.0.vm06.stdout:0/509: dwrite d7/d11/d19/d1d/d39/f51 [0,4194304] 0 2026-03-09T17:29:36.183 INFO:tasks.workunit.client.0.vm06.stdout:0/510: dread d7/d11/d2d/f3a [0,4194304] 0 2026-03-09T17:29:36.184 INFO:tasks.workunit.client.0.vm06.stdout:0/511: readlink d7/d11/d19/d1d/d87/la1 0 2026-03-09T17:29:36.186 INFO:tasks.workunit.client.0.vm06.stdout:0/512: read d7/f2a [3724084,95938] 0 2026-03-09T17:29:36.197 INFO:tasks.workunit.client.0.vm06.stdout:6/363: rename d6/d12/d17/c60 to d6/d47/c77 0 2026-03-09T17:29:36.214 INFO:tasks.workunit.client.0.vm06.stdout:5/439: dwrite f0 [0,4194304] 0 2026-03-09T17:29:36.228 INFO:tasks.workunit.client.0.vm06.stdout:1/461: dwrite d11/d14/d1d/d1e/d2a/d34/f3b [0,4194304] 0 2026-03-09T17:29:36.242 INFO:tasks.workunit.client.0.vm06.stdout:2/395: rename d3/d4/d12/d2b/d36/d37/c4e to d3/d4/d12/d2b/c7f 0 2026-03-09T17:29:36.244 INFO:tasks.workunit.client.0.vm06.stdout:6/364: unlink d6/d12/c35 0 2026-03-09T17:29:36.249 INFO:tasks.workunit.client.0.vm06.stdout:1/462: dread d11/d14/d1d/d1e/d2a/d34/f3b [0,4194304] 0 2026-03-09T17:29:36.255 INFO:tasks.workunit.client.0.vm06.stdout:2/396: symlink d3/d4/d46/l80 0 2026-03-09T17:29:36.265 INFO:tasks.workunit.client.0.vm06.stdout:2/397: dread d3/d4/d12/d2b/d2d/f1b [0,4194304] 0 2026-03-09T17:29:36.271 INFO:tasks.workunit.client.0.vm06.stdout:7/465: dwrite d5/d1f/d34/f41 [0,4194304] 0 2026-03-09T17:29:36.272 INFO:tasks.workunit.client.0.vm06.stdout:7/466: chown d5/dd/f1a 45159 1 2026-03-09T17:29:36.273 INFO:tasks.workunit.client.0.vm06.stdout:7/467: chown d5/d7/d2b 7 1 2026-03-09T17:29:36.280 INFO:tasks.workunit.client.0.vm06.stdout:1/463: dread d11/d14/d1d/d42/f52 [0,4194304] 0 2026-03-09T17:29:36.295 INFO:tasks.workunit.client.0.vm06.stdout:7/468: mknod d5/d12/d5f/c82 0 2026-03-09T17:29:36.302 INFO:tasks.workunit.client.0.vm06.stdout:6/365: rename d6/d12/f31 to d6/d12/d17/f78 0 2026-03-09T17:29:36.306 INFO:tasks.workunit.client.0.vm06.stdout:7/469: write d5/d7/d2b/f59 [797009,73299] 0 2026-03-09T17:29:36.315 INFO:tasks.workunit.client.0.vm06.stdout:9/475: dwrite d3/d15/f23 [0,4194304] 0 2026-03-09T17:29:36.315 INFO:tasks.workunit.client.0.vm06.stdout:9/476: fsync d3/d15/f1a 0 2026-03-09T17:29:36.316 INFO:tasks.workunit.client.0.vm06.stdout:9/477: write d3/d15/f17 [6160164,5368] 0 2026-03-09T17:29:36.317 INFO:tasks.workunit.client.0.vm06.stdout:9/478: chown d3/d15/d36/d4c/d6a 1972 1 2026-03-09T17:29:36.320 INFO:tasks.workunit.client.0.vm06.stdout:8/395: dwrite fe [0,4194304] 0 2026-03-09T17:29:36.326 INFO:tasks.workunit.client.0.vm06.stdout:7/470: unlink d5/d1f/f3a 0 2026-03-09T17:29:36.335 INFO:tasks.workunit.client.0.vm06.stdout:3/431: dwrite dd/d19/d25/f4f [0,4194304] 0 2026-03-09T17:29:36.336 INFO:tasks.workunit.client.0.vm06.stdout:3/432: chown dd/d1d/c27 89 1 2026-03-09T17:29:36.345 INFO:tasks.workunit.client.0.vm06.stdout:8/396: mkdir d15/d16/d19/d3d/d5f/d83 0 2026-03-09T17:29:36.349 INFO:tasks.workunit.client.0.vm06.stdout:7/471: write d5/d1f/d34/d46/d51/f7b [3408468,66085] 0 2026-03-09T17:29:36.351 INFO:tasks.workunit.client.0.vm06.stdout:1/464: getdents d11/d14/d1c/d1f/d57/d7b 0 2026-03-09T17:29:36.353 INFO:tasks.workunit.client.0.vm06.stdout:8/397: symlink d15/d16/d19/d3d/d5f/l84 0 2026-03-09T17:29:36.355 INFO:tasks.workunit.client.0.vm06.stdout:6/366: getdents d6 0 2026-03-09T17:29:36.355 INFO:tasks.workunit.client.0.vm06.stdout:6/367: truncate d6/d12/d53/f64 170594 0 2026-03-09T17:29:36.357 INFO:tasks.workunit.client.0.vm06.stdout:3/433: rmdir dd/d19/d25/d44 39 2026-03-09T17:29:36.358 INFO:tasks.workunit.client.0.vm06.stdout:1/465: fdatasync d11/d14/d1d/f56 0 2026-03-09T17:29:36.359 INFO:tasks.workunit.client.0.vm06.stdout:8/398: mkdir d15/d16/d19/d2b/d85 0 2026-03-09T17:29:36.359 INFO:tasks.workunit.client.0.vm06.stdout:6/368: symlink d6/d12/d17/d27/l79 0 2026-03-09T17:29:36.365 INFO:tasks.workunit.client.0.vm06.stdout:3/434: mknod dd/d81/c85 0 2026-03-09T17:29:36.376 INFO:tasks.workunit.client.0.vm06.stdout:3/435: chown dd/d19/c7a 25 1 2026-03-09T17:29:36.376 INFO:tasks.workunit.client.0.vm06.stdout:3/436: mknod dd/d1d/d4e/c86 0 2026-03-09T17:29:36.376 INFO:tasks.workunit.client.0.vm06.stdout:9/479: dread d3/d26/f29 [0,4194304] 0 2026-03-09T17:29:36.376 INFO:tasks.workunit.client.0.vm06.stdout:3/437: symlink dd/d1d/d4e/l87 0 2026-03-09T17:29:36.376 INFO:tasks.workunit.client.0.vm06.stdout:9/480: mknod d3/d15/d48/c94 0 2026-03-09T17:29:36.376 INFO:tasks.workunit.client.0.vm06.stdout:8/399: dread d15/d39/f4b [0,4194304] 0 2026-03-09T17:29:36.376 INFO:tasks.workunit.client.0.vm06.stdout:8/400: read - d15/d39/f7b zero size 2026-03-09T17:29:36.377 INFO:tasks.workunit.client.0.vm06.stdout:9/481: read d3/d15/d36/d4c/f5a [2752334,61470] 0 2026-03-09T17:29:36.382 INFO:tasks.workunit.client.0.vm06.stdout:8/401: mkdir d15/d39/d67/d86 0 2026-03-09T17:29:36.382 INFO:tasks.workunit.client.0.vm06.stdout:8/402: fdatasync d15/f7d 0 2026-03-09T17:29:36.386 INFO:tasks.workunit.client.0.vm06.stdout:8/403: dwrite d15/d16/d19/d71/f80 [0,4194304] 0 2026-03-09T17:29:36.389 INFO:tasks.workunit.client.0.vm06.stdout:9/482: fdatasync d3/f5f 0 2026-03-09T17:29:36.397 INFO:tasks.workunit.client.0.vm06.stdout:0/513: write d7/fb [3020776,95229] 0 2026-03-09T17:29:36.405 INFO:tasks.workunit.client.0.vm06.stdout:0/514: creat d7/d11/d19/d1d/fb5 x:0 0 0 2026-03-09T17:29:36.406 INFO:tasks.workunit.client.0.vm06.stdout:0/515: readlink d7/d11/d19/d1d/d39/l7b 0 2026-03-09T17:29:36.409 INFO:tasks.workunit.client.0.vm06.stdout:0/516: dwrite d7/d11/d19/d1d/f4c [0,4194304] 0 2026-03-09T17:29:36.411 INFO:tasks.workunit.client.0.vm06.stdout:8/404: creat d15/d16/f87 x:0 0 0 2026-03-09T17:29:36.414 INFO:tasks.workunit.client.0.vm06.stdout:0/517: read d7/d11/d19/d37/fb4 [68961,94574] 0 2026-03-09T17:29:36.416 INFO:tasks.workunit.client.0.vm06.stdout:0/518: mknod d7/d11/d19/cb6 0 2026-03-09T17:29:36.418 INFO:tasks.workunit.client.0.vm06.stdout:8/405: symlink d15/d16/d1a/d7c/l88 0 2026-03-09T17:29:36.418 INFO:tasks.workunit.client.0.vm06.stdout:8/406: readlink d15/d16/d6d/l7a 0 2026-03-09T17:29:36.419 INFO:tasks.workunit.client.0.vm06.stdout:8/407: dread - d15/d39/f6f zero size 2026-03-09T17:29:36.419 INFO:tasks.workunit.client.0.vm06.stdout:8/408: chown d15/d31/d58 9739 1 2026-03-09T17:29:36.420 INFO:tasks.workunit.client.0.vm06.stdout:8/409: write d15/d16/d1a/d47/f76 [1988645,37662] 0 2026-03-09T17:29:36.423 INFO:tasks.workunit.client.0.vm06.stdout:8/410: creat d15/d16/d6d/f89 x:0 0 0 2026-03-09T17:29:36.424 INFO:tasks.workunit.client.0.vm06.stdout:0/519: truncate d7/fe 4781280 0 2026-03-09T17:29:36.426 INFO:tasks.workunit.client.0.vm06.stdout:8/411: symlink d15/l8a 0 2026-03-09T17:29:36.426 INFO:tasks.workunit.client.0.vm06.stdout:8/412: write d15/d16/f51 [1676625,53218] 0 2026-03-09T17:29:36.428 INFO:tasks.workunit.client.0.vm06.stdout:0/520: mkdir d7/d11/d19/d23/db7 0 2026-03-09T17:29:36.432 INFO:tasks.workunit.client.0.vm06.stdout:8/413: truncate d15/d16/d19/f4f 88452 0 2026-03-09T17:29:36.432 INFO:tasks.workunit.client.0.vm06.stdout:8/414: readlink d15/d16/d1e/d30/l60 0 2026-03-09T17:29:36.433 INFO:tasks.workunit.client.0.vm06.stdout:8/415: chown d15/d16/d1e/d30/c4a 161 1 2026-03-09T17:29:36.436 INFO:tasks.workunit.client.0.vm06.stdout:8/416: dwrite d15/d39/f6f [0,4194304] 0 2026-03-09T17:29:36.473 INFO:tasks.workunit.client.0.vm06.stdout:0/521: fdatasync d7/d11/d19/d23/f49 0 2026-03-09T17:29:36.473 INFO:tasks.workunit.client.0.vm06.stdout:4/447: creat db/d59/d5f/d45/fa9 x:0 0 0 2026-03-09T17:29:36.473 INFO:tasks.workunit.client.0.vm06.stdout:4/448: chown db/d1d/d21/d26/c32 0 1 2026-03-09T17:29:36.473 INFO:tasks.workunit.client.0.vm06.stdout:4/449: unlink db/d59/d5f/d6d/f96 0 2026-03-09T17:29:36.473 INFO:tasks.workunit.client.0.vm06.stdout:0/522: dread d7/f14 [0,4194304] 0 2026-03-09T17:29:36.473 INFO:tasks.workunit.client.0.vm06.stdout:8/417: creat d15/d39/d3c/d6c/f8b x:0 0 0 2026-03-09T17:29:36.473 INFO:tasks.workunit.client.0.vm06.stdout:8/418: readlink d15/d16/d19/d71/l75 0 2026-03-09T17:29:36.473 INFO:tasks.workunit.client.0.vm06.stdout:8/419: creat d15/d16/d1e/f8c x:0 0 0 2026-03-09T17:29:36.473 INFO:tasks.workunit.client.0.vm06.stdout:8/420: chown d15/d39/d3c/f5d 48510357 1 2026-03-09T17:29:36.473 INFO:tasks.workunit.client.0.vm06.stdout:8/421: dwrite d15/d16/d6d/f74 [0,4194304] 0 2026-03-09T17:29:36.473 INFO:tasks.workunit.client.0.vm06.stdout:8/422: readlink d15/d16/d19/d71/l75 0 2026-03-09T17:29:36.481 INFO:tasks.workunit.client.0.vm06.stdout:0/523: dread d7/f56 [0,4194304] 0 2026-03-09T17:29:36.485 INFO:tasks.workunit.client.0.vm06.stdout:0/524: dread d7/d11/d5d/d64/f6b [0,4194304] 0 2026-03-09T17:29:36.487 INFO:tasks.workunit.client.0.vm06.stdout:0/525: read d7/d11/f20 [2479062,90346] 0 2026-03-09T17:29:36.645 INFO:tasks.workunit.client.0.vm06.stdout:9/483: symlink d3/l95 0 2026-03-09T17:29:36.646 INFO:tasks.workunit.client.0.vm06.stdout:9/484: write d3/d6d/d85/f89 [784697,55816] 0 2026-03-09T17:29:36.647 INFO:tasks.workunit.client.0.vm06.stdout:9/485: chown d3/d15/d16/l20 1999257 1 2026-03-09T17:29:36.647 INFO:tasks.workunit.client.0.vm06.stdout:9/486: chown d3/d11/f59 9 1 2026-03-09T17:29:36.649 INFO:tasks.workunit.client.0.vm06.stdout:8/423: mknod d15/d16/d19/d3d/c8d 0 2026-03-09T17:29:36.649 INFO:tasks.workunit.client.0.vm06.stdout:8/424: dread - d15/d39/d3c/d6c/f8b zero size 2026-03-09T17:29:36.653 INFO:tasks.workunit.client.0.vm06.stdout:9/487: symlink d3/d26/d6c/d68/l96 0 2026-03-09T17:29:36.656 INFO:tasks.workunit.client.0.vm06.stdout:9/488: unlink d3/d26/d6c/d68/l96 0 2026-03-09T17:29:36.656 INFO:tasks.workunit.client.0.vm06.stdout:9/489: chown d3/f5f 716623 1 2026-03-09T17:29:36.657 INFO:tasks.workunit.client.0.vm06.stdout:9/490: chown d3/d15/f74 0 1 2026-03-09T17:29:36.660 INFO:tasks.workunit.client.0.vm06.stdout:6/369: getdents d6/d12/d17 0 2026-03-09T17:29:36.660 INFO:tasks.workunit.client.0.vm06.stdout:8/425: dread d15/d16/d1e/f64 [4194304,4194304] 0 2026-03-09T17:29:36.661 INFO:tasks.workunit.client.0.vm06.stdout:5/440: creat d4/d50/d18/fa8 x:0 0 0 2026-03-09T17:29:36.662 INFO:tasks.workunit.client.0.vm06.stdout:6/370: truncate d6/d4f/d3e/d52/f74 195930 0 2026-03-09T17:29:36.662 INFO:tasks.workunit.client.0.vm06.stdout:6/371: chown d6/f56 714240 1 2026-03-09T17:29:36.669 INFO:tasks.workunit.client.0.vm06.stdout:2/398: rename d3/d4/d38 to d3/d4/d22/d43/d77/d81 0 2026-03-09T17:29:36.669 INFO:tasks.workunit.client.0.vm06.stdout:2/399: chown d3/d4/d22/f2f 12041 1 2026-03-09T17:29:36.669 INFO:tasks.workunit.client.0.vm06.stdout:5/441: readlink d4/d50/l4c 0 2026-03-09T17:29:36.669 INFO:tasks.workunit.client.0.vm06.stdout:6/372: creat d6/d12/d17/f7a x:0 0 0 2026-03-09T17:29:36.669 INFO:tasks.workunit.client.0.vm06.stdout:7/472: rename d5/d12/d64/l65 to d5/dd/l83 0 2026-03-09T17:29:36.669 INFO:tasks.workunit.client.0.vm06.stdout:5/442: mknod d4/d50/d35/d40/d6f/ca9 0 2026-03-09T17:29:36.670 INFO:tasks.workunit.client.0.vm06.stdout:7/473: dread d5/f71 [0,4194304] 0 2026-03-09T17:29:36.672 INFO:tasks.workunit.client.0.vm06.stdout:4/450: rename db/df/c46 to db/d1d/d21/d37/d69/d78/da0/caa 0 2026-03-09T17:29:36.676 INFO:tasks.workunit.client.0.vm06.stdout:7/474: symlink d5/d7/d2b/l84 0 2026-03-09T17:29:36.684 INFO:tasks.workunit.client.0.vm06.stdout:7/475: chown d5/d7/d2b 3 1 2026-03-09T17:29:36.684 INFO:tasks.workunit.client.0.vm06.stdout:0/526: rename d7/d11/d2d/daf/d5b to d7/d11/d5d/db8 0 2026-03-09T17:29:36.684 INFO:tasks.workunit.client.0.vm06.stdout:9/491: rename d3/d15/d36/l52 to d3/d15/d48/l97 0 2026-03-09T17:29:36.684 INFO:tasks.workunit.client.0.vm06.stdout:0/527: mkdir d7/d11/d19/d3c/db9 0 2026-03-09T17:29:36.685 INFO:tasks.workunit.client.0.vm06.stdout:4/451: mkdir db/d1d/d21/d26/d89/dab 0 2026-03-09T17:29:36.689 INFO:tasks.workunit.client.0.vm06.stdout:8/426: rename d15/d39/d3c/c72 to d15/d16/d1e/d30/d55/c8e 0 2026-03-09T17:29:36.691 INFO:tasks.workunit.client.0.vm06.stdout:4/452: symlink db/d59/d5f/lac 0 2026-03-09T17:29:36.695 INFO:tasks.workunit.client.0.vm06.stdout:4/453: chown db/d59/d5f/d45/f61 5 1 2026-03-09T17:29:36.695 INFO:tasks.workunit.client.0.vm06.stdout:7/476: rmdir d5/d7/d7e 0 2026-03-09T17:29:36.698 INFO:tasks.workunit.client.0.vm06.stdout:7/477: dread d5/d7/d2b/f52 [0,4194304] 0 2026-03-09T17:29:36.700 INFO:tasks.workunit.client.0.vm06.stdout:7/478: write d5/d12/d64/d6b/f6f [715824,83023] 0 2026-03-09T17:29:36.701 INFO:tasks.workunit.client.0.vm06.stdout:1/466: write d11/d14/d1d/d94/f95 [3586432,46917] 0 2026-03-09T17:29:36.708 INFO:tasks.workunit.client.0.vm06.stdout:4/454: symlink db/d1d/d21/d25/d4b/lad 0 2026-03-09T17:29:36.711 INFO:tasks.workunit.client.0.vm06.stdout:7/479: creat d5/d1f/d34/d46/d51/f85 x:0 0 0 2026-03-09T17:29:36.717 INFO:tasks.workunit.client.0.vm06.stdout:7/480: fdatasync d5/d1f/d34/f41 0 2026-03-09T17:29:36.717 INFO:tasks.workunit.client.0.vm06.stdout:2/400: rename d3/d4/d12/d2b/d36/d37/c65 to d3/d4/d22/c82 0 2026-03-09T17:29:36.717 INFO:tasks.workunit.client.0.vm06.stdout:2/401: stat d3/d4/d12/d2b 0 2026-03-09T17:29:36.720 INFO:tasks.workunit.client.0.vm06.stdout:8/427: mkdir d15/d16/d1e/d8f 0 2026-03-09T17:29:36.724 INFO:tasks.workunit.client.0.vm06.stdout:4/455: dread f6 [0,4194304] 0 2026-03-09T17:29:36.725 INFO:tasks.workunit.client.0.vm06.stdout:4/456: readlink db/d1d/d21/d44/l50 0 2026-03-09T17:29:36.729 INFO:tasks.workunit.client.0.vm06.stdout:3/438: truncate dd/d19/d1e/f41 2558506 0 2026-03-09T17:29:36.734 INFO:tasks.workunit.client.0.vm06.stdout:4/457: fsync db/d59/d5f/d45/f8e 0 2026-03-09T17:29:36.737 INFO:tasks.workunit.client.0.vm06.stdout:7/481: unlink d5/d12/f6c 0 2026-03-09T17:29:36.741 INFO:tasks.workunit.client.0.vm06.stdout:7/482: creat d5/d12/d64/d6b/f86 x:0 0 0 2026-03-09T17:29:36.742 INFO:tasks.workunit.client.0.vm06.stdout:4/458: mkdir db/d1d/d21/d26/d89/dab/dae 0 2026-03-09T17:29:36.746 INFO:tasks.workunit.client.0.vm06.stdout:3/439: rename dd/d5c/f60 to dd/d19/d25/d44/f88 0 2026-03-09T17:29:36.754 INFO:tasks.workunit.client.0.vm06.stdout:7/483: fsync d5/dd/f19 0 2026-03-09T17:29:36.754 INFO:tasks.workunit.client.0.vm06.stdout:7/484: write d5/d1f/d34/d46/d51/f6e [71556,115422] 0 2026-03-09T17:29:36.754 INFO:tasks.workunit.client.0.vm06.stdout:4/459: truncate db/d59/d5f/d45/f61 823386 0 2026-03-09T17:29:36.754 INFO:tasks.workunit.client.0.vm06.stdout:4/460: dwrite db/fc [0,4194304] 0 2026-03-09T17:29:36.754 INFO:tasks.workunit.client.0.vm06.stdout:0/528: fsync d7/d11/d89/d99/fb0 0 2026-03-09T17:29:36.754 INFO:tasks.workunit.client.0.vm06.stdout:4/461: chown db/d1d/d21/d25/d4b/f66 1374 1 2026-03-09T17:29:36.780 INFO:tasks.workunit.client.0.vm06.stdout:7/485: rename d5/d7/d2b/f59 to d5/d7/f87 0 2026-03-09T17:29:36.789 INFO:tasks.workunit.client.0.vm06.stdout:0/529: mknod d7/d11/d89/cba 0 2026-03-09T17:29:36.789 INFO:tasks.workunit.client.0.vm06.stdout:0/530: chown d7/d11/d5d/c73 769005730 1 2026-03-09T17:29:36.793 INFO:tasks.workunit.client.0.vm06.stdout:7/486: mknod d5/d1f/d34/d46/c88 0 2026-03-09T17:29:36.797 INFO:tasks.workunit.client.0.vm06.stdout:7/487: write d5/dd/f7d [6745,3661] 0 2026-03-09T17:29:36.798 INFO:tasks.workunit.client.0.vm06.stdout:4/462: symlink db/d59/d90/laf 0 2026-03-09T17:29:36.801 INFO:tasks.workunit.client.0.vm06.stdout:0/531: rmdir d7/d11/d19/d8b/da4 39 2026-03-09T17:29:36.803 INFO:tasks.workunit.client.0.vm06.stdout:3/440: link dd/d19/c7a dd/d59/c89 0 2026-03-09T17:29:36.804 INFO:tasks.workunit.client.0.vm06.stdout:3/441: dread - dd/d59/f83 zero size 2026-03-09T17:29:36.807 INFO:tasks.workunit.client.0.vm06.stdout:7/488: creat d5/d1f/d34/d46/f89 x:0 0 0 2026-03-09T17:29:36.810 INFO:tasks.workunit.client.0.vm06.stdout:7/489: fdatasync d5/d1f/d34/d46/d51/f7b 0 2026-03-09T17:29:36.811 INFO:tasks.workunit.client.0.vm06.stdout:4/463: fdatasync db/d59/d5f/d6d/f7b 0 2026-03-09T17:29:36.812 INFO:tasks.workunit.client.0.vm06.stdout:6/373: dwrite d6/d12/d2d/f39 [0,4194304] 0 2026-03-09T17:29:36.816 INFO:tasks.workunit.client.0.vm06.stdout:6/374: read d6/f56 [4400064,36516] 0 2026-03-09T17:29:36.825 INFO:tasks.workunit.client.0.vm06.stdout:3/442: symlink dd/d59/l8a 0 2026-03-09T17:29:36.825 INFO:tasks.workunit.client.0.vm06.stdout:3/443: write dd/d19/d1e/f3f [178454,3502] 0 2026-03-09T17:29:36.831 INFO:tasks.workunit.client.0.vm06.stdout:5/443: truncate d4/d50/d18/f48 375669 0 2026-03-09T17:29:36.832 INFO:tasks.workunit.client.0.vm06.stdout:5/444: fsync d4/f71 0 2026-03-09T17:29:36.839 INFO:tasks.workunit.client.0.vm06.stdout:4/464: read db/d1d/d21/d44/d8a/fa3 [125810,42607] 0 2026-03-09T17:29:36.842 INFO:tasks.workunit.client.0.vm06.stdout:0/532: symlink d7/d11/d89/da8/db2/lbb 0 2026-03-09T17:29:36.842 INFO:tasks.workunit.client.0.vm06.stdout:6/375: creat d6/f7b x:0 0 0 2026-03-09T17:29:36.846 INFO:tasks.workunit.client.0.vm06.stdout:9/492: dwrite d3/d11/f2a [0,4194304] 0 2026-03-09T17:29:36.855 INFO:tasks.workunit.client.0.vm06.stdout:7/490: unlink d5/c33 0 2026-03-09T17:29:36.862 INFO:tasks.workunit.client.0.vm06.stdout:1/467: dwrite d11/d14/d1c/d1f/f4c [0,4194304] 0 2026-03-09T17:29:36.868 INFO:tasks.workunit.client.0.vm06.stdout:0/533: write d7/d11/d19/d8b/da4/fab [4889455,58560] 0 2026-03-09T17:29:36.875 INFO:tasks.workunit.client.0.vm06.stdout:9/493: symlink d3/d26/d6c/d68/l98 0 2026-03-09T17:29:36.884 INFO:tasks.workunit.client.0.vm06.stdout:7/491: creat d5/d1f/f8a x:0 0 0 2026-03-09T17:29:36.887 INFO:tasks.workunit.client.0.vm06.stdout:2/402: write d3/d4/d12/d2b/f32 [158729,71121] 0 2026-03-09T17:29:36.888 INFO:tasks.workunit.client.0.vm06.stdout:7/492: dwrite d5/f8 [4194304,4194304] 0 2026-03-09T17:29:36.890 INFO:tasks.workunit.client.0.vm06.stdout:9/494: dread d3/d26/d6c/f5b [0,4194304] 0 2026-03-09T17:29:36.906 INFO:tasks.workunit.client.0.vm06.stdout:2/403: mknod d3/d4/d12/d2b/d36/d37/c83 0 2026-03-09T17:29:36.908 INFO:tasks.workunit.client.0.vm06.stdout:7/493: mkdir d5/d1f/d34/d3f/d8b 0 2026-03-09T17:29:36.908 INFO:tasks.workunit.client.0.vm06.stdout:7/494: dread - d5/d1f/f80 zero size 2026-03-09T17:29:36.909 INFO:tasks.workunit.client.0.vm06.stdout:9/495: fsync d3/f5f 0 2026-03-09T17:29:36.911 INFO:tasks.workunit.client.0.vm06.stdout:1/468: creat d11/d14/d1d/d1e/d96/f9b x:0 0 0 2026-03-09T17:29:36.914 INFO:tasks.workunit.client.0.vm06.stdout:9/496: dread - d3/d2c/f8c zero size 2026-03-09T17:29:36.914 INFO:tasks.workunit.client.0.vm06.stdout:9/497: chown d3/f21 0 1 2026-03-09T17:29:36.914 INFO:tasks.workunit.client.0.vm06.stdout:7/495: dread d5/d12/f2c [0,4194304] 0 2026-03-09T17:29:36.914 INFO:tasks.workunit.client.0.vm06.stdout:2/404: write f2 [3492190,77140] 0 2026-03-09T17:29:36.919 INFO:tasks.workunit.client.0.vm06.stdout:7/496: dwrite d5/d12/d64/d6b/f86 [0,4194304] 0 2026-03-09T17:29:36.922 INFO:tasks.workunit.client.0.vm06.stdout:7/497: chown d5/d7/c31 8 1 2026-03-09T17:29:36.923 INFO:tasks.workunit.client.0.vm06.stdout:1/469: mknod d11/d14/d1d/d42/c9c 0 2026-03-09T17:29:36.924 INFO:tasks.workunit.client.0.vm06.stdout:1/470: chown d11/d14/d1c/d1f/d57/d7b/f93 258237 1 2026-03-09T17:29:36.925 INFO:tasks.workunit.client.0.vm06.stdout:9/498: dread d3/d26/d6c/f5b [0,4194304] 0 2026-03-09T17:29:36.927 INFO:tasks.workunit.client.0.vm06.stdout:2/405: fdatasync d3/d4/d22/d43/f5f 0 2026-03-09T17:29:36.927 INFO:tasks.workunit.client.0.vm06.stdout:9/499: write d3/d15/f1a [3674533,61819] 0 2026-03-09T17:29:36.929 INFO:tasks.workunit.client.0.vm06.stdout:7/498: dwrite d5/d1f/f74 [0,4194304] 0 2026-03-09T17:29:36.929 INFO:tasks.workunit.client.0.vm06.stdout:9/500: chown d3/d15/l24 5269542 1 2026-03-09T17:29:36.935 INFO:tasks.workunit.client.0.vm06.stdout:9/501: dread d3/d11/d65/f7c [0,4194304] 0 2026-03-09T17:29:36.936 INFO:tasks.workunit.client.0.vm06.stdout:9/502: chown d3/d15/d36/l53 887 1 2026-03-09T17:29:36.938 INFO:tasks.workunit.client.0.vm06.stdout:2/406: rmdir d3/d4/d46/d7c 0 2026-03-09T17:29:36.940 INFO:tasks.workunit.client.0.vm06.stdout:9/503: unlink d3/d15/d36/d4d/c5d 0 2026-03-09T17:29:36.943 INFO:tasks.workunit.client.0.vm06.stdout:7/499: dwrite d5/d1f/d34/f41 [0,4194304] 0 2026-03-09T17:29:36.946 INFO:tasks.workunit.client.0.vm06.stdout:2/407: rename d3/d4/d12/f1e to d3/d4/d22/d43/d77/d81/f84 0 2026-03-09T17:29:36.948 INFO:tasks.workunit.client.0.vm06.stdout:2/408: chown d3/d4/d22/d72/l5b 312123888 1 2026-03-09T17:29:36.948 INFO:tasks.workunit.client.0.vm06.stdout:9/504: fdatasync d3/d11/f87 0 2026-03-09T17:29:36.949 INFO:tasks.workunit.client.0.vm06.stdout:9/505: chown d3/d26/l45 3 1 2026-03-09T17:29:36.949 INFO:tasks.workunit.client.0.vm06.stdout:7/500: dwrite d5/dd/f5d [0,4194304] 0 2026-03-09T17:29:36.954 INFO:tasks.workunit.client.0.vm06.stdout:7/501: creat d5/d1f/d34/f8c x:0 0 0 2026-03-09T17:29:36.955 INFO:tasks.workunit.client.0.vm06.stdout:7/502: read d5/d12/f32 [261957,22488] 0 2026-03-09T17:29:36.957 INFO:tasks.workunit.client.0.vm06.stdout:9/506: link d3/d15/d16/f31 d3/d26/d35/f99 0 2026-03-09T17:29:36.958 INFO:tasks.workunit.client.0.vm06.stdout:7/503: rename d5/d12/f35 to d5/d12/d64/f8d 0 2026-03-09T17:29:36.969 INFO:tasks.workunit.client.0.vm06.stdout:9/507: mkdir d3/d6d/d9a 0 2026-03-09T17:29:36.969 INFO:tasks.workunit.client.0.vm06.stdout:7/504: symlink d5/d12/d64/d6b/l8e 0 2026-03-09T17:29:36.969 INFO:tasks.workunit.client.0.vm06.stdout:7/505: stat d5/d1f/f56 0 2026-03-09T17:29:36.969 INFO:tasks.workunit.client.0.vm06.stdout:7/506: mknod d5/d1f/d34/c8f 0 2026-03-09T17:29:36.969 INFO:tasks.workunit.client.0.vm06.stdout:7/507: stat d5/d12/l4a 0 2026-03-09T17:29:36.969 INFO:tasks.workunit.client.0.vm06.stdout:7/508: symlink d5/d12/d64/l90 0 2026-03-09T17:29:36.969 INFO:tasks.workunit.client.0.vm06.stdout:7/509: mkdir d5/d1f/d34/d3f/d91 0 2026-03-09T17:29:36.983 INFO:tasks.workunit.client.0.vm06.stdout:2/409: dread d3/d4/d12/f66 [0,4194304] 0 2026-03-09T17:29:37.045 INFO:tasks.workunit.client.0.vm06.stdout:8/428: write d15/d16/d19/d3d/f6a [1877901,38666] 0 2026-03-09T17:29:37.045 INFO:tasks.workunit.client.0.vm06.stdout:8/429: chown d15/d39/d67 31162 1 2026-03-09T17:29:37.051 INFO:tasks.workunit.client.0.vm06.stdout:3/444: getdents dd/d59 0 2026-03-09T17:29:37.053 INFO:tasks.workunit.client.0.vm06.stdout:8/430: mknod d15/d16/d1e/d28/d5e/c90 0 2026-03-09T17:29:37.066 INFO:tasks.workunit.client.0.vm06.stdout:8/431: mknod d15/d16/d1e/d28/c91 0 2026-03-09T17:29:37.066 INFO:tasks.workunit.client.0.vm06.stdout:8/432: readlink d15/d16/d1a/l20 0 2026-03-09T17:29:37.068 INFO:tasks.workunit.client.0.vm06.stdout:8/433: symlink d15/d16/d19/d3d/l92 0 2026-03-09T17:29:37.120 INFO:tasks.workunit.client.0.vm06.stdout:0/534: sync 2026-03-09T17:29:37.127 INFO:tasks.workunit.client.0.vm06.stdout:0/535: truncate d7/d11/d19/f68 23432 0 2026-03-09T17:29:37.127 INFO:tasks.workunit.client.0.vm06.stdout:0/536: creat d7/d11/d19/d8b/da4/d85/fbc x:0 0 0 2026-03-09T17:29:37.127 INFO:tasks.workunit.client.0.vm06.stdout:0/537: read d7/f56 [471415,53272] 0 2026-03-09T17:29:37.128 INFO:tasks.workunit.client.0.vm06.stdout:0/538: rename d7/d11/d5d/d64/d96 to d7/d11/d19/d23/db7/dbd 0 2026-03-09T17:29:37.129 INFO:tasks.workunit.client.0.vm06.stdout:0/539: creat d7/d88/fbe x:0 0 0 2026-03-09T17:29:37.134 INFO:tasks.workunit.client.0.vm06.stdout:0/540: rename d7/d11/d19/d23/f49 to d7/fbf 0 2026-03-09T17:29:37.166 INFO:tasks.workunit.client.0.vm06.stdout:6/376: truncate d6/d4f/f26 4321360 0 2026-03-09T17:29:37.169 INFO:tasks.workunit.client.0.vm06.stdout:4/465: dwrite db/f68 [0,4194304] 0 2026-03-09T17:29:37.175 INFO:tasks.workunit.client.0.vm06.stdout:5/445: write d4/fd [1460208,18068] 0 2026-03-09T17:29:37.178 INFO:tasks.workunit.client.0.vm06.stdout:6/377: mkdir d6/d4f/d3e/d52/d7c 0 2026-03-09T17:29:37.178 INFO:tasks.workunit.client.0.vm06.stdout:6/378: truncate d6/f7b 259665 0 2026-03-09T17:29:37.191 INFO:tasks.workunit.client.0.vm06.stdout:6/379: rename d6/d12/d17/d27/l79 to d6/d4f/d73/l7d 0 2026-03-09T17:29:37.195 INFO:tasks.workunit.client.0.vm06.stdout:6/380: fdatasync d6/d12/d17/d27/f3d 0 2026-03-09T17:29:37.199 INFO:tasks.workunit.client.0.vm06.stdout:6/381: fsync d6/d47/d4d/f6e 0 2026-03-09T17:29:37.216 INFO:tasks.workunit.client.0.vm06.stdout:1/471: sync 2026-03-09T17:29:37.222 INFO:tasks.workunit.client.0.vm06.stdout:1/472: rmdir d11/d14/d1c/d1f/d57 39 2026-03-09T17:29:37.226 INFO:tasks.workunit.client.0.vm06.stdout:1/473: dwrite d11/f8d [0,4194304] 0 2026-03-09T17:29:37.229 INFO:tasks.workunit.client.0.vm06.stdout:5/446: sync 2026-03-09T17:29:37.242 INFO:tasks.workunit.client.0.vm06.stdout:1/474: rename d11/d14/d1c/d1f/f86 to d11/d14/d1d/d4a/f9d 0 2026-03-09T17:29:37.254 INFO:tasks.workunit.client.0.vm06.stdout:5/447: read d4/d50/f1e [586907,124957] 0 2026-03-09T17:29:37.254 INFO:tasks.workunit.client.0.vm06.stdout:5/448: stat d4/d50/d35/c53 0 2026-03-09T17:29:37.256 INFO:tasks.workunit.client.0.vm06.stdout:9/508: write d3/d11/f87 [3054963,100375] 0 2026-03-09T17:29:37.261 INFO:tasks.workunit.client.0.vm06.stdout:7/510: dwrite d5/d7/f58 [4194304,4194304] 0 2026-03-09T17:29:37.277 INFO:tasks.workunit.client.0.vm06.stdout:2/410: dwrite d3/d4/d12/d2b/d2d/f2a [0,4194304] 0 2026-03-09T17:29:37.278 INFO:tasks.workunit.client.0.vm06.stdout:2/411: chown d3/c62 762195 1 2026-03-09T17:29:37.279 INFO:tasks.workunit.client.0.vm06.stdout:2/412: truncate d3/d4/d22/f73 468277 0 2026-03-09T17:29:37.283 INFO:tasks.workunit.client.0.vm06.stdout:5/449: symlink d4/d22/laa 0 2026-03-09T17:29:37.285 INFO:tasks.workunit.client.0.vm06.stdout:5/450: truncate d4/d50/d18/f8c 1007865 0 2026-03-09T17:29:37.287 INFO:tasks.workunit.client.0.vm06.stdout:9/509: unlink d3/l95 0 2026-03-09T17:29:37.290 INFO:tasks.workunit.client.0.vm06.stdout:9/510: stat d3/d15/d36/d4d/l63 0 2026-03-09T17:29:37.290 INFO:tasks.workunit.client.0.vm06.stdout:9/511: chown d3/d6d/d85/f89 114 1 2026-03-09T17:29:37.294 INFO:tasks.workunit.client.0.vm06.stdout:7/511: unlink d5/d1f/d34/d46/d51/f85 0 2026-03-09T17:29:37.298 INFO:tasks.workunit.client.0.vm06.stdout:7/512: dwrite d5/d12/d64/d6b/f86 [0,4194304] 0 2026-03-09T17:29:37.308 INFO:tasks.workunit.client.0.vm06.stdout:5/451: rename d4/d50/d35/f97 to d4/d7e/fab 0 2026-03-09T17:29:37.331 INFO:tasks.workunit.client.0.vm06.stdout:9/512: creat d3/d26/d6c/d68/f9b x:0 0 0 2026-03-09T17:29:37.331 INFO:tasks.workunit.client.0.vm06.stdout:3/445: dwrite dd/f1b [0,4194304] 0 2026-03-09T17:29:37.331 INFO:tasks.workunit.client.0.vm06.stdout:8/434: truncate d15/d16/d1a/d47/f76 22948 0 2026-03-09T17:29:37.331 INFO:tasks.workunit.client.0.vm06.stdout:5/452: fdatasync d4/f5e 0 2026-03-09T17:29:37.331 INFO:tasks.workunit.client.0.vm06.stdout:3/446: stat dd/d59/c5e 0 2026-03-09T17:29:37.335 INFO:tasks.workunit.client.0.vm06.stdout:5/453: fsync d4/d22/f77 0 2026-03-09T17:29:37.347 INFO:tasks.workunit.client.0.vm06.stdout:9/513: mkdir d3/d6d/d9a/d9c 0 2026-03-09T17:29:37.347 INFO:tasks.workunit.client.0.vm06.stdout:3/447: rename dd/f68 to dd/d5c/f8b 0 2026-03-09T17:29:37.347 INFO:tasks.workunit.client.0.vm06.stdout:5/454: mknod d4/d50/d35/d40/d95/cac 0 2026-03-09T17:29:37.347 INFO:tasks.workunit.client.0.vm06.stdout:3/448: creat dd/d1d/f8c x:0 0 0 2026-03-09T17:29:37.347 INFO:tasks.workunit.client.0.vm06.stdout:8/435: creat d15/d16/d19/f93 x:0 0 0 2026-03-09T17:29:37.347 INFO:tasks.workunit.client.0.vm06.stdout:8/436: truncate d15/d16/f87 47353 0 2026-03-09T17:29:37.351 INFO:tasks.workunit.client.0.vm06.stdout:8/437: unlink d15/d16/d6d/f74 0 2026-03-09T17:29:37.358 INFO:tasks.workunit.client.0.vm06.stdout:5/455: unlink d4/d50/f41 0 2026-03-09T17:29:37.358 INFO:tasks.workunit.client.0.vm06.stdout:1/475: dread d11/d14/d1d/d1e/d2a/f43 [0,4194304] 0 2026-03-09T17:29:37.358 INFO:tasks.workunit.client.0.vm06.stdout:1/476: dread d11/d14/d1c/d1f/f45 [0,4194304] 0 2026-03-09T17:29:37.358 INFO:tasks.workunit.client.0.vm06.stdout:1/477: write d11/d14/d1d/d1e/f47 [2687079,2889] 0 2026-03-09T17:29:37.363 INFO:tasks.workunit.client.0.vm06.stdout:8/438: dread d15/d16/d19/f61 [0,4194304] 0 2026-03-09T17:29:37.364 INFO:tasks.workunit.client.0.vm06.stdout:8/439: write d15/d16/d19/d3d/f6a [1019944,56805] 0 2026-03-09T17:29:37.365 INFO:tasks.workunit.client.0.vm06.stdout:8/440: fsync d15/d39/f7b 0 2026-03-09T17:29:37.366 INFO:tasks.workunit.client.0.vm06.stdout:7/513: dread d5/d12/d64/f8d [0,4194304] 0 2026-03-09T17:29:37.368 INFO:tasks.workunit.client.0.vm06.stdout:7/514: fdatasync d5/d1f/d34/d46/d51/f6e 0 2026-03-09T17:29:37.373 INFO:tasks.workunit.client.0.vm06.stdout:2/413: dread d3/d4/d12/d2b/d36/d37/f3a [0,4194304] 0 2026-03-09T17:29:37.378 INFO:tasks.workunit.client.0.vm06.stdout:9/514: sync 2026-03-09T17:29:37.381 INFO:tasks.workunit.client.0.vm06.stdout:2/414: dwrite d3/d4/d12/d2b/f7e [0,4194304] 0 2026-03-09T17:29:37.387 INFO:tasks.workunit.client.0.vm06.stdout:9/515: creat d3/d2c/f9d x:0 0 0 2026-03-09T17:29:37.389 INFO:tasks.workunit.client.0.vm06.stdout:1/478: symlink d11/d14/d1c/d1f/d57/d7b/l9e 0 2026-03-09T17:29:37.391 INFO:tasks.workunit.client.0.vm06.stdout:5/456: creat d4/d50/fad x:0 0 0 2026-03-09T17:29:37.392 INFO:tasks.workunit.client.0.vm06.stdout:9/516: rmdir d3/d11 39 2026-03-09T17:29:37.393 INFO:tasks.workunit.client.0.vm06.stdout:1/479: rmdir d11/d14/d1d/d4a 39 2026-03-09T17:29:37.394 INFO:tasks.workunit.client.0.vm06.stdout:8/441: rename d15/d16/d1e/d30/l60 to d15/d16/d1e/l94 0 2026-03-09T17:29:37.394 INFO:tasks.workunit.client.0.vm06.stdout:7/515: link d5/dd/f48 d5/d1f/d34/d46/d51/f92 0 2026-03-09T17:29:37.395 INFO:tasks.workunit.client.0.vm06.stdout:7/516: chown d5/d12/d64/f8d 7 1 2026-03-09T17:29:37.396 INFO:tasks.workunit.client.0.vm06.stdout:5/457: mknod d4/d52/cae 0 2026-03-09T17:29:37.397 INFO:tasks.workunit.client.0.vm06.stdout:9/517: creat d3/d6d/f9e x:0 0 0 2026-03-09T17:29:37.397 INFO:tasks.workunit.client.0.vm06.stdout:9/518: write d3/d15/f17 [6607404,87542] 0 2026-03-09T17:29:37.402 INFO:tasks.workunit.client.0.vm06.stdout:8/442: creat d15/d16/d1e/d28/d5e/f95 x:0 0 0 2026-03-09T17:29:37.406 INFO:tasks.workunit.client.0.vm06.stdout:7/517: creat d5/d12/f93 x:0 0 0 2026-03-09T17:29:37.407 INFO:tasks.workunit.client.0.vm06.stdout:5/458: symlink d4/d22/d64/laf 0 2026-03-09T17:29:37.408 INFO:tasks.workunit.client.0.vm06.stdout:5/459: fsync d4/d22/d64/f7d 0 2026-03-09T17:29:37.449 INFO:tasks.workunit.client.0.vm06.stdout:0/541: dwrite d7/d11/d5d/d64/f6b [4194304,4194304] 0 2026-03-09T17:29:37.449 INFO:tasks.workunit.client.0.vm06.stdout:0/542: write d7/fb [4557359,93161] 0 2026-03-09T17:29:37.450 INFO:tasks.workunit.client.0.vm06.stdout:0/543: chown d7/d11/d19/l1a 2 1 2026-03-09T17:29:37.469 INFO:tasks.workunit.client.0.vm06.stdout:0/544: creat d7/d11/d19/d23/db7/dbd/fc0 x:0 0 0 2026-03-09T17:29:37.470 INFO:tasks.workunit.client.0.vm06.stdout:0/545: chown d7/d11/d19/d8b/da4/c86 1568751 1 2026-03-09T17:29:37.474 INFO:tasks.workunit.client.0.vm06.stdout:4/466: dwrite db/d1d/d21/d37/f81 [0,4194304] 0 2026-03-09T17:29:37.476 INFO:tasks.workunit.client.0.vm06.stdout:4/467: stat db/d1d/d21/d88/ca6 0 2026-03-09T17:29:37.477 INFO:tasks.workunit.client.0.vm06.stdout:4/468: readlink db/d1d/d21/d25/d4b/l5c 0 2026-03-09T17:29:37.478 INFO:tasks.workunit.client.0.vm06.stdout:4/469: truncate db/d1d/d21/fa5 467323 0 2026-03-09T17:29:37.478 INFO:tasks.workunit.client.0.vm06.stdout:4/470: fdatasync db/d59/d5f/d45/fa9 0 2026-03-09T17:29:37.479 INFO:tasks.workunit.client.0.vm06.stdout:4/471: dread - db/d59/d5f/d5d/f62 zero size 2026-03-09T17:29:37.491 INFO:tasks.workunit.client.0.vm06.stdout:0/546: mkdir d7/d11/d19/d23/db7/dbd/dc1 0 2026-03-09T17:29:37.492 INFO:tasks.workunit.client.0.vm06.stdout:0/547: readlink d7/d11/d5d/db8/l71 0 2026-03-09T17:29:37.495 INFO:tasks.workunit.client.0.vm06.stdout:0/548: read d7/fbf [2320722,11178] 0 2026-03-09T17:29:37.496 INFO:tasks.workunit.client.0.vm06.stdout:0/549: dread - d7/d11/d19/d23/db7/dbd/fc0 zero size 2026-03-09T17:29:37.501 INFO:tasks.workunit.client.0.vm06.stdout:6/382: dwrite d6/d12/d17/f29 [0,4194304] 0 2026-03-09T17:29:37.514 INFO:tasks.workunit.client.0.vm06.stdout:6/383: rename d6/f4a to d6/d12/d17/d27/f7e 0 2026-03-09T17:29:37.531 INFO:tasks.workunit.client.0.vm06.stdout:3/449: dwrite dd/d1d/f29 [0,4194304] 0 2026-03-09T17:29:37.544 INFO:tasks.workunit.client.0.vm06.stdout:3/450: mknod dd/d19/d25/d2d/c8d 0 2026-03-09T17:29:37.544 INFO:tasks.workunit.client.0.vm06.stdout:3/451: write dd/d19/d25/d2d/f71 [198330,5131] 0 2026-03-09T17:29:37.549 INFO:tasks.workunit.client.0.vm06.stdout:3/452: mknod dd/d19/d25/d44/d80/c8e 0 2026-03-09T17:29:37.563 INFO:tasks.workunit.client.0.vm06.stdout:2/415: write d3/d4/f3c [548549,27038] 0 2026-03-09T17:29:37.572 INFO:tasks.workunit.client.0.vm06.stdout:2/416: creat d3/d4/d12/f85 x:0 0 0 2026-03-09T17:29:37.586 INFO:tasks.workunit.client.0.vm06.stdout:2/417: rename d3/d4/d22/d72/c55 to d3/d4/d22/d43/d77/d81/d64/d6a/c86 0 2026-03-09T17:29:37.586 INFO:tasks.workunit.client.0.vm06.stdout:2/418: rename d3/d44 to d3/d44/d87 22 2026-03-09T17:29:37.586 INFO:tasks.workunit.client.0.vm06.stdout:2/419: fdatasync d3/d4/d22/d43/d77/d81/f58 0 2026-03-09T17:29:37.586 INFO:tasks.workunit.client.0.vm06.stdout:2/420: rename d3/c59 to d3/d4/d12/d2b/d36/d37/c88 0 2026-03-09T17:29:37.586 INFO:tasks.workunit.client.0.vm06.stdout:2/421: unlink d3/d4/d12/f7b 0 2026-03-09T17:29:37.586 INFO:tasks.workunit.client.0.vm06.stdout:1/480: write f7 [2583399,28428] 0 2026-03-09T17:29:37.587 INFO:tasks.workunit.client.0.vm06.stdout:9/519: write d3/d26/d6c/f3a [3844656,129902] 0 2026-03-09T17:29:37.590 INFO:tasks.workunit.client.0.vm06.stdout:8/443: write d15/d16/f24 [55975,83635] 0 2026-03-09T17:29:37.591 INFO:tasks.workunit.client.0.vm06.stdout:9/520: dwrite d3/d2c/f8c [0,4194304] 0 2026-03-09T17:29:37.597 INFO:tasks.workunit.client.0.vm06.stdout:9/521: read - d3/d15/d36/d4c/d6a/f8b zero size 2026-03-09T17:29:37.598 INFO:tasks.workunit.client.0.vm06.stdout:9/522: readlink d3/d15/d36/l53 0 2026-03-09T17:29:37.601 INFO:tasks.workunit.client.0.vm06.stdout:8/444: dread d15/d31/f33 [0,4194304] 0 2026-03-09T17:29:37.602 INFO:tasks.workunit.client.0.vm06.stdout:8/445: readlink d15/d16/d1a/l1d 0 2026-03-09T17:29:37.602 INFO:tasks.workunit.client.0.vm06.stdout:8/446: dread - d15/d16/d6d/f89 zero size 2026-03-09T17:29:37.603 INFO:tasks.workunit.client.0.vm06.stdout:8/447: dread d15/d31/f33 [0,4194304] 0 2026-03-09T17:29:37.604 INFO:tasks.workunit.client.0.vm06.stdout:8/448: stat d15/d16/d6d/f89 0 2026-03-09T17:29:37.604 INFO:tasks.workunit.client.0.vm06.stdout:8/449: readlink l1 0 2026-03-09T17:29:37.615 INFO:tasks.workunit.client.0.vm06.stdout:8/450: read d15/d16/f51 [266608,60582] 0 2026-03-09T17:29:37.615 INFO:tasks.workunit.client.0.vm06.stdout:2/422: unlink d3/d4/d12/d2b/d36/d37/c83 0 2026-03-09T17:29:37.615 INFO:tasks.workunit.client.0.vm06.stdout:1/481: truncate d11/d14/d1d/d1e/d2a/f43 5034303 0 2026-03-09T17:29:37.619 INFO:tasks.workunit.client.0.vm06.stdout:7/518: dwrite d5/d1f/d34/f54 [0,4194304] 0 2026-03-09T17:29:37.634 INFO:tasks.workunit.client.0.vm06.stdout:9/523: mkdir d3/d26/d35/d9f 0 2026-03-09T17:29:37.637 INFO:tasks.workunit.client.0.vm06.stdout:2/423: creat d3/d4/d12/d2b/f89 x:0 0 0 2026-03-09T17:29:37.637 INFO:tasks.workunit.client.0.vm06.stdout:2/424: dread - d3/d4/d12/f2e zero size 2026-03-09T17:29:37.638 INFO:tasks.workunit.client.0.vm06.stdout:2/425: write d3/d4/d12/d2b/d2d/f6f [672066,45972] 0 2026-03-09T17:29:37.641 INFO:tasks.workunit.client.0.vm06.stdout:1/482: mknod d11/d14/d1d/d42/c9f 0 2026-03-09T17:29:37.648 INFO:tasks.workunit.client.0.vm06.stdout:9/524: mknod d3/d26/ca0 0 2026-03-09T17:29:37.650 INFO:tasks.workunit.client.0.vm06.stdout:9/525: creat d3/d15/d36/d4d/fa1 x:0 0 0 2026-03-09T17:29:37.652 INFO:tasks.workunit.client.0.vm06.stdout:1/483: link d11/d14/d1c/d1f/d57/l66 d11/d14/d1d/d42/la0 0 2026-03-09T17:29:37.653 INFO:tasks.workunit.client.0.vm06.stdout:9/526: symlink d3/d15/d36/d4c/d6a/d8a/la2 0 2026-03-09T17:29:37.653 INFO:tasks.workunit.client.0.vm06.stdout:9/527: readlink d3/l7 0 2026-03-09T17:29:37.654 INFO:tasks.workunit.client.0.vm06.stdout:1/484: creat d11/d14/d1d/d1e/d2a/d34/d58/fa1 x:0 0 0 2026-03-09T17:29:37.657 INFO:tasks.workunit.client.0.vm06.stdout:1/485: dwrite d11/d14/d1d/f8f [0,4194304] 0 2026-03-09T17:29:37.666 INFO:tasks.workunit.client.0.vm06.stdout:4/472: sync 2026-03-09T17:29:37.667 INFO:tasks.workunit.client.0.vm06.stdout:0/550: sync 2026-03-09T17:29:37.667 INFO:tasks.workunit.client.0.vm06.stdout:3/453: sync 2026-03-09T17:29:37.667 INFO:tasks.workunit.client.0.vm06.stdout:3/454: readlink dd/d19/l20 0 2026-03-09T17:29:37.678 INFO:tasks.workunit.client.0.vm06.stdout:9/528: rename d3/c8 to d3/d6d/d85/ca3 0 2026-03-09T17:29:37.682 INFO:tasks.workunit.client.0.vm06.stdout:4/473: mknod db/d1d/d21/d37/d69/d78/da0/cb0 0 2026-03-09T17:29:37.682 INFO:tasks.workunit.client.0.vm06.stdout:0/551: chown d7/d11/d19/d1d/c5c 224 1 2026-03-09T17:29:37.682 INFO:tasks.workunit.client.0.vm06.stdout:3/455: mknod dd/d81/c8f 0 2026-03-09T17:29:37.683 INFO:tasks.workunit.client.0.vm06.stdout:1/486: creat d11/d14/d1d/d4a/fa2 x:0 0 0 2026-03-09T17:29:37.684 INFO:tasks.workunit.client.0.vm06.stdout:1/487: readlink d11/d14/l77 0 2026-03-09T17:29:37.685 INFO:tasks.workunit.client.0.vm06.stdout:1/488: stat d11/d14/d1d/f4e 0 2026-03-09T17:29:37.694 INFO:tasks.workunit.client.0.vm06.stdout:9/529: creat d3/d15/d36/d4d/fa4 x:0 0 0 2026-03-09T17:29:37.696 INFO:tasks.workunit.client.0.vm06.stdout:0/552: mknod d7/d11/d5d/cc2 0 2026-03-09T17:29:37.697 INFO:tasks.workunit.client.0.vm06.stdout:4/474: chown db/d1d/d21/d37/d69/d78/da0/caa 9110982 1 2026-03-09T17:29:37.702 INFO:tasks.workunit.client.0.vm06.stdout:3/456: symlink dd/d19/d1e/l90 0 2026-03-09T17:29:37.704 INFO:tasks.workunit.client.0.vm06.stdout:9/530: rename d3/d15/d36/d4c/f5a to d3/d15/d36/d4d/fa5 0 2026-03-09T17:29:37.705 INFO:tasks.workunit.client.0.vm06.stdout:0/553: chown d7/d11/d19/f57 5364 1 2026-03-09T17:29:37.705 INFO:tasks.workunit.client.0.vm06.stdout:0/554: chown d7/d11/d19/d1d/f40 440285 1 2026-03-09T17:29:37.706 INFO:tasks.workunit.client.0.vm06.stdout:3/457: creat dd/d5c/f91 x:0 0 0 2026-03-09T17:29:37.707 INFO:tasks.workunit.client.0.vm06.stdout:1/489: truncate d11/d14/d1d/d1e/d2a/d34/f60 2424636 0 2026-03-09T17:29:37.709 INFO:tasks.workunit.client.0.vm06.stdout:4/475: rename db/df/f97 to db/d1d/d21/d26/d89/fb1 0 2026-03-09T17:29:37.711 INFO:tasks.workunit.client.0.vm06.stdout:0/555: dread - d7/d11/d5d/f93 zero size 2026-03-09T17:29:37.712 INFO:tasks.workunit.client.0.vm06.stdout:0/556: stat d7/d11/d19/d23/db7 0 2026-03-09T17:29:37.720 INFO:tasks.workunit.client.0.vm06.stdout:4/476: symlink db/d59/d5f/d6d/lb2 0 2026-03-09T17:29:37.744 INFO:tasks.workunit.client.0.vm06.stdout:9/531: unlink d3/d15/d16/f31 0 2026-03-09T17:29:37.744 INFO:tasks.workunit.client.0.vm06.stdout:0/557: truncate d7/d11/f29 3940520 0 2026-03-09T17:29:37.744 INFO:tasks.workunit.client.0.vm06.stdout:4/477: creat db/d1d/d21/d26/d89/dab/fb3 x:0 0 0 2026-03-09T17:29:37.744 INFO:tasks.workunit.client.0.vm06.stdout:0/558: creat d7/d11/d2d/fc3 x:0 0 0 2026-03-09T17:29:37.744 INFO:tasks.workunit.client.0.vm06.stdout:4/478: rename db/d59/d5f/d45/d84 to db/d1d/d21/d37/d69/d78/db4 0 2026-03-09T17:29:37.744 INFO:tasks.workunit.client.0.vm06.stdout:9/532: fsync d3/d26/d35/f99 0 2026-03-09T17:29:37.744 INFO:tasks.workunit.client.0.vm06.stdout:0/559: creat d7/d11/d19/d1d/d87/fc4 x:0 0 0 2026-03-09T17:29:37.787 INFO:tasks.workunit.client.0.vm06.stdout:9/533: sync 2026-03-09T17:29:37.788 INFO:tasks.workunit.client.0.vm06.stdout:9/534: chown d3/d15/d16/l20 51449 1 2026-03-09T17:29:37.793 INFO:tasks.workunit.client.0.vm06.stdout:5/460: dwrite d4/d50/f43 [0,4194304] 0 2026-03-09T17:29:37.802 INFO:tasks.workunit.client.0.vm06.stdout:5/461: rmdir d4/d50/d35 39 2026-03-09T17:29:37.808 INFO:tasks.workunit.client.0.vm06.stdout:5/462: dread - d4/d50/d35/d40/d6f/f9e zero size 2026-03-09T17:29:37.808 INFO:tasks.workunit.client.0.vm06.stdout:5/463: rename d4/d50/d35/d40/d95/c99 to d4/d50/d35/d40/d6f/cb0 0 2026-03-09T17:29:37.808 INFO:tasks.workunit.client.0.vm06.stdout:5/464: fsync d4/d22/f77 0 2026-03-09T17:29:37.810 INFO:tasks.workunit.client.0.vm06.stdout:5/465: symlink d4/d22/d46/lb1 0 2026-03-09T17:29:37.812 INFO:tasks.workunit.client.0.vm06.stdout:5/466: mkdir d4/d50/db2 0 2026-03-09T17:29:37.911 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.907+0000 7f311dbc6700 1 -- 192.168.123.106:0/288882734 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3118071db0 msgr2=0x7f31180721c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:37.911 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.907+0000 7f311dbc6700 1 --2- 192.168.123.106:0/288882734 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3118071db0 0x7f31180721c0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f3108009a60 tx=0x7f3108009d70 comp rx=0 tx=0).stop 2026-03-09T17:29:37.911 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.908+0000 7f311dbc6700 1 -- 192.168.123.106:0/288882734 shutdown_connections 2026-03-09T17:29:37.911 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.908+0000 7f311dbc6700 1 --2- 192.168.123.106:0/288882734 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3118107d50 0x7f31181081c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:37.911 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.908+0000 7f311dbc6700 1 --2- 192.168.123.106:0/288882734 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3118071db0 0x7f31180721c0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:37.911 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.908+0000 7f311dbc6700 1 -- 192.168.123.106:0/288882734 >> 192.168.123.106:0/288882734 conn(0x7f311806d3e0 msgr2=0x7f311806f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:37.911 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.908+0000 7f311dbc6700 1 -- 192.168.123.106:0/288882734 shutdown_connections 2026-03-09T17:29:37.911 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.908+0000 7f311dbc6700 1 -- 192.168.123.106:0/288882734 wait complete. 2026-03-09T17:29:37.911 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.909+0000 7f311dbc6700 1 Processor -- start 2026-03-09T17:29:37.911 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.909+0000 7f311dbc6700 1 -- start start 2026-03-09T17:29:37.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.909+0000 7f311dbc6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3118107d50 0x7f31181a4c80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:37.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.909+0000 7f311dbc6700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f31181a51c0 0x7f31181aa220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:37.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.910+0000 7f311dbc6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f31181a5660 con 0x7f3118107d50 2026-03-09T17:29:37.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.910+0000 7f311dbc6700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f31181a57d0 con 0x7f31181a51c0 2026-03-09T17:29:37.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.910+0000 7f3116ffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f31181a51c0 0x7f31181aa220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:37.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.910+0000 7f3116ffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f31181a51c0 0x7f31181aa220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:47210/0 (socket says 192.168.123.106:47210) 2026-03-09T17:29:37.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.910+0000 7f3116ffd700 1 -- 192.168.123.106:0/1695312579 learned_addr learned my addr 192.168.123.106:0/1695312579 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:29:37.913 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.911+0000 7f3116ffd700 1 -- 192.168.123.106:0/1695312579 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3118107d50 msgr2=0x7f31181a4c80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:37.913 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.911+0000 7f3116ffd700 1 --2- 192.168.123.106:0/1695312579 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3118107d50 0x7f31181a4c80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:37.913 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.911+0000 7f3116ffd700 1 -- 192.168.123.106:0/1695312579 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3108009710 con 0x7f31181a51c0 2026-03-09T17:29:37.915 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.912+0000 7f3116ffd700 1 --2- 192.168.123.106:0/1695312579 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f31181a51c0 0x7f31181aa220 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f311000e3f0 tx=0x7f311000e7b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:37.915 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.913+0000 7f3114ff9700 1 -- 192.168.123.106:0/1695312579 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f31100090d0 con 0x7f31181a51c0 2026-03-09T17:29:37.915 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.913+0000 7f3114ff9700 1 -- 192.168.123.106:0/1695312579 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f311000f040 con 0x7f31181a51c0 2026-03-09T17:29:37.915 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.913+0000 7f311dbc6700 1 -- 192.168.123.106:0/1695312579 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f31181aa760 con 0x7f31181a51c0 2026-03-09T17:29:37.915 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.913+0000 7f3114ff9700 1 -- 192.168.123.106:0/1695312579 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3110014700 con 0x7f31181a51c0 2026-03-09T17:29:37.915 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.913+0000 7f311dbc6700 1 -- 192.168.123.106:0/1695312579 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f31181aac80 con 0x7f31181a51c0 2026-03-09T17:29:37.916 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.914+0000 7f311dbc6700 1 -- 192.168.123.106:0/1695312579 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3118066e40 con 0x7f31181a51c0 2026-03-09T17:29:37.919 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.916+0000 7f3114ff9700 1 -- 192.168.123.106:0/1695312579 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f3110009230 con 0x7f31181a51c0 2026-03-09T17:29:37.919 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.916+0000 7f3114ff9700 1 --2- 192.168.123.106:0/1695312579 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f310006c680 0x7f310006eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:37.919 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.916+0000 7f3114ff9700 1 -- 192.168.123.106:0/1695312579 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f311008b500 con 0x7f31181a51c0 2026-03-09T17:29:37.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.919+0000 7f31177fe700 1 --2- 192.168.123.106:0/1695312579 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f310006c680 0x7f310006eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:37.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.921+0000 7f31177fe700 1 --2- 192.168.123.106:0/1695312579 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f310006c680 0x7f310006eb30 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f31181a5b60 tx=0x7f310800b540 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:37.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:37.922+0000 7f3114ff9700 1 -- 192.168.123.106:0/1695312579 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3110055d40 con 0x7f31181a51c0 2026-03-09T17:29:37.934 INFO:tasks.workunit.client.0.vm06.stdout:6/384: write d6/fb [1798677,32756] 0 2026-03-09T17:29:37.948 INFO:tasks.workunit.client.0.vm06.stdout:8/451: chown d15/d16/d1a/d47/f76 231563846 1 2026-03-09T17:29:37.951 INFO:tasks.workunit.client.0.vm06.stdout:6/385: dread d6/f5c [0,4194304] 0 2026-03-09T17:29:37.955 INFO:tasks.workunit.client.0.vm06.stdout:6/386: dwrite d6/d12/d53/f5b [0,4194304] 0 2026-03-09T17:29:37.974 INFO:tasks.workunit.client.0.vm06.stdout:6/387: write d6/d12/d17/d27/f7e [7986097,15823] 0 2026-03-09T17:29:37.982 INFO:tasks.workunit.client.0.vm06.stdout:6/388: rename d6/l1e to d6/d12/l7f 0 2026-03-09T17:29:38.049 INFO:tasks.workunit.client.0.vm06.stdout:7/519: dwrite d5/d12/f2c [0,4194304] 0 2026-03-09T17:29:38.063 INFO:tasks.workunit.client.0.vm06.stdout:7/520: getdents d5/d7 0 2026-03-09T17:29:38.067 INFO:tasks.workunit.client.0.vm06.stdout:2/426: write d3/d4/f1f [1201877,82820] 0 2026-03-09T17:29:38.079 INFO:tasks.workunit.client.0.vm06.stdout:2/427: unlink d3/d4/d22/f73 0 2026-03-09T17:29:38.080 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.078+0000 7f311dbc6700 1 -- 192.168.123.106:0/1695312579 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f311810c280 con 0x7f310006c680 2026-03-09T17:29:38.082 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.080+0000 7f3114ff9700 1 -- 192.168.123.106:0/1695312579 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f311810c280 con 0x7f310006c680 2026-03-09T17:29:38.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.083+0000 7f30fe7fc700 1 -- 192.168.123.106:0/1695312579 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f310006c680 msgr2=0x7f310006eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:38.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.083+0000 7f30fe7fc700 1 --2- 192.168.123.106:0/1695312579 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f310006c680 0x7f310006eb30 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f31181a5b60 tx=0x7f310800b540 comp rx=0 tx=0).stop 2026-03-09T17:29:38.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.083+0000 7f30fe7fc700 1 -- 192.168.123.106:0/1695312579 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f31181a51c0 msgr2=0x7f31181aa220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:38.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.083+0000 7f30fe7fc700 1 --2- 192.168.123.106:0/1695312579 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f31181a51c0 0x7f31181aa220 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f311000e3f0 tx=0x7f311000e7b0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.083+0000 7f30fe7fc700 1 -- 192.168.123.106:0/1695312579 shutdown_connections 2026-03-09T17:29:38.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.083+0000 7f30fe7fc700 1 --2- 192.168.123.106:0/1695312579 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f310006c680 0x7f310006eb30 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.083+0000 7f30fe7fc700 1 --2- 192.168.123.106:0/1695312579 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3118107d50 0x7f31181a4c80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.083+0000 7f30fe7fc700 1 --2- 192.168.123.106:0/1695312579 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f31181a51c0 0x7f31181aa220 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.083+0000 7f30fe7fc700 1 -- 192.168.123.106:0/1695312579 >> 192.168.123.106:0/1695312579 conn(0x7f311806d3e0 msgr2=0x7f3118070670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:38.086 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.083+0000 7f30fe7fc700 1 -- 192.168.123.106:0/1695312579 shutdown_connections 2026-03-09T17:29:38.086 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.083+0000 7f30fe7fc700 1 -- 192.168.123.106:0/1695312579 wait complete. 2026-03-09T17:29:38.096 INFO:tasks.workunit.client.0.vm06.stdout:2/428: rename d3/cd to d3/d4/d22/d72/c8a 0 2026-03-09T17:29:38.096 INFO:tasks.workunit.client.0.vm06.stdout:2/429: chown d3/f3b 47449 1 2026-03-09T17:29:38.098 INFO:tasks.workunit.client.0.vm06.stdout:2/430: dread d3/d4/d12/f66 [0,4194304] 0 2026-03-09T17:29:38.104 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:29:38.109 INFO:tasks.workunit.client.0.vm06.stdout:2/431: mknod d3/c8b 0 2026-03-09T17:29:38.117 INFO:tasks.workunit.client.0.vm06.stdout:2/432: creat d3/d4/d12/d71/f8c x:0 0 0 2026-03-09T17:29:38.123 INFO:tasks.workunit.client.0.vm06.stdout:9/535: unlink d3/d15/d36/d4d/fa5 0 2026-03-09T17:29:38.124 INFO:tasks.workunit.client.0.vm06.stdout:2/433: rename d3/d4/c1d to d3/d4/d12/d2b/d36/d37/c8d 0 2026-03-09T17:29:38.127 INFO:tasks.workunit.client.0.vm06.stdout:9/536: unlink d3/d26/f57 0 2026-03-09T17:29:38.127 INFO:tasks.workunit.client.0.vm06.stdout:9/537: dread - d3/d15/d36/d4d/fa4 zero size 2026-03-09T17:29:38.130 INFO:tasks.workunit.client.0.vm06.stdout:2/434: rename d3/d4/d22/d43/c4a to d3/d4/d22/c8e 0 2026-03-09T17:29:38.131 INFO:tasks.workunit.client.0.vm06.stdout:2/435: readlink d3/d4/d12/d2b/d36/l45 0 2026-03-09T17:29:38.139 INFO:tasks.workunit.client.0.vm06.stdout:9/538: write d3/d11/f87 [8375,1608] 0 2026-03-09T17:29:38.143 INFO:tasks.workunit.client.0.vm06.stdout:9/539: dwrite d3/d15/d36/d4d/f61 [4194304,4194304] 0 2026-03-09T17:29:38.144 INFO:tasks.workunit.client.0.vm06.stdout:9/540: write d3/d15/d36/d4d/fa1 [744797,115275] 0 2026-03-09T17:29:38.152 INFO:tasks.workunit.client.0.vm06.stdout:2/436: mkdir d3/d4/d22/d72/d8f 0 2026-03-09T17:29:38.153 INFO:tasks.workunit.client.0.vm06.stdout:2/437: write d3/d4/d12/d71/f8c [190465,32266] 0 2026-03-09T17:29:38.155 INFO:tasks.workunit.client.0.vm06.stdout:3/458: write dd/d19/d28/f58 [259898,67749] 0 2026-03-09T17:29:38.171 INFO:tasks.workunit.client.0.vm06.stdout:3/459: creat dd/d5b/d65/f92 x:0 0 0 2026-03-09T17:29:38.171 INFO:tasks.workunit.client.0.vm06.stdout:3/460: read - dd/d1d/d2e/d67/f7e zero size 2026-03-09T17:29:38.175 INFO:tasks.workunit.client.0.vm06.stdout:9/541: mknod d3/d15/ca6 0 2026-03-09T17:29:38.186 INFO:tasks.workunit.client.0.vm06.stdout:2/438: dread d3/d4/d22/d43/d77/d81/f58 [0,4194304] 0 2026-03-09T17:29:38.188 INFO:tasks.workunit.client.0.vm06.stdout:1/490: write d11/d14/d1d/d1e/d2a/f40 [184162,11646] 0 2026-03-09T17:29:38.200 INFO:tasks.workunit.client.0.vm06.stdout:2/439: fdatasync d3/d4/d22/f28 0 2026-03-09T17:29:38.206 INFO:tasks.workunit.client.0.vm06.stdout:1/491: creat d11/d14/d1d/d42/d46/d92/fa3 x:0 0 0 2026-03-09T17:29:38.206 INFO:tasks.workunit.client.0.vm06.stdout:4/479: dwrite db/d59/d5f/d45/f4a [0,4194304] 0 2026-03-09T17:29:38.216 INFO:tasks.workunit.client.0.vm06.stdout:9/542: sync 2026-03-09T17:29:38.217 INFO:tasks.workunit.client.0.vm06.stdout:0/560: dwrite d7/d11/d19/d8b/da4/fa7 [0,4194304] 0 2026-03-09T17:29:38.218 INFO:tasks.workunit.client.0.vm06.stdout:9/543: stat d3/d15/d48/f64 0 2026-03-09T17:29:38.226 INFO:tasks.workunit.client.0.vm06.stdout:3/461: mkdir dd/d19/d25/d48/d93 0 2026-03-09T17:29:38.232 INFO:tasks.workunit.client.0.vm06.stdout:4/480: dread db/df/f2d [0,4194304] 0 2026-03-09T17:29:38.232 INFO:tasks.workunit.client.0.vm06.stdout:4/481: chown db/d1d/d21/d37/d69/f75 7 1 2026-03-09T17:29:38.233 INFO:tasks.workunit.client.0.vm06.stdout:4/482: chown db/d1d/d21/d26/d89/dab/dae 7581453 1 2026-03-09T17:29:38.257 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.255+0000 7f734e168700 1 -- 192.168.123.106:0/3905417930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73481002c0 msgr2=0x7f73481006d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:38.257 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.255+0000 7f734e168700 1 --2- 192.168.123.106:0/3905417930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73481002c0 0x7f73481006d0 secure :-1 s=READY pgs=309 cs=0 l=1 rev1=1 crypto rx=0x7f7344009a60 tx=0x7f7344009d70 comp rx=0 tx=0).stop 2026-03-09T17:29:38.257 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.255+0000 7f734e168700 1 -- 192.168.123.106:0/3905417930 shutdown_connections 2026-03-09T17:29:38.257 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.255+0000 7f734e168700 1 --2- 192.168.123.106:0/3905417930 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f73481014c0 0x7f7348101910 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.257 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.255+0000 7f734e168700 1 --2- 192.168.123.106:0/3905417930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73481002c0 0x7f73481006d0 unknown :-1 s=CLOSED pgs=309 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.257 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.255+0000 7f734e168700 1 -- 192.168.123.106:0/3905417930 >> 192.168.123.106:0/3905417930 conn(0x7f73480fb890 msgr2=0x7f73480fdca0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:38.257 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.255+0000 7f734e168700 1 -- 192.168.123.106:0/3905417930 shutdown_connections 2026-03-09T17:29:38.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.255+0000 7f734e168700 1 -- 192.168.123.106:0/3905417930 wait complete. 2026-03-09T17:29:38.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.255+0000 7f734e168700 1 Processor -- start 2026-03-09T17:29:38.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.256+0000 7f734e168700 1 -- start start 2026-03-09T17:29:38.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.256+0000 7f734e168700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73481002c0 0x7f73481939a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:38.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.256+0000 7f734e168700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f73481014c0 0x7f7348193ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:38.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.256+0000 7f734e168700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7348194500 con 0x7f73481002c0 2026-03-09T17:29:38.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.256+0000 7f734e168700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7348194640 con 0x7f73481014c0 2026-03-09T17:29:38.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.256+0000 7f734c965700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f73481014c0 0x7f7348193ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:38.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.257+0000 7f734c965700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f73481014c0 0x7f7348193ee0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:47216/0 (socket says 192.168.123.106:47216) 2026-03-09T17:29:38.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.257+0000 7f734c965700 1 -- 192.168.123.106:0/604491811 learned_addr learned my addr 192.168.123.106:0/604491811 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:29:38.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.257+0000 7f734c965700 1 -- 192.168.123.106:0/604491811 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73481002c0 msgr2=0x7f73481939a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:38.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.257+0000 7f734c965700 1 --2- 192.168.123.106:0/604491811 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73481002c0 0x7f73481939a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.257+0000 7f734c965700 1 -- 192.168.123.106:0/604491811 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7344009710 con 0x7f73481014c0 2026-03-09T17:29:38.260 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:37 vm06.local ceph-mon[57307]: pgmap v152: 65 pgs: 65 active+clean; 1.1 GiB data, 4.2 GiB used, 116 GiB / 120 GiB avail; 24 MiB/s rd, 94 MiB/s wr, 321 op/s 2026-03-09T17:29:38.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.258+0000 7f734c965700 1 --2- 192.168.123.106:0/604491811 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f73481014c0 0x7f7348193ee0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f733800ea30 tx=0x7f733800edf0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:38.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.258+0000 7f733e7fc700 1 -- 192.168.123.106:0/604491811 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f733800cc40 con 0x7f73481014c0 2026-03-09T17:29:38.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.258+0000 7f734e168700 1 -- 192.168.123.106:0/604491811 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f73481990f0 con 0x7f73481014c0 2026-03-09T17:29:38.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.258+0000 7f734e168700 1 -- 192.168.123.106:0/604491811 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7348199640 con 0x7f73481014c0 2026-03-09T17:29:38.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.258+0000 7f733e7fc700 1 -- 192.168.123.106:0/604491811 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f733800cda0 con 0x7f73481014c0 2026-03-09T17:29:38.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.259+0000 7f733e7fc700 1 -- 192.168.123.106:0/604491811 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7338010430 con 0x7f73481014c0 2026-03-09T17:29:38.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.259+0000 7f734e168700 1 -- 192.168.123.106:0/604491811 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f732c005320 con 0x7f73481014c0 2026-03-09T17:29:38.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.260+0000 7f733e7fc700 1 -- 192.168.123.106:0/604491811 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f7338005000 con 0x7f73481014c0 2026-03-09T17:29:38.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.261+0000 7f733e7fc700 1 --2- 192.168.123.106:0/604491811 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7334070b00 0x7f7334072fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:38.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.261+0000 7f733e7fc700 1 -- 192.168.123.106:0/604491811 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f7338014070 con 0x7f73481014c0 2026-03-09T17:29:38.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.262+0000 7f734d166700 1 --2- 192.168.123.106:0/604491811 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7334070b00 0x7f7334072fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:38.265 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.263+0000 7f734d166700 1 --2- 192.168.123.106:0/604491811 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7334070b00 0x7f7334072fb0 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f7344003820 tx=0x7f7344005b40 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:38.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.264+0000 7f733e7fc700 1 -- 192.168.123.106:0/604491811 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f733805a290 con 0x7f73481014c0 2026-03-09T17:29:38.304 INFO:tasks.workunit.client.0.vm06.stdout:5/467: getdents d4/d50/d35/d40/d6f 0 2026-03-09T17:29:38.330 INFO:tasks.workunit.client.0.vm06.stdout:1/492: write d11/d14/d1d/d1e/d2a/d34/f3b [1401781,117453] 0 2026-03-09T17:29:38.333 INFO:tasks.workunit.client.0.vm06.stdout:1/493: dread d11/d14/d1d/f8f [0,4194304] 0 2026-03-09T17:29:38.334 INFO:tasks.workunit.client.0.vm06.stdout:1/494: write d11/d14/d1c/d1f/f68 [1970095,82836] 0 2026-03-09T17:29:38.335 INFO:tasks.workunit.client.0.vm06.stdout:8/452: write d15/d16/d1a/f29 [214228,7556] 0 2026-03-09T17:29:38.343 INFO:tasks.workunit.client.0.vm06.stdout:3/462: dread dd/d19/d25/d44/f88 [0,4194304] 0 2026-03-09T17:29:38.351 INFO:tasks.workunit.client.0.vm06.stdout:4/483: truncate db/d1d/f22 3203727 0 2026-03-09T17:29:38.357 INFO:tasks.workunit.client.0.vm06.stdout:7/521: write d5/d1f/d34/f5e [4430929,128488] 0 2026-03-09T17:29:38.357 INFO:tasks.workunit.client.0.vm06.stdout:7/522: fdatasync d5/d12/f2c 0 2026-03-09T17:29:38.358 INFO:tasks.workunit.client.0.vm06.stdout:7/523: write d5/d1f/d34/d46/f55 [1490241,53506] 0 2026-03-09T17:29:38.366 INFO:tasks.workunit.client.0.vm06.stdout:8/453: dwrite d15/d39/f4b [0,4194304] 0 2026-03-09T17:29:38.392 INFO:tasks.workunit.client.0.vm06.stdout:0/561: symlink d7/d11/d19/d23/db7/dbd/dc1/lc5 0 2026-03-09T17:29:38.392 INFO:tasks.workunit.client.0.vm06.stdout:4/484: rmdir db/d57 39 2026-03-09T17:29:38.394 INFO:tasks.workunit.client.0.vm06.stdout:5/468: symlink d4/d50/lb3 0 2026-03-09T17:29:38.395 INFO:tasks.workunit.client.0.vm06.stdout:5/469: chown d4/d50/d18/d3d/l7f 4186502 1 2026-03-09T17:29:38.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:37 vm09.local ceph-mon[62061]: pgmap v152: 65 pgs: 65 active+clean; 1.1 GiB data, 4.2 GiB used, 116 GiB / 120 GiB avail; 24 MiB/s rd, 94 MiB/s wr, 321 op/s 2026-03-09T17:29:38.407 INFO:tasks.workunit.client.0.vm06.stdout:8/454: creat d15/d16/d19/d71/f96 x:0 0 0 2026-03-09T17:29:38.409 INFO:tasks.workunit.client.0.vm06.stdout:1/495: truncate d11/d14/d1d/d1e/d2a/f50 1975345 0 2026-03-09T17:29:38.420 INFO:tasks.workunit.client.0.vm06.stdout:6/389: rmdir d6/d47/d4d 39 2026-03-09T17:29:38.423 INFO:tasks.workunit.client.0.vm06.stdout:6/390: dwrite d6/d12/f76 [0,4194304] 0 2026-03-09T17:29:38.438 INFO:tasks.workunit.client.0.vm06.stdout:1/496: dread d11/d14/d1d/d42/f52 [0,4194304] 0 2026-03-09T17:29:38.441 INFO:tasks.workunit.client.0.vm06.stdout:3/463: rename dd/d19/d25/d44/l77 to dd/d19/d2c/l94 0 2026-03-09T17:29:38.441 INFO:tasks.workunit.client.0.vm06.stdout:3/464: rename dd/d1d/d2e to dd/d1d/d2e/d95 22 2026-03-09T17:29:38.453 INFO:tasks.workunit.client.0.vm06.stdout:1/497: dread d11/d14/d1d/f56 [0,4194304] 0 2026-03-09T17:29:38.454 INFO:tasks.workunit.client.0.vm06.stdout:1/498: fdatasync d11/d14/d1d/d1e/d2a/f40 0 2026-03-09T17:29:38.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.463+0000 7f734e168700 1 -- 192.168.123.106:0/604491811 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f732c000bf0 con 0x7f7334070b00 2026-03-09T17:29:38.467 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.464+0000 7f733e7fc700 1 -- 192.168.123.106:0/604491811 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f732c000bf0 con 0x7f7334070b00 2026-03-09T17:29:38.469 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.467+0000 7f734e168700 1 -- 192.168.123.106:0/604491811 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7334070b00 msgr2=0x7f7334072fb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:38.469 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.467+0000 7f734e168700 1 --2- 192.168.123.106:0/604491811 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7334070b00 0x7f7334072fb0 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f7344003820 tx=0x7f7344005b40 comp rx=0 tx=0).stop 2026-03-09T17:29:38.469 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.467+0000 7f734e168700 1 -- 192.168.123.106:0/604491811 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f73481014c0 msgr2=0x7f7348193ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:38.469 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.467+0000 7f734e168700 1 --2- 192.168.123.106:0/604491811 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f73481014c0 0x7f7348193ee0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f733800ea30 tx=0x7f733800edf0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.469 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.467+0000 7f734e168700 1 -- 192.168.123.106:0/604491811 shutdown_connections 2026-03-09T17:29:38.469 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.467+0000 7f734e168700 1 --2- 192.168.123.106:0/604491811 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7334070b00 0x7f7334072fb0 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.469 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.467+0000 7f734e168700 1 --2- 192.168.123.106:0/604491811 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f73481002c0 0x7f73481939a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.469 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.467+0000 7f734e168700 1 --2- 192.168.123.106:0/604491811 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f73481014c0 0x7f7348193ee0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.470 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.467+0000 7f734e168700 1 -- 192.168.123.106:0/604491811 >> 192.168.123.106:0/604491811 conn(0x7f73480fb890 msgr2=0x7f73481046f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:38.470 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.467+0000 7f734e168700 1 -- 192.168.123.106:0/604491811 shutdown_connections 2026-03-09T17:29:38.470 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.467+0000 7f734e168700 1 -- 192.168.123.106:0/604491811 wait complete. 2026-03-09T17:29:38.480 INFO:tasks.workunit.client.0.vm06.stdout:5/470: mkdir d4/d52/db4 0 2026-03-09T17:29:38.481 INFO:tasks.workunit.client.0.vm06.stdout:2/440: getdents d3 0 2026-03-09T17:29:38.482 INFO:tasks.workunit.client.0.vm06.stdout:2/441: stat d3/d4/d22/d43/d77/d81/f58 0 2026-03-09T17:29:38.483 INFO:tasks.workunit.client.0.vm06.stdout:2/442: read d3/d4/d12/d71/f8c [128905,46713] 0 2026-03-09T17:29:38.483 INFO:tasks.workunit.client.0.vm06.stdout:2/443: readlink d3/d4/d12/l5c 0 2026-03-09T17:29:38.511 INFO:tasks.workunit.client.0.vm06.stdout:6/391: dwrite d6/d12/d17/f78 [4194304,4194304] 0 2026-03-09T17:29:38.515 INFO:tasks.workunit.client.0.vm06.stdout:6/392: read d6/d47/f49 [455,37868] 0 2026-03-09T17:29:38.517 INFO:tasks.workunit.client.0.vm06.stdout:6/393: chown d6/d12/d17/d27/f37 117 1 2026-03-09T17:29:38.517 INFO:tasks.workunit.client.0.vm06.stdout:6/394: fdatasync d6/d12/d2d/f39 0 2026-03-09T17:29:38.525 INFO:tasks.workunit.client.0.vm06.stdout:7/524: truncate d5/d1f/d34/d46/f55 820572 0 2026-03-09T17:29:38.527 INFO:tasks.workunit.client.0.vm06.stdout:4/485: symlink db/d59/d5f/d45/lb5 0 2026-03-09T17:29:38.527 INFO:tasks.workunit.client.0.vm06.stdout:4/486: stat db/d59/d5f/d6d/f7b 0 2026-03-09T17:29:38.533 INFO:tasks.workunit.client.0.vm06.stdout:8/455: write d15/d39/d3c/f5d [4176013,77692] 0 2026-03-09T17:29:38.539 INFO:tasks.workunit.client.0.vm06.stdout:0/562: dwrite d7/d11/d19/f68 [0,4194304] 0 2026-03-09T17:29:38.563 INFO:tasks.workunit.client.0.vm06.stdout:9/544: dwrite d3/d26/d35/f99 [0,4194304] 0 2026-03-09T17:29:38.574 INFO:tasks.workunit.client.0.vm06.stdout:5/471: symlink d4/d7e/lb5 0 2026-03-09T17:29:38.596 INFO:tasks.workunit.client.0.vm06.stdout:3/465: dread dd/d5b/d65/f6a [0,4194304] 0 2026-03-09T17:29:38.601 INFO:tasks.workunit.client.0.vm06.stdout:1/499: mknod d11/d14/d1d/ca4 0 2026-03-09T17:29:38.601 INFO:tasks.workunit.client.0.vm06.stdout:6/395: dread - d6/d12/d17/f6b zero size 2026-03-09T17:29:38.601 INFO:tasks.workunit.client.0.vm06.stdout:7/525: mknod d5/d12/d64/d6b/c94 0 2026-03-09T17:29:38.605 INFO:tasks.workunit.client.0.vm06.stdout:7/526: write d5/dd/f7d [760437,39760] 0 2026-03-09T17:29:38.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.603+0000 7f074710e700 1 -- 192.168.123.106:0/596365629 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0740071a60 msgr2=0x7f0740071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:38.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.603+0000 7f074710e700 1 --2- 192.168.123.106:0/596365629 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0740071a60 0x7f0740071e70 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f073c009b00 tx=0x7f073c009e10 comp rx=0 tx=0).stop 2026-03-09T17:29:38.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.603+0000 7f074710e700 1 -- 192.168.123.106:0/596365629 shutdown_connections 2026-03-09T17:29:38.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.603+0000 7f074710e700 1 --2- 192.168.123.106:0/596365629 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0740072440 0x7f074010be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.603+0000 7f074710e700 1 --2- 192.168.123.106:0/596365629 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0740071a60 0x7f0740071e70 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.603+0000 7f074710e700 1 -- 192.168.123.106:0/596365629 >> 192.168.123.106:0/596365629 conn(0x7f074006d1a0 msgr2=0x7f074006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:38.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.603+0000 7f074710e700 1 -- 192.168.123.106:0/596365629 shutdown_connections 2026-03-09T17:29:38.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.603+0000 7f074710e700 1 -- 192.168.123.106:0/596365629 wait complete. 2026-03-09T17:29:38.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.604+0000 7f074710e700 1 Processor -- start 2026-03-09T17:29:38.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.604+0000 7f074710e700 1 -- start start 2026-03-09T17:29:38.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.604+0000 7f074710e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0740072440 0x7f07401169e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:38.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.604+0000 7f074710e700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0740116f20 0x7f07401b2760 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:38.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.604+0000 7f074710e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0740117420 con 0x7f0740072440 2026-03-09T17:29:38.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.604+0000 7f074710e700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0740117590 con 0x7f0740116f20 2026-03-09T17:29:38.607 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.605+0000 7f074590b700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0740116f20 0x7f07401b2760 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:38.607 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.605+0000 7f074590b700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0740116f20 0x7f07401b2760 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:47248/0 (socket says 192.168.123.106:47248) 2026-03-09T17:29:38.607 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.605+0000 7f074590b700 1 -- 192.168.123.106:0/1482913552 learned_addr learned my addr 192.168.123.106:0/1482913552 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:29:38.607 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.605+0000 7f074590b700 1 -- 192.168.123.106:0/1482913552 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0740072440 msgr2=0x7f07401169e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:38.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.605+0000 7f074590b700 1 --2- 192.168.123.106:0/1482913552 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0740072440 0x7f07401169e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.605+0000 7f074590b700 1 -- 192.168.123.106:0/1482913552 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f073c0097e0 con 0x7f0740116f20 2026-03-09T17:29:38.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.606+0000 7f074590b700 1 --2- 192.168.123.106:0/1482913552 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0740116f20 0x7f07401b2760 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f073800bf40 tx=0x7f073800bf70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:38.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.606+0000 7f07377fe700 1 -- 192.168.123.106:0/1482913552 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f073800cb40 con 0x7f0740116f20 2026-03-09T17:29:38.609 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.606+0000 7f074710e700 1 -- 192.168.123.106:0/1482913552 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f07401b2d00 con 0x7f0740116f20 2026-03-09T17:29:38.609 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.606+0000 7f074710e700 1 -- 192.168.123.106:0/1482913552 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f07401b3220 con 0x7f0740116f20 2026-03-09T17:29:38.609 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.607+0000 7f07377fe700 1 -- 192.168.123.106:0/1482913552 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f073800cca0 con 0x7f0740116f20 2026-03-09T17:29:38.609 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.607+0000 7f07377fe700 1 -- 192.168.123.106:0/1482913552 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0738012720 con 0x7f0740116f20 2026-03-09T17:29:38.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.608+0000 7f07377fe700 1 -- 192.168.123.106:0/1482913552 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f07380129c0 con 0x7f0740116f20 2026-03-09T17:29:38.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.609+0000 7f07377fe700 1 --2- 192.168.123.106:0/1482913552 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f072c06c6d0 0x7f072c06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:38.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.609+0000 7f07377fe700 1 -- 192.168.123.106:0/1482913552 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f073808c1a0 con 0x7f0740116f20 2026-03-09T17:29:38.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.609+0000 7f074710e700 1 -- 192.168.123.106:0/1482913552 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0724005320 con 0x7f0740116f20 2026-03-09T17:29:38.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.610+0000 7f074610c700 1 --2- 192.168.123.106:0/1482913552 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f072c06c6d0 0x7f072c06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:38.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.610+0000 7f074610c700 1 --2- 192.168.123.106:0/1482913552 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f072c06c6d0 0x7f072c06eb80 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f073c00b5c0 tx=0x7f073c009f90 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:38.614 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.612+0000 7f07377fe700 1 -- 192.168.123.106:0/1482913552 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f073804e720 con 0x7f0740116f20 2026-03-09T17:29:38.618 INFO:tasks.workunit.client.0.vm06.stdout:0/563: read - d7/d11/d19/d37/f4f zero size 2026-03-09T17:29:38.629 INFO:tasks.workunit.client.0.vm06.stdout:9/545: symlink d3/d15/d16/la7 0 2026-03-09T17:29:38.637 INFO:tasks.workunit.client.0.vm06.stdout:6/396: sync 2026-03-09T17:29:38.643 INFO:tasks.workunit.client.0.vm06.stdout:2/444: unlink d3/d4/d12/c26 0 2026-03-09T17:29:38.646 INFO:tasks.workunit.client.0.vm06.stdout:1/500: creat d11/d14/d1d/d4a/fa5 x:0 0 0 2026-03-09T17:29:38.647 INFO:tasks.workunit.client.0.vm06.stdout:8/456: mkdir d15/d39/d67/d77/d97 0 2026-03-09T17:29:38.648 INFO:tasks.workunit.client.0.vm06.stdout:8/457: write d15/d16/d19/d71/f80 [319057,11404] 0 2026-03-09T17:29:38.655 INFO:tasks.workunit.client.0.vm06.stdout:5/472: mknod d4/d52/db4/cb6 0 2026-03-09T17:29:38.658 INFO:tasks.workunit.client.0.vm06.stdout:9/546: rmdir d3/d15/d16 39 2026-03-09T17:29:38.661 INFO:tasks.workunit.client.0.vm06.stdout:4/487: truncate db/df/f2a 748627 0 2026-03-09T17:29:38.663 INFO:tasks.workunit.client.0.vm06.stdout:2/445: rmdir d3/d4/d12/d2b/d2d 39 2026-03-09T17:29:38.666 INFO:tasks.workunit.client.0.vm06.stdout:2/446: dwrite d3/d4/d12/d2b/f89 [0,4194304] 0 2026-03-09T17:29:38.667 INFO:tasks.workunit.client.0.vm06.stdout:2/447: write d3/d4/d12/d2b/f32 [577970,125699] 0 2026-03-09T17:29:38.676 INFO:tasks.workunit.client.0.vm06.stdout:1/501: rmdir d11/d14/d1d/d1e/d2a/d34/d64 39 2026-03-09T17:29:38.678 INFO:tasks.workunit.client.0.vm06.stdout:7/527: symlink d5/dd/d79/d7f/l95 0 2026-03-09T17:29:38.683 INFO:tasks.workunit.client.0.vm06.stdout:0/564: getdents d7/d11/d19/d3c/db9 0 2026-03-09T17:29:38.687 INFO:tasks.workunit.client.0.vm06.stdout:3/466: truncate dd/d19/d25/d2d/f6d 702019 0 2026-03-09T17:29:38.691 INFO:tasks.workunit.client.0.vm06.stdout:5/473: mknod d4/d50/d35/d40/d96/cb7 0 2026-03-09T17:29:38.692 INFO:tasks.workunit.client.0.vm06.stdout:9/547: chown d3/d15/d16/l67 937225 1 2026-03-09T17:29:38.693 INFO:tasks.workunit.client.0.vm06.stdout:9/548: fdatasync d3/d2c/f8c 0 2026-03-09T17:29:38.693 INFO:tasks.workunit.client.0.vm06.stdout:9/549: read - d3/d2c/f81 zero size 2026-03-09T17:29:38.695 INFO:tasks.workunit.client.0.vm06.stdout:6/397: mkdir d6/d4f/d3e/d52/d80 0 2026-03-09T17:29:38.696 INFO:tasks.workunit.client.0.vm06.stdout:4/488: mkdir db/d1d/d21/d37/d69/d78/da0/db6 0 2026-03-09T17:29:38.703 INFO:tasks.workunit.client.0.vm06.stdout:0/565: sync 2026-03-09T17:29:38.707 INFO:tasks.workunit.client.0.vm06.stdout:0/566: sync 2026-03-09T17:29:38.708 INFO:tasks.workunit.client.0.vm06.stdout:0/567: read d7/d11/d19/f57 [632487,95237] 0 2026-03-09T17:29:38.716 INFO:tasks.workunit.client.0.vm06.stdout:2/448: dwrite d3/d4/d12/d2b/d36/d37/f3a [0,4194304] 0 2026-03-09T17:29:38.721 INFO:tasks.workunit.client.0.vm06.stdout:2/449: read d3/d4/d22/d72/f54 [781766,60427] 0 2026-03-09T17:29:38.747 INFO:tasks.workunit.client.0.vm06.stdout:9/550: mkdir d3/d15/d48/da8 0 2026-03-09T17:29:38.755 INFO:tasks.workunit.client.0.vm06.stdout:0/568: creat d7/d11/d5d/db8/fc6 x:0 0 0 2026-03-09T17:29:38.763 INFO:tasks.workunit.client.0.vm06.stdout:5/474: write d4/d50/d18/d3d/f54 [496839,78031] 0 2026-03-09T17:29:38.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.762+0000 7f074710e700 1 -- 192.168.123.106:0/1482913552 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f0724000bf0 con 0x7f072c06c6d0 2026-03-09T17:29:38.768 INFO:tasks.workunit.client.0.vm06.stdout:5/475: dwrite d4/d7e/f8b [0,4194304] 0 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.781+0000 7f07377fe700 1 -- 192.168.123.106:0/1482913552 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3216 (secure 0 0 0) 0x7f0724000bf0 con 0x7f072c06c6d0 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (3m) 2m ago 4m 24.7M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (4m) 2m ago 4m 8078k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (4m) 2m ago 4m 8136k - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (4m) 2m ago 4m 7402k - 18.2.0 dc2bc1663786 8c6366ef2954 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (4m) 2m ago 4m 7419k - 18.2.0 dc2bc1663786 78af352f0367 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (3m) 2m ago 4m 84.7M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (2m) 2m ago 2m 10.8M - 18.2.0 dc2bc1663786 4b4cbdf0c640 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (2m) 2m ago 2m 19.5M - 18.2.0 dc2bc1663786 4c8e86b2b8cd 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (2m) 2m ago 2m 14.7M - 18.2.0 dc2bc1663786 aa1f0430b448 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (2m) 2m ago 2m 13.4M - 18.2.0 dc2bc1663786 8dc8a0159213 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:9283,8765,8443 running (5m) 2m ago 5m 498M - 18.2.0 dc2bc1663786 2765e8d99a9c 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (4m) 2m ago 4m 443M - 18.2.0 dc2bc1663786 e6525bf5de20 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (5m) 2m ago 5m 52.0M 2048M 18.2.0 dc2bc1663786 e0e1a20b1577 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (4m) 2m ago 4m 47.4M 2048M 18.2.0 dc2bc1663786 4c30d1217de3 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (4m) 2m ago 4m 13.9M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (4m) 2m ago 4m 14.9M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (3m) 2m ago 3m 46.8M 4096M 18.2.0 dc2bc1663786 7a07f019bdd7 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (3m) 2m ago 3m 46.0M 4096M 18.2.0 dc2bc1663786 decf7e88ebaf 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (3m) 2m ago 3m 48.3M 4096M 18.2.0 dc2bc1663786 df6c7067c3e4 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (3m) 2m ago 3m 43.8M 4096M 18.2.0 dc2bc1663786 48a594500ef1 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (3m) 2m ago 3m 43.8M 4096M 18.2.0 dc2bc1663786 a47c39052541 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (2m) 2m ago 2m 42.5M 4096M 18.2.0 dc2bc1663786 89f436540a49 2026-03-09T17:29:38.784 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (3m) 2m ago 4m 41.3M - 2.43.0 a07b618ecd1d 9f52c04d903c 2026-03-09T17:29:38.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.784+0000 7f07357fa700 1 -- 192.168.123.106:0/1482913552 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f072c06c6d0 msgr2=0x7f072c06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:38.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.784+0000 7f07357fa700 1 --2- 192.168.123.106:0/1482913552 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f072c06c6d0 0x7f072c06eb80 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f073c00b5c0 tx=0x7f073c009f90 comp rx=0 tx=0).stop 2026-03-09T17:29:38.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.784+0000 7f07357fa700 1 -- 192.168.123.106:0/1482913552 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0740116f20 msgr2=0x7f07401b2760 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:38.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.784+0000 7f07357fa700 1 --2- 192.168.123.106:0/1482913552 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0740116f20 0x7f07401b2760 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f073800bf40 tx=0x7f073800bf70 comp rx=0 tx=0).stop 2026-03-09T17:29:38.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.784+0000 7f07357fa700 1 -- 192.168.123.106:0/1482913552 shutdown_connections 2026-03-09T17:29:38.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.784+0000 7f07357fa700 1 --2- 192.168.123.106:0/1482913552 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f072c06c6d0 0x7f072c06eb80 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.784+0000 7f07357fa700 1 --2- 192.168.123.106:0/1482913552 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0740072440 0x7f07401169e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.784+0000 7f07357fa700 1 --2- 192.168.123.106:0/1482913552 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0740116f20 0x7f07401b2760 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.784+0000 7f07357fa700 1 -- 192.168.123.106:0/1482913552 >> 192.168.123.106:0/1482913552 conn(0x7f074006d1a0 msgr2=0x7f074010b210 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:38.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.784+0000 7f07357fa700 1 -- 192.168.123.106:0/1482913552 shutdown_connections 2026-03-09T17:29:38.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.784+0000 7f07357fa700 1 -- 192.168.123.106:0/1482913552 wait complete. 2026-03-09T17:29:38.790 INFO:tasks.workunit.client.0.vm06.stdout:8/458: link d15/d16/d19/d71/f65 d15/d16/d1e/d28/d5e/f98 0 2026-03-09T17:29:38.792 INFO:tasks.workunit.client.0.vm06.stdout:2/450: creat d3/d4/d12/d2b/d36/d37/f90 x:0 0 0 2026-03-09T17:29:38.792 INFO:tasks.workunit.client.0.vm06.stdout:3/467: mknod dd/c96 0 2026-03-09T17:29:38.792 INFO:tasks.workunit.client.0.vm06.stdout:9/551: creat d3/d15/d36/d4c/d6a/fa9 x:0 0 0 2026-03-09T17:29:38.792 INFO:tasks.workunit.client.0.vm06.stdout:6/398: creat d6/d12/d17/d65/f81 x:0 0 0 2026-03-09T17:29:38.796 INFO:tasks.workunit.client.0.vm06.stdout:9/552: dwrite d3/d15/d36/d4c/d6a/fa9 [0,4194304] 0 2026-03-09T17:29:38.813 INFO:tasks.workunit.client.0.vm06.stdout:7/528: truncate d5/d1f/d34/f41 3884591 0 2026-03-09T17:29:38.814 INFO:tasks.workunit.client.0.vm06.stdout:5/476: rename d4/d50/d18/da6 to d4/d50/d35/d40/d95/db8 0 2026-03-09T17:29:38.814 INFO:tasks.workunit.client.0.vm06.stdout:3/468: mkdir dd/d81/d97 0 2026-03-09T17:29:38.814 INFO:tasks.workunit.client.0.vm06.stdout:1/502: write d11/d14/d1d/d1e/d2a/f38 [3951776,29289] 0 2026-03-09T17:29:38.816 INFO:tasks.workunit.client.0.vm06.stdout:3/469: read dd/d19/d25/d2d/f71 [20163,122992] 0 2026-03-09T17:29:38.817 INFO:tasks.workunit.client.0.vm06.stdout:3/470: truncate dd/d1d/d4e/f7d 470162 0 2026-03-09T17:29:38.818 INFO:tasks.workunit.client.0.vm06.stdout:1/503: read d11/f18 [1825534,61019] 0 2026-03-09T17:29:38.823 INFO:tasks.workunit.client.0.vm06.stdout:8/459: mkdir d15/d39/d67/d77/d99 0 2026-03-09T17:29:38.823 INFO:tasks.workunit.client.0.vm06.stdout:6/399: unlink d6/d12/d17/d65/f81 0 2026-03-09T17:29:38.824 INFO:tasks.workunit.client.0.vm06.stdout:6/400: chown d6/d12/d17/d27/d40/l5f 2834772 1 2026-03-09T17:29:38.824 INFO:tasks.workunit.client.0.vm06.stdout:8/460: read d15/d16/d1a/f29 [90430,43339] 0 2026-03-09T17:29:38.828 INFO:tasks.workunit.client.0.vm06.stdout:3/471: sync 2026-03-09T17:29:38.829 INFO:tasks.workunit.client.0.vm06.stdout:3/472: chown dd/d59 4990 1 2026-03-09T17:29:38.840 INFO:tasks.workunit.client.0.vm06.stdout:0/569: fsync d7/f76 0 2026-03-09T17:29:38.840 INFO:tasks.workunit.client.0.vm06.stdout:9/553: creat d3/d11/d65/faa x:0 0 0 2026-03-09T17:29:38.841 INFO:tasks.workunit.client.0.vm06.stdout:5/477: creat d4/d22/d46/fb9 x:0 0 0 2026-03-09T17:29:38.842 INFO:tasks.workunit.client.0.vm06.stdout:9/554: write d3/d15/d16/f5c [6426752,21154] 0 2026-03-09T17:29:38.847 INFO:tasks.workunit.client.0.vm06.stdout:7/529: rename d5/dd/d79/d7f/l95 to d5/dd/d79/d7f/l96 0 2026-03-09T17:29:38.847 INFO:tasks.workunit.client.0.vm06.stdout:7/530: fsync d5/d1f/d34/d46/f89 0 2026-03-09T17:29:38.864 INFO:tasks.workunit.client.0.vm06.stdout:4/489: getdents db/d59/d5f/d5d 0 2026-03-09T17:29:38.867 INFO:tasks.workunit.client.0.vm06.stdout:1/504: dread d11/d14/d1d/d1e/f47 [0,4194304] 0 2026-03-09T17:29:38.870 INFO:tasks.workunit.client.0.vm06.stdout:2/451: write d3/d4/d12/f15 [508284,79825] 0 2026-03-09T17:29:38.883 INFO:tasks.workunit.client.0.vm06.stdout:0/570: rename d7/d11/d2d/c38 to d7/d11/d19/d1d/cc7 0 2026-03-09T17:29:38.883 INFO:tasks.workunit.client.0.vm06.stdout:0/571: fsync d7/d11/d2d/f2f 0 2026-03-09T17:29:38.889 INFO:tasks.workunit.client.0.vm06.stdout:9/555: chown d3/d11/d65/c8e 1882163 1 2026-03-09T17:29:38.893 INFO:tasks.workunit.client.0.vm06.stdout:7/531: creat d5/dd/d79/f97 x:0 0 0 2026-03-09T17:29:38.899 INFO:tasks.workunit.client.0.vm06.stdout:4/490: truncate db/d1d/d21/f6e 385095 0 2026-03-09T17:29:38.900 INFO:tasks.workunit.client.0.vm06.stdout:1/505: fsync d11/d14/d1d/d1e/f65 0 2026-03-09T17:29:38.901 INFO:tasks.workunit.client.0.vm06.stdout:1/506: write d11/f8d [4173228,124345] 0 2026-03-09T17:29:38.906 INFO:tasks.workunit.client.0.vm06.stdout:2/452: rmdir d3 39 2026-03-09T17:29:38.909 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.906+0000 7f92c759e700 1 -- 192.168.123.106:0/4050308876 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92c0097620 msgr2=0x7f92c0097a30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:38.909 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.906+0000 7f92c759e700 1 --2- 192.168.123.106:0/4050308876 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92c0097620 0x7f92c0097a30 secure :-1 s=READY pgs=310 cs=0 l=1 rev1=1 crypto rx=0x7f92b8009b00 tx=0x7f92b8009e10 comp rx=0 tx=0).stop 2026-03-09T17:29:38.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.907+0000 7f92c759e700 1 -- 192.168.123.106:0/4050308876 shutdown_connections 2026-03-09T17:29:38.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.907+0000 7f92c759e700 1 --2- 192.168.123.106:0/4050308876 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f92c0098820 0x7f92c0098c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.907+0000 7f92c759e700 1 --2- 192.168.123.106:0/4050308876 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92c0097620 0x7f92c0097a30 unknown :-1 s=CLOSED pgs=310 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.907+0000 7f92c759e700 1 -- 192.168.123.106:0/4050308876 >> 192.168.123.106:0/4050308876 conn(0x7f92c0092bb0 msgr2=0x7f92c0095000 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:38.910 INFO:tasks.workunit.client.0.vm06.stdout:8/461: truncate d15/d16/d1a/d47/f7e 592132 0 2026-03-09T17:29:38.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.908+0000 7f92c759e700 1 -- 192.168.123.106:0/4050308876 shutdown_connections 2026-03-09T17:29:38.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.908+0000 7f92c759e700 1 -- 192.168.123.106:0/4050308876 wait complete. 2026-03-09T17:29:38.911 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.909+0000 7f92c759e700 1 Processor -- start 2026-03-09T17:29:38.911 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.909+0000 7f92c759e700 1 -- start start 2026-03-09T17:29:38.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.909+0000 7f92c759e700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f92c0097620 0x7f92c012cec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:38.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.909+0000 7f92c759e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92c0098820 0x7f92c012d400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:38.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.909+0000 7f92c759e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92c012da20 con 0x7f92c0098820 2026-03-09T17:29:38.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.909+0000 7f92c759e700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92c012db60 con 0x7f92c0097620 2026-03-09T17:29:38.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.910+0000 7f92c659c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f92c0097620 0x7f92c012cec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:38.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.910+0000 7f92c659c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f92c0097620 0x7f92c012cec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:47272/0 (socket says 192.168.123.106:47272) 2026-03-09T17:29:38.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.910+0000 7f92c659c700 1 -- 192.168.123.106:0/4139756502 learned_addr learned my addr 192.168.123.106:0/4139756502 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:29:38.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.910+0000 7f92c5d9b700 1 --2- 192.168.123.106:0/4139756502 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92c0098820 0x7f92c012d400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:38.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.910+0000 7f92c5d9b700 1 -- 192.168.123.106:0/4139756502 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f92c0097620 msgr2=0x7f92c012cec0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:38.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.910+0000 7f92c5d9b700 1 --2- 192.168.123.106:0/4139756502 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f92c0097620 0x7f92c012cec0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:38.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.910+0000 7f92c5d9b700 1 -- 192.168.123.106:0/4139756502 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f92b80097e0 con 0x7f92c0098820 2026-03-09T17:29:38.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.910+0000 7f92c5d9b700 1 --2- 192.168.123.106:0/4139756502 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92c0098820 0x7f92c012d400 secure :-1 s=READY pgs=311 cs=0 l=1 rev1=1 crypto rx=0x7f92bc00eab0 tx=0x7f92bc00edc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:38.913 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.910+0000 7f92b77fe700 1 -- 192.168.123.106:0/4139756502 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f92bc00cb20 con 0x7f92c0098820 2026-03-09T17:29:38.913 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.911+0000 7f92b77fe700 1 -- 192.168.123.106:0/4139756502 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f92bc00cc80 con 0x7f92c0098820 2026-03-09T17:29:38.913 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.911+0000 7f92c759e700 1 -- 192.168.123.106:0/4139756502 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f92c0132610 con 0x7f92c0098820 2026-03-09T17:29:38.913 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.911+0000 7f92b77fe700 1 -- 192.168.123.106:0/4139756502 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f92bc018860 con 0x7f92c0098820 2026-03-09T17:29:38.914 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.911+0000 7f92c759e700 1 -- 192.168.123.106:0/4139756502 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f92c0132b60 con 0x7f92c0098820 2026-03-09T17:29:38.914 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.912+0000 7f92b77fe700 1 -- 192.168.123.106:0/4139756502 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f92bc0189c0 con 0x7f92c0098820 2026-03-09T17:29:38.914 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.912+0000 7f92c759e700 1 -- 192.168.123.106:0/4139756502 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f92c0005750 con 0x7f92c0098820 2026-03-09T17:29:38.915 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.913+0000 7f92b77fe700 1 --2- 192.168.123.106:0/4139756502 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f92b006c7a0 0x7f92b006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:38.915 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.913+0000 7f92b77fe700 1 -- 192.168.123.106:0/4139756502 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f92bc014070 con 0x7f92c0098820 2026-03-09T17:29:38.915 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.913+0000 7f92c659c700 1 --2- 192.168.123.106:0/4139756502 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f92b006c7a0 0x7f92b006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:38.916 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.914+0000 7f92c659c700 1 --2- 192.168.123.106:0/4139756502 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f92b006c7a0 0x7f92b006ec50 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f92b8005200 tx=0x7f92b801a040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:38.918 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:38.916+0000 7f92b77fe700 1 -- 192.168.123.106:0/4139756502 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f92bc092050 con 0x7f92c0098820 2026-03-09T17:29:38.919 INFO:tasks.workunit.client.0.vm06.stdout:3/473: mknod dd/d19/c98 0 2026-03-09T17:29:38.919 INFO:tasks.workunit.client.0.vm06.stdout:3/474: readlink dd/d19/l39 0 2026-03-09T17:29:38.920 INFO:tasks.workunit.client.0.vm06.stdout:3/475: chown dd/d1d/d4e/f5a 1805 1 2026-03-09T17:29:38.950 INFO:tasks.workunit.client.0.vm06.stdout:5/478: mknod d4/d50/d35/d40/d96/d98/cba 0 2026-03-09T17:29:38.967 INFO:tasks.workunit.client.0.vm06.stdout:7/532: rename d5/d1f/d34/f8c to d5/dd/d79/d7f/f98 0 2026-03-09T17:29:38.968 INFO:tasks.workunit.client.0.vm06.stdout:7/533: chown d5/d12/d64/f8d 740 1 2026-03-09T17:29:38.968 INFO:tasks.workunit.client.0.vm06.stdout:5/479: dread d4/f5 [4194304,4194304] 0 2026-03-09T17:29:38.988 INFO:tasks.workunit.client.0.vm06.stdout:6/401: dwrite d6/d4f/f3a [0,4194304] 0 2026-03-09T17:29:38.990 INFO:tasks.workunit.client.0.vm06.stdout:6/402: truncate d6/d4f/d3e/f62 749473 0 2026-03-09T17:29:38.991 INFO:tasks.workunit.client.0.vm06.stdout:6/403: write d6/d12/d53/f5b [1444224,39893] 0 2026-03-09T17:29:38.999 INFO:tasks.workunit.client.0.vm06.stdout:0/572: write d7/d11/d19/d37/fb4 [710996,69503] 0 2026-03-09T17:29:39.006 INFO:tasks.workunit.client.0.vm06.stdout:4/491: write db/df/f4d [492546,120477] 0 2026-03-09T17:29:39.033 INFO:tasks.workunit.client.0.vm06.stdout:9/556: dwrite d3/d26/f33 [0,4194304] 0 2026-03-09T17:29:39.035 INFO:tasks.workunit.client.0.vm06.stdout:7/534: creat d5/d7/f99 x:0 0 0 2026-03-09T17:29:39.036 INFO:tasks.workunit.client.0.vm06.stdout:7/535: fdatasync d5/d1f/d34/d46/d51/f7c 0 2026-03-09T17:29:39.040 INFO:tasks.workunit.client.0.vm06.stdout:5/480: rmdir d4/d50/d18/d3d 39 2026-03-09T17:29:39.056 INFO:tasks.workunit.client.0.vm06.stdout:0/573: creat d7/d11/d19/d8b/da4/d85/fc8 x:0 0 0 2026-03-09T17:29:39.057 INFO:tasks.workunit.client.0.vm06.stdout:4/492: creat db/d1d/d21/d44/fb7 x:0 0 0 2026-03-09T17:29:39.073 INFO:tasks.workunit.client.0.vm06.stdout:7/536: mknod d5/d1f/d34/d3f/c9a 0 2026-03-09T17:29:39.082 INFO:tasks.workunit.client.0.vm06.stdout:9/557: write d3/d15/f74 [4687677,88919] 0 2026-03-09T17:29:39.086 INFO:tasks.workunit.client.0.vm06.stdout:6/404: write d6/d4f/f3c [1792680,64193] 0 2026-03-09T17:29:39.089 INFO:tasks.workunit.client.0.vm06.stdout:6/405: dwrite d6/d12/d17/d65/f72 [0,4194304] 0 2026-03-09T17:29:39.091 INFO:tasks.workunit.client.0.vm06.stdout:6/406: fdatasync d6/d12/d2d/f6c 0 2026-03-09T17:29:39.106 INFO:tasks.workunit.client.0.vm06.stdout:3/476: link dd/d1d/d4e/f74 dd/d1d/f99 0 2026-03-09T17:29:39.114 INFO:tasks.workunit.client.0.vm06.stdout:1/507: rename d11/d14/d1d/d42/d46/d92/f9a to d11/d14/fa6 0 2026-03-09T17:29:39.118 INFO:tasks.workunit.client.0.vm06.stdout:0/574: dwrite d7/d11/d5d/f93 [0,4194304] 0 2026-03-09T17:29:39.119 INFO:tasks.workunit.client.0.vm06.stdout:1/508: dread d11/d14/f59 [0,4194304] 0 2026-03-09T17:29:39.120 INFO:tasks.workunit.client.0.vm06.stdout:1/509: chown d11/d14/d1c/d1f/f4c 19182 1 2026-03-09T17:29:39.121 INFO:tasks.workunit.client.0.vm06.stdout:0/575: write d7/d11/d19/d1d/d39/f51 [3247911,25503] 0 2026-03-09T17:29:39.126 INFO:tasks.workunit.client.0.vm06.stdout:4/493: dwrite db/f15 [0,4194304] 0 2026-03-09T17:29:39.135 INFO:tasks.workunit.client.0.vm06.stdout:1/510: dread d11/d14/d1d/d1e/d2a/f38 [0,4194304] 0 2026-03-09T17:29:39.136 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.134+0000 7f92c759e700 1 -- 192.168.123.106:0/4139756502 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f92c0133030 con 0x7f92c0098820 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.135+0000 7f92b77fe700 1 -- 192.168.123.106:0/4139756502 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f92bc05add0 con 0x7f92c0098820 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:29:39.138 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:29:39.142 INFO:tasks.workunit.client.0.vm06.stdout:5/481: mkdir d4/dbb 0 2026-03-09T17:29:39.151 INFO:tasks.workunit.client.0.vm06.stdout:8/462: truncate d15/d16/d1a/f22 1598106 0 2026-03-09T17:29:39.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.141+0000 7f92b57fa700 1 -- 192.168.123.106:0/4139756502 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f92b006c7a0 msgr2=0x7f92b006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:39.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.142+0000 7f92b57fa700 1 --2- 192.168.123.106:0/4139756502 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f92b006c7a0 0x7f92b006ec50 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f92b8005200 tx=0x7f92b801a040 comp rx=0 tx=0).stop 2026-03-09T17:29:39.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.142+0000 7f92b57fa700 1 -- 192.168.123.106:0/4139756502 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92c0098820 msgr2=0x7f92c012d400 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:39.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.142+0000 7f92b57fa700 1 --2- 192.168.123.106:0/4139756502 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92c0098820 0x7f92c012d400 secure :-1 s=READY pgs=311 cs=0 l=1 rev1=1 crypto rx=0x7f92bc00eab0 tx=0x7f92bc00edc0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.142+0000 7f92b57fa700 1 -- 192.168.123.106:0/4139756502 shutdown_connections 2026-03-09T17:29:39.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.142+0000 7f92b57fa700 1 --2- 192.168.123.106:0/4139756502 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f92c0097620 0x7f92c012cec0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.142+0000 7f92b57fa700 1 --2- 192.168.123.106:0/4139756502 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f92b006c7a0 0x7f92b006ec50 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.142+0000 7f92b57fa700 1 --2- 192.168.123.106:0/4139756502 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92c0098820 0x7f92c012d400 unknown :-1 s=CLOSED pgs=311 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.142+0000 7f92b57fa700 1 -- 192.168.123.106:0/4139756502 >> 192.168.123.106:0/4139756502 conn(0x7f92c0092bb0 msgr2=0x7f92c009ba50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:39.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.142+0000 7f92b57fa700 1 -- 192.168.123.106:0/4139756502 shutdown_connections 2026-03-09T17:29:39.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.142+0000 7f92b57fa700 1 -- 192.168.123.106:0/4139756502 wait complete. 2026-03-09T17:29:39.152 INFO:tasks.workunit.client.0.vm06.stdout:2/453: link d3/f29 d3/f91 0 2026-03-09T17:29:39.152 INFO:tasks.workunit.client.0.vm06.stdout:7/537: dwrite d5/d1f/d34/d46/f55 [0,4194304] 0 2026-03-09T17:29:39.164 INFO:tasks.workunit.client.0.vm06.stdout:1/511: creat d11/d14/d1d/d4a/fa7 x:0 0 0 2026-03-09T17:29:39.165 INFO:tasks.workunit.client.0.vm06.stdout:1/512: fsync d11/d14/d1d/d1e/d2a/f40 0 2026-03-09T17:29:39.169 INFO:tasks.workunit.client.0.vm06.stdout:9/558: creat d3/d6d/d9a/d9c/fab x:0 0 0 2026-03-09T17:29:39.169 INFO:tasks.workunit.client.0.vm06.stdout:6/407: symlink d6/d4f/d3e/l82 0 2026-03-09T17:29:39.169 INFO:tasks.workunit.client.0.vm06.stdout:7/538: dread - d5/dd/f22 zero size 2026-03-09T17:29:39.177 INFO:tasks.workunit.client.0.vm06.stdout:8/463: creat d15/d16/d19/d2b/d85/f9a x:0 0 0 2026-03-09T17:29:39.178 INFO:tasks.workunit.client.0.vm06.stdout:8/464: chown d15/d31/f33 18086692 1 2026-03-09T17:29:39.178 INFO:tasks.workunit.client.0.vm06.stdout:6/408: dread d6/d12/d2d/f39 [0,4194304] 0 2026-03-09T17:29:39.181 INFO:tasks.workunit.client.0.vm06.stdout:9/559: symlink d3/d15/d36/d4d/lac 0 2026-03-09T17:29:39.182 INFO:tasks.workunit.client.0.vm06.stdout:9/560: chown d3/d15/d36/d4d/f62 143349 1 2026-03-09T17:29:39.182 INFO:tasks.workunit.client.0.vm06.stdout:9/561: chown d3/d11/d65/f7c 83262 1 2026-03-09T17:29:39.198 INFO:tasks.workunit.client.0.vm06.stdout:9/562: truncate d3/d11/d65/f66 173234 0 2026-03-09T17:29:39.205 INFO:tasks.workunit.client.0.vm06.stdout:4/494: link db/d1d/d21/d25/l41 db/d1d/d21/d37/d69/d78/db4/lb8 0 2026-03-09T17:29:39.209 INFO:tasks.workunit.client.0.vm06.stdout:3/477: dwrite dd/d19/f2b [0,4194304] 0 2026-03-09T17:29:39.226 INFO:tasks.workunit.client.0.vm06.stdout:0/576: write d7/f12 [4227819,6441] 0 2026-03-09T17:29:39.228 INFO:tasks.workunit.client.0.vm06.stdout:5/482: write d4/d50/f1d [2674986,21340] 0 2026-03-09T17:29:39.228 INFO:tasks.workunit.client.0.vm06.stdout:7/539: write d5/dd/f3e [90017,74158] 0 2026-03-09T17:29:39.237 INFO:tasks.workunit.client.0.vm06.stdout:8/465: dwrite d15/d16/d19/f61 [0,4194304] 0 2026-03-09T17:29:39.251 INFO:tasks.workunit.client.0.vm06.stdout:5/483: dread d4/d22/f5d [0,4194304] 0 2026-03-09T17:29:39.269 INFO:tasks.workunit.client.0.vm06.stdout:4/495: mknod db/d1d/d21/d26/d89/cb9 0 2026-03-09T17:29:39.269 INFO:tasks.workunit.client.0.vm06.stdout:4/496: readlink db/d1d/d21/l4c 0 2026-03-09T17:29:39.272 INFO:tasks.workunit.client.0.vm06.stdout:1/513: getdents d11/d14/d1d/d42 0 2026-03-09T17:29:39.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.271+0000 7f7cbab45700 1 -- 192.168.123.106:0/3154300246 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7cb4072360 msgr2=0x7f7cb40770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:39.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.271+0000 7f7cbab45700 1 --2- 192.168.123.106:0/3154300246 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7cb4072360 0x7f7cb40770e0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f7cac009230 tx=0x7f7cac009260 comp rx=0 tx=0).stop 2026-03-09T17:29:39.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.271+0000 7f7cbab45700 1 -- 192.168.123.106:0/3154300246 shutdown_connections 2026-03-09T17:29:39.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.271+0000 7f7cbab45700 1 --2- 192.168.123.106:0/3154300246 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7cb4072360 0x7f7cb40770e0 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.271+0000 7f7cbab45700 1 --2- 192.168.123.106:0/3154300246 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7cb4071980 0x7f7cb4071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.271+0000 7f7cbab45700 1 -- 192.168.123.106:0/3154300246 >> 192.168.123.106:0/3154300246 conn(0x7f7cb406d1a0 msgr2=0x7f7cb406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.271+0000 7f7cbab45700 1 -- 192.168.123.106:0/3154300246 shutdown_connections 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.271+0000 7f7cbab45700 1 -- 192.168.123.106:0/3154300246 wait complete. 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.271+0000 7f7cbab45700 1 Processor -- start 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.271+0000 7f7cbab45700 1 -- start start 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.271+0000 7f7cbab45700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7cb4071980 0x7f7cb40824a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.271+0000 7f7cbab45700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7cb40829e0 0x7f7cb4082e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.271+0000 7f7cbab45700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7cb4083e50 con 0x7f7cb4071980 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.271+0000 7f7cbab45700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7cb412dd80 con 0x7f7cb40829e0 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.272+0000 7f7cb3fff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7cb40829e0 0x7f7cb4082e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.272+0000 7f7cb3fff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7cb40829e0 0x7f7cb4082e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:47276/0 (socket says 192.168.123.106:47276) 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.272+0000 7f7cb3fff700 1 -- 192.168.123.106:0/3937072729 learned_addr learned my addr 192.168.123.106:0/3937072729 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:29:39.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:38 vm06.local ceph-mon[57307]: from='client.24449 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:39.274 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:38 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.272+0000 7f7cb88e1700 1 --2- 192.168.123.106:0/3937072729 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7cb4071980 0x7f7cb40824a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.272+0000 7f7cb3fff700 1 -- 192.168.123.106:0/3937072729 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7cb4071980 msgr2=0x7f7cb40824a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.272+0000 7f7cb3fff700 1 --2- 192.168.123.106:0/3937072729 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7cb4071980 0x7f7cb40824a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.272+0000 7f7cb3fff700 1 -- 192.168.123.106:0/3937072729 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7cac008ee0 con 0x7f7cb40829e0 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.272+0000 7f7cb3fff700 1 --2- 192.168.123.106:0/3937072729 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7cb40829e0 0x7f7cb4082e50 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f7cac004740 tx=0x7f7cac004820 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:39.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.272+0000 7f7cb1ffb700 1 -- 192.168.123.106:0/3937072729 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7cac01d070 con 0x7f7cb40829e0 2026-03-09T17:29:39.275 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.272+0000 7f7cbab45700 1 -- 192.168.123.106:0/3937072729 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7cb412dfa0 con 0x7f7cb40829e0 2026-03-09T17:29:39.275 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.272+0000 7f7cbab45700 1 -- 192.168.123.106:0/3937072729 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7cb412e490 con 0x7f7cb40829e0 2026-03-09T17:29:39.275 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.273+0000 7f7cb1ffb700 1 -- 192.168.123.106:0/3937072729 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7cac00ece0 con 0x7f7cb40829e0 2026-03-09T17:29:39.275 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.273+0000 7f7cb1ffb700 1 -- 192.168.123.106:0/3937072729 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7cac016750 con 0x7f7cb40829e0 2026-03-09T17:29:39.276 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.274+0000 7f7cbab45700 1 -- 192.168.123.106:0/3937072729 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7ca0005320 con 0x7f7cb40829e0 2026-03-09T17:29:39.277 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.275+0000 7f7cb1ffb700 1 -- 192.168.123.106:0/3937072729 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f7cac00e800 con 0x7f7cb40829e0 2026-03-09T17:29:39.277 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.275+0000 7f7cb1ffb700 1 --2- 192.168.123.106:0/3937072729 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7c9c06c6d0 0x7f7c9c06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:39.277 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.275+0000 7f7cb1ffb700 1 -- 192.168.123.106:0/3937072729 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f7cac012070 con 0x7f7cb40829e0 2026-03-09T17:29:39.280 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.278+0000 7f7cb88e1700 1 --2- 192.168.123.106:0/3937072729 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7c9c06c6d0 0x7f7c9c06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:39.281 INFO:tasks.workunit.client.0.vm06.stdout:3/478: chown dd/c42 119957795 1 2026-03-09T17:29:39.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.280+0000 7f7cb88e1700 1 --2- 192.168.123.106:0/3937072729 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7c9c06c6d0 0x7f7c9c06eb80 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f7ca400b3c0 tx=0x7f7ca400d040 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:39.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.280+0000 7f7cb1ffb700 1 -- 192.168.123.106:0/3937072729 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7cac05b6a0 con 0x7f7cb40829e0 2026-03-09T17:29:39.285 INFO:tasks.workunit.client.0.vm06.stdout:0/577: creat d7/d11/d5d/d64/fc9 x:0 0 0 2026-03-09T17:29:39.286 INFO:tasks.workunit.client.0.vm06.stdout:0/578: readlink d7/d11/d19/d1d/d87/la1 0 2026-03-09T17:29:39.288 INFO:tasks.workunit.client.0.vm06.stdout:7/540: write d5/f71 [536979,84049] 0 2026-03-09T17:29:39.289 INFO:tasks.workunit.client.0.vm06.stdout:7/541: chown d5/d7/f99 16476527 1 2026-03-09T17:29:39.289 INFO:tasks.workunit.client.0.vm06.stdout:7/542: stat d5/d7/f99 0 2026-03-09T17:29:39.289 INFO:tasks.workunit.client.0.vm06.stdout:7/543: stat d5/d12/l6a 0 2026-03-09T17:29:39.290 INFO:tasks.workunit.client.0.vm06.stdout:7/544: write d5/d12/d64/f77 [29157,66433] 0 2026-03-09T17:29:39.321 INFO:tasks.workunit.client.0.vm06.stdout:5/484: dwrite d4/d50/f80 [0,4194304] 0 2026-03-09T17:29:39.326 INFO:tasks.workunit.client.0.vm06.stdout:9/563: mkdir d3/dad 0 2026-03-09T17:29:39.327 INFO:tasks.workunit.client.0.vm06.stdout:9/564: write d3/d15/d16/f72 [2449124,103341] 0 2026-03-09T17:29:39.347 INFO:tasks.workunit.client.0.vm06.stdout:3/479: creat dd/d1d/d6e/f9a x:0 0 0 2026-03-09T17:29:39.353 INFO:tasks.workunit.client.0.vm06.stdout:0/579: fdatasync d7/d11/d19/d37/f4f 0 2026-03-09T17:29:39.358 INFO:tasks.workunit.client.0.vm06.stdout:7/545: symlink d5/d7/d2b/l9b 0 2026-03-09T17:29:39.372 INFO:tasks.workunit.client.0.vm06.stdout:5/485: symlink d4/d52/lbc 0 2026-03-09T17:29:39.380 INFO:tasks.workunit.client.0.vm06.stdout:1/514: write d11/d14/d1c/d1f/f45 [2897769,129482] 0 2026-03-09T17:29:39.382 INFO:tasks.workunit.client.0.vm06.stdout:8/466: dwrite d15/d16/d19/f4f [0,4194304] 0 2026-03-09T17:29:39.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:38 vm09.local ceph-mon[62061]: from='client.24449 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:39.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:38 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:29:39.410 INFO:tasks.workunit.client.0.vm06.stdout:3/480: mkdir dd/d19/d25/d2d/d9b 0 2026-03-09T17:29:39.411 INFO:tasks.workunit.client.0.vm06.stdout:3/481: dread - dd/d1d/d6e/f9a zero size 2026-03-09T17:29:39.415 INFO:tasks.workunit.client.0.vm06.stdout:0/580: truncate d7/d11/f13 5745835 0 2026-03-09T17:29:39.415 INFO:tasks.workunit.client.0.vm06.stdout:0/581: fsync d7/d11/d2d/f2f 0 2026-03-09T17:29:39.416 INFO:tasks.workunit.client.0.vm06.stdout:0/582: chown d7/d11/d19/d1d/d87/fc4 15968489 1 2026-03-09T17:29:39.421 INFO:tasks.workunit.client.0.vm06.stdout:4/497: truncate db/d1d/d21/f42 2795142 0 2026-03-09T17:29:39.427 INFO:tasks.workunit.client.0.vm06.stdout:6/409: link d6/c24 d6/d47/d4d/d6d/c83 0 2026-03-09T17:29:39.428 INFO:tasks.workunit.client.0.vm06.stdout:6/410: chown d6/d4f/d3e/f62 343 1 2026-03-09T17:29:39.430 INFO:tasks.workunit.client.0.vm06.stdout:5/486: truncate d4/f17 1146704 0 2026-03-09T17:29:39.433 INFO:tasks.workunit.client.0.vm06.stdout:9/565: rename d3/c42 to d3/d11/cae 0 2026-03-09T17:29:39.437 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.435+0000 7f7cbab45700 1 -- 192.168.123.106:0/3937072729 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f7ca0005cc0 con 0x7f7cb40829e0 2026-03-09T17:29:39.439 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.437+0000 7f7cb1ffb700 1 -- 192.168.123.106:0/3937072729 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1851 (secure 0 0 0) 0x7f7cac0085f0 con 0x7f7cb40829e0 2026-03-09T17:29:39.439 INFO:teuthology.orchestra.run.vm06.stdout:e13 2026-03-09T17:29:39.439 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:29:39.439 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:29:39.439 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:29:39.439 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:29:39.439 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:29:39.439 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:29:39.439 INFO:teuthology.orchestra.run.vm06.stdout:epoch 11 2026-03-09T17:29:39.439 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24291} 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:29:39.440 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:29:39.442 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.440+0000 7f7c9b7fe700 1 -- 192.168.123.106:0/3937072729 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7c9c06c6d0 msgr2=0x7f7c9c06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:39.442 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.440+0000 7f7c9b7fe700 1 --2- 192.168.123.106:0/3937072729 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7c9c06c6d0 0x7f7c9c06eb80 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f7ca400b3c0 tx=0x7f7ca400d040 comp rx=0 tx=0).stop 2026-03-09T17:29:39.442 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.440+0000 7f7c9b7fe700 1 -- 192.168.123.106:0/3937072729 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7cb40829e0 msgr2=0x7f7cb4082e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:39.442 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.440+0000 7f7c9b7fe700 1 --2- 192.168.123.106:0/3937072729 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7cb40829e0 0x7f7cb4082e50 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f7cac004740 tx=0x7f7cac004820 comp rx=0 tx=0).stop 2026-03-09T17:29:39.442 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.440+0000 7f7c9b7fe700 1 -- 192.168.123.106:0/3937072729 shutdown_connections 2026-03-09T17:29:39.442 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.440+0000 7f7c9b7fe700 1 --2- 192.168.123.106:0/3937072729 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f7c9c06c6d0 0x7f7c9c06eb80 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.442 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.440+0000 7f7c9b7fe700 1 --2- 192.168.123.106:0/3937072729 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7cb4071980 0x7f7cb40824a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.442 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.440+0000 7f7c9b7fe700 1 --2- 192.168.123.106:0/3937072729 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7cb40829e0 0x7f7cb4082e50 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.442 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.440+0000 7f7c9b7fe700 1 -- 192.168.123.106:0/3937072729 >> 192.168.123.106:0/3937072729 conn(0x7f7cb406d1a0 msgr2=0x7f7cb4076470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:39.442 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.440+0000 7f7c9b7fe700 1 -- 192.168.123.106:0/3937072729 shutdown_connections 2026-03-09T17:29:39.442 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.440+0000 7f7c9b7fe700 1 -- 192.168.123.106:0/3937072729 wait complete. 2026-03-09T17:29:39.443 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:29:39.455 INFO:tasks.workunit.client.0.vm06.stdout:8/467: fdatasync d15/d16/d1a/f1b 0 2026-03-09T17:29:39.455 INFO:tasks.workunit.client.0.vm06.stdout:8/468: readlink d15/d16/d1a/l20 0 2026-03-09T17:29:39.456 INFO:tasks.workunit.client.0.vm06.stdout:8/469: dread - d15/d16/d19/d71/f96 zero size 2026-03-09T17:29:39.460 INFO:tasks.workunit.client.0.vm06.stdout:8/470: dwrite d15/d16/d19/f93 [0,4194304] 0 2026-03-09T17:29:39.461 INFO:tasks.workunit.client.0.vm06.stdout:8/471: write fe [4486221,57969] 0 2026-03-09T17:29:39.463 INFO:tasks.workunit.client.0.vm06.stdout:2/454: creat d3/d4/d12/f92 x:0 0 0 2026-03-09T17:29:39.465 INFO:tasks.workunit.client.0.vm06.stdout:3/482: symlink dd/d19/d25/d48/l9c 0 2026-03-09T17:29:39.469 INFO:tasks.workunit.client.0.vm06.stdout:0/583: mkdir d7/d11/d2d/dca 0 2026-03-09T17:29:39.484 INFO:tasks.workunit.client.0.vm06.stdout:7/546: symlink d5/d1f/d34/d3f/d91/l9c 0 2026-03-09T17:29:39.488 INFO:tasks.workunit.client.0.vm06.stdout:6/411: truncate d6/d47/d4d/f55 4021638 0 2026-03-09T17:29:39.490 INFO:tasks.workunit.client.0.vm06.stdout:6/412: read d6/d12/d17/d27/f37 [322013,78855] 0 2026-03-09T17:29:39.494 INFO:tasks.workunit.client.0.vm06.stdout:9/566: creat d3/d6d/d85/faf x:0 0 0 2026-03-09T17:29:39.495 INFO:tasks.workunit.client.0.vm06.stdout:9/567: chown d3/f4b 123167053 1 2026-03-09T17:29:39.498 INFO:tasks.workunit.client.0.vm06.stdout:1/515: symlink d11/d14/d1d/d1e/d2a/la8 0 2026-03-09T17:29:39.517 INFO:tasks.workunit.client.0.vm06.stdout:2/455: stat d3/d4/d12/d2b/d2d/l51 0 2026-03-09T17:29:39.517 INFO:tasks.workunit.client.0.vm06.stdout:2/456: stat d3/d4 0 2026-03-09T17:29:39.521 INFO:tasks.workunit.client.0.vm06.stdout:3/483: readlink dd/d19/d2c/l94 0 2026-03-09T17:29:39.526 INFO:tasks.workunit.client.0.vm06.stdout:4/498: mkdir db/d1d/d21/d26/d89/dab/dae/dba 0 2026-03-09T17:29:39.534 INFO:tasks.workunit.client.0.vm06.stdout:4/499: dread db/d1d/d21/d25/f38 [0,4194304] 0 2026-03-09T17:29:39.537 INFO:tasks.workunit.client.0.vm06.stdout:4/500: dwrite db/d59/d5f/d45/fa9 [0,4194304] 0 2026-03-09T17:29:39.546 INFO:tasks.workunit.client.0.vm06.stdout:7/547: rmdir d5/dd 39 2026-03-09T17:29:39.553 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.548+0000 7f059c361700 1 -- 192.168.123.106:0/4200020680 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0594072360 msgr2=0x7f05940770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:39.553 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.548+0000 7f059c361700 1 --2- 192.168.123.106:0/4200020680 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0594072360 0x7f05940770e0 secure :-1 s=READY pgs=312 cs=0 l=1 rev1=1 crypto rx=0x7f058c009230 tx=0x7f058c009260 comp rx=0 tx=0).stop 2026-03-09T17:29:39.553 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.548+0000 7f059c361700 1 -- 192.168.123.106:0/4200020680 shutdown_connections 2026-03-09T17:29:39.553 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.548+0000 7f059c361700 1 --2- 192.168.123.106:0/4200020680 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0594072360 0x7f05940770e0 unknown :-1 s=CLOSED pgs=312 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.553 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.548+0000 7f059c361700 1 --2- 192.168.123.106:0/4200020680 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0594071980 0x7f0594071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.553 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.548+0000 7f059c361700 1 -- 192.168.123.106:0/4200020680 >> 192.168.123.106:0/4200020680 conn(0x7f059406d1a0 msgr2=0x7f059406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:39.553 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.548+0000 7f059c361700 1 -- 192.168.123.106:0/4200020680 shutdown_connections 2026-03-09T17:29:39.553 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.548+0000 7f059c361700 1 -- 192.168.123.106:0/4200020680 wait complete. 2026-03-09T17:29:39.553 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.548+0000 7f059c361700 1 Processor -- start 2026-03-09T17:29:39.553 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.548+0000 7f059c361700 1 -- start start 2026-03-09T17:29:39.553 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.549+0000 7f059c361700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0594071980 0x7f05940825c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:39.553 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.549+0000 7f059c361700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0594082b00 0x7f0594082f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:39.553 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.549+0000 7f059c361700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f059412dd80 con 0x7f0594071980 2026-03-09T17:29:39.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.549+0000 7f059c361700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f059412def0 con 0x7f0594082b00 2026-03-09T17:29:39.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.549+0000 7f059a0fd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0594071980 0x7f05940825c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:39.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.549+0000 7f059a0fd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0594071980 0x7f05940825c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:39924/0 (socket says 192.168.123.106:39924) 2026-03-09T17:29:39.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.549+0000 7f059a0fd700 1 -- 192.168.123.106:0/1724692899 learned_addr learned my addr 192.168.123.106:0/1724692899 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:29:39.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.549+0000 7f05998fc700 1 --2- 192.168.123.106:0/1724692899 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0594082b00 0x7f0594082f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:39.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.549+0000 7f059a0fd700 1 -- 192.168.123.106:0/1724692899 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0594082b00 msgr2=0x7f0594082f70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:39.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.549+0000 7f059a0fd700 1 --2- 192.168.123.106:0/1724692899 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0594082b00 0x7f0594082f70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.549+0000 7f059a0fd700 1 -- 192.168.123.106:0/1724692899 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f058c008ee0 con 0x7f0594071980 2026-03-09T17:29:39.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.550+0000 7f059a0fd700 1 --2- 192.168.123.106:0/1724692899 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0594071980 0x7f05940825c0 secure :-1 s=READY pgs=313 cs=0 l=1 rev1=1 crypto rx=0x7f059000da50 tx=0x7f059000de10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:39.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.550+0000 7f058b7fe700 1 -- 192.168.123.106:0/1724692899 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f059000cc40 con 0x7f0594071980 2026-03-09T17:29:39.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.550+0000 7f059c361700 1 -- 192.168.123.106:0/1724692899 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f059412e170 con 0x7f0594071980 2026-03-09T17:29:39.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.550+0000 7f059c361700 1 -- 192.168.123.106:0/1724692899 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f059412e6c0 con 0x7f0594071980 2026-03-09T17:29:39.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.550+0000 7f059c361700 1 -- 192.168.123.106:0/1724692899 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f059407c8b0 con 0x7f0594071980 2026-03-09T17:29:39.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.556+0000 7f058b7fe700 1 -- 192.168.123.106:0/1724692899 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f059000cda0 con 0x7f0594071980 2026-03-09T17:29:39.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.556+0000 7f058b7fe700 1 -- 192.168.123.106:0/1724692899 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f059001d7a0 con 0x7f0594071980 2026-03-09T17:29:39.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.556+0000 7f058b7fe700 1 -- 192.168.123.106:0/1724692899 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f059001d980 con 0x7f0594071980 2026-03-09T17:29:39.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.557+0000 7f058b7fe700 1 --2- 192.168.123.106:0/1724692899 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f058006c7a0 0x7f058006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:39.559 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.557+0000 7f058b7fe700 1 -- 192.168.123.106:0/1724692899 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f059008ce70 con 0x7f0594071980 2026-03-09T17:29:39.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.557+0000 7f05998fc700 1 --2- 192.168.123.106:0/1724692899 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f058006c7a0 0x7f058006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:39.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.558+0000 7f05998fc700 1 --2- 192.168.123.106:0/1724692899 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f058006c7a0 0x7f058006ec50 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f058c0062a0 tx=0x7f058c0061f0 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:39.565 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.560+0000 7f058b7fe700 1 -- 192.168.123.106:0/1724692899 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f0590057670 con 0x7f0594071980 2026-03-09T17:29:39.565 INFO:tasks.workunit.client.0.vm06.stdout:6/413: write d6/fb [2806866,114781] 0 2026-03-09T17:29:39.575 INFO:tasks.workunit.client.0.vm06.stdout:1/516: creat d11/fa9 x:0 0 0 2026-03-09T17:29:39.617 INFO:tasks.workunit.client.0.vm06.stdout:2/457: fdatasync d3/d4/d12/d2b/d2d/f48 0 2026-03-09T17:29:39.617 INFO:tasks.workunit.client.0.vm06.stdout:8/472: mkdir d15/d31/d58/d9b 0 2026-03-09T17:29:39.617 INFO:tasks.workunit.client.0.vm06.stdout:3/484: symlink dd/d19/d25/d2d/l9d 0 2026-03-09T17:29:39.617 INFO:tasks.workunit.client.0.vm06.stdout:0/584: mkdir d7/d11/d19/d8b/dcb 0 2026-03-09T17:29:39.617 INFO:tasks.workunit.client.0.vm06.stdout:7/548: dwrite d5/d1f/f80 [0,4194304] 0 2026-03-09T17:29:39.617 INFO:tasks.workunit.client.0.vm06.stdout:7/549: dwrite d5/d1f/f56 [0,4194304] 0 2026-03-09T17:29:39.618 INFO:tasks.workunit.client.0.vm06.stdout:0/585: dread d7/d11/d19/d23/f8e [4194304,4194304] 0 2026-03-09T17:29:39.642 INFO:tasks.workunit.client.0.vm06.stdout:4/501: dread db/d1d/d21/d25/d4b/f4e [0,4194304] 0 2026-03-09T17:29:39.646 INFO:tasks.workunit.client.0.vm06.stdout:6/414: unlink c3 0 2026-03-09T17:29:39.671 INFO:tasks.workunit.client.0.vm06.stdout:7/550: unlink d5/d12/d64/l90 0 2026-03-09T17:29:39.680 INFO:tasks.workunit.client.0.vm06.stdout:4/502: mknod db/d1d/d21/d37/d69/d78/cbb 0 2026-03-09T17:29:39.683 INFO:tasks.workunit.client.0.vm06.stdout:4/503: dwrite db/d1d/d21/d26/f70 [0,4194304] 0 2026-03-09T17:29:39.697 INFO:tasks.workunit.client.0.vm06.stdout:5/487: dwrite d4/d50/d18/f9d [0,4194304] 0 2026-03-09T17:29:39.709 INFO:tasks.workunit.client.0.vm06.stdout:3/485: dwrite dd/f38 [0,4194304] 0 2026-03-09T17:29:39.711 INFO:tasks.workunit.client.0.vm06.stdout:3/486: write dd/d1d/d2e/d67/f7e [541199,53467] 0 2026-03-09T17:29:39.719 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.715+0000 7f059c361700 1 -- 192.168.123.106:0/1724692899 --> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0594061190 con 0x7f058006c7a0 2026-03-09T17:29:39.722 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:29:39.722 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T17:29:39.722 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:29:39.722 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:29:39.722 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [], 2026-03-09T17:29:39.722 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "0/23 daemons upgraded", 2026-03-09T17:29:39.722 INFO:teuthology.orchestra.run.vm06.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm09", 2026-03-09T17:29:39.722 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:29:39.722 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:29:39.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.718+0000 7f058b7fe700 1 -- 192.168.123.106:0/1724692899 <== mgr.14221 v2:192.168.123.106:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f0594061190 con 0x7f058006c7a0 2026-03-09T17:29:39.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.721+0000 7f05897fa700 1 -- 192.168.123.106:0/1724692899 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f058006c7a0 msgr2=0x7f058006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:39.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.721+0000 7f05897fa700 1 --2- 192.168.123.106:0/1724692899 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f058006c7a0 0x7f058006ec50 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f058c0062a0 tx=0x7f058c0061f0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.721+0000 7f05897fa700 1 -- 192.168.123.106:0/1724692899 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0594071980 msgr2=0x7f05940825c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:39.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.721+0000 7f05897fa700 1 --2- 192.168.123.106:0/1724692899 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0594071980 0x7f05940825c0 secure :-1 s=READY pgs=313 cs=0 l=1 rev1=1 crypto rx=0x7f059000da50 tx=0x7f059000de10 comp rx=0 tx=0).stop 2026-03-09T17:29:39.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.721+0000 7f05897fa700 1 -- 192.168.123.106:0/1724692899 shutdown_connections 2026-03-09T17:29:39.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.721+0000 7f05897fa700 1 --2- 192.168.123.106:0/1724692899 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7f058006c7a0 0x7f058006ec50 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.721+0000 7f05897fa700 1 --2- 192.168.123.106:0/1724692899 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0594071980 0x7f05940825c0 unknown :-1 s=CLOSED pgs=313 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.721+0000 7f05897fa700 1 --2- 192.168.123.106:0/1724692899 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0594082b00 0x7f0594082f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.721+0000 7f05897fa700 1 -- 192.168.123.106:0/1724692899 >> 192.168.123.106:0/1724692899 conn(0x7f059406d1a0 msgr2=0x7f05940764c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:39.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.721+0000 7f05897fa700 1 -- 192.168.123.106:0/1724692899 shutdown_connections 2026-03-09T17:29:39.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.721+0000 7f05897fa700 1 -- 192.168.123.106:0/1724692899 wait complete. 2026-03-09T17:29:39.729 INFO:tasks.workunit.client.0.vm06.stdout:2/458: unlink d3/d4/d12/d2b/d36/d37/c88 0 2026-03-09T17:29:39.734 INFO:tasks.workunit.client.0.vm06.stdout:8/473: truncate d15/d16/d1a/f22 1196345 0 2026-03-09T17:29:39.742 INFO:tasks.workunit.client.0.vm06.stdout:0/586: symlink d7/d11/d19/d23/lcc 0 2026-03-09T17:29:39.752 INFO:tasks.workunit.client.0.vm06.stdout:5/488: truncate d4/d22/d46/f6e 991415 0 2026-03-09T17:29:39.759 INFO:tasks.workunit.client.0.vm06.stdout:1/517: dwrite d11/d14/d1d/d1e/d2a/d34/d58/fa1 [0,4194304] 0 2026-03-09T17:29:39.775 INFO:tasks.workunit.client.0.vm06.stdout:4/504: write db/d59/d5f/d5d/f62 [540919,69553] 0 2026-03-09T17:29:39.781 INFO:tasks.workunit.client.0.vm06.stdout:6/415: truncate d6/fb 987715 0 2026-03-09T17:29:39.789 INFO:tasks.workunit.client.0.vm06.stdout:9/568: getdents d3/d11 0 2026-03-09T17:29:39.802 INFO:tasks.workunit.client.0.vm06.stdout:3/487: dwrite dd/f26 [0,4194304] 0 2026-03-09T17:29:39.820 INFO:tasks.workunit.client.0.vm06.stdout:8/474: creat d15/d16/d1a/d47/f9c x:0 0 0 2026-03-09T17:29:39.824 INFO:tasks.workunit.client.0.vm06.stdout:7/551: creat d5/dd/f9d x:0 0 0 2026-03-09T17:29:39.828 INFO:tasks.workunit.client.0.vm06.stdout:0/587: rmdir d7/d11/d19/d1d/d87 39 2026-03-09T17:29:39.829 INFO:tasks.workunit.client.0.vm06.stdout:0/588: chown d7/d11/d5d/db8/l81 528753 1 2026-03-09T17:29:39.837 INFO:tasks.workunit.client.0.vm06.stdout:5/489: read d4/f11 [567226,109003] 0 2026-03-09T17:29:39.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.834+0000 7fb62940c700 1 -- 192.168.123.106:0/492988215 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb6241024a0 msgr2=0x7fb6241028b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:39.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.834+0000 7fb62940c700 1 --2- 192.168.123.106:0/492988215 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb6241024a0 0x7fb6241028b0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7fb614009b00 tx=0x7fb614009e10 comp rx=0 tx=0).stop 2026-03-09T17:29:39.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.835+0000 7fb62940c700 1 -- 192.168.123.106:0/492988215 shutdown_connections 2026-03-09T17:29:39.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.835+0000 7fb62940c700 1 --2- 192.168.123.106:0/492988215 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6241036a0 0x7fb624103af0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.835+0000 7fb62940c700 1 --2- 192.168.123.106:0/492988215 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb6241024a0 0x7fb6241028b0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.835+0000 7fb62940c700 1 -- 192.168.123.106:0/492988215 >> 192.168.123.106:0/492988215 conn(0x7fb6240fda30 msgr2=0x7fb6240ffe80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:39.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.835+0000 7fb62940c700 1 -- 192.168.123.106:0/492988215 shutdown_connections 2026-03-09T17:29:39.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.835+0000 7fb62940c700 1 -- 192.168.123.106:0/492988215 wait complete. 2026-03-09T17:29:39.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.836+0000 7fb62940c700 1 Processor -- start 2026-03-09T17:29:39.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.836+0000 7fb62940c700 1 -- start start 2026-03-09T17:29:39.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.837+0000 7fb62940c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb6241024a0 0x7fb624197e10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:39.839 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.837+0000 7fb62940c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6241036a0 0x7fb624198350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:39.839 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.837+0000 7fb622ffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb6241024a0 0x7fb624197e10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:39.839 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.837+0000 7fb622ffd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb6241024a0 0x7fb624197e10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:47294/0 (socket says 192.168.123.106:47294) 2026-03-09T17:29:39.839 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.837+0000 7fb622ffd700 1 -- 192.168.123.106:0/3124491648 learned_addr learned my addr 192.168.123.106:0/3124491648 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:29:39.839 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.837+0000 7fb62940c700 1 -- 192.168.123.106:0/3124491648 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb624198970 con 0x7fb6241036a0 2026-03-09T17:29:39.839 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.837+0000 7fb62940c700 1 -- 192.168.123.106:0/3124491648 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb624198ab0 con 0x7fb6241024a0 2026-03-09T17:29:39.839 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.837+0000 7fb6227fc700 1 --2- 192.168.123.106:0/3124491648 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6241036a0 0x7fb624198350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:39.840 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.837+0000 7fb622ffd700 1 -- 192.168.123.106:0/3124491648 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6241036a0 msgr2=0x7fb624198350 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:39.840 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.838+0000 7fb622ffd700 1 --2- 192.168.123.106:0/3124491648 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6241036a0 0x7fb624198350 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:39.840 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.838+0000 7fb622ffd700 1 -- 192.168.123.106:0/3124491648 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb6140097e0 con 0x7fb6241024a0 2026-03-09T17:29:39.840 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.838+0000 7fb6227fc700 1 --2- 192.168.123.106:0/3124491648 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6241036a0 0x7fb624198350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T17:29:39.840 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.838+0000 7fb622ffd700 1 --2- 192.168.123.106:0/3124491648 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb6241024a0 0x7fb624197e10 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fb6140052d0 tx=0x7fb614004a60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:39.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.840+0000 7fb60bfff700 1 -- 192.168.123.106:0/3124491648 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb61401d070 con 0x7fb6241024a0 2026-03-09T17:29:39.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.840+0000 7fb60bfff700 1 -- 192.168.123.106:0/3124491648 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb61400bc50 con 0x7fb6241024a0 2026-03-09T17:29:39.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.840+0000 7fb60bfff700 1 -- 192.168.123.106:0/3124491648 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb61400f7c0 con 0x7fb6241024a0 2026-03-09T17:29:39.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.840+0000 7fb62940c700 1 -- 192.168.123.106:0/3124491648 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb62419d500 con 0x7fb6241024a0 2026-03-09T17:29:39.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.840+0000 7fb62940c700 1 -- 192.168.123.106:0/3124491648 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb62419d9f0 con 0x7fb6241024a0 2026-03-09T17:29:39.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.841+0000 7fb62940c700 1 -- 192.168.123.106:0/3124491648 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb62404ea50 con 0x7fb6241024a0 2026-03-09T17:29:39.844 INFO:tasks.workunit.client.0.vm06.stdout:1/518: rename d11/d14/d1d/f8b to d11/d14/d1d/d42/d46/faa 0 2026-03-09T17:29:39.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.843+0000 7fb60bfff700 1 -- 192.168.123.106:0/3124491648 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb614022ae0 con 0x7fb6241024a0 2026-03-09T17:29:39.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.843+0000 7fb60bfff700 1 --2- 192.168.123.106:0/3124491648 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb60c06c7a0 0x7fb60c06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:29:39.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.843+0000 7fb60bfff700 1 -- 192.168.123.106:0/3124491648 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb61408dc50 con 0x7fb6241024a0 2026-03-09T17:29:39.845 INFO:tasks.workunit.client.0.vm06.stdout:6/416: creat d6/d4f/d3e/d52/f84 x:0 0 0 2026-03-09T17:29:39.846 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.844+0000 7fb6227fc700 1 --2- 192.168.123.106:0/3124491648 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb60c06c7a0 0x7fb60c06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:29:39.846 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.844+0000 7fb6227fc700 1 --2- 192.168.123.106:0/3124491648 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb60c06c7a0 0x7fb60c06ec50 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7fb618009fd0 tx=0x7fb618009380 comp rx=0 tx=0).ready entity=mgr.14221 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:29:39.847 INFO:tasks.workunit.client.0.vm06.stdout:9/569: fdatasync d3/d26/f28 0 2026-03-09T17:29:39.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:39.845+0000 7fb60bfff700 1 -- 192.168.123.106:0/3124491648 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb61408de60 con 0x7fb6241024a0 2026-03-09T17:29:39.853 INFO:tasks.workunit.client.0.vm06.stdout:3/488: read - dd/d19/d25/d48/f4c zero size 2026-03-09T17:29:39.858 INFO:tasks.workunit.client.0.vm06.stdout:7/552: symlink d5/d12/d5f/l9e 0 2026-03-09T17:29:39.861 INFO:tasks.workunit.client.0.vm06.stdout:0/589: mknod d7/d11/d19/d23/db7/ccd 0 2026-03-09T17:29:39.865 INFO:tasks.workunit.client.0.vm06.stdout:0/590: dwrite d7/d11/d2d/f2f [0,4194304] 0 2026-03-09T17:29:39.867 INFO:tasks.workunit.client.0.vm06.stdout:4/505: sync 2026-03-09T17:29:39.868 INFO:tasks.workunit.client.0.vm06.stdout:0/591: stat d7/d11/c15 0 2026-03-09T17:29:39.868 INFO:tasks.workunit.client.0.vm06.stdout:7/553: sync 2026-03-09T17:29:39.880 INFO:tasks.workunit.client.0.vm06.stdout:7/554: dread d5/dd/f7d [0,4194304] 0 2026-03-09T17:29:39.880 INFO:tasks.workunit.client.0.vm06.stdout:7/555: chown d5/f71 13799679 1 2026-03-09T17:29:39.892 INFO:tasks.workunit.client.0.vm06.stdout:5/490: mknod d4/d50/d35/d40/d95/cbd 0 2026-03-09T17:29:39.893 INFO:tasks.workunit.client.0.vm06.stdout:5/491: readlink d4/d22/d64/l86 0 2026-03-09T17:29:39.893 INFO:tasks.workunit.client.0.vm06.stdout:5/492: truncate d4/d52/f8a 801655 0 2026-03-09T17:29:39.894 INFO:tasks.workunit.client.0.vm06.stdout:7/556: sync 2026-03-09T17:29:39.897 INFO:tasks.workunit.client.0.vm06.stdout:7/557: dread d5/d12/d64/f77 [0,4194304] 0 2026-03-09T17:29:39.897 INFO:tasks.workunit.client.0.vm06.stdout:7/558: fsync d5/d1f/d34/d46/d51/f7b 0 2026-03-09T17:29:39.901 INFO:tasks.workunit.client.0.vm06.stdout:1/519: fsync d11/d14/d1d/d42/f52 0 2026-03-09T17:29:39.905 INFO:tasks.workunit.client.0.vm06.stdout:2/459: dwrite d3/d4/d12/d2b/d36/d37/f41 [4194304,4194304] 0 2026-03-09T17:29:39.918 INFO:tasks.workunit.client.0.vm06.stdout:8/475: dwrite d15/d16/d1e/d30/f3b [0,4194304] 0 2026-03-09T17:29:39.923 INFO:tasks.workunit.client.0.vm06.stdout:8/476: dwrite d15/d16/d19/d2b/d85/f9a [0,4194304] 0 2026-03-09T17:29:39.930 INFO:tasks.workunit.client.0.vm06.stdout:8/477: fdatasync d15/d16/d19/f61 0 2026-03-09T17:29:39.938 INFO:tasks.workunit.client.0.vm06.stdout:3/489: symlink dd/d19/d2c/l9e 0 2026-03-09T17:29:39.963 INFO:tasks.workunit.client.0.vm06.stdout:5/493: mkdir d4/d22/dbe 0 2026-03-09T17:29:39.963 INFO:tasks.workunit.client.0.vm06.stdout:6/417: fdatasync d6/f5c 0 2026-03-09T17:29:39.963 INFO:tasks.workunit.client.0.vm06.stdout:5/494: fdatasync d4/d22/d64/f9f 0 2026-03-09T17:29:39.963 INFO:tasks.workunit.client.0.vm06.stdout:9/570: dread d3/d15/d36/d4c/f55 [0,4194304] 0 2026-03-09T17:29:39.963 INFO:tasks.workunit.client.0.vm06.stdout:1/520: unlink d11/d14/f59 0 2026-03-09T17:29:39.963 INFO:tasks.workunit.client.0.vm06.stdout:9/571: write d3/d15/f23 [3829619,44148] 0 2026-03-09T17:29:39.963 INFO:tasks.workunit.client.0.vm06.stdout:2/460: mknod d3/d4/d12/d2b/d36/c93 0 2026-03-09T17:29:39.965 INFO:tasks.workunit.client.0.vm06.stdout:2/461: dread d3/d4/f3c [0,4194304] 0 2026-03-09T17:29:39.975 INFO:tasks.workunit.client.0.vm06.stdout:8/478: dread d15/d39/f45 [0,4194304] 0 2026-03-09T17:29:39.977 INFO:tasks.workunit.client.0.vm06.stdout:8/479: stat d15/d16/d1a/l1d 0 2026-03-09T17:29:39.980 INFO:tasks.workunit.client.0.vm06.stdout:4/506: symlink db/lbc 0 2026-03-09T17:29:39.984 INFO:tasks.workunit.client.0.vm06.stdout:3/490: creat dd/d1d/f9f x:0 0 0 2026-03-09T17:29:39.997 INFO:tasks.workunit.client.0.vm06.stdout:5/495: read d4/d50/d18/f4a [2687489,108599] 0 2026-03-09T17:29:39.997 INFO:tasks.workunit.client.0.vm06.stdout:5/496: readlink d4/d50/d18/l7b 0 2026-03-09T17:29:40.014 INFO:tasks.workunit.client.0.vm06.stdout:1/521: rmdir d11/d14/d1d/d1e 39 2026-03-09T17:29:40.029 INFO:tasks.workunit.client.0.vm06.stdout:8/480: chown d15/d31/c53 113450123 1 2026-03-09T17:29:40.029 INFO:tasks.workunit.client.0.vm06.stdout:8/481: chown d15/d16/l1c 1748605 1 2026-03-09T17:29:40.037 INFO:tasks.workunit.client.0.vm06.stdout:4/507: fdatasync f6 0 2026-03-09T17:29:40.037 INFO:tasks.workunit.client.0.vm06.stdout:4/508: fdatasync db/f6f 0 2026-03-09T17:29:40.041 INFO:tasks.workunit.client.0.vm06.stdout:2/462: dread d3/f3b [0,4194304] 0 2026-03-09T17:29:40.046 INFO:tasks.workunit.client.0.vm06.stdout:0/592: creat d7/fce x:0 0 0 2026-03-09T17:29:40.050 INFO:tasks.workunit.client.0.vm06.stdout:3/491: unlink dd/d1d/f8c 0 2026-03-09T17:29:40.053 INFO:tasks.workunit.client.0.vm06.stdout:7/559: link d5/d7/c31 d5/d7/c9f 0 2026-03-09T17:29:40.056 INFO:tasks.workunit.client.0.vm06.stdout:6/418: mkdir d6/d12/d17/d85 0 2026-03-09T17:29:40.071 INFO:tasks.workunit.client.0.vm06.stdout:6/419: sync 2026-03-09T17:29:40.072 INFO:tasks.workunit.client.0.vm06.stdout:4/509: creat db/d1d/d21/d37/fbd x:0 0 0 2026-03-09T17:29:40.073 INFO:tasks.workunit.client.0.vm06.stdout:4/510: chown db/df/l3f 0 1 2026-03-09T17:29:40.074 INFO:tasks.workunit.client.0.vm06.stdout:6/420: dwrite d6/d12/d53/f5b [0,4194304] 0 2026-03-09T17:29:40.081 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:40.078+0000 7fb62940c700 1 -- 192.168.123.106:0/3124491648 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fb62419dcd0 con 0x7fb6241024a0 2026-03-09T17:29:40.081 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:40.079+0000 7fb60bfff700 1 -- 192.168.123.106:0/3124491648 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7fb614027070 con 0x7fb6241024a0 2026-03-09T17:29:40.081 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_OK 2026-03-09T17:29:40.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:40.082+0000 7fb62940c700 1 -- 192.168.123.106:0/3124491648 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb60c06c7a0 msgr2=0x7fb60c06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:40.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:40.082+0000 7fb62940c700 1 --2- 192.168.123.106:0/3124491648 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb60c06c7a0 0x7fb60c06ec50 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7fb618009fd0 tx=0x7fb618009380 comp rx=0 tx=0).stop 2026-03-09T17:29:40.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:40.082+0000 7fb62940c700 1 -- 192.168.123.106:0/3124491648 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb6241024a0 msgr2=0x7fb624197e10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:29:40.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:40.082+0000 7fb62940c700 1 --2- 192.168.123.106:0/3124491648 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb6241024a0 0x7fb624197e10 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fb6140052d0 tx=0x7fb614004a60 comp rx=0 tx=0).stop 2026-03-09T17:29:40.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:40.083+0000 7fb62940c700 1 -- 192.168.123.106:0/3124491648 shutdown_connections 2026-03-09T17:29:40.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:40.083+0000 7fb62940c700 1 --2- 192.168.123.106:0/3124491648 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb6241024a0 0x7fb624197e10 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:40.085 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:40.083+0000 7fb62940c700 1 --2- 192.168.123.106:0/3124491648 >> [v2:192.168.123.106:6800/2,v1:192.168.123.106:6801/2] conn(0x7fb60c06c7a0 0x7fb60c06ec50 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:40.086 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:40.083+0000 7fb62940c700 1 --2- 192.168.123.106:0/3124491648 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb6241036a0 0x7fb624198350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:29:40.086 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:40.083+0000 7fb62940c700 1 -- 192.168.123.106:0/3124491648 >> 192.168.123.106:0/3124491648 conn(0x7fb6240fda30 msgr2=0x7fb6241068d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:29:40.087 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:40.084+0000 7fb62940c700 1 -- 192.168.123.106:0/3124491648 shutdown_connections 2026-03-09T17:29:40.087 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:29:40.085+0000 7fb62940c700 1 -- 192.168.123.106:0/3124491648 wait complete. 2026-03-09T17:29:40.091 INFO:tasks.workunit.client.0.vm06.stdout:3/492: creat dd/d1d/d6e/fa0 x:0 0 0 2026-03-09T17:29:40.102 INFO:tasks.workunit.client.0.vm06.stdout:1/522: creat d11/d14/d1d/d1e/d96/fab x:0 0 0 2026-03-09T17:29:40.116 INFO:tasks.workunit.client.0.vm06.stdout:5/497: dwrite d4/d22/d46/f6e [0,4194304] 0 2026-03-09T17:29:40.135 INFO:tasks.workunit.client.0.vm06.stdout:3/493: mkdir dd/d59/da1 0 2026-03-09T17:29:40.135 INFO:tasks.workunit.client.0.vm06.stdout:3/494: readlink dd/d19/d2c/l9e 0 2026-03-09T17:29:40.137 INFO:tasks.workunit.client.0.vm06.stdout:1/523: creat d11/d14/d1c/d1f/d57/fac x:0 0 0 2026-03-09T17:29:40.138 INFO:tasks.workunit.client.0.vm06.stdout:9/572: rename d3/d11/f14 to d3/d26/d35/fb0 0 2026-03-09T17:29:40.139 INFO:tasks.workunit.client.0.vm06.stdout:8/482: creat d15/f9d x:0 0 0 2026-03-09T17:29:40.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:40 vm06.local ceph-mon[57307]: from='client.24453 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:40.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:40 vm06.local ceph-mon[57307]: from='client.24457 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:40.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:40 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/4139756502' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:29:40.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:40 vm06.local ceph-mon[57307]: pgmap v153: 65 pgs: 65 active+clean; 1.1 GiB data, 4.5 GiB used, 116 GiB / 120 GiB avail; 22 MiB/s rd, 79 MiB/s wr, 288 op/s 2026-03-09T17:29:40.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:40 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/3937072729' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:29:40.143 INFO:tasks.workunit.client.0.vm06.stdout:4/511: truncate db/f39 783337 0 2026-03-09T17:29:40.145 INFO:tasks.workunit.client.0.vm06.stdout:6/421: link d6/d47/d4d/f6e d6/d12/d53/f86 0 2026-03-09T17:29:40.149 INFO:tasks.workunit.client.0.vm06.stdout:0/593: rename d7/d11/d19/d1d/d39/l7b to d7/d11/d19/d3c/lcf 0 2026-03-09T17:29:40.157 INFO:tasks.workunit.client.0.vm06.stdout:4/512: creat db/d1d/d21/d25/d4b/d85/fbe x:0 0 0 2026-03-09T17:29:40.161 INFO:tasks.workunit.client.0.vm06.stdout:2/463: dwrite d3/d4/f11 [0,4194304] 0 2026-03-09T17:29:40.161 INFO:tasks.workunit.client.0.vm06.stdout:3/495: mknod dd/d81/d97/ca2 0 2026-03-09T17:29:40.180 INFO:tasks.workunit.client.0.vm06.stdout:9/573: dwrite d3/d15/f46 [0,4194304] 0 2026-03-09T17:29:40.184 INFO:tasks.workunit.client.0.vm06.stdout:7/560: rename d5/d7/d2b/f52 to d5/dd/fa0 0 2026-03-09T17:29:40.184 INFO:tasks.workunit.client.0.vm06.stdout:7/561: stat d5/d1f 0 2026-03-09T17:29:40.196 INFO:tasks.workunit.client.0.vm06.stdout:8/483: link f7 d15/d16/d1a/d7c/f9e 0 2026-03-09T17:29:40.196 INFO:tasks.workunit.client.0.vm06.stdout:8/484: chown d15/d16/d1e/d28/d5e/c90 265995422 1 2026-03-09T17:29:40.199 INFO:tasks.workunit.client.0.vm06.stdout:8/485: chown d15/d39/d67/d77 7 1 2026-03-09T17:29:40.201 INFO:tasks.workunit.client.0.vm06.stdout:9/574: dread d3/d26/f76 [0,4194304] 0 2026-03-09T17:29:40.201 INFO:tasks.workunit.client.0.vm06.stdout:9/575: fsync d3/d6d/d9a/d9c/fab 0 2026-03-09T17:29:40.204 INFO:tasks.workunit.client.0.vm06.stdout:4/513: truncate db/d1d/d21/d44/d8a/fa3 1206397 0 2026-03-09T17:29:40.206 INFO:tasks.workunit.client.0.vm06.stdout:3/496: mkdir dd/d81/da3 0 2026-03-09T17:29:40.206 INFO:tasks.workunit.client.0.vm06.stdout:2/464: creat d3/d4/d46/f94 x:0 0 0 2026-03-09T17:29:40.206 INFO:tasks.workunit.client.0.vm06.stdout:3/497: stat dd/d1d/d6e 0 2026-03-09T17:29:40.206 INFO:tasks.workunit.client.0.vm06.stdout:2/465: dread - d3/f5a zero size 2026-03-09T17:29:40.207 INFO:tasks.workunit.client.0.vm06.stdout:0/594: dread d7/d11/d19/d1d/d87/f92 [0,4194304] 0 2026-03-09T17:29:40.209 INFO:tasks.workunit.client.0.vm06.stdout:0/595: read d7/d11/d19/d1d/d39/f7d [528887,95767] 0 2026-03-09T17:29:40.209 INFO:tasks.workunit.client.0.vm06.stdout:0/596: chown d7/d11/d2d/fc3 542796600 1 2026-03-09T17:29:40.211 INFO:tasks.workunit.client.0.vm06.stdout:1/524: link d11/d14/d1d/d1e/d96/f9b d11/d69/fad 0 2026-03-09T17:29:40.219 INFO:tasks.workunit.client.0.vm06.stdout:0/597: dread d7/d11/d19/d8b/da4/fab [0,4194304] 0 2026-03-09T17:29:40.222 INFO:tasks.workunit.client.0.vm06.stdout:0/598: dread d7/d11/d19/f57 [0,4194304] 0 2026-03-09T17:29:40.223 INFO:tasks.workunit.client.0.vm06.stdout:0/599: write d7/d11/d19/d1d/f40 [576317,36763] 0 2026-03-09T17:29:40.225 INFO:tasks.workunit.client.0.vm06.stdout:7/562: creat d5/d7/d2b/fa1 x:0 0 0 2026-03-09T17:29:40.242 INFO:tasks.workunit.client.0.vm06.stdout:8/486: symlink d15/d16/d1a/l9f 0 2026-03-09T17:29:40.242 INFO:tasks.workunit.client.0.vm06.stdout:7/563: sync 2026-03-09T17:29:40.246 INFO:tasks.workunit.client.0.vm06.stdout:5/498: getdents d4/d22/d64 0 2026-03-09T17:29:40.246 INFO:tasks.workunit.client.0.vm06.stdout:5/499: chown d4/d50/d18/f5c 2913784 1 2026-03-09T17:29:40.247 INFO:tasks.workunit.client.0.vm06.stdout:7/564: dwrite d5/dd/f3e [0,4194304] 0 2026-03-09T17:29:40.248 INFO:tasks.workunit.client.0.vm06.stdout:4/514: mknod db/d1d/d21/d25/d4b/d85/cbf 0 2026-03-09T17:29:40.259 INFO:tasks.workunit.client.0.vm06.stdout:6/422: rename d6/fb to d6/d12/d53/f87 0 2026-03-09T17:29:40.264 INFO:tasks.workunit.client.0.vm06.stdout:9/576: creat d3/d15/d36/d83/fb1 x:0 0 0 2026-03-09T17:29:40.266 INFO:tasks.workunit.client.0.vm06.stdout:9/577: dread d3/d15/d36/d4c/d6a/fa9 [0,4194304] 0 2026-03-09T17:29:40.271 INFO:tasks.workunit.client.0.vm06.stdout:9/578: dwrite d3/d15/f17 [4194304,4194304] 0 2026-03-09T17:29:40.271 INFO:tasks.workunit.client.0.vm06.stdout:5/500: symlink d4/d50/d35/lbf 0 2026-03-09T17:29:40.283 INFO:tasks.workunit.client.0.vm06.stdout:4/515: creat db/d1d/d21/d25/fc0 x:0 0 0 2026-03-09T17:29:40.287 INFO:tasks.workunit.client.0.vm06.stdout:7/565: creat d5/d12/fa2 x:0 0 0 2026-03-09T17:29:40.287 INFO:tasks.workunit.client.0.vm06.stdout:4/516: dwrite db/d1d/d21/d25/fc0 [0,4194304] 0 2026-03-09T17:29:40.289 INFO:tasks.workunit.client.0.vm06.stdout:7/566: write d5/d1f/d34/d46/f89 [303439,64070] 0 2026-03-09T17:29:40.295 INFO:tasks.workunit.client.0.vm06.stdout:2/466: creat d3/d4/d22/d72/d8f/f95 x:0 0 0 2026-03-09T17:29:40.309 INFO:tasks.workunit.client.0.vm06.stdout:3/498: dwrite dd/d5b/d65/f6a [0,4194304] 0 2026-03-09T17:29:40.313 INFO:tasks.workunit.client.0.vm06.stdout:6/423: fsync d6/d12/f76 0 2026-03-09T17:29:40.315 INFO:tasks.workunit.client.0.vm06.stdout:3/499: dwrite dd/d19/d1e/f3f [0,4194304] 0 2026-03-09T17:29:40.329 INFO:tasks.workunit.client.0.vm06.stdout:0/600: symlink d7/d11/d2d/dca/ld0 0 2026-03-09T17:29:40.329 INFO:tasks.workunit.client.0.vm06.stdout:0/601: write d7/d11/d5d/db8/fc6 [585254,14265] 0 2026-03-09T17:29:40.337 INFO:tasks.workunit.client.0.vm06.stdout:8/487: write d15/d16/d1a/d47/f76 [152034,8999] 0 2026-03-09T17:29:40.337 INFO:tasks.workunit.client.0.vm06.stdout:8/488: chown d15/l8a 18 1 2026-03-09T17:29:40.343 INFO:tasks.workunit.client.0.vm06.stdout:4/517: rmdir db/d59/d5f/d5d 39 2026-03-09T17:29:40.352 INFO:tasks.workunit.client.0.vm06.stdout:7/567: rename d5/d1f/d34/c63 to d5/d7/d2b/ca3 0 2026-03-09T17:29:40.352 INFO:tasks.workunit.client.0.vm06.stdout:4/518: dwrite db/d1d/d21/d44/fb7 [0,4194304] 0 2026-03-09T17:29:40.352 INFO:tasks.workunit.client.0.vm06.stdout:2/467: creat d3/d4/d22/d43/d77/d81/d64/d6a/f96 x:0 0 0 2026-03-09T17:29:40.352 INFO:tasks.workunit.client.0.vm06.stdout:1/525: creat d11/d14/d1d/d1e/d2a/fae x:0 0 0 2026-03-09T17:29:40.356 INFO:tasks.workunit.client.0.vm06.stdout:6/424: fdatasync d6/d4f/d3e/d52/f74 0 2026-03-09T17:29:40.356 INFO:tasks.workunit.client.0.vm06.stdout:6/425: chown d6/d12/d17/d27 73 1 2026-03-09T17:29:40.357 INFO:tasks.workunit.client.0.vm06.stdout:6/426: chown d6/d12/d2d 3 1 2026-03-09T17:29:40.358 INFO:tasks.workunit.client.0.vm06.stdout:0/602: truncate d7/d11/d19/d37/f6d 1458211 0 2026-03-09T17:29:40.358 INFO:tasks.workunit.client.0.vm06.stdout:0/603: read - d7/d88/fbe zero size 2026-03-09T17:29:40.359 INFO:tasks.workunit.client.0.vm06.stdout:0/604: chown d7/d11/f1c 5 1 2026-03-09T17:29:40.360 INFO:tasks.workunit.client.0.vm06.stdout:9/579: creat d3/dad/fb2 x:0 0 0 2026-03-09T17:29:40.361 INFO:tasks.workunit.client.0.vm06.stdout:5/501: creat d4/d50/d18/d3d/fc0 x:0 0 0 2026-03-09T17:29:40.364 INFO:tasks.workunit.client.0.vm06.stdout:3/500: rename dd/d19/d25/d2d/f71 to dd/d59/da1/fa4 0 2026-03-09T17:29:40.368 INFO:tasks.workunit.client.0.vm06.stdout:2/468: creat d3/d4/d12/d71/f97 x:0 0 0 2026-03-09T17:29:40.369 INFO:tasks.workunit.client.0.vm06.stdout:2/469: write f2 [2172317,94655] 0 2026-03-09T17:29:40.377 INFO:tasks.workunit.client.0.vm06.stdout:4/519: dread db/d1d/d21/d37/d69/f8b [0,4194304] 0 2026-03-09T17:29:40.381 INFO:tasks.workunit.client.0.vm06.stdout:4/520: dwrite db/df/f4d [0,4194304] 0 2026-03-09T17:29:40.384 INFO:tasks.workunit.client.0.vm06.stdout:4/521: chown db/d1d/d21/d37/d69/d78/c93 84 1 2026-03-09T17:29:40.385 INFO:tasks.workunit.client.0.vm06.stdout:0/605: creat d7/d11/d19/d23/db7/fd1 x:0 0 0 2026-03-09T17:29:40.391 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:40 vm09.local ceph-mon[62061]: from='client.24453 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:40.391 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:40 vm09.local ceph-mon[62061]: from='client.24457 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:40.391 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:40 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/4139756502' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:29:40.391 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:40 vm09.local ceph-mon[62061]: pgmap v153: 65 pgs: 65 active+clean; 1.1 GiB data, 4.5 GiB used, 116 GiB / 120 GiB avail; 22 MiB/s rd, 79 MiB/s wr, 288 op/s 2026-03-09T17:29:40.391 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:40 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/3937072729' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:29:40.408 INFO:tasks.workunit.client.0.vm06.stdout:5/502: creat d4/d50/d35/d40/fc1 x:0 0 0 2026-03-09T17:29:40.414 INFO:tasks.workunit.client.0.vm06.stdout:8/489: rename d15/d16/f54 to d15/d39/d67/d77/fa0 0 2026-03-09T17:29:40.414 INFO:tasks.workunit.client.0.vm06.stdout:8/490: chown d15/d16/d1a/d7c 194352 1 2026-03-09T17:29:40.417 INFO:tasks.workunit.client.0.vm06.stdout:5/503: dread d4/d50/d18/d3d/f54 [0,4194304] 0 2026-03-09T17:29:40.434 INFO:tasks.workunit.client.0.vm06.stdout:1/526: mkdir d11/d14/d1c/d1f/daf 0 2026-03-09T17:29:40.439 INFO:tasks.workunit.client.0.vm06.stdout:0/606: creat d7/d11/d5d/d64/fd2 x:0 0 0 2026-03-09T17:29:40.448 INFO:tasks.workunit.client.0.vm06.stdout:3/501: rename dd/d1d/d4e/l54 to dd/d19/d25/d44/d80/la5 0 2026-03-09T17:29:40.451 INFO:tasks.workunit.client.0.vm06.stdout:5/504: fsync d4/d22/f45 0 2026-03-09T17:29:40.455 INFO:tasks.workunit.client.0.vm06.stdout:0/607: rmdir d7/d11/d2d 39 2026-03-09T17:29:40.461 INFO:tasks.workunit.client.0.vm06.stdout:8/491: dread d15/d16/d1a/d47/f7e [0,4194304] 0 2026-03-09T17:29:40.468 INFO:tasks.workunit.client.0.vm06.stdout:7/568: dwrite d5/d12/d64/f77 [0,4194304] 0 2026-03-09T17:29:40.480 INFO:tasks.workunit.client.0.vm06.stdout:3/502: unlink dd/c69 0 2026-03-09T17:29:40.480 INFO:tasks.workunit.client.0.vm06.stdout:3/503: stat dd/d19/d28/l3b 0 2026-03-09T17:29:40.481 INFO:tasks.workunit.client.0.vm06.stdout:4/522: write db/d1d/f3a [1923724,99034] 0 2026-03-09T17:29:40.484 INFO:tasks.workunit.client.0.vm06.stdout:5/505: fsync d4/d52/f8f 0 2026-03-09T17:29:40.492 INFO:tasks.workunit.client.0.vm06.stdout:2/470: dwrite d3/d44/f6c [0,4194304] 0 2026-03-09T17:29:40.494 INFO:tasks.workunit.client.0.vm06.stdout:7/569: symlink d5/dd/d79/la4 0 2026-03-09T17:29:40.495 INFO:tasks.workunit.client.0.vm06.stdout:7/570: write d5/d12/f2c [1805113,124852] 0 2026-03-09T17:29:40.496 INFO:tasks.workunit.client.0.vm06.stdout:9/580: symlink d3/d26/d35/d9f/lb3 0 2026-03-09T17:29:40.504 INFO:tasks.workunit.client.0.vm06.stdout:9/581: dwrite d3/d11/f87 [4194304,4194304] 0 2026-03-09T17:29:40.505 INFO:tasks.workunit.client.0.vm06.stdout:7/571: dwrite d5/d1f/d34/d46/f55 [4194304,4194304] 0 2026-03-09T17:29:40.518 INFO:tasks.workunit.client.0.vm06.stdout:4/523: sync 2026-03-09T17:29:40.518 INFO:tasks.workunit.client.0.vm06.stdout:3/504: symlink dd/d59/la6 0 2026-03-09T17:29:40.518 INFO:tasks.workunit.client.0.vm06.stdout:4/524: chown db/d1d/d21/d37/fbd 299 1 2026-03-09T17:29:40.522 INFO:tasks.workunit.client.0.vm06.stdout:1/527: mkdir d11/d14/d1d/d1e/d2a/d99/db0 0 2026-03-09T17:29:40.526 INFO:tasks.workunit.client.0.vm06.stdout:0/608: unlink d7/d11/d2d/l61 0 2026-03-09T17:29:40.530 INFO:tasks.workunit.client.0.vm06.stdout:7/572: fsync d5/dd/f7d 0 2026-03-09T17:29:40.531 INFO:tasks.workunit.client.0.vm06.stdout:7/573: chown d5/d1f/d34/d46/f4e 146908 1 2026-03-09T17:29:40.532 INFO:tasks.workunit.client.0.vm06.stdout:7/574: write d5/d1f/d34/f5e [5359855,86409] 0 2026-03-09T17:29:40.535 INFO:tasks.workunit.client.0.vm06.stdout:3/505: creat dd/d5b/fa7 x:0 0 0 2026-03-09T17:29:40.546 INFO:tasks.workunit.client.0.vm06.stdout:0/609: creat d7/d11/d2d/daf/fd3 x:0 0 0 2026-03-09T17:29:40.551 INFO:tasks.workunit.client.0.vm06.stdout:6/427: rename d6/f56 to d6/d47/f88 0 2026-03-09T17:29:40.557 INFO:tasks.workunit.client.0.vm06.stdout:3/506: symlink dd/d1d/d2e/d67/la8 0 2026-03-09T17:29:40.559 INFO:tasks.workunit.client.0.vm06.stdout:0/610: fsync d7/f50 0 2026-03-09T17:29:40.571 INFO:tasks.workunit.client.0.vm06.stdout:8/492: getdents d15/d16/d1a/d47 0 2026-03-09T17:29:40.571 INFO:tasks.workunit.client.0.vm06.stdout:2/471: rename d3/d4/d12/d2b/d36/d37/c78 to d3/d4/d22/d72/d8f/c98 0 2026-03-09T17:29:40.571 INFO:tasks.workunit.client.0.vm06.stdout:7/575: truncate d5/dd/f48 683076 0 2026-03-09T17:29:40.571 INFO:tasks.workunit.client.0.vm06.stdout:7/576: write d5/d1f/d34/d46/f55 [1104646,116781] 0 2026-03-09T17:29:40.571 INFO:tasks.workunit.client.0.vm06.stdout:3/507: creat dd/d81/fa9 x:0 0 0 2026-03-09T17:29:40.571 INFO:tasks.workunit.client.0.vm06.stdout:0/611: mknod d7/d11/d5d/d64/cd4 0 2026-03-09T17:29:40.571 INFO:tasks.workunit.client.0.vm06.stdout:2/472: fsync d3/f3b 0 2026-03-09T17:29:40.571 INFO:tasks.workunit.client.0.vm06.stdout:2/473: write d3/d4/d12/d2b/d2d/f6f [803045,52563] 0 2026-03-09T17:29:40.571 INFO:tasks.workunit.client.0.vm06.stdout:7/577: mkdir d5/d1f/d34/d46/d51/da5 0 2026-03-09T17:29:40.574 INFO:tasks.workunit.client.0.vm06.stdout:4/525: dread db/d1d/f3a [0,4194304] 0 2026-03-09T17:29:40.574 INFO:tasks.workunit.client.0.vm06.stdout:3/508: symlink dd/d19/d28/laa 0 2026-03-09T17:29:40.575 INFO:tasks.workunit.client.0.vm06.stdout:3/509: truncate dd/d5c/f66 1099134 0 2026-03-09T17:29:40.576 INFO:tasks.workunit.client.0.vm06.stdout:3/510: write dd/d1d/d4e/f7d [53772,43836] 0 2026-03-09T17:29:40.577 INFO:tasks.workunit.client.0.vm06.stdout:0/612: creat d7/d11/d2d/daf/fd5 x:0 0 0 2026-03-09T17:29:40.582 INFO:tasks.workunit.client.0.vm06.stdout:8/493: mknod d15/d16/d1e/d8f/ca1 0 2026-03-09T17:29:40.591 INFO:tasks.workunit.client.0.vm06.stdout:5/506: dwrite d4/fb [0,4194304] 0 2026-03-09T17:29:40.592 INFO:tasks.workunit.client.0.vm06.stdout:5/507: chown d4/d50/d18/f4a 6 1 2026-03-09T17:29:40.593 INFO:tasks.workunit.client.0.vm06.stdout:5/508: chown d4/dbb 519 1 2026-03-09T17:29:40.594 INFO:tasks.workunit.client.0.vm06.stdout:5/509: chown d4/d50/d35/d40/d96/d98/cba 123138888 1 2026-03-09T17:29:40.601 INFO:tasks.workunit.client.0.vm06.stdout:2/474: rmdir d3/d4/d22/d43/d77/d81/d64 39 2026-03-09T17:29:40.603 INFO:tasks.workunit.client.0.vm06.stdout:7/578: rmdir d5/d12 39 2026-03-09T17:29:40.604 INFO:tasks.workunit.client.0.vm06.stdout:7/579: fsync d5/d1f/d34/f54 0 2026-03-09T17:29:40.605 INFO:tasks.workunit.client.0.vm06.stdout:4/526: mkdir db/d1d/d21/d44/dc1 0 2026-03-09T17:29:40.607 INFO:tasks.workunit.client.0.vm06.stdout:3/511: creat dd/d19/d28/fab x:0 0 0 2026-03-09T17:29:40.608 INFO:tasks.workunit.client.0.vm06.stdout:0/613: unlink d7/fb 0 2026-03-09T17:29:40.609 INFO:tasks.workunit.client.0.vm06.stdout:8/494: mknod d15/d16/d1a/d47/ca2 0 2026-03-09T17:29:40.611 INFO:tasks.workunit.client.0.vm06.stdout:1/528: rename d11/d14/d1c/d1f/d57/d7b/l9e to d11/d14/d1d/lb1 0 2026-03-09T17:29:40.612 INFO:tasks.workunit.client.0.vm06.stdout:1/529: read - d11/d14/d1d/d1e/d96/fab zero size 2026-03-09T17:29:40.620 INFO:tasks.workunit.client.0.vm06.stdout:9/582: dwrite d3/d15/d48/f64 [0,4194304] 0 2026-03-09T17:29:40.623 INFO:tasks.workunit.client.0.vm06.stdout:4/527: rmdir db/d1d/d21/d25/d4b 39 2026-03-09T17:29:40.624 INFO:tasks.workunit.client.0.vm06.stdout:4/528: chown db/d1d/d21/d26/f70 4 1 2026-03-09T17:29:40.625 INFO:tasks.workunit.client.0.vm06.stdout:4/529: write db/d59/d5f/d45/fa9 [2135149,92605] 0 2026-03-09T17:29:40.627 INFO:tasks.workunit.client.0.vm06.stdout:3/512: truncate dd/d1d/f34 3407501 0 2026-03-09T17:29:40.638 INFO:tasks.workunit.client.0.vm06.stdout:0/614: unlink d7/d11/c47 0 2026-03-09T17:29:40.638 INFO:tasks.workunit.client.0.vm06.stdout:0/615: dread - d7/d11/d19/d1d/fb3 zero size 2026-03-09T17:29:40.639 INFO:tasks.workunit.client.0.vm06.stdout:0/616: chown d7/l63 11 1 2026-03-09T17:29:40.641 INFO:tasks.workunit.client.0.vm06.stdout:8/495: mknod d15/d16/d1e/d28/ca3 0 2026-03-09T17:29:40.650 INFO:tasks.workunit.client.0.vm06.stdout:7/580: link d5/d1f/f8a d5/dd/fa6 0 2026-03-09T17:29:40.651 INFO:tasks.workunit.client.0.vm06.stdout:7/581: chown d5/d7/f58 1661126 1 2026-03-09T17:29:40.656 INFO:tasks.workunit.client.0.vm06.stdout:9/583: creat d3/d6d/d9a/fb4 x:0 0 0 2026-03-09T17:29:40.656 INFO:tasks.workunit.client.0.vm06.stdout:9/584: write d3/d15/f1a [550632,45070] 0 2026-03-09T17:29:40.665 INFO:tasks.workunit.client.0.vm06.stdout:4/530: read db/d1d/d21/d37/f54 [1433198,91353] 0 2026-03-09T17:29:40.671 INFO:tasks.workunit.client.0.vm06.stdout:6/428: truncate d6/d12/d17/d27/f3d 888340 0 2026-03-09T17:29:40.671 INFO:tasks.workunit.client.0.vm06.stdout:6/429: fsync d6/d12/d53/f64 0 2026-03-09T17:29:40.675 INFO:tasks.workunit.client.0.vm06.stdout:0/617: write d7/d11/f20 [3416976,73964] 0 2026-03-09T17:29:40.692 INFO:tasks.workunit.client.0.vm06.stdout:5/510: dwrite d4/d22/d46/f82 [0,4194304] 0 2026-03-09T17:29:40.694 INFO:tasks.workunit.client.0.vm06.stdout:4/531: dread db/d1d/d21/f67 [0,4194304] 0 2026-03-09T17:29:40.695 INFO:tasks.workunit.client.0.vm06.stdout:5/511: read d4/d50/d18/f4a [1635055,60175] 0 2026-03-09T17:29:40.695 INFO:tasks.workunit.client.0.vm06.stdout:5/512: write d4/f26 [650111,110330] 0 2026-03-09T17:29:40.703 INFO:tasks.workunit.client.0.vm06.stdout:2/475: dwrite d3/d4/d12/d71/f8c [0,4194304] 0 2026-03-09T17:29:40.715 INFO:tasks.workunit.client.0.vm06.stdout:2/476: dread d3/d4/d12/d2b/d36/d37/f3a [0,4194304] 0 2026-03-09T17:29:40.726 INFO:tasks.workunit.client.0.vm06.stdout:0/618: mknod d7/d11/d89/d99/cd6 0 2026-03-09T17:29:40.727 INFO:tasks.workunit.client.0.vm06.stdout:6/430: rmdir d6/d12/d17 39 2026-03-09T17:29:40.732 INFO:tasks.workunit.client.0.vm06.stdout:7/582: dwrite d5/d1f/d34/d46/f4e [0,4194304] 0 2026-03-09T17:29:40.734 INFO:tasks.workunit.client.0.vm06.stdout:7/583: write d5/f71 [1467275,76120] 0 2026-03-09T17:29:40.735 INFO:tasks.workunit.client.0.vm06.stdout:4/532: stat db/d1d/f1f 0 2026-03-09T17:29:40.738 INFO:tasks.workunit.client.0.vm06.stdout:5/513: truncate d4/d50/d18/f4b 3198246 0 2026-03-09T17:29:40.739 INFO:tasks.workunit.client.0.vm06.stdout:5/514: chown d4/d22/d46/lb1 21 1 2026-03-09T17:29:40.744 INFO:tasks.workunit.client.0.vm06.stdout:3/513: link dd/d1d/f45 dd/d81/fac 0 2026-03-09T17:29:40.754 INFO:tasks.workunit.client.0.vm06.stdout:0/619: rmdir d7/d11/d19/d23/db7/dbd/dc1 39 2026-03-09T17:29:40.754 INFO:tasks.workunit.client.0.vm06.stdout:8/496: rmdir d15/d16/d19/d2b/d43 0 2026-03-09T17:29:40.754 INFO:tasks.workunit.client.0.vm06.stdout:8/497: write d15/d39/f4b [1669334,20444] 0 2026-03-09T17:29:40.760 INFO:tasks.workunit.client.0.vm06.stdout:9/585: creat d3/d26/d35/fb5 x:0 0 0 2026-03-09T17:29:40.760 INFO:tasks.workunit.client.0.vm06.stdout:2/477: sync 2026-03-09T17:29:40.763 INFO:tasks.workunit.client.0.vm06.stdout:7/584: symlink d5/dd/d79/la7 0 2026-03-09T17:29:40.768 INFO:tasks.workunit.client.0.vm06.stdout:5/515: write d4/d7e/fab [686428,122391] 0 2026-03-09T17:29:40.770 INFO:tasks.workunit.client.0.vm06.stdout:3/514: dread dd/d1d/f4b [0,4194304] 0 2026-03-09T17:29:40.774 INFO:tasks.workunit.client.0.vm06.stdout:0/620: fsync d7/d11/d19/d37/f4f 0 2026-03-09T17:29:40.777 INFO:tasks.workunit.client.0.vm06.stdout:6/431: dread - d6/d4f/d3e/f51 zero size 2026-03-09T17:29:40.781 INFO:tasks.workunit.client.0.vm06.stdout:1/530: link d11/d14/d1d/d1e/d2a/c98 d11/d14/d1d/d1e/cb2 0 2026-03-09T17:29:40.783 INFO:tasks.workunit.client.0.vm06.stdout:7/585: fdatasync d5/d1f/d34/d46/f4e 0 2026-03-09T17:29:40.790 INFO:tasks.workunit.client.0.vm06.stdout:9/586: creat d3/d6d/d85/fb6 x:0 0 0 2026-03-09T17:29:40.791 INFO:tasks.workunit.client.0.vm06.stdout:9/587: write d3/d15/f17 [6543478,53885] 0 2026-03-09T17:29:40.793 INFO:tasks.workunit.client.0.vm06.stdout:7/586: dread d5/f8 [4194304,4194304] 0 2026-03-09T17:29:40.798 INFO:tasks.workunit.client.0.vm06.stdout:4/533: creat db/d59/d5f/d5d/fc2 x:0 0 0 2026-03-09T17:29:40.802 INFO:tasks.workunit.client.0.vm06.stdout:3/515: creat dd/d19/d2c/fad x:0 0 0 2026-03-09T17:29:40.806 INFO:tasks.workunit.client.0.vm06.stdout:0/621: dread - d7/d11/fa3 zero size 2026-03-09T17:29:40.824 INFO:tasks.workunit.client.0.vm06.stdout:8/498: write d15/d16/f66 [1179823,13490] 0 2026-03-09T17:29:40.827 INFO:tasks.workunit.client.0.vm06.stdout:8/499: dwrite d15/d16/d19/d3d/f4c [4194304,4194304] 0 2026-03-09T17:29:40.828 INFO:tasks.workunit.client.0.vm06.stdout:1/531: fdatasync d11/d14/d1d/d1e/d2a/f43 0 2026-03-09T17:29:40.829 INFO:tasks.workunit.client.0.vm06.stdout:1/532: chown d11/d14/d1d/f56 98 1 2026-03-09T17:29:40.831 INFO:tasks.workunit.client.0.vm06.stdout:8/500: dwrite d15/d39/f7b [0,4194304] 0 2026-03-09T17:29:40.837 INFO:tasks.workunit.client.0.vm06.stdout:5/516: write d4/d50/f1e [6518335,95766] 0 2026-03-09T17:29:40.838 INFO:tasks.workunit.client.0.vm06.stdout:5/517: stat d4/d22/d64/f7d 0 2026-03-09T17:29:40.841 INFO:tasks.workunit.client.0.vm06.stdout:5/518: dwrite d4/d7e/fab [0,4194304] 0 2026-03-09T17:29:40.859 INFO:tasks.workunit.client.0.vm06.stdout:7/587: link d5/d1f/d34/d3f/c9a d5/dd/d79/ca8 0 2026-03-09T17:29:40.863 INFO:tasks.workunit.client.0.vm06.stdout:2/478: rename d3/d4/d22/c82 to d3/d4/d12/d2b/c99 0 2026-03-09T17:29:40.866 INFO:tasks.workunit.client.0.vm06.stdout:5/519: mkdir d4/d52/db4/dc2 0 2026-03-09T17:29:40.871 INFO:tasks.workunit.client.0.vm06.stdout:9/588: getdents d3/d11/d65 0 2026-03-09T17:29:40.884 INFO:tasks.workunit.client.0.vm06.stdout:9/589: dwrite d3/d2c/f9d [0,4194304] 0 2026-03-09T17:29:40.885 INFO:tasks.workunit.client.0.vm06.stdout:4/534: getdents db/d1d/d21/d37/d69 0 2026-03-09T17:29:40.885 INFO:tasks.workunit.client.0.vm06.stdout:5/520: dread - d4/d50/d18/d3d/f81 zero size 2026-03-09T17:29:40.885 INFO:tasks.workunit.client.0.vm06.stdout:5/521: dread d4/d52/f8a [0,4194304] 0 2026-03-09T17:29:40.885 INFO:tasks.workunit.client.0.vm06.stdout:5/522: fsync d4/f7 0 2026-03-09T17:29:40.885 INFO:tasks.workunit.client.0.vm06.stdout:6/432: rename d6/d47/d4d/f55 to d6/d4f/d3e/d52/f89 0 2026-03-09T17:29:40.886 INFO:tasks.workunit.client.0.vm06.stdout:0/622: write d7/d11/fa3 [770478,59146] 0 2026-03-09T17:29:40.887 INFO:tasks.workunit.client.0.vm06.stdout:9/590: creat d3/d15/d48/fb7 x:0 0 0 2026-03-09T17:29:40.889 INFO:tasks.workunit.client.0.vm06.stdout:4/535: mkdir db/d1d/d21/d88/dc3 0 2026-03-09T17:29:40.893 INFO:tasks.workunit.client.0.vm06.stdout:8/501: rename d15/d16/d1e/d8f to d15/d16/d1e/d28/da4 0 2026-03-09T17:29:40.895 INFO:tasks.workunit.client.0.vm06.stdout:8/502: dread d15/d39/f40 [0,4194304] 0 2026-03-09T17:29:40.899 INFO:tasks.workunit.client.0.vm06.stdout:8/503: readlink d15/d16/d1e/d30/l38 0 2026-03-09T17:29:40.899 INFO:tasks.workunit.client.0.vm06.stdout:6/433: chown d6/d12/d17/f7a 8082 1 2026-03-09T17:29:40.899 INFO:tasks.workunit.client.0.vm06.stdout:4/536: mkdir db/d1d/dc4 0 2026-03-09T17:29:40.899 INFO:tasks.workunit.client.0.vm06.stdout:2/479: rename d3/d4/d12/f20 to d3/d4/d22/d43/d77/d81/d64/f9a 0 2026-03-09T17:29:40.902 INFO:tasks.workunit.client.0.vm06.stdout:7/588: sync 2026-03-09T17:29:40.903 INFO:tasks.workunit.client.0.vm06.stdout:8/504: unlink d15/d16/d19/d71/c78 0 2026-03-09T17:29:40.905 INFO:tasks.workunit.client.0.vm06.stdout:4/537: symlink db/d1d/d21/d26/d89/lc5 0 2026-03-09T17:29:40.907 INFO:tasks.workunit.client.0.vm06.stdout:0/623: rename d7/d11/d5d/db8/l5e to d7/d11/d2d/dca/ld7 0 2026-03-09T17:29:40.911 INFO:tasks.workunit.client.0.vm06.stdout:8/505: creat d15/d16/d1a/d47/fa5 x:0 0 0 2026-03-09T17:29:40.912 INFO:tasks.workunit.client.0.vm06.stdout:8/506: write d15/d16/d19/d71/f96 [47831,98627] 0 2026-03-09T17:29:40.913 INFO:tasks.workunit.client.0.vm06.stdout:8/507: write d15/d39/d3c/d6c/f8b [664867,54836] 0 2026-03-09T17:29:40.923 INFO:tasks.workunit.client.0.vm06.stdout:9/591: rename d3/l7 to d3/d6d/d9a/d9c/lb8 0 2026-03-09T17:29:40.924 INFO:tasks.workunit.client.0.vm06.stdout:9/592: stat d3/d2c/l40 0 2026-03-09T17:29:40.924 INFO:tasks.workunit.client.0.vm06.stdout:9/593: readlink d3/d26/d6c/d68/l69 0 2026-03-09T17:29:40.925 INFO:tasks.workunit.client.0.vm06.stdout:9/594: chown d3/d15/ca6 1550416006 1 2026-03-09T17:29:40.929 INFO:tasks.workunit.client.0.vm06.stdout:9/595: dwrite d3/d15/f1a [0,4194304] 0 2026-03-09T17:29:40.949 INFO:tasks.workunit.client.0.vm06.stdout:1/533: write d11/d14/d1d/f73 [1927676,8987] 0 2026-03-09T17:29:40.953 INFO:tasks.workunit.client.0.vm06.stdout:3/516: write dd/d1d/f34 [4092097,42106] 0 2026-03-09T17:29:40.959 INFO:tasks.workunit.client.0.vm06.stdout:0/624: mkdir d7/d11/d19/d3c/db9/dd8 0 2026-03-09T17:29:40.963 INFO:tasks.workunit.client.0.vm06.stdout:8/508: mkdir d15/da6 0 2026-03-09T17:29:40.964 INFO:tasks.workunit.client.0.vm06.stdout:5/523: write d4/f21 [3146909,101063] 0 2026-03-09T17:29:40.964 INFO:tasks.workunit.client.0.vm06.stdout:5/524: chown d4/d52/lbc 10460 1 2026-03-09T17:29:40.972 INFO:tasks.workunit.client.0.vm06.stdout:4/538: link db/d1d/d21/d37/d69/f8b db/d1d/d21/d37/fc6 0 2026-03-09T17:29:40.975 INFO:tasks.workunit.client.0.vm06.stdout:5/525: dread d4/d50/d18/f74 [0,4194304] 0 2026-03-09T17:29:40.977 INFO:tasks.workunit.client.0.vm06.stdout:7/589: write d5/d7/f62 [850270,18546] 0 2026-03-09T17:29:40.983 INFO:tasks.workunit.client.0.vm06.stdout:6/434: dwrite d6/d12/d17/f32 [0,4194304] 0 2026-03-09T17:29:40.989 INFO:tasks.workunit.client.0.vm06.stdout:0/625: creat d7/d11/d19/d23/db7/fd9 x:0 0 0 2026-03-09T17:29:40.990 INFO:tasks.workunit.client.0.vm06.stdout:8/509: symlink d15/d39/d67/d77/la7 0 2026-03-09T17:29:40.990 INFO:tasks.workunit.client.0.vm06.stdout:5/526: sync 2026-03-09T17:29:41.001 INFO:tasks.workunit.client.0.vm06.stdout:0/626: dread d7/d11/d19/d1d/d39/f7d [0,4194304] 0 2026-03-09T17:29:41.003 INFO:tasks.workunit.client.0.vm06.stdout:1/534: write d11/d14/d1d/d42/d46/f55 [430495,4015] 0 2026-03-09T17:29:41.013 INFO:tasks.workunit.client.0.vm06.stdout:0/627: sync 2026-03-09T17:29:41.039 INFO:tasks.workunit.client.0.vm06.stdout:3/517: mkdir dd/d81/da3/dae 0 2026-03-09T17:29:41.039 INFO:tasks.workunit.client.0.vm06.stdout:3/518: write dd/d1d/f34 [917095,119764] 0 2026-03-09T17:29:41.040 INFO:tasks.workunit.client.0.vm06.stdout:3/519: write dd/d1d/d6e/fa0 [31874,47125] 0 2026-03-09T17:29:41.043 INFO:tasks.workunit.client.0.vm06.stdout:2/480: rename d3/d4/d12/d2b/d36/d37/f41 to d3/d4/d12/d2b/d2d/f9b 0 2026-03-09T17:29:41.045 INFO:tasks.workunit.client.0.vm06.stdout:8/510: mknod d15/d16/d19/d71/ca8 0 2026-03-09T17:29:41.050 INFO:tasks.workunit.client.0.vm06.stdout:4/539: creat db/d57/fc7 x:0 0 0 2026-03-09T17:29:41.051 INFO:tasks.workunit.client.0.vm06.stdout:1/535: fdatasync d11/d14/d1c/d1f/f21 0 2026-03-09T17:29:41.061 INFO:tasks.workunit.client.0.vm06.stdout:1/536: truncate d11/d14/d1d/f31 650665 0 2026-03-09T17:29:41.076 INFO:tasks.workunit.client.0.vm06.stdout:6/435: rename d6/d47/d6f to d6/d47/d8a 0 2026-03-09T17:29:41.086 INFO:tasks.workunit.client.0.vm06.stdout:6/436: creat d6/d47/d4d/f8b x:0 0 0 2026-03-09T17:29:41.088 INFO:tasks.workunit.client.0.vm06.stdout:2/481: creat d3/d4/f9c x:0 0 0 2026-03-09T17:29:41.090 INFO:tasks.workunit.client.0.vm06.stdout:1/537: dread d11/d14/d1d/d1e/d2a/f38 [0,4194304] 0 2026-03-09T17:29:41.093 INFO:tasks.workunit.client.0.vm06.stdout:6/437: mkdir d6/d4f/d3e/d52/d8c 0 2026-03-09T17:29:41.096 INFO:tasks.workunit.client.0.vm06.stdout:2/482: chown d3/d4/d12/d2b/d36/d37/c8d 237673 1 2026-03-09T17:29:41.097 INFO:tasks.workunit.client.0.vm06.stdout:2/483: truncate d3/d4/d12/f92 463996 0 2026-03-09T17:29:41.098 INFO:tasks.workunit.client.0.vm06.stdout:2/484: fsync d3/d4/d12/f42 0 2026-03-09T17:29:41.098 INFO:tasks.workunit.client.0.vm06.stdout:2/485: read d3/d4/f11 [2025907,28032] 0 2026-03-09T17:29:41.099 INFO:tasks.workunit.client.0.vm06.stdout:2/486: fdatasync d3/d4/d12/f92 0 2026-03-09T17:29:41.102 INFO:tasks.workunit.client.0.vm06.stdout:8/511: link d15/d16/f3f d15/d16/d1e/fa9 0 2026-03-09T17:29:41.104 INFO:tasks.workunit.client.0.vm06.stdout:1/538: dread d11/d14/f17 [4194304,4194304] 0 2026-03-09T17:29:41.108 INFO:tasks.workunit.client.0.vm06.stdout:0/628: creat d7/d11/d19/d1d/fda x:0 0 0 2026-03-09T17:29:41.112 INFO:tasks.workunit.client.0.vm06.stdout:1/539: dread d11/d14/d1d/d1e/f47 [0,4194304] 0 2026-03-09T17:29:41.114 INFO:tasks.workunit.client.0.vm06.stdout:2/487: unlink d3/d4/d22/d43/d77/d81/f58 0 2026-03-09T17:29:41.115 INFO:tasks.workunit.client.0.vm06.stdout:2/488: fdatasync d3/d4/d12/d2b/d2d/f2a 0 2026-03-09T17:29:41.116 INFO:tasks.workunit.client.0.vm06.stdout:8/512: fsync d15/d16/d1a/d47/f7e 0 2026-03-09T17:29:41.119 INFO:tasks.workunit.client.0.vm06.stdout:0/629: chown d7/d11/f29 688463934 1 2026-03-09T17:29:41.122 INFO:tasks.workunit.client.0.vm06.stdout:0/630: dwrite d7/d11/d5d/db8/fc6 [0,4194304] 0 2026-03-09T17:29:41.133 INFO:tasks.workunit.client.0.vm06.stdout:6/438: link d6/d47/d4d/d6d/c75 d6/d47/d4d/d6d/c8d 0 2026-03-09T17:29:41.136 INFO:tasks.workunit.client.0.vm06.stdout:2/489: creat d3/d4/d12/d2b/d2d/f9d x:0 0 0 2026-03-09T17:29:41.137 INFO:tasks.workunit.client.0.vm06.stdout:2/490: write d3/d4/d46/f94 [678991,16094] 0 2026-03-09T17:29:41.142 INFO:tasks.workunit.client.0.vm06.stdout:6/439: unlink d6/d4f/d3e/d52/l57 0 2026-03-09T17:29:41.142 INFO:tasks.workunit.client.0.vm06.stdout:0/631: link d7/d11/d19/f57 d7/d88/fdb 0 2026-03-09T17:29:41.143 INFO:tasks.workunit.client.0.vm06.stdout:0/632: dread - d7/d11/d19/d1d/fda zero size 2026-03-09T17:29:41.147 INFO:tasks.workunit.client.0.vm06.stdout:6/440: symlink d6/d4f/d3e/d52/d8c/l8e 0 2026-03-09T17:29:41.183 INFO:tasks.workunit.client.0.vm06.stdout:0/633: symlink d7/d11/ldc 0 2026-03-09T17:29:41.183 INFO:tasks.workunit.client.0.vm06.stdout:0/634: chown d7/d11/d5d/d64/f7f 165229592 1 2026-03-09T17:29:41.183 INFO:tasks.workunit.client.0.vm06.stdout:6/441: mkdir d6/d12/d53/d8f 0 2026-03-09T17:29:41.184 INFO:tasks.workunit.client.0.vm06.stdout:0/635: mkdir d7/d11/d19/d3c/db9/ddd 0 2026-03-09T17:29:41.184 INFO:tasks.workunit.client.0.vm06.stdout:0/636: fsync d7/d11/f30 0 2026-03-09T17:29:41.184 INFO:tasks.workunit.client.0.vm06.stdout:0/637: chown d7/d11/d19/d1d/d39/l6e 13 1 2026-03-09T17:29:41.184 INFO:tasks.workunit.client.0.vm06.stdout:0/638: fdatasync d7/d11/f75 0 2026-03-09T17:29:41.184 INFO:tasks.workunit.client.0.vm06.stdout:0/639: creat d7/d11/fde x:0 0 0 2026-03-09T17:29:41.184 INFO:tasks.workunit.client.0.vm06.stdout:2/491: dread d3/d4/d22/d72/f54 [0,4194304] 0 2026-03-09T17:29:41.249 INFO:tasks.workunit.client.0.vm06.stdout:6/442: sync 2026-03-09T17:29:41.257 INFO:tasks.workunit.client.0.vm06.stdout:6/443: unlink d6/d12/d17/d27/c3b 0 2026-03-09T17:29:41.258 INFO:tasks.workunit.client.0.vm06.stdout:6/444: symlink d6/d4f/l90 0 2026-03-09T17:29:41.260 INFO:tasks.workunit.client.0.vm06.stdout:6/445: mkdir d6/d12/d53/d91 0 2026-03-09T17:29:41.263 INFO:tasks.workunit.client.0.vm06.stdout:6/446: unlink d6/d4f/d3e/d52/f74 0 2026-03-09T17:29:41.264 INFO:tasks.workunit.client.0.vm06.stdout:6/447: write d6/d12/d17/f6b [550828,66858] 0 2026-03-09T17:29:41.270 INFO:tasks.workunit.client.0.vm06.stdout:6/448: symlink d6/d47/d4d/d6d/l92 0 2026-03-09T17:29:41.270 INFO:tasks.workunit.client.0.vm06.stdout:6/449: write d6/f7b [215411,5610] 0 2026-03-09T17:29:41.274 INFO:tasks.workunit.client.0.vm06.stdout:6/450: symlink d6/d47/d4d/d6d/l93 0 2026-03-09T17:29:41.275 INFO:tasks.workunit.client.0.vm06.stdout:6/451: symlink d6/d12/d17/d27/d40/l94 0 2026-03-09T17:29:41.318 INFO:tasks.workunit.client.0.vm06.stdout:5/527: dwrite d4/d50/d18/f4a [0,4194304] 0 2026-03-09T17:29:41.323 INFO:tasks.workunit.client.0.vm06.stdout:7/590: dwrite d5/d12/f32 [0,4194304] 0 2026-03-09T17:29:41.327 INFO:tasks.workunit.client.0.vm06.stdout:6/452: dread d6/d4f/f25 [0,4194304] 0 2026-03-09T17:29:41.337 INFO:tasks.workunit.client.0.vm06.stdout:5/528: symlink d4/d52/d55/lc3 0 2026-03-09T17:29:41.342 INFO:tasks.workunit.client.0.vm06.stdout:5/529: symlink d4/d50/d35/d40/d6f/lc4 0 2026-03-09T17:29:41.350 INFO:tasks.workunit.client.0.vm06.stdout:7/591: getdents d5/d1f/d34/d46/d51 0 2026-03-09T17:29:41.356 INFO:tasks.workunit.client.0.vm06.stdout:6/453: sync 2026-03-09T17:29:41.358 INFO:tasks.workunit.client.0.vm06.stdout:6/454: mkdir d6/d4f/d3e/d52/d95 0 2026-03-09T17:29:41.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:41 vm06.local ceph-mon[57307]: from='client.14660 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:41.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:41 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/3124491648' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:29:41.397 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:41 vm09.local ceph-mon[62061]: from='client.14660 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:29:41.397 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:41 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/3124491648' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:29:41.403 INFO:tasks.workunit.client.0.vm06.stdout:3/520: dwrite dd/d59/da1/fa4 [0,4194304] 0 2026-03-09T17:29:41.416 INFO:tasks.workunit.client.0.vm06.stdout:0/640: rmdir d7/d11/d19/d8b/dcb 0 2026-03-09T17:29:41.416 INFO:tasks.workunit.client.0.vm06.stdout:0/641: fdatasync d7/d11/d19/d8b/da4/fa7 0 2026-03-09T17:29:41.421 INFO:tasks.workunit.client.0.vm06.stdout:4/540: dwrite db/d1d/d21/d25/d4b/f66 [0,4194304] 0 2026-03-09T17:29:41.424 INFO:tasks.workunit.client.0.vm06.stdout:4/541: stat db/d1d/d21/d37/l47 0 2026-03-09T17:29:41.431 INFO:tasks.workunit.client.0.vm06.stdout:4/542: dwrite db/d1d/d21/d25/d4b/f66 [0,4194304] 0 2026-03-09T17:29:41.464 INFO:tasks.workunit.client.0.vm06.stdout:4/543: dread db/d59/d5f/d45/f61 [0,4194304] 0 2026-03-09T17:29:41.464 INFO:tasks.workunit.client.0.vm06.stdout:4/544: mknod db/d1d/d21/d37/d69/d78/da0/cc8 0 2026-03-09T17:29:41.464 INFO:tasks.workunit.client.0.vm06.stdout:4/545: mknod db/d1d/d21/d26/d89/dab/dae/cc9 0 2026-03-09T17:29:41.464 INFO:tasks.workunit.client.0.vm06.stdout:4/546: unlink db/d1d/f22 0 2026-03-09T17:29:41.464 INFO:tasks.workunit.client.0.vm06.stdout:4/547: creat db/d1d/d21/d37/d69/d78/da0/db6/fca x:0 0 0 2026-03-09T17:29:41.464 INFO:tasks.workunit.client.0.vm06.stdout:4/548: unlink db/l2c 0 2026-03-09T17:29:41.468 INFO:tasks.workunit.client.0.vm06.stdout:3/521: dread dd/d19/f2b [0,4194304] 0 2026-03-09T17:29:41.708 INFO:tasks.workunit.client.0.vm06.stdout:0/642: sync 2026-03-09T17:29:41.708 INFO:tasks.workunit.client.0.vm06.stdout:3/522: sync 2026-03-09T17:29:41.716 INFO:tasks.workunit.client.0.vm06.stdout:3/523: dread - dd/d19/d28/f6f zero size 2026-03-09T17:29:41.716 INFO:tasks.workunit.client.0.vm06.stdout:3/524: readlink dd/d19/l20 0 2026-03-09T17:29:41.720 INFO:tasks.workunit.client.0.vm06.stdout:3/525: creat dd/d59/da1/faf x:0 0 0 2026-03-09T17:29:41.720 INFO:tasks.workunit.client.0.vm06.stdout:3/526: stat dd/lf 0 2026-03-09T17:29:41.721 INFO:tasks.workunit.client.0.vm06.stdout:3/527: fdatasync dd/d1d/f99 0 2026-03-09T17:29:41.726 INFO:tasks.workunit.client.0.vm06.stdout:3/528: truncate dd/d19/d1e/f41 117325 0 2026-03-09T17:29:41.730 INFO:tasks.workunit.client.0.vm06.stdout:3/529: mknod dd/d1d/d2e/cb0 0 2026-03-09T17:29:41.737 INFO:tasks.workunit.client.0.vm06.stdout:3/530: fdatasync dd/f75 0 2026-03-09T17:29:41.738 INFO:tasks.workunit.client.0.vm06.stdout:3/531: dread - dd/d59/da1/faf zero size 2026-03-09T17:29:41.739 INFO:tasks.workunit.client.0.vm06.stdout:3/532: write dd/f38 [332723,35114] 0 2026-03-09T17:29:41.749 INFO:tasks.workunit.client.0.vm06.stdout:8/513: write d15/d16/d1a/d7c/f9e [1455439,2277] 0 2026-03-09T17:29:41.759 INFO:tasks.workunit.client.0.vm06.stdout:8/514: symlink d15/d39/d67/laa 0 2026-03-09T17:29:41.764 INFO:tasks.workunit.client.0.vm06.stdout:8/515: mknod d15/d39/d67/d77/cab 0 2026-03-09T17:29:41.774 INFO:tasks.workunit.client.0.vm06.stdout:3/533: sync 2026-03-09T17:29:41.776 INFO:tasks.workunit.client.0.vm06.stdout:2/492: truncate d3/d4/d22/d43/d77/d81/f84 583752 0 2026-03-09T17:29:41.778 INFO:tasks.workunit.client.0.vm06.stdout:2/493: rmdir d3/d44 39 2026-03-09T17:29:41.808 INFO:tasks.workunit.client.0.vm06.stdout:9/596: rename d3/d6d/d85 to d3/d15/d48/da8/db9 0 2026-03-09T17:29:41.809 INFO:tasks.workunit.client.0.vm06.stdout:9/597: chown d3/d15/f17 16182680 1 2026-03-09T17:29:41.809 INFO:tasks.workunit.client.0.vm06.stdout:9/598: chown d3/d11/d65/c8e 57 1 2026-03-09T17:29:41.812 INFO:tasks.workunit.client.0.vm06.stdout:7/592: rename f0 to d5/d1f/d34/d46/fa9 0 2026-03-09T17:29:41.817 INFO:tasks.workunit.client.0.vm06.stdout:7/593: rmdir d5/d1f/d34/d46/d51 39 2026-03-09T17:29:41.819 INFO:tasks.workunit.client.0.vm06.stdout:6/455: rename d6/d12/d17/d27 to d6/d47/d96 0 2026-03-09T17:29:41.819 INFO:tasks.workunit.client.0.vm06.stdout:3/534: rename dd to dd/d19/d28/db1 22 2026-03-09T17:29:41.822 INFO:tasks.workunit.client.0.vm06.stdout:9/599: dread d3/f27 [0,4194304] 0 2026-03-09T17:29:41.827 INFO:tasks.workunit.client.0.vm06.stdout:7/594: dwrite d5/f16 [0,4194304] 0 2026-03-09T17:29:41.837 INFO:tasks.workunit.client.0.vm06.stdout:9/600: chown d3/c3e 12278133 1 2026-03-09T17:29:41.851 INFO:tasks.workunit.client.0.vm06.stdout:9/601: rmdir d3/d6d/d9a/d9c 39 2026-03-09T17:29:41.854 INFO:tasks.workunit.client.0.vm06.stdout:7/595: creat d5/d1f/d34/d3f/d8b/faa x:0 0 0 2026-03-09T17:29:41.854 INFO:tasks.workunit.client.0.vm06.stdout:7/596: stat d5/dd/f22 0 2026-03-09T17:29:41.858 INFO:tasks.workunit.client.0.vm06.stdout:5/530: write d4/d22/d64/f70 [1931296,90845] 0 2026-03-09T17:29:41.858 INFO:tasks.workunit.client.0.vm06.stdout:5/531: write d4/d50/f1d [1484869,17403] 0 2026-03-09T17:29:41.863 INFO:tasks.workunit.client.0.vm06.stdout:5/532: write d4/d7e/f8b [2215951,100816] 0 2026-03-09T17:29:41.877 INFO:tasks.workunit.client.0.vm06.stdout:7/597: link d5/d1f/d34/d46/d51/c67 d5/d1f/d34/d3f/cab 0 2026-03-09T17:29:41.879 INFO:tasks.workunit.client.0.vm06.stdout:7/598: dread d5/d12/d64/f77 [0,4194304] 0 2026-03-09T17:29:41.883 INFO:tasks.workunit.client.0.vm06.stdout:7/599: mkdir d5/d7/dac 0 2026-03-09T17:29:41.887 INFO:tasks.workunit.client.0.vm06.stdout:7/600: mkdir d5/d12/dad 0 2026-03-09T17:29:41.887 INFO:tasks.workunit.client.0.vm06.stdout:7/601: write d5/d1f/d34/d46/d51/f7c [875792,69334] 0 2026-03-09T17:29:41.889 INFO:tasks.workunit.client.0.vm06.stdout:7/602: chown d5/cc 0 1 2026-03-09T17:29:41.892 INFO:tasks.workunit.client.0.vm06.stdout:7/603: mkdir d5/d1f/dae 0 2026-03-09T17:29:41.894 INFO:tasks.workunit.client.0.vm06.stdout:7/604: rename d5/d1f/d34/c8f to d5/d1f/d34/d3f/d8b/caf 0 2026-03-09T17:29:41.897 INFO:tasks.workunit.client.0.vm06.stdout:7/605: mknod d5/d1f/d34/d3f/cb0 0 2026-03-09T17:29:41.899 INFO:tasks.workunit.client.0.vm06.stdout:7/606: readlink d5/d12/d64/l69 0 2026-03-09T17:29:41.902 INFO:tasks.workunit.client.0.vm06.stdout:7/607: write d5/f16 [54818,61051] 0 2026-03-09T17:29:41.902 INFO:tasks.workunit.client.0.vm06.stdout:7/608: fdatasync d5/d7/f62 0 2026-03-09T17:29:41.906 INFO:tasks.workunit.client.0.vm06.stdout:7/609: dread d5/d12/d64/f77 [0,4194304] 0 2026-03-09T17:29:41.911 INFO:tasks.workunit.client.0.vm06.stdout:7/610: getdents d5/d1f/dae 0 2026-03-09T17:29:41.915 INFO:tasks.workunit.client.0.vm06.stdout:7/611: creat d5/d12/d5f/fb1 x:0 0 0 2026-03-09T17:29:41.919 INFO:tasks.workunit.client.0.vm06.stdout:7/612: rename d5/d7/f58 to d5/d12/d5f/fb2 0 2026-03-09T17:29:41.923 INFO:tasks.workunit.client.0.vm06.stdout:1/540: truncate d11/d14/d1d/f31 191360 0 2026-03-09T17:29:41.925 INFO:tasks.workunit.client.0.vm06.stdout:1/541: mknod d11/d14/d1d/d8c/cb3 0 2026-03-09T17:29:41.928 INFO:tasks.workunit.client.0.vm06.stdout:1/542: dwrite d11/d14/d1d/d1e/d2a/f40 [0,4194304] 0 2026-03-09T17:29:41.932 INFO:tasks.workunit.client.0.vm06.stdout:1/543: readlink d11/d14/d1d/d1e/d2a/l63 0 2026-03-09T17:29:41.935 INFO:tasks.workunit.client.0.vm06.stdout:1/544: rename d11/d14/l16 to d11/d14/d1c/d1f/lb4 0 2026-03-09T17:29:41.937 INFO:tasks.workunit.client.0.vm06.stdout:1/545: mknod d11/cb5 0 2026-03-09T17:29:41.966 INFO:tasks.workunit.client.0.vm06.stdout:4/549: dwrite db/d1d/f1f [0,4194304] 0 2026-03-09T17:29:42.007 INFO:tasks.workunit.client.0.vm06.stdout:0/643: dwrite d7/fe [0,4194304] 0 2026-03-09T17:29:42.008 INFO:tasks.workunit.client.0.vm06.stdout:0/644: chown d7/d11/d19/d1d 1828360 1 2026-03-09T17:29:42.023 INFO:tasks.workunit.client.0.vm06.stdout:0/645: creat d7/d11/d2d/fdf x:0 0 0 2026-03-09T17:29:42.025 INFO:tasks.workunit.client.0.vm06.stdout:4/550: dread f7 [0,4194304] 0 2026-03-09T17:29:42.025 INFO:tasks.workunit.client.0.vm06.stdout:4/551: readlink db/d59/d5f/d6d/lb2 0 2026-03-09T17:29:42.045 INFO:tasks.workunit.client.0.vm06.stdout:0/646: symlink d7/d11/d19/d23/db7/dbd/dc1/le0 0 2026-03-09T17:29:42.045 INFO:tasks.workunit.client.0.vm06.stdout:0/647: fsync d7/d11/fde 0 2026-03-09T17:29:42.067 INFO:tasks.workunit.client.0.vm06.stdout:0/648: symlink d7/d11/d19/d37/le1 0 2026-03-09T17:29:42.069 INFO:tasks.workunit.client.0.vm06.stdout:8/516: write d15/d16/f51 [4714231,61120] 0 2026-03-09T17:29:42.081 INFO:tasks.workunit.client.0.vm06.stdout:0/649: dread d7/d11/d5d/d64/f6a [0,4194304] 0 2026-03-09T17:29:42.088 INFO:tasks.workunit.client.0.vm06.stdout:8/517: mkdir d15/d39/d67/d77/d97/dac 0 2026-03-09T17:29:42.095 INFO:tasks.workunit.client.0.vm06.stdout:2/494: dwrite d3/d4/d22/d43/d77/d81/d64/f9a [4194304,4194304] 0 2026-03-09T17:29:42.099 INFO:tasks.workunit.client.0.vm06.stdout:8/518: chown d15/d16/d1e/fa9 19062 1 2026-03-09T17:29:42.137 INFO:tasks.workunit.client.0.vm06.stdout:8/519: fsync d15/d16/d1a/f29 0 2026-03-09T17:29:42.137 INFO:tasks.workunit.client.0.vm06.stdout:8/520: creat d15/d39/d67/d77/d97/fad x:0 0 0 2026-03-09T17:29:42.137 INFO:tasks.workunit.client.0.vm06.stdout:3/535: dwrite dd/d19/d2c/f79 [0,4194304] 0 2026-03-09T17:29:42.137 INFO:tasks.workunit.client.0.vm06.stdout:8/521: mknod d15/d31/d58/d9b/cae 0 2026-03-09T17:29:42.137 INFO:tasks.workunit.client.0.vm06.stdout:9/602: write d3/d15/d16/f7d [983330,14634] 0 2026-03-09T17:29:42.137 INFO:tasks.workunit.client.0.vm06.stdout:8/522: rmdir d15/d39/d67 39 2026-03-09T17:29:42.137 INFO:tasks.workunit.client.0.vm06.stdout:5/533: dwrite d4/d50/d18/f3c [4194304,4194304] 0 2026-03-09T17:29:42.137 INFO:tasks.workunit.client.0.vm06.stdout:3/536: truncate dd/d19/d25/d2d/f55 943388 0 2026-03-09T17:29:42.142 INFO:tasks.workunit.client.0.vm06.stdout:3/537: creat dd/d1d/d2e/fb2 x:0 0 0 2026-03-09T17:29:42.144 INFO:tasks.workunit.client.0.vm06.stdout:5/534: rename d4/f5 to d4/da4/fc5 0 2026-03-09T17:29:42.146 INFO:tasks.workunit.client.0.vm06.stdout:9/603: symlink d3/d26/lba 0 2026-03-09T17:29:42.147 INFO:tasks.workunit.client.0.vm06.stdout:8/523: link d15/d16/d1e/d28/d5e/f98 d15/d16/d1a/d47/faf 0 2026-03-09T17:29:42.150 INFO:tasks.workunit.client.0.vm06.stdout:7/613: write d5/d7/d2b/f42 [2941011,65296] 0 2026-03-09T17:29:42.151 INFO:tasks.workunit.client.0.vm06.stdout:7/614: readlink d5/d12/d64/l69 0 2026-03-09T17:29:42.152 INFO:tasks.workunit.client.0.vm06.stdout:7/615: write d5/d7/d2b/f42 [1373593,73266] 0 2026-03-09T17:29:42.153 INFO:tasks.workunit.client.0.vm06.stdout:7/616: write d5/d1f/d34/d46/f4e [1852810,109566] 0 2026-03-09T17:29:42.155 INFO:tasks.workunit.client.0.vm06.stdout:3/538: symlink dd/d19/d25/d48/lb3 0 2026-03-09T17:29:42.245 INFO:tasks.workunit.client.0.vm06.stdout:5/535: read d4/d50/f14 [1412861,51701] 0 2026-03-09T17:29:42.245 INFO:tasks.workunit.client.0.vm06.stdout:8/524: dread - d15/d39/d67/d77/d97/fad zero size 2026-03-09T17:29:42.245 INFO:tasks.workunit.client.0.vm06.stdout:8/525: write d15/d39/d3c/d6c/f8b [286517,93537] 0 2026-03-09T17:29:42.245 INFO:tasks.workunit.client.0.vm06.stdout:3/539: rename f4 to dd/d1d/d2e/fb4 0 2026-03-09T17:29:42.245 INFO:tasks.workunit.client.0.vm06.stdout:3/540: chown dd/f5f 180229 1 2026-03-09T17:29:42.245 INFO:tasks.workunit.client.0.vm06.stdout:3/541: readlink lb 0 2026-03-09T17:29:42.245 INFO:tasks.workunit.client.0.vm06.stdout:5/536: creat d4/d50/d35/fc6 x:0 0 0 2026-03-09T17:29:42.245 INFO:tasks.workunit.client.0.vm06.stdout:5/537: chown d4/d52/d55/lc3 17287 1 2026-03-09T17:29:42.246 INFO:tasks.workunit.client.0.vm06.stdout:9/604: creat d3/d15/fbb x:0 0 0 2026-03-09T17:29:42.246 INFO:tasks.workunit.client.0.vm06.stdout:5/538: creat d4/d50/d35/d40/d6f/fc7 x:0 0 0 2026-03-09T17:29:42.246 INFO:tasks.workunit.client.0.vm06.stdout:3/542: creat dd/d81/da3/dae/fb5 x:0 0 0 2026-03-09T17:29:42.246 INFO:tasks.workunit.client.0.vm06.stdout:3/543: fsync dd/d5c/f66 0 2026-03-09T17:29:42.246 INFO:tasks.workunit.client.0.vm06.stdout:7/617: getdents d5/d1f/d34/d3f/d8b 0 2026-03-09T17:29:42.246 INFO:tasks.workunit.client.0.vm06.stdout:3/544: dread dd/d5b/d65/f6a [0,4194304] 0 2026-03-09T17:29:42.246 INFO:tasks.workunit.client.0.vm06.stdout:7/618: creat d5/dd/d79/fb3 x:0 0 0 2026-03-09T17:29:42.246 INFO:tasks.workunit.client.0.vm06.stdout:7/619: chown d5/d1f/d34/d46/f4e 729389 1 2026-03-09T17:29:42.246 INFO:tasks.workunit.client.0.vm06.stdout:7/620: creat d5/d7/fb4 x:0 0 0 2026-03-09T17:29:42.246 INFO:tasks.workunit.client.0.vm06.stdout:7/621: write d5/d12/d64/d6b/f6f [940816,90148] 0 2026-03-09T17:29:42.246 INFO:tasks.workunit.client.0.vm06.stdout:3/545: truncate dd/d19/d25/f56 7792589 0 2026-03-09T17:29:42.246 INFO:tasks.workunit.client.0.vm06.stdout:3/546: dread dd/f10 [0,4194304] 0 2026-03-09T17:29:42.281 INFO:tasks.workunit.client.0.vm06.stdout:7/622: dread d5/d1f/d34/d46/fa9 [0,4194304] 0 2026-03-09T17:29:42.323 INFO:tasks.workunit.client.0.vm06.stdout:7/623: chown d5/d7/c31 1508276 1 2026-03-09T17:29:42.323 INFO:tasks.workunit.client.0.vm06.stdout:7/624: rename d5/dd/l83 to d5/d1f/d34/d46/lb5 0 2026-03-09T17:29:42.323 INFO:tasks.workunit.client.0.vm06.stdout:7/625: mknod d5/d1f/dae/cb6 0 2026-03-09T17:29:42.323 INFO:tasks.workunit.client.0.vm06.stdout:7/626: rename d5/d1f/d34/l78 to d5/dd/lb7 0 2026-03-09T17:29:42.323 INFO:tasks.workunit.client.0.vm06.stdout:7/627: fdatasync d5/f8 0 2026-03-09T17:29:42.323 INFO:tasks.workunit.client.0.vm06.stdout:7/628: creat d5/d12/d64/d6b/fb8 x:0 0 0 2026-03-09T17:29:42.323 INFO:tasks.workunit.client.0.vm06.stdout:7/629: unlink d5/d1f/d34/d46/d51/f7b 0 2026-03-09T17:29:42.333 INFO:tasks.workunit.client.0.vm06.stdout:9/605: dread d3/d26/d6c/f5b [0,4194304] 0 2026-03-09T17:29:42.336 INFO:tasks.workunit.client.0.vm06.stdout:9/606: symlink d3/d15/d36/d4c/d6a/d8a/lbc 0 2026-03-09T17:29:42.337 INFO:tasks.workunit.client.0.vm06.stdout:9/607: write d3/d15/d36/d83/fb1 [604407,78589] 0 2026-03-09T17:29:42.338 INFO:tasks.workunit.client.0.vm06.stdout:5/539: dread f0 [0,4194304] 0 2026-03-09T17:29:42.470 INFO:tasks.workunit.client.0.vm06.stdout:9/608: dread d3/f1b [0,4194304] 0 2026-03-09T17:29:42.498 INFO:tasks.workunit.client.0.vm06.stdout:0/650: sync 2026-03-09T17:29:42.498 INFO:tasks.workunit.client.0.vm06.stdout:9/609: fdatasync d3/d26/f28 0 2026-03-09T17:29:42.498 INFO:tasks.workunit.client.0.vm06.stdout:9/610: dwrite d3/d6d/d9a/d9c/fab [0,4194304] 0 2026-03-09T17:29:42.498 INFO:tasks.workunit.client.0.vm06.stdout:9/611: fsync d3/d15/d36/d4d/f62 0 2026-03-09T17:29:42.498 INFO:tasks.workunit.client.0.vm06.stdout:9/612: unlink d3/d15/ca6 0 2026-03-09T17:29:42.498 INFO:tasks.workunit.client.0.vm06.stdout:9/613: chown d3/d15 1062912122 1 2026-03-09T17:29:42.498 INFO:tasks.workunit.client.0.vm06.stdout:9/614: rename d3/d15/d16/c22 to d3/d6d/cbd 0 2026-03-09T17:29:42.498 INFO:tasks.workunit.client.0.vm06.stdout:9/615: fdatasync d3/d26/d35/fb5 0 2026-03-09T17:29:42.499 INFO:tasks.workunit.client.0.vm06.stdout:9/616: mknod d3/d26/d6c/d68/cbe 0 2026-03-09T17:29:42.505 INFO:tasks.workunit.client.0.vm06.stdout:9/617: fdatasync d3/d15/f2e 0 2026-03-09T17:29:42.619 INFO:tasks.workunit.client.0.vm06.stdout:1/546: dwrite d11/d14/d1c/d1f/f7f [4194304,4194304] 0 2026-03-09T17:29:42.624 INFO:tasks.workunit.client.0.vm06.stdout:1/547: symlink d11/d14/d1c/d1f/lb6 0 2026-03-09T17:29:42.641 INFO:tasks.workunit.client.0.vm06.stdout:0/651: sync 2026-03-09T17:29:42.645 INFO:tasks.workunit.client.0.vm06.stdout:0/652: rename l4 to d7/d11/d2d/dca/le2 0 2026-03-09T17:29:42.837 INFO:tasks.workunit.client.0.vm06.stdout:4/552: dwrite db/d1d/d21/d44/d8a/fa3 [0,4194304] 0 2026-03-09T17:29:42.837 INFO:tasks.workunit.client.0.vm06.stdout:4/553: dread - db/d1d/d21/d25/d4b/d85/fbe zero size 2026-03-09T17:29:42.847 INFO:tasks.workunit.client.0.vm06.stdout:4/554: write db/df/f2d [4475732,114545] 0 2026-03-09T17:29:42.848 INFO:tasks.workunit.client.0.vm06.stdout:4/555: chown db/l9d 169473336 1 2026-03-09T17:29:42.864 INFO:tasks.workunit.client.0.vm06.stdout:4/556: dread db/d1d/d21/d37/fc6 [0,4194304] 0 2026-03-09T17:29:42.865 INFO:tasks.workunit.client.0.vm06.stdout:4/557: stat db/d1d/f3a 0 2026-03-09T17:29:42.865 INFO:tasks.workunit.client.0.vm06.stdout:2/495: dwrite d3/d4/d12/f35 [0,4194304] 0 2026-03-09T17:29:42.870 INFO:tasks.workunit.client.0.vm06.stdout:2/496: symlink d3/l9e 0 2026-03-09T17:29:42.871 INFO:tasks.workunit.client.0.vm06.stdout:2/497: readlink d3/d4/d12/d2b/d36/l45 0 2026-03-09T17:29:42.873 INFO:tasks.workunit.client.0.vm06.stdout:4/558: link db/c1b db/d57/ccb 0 2026-03-09T17:29:42.874 INFO:tasks.workunit.client.0.vm06.stdout:4/559: chown db/d1d/d21/d25/d4b 28303087 1 2026-03-09T17:29:42.876 INFO:tasks.workunit.client.0.vm06.stdout:4/560: mkdir db/d1d/d21/d26/d89/dab/dae/dcc 0 2026-03-09T17:29:42.878 INFO:tasks.workunit.client.0.vm06.stdout:4/561: write db/f68 [3294464,80377] 0 2026-03-09T17:29:42.881 INFO:tasks.workunit.client.0.vm06.stdout:4/562: symlink db/d1d/d21/d25/d4b/d85/lcd 0 2026-03-09T17:29:42.884 INFO:tasks.workunit.client.0.vm06.stdout:4/563: creat db/d1d/d21/d26/d89/dab/dae/dcc/fce x:0 0 0 2026-03-09T17:29:42.886 INFO:tasks.workunit.client.0.vm06.stdout:4/564: write db/fc [5001452,129872] 0 2026-03-09T17:29:42.888 INFO:tasks.workunit.client.0.vm06.stdout:4/565: stat db/df/l79 0 2026-03-09T17:29:42.917 INFO:tasks.workunit.client.0.vm06.stdout:6/456: dwrite d6/d47/f88 [4194304,4194304] 0 2026-03-09T17:29:42.961 INFO:tasks.workunit.client.0.vm06.stdout:6/457: sync 2026-03-09T17:29:42.967 INFO:tasks.workunit.client.0.vm06.stdout:6/458: rename d6/d47/d4d/f8b to d6/f97 0 2026-03-09T17:29:42.975 INFO:tasks.workunit.client.0.vm06.stdout:6/459: sync 2026-03-09T17:29:42.995 INFO:tasks.workunit.client.0.vm06.stdout:8/526: dwrite d15/d39/f45 [0,4194304] 0 2026-03-09T17:29:43.011 INFO:tasks.workunit.client.0.vm06.stdout:8/527: dread d15/d39/f6f [0,4194304] 0 2026-03-09T17:29:43.015 INFO:tasks.workunit.client.0.vm06.stdout:8/528: rename d15/d16/d1a/l1d to d15/d39/d67/d86/lb0 0 2026-03-09T17:29:43.017 INFO:tasks.workunit.client.0.vm06.stdout:8/529: creat d15/d31/d58/d9b/fb1 x:0 0 0 2026-03-09T17:29:43.020 INFO:tasks.workunit.client.0.vm06.stdout:8/530: dread d15/d16/d1e/d30/f3b [4194304,4194304] 0 2026-03-09T17:29:43.046 INFO:tasks.workunit.client.0.vm06.stdout:3/547: dwrite dd/f10 [4194304,4194304] 0 2026-03-09T17:29:43.059 INFO:tasks.workunit.client.0.vm06.stdout:3/548: mknod dd/d19/d25/d48/d93/cb6 0 2026-03-09T17:29:43.079 INFO:tasks.workunit.client.0.vm06.stdout:7/630: dwrite d5/dd/f19 [0,4194304] 0 2026-03-09T17:29:43.093 INFO:tasks.workunit.client.0.vm06.stdout:7/631: link d5/d1f/d34/d46/d51/f92 d5/d1f/d34/d3f/d91/fb9 0 2026-03-09T17:29:43.094 INFO:tasks.workunit.client.0.vm06.stdout:7/632: symlink d5/d12/d64/lba 0 2026-03-09T17:29:43.096 INFO:tasks.workunit.client.0.vm06.stdout:7/633: creat d5/d1f/d34/d3f/fbb x:0 0 0 2026-03-09T17:29:43.098 INFO:tasks.workunit.client.0.vm06.stdout:7/634: link d5/d12/d64/d6b/l70 d5/d7/dac/lbc 0 2026-03-09T17:29:43.100 INFO:tasks.workunit.client.0.vm06.stdout:7/635: write d5/d12/f32 [4096073,37477] 0 2026-03-09T17:29:43.106 INFO:tasks.workunit.client.0.vm06.stdout:5/540: dwrite d4/d50/f14 [0,4194304] 0 2026-03-09T17:29:43.114 INFO:tasks.workunit.client.0.vm06.stdout:7/636: readlink d5/d7/lb 0 2026-03-09T17:29:43.118 INFO:tasks.workunit.client.0.vm06.stdout:5/541: creat d4/d52/fc8 x:0 0 0 2026-03-09T17:29:43.118 INFO:tasks.workunit.client.0.vm06.stdout:7/637: dwrite d5/d1f/d34/f54 [4194304,4194304] 0 2026-03-09T17:29:43.121 INFO:tasks.workunit.client.0.vm06.stdout:7/638: dread d5/d1f/d34/f54 [4194304,4194304] 0 2026-03-09T17:29:43.127 INFO:tasks.workunit.client.0.vm06.stdout:7/639: rename d5/d1f/d34/d46/d51/da5 to d5/d7/d2b/dbd 0 2026-03-09T17:29:43.132 INFO:tasks.workunit.client.0.vm06.stdout:7/640: unlink d5/d1f/f80 0 2026-03-09T17:29:43.135 INFO:tasks.workunit.client.0.vm06.stdout:7/641: chown d5/d7/dac/lbc 1 1 2026-03-09T17:29:43.135 INFO:tasks.workunit.client.0.vm06.stdout:7/642: fsync d5/d1f/d34/d46/d51/f7c 0 2026-03-09T17:29:43.139 INFO:tasks.workunit.client.0.vm06.stdout:7/643: rmdir d5/d12/d5f 39 2026-03-09T17:29:43.140 INFO:tasks.workunit.client.0.vm06.stdout:7/644: write d5/dd/d79/fb3 [342940,51200] 0 2026-03-09T17:29:43.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:42 vm06.local ceph-mon[57307]: pgmap v154: 65 pgs: 65 active+clean; 1.2 GiB data, 4.6 GiB used, 115 GiB / 120 GiB avail; 22 MiB/s rd, 77 MiB/s wr, 303 op/s 2026-03-09T17:29:43.143 INFO:tasks.workunit.client.0.vm06.stdout:7/645: creat d5/d7/d2b/dbd/fbe x:0 0 0 2026-03-09T17:29:43.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:42 vm09.local ceph-mon[62061]: pgmap v154: 65 pgs: 65 active+clean; 1.2 GiB data, 4.6 GiB used, 115 GiB / 120 GiB avail; 22 MiB/s rd, 77 MiB/s wr, 303 op/s 2026-03-09T17:29:43.223 INFO:tasks.workunit.client.0.vm06.stdout:9/618: truncate d3/d26/d35/f99 1488041 0 2026-03-09T17:29:43.242 INFO:tasks.workunit.client.0.vm06.stdout:9/619: read d3/d11/f1f [3717246,11412] 0 2026-03-09T17:29:43.243 INFO:tasks.workunit.client.0.vm06.stdout:9/620: mknod d3/d15/d36/d4c/d6a/d8a/cbf 0 2026-03-09T17:29:43.246 INFO:tasks.workunit.client.0.vm06.stdout:9/621: rename d3/d26/d6c/d68/cbe to d3/d26/d6c/d68/cc0 0 2026-03-09T17:29:43.246 INFO:tasks.workunit.client.0.vm06.stdout:1/548: write d11/f18 [1005068,93795] 0 2026-03-09T17:29:43.247 INFO:tasks.workunit.client.0.vm06.stdout:1/549: stat d11/d14/d1c/d1f/f4c 0 2026-03-09T17:29:43.250 INFO:tasks.workunit.client.0.vm06.stdout:9/622: symlink d3/dad/lc1 0 2026-03-09T17:29:43.253 INFO:tasks.workunit.client.0.vm06.stdout:9/623: mknod d3/dad/cc2 0 2026-03-09T17:29:43.259 INFO:tasks.workunit.client.0.vm06.stdout:0/653: write d7/d11/f29 [2731247,37261] 0 2026-03-09T17:29:43.261 INFO:tasks.workunit.client.0.vm06.stdout:0/654: rmdir d7/d11/d19/d8b 39 2026-03-09T17:29:43.262 INFO:tasks.workunit.client.0.vm06.stdout:9/624: dread d3/d26/d6c/f75 [0,4194304] 0 2026-03-09T17:29:43.264 INFO:tasks.workunit.client.0.vm06.stdout:0/655: rmdir d7/d11/d2d/dca 39 2026-03-09T17:29:43.265 INFO:tasks.workunit.client.0.vm06.stdout:9/625: dread d3/d15/d16/f72 [0,4194304] 0 2026-03-09T17:29:43.268 INFO:tasks.workunit.client.0.vm06.stdout:0/656: write d7/d11/d19/d23/db7/dbd/fc0 [271359,67206] 0 2026-03-09T17:29:43.279 INFO:tasks.workunit.client.0.vm06.stdout:0/657: rmdir d7/d11/d19 39 2026-03-09T17:29:43.279 INFO:tasks.workunit.client.0.vm06.stdout:0/658: symlink d7/d11/d19/d8b/le3 0 2026-03-09T17:29:43.279 INFO:tasks.workunit.client.0.vm06.stdout:0/659: write d7/d11/d19/d23/db7/dbd/fc0 [981141,48963] 0 2026-03-09T17:29:43.285 INFO:tasks.workunit.client.0.vm06.stdout:1/550: sync 2026-03-09T17:29:43.290 INFO:tasks.workunit.client.0.vm06.stdout:1/551: mkdir d11/d14/d1c/d3a/db7 0 2026-03-09T17:29:43.291 INFO:tasks.workunit.client.0.vm06.stdout:1/552: fdatasync d11/d14/d1d/d1e/d96/fab 0 2026-03-09T17:29:43.296 INFO:tasks.workunit.client.0.vm06.stdout:1/553: mknod d11/d14/d1c/d1f/d57/cb8 0 2026-03-09T17:29:43.300 INFO:tasks.workunit.client.0.vm06.stdout:1/554: dwrite d11/d14/d1d/d42/f52 [4194304,4194304] 0 2026-03-09T17:29:43.304 INFO:tasks.workunit.client.0.vm06.stdout:1/555: truncate d11/d69/fad 629284 0 2026-03-09T17:29:43.311 INFO:tasks.workunit.client.0.vm06.stdout:1/556: creat d11/d14/d1d/d1e/d2a/d34/d58/fb9 x:0 0 0 2026-03-09T17:29:43.317 INFO:tasks.workunit.client.0.vm06.stdout:2/498: getdents d3 0 2026-03-09T17:29:43.321 INFO:tasks.workunit.client.0.vm06.stdout:1/557: dread d11/d14/d1d/f8f [0,4194304] 0 2026-03-09T17:29:43.328 INFO:tasks.workunit.client.0.vm06.stdout:4/566: dwrite db/d1d/d21/fa1 [0,4194304] 0 2026-03-09T17:29:43.328 INFO:tasks.workunit.client.0.vm06.stdout:1/558: creat d11/d14/d1d/d1e/d2a/fba x:0 0 0 2026-03-09T17:29:43.332 INFO:tasks.workunit.client.0.vm06.stdout:1/559: dwrite d11/d14/d1d/d1e/d2a/fae [0,4194304] 0 2026-03-09T17:29:43.333 INFO:tasks.workunit.client.0.vm06.stdout:1/560: dread - d11/d14/d1d/d42/d46/d92/fa3 zero size 2026-03-09T17:29:43.334 INFO:tasks.workunit.client.0.vm06.stdout:1/561: chown d11/d14/d1c/d5f/c75 1628809 1 2026-03-09T17:29:43.336 INFO:tasks.workunit.client.0.vm06.stdout:4/567: mknod db/d1d/d21/d25/d4b/d85/ccf 0 2026-03-09T17:29:43.337 INFO:tasks.workunit.client.0.vm06.stdout:4/568: chown db/d1d/d21/d26/d89/dab/dae 22824486 1 2026-03-09T17:29:43.339 INFO:tasks.workunit.client.0.vm06.stdout:4/569: creat db/d1d/d21/d26/d89/dab/dae/dcc/fd0 x:0 0 0 2026-03-09T17:29:43.355 INFO:tasks.workunit.client.0.vm06.stdout:6/460: chown d6/d47/f49 462 1 2026-03-09T17:29:43.356 INFO:tasks.workunit.client.0.vm06.stdout:6/461: chown d6/d4f/d3e/d52 25 1 2026-03-09T17:29:43.364 INFO:tasks.workunit.client.0.vm06.stdout:8/531: dwrite d15/d16/d19/d71/f82 [0,4194304] 0 2026-03-09T17:29:43.370 INFO:tasks.workunit.client.0.vm06.stdout:8/532: creat d15/fb2 x:0 0 0 2026-03-09T17:29:43.370 INFO:tasks.workunit.client.0.vm06.stdout:4/570: sync 2026-03-09T17:29:43.374 INFO:tasks.workunit.client.0.vm06.stdout:8/533: dwrite d15/d31/d58/d9b/fb1 [0,4194304] 0 2026-03-09T17:29:43.381 INFO:tasks.workunit.client.0.vm06.stdout:3/549: dwrite dd/d19/d25/d44/f88 [0,4194304] 0 2026-03-09T17:29:43.387 INFO:tasks.workunit.client.0.vm06.stdout:3/550: creat dd/d5b/d65/fb7 x:0 0 0 2026-03-09T17:29:43.396 INFO:tasks.workunit.client.0.vm06.stdout:4/571: rename db/d1d/d21/d25/l41 to db/d59/d5f/d45/ld1 0 2026-03-09T17:29:43.399 INFO:tasks.workunit.client.0.vm06.stdout:8/534: rename d15/d16/d1a/d47/ca2 to d15/d16/d19/d71/cb3 0 2026-03-09T17:29:43.404 INFO:tasks.workunit.client.0.vm06.stdout:8/535: getdents d15/d16/d1e 0 2026-03-09T17:29:43.423 INFO:tasks.workunit.client.0.vm06.stdout:8/536: dread f12 [0,4194304] 0 2026-03-09T17:29:43.428 INFO:tasks.workunit.client.0.vm06.stdout:8/537: creat d15/d16/fb4 x:0 0 0 2026-03-09T17:29:43.435 INFO:tasks.workunit.client.0.vm06.stdout:5/542: write d4/f71 [173384,22880] 0 2026-03-09T17:29:43.453 INFO:tasks.workunit.client.0.vm06.stdout:7/646: write d5/dd/fa0 [785335,106694] 0 2026-03-09T17:29:43.454 INFO:tasks.workunit.client.0.vm06.stdout:7/647: write d5/d7/d2b/dbd/fbe [861389,107841] 0 2026-03-09T17:29:43.459 INFO:tasks.workunit.client.0.vm06.stdout:7/648: stat d5/dd/lb7 0 2026-03-09T17:29:43.462 INFO:tasks.workunit.client.0.vm06.stdout:7/649: dread d5/f71 [0,4194304] 0 2026-03-09T17:29:43.465 INFO:tasks.workunit.client.0.vm06.stdout:7/650: symlink d5/dd/d79/d7f/lbf 0 2026-03-09T17:29:43.466 INFO:tasks.workunit.client.0.vm06.stdout:7/651: dread - d5/dd/f9d zero size 2026-03-09T17:29:43.488 INFO:tasks.workunit.client.0.vm06.stdout:9/626: write d3/d11/d65/f71 [398196,61948] 0 2026-03-09T17:29:43.489 INFO:tasks.workunit.client.0.vm06.stdout:9/627: readlink d3/d15/l1e 0 2026-03-09T17:29:43.494 INFO:tasks.workunit.client.0.vm06.stdout:0/660: dwrite d7/d11/f1c [4194304,4194304] 0 2026-03-09T17:29:43.495 INFO:tasks.workunit.client.0.vm06.stdout:9/628: mkdir d3/d15/d36/d4c/d6a/d8a/dc3 0 2026-03-09T17:29:43.498 INFO:tasks.workunit.client.0.vm06.stdout:9/629: symlink d3/d2c/lc4 0 2026-03-09T17:29:43.501 INFO:tasks.workunit.client.0.vm06.stdout:9/630: mknod d3/d15/d36/d4c/d6a/d8a/cc5 0 2026-03-09T17:29:43.502 INFO:tasks.workunit.client.0.vm06.stdout:0/661: unlink d7/d11/d19/d1d/cc7 0 2026-03-09T17:29:43.505 INFO:tasks.workunit.client.0.vm06.stdout:9/631: chown d3/l4 687 1 2026-03-09T17:29:43.509 INFO:tasks.workunit.client.0.vm06.stdout:9/632: read - d3/d15/d36/d4d/fa4 zero size 2026-03-09T17:29:43.512 INFO:tasks.workunit.client.0.vm06.stdout:0/662: unlink d7/d11/d19/d8b/da4/cae 0 2026-03-09T17:29:43.516 INFO:tasks.workunit.client.0.vm06.stdout:9/633: fsync d3/f1b 0 2026-03-09T17:29:43.518 INFO:tasks.workunit.client.0.vm06.stdout:2/499: write d3/f29 [124753,80886] 0 2026-03-09T17:29:43.520 INFO:tasks.workunit.client.0.vm06.stdout:0/663: mkdir d7/d11/d19/d3c/db9/ddd/de4 0 2026-03-09T17:29:43.525 INFO:tasks.workunit.client.0.vm06.stdout:9/634: link d3/d26/f33 d3/d15/d36/d83/fc6 0 2026-03-09T17:29:43.528 INFO:tasks.workunit.client.0.vm06.stdout:9/635: dread d3/f27 [0,4194304] 0 2026-03-09T17:29:43.532 INFO:tasks.workunit.client.0.vm06.stdout:1/562: write d11/f13 [9200309,58783] 0 2026-03-09T17:29:43.539 INFO:tasks.workunit.client.0.vm06.stdout:0/664: creat d7/d11/d19/fe5 x:0 0 0 2026-03-09T17:29:43.544 INFO:tasks.workunit.client.0.vm06.stdout:6/462: dwrite d6/d12/d2d/f5e [0,4194304] 0 2026-03-09T17:29:43.546 INFO:tasks.workunit.client.0.vm06.stdout:2/500: sync 2026-03-09T17:29:43.547 INFO:tasks.workunit.client.0.vm06.stdout:2/501: dread - d3/d4/d12/d2b/d2d/f9d zero size 2026-03-09T17:29:43.550 INFO:tasks.workunit.client.0.vm06.stdout:0/665: symlink d7/d11/d19/le6 0 2026-03-09T17:29:43.551 INFO:tasks.workunit.client.0.vm06.stdout:0/666: fsync d7/d11/f29 0 2026-03-09T17:29:43.551 INFO:tasks.workunit.client.0.vm06.stdout:0/667: chown d7/d11/d2d/daf 1 1 2026-03-09T17:29:43.558 INFO:tasks.workunit.client.0.vm06.stdout:6/463: rename d6/d12/d17/l34 to d6/d4f/d73/l98 0 2026-03-09T17:29:43.568 INFO:tasks.workunit.client.0.vm06.stdout:2/502: mkdir d3/d4/d12/d2b/d9f 0 2026-03-09T17:29:43.568 INFO:tasks.workunit.client.0.vm06.stdout:2/503: chown d3/d4/d12/d2b/f32 482230 1 2026-03-09T17:29:43.568 INFO:tasks.workunit.client.0.vm06.stdout:6/464: chown d6/c41 150927 1 2026-03-09T17:29:43.568 INFO:tasks.workunit.client.0.vm06.stdout:6/465: fdatasync d6/d12/f76 0 2026-03-09T17:29:43.568 INFO:tasks.workunit.client.0.vm06.stdout:2/504: creat d3/d4/d46/fa0 x:0 0 0 2026-03-09T17:29:43.569 INFO:tasks.workunit.client.0.vm06.stdout:3/551: write dd/d81/fac [2308452,46154] 0 2026-03-09T17:29:43.570 INFO:tasks.workunit.client.0.vm06.stdout:3/552: chown dd/d81/fac 308042089 1 2026-03-09T17:29:43.571 INFO:tasks.workunit.client.0.vm06.stdout:2/505: symlink d3/d4/d22/d72/la1 0 2026-03-09T17:29:43.575 INFO:tasks.workunit.client.0.vm06.stdout:0/668: rename d7/d11/d19/d3c/f55 to d7/d11/d2d/fe7 0 2026-03-09T17:29:43.577 INFO:tasks.workunit.client.0.vm06.stdout:3/553: sync 2026-03-09T17:29:43.580 INFO:tasks.workunit.client.0.vm06.stdout:0/669: rename d7/d11/f20 to d7/d11/d19/d3c/fe8 0 2026-03-09T17:29:43.582 INFO:tasks.workunit.client.0.vm06.stdout:3/554: mkdir dd/d19/d1e/db8 0 2026-03-09T17:29:43.588 INFO:tasks.workunit.client.0.vm06.stdout:3/555: rename dd/d1d/f45 to dd/d19/d25/d48/d93/fb9 0 2026-03-09T17:29:43.590 INFO:tasks.workunit.client.0.vm06.stdout:4/572: write db/d59/d5f/d6d/f7b [274358,85048] 0 2026-03-09T17:29:43.593 INFO:tasks.workunit.client.0.vm06.stdout:2/506: link d3/d4/d22/d72/l5b d3/d4/d12/d2b/d9f/la2 0 2026-03-09T17:29:43.594 INFO:tasks.workunit.client.0.vm06.stdout:2/507: chown d3/d4/d12/f15 1182 1 2026-03-09T17:29:43.599 INFO:tasks.workunit.client.0.vm06.stdout:4/573: write db/df/f4d [1526154,56407] 0 2026-03-09T17:29:43.604 INFO:tasks.workunit.client.0.vm06.stdout:2/508: readlink d3/d4/d12/d2b/d9f/la2 0 2026-03-09T17:29:43.605 INFO:tasks.workunit.client.0.vm06.stdout:6/466: dread d6/d12/d17/f32 [4194304,4194304] 0 2026-03-09T17:29:43.606 INFO:tasks.workunit.client.0.vm06.stdout:0/670: link d7/d11/d5d/db8/l81 d7/d11/d19/d1d/d87/le9 0 2026-03-09T17:29:43.611 INFO:tasks.workunit.client.0.vm06.stdout:2/509: rmdir d3/d4/d22/d72/d8f 39 2026-03-09T17:29:43.612 INFO:tasks.workunit.client.0.vm06.stdout:4/574: dread db/d1d/d21/fa5 [0,4194304] 0 2026-03-09T17:29:43.613 INFO:tasks.workunit.client.0.vm06.stdout:0/671: sync 2026-03-09T17:29:43.613 INFO:tasks.workunit.client.0.vm06.stdout:0/672: dread - d7/d11/d19/fe5 zero size 2026-03-09T17:29:43.617 INFO:tasks.workunit.client.0.vm06.stdout:6/467: mknod d6/d12/d2d/c99 0 2026-03-09T17:29:43.618 INFO:tasks.workunit.client.0.vm06.stdout:2/510: creat d3/d4/d22/d43/d77/d81/d64/d6a/fa3 x:0 0 0 2026-03-09T17:29:43.620 INFO:tasks.workunit.client.0.vm06.stdout:4/575: creat db/d1d/d21/d88/fd2 x:0 0 0 2026-03-09T17:29:43.623 INFO:tasks.workunit.client.0.vm06.stdout:0/673: mkdir d7/d11/d89/da8/db2/dea 0 2026-03-09T17:29:43.624 INFO:tasks.workunit.client.0.vm06.stdout:6/468: mkdir d6/d47/d4d/d9a 0 2026-03-09T17:29:43.626 INFO:tasks.workunit.client.0.vm06.stdout:2/511: truncate d3/d4/f11 4811467 0 2026-03-09T17:29:43.627 INFO:tasks.workunit.client.0.vm06.stdout:2/512: fdatasync d3/d4/d22/d43/d77/d81/d64/f9a 0 2026-03-09T17:29:43.628 INFO:tasks.workunit.client.0.vm06.stdout:4/576: creat db/d1d/fd3 x:0 0 0 2026-03-09T17:29:43.629 INFO:tasks.workunit.client.0.vm06.stdout:4/577: chown db/d1d/d21/d37/d69/d78/c7d 3505 1 2026-03-09T17:29:43.629 INFO:tasks.workunit.client.0.vm06.stdout:4/578: write db/fc [5138256,63781] 0 2026-03-09T17:29:43.630 INFO:tasks.workunit.client.0.vm06.stdout:4/579: read db/f17 [11355141,1619] 0 2026-03-09T17:29:43.641 INFO:tasks.workunit.client.0.vm06.stdout:4/580: dread db/f13 [0,4194304] 0 2026-03-09T17:29:43.644 INFO:tasks.workunit.client.0.vm06.stdout:8/538: write d15/d16/f3f [1242025,16105] 0 2026-03-09T17:29:43.647 INFO:tasks.workunit.client.0.vm06.stdout:5/543: dwrite d4/f49 [0,4194304] 0 2026-03-09T17:29:43.659 INFO:tasks.workunit.client.0.vm06.stdout:2/513: fsync d3/d4/d22/d43/f5f 0 2026-03-09T17:29:43.663 INFO:tasks.workunit.client.0.vm06.stdout:7/652: dwrite d5/dd/fa6 [0,4194304] 0 2026-03-09T17:29:43.668 INFO:tasks.workunit.client.0.vm06.stdout:7/653: chown d5/d12/fa2 0 1 2026-03-09T17:29:43.668 INFO:tasks.workunit.client.0.vm06.stdout:4/581: truncate db/d1d/d21/fa5 548321 0 2026-03-09T17:29:43.668 INFO:tasks.workunit.client.0.vm06.stdout:6/469: mknod d6/d4f/d3e/c9b 0 2026-03-09T17:29:43.668 INFO:tasks.workunit.client.0.vm06.stdout:5/544: truncate d4/d22/f90 655933 0 2026-03-09T17:29:43.677 INFO:tasks.workunit.client.0.vm06.stdout:7/654: rmdir d5/d1f/d34/d3f/d8b 39 2026-03-09T17:29:43.681 INFO:tasks.workunit.client.0.vm06.stdout:9/636: dwrite d3/d11/f2a [0,4194304] 0 2026-03-09T17:29:43.689 INFO:tasks.workunit.client.0.vm06.stdout:2/514: symlink d3/d4/d12/d2b/la4 0 2026-03-09T17:29:43.689 INFO:tasks.workunit.client.0.vm06.stdout:1/563: dwrite d11/d69/fad [0,4194304] 0 2026-03-09T17:29:43.691 INFO:tasks.workunit.client.0.vm06.stdout:2/515: read d3/d4/d12/d2b/f89 [3231840,8552] 0 2026-03-09T17:29:43.705 INFO:tasks.workunit.client.0.vm06.stdout:9/637: sync 2026-03-09T17:29:43.709 INFO:tasks.workunit.client.0.vm06.stdout:6/470: creat d6/d12/d17/d85/f9c x:0 0 0 2026-03-09T17:29:43.710 INFO:tasks.workunit.client.0.vm06.stdout:6/471: truncate d6/d4f/d3e/d52/f84 11857 0 2026-03-09T17:29:43.712 INFO:tasks.workunit.client.0.vm06.stdout:7/655: dread d5/d7/d2b/f42 [0,4194304] 0 2026-03-09T17:29:43.713 INFO:tasks.workunit.client.0.vm06.stdout:5/545: symlink d4/d50/lc9 0 2026-03-09T17:29:43.717 INFO:tasks.workunit.client.0.vm06.stdout:1/564: symlink d11/d14/d1d/d1e/d96/lbb 0 2026-03-09T17:29:43.727 INFO:tasks.workunit.client.0.vm06.stdout:2/516: write d3/d4/d12/d2b/d2d/f9b [6811760,52220] 0 2026-03-09T17:29:43.727 INFO:tasks.workunit.client.0.vm06.stdout:7/656: rmdir d5 39 2026-03-09T17:29:43.728 INFO:tasks.workunit.client.0.vm06.stdout:3/556: dwrite dd/d19/d28/f32 [0,4194304] 0 2026-03-09T17:29:43.743 INFO:tasks.workunit.client.0.vm06.stdout:3/557: dread dd/f26 [0,4194304] 0 2026-03-09T17:29:43.744 INFO:tasks.workunit.client.0.vm06.stdout:3/558: write dd/d19/d28/f32 [823844,87922] 0 2026-03-09T17:29:43.748 INFO:tasks.workunit.client.0.vm06.stdout:7/657: chown d5/d12/d64/f8d 4 1 2026-03-09T17:29:43.750 INFO:tasks.workunit.client.0.vm06.stdout:0/674: dwrite d7/d11/d19/d37/f4f [0,4194304] 0 2026-03-09T17:29:43.765 INFO:tasks.workunit.client.0.vm06.stdout:8/539: write d15/f3e [3506200,4064] 0 2026-03-09T17:29:43.765 INFO:tasks.workunit.client.0.vm06.stdout:3/559: creat dd/d19/d25/d48/fba x:0 0 0 2026-03-09T17:29:43.765 INFO:tasks.workunit.client.0.vm06.stdout:3/560: chown dd/d19/d2c 0 1 2026-03-09T17:29:43.770 INFO:tasks.workunit.client.0.vm06.stdout:8/540: read d15/d16/d1e/d30/f3b [8005120,59636] 0 2026-03-09T17:29:43.774 INFO:tasks.workunit.client.0.vm06.stdout:7/658: symlink d5/lc0 0 2026-03-09T17:29:43.780 INFO:tasks.workunit.client.0.vm06.stdout:7/659: unlink d5/d7/c24 0 2026-03-09T17:29:43.786 INFO:tasks.workunit.client.0.vm06.stdout:7/660: dwrite d5/f16 [0,4194304] 0 2026-03-09T17:29:43.794 INFO:tasks.workunit.client.0.vm06.stdout:0/675: link d7/d11/d19/d23/lcc d7/d11/d19/d23/db7/dbd/leb 0 2026-03-09T17:29:43.794 INFO:tasks.workunit.client.0.vm06.stdout:0/676: chown d7/f56 39585 1 2026-03-09T17:29:43.802 INFO:tasks.workunit.client.0.vm06.stdout:7/661: link d5/dd/lb7 d5/d1f/d34/d46/d51/lc1 0 2026-03-09T17:29:43.807 INFO:tasks.workunit.client.0.vm06.stdout:5/546: truncate d4/d22/f90 1364050 0 2026-03-09T17:29:43.816 INFO:tasks.workunit.client.0.vm06.stdout:4/582: dwrite db/d1d/f5b [0,4194304] 0 2026-03-09T17:29:43.832 INFO:tasks.workunit.client.0.vm06.stdout:7/662: symlink d5/lc2 0 2026-03-09T17:29:43.832 INFO:tasks.workunit.client.0.vm06.stdout:9/638: write d3/d15/d36/d4d/fa4 [578900,35884] 0 2026-03-09T17:29:43.832 INFO:tasks.workunit.client.0.vm06.stdout:2/517: write d3/d4/d22/f28 [566482,64349] 0 2026-03-09T17:29:43.832 INFO:tasks.workunit.client.0.vm06.stdout:1/565: write d11/d14/d1d/d1e/d2a/d34/f60 [1306448,40142] 0 2026-03-09T17:29:43.832 INFO:tasks.workunit.client.0.vm06.stdout:6/472: dwrite d6/d12/d17/f78 [4194304,4194304] 0 2026-03-09T17:29:43.832 INFO:tasks.workunit.client.0.vm06.stdout:4/583: stat db/df/c3e 0 2026-03-09T17:29:43.834 INFO:tasks.workunit.client.0.vm06.stdout:1/566: fsync d11/d14/d1d/f8f 0 2026-03-09T17:29:43.836 INFO:tasks.workunit.client.0.vm06.stdout:8/541: dread d15/d16/d19/d2b/d85/f9a [0,4194304] 0 2026-03-09T17:29:43.841 INFO:tasks.workunit.client.0.vm06.stdout:4/584: mkdir db/d57/dd4 0 2026-03-09T17:29:43.845 INFO:tasks.workunit.client.0.vm06.stdout:8/542: truncate d15/d31/f33 1073574 0 2026-03-09T17:29:43.846 INFO:tasks.workunit.client.0.vm06.stdout:8/543: chown d15/d16/d19/d3d/f4c 21623 1 2026-03-09T17:29:43.850 INFO:tasks.workunit.client.0.vm06.stdout:8/544: dwrite d15/d16/fb4 [0,4194304] 0 2026-03-09T17:29:43.856 INFO:tasks.workunit.client.0.vm06.stdout:2/518: unlink d3/d4/lb 0 2026-03-09T17:29:43.857 INFO:tasks.workunit.client.0.vm06.stdout:9/639: truncate d3/d15/d36/d83/fc6 2711947 0 2026-03-09T17:29:43.859 INFO:tasks.workunit.client.0.vm06.stdout:6/473: fdatasync d6/d4f/f26 0 2026-03-09T17:29:43.867 INFO:tasks.workunit.client.0.vm06.stdout:5/547: rename d4/d7e to d4/dca 0 2026-03-09T17:29:43.869 INFO:tasks.workunit.client.0.vm06.stdout:5/548: write d4/d50/d35/d40/d6f/fc7 [59023,10852] 0 2026-03-09T17:29:43.872 INFO:tasks.workunit.client.0.vm06.stdout:4/585: fsync f7 0 2026-03-09T17:29:43.873 INFO:tasks.workunit.client.0.vm06.stdout:4/586: chown db/d1d/d21/d37/d69/d78/db4/c9a 56775240 1 2026-03-09T17:29:43.876 INFO:tasks.workunit.client.0.vm06.stdout:8/545: mknod d15/d16/d19/d2b/cb5 0 2026-03-09T17:29:43.877 INFO:tasks.workunit.client.0.vm06.stdout:2/519: rename d3/d44 to d3/d4/d46/da5 0 2026-03-09T17:29:43.878 INFO:tasks.workunit.client.0.vm06.stdout:5/549: truncate d4/d50/d18/fa8 809067 0 2026-03-09T17:29:43.879 INFO:tasks.workunit.client.0.vm06.stdout:5/550: fdatasync d4/d52/f8f 0 2026-03-09T17:29:43.880 INFO:tasks.workunit.client.0.vm06.stdout:5/551: write d4/d50/d18/f48 [744614,22710] 0 2026-03-09T17:29:43.880 INFO:tasks.workunit.client.0.vm06.stdout:5/552: stat d4/d22/d64/f9f 0 2026-03-09T17:29:43.881 INFO:tasks.workunit.client.0.vm06.stdout:7/663: link d5/lc2 d5/lc3 0 2026-03-09T17:29:43.882 INFO:tasks.workunit.client.0.vm06.stdout:4/587: unlink db/d1d/d21/d26/d89/dab/fb3 0 2026-03-09T17:29:43.885 INFO:tasks.workunit.client.0.vm06.stdout:7/664: dwrite d5/d7/d2b/fa1 [0,4194304] 0 2026-03-09T17:29:43.887 INFO:tasks.workunit.client.0.vm06.stdout:4/588: dread db/d1d/f5b [0,4194304] 0 2026-03-09T17:29:43.889 INFO:tasks.workunit.client.0.vm06.stdout:6/474: sync 2026-03-09T17:29:43.889 INFO:tasks.workunit.client.0.vm06.stdout:8/546: sync 2026-03-09T17:29:43.891 INFO:tasks.workunit.client.0.vm06.stdout:9/640: symlink d3/lc7 0 2026-03-09T17:29:43.893 INFO:tasks.workunit.client.0.vm06.stdout:5/553: creat d4/d22/fcb x:0 0 0 2026-03-09T17:29:43.897 INFO:tasks.workunit.client.0.vm06.stdout:6/475: symlink d6/d47/d96/l9d 0 2026-03-09T17:29:43.897 INFO:tasks.workunit.client.0.vm06.stdout:8/547: creat d15/d39/d67/d77/d97/fb6 x:0 0 0 2026-03-09T17:29:43.900 INFO:tasks.workunit.client.0.vm06.stdout:5/554: rename d4/d52/f5f to d4/d22/d64/fcc 0 2026-03-09T17:29:43.904 INFO:tasks.workunit.client.0.vm06.stdout:8/548: rmdir d15/d31/d58 39 2026-03-09T17:29:43.913 INFO:tasks.workunit.client.0.vm06.stdout:5/555: dread d4/d22/d46/f6e [0,4194304] 0 2026-03-09T17:29:43.922 INFO:tasks.workunit.client.0.vm06.stdout:3/561: dwrite dd/d1d/d4e/f7c [0,4194304] 0 2026-03-09T17:29:43.922 INFO:tasks.workunit.client.0.vm06.stdout:3/562: readlink dd/l17 0 2026-03-09T17:29:43.922 INFO:tasks.workunit.client.0.vm06.stdout:3/563: dwrite dd/d19/d28/fab [0,4194304] 0 2026-03-09T17:29:43.939 INFO:tasks.workunit.client.0.vm06.stdout:3/564: getdents dd/d19/d2c 0 2026-03-09T17:29:43.946 INFO:tasks.workunit.client.0.vm06.stdout:4/589: dread db/d1d/d21/d37/f81 [0,4194304] 0 2026-03-09T17:29:43.947 INFO:tasks.workunit.client.0.vm06.stdout:4/590: dread - db/d59/d5f/d45/f8e zero size 2026-03-09T17:29:43.947 INFO:tasks.workunit.client.0.vm06.stdout:4/591: chown db/d1d/d21/d88/fd2 0 1 2026-03-09T17:29:43.952 INFO:tasks.workunit.client.0.vm06.stdout:4/592: link db/c1b db/cd5 0 2026-03-09T17:29:43.990 INFO:tasks.workunit.client.0.vm06.stdout:4/593: write db/d1d/f3a [717034,113309] 0 2026-03-09T17:29:43.990 INFO:tasks.workunit.client.0.vm06.stdout:4/594: mknod db/d1d/d21/d25/cd6 0 2026-03-09T17:29:43.990 INFO:tasks.workunit.client.0.vm06.stdout:4/595: creat db/d1d/d21/d25/d4b/fd7 x:0 0 0 2026-03-09T17:29:43.990 INFO:tasks.workunit.client.0.vm06.stdout:4/596: chown db/d1d/d21/d37/d69/d78/da0/caa 226 1 2026-03-09T17:29:44.021 INFO:tasks.workunit.client.0.vm06.stdout:8/549: sync 2026-03-09T17:29:44.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:43 vm06.local ceph-mon[57307]: pgmap v155: 65 pgs: 65 active+clean; 1.3 GiB data, 4.9 GiB used, 115 GiB / 120 GiB avail; 25 MiB/s rd, 84 MiB/s wr, 329 op/s 2026-03-09T17:29:44.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:43 vm09.local ceph-mon[62061]: pgmap v155: 65 pgs: 65 active+clean; 1.3 GiB data, 4.9 GiB used, 115 GiB / 120 GiB avail; 25 MiB/s rd, 84 MiB/s wr, 329 op/s 2026-03-09T17:29:44.148 INFO:tasks.workunit.client.0.vm06.stdout:0/677: write d7/d11/d5d/d64/f6b [5269011,92362] 0 2026-03-09T17:29:44.161 INFO:tasks.workunit.client.0.vm06.stdout:1/567: truncate d11/d14/d1d/d42/f52 1578440 0 2026-03-09T17:29:44.166 INFO:tasks.workunit.client.0.vm06.stdout:0/678: creat d7/d11/d19/d1d/fec x:0 0 0 2026-03-09T17:29:44.167 INFO:tasks.workunit.client.0.vm06.stdout:0/679: write d7/d11/d19/d1d/fda [231099,79471] 0 2026-03-09T17:29:44.172 INFO:tasks.workunit.client.0.vm06.stdout:1/568: mkdir d11/d14/d1c/dbc 0 2026-03-09T17:29:44.181 INFO:tasks.workunit.client.0.vm06.stdout:0/680: creat d7/d11/d19/d3c/db9/dd8/fed x:0 0 0 2026-03-09T17:29:44.185 INFO:tasks.workunit.client.0.vm06.stdout:9/641: dwrite d3/d15/d36/d4c/d6a/f8b [0,4194304] 0 2026-03-09T17:29:44.189 INFO:tasks.workunit.client.0.vm06.stdout:9/642: dwrite d3/d15/d36/d4d/fa4 [0,4194304] 0 2026-03-09T17:29:44.201 INFO:tasks.workunit.client.0.vm06.stdout:0/681: creat d7/d11/d19/d3c/db9/dd8/fee x:0 0 0 2026-03-09T17:29:44.208 INFO:tasks.workunit.client.0.vm06.stdout:9/643: fsync d3/d26/d35/f99 0 2026-03-09T17:29:44.210 INFO:tasks.workunit.client.0.vm06.stdout:8/550: dread d15/d16/d1a/d47/f76 [0,4194304] 0 2026-03-09T17:29:44.213 INFO:tasks.workunit.client.0.vm06.stdout:5/556: stat d4/d22/d64/fcc 0 2026-03-09T17:29:44.213 INFO:tasks.workunit.client.0.vm06.stdout:9/644: dread d3/d26/d6c/f3a [0,4194304] 0 2026-03-09T17:29:44.216 INFO:tasks.workunit.client.0.vm06.stdout:7/665: write d5/d12/d5f/f81 [643865,84337] 0 2026-03-09T17:29:44.216 INFO:tasks.workunit.client.0.vm06.stdout:7/666: write d5/d1f/d34/f5e [5568587,54242] 0 2026-03-09T17:29:44.221 INFO:tasks.workunit.client.0.vm06.stdout:6/476: truncate d6/d12/d17/d65/f72 3617641 0 2026-03-09T17:29:44.236 INFO:tasks.workunit.client.0.vm06.stdout:0/682: fdatasync d7/d11/d19/d23/f60 0 2026-03-09T17:29:44.236 INFO:tasks.workunit.client.0.vm06.stdout:2/520: link d3/d4/d22/d72/d8f/c98 d3/d4/d12/d2b/d36/ca6 0 2026-03-09T17:29:44.239 INFO:tasks.workunit.client.0.vm06.stdout:5/557: chown d4/dca/l88 616149540 1 2026-03-09T17:29:44.241 INFO:tasks.workunit.client.0.vm06.stdout:9/645: symlink d3/d15/d48/da8/lc8 0 2026-03-09T17:29:44.246 INFO:tasks.workunit.client.0.vm06.stdout:6/477: unlink d6/d12/l7f 0 2026-03-09T17:29:44.246 INFO:tasks.workunit.client.0.vm06.stdout:9/646: truncate d3/d15/d48/da8/db9/f89 796057 0 2026-03-09T17:29:44.247 INFO:tasks.workunit.client.0.vm06.stdout:2/521: mkdir d3/d4/d12/da7 0 2026-03-09T17:29:44.252 INFO:tasks.workunit.client.0.vm06.stdout:9/647: symlink d3/d11/d65/lc9 0 2026-03-09T17:29:44.253 INFO:tasks.workunit.client.0.vm06.stdout:9/648: chown d3/d15/l24 4420367 1 2026-03-09T17:29:44.255 INFO:tasks.workunit.client.0.vm06.stdout:6/478: creat d6/d12/d53/d8f/f9e x:0 0 0 2026-03-09T17:29:44.256 INFO:tasks.workunit.client.0.vm06.stdout:6/479: dread - d6/d12/d53/f86 zero size 2026-03-09T17:29:44.256 INFO:tasks.workunit.client.0.vm06.stdout:0/683: getdents d7/d11/d2d/dca 0 2026-03-09T17:29:44.259 INFO:tasks.workunit.client.0.vm06.stdout:9/649: mknod d3/d15/cca 0 2026-03-09T17:29:44.268 INFO:tasks.workunit.client.0.vm06.stdout:9/650: rmdir d3/d26/d35 39 2026-03-09T17:29:44.280 INFO:tasks.workunit.client.0.vm06.stdout:0/684: getdents d7/d11/d19/d1d/d39 0 2026-03-09T17:29:44.317 INFO:tasks.workunit.client.0.vm06.stdout:9/651: sync 2026-03-09T17:29:44.317 INFO:tasks.workunit.client.0.vm06.stdout:9/652: write d3/d6d/d9a/fb4 [246700,47431] 0 2026-03-09T17:29:44.321 INFO:tasks.workunit.client.0.vm06.stdout:9/653: mkdir d3/d26/dcb 0 2026-03-09T17:29:44.323 INFO:tasks.workunit.client.0.vm06.stdout:9/654: fdatasync d3/d15/f5e 0 2026-03-09T17:29:44.325 INFO:tasks.workunit.client.0.vm06.stdout:9/655: stat d3/d6d/cbd 0 2026-03-09T17:29:44.327 INFO:tasks.workunit.client.0.vm06.stdout:9/656: mknod d3/d15/d36/ccc 0 2026-03-09T17:29:44.341 INFO:tasks.workunit.client.0.vm06.stdout:3/565: write dd/f15 [85809,12516] 0 2026-03-09T17:29:44.344 INFO:tasks.workunit.client.0.vm06.stdout:3/566: creat dd/d81/da3/dae/fbb x:0 0 0 2026-03-09T17:29:44.353 INFO:tasks.workunit.client.0.vm06.stdout:3/567: dread dd/f4a [0,4194304] 0 2026-03-09T17:29:44.354 INFO:tasks.workunit.client.0.vm06.stdout:3/568: write dd/d59/da1/fa4 [1540604,114614] 0 2026-03-09T17:29:44.359 INFO:tasks.workunit.client.0.vm06.stdout:3/569: creat dd/d81/da3/fbc x:0 0 0 2026-03-09T17:29:44.362 INFO:tasks.workunit.client.0.vm06.stdout:3/570: creat dd/d1d/d6e/d70/fbd x:0 0 0 2026-03-09T17:29:44.362 INFO:tasks.workunit.client.0.vm06.stdout:3/571: read - dd/d1d/d6e/d70/fbd zero size 2026-03-09T17:29:44.363 INFO:tasks.workunit.client.0.vm06.stdout:3/572: truncate dd/d81/da3/dae/fbb 807314 0 2026-03-09T17:29:44.367 INFO:tasks.workunit.client.0.vm06.stdout:3/573: dwrite dd/d19/d28/fab [0,4194304] 0 2026-03-09T17:29:44.373 INFO:tasks.workunit.client.0.vm06.stdout:4/597: dwrite db/df/f30 [4194304,4194304] 0 2026-03-09T17:29:44.374 INFO:tasks.workunit.client.0.vm06.stdout:4/598: chown db/d59/d90 1 1 2026-03-09T17:29:44.388 INFO:tasks.workunit.client.0.vm06.stdout:3/574: read dd/f1b [175546,25977] 0 2026-03-09T17:29:44.393 INFO:tasks.workunit.client.0.vm06.stdout:3/575: unlink dd/d19/d25/c50 0 2026-03-09T17:29:44.397 INFO:tasks.workunit.client.0.vm06.stdout:3/576: unlink dd/d19/d25/d48/fba 0 2026-03-09T17:29:44.405 INFO:tasks.workunit.client.0.vm06.stdout:3/577: rename dd/d19/d28/l3b to dd/d19/d1e/db8/lbe 0 2026-03-09T17:29:44.409 INFO:tasks.workunit.client.0.vm06.stdout:3/578: truncate f7 2313414 0 2026-03-09T17:29:44.429 INFO:tasks.workunit.client.0.vm06.stdout:3/579: dread dd/d1d/f4b [0,4194304] 0 2026-03-09T17:29:44.450 INFO:tasks.workunit.client.0.vm06.stdout:1/569: dwrite d11/d14/d1c/d1f/d57/d7b/f84 [0,4194304] 0 2026-03-09T17:29:44.450 INFO:tasks.workunit.client.0.vm06.stdout:3/580: sync 2026-03-09T17:29:44.463 INFO:tasks.workunit.client.0.vm06.stdout:3/581: truncate dd/d1d/d4e/f72 4910655 0 2026-03-09T17:29:44.465 INFO:tasks.workunit.client.0.vm06.stdout:1/570: dwrite d11/d14/fa6 [0,4194304] 0 2026-03-09T17:29:44.465 INFO:tasks.workunit.client.0.vm06.stdout:1/571: chown f10 0 1 2026-03-09T17:29:44.465 INFO:tasks.workunit.client.0.vm06.stdout:1/572: readlink d11/d14/d1c/d5f/l97 0 2026-03-09T17:29:44.468 INFO:tasks.workunit.client.0.vm06.stdout:3/582: creat dd/d81/fbf x:0 0 0 2026-03-09T17:29:44.469 INFO:tasks.workunit.client.0.vm06.stdout:3/583: dread - dd/d1d/d6e/d70/fbd zero size 2026-03-09T17:29:44.469 INFO:tasks.workunit.client.0.vm06.stdout:3/584: stat dd/d1d/d6e/fa0 0 2026-03-09T17:29:44.479 INFO:tasks.workunit.client.0.vm06.stdout:1/573: mknod d11/d14/d1d/d1e/d2a/d34/d58/cbd 0 2026-03-09T17:29:44.480 INFO:tasks.workunit.client.0.vm06.stdout:1/574: write d11/d14/fa6 [1144573,36416] 0 2026-03-09T17:29:44.487 INFO:tasks.workunit.client.0.vm06.stdout:3/585: mknod dd/d19/d25/d2d/cc0 0 2026-03-09T17:29:44.494 INFO:tasks.workunit.client.0.vm06.stdout:3/586: readlink dd/d19/d1e/l7f 0 2026-03-09T17:29:44.498 INFO:tasks.workunit.client.0.vm06.stdout:3/587: dwrite dd/d1d/d6e/fa0 [0,4194304] 0 2026-03-09T17:29:44.520 INFO:tasks.workunit.client.0.vm06.stdout:4/599: dread db/d1d/d21/d25/d4b/f4e [0,4194304] 0 2026-03-09T17:29:44.539 INFO:tasks.workunit.client.0.vm06.stdout:4/600: creat db/d1d/d21/d44/dc1/fd8 x:0 0 0 2026-03-09T17:29:44.543 INFO:tasks.workunit.client.0.vm06.stdout:4/601: creat db/d59/d5f/d6d/fd9 x:0 0 0 2026-03-09T17:29:44.550 INFO:tasks.workunit.client.0.vm06.stdout:3/588: link dd/d1d/d6e/d70/c76 dd/d19/cc1 0 2026-03-09T17:29:44.551 INFO:tasks.workunit.client.0.vm06.stdout:3/589: truncate dd/d81/fa9 855103 0 2026-03-09T17:29:44.553 INFO:tasks.workunit.client.0.vm06.stdout:4/602: creat db/d1d/d21/d26/d7a/fda x:0 0 0 2026-03-09T17:29:44.554 INFO:tasks.workunit.client.0.vm06.stdout:4/603: dread - db/d1d/d21/d26/d89/dab/dae/dcc/fce zero size 2026-03-09T17:29:44.559 INFO:tasks.workunit.client.0.vm06.stdout:8/551: dwrite d15/d16/d1a/d47/f7e [0,4194304] 0 2026-03-09T17:29:44.560 INFO:tasks.workunit.client.0.vm06.stdout:8/552: stat d15/d39/d67/d77/d97/fb6 0 2026-03-09T17:29:44.561 INFO:tasks.workunit.client.0.vm06.stdout:4/604: mkdir db/d59/d5f/d6d/ddb 0 2026-03-09T17:29:44.570 INFO:tasks.workunit.client.0.vm06.stdout:7/667: write d5/d1f/d34/f41 [4436067,16810] 0 2026-03-09T17:29:44.579 INFO:tasks.workunit.client.0.vm06.stdout:5/558: dwrite d4/d50/d18/fa8 [0,4194304] 0 2026-03-09T17:29:44.583 INFO:tasks.workunit.client.0.vm06.stdout:7/668: fsync d5/d12/f93 0 2026-03-09T17:29:44.591 INFO:tasks.workunit.client.0.vm06.stdout:2/522: dwrite d3/d4/d12/f66 [0,4194304] 0 2026-03-09T17:29:44.594 INFO:tasks.workunit.client.0.vm06.stdout:5/559: dread d4/d22/d46/f82 [0,4194304] 0 2026-03-09T17:29:44.599 INFO:tasks.workunit.client.0.vm06.stdout:2/523: dwrite d3/d4/d22/d43/d77/d81/d64/d6a/fa3 [0,4194304] 0 2026-03-09T17:29:44.603 INFO:tasks.workunit.client.0.vm06.stdout:2/524: truncate d3/d4/d12/d71/f97 982854 0 2026-03-09T17:29:44.610 INFO:tasks.workunit.client.0.vm06.stdout:7/669: dread - d5/d1f/d34/d3f/d8b/faa zero size 2026-03-09T17:29:44.614 INFO:tasks.workunit.client.0.vm06.stdout:6/480: dwrite d6/d12/d53/f87 [0,4194304] 0 2026-03-09T17:29:44.620 INFO:tasks.workunit.client.0.vm06.stdout:2/525: creat d3/d4/d46/da5/fa8 x:0 0 0 2026-03-09T17:29:44.621 INFO:tasks.workunit.client.0.vm06.stdout:0/685: write d7/d11/d19/f57 [1787432,118680] 0 2026-03-09T17:29:44.638 INFO:tasks.workunit.client.0.vm06.stdout:2/526: rmdir d3/d4/d12/d2b/d9f 39 2026-03-09T17:29:44.640 INFO:tasks.workunit.client.0.vm06.stdout:0/686: rmdir d7/d11/d19/d8b 39 2026-03-09T17:29:44.642 INFO:tasks.workunit.client.0.vm06.stdout:7/670: creat d5/d12/dad/fc4 x:0 0 0 2026-03-09T17:29:44.642 INFO:tasks.workunit.client.0.vm06.stdout:7/671: chown d5/d7/d2b 11761 1 2026-03-09T17:29:44.643 INFO:tasks.workunit.client.0.vm06.stdout:7/672: chown d5/d1f/d34/f54 32461862 1 2026-03-09T17:29:44.651 INFO:tasks.workunit.client.0.vm06.stdout:9/657: dwrite d3/d26/d6c/f5b [0,4194304] 0 2026-03-09T17:29:44.651 INFO:tasks.workunit.client.0.vm06.stdout:9/658: chown d3/d15/f74 1612 1 2026-03-09T17:29:44.658 INFO:tasks.workunit.client.0.vm06.stdout:9/659: dwrite d3/d11/d65/f71 [0,4194304] 0 2026-03-09T17:29:44.660 INFO:tasks.workunit.client.0.vm06.stdout:9/660: readlink d3/d15/d36/d4c/d6a/d8a/la2 0 2026-03-09T17:29:44.671 INFO:tasks.workunit.client.0.vm06.stdout:7/673: dread d5/d12/f32 [0,4194304] 0 2026-03-09T17:29:44.673 INFO:tasks.workunit.client.0.vm06.stdout:2/527: mknod d3/d4/d46/da5/ca9 0 2026-03-09T17:29:44.677 INFO:tasks.workunit.client.0.vm06.stdout:9/661: mkdir d3/d6d/d9a/d9c/dcd 0 2026-03-09T17:29:44.680 INFO:tasks.workunit.client.0.vm06.stdout:9/662: stat d3/d15/l30 0 2026-03-09T17:29:44.706 INFO:tasks.workunit.client.0.vm06.stdout:1/575: getdents d11/d14/d1d/d1e/d2a/d34/d58 0 2026-03-09T17:29:44.708 INFO:tasks.workunit.client.0.vm06.stdout:1/576: creat d11/d14/d1c/d3a/fbe x:0 0 0 2026-03-09T17:29:44.709 INFO:tasks.workunit.client.0.vm06.stdout:3/590: getdents dd/d19/d25/d2d 0 2026-03-09T17:29:44.720 INFO:tasks.workunit.client.0.vm06.stdout:3/591: link dd/d19/d25/d2d/f55 dd/fc2 0 2026-03-09T17:29:44.725 INFO:tasks.workunit.client.0.vm06.stdout:3/592: dwrite dd/d59/da1/faf [0,4194304] 0 2026-03-09T17:29:44.729 INFO:tasks.workunit.client.0.vm06.stdout:3/593: symlink dd/d19/d25/d44/lc3 0 2026-03-09T17:29:44.746 INFO:tasks.workunit.client.0.vm06.stdout:8/553: write d15/d16/f23 [3208445,84609] 0 2026-03-09T17:29:44.747 INFO:tasks.workunit.client.0.vm06.stdout:8/554: fdatasync d15/d16/d1a/d7c/f9e 0 2026-03-09T17:29:44.751 INFO:tasks.workunit.client.0.vm06.stdout:4/605: dwrite db/d1d/d21/d37/f81 [0,4194304] 0 2026-03-09T17:29:44.755 INFO:tasks.workunit.client.0.vm06.stdout:3/594: dread dd/f22 [0,4194304] 0 2026-03-09T17:29:44.758 INFO:tasks.workunit.client.0.vm06.stdout:4/606: dread db/d1d/d21/d25/d4b/f4e [0,4194304] 0 2026-03-09T17:29:44.758 INFO:tasks.workunit.client.0.vm06.stdout:4/607: chown db/f55 124555 1 2026-03-09T17:29:44.769 INFO:tasks.workunit.client.0.vm06.stdout:5/560: dwrite d4/d50/d18/f3e [8388608,4194304] 0 2026-03-09T17:29:44.791 INFO:tasks.workunit.client.0.vm06.stdout:6/481: dwrite d6/d47/f49 [0,4194304] 0 2026-03-09T17:29:44.792 INFO:tasks.workunit.client.0.vm06.stdout:8/555: link d15/d16/d1e/c41 d15/d16/d1e/d28/da4/cb7 0 2026-03-09T17:29:44.792 INFO:tasks.workunit.client.0.vm06.stdout:8/556: write d15/d16/d1a/d47/fa5 [183891,18523] 0 2026-03-09T17:29:44.799 INFO:tasks.workunit.client.0.vm06.stdout:7/674: rename d5/d12 to d5/dd/dc5 0 2026-03-09T17:29:44.806 INFO:tasks.workunit.client.0.vm06.stdout:8/557: chown d15/d16/d19/d3d/d5f/c73 7 1 2026-03-09T17:29:44.809 INFO:tasks.workunit.client.0.vm06.stdout:0/687: rename d7/d11/d19/d8b/da4/d85/fbc to d7/d11/d89/d99/fef 0 2026-03-09T17:29:44.812 INFO:tasks.workunit.client.0.vm06.stdout:9/663: write d3/d15/d36/d4c/f55 [1866435,63523] 0 2026-03-09T17:29:44.827 INFO:tasks.workunit.client.0.vm06.stdout:4/608: link db/d59/d5f/d45/l63 db/d1d/d21/d26/ldc 0 2026-03-09T17:29:44.827 INFO:tasks.workunit.client.0.vm06.stdout:2/528: rename d3/d4/d22/d43 to d3/d4/d12/d71/daa 0 2026-03-09T17:29:44.827 INFO:tasks.workunit.client.0.vm06.stdout:2/529: dwrite d3/d4/d12/f42 [0,4194304] 0 2026-03-09T17:29:44.851 INFO:tasks.workunit.client.0.vm06.stdout:1/577: rename d11/d14/d1d/d1e/d2a/fae to d11/d14/d1c/d3a/fbf 0 2026-03-09T17:29:44.851 INFO:tasks.workunit.client.0.vm06.stdout:1/578: stat d11/d14/d1c/d1f/d57/d7b 0 2026-03-09T17:29:44.854 INFO:tasks.workunit.client.0.vm06.stdout:2/530: creat d3/d4/d12/d71/daa/d77/d81/d64/d6a/fab x:0 0 0 2026-03-09T17:29:44.866 INFO:tasks.workunit.client.0.vm06.stdout:2/531: mknod d3/cac 0 2026-03-09T17:29:44.872 INFO:tasks.workunit.client.0.vm06.stdout:2/532: fsync d3/d4/d12/d71/daa/d77/d81/f84 0 2026-03-09T17:29:44.872 INFO:tasks.workunit.client.0.vm06.stdout:2/533: chown d3/d4/d12/d71/daa/d77/d81/d64/d6a/f6d 2 1 2026-03-09T17:29:44.873 INFO:tasks.workunit.client.0.vm06.stdout:9/664: sync 2026-03-09T17:29:44.883 INFO:tasks.workunit.client.0.vm06.stdout:8/558: rename d15/d16/d1e/d28 to d15/d16/d1e/d30/db8 0 2026-03-09T17:29:44.883 INFO:tasks.workunit.client.0.vm06.stdout:3/595: truncate dd/d1d/d4e/f7c 3254917 0 2026-03-09T17:29:44.884 INFO:tasks.workunit.client.0.vm06.stdout:5/561: write d4/d52/f8a [1578215,127287] 0 2026-03-09T17:29:44.888 INFO:tasks.workunit.client.0.vm06.stdout:9/665: truncate d3/d15/d48/da8/db9/f89 1324890 0 2026-03-09T17:29:44.892 INFO:tasks.workunit.client.0.vm06.stdout:9/666: dwrite d3/d15/d36/d4c/f55 [0,4194304] 0 2026-03-09T17:29:44.895 INFO:tasks.workunit.client.0.vm06.stdout:1/579: rename d11/d14/d1c/d1f to d11/d14/d1d/d42/d46/d92/dc0 0 2026-03-09T17:29:44.896 INFO:tasks.workunit.client.0.vm06.stdout:6/482: dwrite d6/d47/d96/d40/f67 [0,4194304] 0 2026-03-09T17:29:44.916 INFO:tasks.workunit.client.0.vm06.stdout:6/483: unlink d6/d47/d4d/f6e 0 2026-03-09T17:29:44.917 INFO:tasks.workunit.client.0.vm06.stdout:3/596: rename dd/fc2 to dd/d19/d25/d44/d80/fc4 0 2026-03-09T17:29:44.918 INFO:tasks.workunit.client.0.vm06.stdout:9/667: creat d3/d15/fce x:0 0 0 2026-03-09T17:29:44.925 INFO:tasks.workunit.client.0.vm06.stdout:0/688: write d7/d11/d5d/d64/f6a [1563920,61317] 0 2026-03-09T17:29:44.927 INFO:tasks.workunit.client.0.vm06.stdout:5/562: dread d4/d50/f61 [4194304,4194304] 0 2026-03-09T17:29:44.933 INFO:tasks.workunit.client.0.vm06.stdout:9/668: rename d3/d11/f59 to d3/d15/d36/fcf 0 2026-03-09T17:29:44.933 INFO:tasks.workunit.client.0.vm06.stdout:3/597: dread dd/d59/f84 [0,4194304] 0 2026-03-09T17:29:44.934 INFO:tasks.workunit.client.0.vm06.stdout:0/689: unlink d7/d11/d19/d1d/d87/f92 0 2026-03-09T17:29:44.937 INFO:tasks.workunit.client.0.vm06.stdout:5/563: creat d4/d52/fcd x:0 0 0 2026-03-09T17:29:44.938 INFO:tasks.workunit.client.0.vm06.stdout:9/669: rename d3/d15/d36/d4c/d6a/fa9 to d3/d11/d65/d80/fd0 0 2026-03-09T17:29:44.946 INFO:tasks.workunit.client.0.vm06.stdout:6/484: sync 2026-03-09T17:29:44.947 INFO:tasks.workunit.client.0.vm06.stdout:5/564: creat d4/d22/d64/fce x:0 0 0 2026-03-09T17:29:44.949 INFO:tasks.workunit.client.0.vm06.stdout:3/598: creat dd/d19/d25/d2d/d9b/fc5 x:0 0 0 2026-03-09T17:29:44.950 INFO:tasks.workunit.client.0.vm06.stdout:5/565: mkdir d4/da4/dcf 0 2026-03-09T17:29:44.952 INFO:tasks.workunit.client.0.vm06.stdout:3/599: creat dd/d81/da3/fc6 x:0 0 0 2026-03-09T17:29:44.953 INFO:tasks.workunit.client.0.vm06.stdout:5/566: unlink d4/d52/cae 0 2026-03-09T17:29:44.954 INFO:tasks.workunit.client.0.vm06.stdout:6/485: fsync d6/d47/d96/d40/f67 0 2026-03-09T17:29:44.958 INFO:tasks.workunit.client.0.vm06.stdout:5/567: dwrite d4/d50/d18/f9d [4194304,4194304] 0 2026-03-09T17:29:44.961 INFO:tasks.workunit.client.0.vm06.stdout:0/690: getdents d7/d11/d2d/daf 0 2026-03-09T17:29:44.962 INFO:tasks.workunit.client.0.vm06.stdout:0/691: chown d7/d11/ldc 559979033 1 2026-03-09T17:29:44.966 INFO:tasks.workunit.client.0.vm06.stdout:5/568: sync 2026-03-09T17:29:44.966 INFO:tasks.workunit.client.0.vm06.stdout:0/692: sync 2026-03-09T17:29:44.966 INFO:tasks.workunit.client.0.vm06.stdout:5/569: chown d4/d22/d46/la0 5406005 1 2026-03-09T17:29:44.967 INFO:tasks.workunit.client.0.vm06.stdout:0/693: chown d7/d11/d19/d1d/fb3 22774013 1 2026-03-09T17:29:44.967 INFO:tasks.workunit.client.0.vm06.stdout:6/486: dread d6/d12/d17/f32 [0,4194304] 0 2026-03-09T17:29:44.968 INFO:tasks.workunit.client.0.vm06.stdout:3/600: creat dd/d19/d25/d2d/d9b/fc7 x:0 0 0 2026-03-09T17:29:44.968 INFO:tasks.workunit.client.0.vm06.stdout:7/675: write d5/dd/f48 [1653372,93575] 0 2026-03-09T17:29:44.971 INFO:tasks.workunit.client.0.vm06.stdout:7/676: truncate d5/dd/d79/fb3 1035218 0 2026-03-09T17:29:44.972 INFO:tasks.workunit.client.0.vm06.stdout:7/677: fsync d5/d1f/f8a 0 2026-03-09T17:29:44.976 INFO:tasks.workunit.client.0.vm06.stdout:0/694: dwrite d7/d11/f1c [4194304,4194304] 0 2026-03-09T17:29:44.978 INFO:tasks.workunit.client.0.vm06.stdout:0/695: dread - d7/d11/d5d/d64/fc9 zero size 2026-03-09T17:29:44.979 INFO:tasks.workunit.client.0.vm06.stdout:6/487: dwrite d6/d47/d96/f37 [0,4194304] 0 2026-03-09T17:29:44.980 INFO:tasks.workunit.client.0.vm06.stdout:0/696: stat d7/d11/d19/d8b/da4/d85 0 2026-03-09T17:29:44.986 INFO:tasks.workunit.client.0.vm06.stdout:0/697: dwrite d7/d11/d19/d1d/d39/f7d [0,4194304] 0 2026-03-09T17:29:44.999 INFO:tasks.workunit.client.0.vm06.stdout:5/570: chown d4/da4/fc5 8878 1 2026-03-09T17:29:45.003 INFO:tasks.workunit.client.0.vm06.stdout:4/609: truncate db/df/f4d 2621063 0 2026-03-09T17:29:45.008 INFO:tasks.workunit.client.0.vm06.stdout:7/678: readlink d5/dd/dc5/d64/d6b/l70 0 2026-03-09T17:29:45.010 INFO:tasks.workunit.client.0.vm06.stdout:0/698: creat d7/d11/d19/d3c/ff0 x:0 0 0 2026-03-09T17:29:45.011 INFO:tasks.workunit.client.0.vm06.stdout:5/571: creat d4/d50/d35/d40/d6f/fd0 x:0 0 0 2026-03-09T17:29:45.019 INFO:tasks.workunit.client.0.vm06.stdout:7/679: dread d5/dd/dc5/d64/f77 [0,4194304] 0 2026-03-09T17:29:45.028 INFO:tasks.workunit.client.0.vm06.stdout:5/572: fdatasync d4/d50/d18/f74 0 2026-03-09T17:29:45.031 INFO:tasks.workunit.client.0.vm06.stdout:7/680: creat d5/dd/dc5/dad/fc6 x:0 0 0 2026-03-09T17:29:45.036 INFO:tasks.workunit.client.0.vm06.stdout:0/699: link d7/d11/d19/d23/l90 d7/d11/d19/d3c/lf1 0 2026-03-09T17:29:45.039 INFO:tasks.workunit.client.0.vm06.stdout:7/681: link d5/d7/d2b/l9b d5/d1f/d34/lc7 0 2026-03-09T17:29:45.039 INFO:tasks.workunit.client.0.vm06.stdout:7/682: fsync d5/d1f/d34/f5e 0 2026-03-09T17:29:45.040 INFO:tasks.workunit.client.0.vm06.stdout:7/683: write d5/dd/fa6 [3242481,20920] 0 2026-03-09T17:29:45.042 INFO:tasks.workunit.client.0.vm06.stdout:0/700: rename d7/d11/d5d/cc2 to d7/d11/d89/da8/cf2 0 2026-03-09T17:29:45.045 INFO:tasks.workunit.client.0.vm06.stdout:7/684: rmdir d5/d7/dac 39 2026-03-09T17:29:45.048 INFO:tasks.workunit.client.0.vm06.stdout:7/685: getdents d5/d1f/d34/d3f/d8b 0 2026-03-09T17:29:45.051 INFO:tasks.workunit.client.0.vm06.stdout:7/686: mkdir d5/d7/d2b/dc8 0 2026-03-09T17:29:45.052 INFO:tasks.workunit.client.0.vm06.stdout:7/687: chown d5/dd/f9d 0 1 2026-03-09T17:29:45.053 INFO:tasks.workunit.client.0.vm06.stdout:7/688: write d5/dd/dc5/dad/fc6 [151093,53452] 0 2026-03-09T17:29:45.055 INFO:tasks.workunit.client.0.vm06.stdout:7/689: truncate d5/f18 5702521 0 2026-03-09T17:29:45.056 INFO:tasks.workunit.client.0.vm06.stdout:7/690: getdents d5/d7/d2b/dc8 0 2026-03-09T17:29:45.062 INFO:tasks.workunit.client.0.vm06.stdout:7/691: write d5/dd/dc5/d5f/fb2 [4761642,75242] 0 2026-03-09T17:29:45.065 INFO:tasks.workunit.client.0.vm06.stdout:2/534: dwrite d3/d4/f11 [0,4194304] 0 2026-03-09T17:29:45.069 INFO:tasks.workunit.client.0.vm06.stdout:3/601: read dd/d1d/d4e/f7c [2660466,50468] 0 2026-03-09T17:29:45.069 INFO:tasks.workunit.client.0.vm06.stdout:3/602: dread - dd/d5b/fa7 zero size 2026-03-09T17:29:45.070 INFO:tasks.workunit.client.0.vm06.stdout:3/603: write dd/f38 [788633,98787] 0 2026-03-09T17:29:45.075 INFO:tasks.workunit.client.0.vm06.stdout:7/692: rename d5/d1f/d34/d3f/d91/l9c to d5/d1f/d34/d46/d51/lc9 0 2026-03-09T17:29:45.076 INFO:tasks.workunit.client.0.vm06.stdout:7/693: write d5/dd/dc5/dad/fc6 [1220911,123726] 0 2026-03-09T17:29:45.076 INFO:tasks.workunit.client.0.vm06.stdout:7/694: chown d5/d7/f75 157498 1 2026-03-09T17:29:45.083 INFO:tasks.workunit.client.0.vm06.stdout:3/604: creat dd/d59/da1/fc8 x:0 0 0 2026-03-09T17:29:45.083 INFO:tasks.workunit.client.0.vm06.stdout:3/605: chown dd/d19/d28 166006 1 2026-03-09T17:29:45.083 INFO:tasks.workunit.client.0.vm06.stdout:3/606: chown dd/d1d/d6e/fa0 623174826 1 2026-03-09T17:29:45.084 INFO:tasks.workunit.client.0.vm06.stdout:3/607: write dd/d81/da3/dae/fb5 [105975,127832] 0 2026-03-09T17:29:45.086 INFO:tasks.workunit.client.0.vm06.stdout:7/695: write d5/d1f/d34/d46/f4c [908894,15857] 0 2026-03-09T17:29:45.087 INFO:tasks.workunit.client.0.vm06.stdout:8/559: creat d15/d16/d1e/d30/db8/d5e/fb9 x:0 0 0 2026-03-09T17:29:45.095 INFO:tasks.workunit.client.0.vm06.stdout:3/608: truncate dd/d5b/d65/f92 773860 0 2026-03-09T17:29:45.095 INFO:tasks.workunit.client.0.vm06.stdout:8/560: rename d15/d39/d67/d77/d97/fb6 to d15/d39/d3c/d6c/fba 0 2026-03-09T17:29:45.095 INFO:tasks.workunit.client.0.vm06.stdout:2/535: getdents d3/d4/d12/d2b/d9f 0 2026-03-09T17:29:45.103 INFO:tasks.workunit.client.0.vm06.stdout:1/580: write d11/d14/d1d/d1e/d2a/f43 [29620,22358] 0 2026-03-09T17:29:45.105 INFO:tasks.workunit.client.0.vm06.stdout:1/581: mknod d11/d14/d1d/d1e/cc1 0 2026-03-09T17:29:45.116 INFO:tasks.workunit.client.0.vm06.stdout:9/670: dwrite d3/d15/f74 [0,4194304] 0 2026-03-09T17:29:45.117 INFO:tasks.workunit.client.0.vm06.stdout:9/671: stat d3/d11/f87 0 2026-03-09T17:29:45.126 INFO:tasks.workunit.client.0.vm06.stdout:1/582: unlink d11/f8d 0 2026-03-09T17:29:45.127 INFO:tasks.workunit.client.0.vm06.stdout:9/672: creat d3/d15/d36/d4d/fd1 x:0 0 0 2026-03-09T17:29:45.133 INFO:tasks.workunit.client.0.vm06.stdout:9/673: chown d3/d11/c2b 303 1 2026-03-09T17:29:45.138 INFO:tasks.workunit.client.0.vm06.stdout:4/610: write f7 [11639117,117054] 0 2026-03-09T17:29:45.142 INFO:tasks.workunit.client.0.vm06.stdout:4/611: dwrite db/d1d/d21/d26/f70 [0,4194304] 0 2026-03-09T17:29:45.154 INFO:tasks.workunit.client.0.vm06.stdout:6/488: dwrite d6/d4f/d3e/d52/f89 [0,4194304] 0 2026-03-09T17:29:45.163 INFO:tasks.workunit.client.0.vm06.stdout:6/489: creat d6/d47/d96/d40/f9f x:0 0 0 2026-03-09T17:29:45.195 INFO:tasks.workunit.client.0.vm06.stdout:7/696: rmdir d5/dd/dc5/dad 39 2026-03-09T17:29:45.202 INFO:tasks.workunit.client.0.vm06.stdout:5/573: dwrite d4/d50/f43 [4194304,4194304] 0 2026-03-09T17:29:45.202 INFO:tasks.workunit.client.0.vm06.stdout:5/574: fdatasync d4/f7 0 2026-03-09T17:29:45.210 INFO:tasks.workunit.client.0.vm06.stdout:0/701: dwrite d7/d11/d19/d8b/da4/d85/fc8 [0,4194304] 0 2026-03-09T17:29:45.218 INFO:tasks.workunit.client.0.vm06.stdout:7/697: sync 2026-03-09T17:29:45.223 INFO:tasks.workunit.client.0.vm06.stdout:5/575: symlink d4/d50/d35/d40/d95/ld1 0 2026-03-09T17:29:45.226 INFO:tasks.workunit.client.0.vm06.stdout:5/576: dwrite d4/d50/d35/fc6 [0,4194304] 0 2026-03-09T17:29:45.243 INFO:tasks.workunit.client.0.vm06.stdout:5/577: creat d4/d50/d35/d40/d95/fd2 x:0 0 0 2026-03-09T17:29:45.246 INFO:tasks.workunit.client.0.vm06.stdout:5/578: getdents d4/d50/d35/d40/d96/d98 0 2026-03-09T17:29:45.247 INFO:tasks.workunit.client.0.vm06.stdout:5/579: readlink d4/d50/l1c 0 2026-03-09T17:29:45.249 INFO:tasks.workunit.client.0.vm06.stdout:5/580: unlink d4/d50/f14 0 2026-03-09T17:29:45.284 INFO:tasks.workunit.client.0.vm06.stdout:3/609: write dd/d19/d25/f56 [3936319,115195] 0 2026-03-09T17:29:45.285 INFO:tasks.workunit.client.0.vm06.stdout:2/536: write d3/d4/d12/d71/daa/d77/d81/f50 [580953,57080] 0 2026-03-09T17:29:45.287 INFO:tasks.workunit.client.0.vm06.stdout:3/610: mknod dd/d81/da3/cc9 0 2026-03-09T17:29:45.290 INFO:tasks.workunit.client.0.vm06.stdout:2/537: dwrite d3/d4/d12/d71/daa/d77/d81/d64/d6a/fa3 [0,4194304] 0 2026-03-09T17:29:45.293 INFO:tasks.workunit.client.0.vm06.stdout:8/561: truncate d15/d16/d1a/f1b 2694445 0 2026-03-09T17:29:45.311 INFO:tasks.workunit.client.0.vm06.stdout:8/562: sync 2026-03-09T17:29:45.312 INFO:tasks.workunit.client.0.vm06.stdout:6/490: write d6/d12/d53/d8f/f9e [327815,16516] 0 2026-03-09T17:29:45.316 INFO:tasks.workunit.client.0.vm06.stdout:8/563: symlink d15/d16/d19/d3d/d5f/lbb 0 2026-03-09T17:29:45.317 INFO:tasks.workunit.client.0.vm06.stdout:6/491: rmdir d6/d4f/d73 39 2026-03-09T17:29:45.319 INFO:tasks.workunit.client.0.vm06.stdout:8/564: unlink d15/d16/d1a/d7c/l88 0 2026-03-09T17:29:45.320 INFO:tasks.workunit.client.0.vm06.stdout:6/492: dread d6/d47/d96/d40/f67 [0,4194304] 0 2026-03-09T17:29:45.321 INFO:tasks.workunit.client.0.vm06.stdout:8/565: rmdir d15/d16/d19/d3d 39 2026-03-09T17:29:45.325 INFO:tasks.workunit.client.0.vm06.stdout:6/493: dread - d6/d12/d53/f86 zero size 2026-03-09T17:29:45.326 INFO:tasks.workunit.client.0.vm06.stdout:8/566: symlink d15/lbc 0 2026-03-09T17:29:45.326 INFO:tasks.workunit.client.0.vm06.stdout:6/494: mkdir d6/d47/d4d/da0 0 2026-03-09T17:29:45.327 INFO:tasks.workunit.client.0.vm06.stdout:6/495: chown d6/d12/d53 2 1 2026-03-09T17:29:45.392 INFO:tasks.workunit.client.0.vm06.stdout:8/567: sync 2026-03-09T17:29:45.395 INFO:tasks.workunit.client.0.vm06.stdout:8/568: unlink d15/d16/d1e/d30/db8/f2a 0 2026-03-09T17:29:45.404 INFO:tasks.workunit.client.0.vm06.stdout:8/569: read f7 [2283447,107857] 0 2026-03-09T17:29:45.409 INFO:tasks.workunit.client.0.vm06.stdout:8/570: link d15/d16/d19/d71/f82 d15/d16/d19/fbd 0 2026-03-09T17:29:45.413 INFO:tasks.workunit.client.0.vm06.stdout:8/571: creat d15/da6/fbe x:0 0 0 2026-03-09T17:29:45.489 INFO:tasks.workunit.client.0.vm06.stdout:8/572: sync 2026-03-09T17:29:45.491 INFO:tasks.workunit.client.0.vm06.stdout:8/573: unlink d15/d16/f24 0 2026-03-09T17:29:45.552 INFO:tasks.workunit.client.0.vm06.stdout:1/583: write d11/d14/d1c/f2e [207702,105279] 0 2026-03-09T17:29:45.555 INFO:tasks.workunit.client.0.vm06.stdout:1/584: mkdir d11/d14/d1d/d1e/dc2 0 2026-03-09T17:29:45.556 INFO:tasks.workunit.client.0.vm06.stdout:1/585: write d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/f84 [390347,116710] 0 2026-03-09T17:29:45.558 INFO:tasks.workunit.client.0.vm06.stdout:1/586: creat d11/d14/d1c/d3a/fc3 x:0 0 0 2026-03-09T17:29:45.619 INFO:tasks.workunit.client.0.vm06.stdout:9/674: write d3/d15/f1a [2839197,9947] 0 2026-03-09T17:29:45.622 INFO:tasks.workunit.client.0.vm06.stdout:9/675: creat d3/d11/d65/d80/fd2 x:0 0 0 2026-03-09T17:29:45.635 INFO:tasks.workunit.client.0.vm06.stdout:4/612: dwrite db/d1d/d21/d37/d69/f8b [0,4194304] 0 2026-03-09T17:29:45.642 INFO:tasks.workunit.client.0.vm06.stdout:4/613: rename db/f17 to db/d1d/d21/d26/d89/dab/dae/fdd 0 2026-03-09T17:29:45.647 INFO:tasks.workunit.client.0.vm06.stdout:4/614: creat db/d59/d5f/d45/fde x:0 0 0 2026-03-09T17:29:45.652 INFO:tasks.workunit.client.0.vm06.stdout:4/615: mknod db/d57/dd4/cdf 0 2026-03-09T17:29:45.657 INFO:tasks.workunit.client.0.vm06.stdout:4/616: mkdir db/d1d/d21/d26/d89/dab/dae/dcc/de0 0 2026-03-09T17:29:45.665 INFO:tasks.workunit.client.0.vm06.stdout:4/617: getdents db/d59/d5f/d5d 0 2026-03-09T17:29:45.665 INFO:tasks.workunit.client.0.vm06.stdout:4/618: readlink db/df/l79 0 2026-03-09T17:29:45.672 INFO:tasks.workunit.client.0.vm06.stdout:4/619: fdatasync db/d59/f76 0 2026-03-09T17:29:45.676 INFO:tasks.workunit.client.0.vm06.stdout:4/620: truncate db/d59/d5f/d45/f61 167623 0 2026-03-09T17:29:45.688 INFO:tasks.workunit.client.0.vm06.stdout:0/702: write d7/d11/d19/d1d/fb5 [90122,90744] 0 2026-03-09T17:29:45.692 INFO:tasks.workunit.client.0.vm06.stdout:0/703: mkdir d7/d11/d19/d3c/df3 0 2026-03-09T17:29:45.695 INFO:tasks.workunit.client.0.vm06.stdout:0/704: rename d7/d11/d19/d23/db7/fd1 to d7/d11/d2d/dca/ff4 0 2026-03-09T17:29:45.697 INFO:tasks.workunit.client.0.vm06.stdout:0/705: read - d7/d11/d19/d23/f97 zero size 2026-03-09T17:29:45.697 INFO:tasks.workunit.client.0.vm06.stdout:0/706: chown d7/d11/c42 647 1 2026-03-09T17:29:45.698 INFO:tasks.workunit.client.0.vm06.stdout:0/707: chown d7/d11/d5d/f93 163 1 2026-03-09T17:29:45.702 INFO:tasks.workunit.client.0.vm06.stdout:7/698: truncate d5/dd/f3e 3524800 0 2026-03-09T17:29:45.707 INFO:tasks.workunit.client.0.vm06.stdout:5/581: dwrite d4/f1f [0,4194304] 0 2026-03-09T17:29:45.709 INFO:tasks.workunit.client.0.vm06.stdout:5/582: write d4/d22/d64/fce [118906,14085] 0 2026-03-09T17:29:45.715 INFO:tasks.workunit.client.0.vm06.stdout:7/699: unlink d5/dd/f5d 0 2026-03-09T17:29:45.717 INFO:tasks.workunit.client.0.vm06.stdout:0/708: mknod d7/d11/d2d/cf5 0 2026-03-09T17:29:45.723 INFO:tasks.workunit.client.0.vm06.stdout:5/583: dread d4/d52/f8f [4194304,4194304] 0 2026-03-09T17:29:45.727 INFO:tasks.workunit.client.0.vm06.stdout:0/709: mkdir d7/d11/d89/d99/df6 0 2026-03-09T17:29:45.732 INFO:tasks.workunit.client.0.vm06.stdout:7/700: dread d5/f18 [0,4194304] 0 2026-03-09T17:29:45.733 INFO:tasks.workunit.client.0.vm06.stdout:0/710: creat d7/d11/d19/d3c/db9/ddd/ff7 x:0 0 0 2026-03-09T17:29:45.735 INFO:tasks.workunit.client.0.vm06.stdout:0/711: mkdir d7/d11/d19/d3c/df8 0 2026-03-09T17:29:45.735 INFO:tasks.workunit.client.0.vm06.stdout:0/712: read - d7/d11/d19/d3c/db9/ddd/ff7 zero size 2026-03-09T17:29:45.736 INFO:tasks.workunit.client.0.vm06.stdout:5/584: link d4/d22/d46/la0 d4/d50/d18/d3d/ld3 0 2026-03-09T17:29:45.739 INFO:tasks.workunit.client.0.vm06.stdout:0/713: dwrite d7/d11/d19/d3c/db9/dd8/fed [0,4194304] 0 2026-03-09T17:29:45.742 INFO:tasks.workunit.client.0.vm06.stdout:7/701: creat d5/d1f/d34/d3f/fca x:0 0 0 2026-03-09T17:29:45.743 INFO:tasks.workunit.client.0.vm06.stdout:0/714: truncate d7/fbf 1011130 0 2026-03-09T17:29:45.743 INFO:tasks.workunit.client.0.vm06.stdout:7/702: write d5/dd/f9d [1003120,49676] 0 2026-03-09T17:29:45.745 INFO:tasks.workunit.client.0.vm06.stdout:0/715: write d7/d11/d5d/db8/fc6 [1726241,12592] 0 2026-03-09T17:29:45.746 INFO:tasks.workunit.client.0.vm06.stdout:3/611: dwrite dd/f75 [0,4194304] 0 2026-03-09T17:29:45.747 INFO:tasks.workunit.client.0.vm06.stdout:3/612: readlink dd/l17 0 2026-03-09T17:29:45.752 INFO:tasks.workunit.client.0.vm06.stdout:7/703: creat d5/d7/dac/fcb x:0 0 0 2026-03-09T17:29:45.760 INFO:tasks.workunit.client.0.vm06.stdout:3/613: creat dd/d1d/d4e/fca x:0 0 0 2026-03-09T17:29:45.760 INFO:tasks.workunit.client.0.vm06.stdout:3/614: fdatasync dd/d81/da3/dae/fb5 0 2026-03-09T17:29:45.760 INFO:tasks.workunit.client.0.vm06.stdout:3/615: link dd/f10 dd/d81/da3/dae/fcb 0 2026-03-09T17:29:45.761 INFO:tasks.workunit.client.0.vm06.stdout:0/716: link d7/d11/d19/d23/db7/dbd/la6 d7/d11/lf9 0 2026-03-09T17:29:45.767 INFO:tasks.workunit.client.0.vm06.stdout:3/616: read - dd/d19/d25/d44/f57 zero size 2026-03-09T17:29:45.768 INFO:tasks.workunit.client.0.vm06.stdout:3/617: symlink dd/d1d/d6e/d70/lcc 0 2026-03-09T17:29:45.770 INFO:tasks.workunit.client.0.vm06.stdout:0/717: link d7/d11/d19/f68 d7/ffa 0 2026-03-09T17:29:45.770 INFO:tasks.workunit.client.0.vm06.stdout:3/618: fsync dd/f14 0 2026-03-09T17:29:45.771 INFO:tasks.workunit.client.0.vm06.stdout:3/619: stat dd/d19/d25/l82 0 2026-03-09T17:29:45.775 INFO:tasks.workunit.client.0.vm06.stdout:3/620: symlink dd/d19/d2c/lcd 0 2026-03-09T17:29:45.777 INFO:tasks.workunit.client.0.vm06.stdout:3/621: creat dd/d19/d25/d2d/fce x:0 0 0 2026-03-09T17:29:45.783 INFO:tasks.workunit.client.0.vm06.stdout:3/622: rename dd/d81/fbf to dd/d1d/d2e/d67/fcf 0 2026-03-09T17:29:45.785 INFO:tasks.workunit.client.0.vm06.stdout:3/623: chown dd 123177746 1 2026-03-09T17:29:45.786 INFO:tasks.workunit.client.0.vm06.stdout:3/624: dread dd/f22 [4194304,4194304] 0 2026-03-09T17:29:45.787 INFO:tasks.workunit.client.0.vm06.stdout:5/585: sync 2026-03-09T17:29:45.788 INFO:tasks.workunit.client.0.vm06.stdout:3/625: dread dd/d5c/f66 [0,4194304] 0 2026-03-09T17:29:45.789 INFO:tasks.workunit.client.0.vm06.stdout:0/718: link d7/d11/d19/d1d/d87/le9 d7/d11/d19/d37/lfb 0 2026-03-09T17:29:45.794 INFO:tasks.workunit.client.0.vm06.stdout:3/626: creat dd/d19/d1e/db8/fd0 x:0 0 0 2026-03-09T17:29:45.801 INFO:tasks.workunit.client.0.vm06.stdout:2/538: dwrite d3/d4/d12/f85 [0,4194304] 0 2026-03-09T17:29:45.803 INFO:tasks.workunit.client.0.vm06.stdout:5/586: mknod d4/dbb/cd4 0 2026-03-09T17:29:45.814 INFO:tasks.workunit.client.0.vm06.stdout:5/587: symlink d4/d50/db2/ld5 0 2026-03-09T17:29:45.817 INFO:tasks.workunit.client.0.vm06.stdout:5/588: dread d4/f3a [4194304,4194304] 0 2026-03-09T17:29:45.820 INFO:tasks.workunit.client.0.vm06.stdout:3/627: link dd/d19/d25/d44/d80/fc4 dd/d19/d25/fd1 0 2026-03-09T17:29:45.821 INFO:tasks.workunit.client.0.vm06.stdout:5/589: mkdir d4/d50/dd6 0 2026-03-09T17:29:45.824 INFO:tasks.workunit.client.0.vm06.stdout:3/628: unlink dd/d1d/d6e/fa0 0 2026-03-09T17:29:45.826 INFO:tasks.workunit.client.0.vm06.stdout:3/629: mknod dd/d5c/cd2 0 2026-03-09T17:29:45.830 INFO:tasks.workunit.client.0.vm06.stdout:3/630: fdatasync dd/f14 0 2026-03-09T17:29:45.830 INFO:tasks.workunit.client.0.vm06.stdout:3/631: truncate dd/d81/da3/fbc 50499 0 2026-03-09T17:29:45.831 INFO:tasks.workunit.client.0.vm06.stdout:3/632: read dd/d59/da1/fa4 [2833105,125702] 0 2026-03-09T17:29:45.837 INFO:tasks.workunit.client.0.vm06.stdout:3/633: rename dd/d59/f84 to dd/d19/d25/d48/fd3 0 2026-03-09T17:29:45.839 INFO:tasks.workunit.client.0.vm06.stdout:3/634: symlink dd/d5b/d65/ld4 0 2026-03-09T17:29:45.840 INFO:tasks.workunit.client.0.vm06.stdout:3/635: write dd/d81/da3/dae/fb5 [717679,60718] 0 2026-03-09T17:29:45.844 INFO:tasks.workunit.client.0.vm06.stdout:3/636: dwrite dd/d19/d1e/f3f [0,4194304] 0 2026-03-09T17:29:45.846 INFO:tasks.workunit.client.0.vm06.stdout:6/496: write d6/d4f/f3a [5027345,124524] 0 2026-03-09T17:29:45.848 INFO:tasks.workunit.client.0.vm06.stdout:3/637: symlink dd/d1d/d6e/d70/ld5 0 2026-03-09T17:29:45.856 INFO:tasks.workunit.client.0.vm06.stdout:3/638: dwrite dd/d59/da1/faf [0,4194304] 0 2026-03-09T17:29:45.861 INFO:tasks.workunit.client.0.vm06.stdout:6/497: dwrite d6/d12/d53/d8f/f9e [0,4194304] 0 2026-03-09T17:29:45.865 INFO:tasks.workunit.client.0.vm06.stdout:3/639: creat dd/d19/d25/d44/fd6 x:0 0 0 2026-03-09T17:29:45.865 INFO:tasks.workunit.client.0.vm06.stdout:6/498: chown d6/d12/d53/d91 1 1 2026-03-09T17:29:45.870 INFO:tasks.workunit.client.0.vm06.stdout:6/499: dwrite d6/d47/d96/d40/f67 [0,4194304] 0 2026-03-09T17:29:45.890 INFO:tasks.workunit.client.0.vm06.stdout:6/500: write d6/f97 [476321,58243] 0 2026-03-09T17:29:45.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:45 vm06.local ceph-mon[57307]: pgmap v156: 65 pgs: 65 active+clean; 1.3 GiB data, 5.1 GiB used, 115 GiB / 120 GiB avail; 29 MiB/s rd, 74 MiB/s wr, 300 op/s 2026-03-09T17:29:45.894 INFO:tasks.workunit.client.0.vm06.stdout:6/501: mkdir d6/d47/d96/da1 0 2026-03-09T17:29:45.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:45 vm09.local ceph-mon[62061]: pgmap v156: 65 pgs: 65 active+clean; 1.3 GiB data, 5.1 GiB used, 115 GiB / 120 GiB avail; 29 MiB/s rd, 74 MiB/s wr, 300 op/s 2026-03-09T17:29:45.898 INFO:tasks.workunit.client.0.vm06.stdout:6/502: mkdir d6/d47/d4d/d9a/da2 0 2026-03-09T17:29:45.905 INFO:tasks.workunit.client.0.vm06.stdout:6/503: link d6/d4f/f44 d6/d4f/fa3 0 2026-03-09T17:29:45.915 INFO:tasks.workunit.client.0.vm06.stdout:8/574: write d15/d39/d3c/f5d [3632670,52206] 0 2026-03-09T17:29:45.918 INFO:tasks.workunit.client.0.vm06.stdout:8/575: dread d15/d16/d1a/f29 [0,4194304] 0 2026-03-09T17:29:45.920 INFO:tasks.workunit.client.0.vm06.stdout:8/576: chown d15/d16/d19/d3d/f6a 59475376 1 2026-03-09T17:29:45.923 INFO:tasks.workunit.client.0.vm06.stdout:8/577: creat d15/d39/d3c/d6c/fbf x:0 0 0 2026-03-09T17:29:45.924 INFO:tasks.workunit.client.0.vm06.stdout:8/578: creat d15/d16/d19/d3d/fc0 x:0 0 0 2026-03-09T17:29:45.925 INFO:tasks.workunit.client.0.vm06.stdout:8/579: chown d15/d16/d19/d2b 0 1 2026-03-09T17:29:45.954 INFO:tasks.workunit.client.0.vm06.stdout:1/587: write d11/d14/d1d/d1e/d2a/d34/d58/f6a [3167999,66083] 0 2026-03-09T17:29:45.954 INFO:tasks.workunit.client.0.vm06.stdout:1/588: chown d11/d14/d1d/l81 448507376 1 2026-03-09T17:29:45.976 INFO:tasks.workunit.client.0.vm06.stdout:9/676: write d3/d15/d36/d4d/fa4 [4941370,30203] 0 2026-03-09T17:29:45.977 INFO:tasks.workunit.client.0.vm06.stdout:9/677: chown d3/d15/f46 169 1 2026-03-09T17:29:45.978 INFO:tasks.workunit.client.0.vm06.stdout:9/678: mknod d3/d15/d48/cd3 0 2026-03-09T17:29:45.978 INFO:tasks.workunit.client.0.vm06.stdout:9/679: chown d3/d2c/f81 473884521 1 2026-03-09T17:29:45.979 INFO:tasks.workunit.client.0.vm06.stdout:9/680: write d3/d15/fce [165738,27405] 0 2026-03-09T17:29:45.982 INFO:tasks.workunit.client.0.vm06.stdout:9/681: dwrite d3/d15/fbb [0,4194304] 0 2026-03-09T17:29:45.988 INFO:tasks.workunit.client.0.vm06.stdout:9/682: dwrite d3/d11/f87 [0,4194304] 0 2026-03-09T17:29:46.004 INFO:tasks.workunit.client.0.vm06.stdout:9/683: truncate d3/d11/d65/d80/fd0 202826 0 2026-03-09T17:29:46.084 INFO:tasks.workunit.client.0.vm06.stdout:4/621: truncate db/d1d/d21/d37/d69/f8b 3182880 0 2026-03-09T17:29:46.088 INFO:tasks.workunit.client.0.vm06.stdout:4/622: dwrite db/d1d/f1f [0,4194304] 0 2026-03-09T17:29:46.089 INFO:tasks.workunit.client.0.vm06.stdout:4/623: fdatasync db/df/f2d 0 2026-03-09T17:29:46.092 INFO:tasks.workunit.client.0.vm06.stdout:4/624: creat db/d1d/d21/d44/dc1/fe1 x:0 0 0 2026-03-09T17:29:46.147 INFO:tasks.workunit.client.0.vm06.stdout:7/704: write d5/dd/dc5/d64/f77 [2604663,113165] 0 2026-03-09T17:29:46.148 INFO:tasks.workunit.client.0.vm06.stdout:7/705: truncate d5/f8 5409009 0 2026-03-09T17:29:46.149 INFO:tasks.workunit.client.0.vm06.stdout:7/706: write d5/d1f/d34/f41 [3147503,78981] 0 2026-03-09T17:29:46.152 INFO:tasks.workunit.client.0.vm06.stdout:7/707: mknod d5/d7/dac/ccc 0 2026-03-09T17:29:46.153 INFO:tasks.workunit.client.0.vm06.stdout:7/708: fdatasync d5/dd/dc5/d5f/fb1 0 2026-03-09T17:29:46.175 INFO:tasks.workunit.client.0.vm06.stdout:0/719: write d7/d11/d89/fa5 [1833877,57286] 0 2026-03-09T17:29:46.185 INFO:tasks.workunit.client.0.vm06.stdout:0/720: dread d7/d11/d19/d37/f6d [0,4194304] 0 2026-03-09T17:29:46.188 INFO:tasks.workunit.client.0.vm06.stdout:0/721: creat d7/d11/d19/d3c/df8/ffc x:0 0 0 2026-03-09T17:29:46.192 INFO:tasks.workunit.client.0.vm06.stdout:0/722: dwrite d7/d11/d19/d1d/f4c [0,4194304] 0 2026-03-09T17:29:46.216 INFO:tasks.workunit.client.0.vm06.stdout:2/539: write d3/d4/d12/d2b/d2d/f48 [371515,57598] 0 2026-03-09T17:29:46.222 INFO:tasks.workunit.client.0.vm06.stdout:5/590: truncate d4/d22/d64/f70 6088551 0 2026-03-09T17:29:46.222 INFO:tasks.workunit.client.0.vm06.stdout:5/591: chown d4/d50/d35/d40/d95/db8 925104 1 2026-03-09T17:29:46.237 INFO:tasks.workunit.client.0.vm06.stdout:0/723: sync 2026-03-09T17:29:46.238 INFO:tasks.workunit.client.0.vm06.stdout:0/724: write d7/d11/f75 [5016299,39085] 0 2026-03-09T17:29:46.240 INFO:tasks.workunit.client.0.vm06.stdout:5/592: creat d4/d50/fd7 x:0 0 0 2026-03-09T17:29:46.242 INFO:tasks.workunit.client.0.vm06.stdout:5/593: creat d4/d50/d35/d40/d6f/fd8 x:0 0 0 2026-03-09T17:29:46.242 INFO:tasks.workunit.client.0.vm06.stdout:5/594: stat d4/d50/fd7 0 2026-03-09T17:29:46.246 INFO:tasks.workunit.client.0.vm06.stdout:0/725: dread d7/d88/fdb [0,4194304] 0 2026-03-09T17:29:46.248 INFO:tasks.workunit.client.0.vm06.stdout:0/726: truncate d7/d11/d19/d8b/da4/fab 1088306 0 2026-03-09T17:29:46.264 INFO:tasks.workunit.client.0.vm06.stdout:3/640: dwrite dd/d1d/d2e/f3a [0,4194304] 0 2026-03-09T17:29:46.266 INFO:tasks.workunit.client.0.vm06.stdout:3/641: truncate f7 213483 0 2026-03-09T17:29:46.269 INFO:tasks.workunit.client.0.vm06.stdout:3/642: dread dd/f75 [0,4194304] 0 2026-03-09T17:29:46.269 INFO:tasks.workunit.client.0.vm06.stdout:3/643: chown dd/d59/da1/faf 21 1 2026-03-09T17:29:46.279 INFO:tasks.workunit.client.0.vm06.stdout:6/504: dwrite d6/d47/f61 [0,4194304] 0 2026-03-09T17:29:46.281 INFO:tasks.workunit.client.0.vm06.stdout:8/580: rmdir d15 39 2026-03-09T17:29:46.287 INFO:tasks.workunit.client.0.vm06.stdout:6/505: readlink d6/l20 0 2026-03-09T17:29:46.289 INFO:tasks.workunit.client.0.vm06.stdout:8/581: readlink d15/d16/d19/d3d/d5f/lbb 0 2026-03-09T17:29:46.298 INFO:tasks.workunit.client.0.vm06.stdout:6/506: symlink d6/d4f/d3e/d52/d80/la4 0 2026-03-09T17:29:46.299 INFO:tasks.workunit.client.0.vm06.stdout:8/582: mkdir d15/d16/d19/d3d/d5f/d83/dc1 0 2026-03-09T17:29:46.301 INFO:tasks.workunit.client.0.vm06.stdout:8/583: dread d15/d39/f40 [0,4194304] 0 2026-03-09T17:29:46.304 INFO:tasks.workunit.client.0.vm06.stdout:8/584: mknod d15/d39/d67/d77/cc2 0 2026-03-09T17:29:46.319 INFO:tasks.workunit.client.0.vm06.stdout:6/507: sync 2026-03-09T17:29:46.320 INFO:tasks.workunit.client.0.vm06.stdout:6/508: symlink d6/d12/d53/d8f/la5 0 2026-03-09T17:29:46.321 INFO:tasks.workunit.client.0.vm06.stdout:6/509: chown d6/d47/d4d/da0 33214239 1 2026-03-09T17:29:46.328 INFO:tasks.workunit.client.0.vm06.stdout:1/589: dwrite d11/d14/d1d/d4a/fa7 [0,4194304] 0 2026-03-09T17:29:46.348 INFO:tasks.workunit.client.0.vm06.stdout:1/590: creat d11/d14/d1c/d5f/fc4 x:0 0 0 2026-03-09T17:29:46.348 INFO:tasks.workunit.client.0.vm06.stdout:1/591: chown d11/d14/l62 816705557 1 2026-03-09T17:29:46.349 INFO:tasks.workunit.client.0.vm06.stdout:1/592: chown d11/d14/d1d/d42/c9c 821 1 2026-03-09T17:29:46.350 INFO:tasks.workunit.client.0.vm06.stdout:1/593: creat d11/d14/d1c/d3a/fc5 x:0 0 0 2026-03-09T17:29:46.433 INFO:tasks.workunit.client.0.vm06.stdout:9/684: rename d3/d26/d6c/f75 to d3/d6d/d9a/d9c/dcd/fd4 0 2026-03-09T17:29:46.436 INFO:tasks.workunit.client.0.vm06.stdout:9/685: fsync d3/d15/f2e 0 2026-03-09T17:29:46.574 INFO:tasks.workunit.client.0.vm06.stdout:7/709: mknod d5/d1f/d34/d3f/ccd 0 2026-03-09T17:29:46.575 INFO:tasks.workunit.client.0.vm06.stdout:7/710: readlink d5/d7/lb 0 2026-03-09T17:29:46.578 INFO:tasks.workunit.client.0.vm06.stdout:7/711: creat d5/d1f/d34/d3f/d91/fce x:0 0 0 2026-03-09T17:29:46.581 INFO:tasks.workunit.client.0.vm06.stdout:0/727: dread d7/fbf [0,4194304] 0 2026-03-09T17:29:46.583 INFO:tasks.workunit.client.0.vm06.stdout:7/712: mknod d5/dd/dc5/d5f/ccf 0 2026-03-09T17:29:46.588 INFO:tasks.workunit.client.0.vm06.stdout:7/713: symlink d5/d7/d2b/dc8/ld0 0 2026-03-09T17:29:46.589 INFO:tasks.workunit.client.0.vm06.stdout:7/714: write d5/d7/f87 [1423540,27360] 0 2026-03-09T17:29:46.591 INFO:tasks.workunit.client.0.vm06.stdout:7/715: mkdir d5/dd/dc5/d64/d6b/dd1 0 2026-03-09T17:29:46.715 INFO:tasks.workunit.client.0.vm06.stdout:5/595: dwrite d4/d22/d46/f59 [0,4194304] 0 2026-03-09T17:29:46.724 INFO:tasks.workunit.client.0.vm06.stdout:5/596: creat d4/d22/d46/fd9 x:0 0 0 2026-03-09T17:29:46.729 INFO:tasks.workunit.client.0.vm06.stdout:3/644: write dd/f14 [2321523,113029] 0 2026-03-09T17:29:46.730 INFO:tasks.workunit.client.0.vm06.stdout:3/645: fsync dd/d1d/f4b 0 2026-03-09T17:29:46.732 INFO:tasks.workunit.client.0.vm06.stdout:3/646: mkdir dd/d19/d25/d44/d80/dd7 0 2026-03-09T17:29:46.732 INFO:tasks.workunit.client.0.vm06.stdout:3/647: readlink dd/d19/d2c/l94 0 2026-03-09T17:29:46.733 INFO:tasks.workunit.client.0.vm06.stdout:3/648: symlink dd/d1d/ld8 0 2026-03-09T17:29:46.736 INFO:tasks.workunit.client.0.vm06.stdout:3/649: symlink dd/d1d/d6e/d70/ld9 0 2026-03-09T17:29:46.740 INFO:tasks.workunit.client.0.vm06.stdout:3/650: mkdir dd/d19/dda 0 2026-03-09T17:29:46.746 INFO:tasks.workunit.client.0.vm06.stdout:3/651: link dd/d19/d25/fd1 dd/d19/d25/d2d/d9b/fdb 0 2026-03-09T17:29:46.751 INFO:tasks.workunit.client.0.vm06.stdout:6/510: write d6/d4f/f44 [3589204,23664] 0 2026-03-09T17:29:46.754 INFO:tasks.workunit.client.0.vm06.stdout:6/511: symlink d6/d47/d4d/d6d/la6 0 2026-03-09T17:29:46.754 INFO:tasks.workunit.client.0.vm06.stdout:6/512: chown d6/d47/d4d/d6d 433057 1 2026-03-09T17:29:46.759 INFO:tasks.workunit.client.0.vm06.stdout:1/594: write d11/d14/d1d/d4a/fa5 [802970,36115] 0 2026-03-09T17:29:46.761 INFO:tasks.workunit.client.0.vm06.stdout:1/595: readlink d11/d14/d1d/d42/la0 0 2026-03-09T17:29:46.763 INFO:tasks.workunit.client.0.vm06.stdout:1/596: creat d11/d14/d1d/d94/fc6 x:0 0 0 2026-03-09T17:29:46.784 INFO:tasks.workunit.client.0.vm06.stdout:2/540: dread d3/d4/d12/d2b/f32 [0,4194304] 0 2026-03-09T17:29:46.790 INFO:tasks.workunit.client.0.vm06.stdout:5/597: dread d4/d50/d18/f9d [0,4194304] 0 2026-03-09T17:29:46.906 INFO:tasks.workunit.client.0.vm06.stdout:4/625: mkdir db/de2 0 2026-03-09T17:29:46.949 INFO:tasks.workunit.client.0.vm06.stdout:0/728: creat d7/d11/d19/d3c/ffd x:0 0 0 2026-03-09T17:29:46.950 INFO:tasks.workunit.client.0.vm06.stdout:7/716: creat d5/d1f/d34/fd2 x:0 0 0 2026-03-09T17:29:46.985 INFO:tasks.workunit.client.0.vm06.stdout:8/585: rename d15/d16/d19/f93 to d15/d39/d67/d77/fc3 0 2026-03-09T17:29:46.986 INFO:tasks.workunit.client.0.vm06.stdout:8/586: write d15/d16/fb4 [1400422,35778] 0 2026-03-09T17:29:46.989 INFO:tasks.workunit.client.0.vm06.stdout:9/686: rename d3/d11/d65/l73 to d3/d6d/d9a/d9c/dcd/ld5 0 2026-03-09T17:29:46.990 INFO:tasks.workunit.client.0.vm06.stdout:8/587: symlink d15/d16/d19/d71/lc4 0 2026-03-09T17:29:46.991 INFO:tasks.workunit.client.0.vm06.stdout:3/652: rename dd/d59/l8a to dd/d19/d2c/ldc 0 2026-03-09T17:29:46.995 INFO:tasks.workunit.client.0.vm06.stdout:3/653: dwrite dd/f15 [4194304,4194304] 0 2026-03-09T17:29:46.998 INFO:tasks.workunit.client.0.vm06.stdout:6/513: rename d6/d12/d2d/f6c to d6/d12/d17/d85/fa7 0 2026-03-09T17:29:47.002 INFO:tasks.workunit.client.0.vm06.stdout:6/514: read d6/d12/d17/f6b [311093,14550] 0 2026-03-09T17:29:47.014 INFO:tasks.workunit.client.0.vm06.stdout:7/717: dread d5/d1f/d34/d3f/f5b [0,4194304] 0 2026-03-09T17:29:47.016 INFO:tasks.workunit.client.0.vm06.stdout:8/588: dread d15/d16/d1a/f1b [0,4194304] 0 2026-03-09T17:29:47.018 INFO:tasks.workunit.client.0.vm06.stdout:8/589: chown d15/d39/d3c/l70 16 1 2026-03-09T17:29:47.023 INFO:tasks.workunit.client.0.vm06.stdout:7/718: dwrite d5/dd/fa0 [0,4194304] 0 2026-03-09T17:29:47.030 INFO:tasks.workunit.client.0.vm06.stdout:2/541: rename d3/d4/d12/d71/f97 to d3/fad 0 2026-03-09T17:29:47.031 INFO:tasks.workunit.client.0.vm06.stdout:6/515: creat d6/d47/fa8 x:0 0 0 2026-03-09T17:29:47.032 INFO:tasks.workunit.client.0.vm06.stdout:3/654: link dd/d19/d25/d44/d80/fc4 dd/fdd 0 2026-03-09T17:29:47.040 INFO:tasks.workunit.client.0.vm06.stdout:0/729: rename d7/d11/d19/d37/l54 to d7/d11/d89/d99/lfe 0 2026-03-09T17:29:47.042 INFO:tasks.workunit.client.0.vm06.stdout:0/730: write d7/d11/d19/d1d/fec [587326,100665] 0 2026-03-09T17:29:47.043 INFO:tasks.workunit.client.0.vm06.stdout:0/731: truncate d7/d11/d5d/d64/fc9 998057 0 2026-03-09T17:29:47.051 INFO:tasks.workunit.client.0.vm06.stdout:7/719: rename d5/dd/f19 to d5/d1f/d34/d3f/d8b/fd3 0 2026-03-09T17:29:47.054 INFO:tasks.workunit.client.0.vm06.stdout:2/542: mknod d3/cae 0 2026-03-09T17:29:47.056 INFO:tasks.workunit.client.0.vm06.stdout:2/543: truncate d3/d4/d12/d71/daa/d77/d81/d64/d6a/fab 492765 0 2026-03-09T17:29:47.066 INFO:tasks.workunit.client.0.vm06.stdout:1/597: write d11/d14/d1d/d42/d46/d92/dc0/f21 [3082926,3804] 0 2026-03-09T17:29:47.068 INFO:tasks.workunit.client.0.vm06.stdout:7/720: dwrite d5/dd/d79/d7f/f98 [0,4194304] 0 2026-03-09T17:29:47.072 INFO:tasks.workunit.client.0.vm06.stdout:2/544: truncate d3/d4/d22/d72/f54 225193 0 2026-03-09T17:29:47.074 INFO:tasks.workunit.client.0.vm06.stdout:0/732: rename d7/d11/d19/f68 to d7/d11/d89/d99/fff 0 2026-03-09T17:29:47.080 INFO:tasks.workunit.client.0.vm06.stdout:9/687: rmdir d3/d6d 39 2026-03-09T17:29:47.106 INFO:tasks.workunit.client.0.vm06.stdout:7/721: mkdir d5/d7/dac/dd4 0 2026-03-09T17:29:47.106 INFO:tasks.workunit.client.0.vm06.stdout:2/545: stat d3/f29 0 2026-03-09T17:29:47.106 INFO:tasks.workunit.client.0.vm06.stdout:0/733: readlink d7/d11/d19/d23/lcc 0 2026-03-09T17:29:47.106 INFO:tasks.workunit.client.0.vm06.stdout:7/722: truncate d5/dd/dc5/f32 2995456 0 2026-03-09T17:29:47.106 INFO:tasks.workunit.client.0.vm06.stdout:7/723: write d5/d1f/d34/d46/f4e [2964300,120683] 0 2026-03-09T17:29:47.106 INFO:tasks.workunit.client.0.vm06.stdout:7/724: chown d5/d7/d2b/fa1 613 1 2026-03-09T17:29:47.106 INFO:tasks.workunit.client.0.vm06.stdout:7/725: write d5/d1f/d34/d3f/f5b [2559217,23997] 0 2026-03-09T17:29:47.106 INFO:tasks.workunit.client.0.vm06.stdout:2/546: symlink d3/d4/d12/d71/daa/d77/d81/d64/laf 0 2026-03-09T17:29:47.106 INFO:tasks.workunit.client.0.vm06.stdout:0/734: rename d7/d11/d19/d1d/d39/l9a to d7/d11/d89/l100 0 2026-03-09T17:29:47.106 INFO:tasks.workunit.client.0.vm06.stdout:5/598: dwrite d4/d50/d35/d40/fc1 [0,4194304] 0 2026-03-09T17:29:47.106 INFO:tasks.workunit.client.0.vm06.stdout:0/735: dread - d7/d11/d2d/fc3 zero size 2026-03-09T17:29:47.106 INFO:tasks.workunit.client.0.vm06.stdout:5/599: mkdir d4/d50/d35/d40/d95/db8/dda 0 2026-03-09T17:29:47.106 INFO:tasks.workunit.client.0.vm06.stdout:5/600: fdatasync d4/d52/f8f 0 2026-03-09T17:29:47.106 INFO:tasks.workunit.client.0.vm06.stdout:9/688: creat d3/d6d/d9a/fd6 x:0 0 0 2026-03-09T17:29:47.109 INFO:tasks.workunit.client.0.vm06.stdout:0/736: rmdir d7/d11/d5d 39 2026-03-09T17:29:47.113 INFO:tasks.workunit.client.0.vm06.stdout:7/726: getdents d5/d1f/d34/d46/d51 0 2026-03-09T17:29:47.115 INFO:tasks.workunit.client.0.vm06.stdout:7/727: truncate d5/d7/d2b/f50 4770155 0 2026-03-09T17:29:47.118 INFO:tasks.workunit.client.0.vm06.stdout:0/737: mkdir d7/d11/d19/d23/db7/dbd/d101 0 2026-03-09T17:29:47.120 INFO:tasks.workunit.client.0.vm06.stdout:7/728: getdents d5/d1f/d34/d46/d51 0 2026-03-09T17:29:47.122 INFO:tasks.workunit.client.0.vm06.stdout:5/601: link d4/d50/c89 d4/d22/dbe/cdb 0 2026-03-09T17:29:47.122 INFO:tasks.workunit.client.0.vm06.stdout:5/602: dread - d4/d22/fcb zero size 2026-03-09T17:29:47.124 INFO:tasks.workunit.client.0.vm06.stdout:7/729: symlink d5/dd/dc5/d5f/ld5 0 2026-03-09T17:29:47.125 INFO:tasks.workunit.client.0.vm06.stdout:0/738: mkdir d7/d102 0 2026-03-09T17:29:47.128 INFO:tasks.workunit.client.0.vm06.stdout:0/739: dwrite d7/fe [0,4194304] 0 2026-03-09T17:29:47.129 INFO:tasks.workunit.client.0.vm06.stdout:5/603: mknod d4/d50/d35/d40/d96/d98/cdc 0 2026-03-09T17:29:47.136 INFO:tasks.workunit.client.0.vm06.stdout:0/740: dread - d7/d11/d19/d1d/fb3 zero size 2026-03-09T17:29:47.137 INFO:tasks.workunit.client.0.vm06.stdout:5/604: creat d4/d50/d35/d40/d95/db8/dda/fdd x:0 0 0 2026-03-09T17:29:47.138 INFO:tasks.workunit.client.0.vm06.stdout:0/741: mknod d7/d11/d2d/daf/c103 0 2026-03-09T17:29:47.138 INFO:tasks.workunit.client.0.vm06.stdout:3/655: sync 2026-03-09T17:29:47.142 INFO:tasks.workunit.client.0.vm06.stdout:0/742: dwrite d7/d11/d19/d37/f4f [0,4194304] 0 2026-03-09T17:29:47.149 INFO:tasks.workunit.client.0.vm06.stdout:0/743: dwrite d7/d11/d19/d37/f4f [0,4194304] 0 2026-03-09T17:29:47.161 INFO:tasks.workunit.client.0.vm06.stdout:5/605: dread d4/d50/d35/f4d [0,4194304] 0 2026-03-09T17:29:47.162 INFO:tasks.workunit.client.0.vm06.stdout:5/606: write d4/d50/d18/f3e [7803114,106365] 0 2026-03-09T17:29:47.167 INFO:tasks.workunit.client.0.vm06.stdout:3/656: creat dd/d19/d25/d44/d80/dd7/fde x:0 0 0 2026-03-09T17:29:47.190 INFO:tasks.workunit.client.0.vm06.stdout:5/607: creat d4/d22/fde x:0 0 0 2026-03-09T17:29:47.198 INFO:tasks.workunit.client.0.vm06.stdout:3/657: creat dd/d19/d2c/fdf x:0 0 0 2026-03-09T17:29:47.206 INFO:tasks.workunit.client.0.vm06.stdout:4/626: write db/d59/d5f/d45/f8e [494804,98964] 0 2026-03-09T17:29:47.213 INFO:tasks.workunit.client.0.vm06.stdout:4/627: dwrite db/d1d/d21/d26/d89/dab/dae/dcc/fd0 [0,4194304] 0 2026-03-09T17:29:47.216 INFO:tasks.workunit.client.0.vm06.stdout:5/608: write d4/d50/d18/f9d [483199,98280] 0 2026-03-09T17:29:47.216 INFO:tasks.workunit.client.0.vm06.stdout:3/658: mknod dd/d81/ce0 0 2026-03-09T17:29:47.250 INFO:tasks.workunit.client.0.vm06.stdout:4/628: unlink db/d59/d5f/c9e 0 2026-03-09T17:29:47.251 INFO:tasks.workunit.client.0.vm06.stdout:8/590: write d15/d16/f87 [903242,122215] 0 2026-03-09T17:29:47.257 INFO:tasks.workunit.client.0.vm06.stdout:3/659: mknod dd/d1d/d2e/ce1 0 2026-03-09T17:29:47.261 INFO:tasks.workunit.client.0.vm06.stdout:5/609: symlink d4/d50/d18/d3d/ldf 0 2026-03-09T17:29:47.262 INFO:tasks.workunit.client.0.vm06.stdout:4/629: unlink db/d1d/d21/d26/d89/cb9 0 2026-03-09T17:29:47.263 INFO:tasks.workunit.client.0.vm06.stdout:4/630: read db/fc [188996,86123] 0 2026-03-09T17:29:47.266 INFO:tasks.workunit.client.0.vm06.stdout:6/516: dwrite d6/f5c [0,4194304] 0 2026-03-09T17:29:47.276 INFO:tasks.workunit.client.0.vm06.stdout:8/591: truncate d15/d39/f7b 5199271 0 2026-03-09T17:29:47.276 INFO:tasks.workunit.client.0.vm06.stdout:1/598: write f7 [2514329,122472] 0 2026-03-09T17:29:47.277 INFO:tasks.workunit.client.0.vm06.stdout:1/599: chown d11/fa9 847817 1 2026-03-09T17:29:47.284 INFO:tasks.workunit.client.0.vm06.stdout:2/547: write d3/d4/d12/d2b/f32 [2974965,7827] 0 2026-03-09T17:29:47.288 INFO:tasks.workunit.client.0.vm06.stdout:2/548: truncate d3/d4/d46/da5/fa8 360578 0 2026-03-09T17:29:47.288 INFO:tasks.workunit.client.0.vm06.stdout:2/549: dwrite d3/d4/d46/da5/f6c [0,4194304] 0 2026-03-09T17:29:47.291 INFO:tasks.workunit.client.0.vm06.stdout:3/660: dread dd/d1d/d4e/f7d [0,4194304] 0 2026-03-09T17:29:47.292 INFO:tasks.workunit.client.0.vm06.stdout:4/631: creat db/d57/fe3 x:0 0 0 2026-03-09T17:29:47.298 INFO:tasks.workunit.client.0.vm06.stdout:8/592: truncate d15/d16/d19/d3d/f6a 1859545 0 2026-03-09T17:29:47.298 INFO:tasks.workunit.client.0.vm06.stdout:8/593: write d15/d39/d3c/d6c/fbf [229692,28180] 0 2026-03-09T17:29:47.303 INFO:tasks.workunit.client.0.vm06.stdout:1/600: rmdir d11/d14/d1d/d42/d46/d92/dc0/d57 39 2026-03-09T17:29:47.305 INFO:tasks.workunit.client.0.vm06.stdout:9/689: dwrite d3/d15/f17 [4194304,4194304] 0 2026-03-09T17:29:47.306 INFO:tasks.workunit.client.0.vm06.stdout:1/601: read d11/f18 [514199,49206] 0 2026-03-09T17:29:47.314 INFO:tasks.workunit.client.0.vm06.stdout:2/550: unlink d3/d4/d12/d2b/d2d/f6f 0 2026-03-09T17:29:47.328 INFO:tasks.workunit.client.0.vm06.stdout:7/730: dwrite d5/f18 [4194304,4194304] 0 2026-03-09T17:29:47.328 INFO:tasks.workunit.client.0.vm06.stdout:6/517: symlink d6/la9 0 2026-03-09T17:29:47.328 INFO:tasks.workunit.client.0.vm06.stdout:0/744: write d7/fb1 [1251888,90927] 0 2026-03-09T17:29:47.329 INFO:tasks.workunit.client.0.vm06.stdout:3/661: dread dd/f5f [0,4194304] 0 2026-03-09T17:29:47.330 INFO:tasks.workunit.client.0.vm06.stdout:1/602: symlink d11/d14/d1c/d5f/lc7 0 2026-03-09T17:29:47.335 INFO:tasks.workunit.client.0.vm06.stdout:6/518: chown d6/d4f/d3e/d52/f89 6702637 1 2026-03-09T17:29:47.336 INFO:tasks.workunit.client.0.vm06.stdout:8/594: mkdir d15/d31/dc5 0 2026-03-09T17:29:47.337 INFO:tasks.workunit.client.0.vm06.stdout:5/610: truncate d4/d50/d18/f4a 5119689 0 2026-03-09T17:29:47.342 INFO:tasks.workunit.client.0.vm06.stdout:0/745: rename d7/d11/d19/d23/f97 to d7/d11/d19/d1d/d87/f104 0 2026-03-09T17:29:47.346 INFO:tasks.workunit.client.0.vm06.stdout:9/690: mkdir d3/d26/dd7 0 2026-03-09T17:29:47.348 INFO:tasks.workunit.client.0.vm06.stdout:3/662: creat dd/d19/d25/d2d/fe2 x:0 0 0 2026-03-09T17:29:47.350 INFO:tasks.workunit.client.0.vm06.stdout:1/603: creat d11/d14/d1d/d94/fc8 x:0 0 0 2026-03-09T17:29:47.354 INFO:tasks.workunit.client.0.vm06.stdout:4/632: rmdir db/d1d/d21/d37/d69/d78/db4/d91 0 2026-03-09T17:29:47.357 INFO:tasks.workunit.client.0.vm06.stdout:4/633: dread db/d59/d5f/d45/f8e [0,4194304] 0 2026-03-09T17:29:47.360 INFO:tasks.workunit.client.0.vm06.stdout:6/519: rename d6/d12/d53/f86 to d6/d4f/d3e/d52/d80/faa 0 2026-03-09T17:29:47.362 INFO:tasks.workunit.client.0.vm06.stdout:0/746: creat d7/d11/d19/d8b/da4/d85/f105 x:0 0 0 2026-03-09T17:29:47.362 INFO:tasks.workunit.client.0.vm06.stdout:0/747: chown d7/d11/d2d/dca/ld7 1645374 1 2026-03-09T17:29:47.396 INFO:tasks.workunit.client.0.vm06.stdout:9/691: chown d3/d15/l51 30717337 1 2026-03-09T17:29:47.396 INFO:tasks.workunit.client.0.vm06.stdout:8/595: symlink d15/d31/dc5/lc6 0 2026-03-09T17:29:47.396 INFO:tasks.workunit.client.0.vm06.stdout:8/596: chown d15/d16/d1a/f29 2074622 1 2026-03-09T17:29:47.396 INFO:tasks.workunit.client.0.vm06.stdout:9/692: creat d3/d6d/d9a/d9c/fd8 x:0 0 0 2026-03-09T17:29:47.396 INFO:tasks.workunit.client.0.vm06.stdout:2/551: getdents d3/d4/d12/d71/daa/d77 0 2026-03-09T17:29:47.396 INFO:tasks.workunit.client.0.vm06.stdout:0/748: dread d7/d11/d19/d8b/da4/fab [0,4194304] 0 2026-03-09T17:29:47.396 INFO:tasks.workunit.client.0.vm06.stdout:9/693: dwrite d3/d6d/d9a/d9c/fd8 [0,4194304] 0 2026-03-09T17:29:47.396 INFO:tasks.workunit.client.0.vm06.stdout:9/694: write d3/d26/d6c/f3a [2747927,80664] 0 2026-03-09T17:29:47.396 INFO:tasks.workunit.client.0.vm06.stdout:0/749: dread d7/d11/d19/f24 [4194304,4194304] 0 2026-03-09T17:29:47.396 INFO:tasks.workunit.client.0.vm06.stdout:6/520: symlink d6/d47/d4d/d9a/da2/lab 0 2026-03-09T17:29:47.396 INFO:tasks.workunit.client.0.vm06.stdout:7/731: write d5/dd/dc5/f32 [3804406,86846] 0 2026-03-09T17:29:47.396 INFO:tasks.workunit.client.0.vm06.stdout:9/695: creat d3/d15/d36/d4d/fd9 x:0 0 0 2026-03-09T17:29:47.397 INFO:tasks.workunit.client.0.vm06.stdout:2/552: mkdir d3/d4/d12/d2b/db0 0 2026-03-09T17:29:47.397 INFO:tasks.workunit.client.0.vm06.stdout:2/553: fdatasync d3/d4/d46/da5/fa8 0 2026-03-09T17:29:47.408 INFO:tasks.workunit.client.0.vm06.stdout:5/611: sync 2026-03-09T17:29:47.410 INFO:tasks.workunit.client.0.vm06.stdout:0/750: dwrite d7/d11/d19/d3c/fe8 [4194304,4194304] 0 2026-03-09T17:29:47.414 INFO:tasks.workunit.client.0.vm06.stdout:0/751: truncate d7/d11/d19/d23/db7/dbd/fc0 2056695 0 2026-03-09T17:29:47.415 INFO:tasks.workunit.client.0.vm06.stdout:6/521: creat d6/d47/d4d/d9a/da2/fac x:0 0 0 2026-03-09T17:29:47.415 INFO:tasks.workunit.client.0.vm06.stdout:0/752: dread - d7/d11/d19/d3c/db9/ddd/ff7 zero size 2026-03-09T17:29:47.424 INFO:tasks.workunit.client.0.vm06.stdout:9/696: rename d3/d15/d36/d4d/f62 to d3/d15/d48/fda 0 2026-03-09T17:29:47.439 INFO:tasks.workunit.client.0.vm06.stdout:7/732: symlink d5/dd/dc5/dad/ld6 0 2026-03-09T17:29:47.444 INFO:tasks.workunit.client.0.vm06.stdout:9/697: chown d3/c47 43057714 1 2026-03-09T17:29:47.454 INFO:tasks.workunit.client.0.vm06.stdout:9/698: chown d3/d15/d16/l82 13873727 1 2026-03-09T17:29:47.454 INFO:tasks.workunit.client.0.vm06.stdout:2/554: mknod d3/d4/d12/d71/daa/d77/d81/cb1 0 2026-03-09T17:29:47.454 INFO:tasks.workunit.client.0.vm06.stdout:6/522: mknod d6/d4f/d3e/cad 0 2026-03-09T17:29:47.456 INFO:tasks.workunit.client.0.vm06.stdout:5/612: rename d4/d22/dbe/cdb to d4/d50/db2/ce0 0 2026-03-09T17:29:47.457 INFO:tasks.workunit.client.0.vm06.stdout:2/555: creat d3/d4/d22/fb2 x:0 0 0 2026-03-09T17:29:47.459 INFO:tasks.workunit.client.0.vm06.stdout:0/753: creat d7/f106 x:0 0 0 2026-03-09T17:29:47.463 INFO:tasks.workunit.client.0.vm06.stdout:0/754: dwrite d7/d11/d19/d3c/fe8 [4194304,4194304] 0 2026-03-09T17:29:47.465 INFO:tasks.workunit.client.0.vm06.stdout:0/755: write d7/d11/d19/d3c/ffd [585070,99552] 0 2026-03-09T17:29:47.476 INFO:tasks.workunit.client.0.vm06.stdout:0/756: read d7/f36 [543337,107156] 0 2026-03-09T17:29:47.489 INFO:tasks.workunit.client.0.vm06.stdout:3/663: write dd/d1d/f29 [4861752,49887] 0 2026-03-09T17:29:47.490 INFO:tasks.workunit.client.0.vm06.stdout:3/664: chown dd/d19/d25/d2d/d9b/fc5 4056380 1 2026-03-09T17:29:47.495 INFO:tasks.workunit.client.0.vm06.stdout:1/604: dwrite d11/d14/d1d/d1e/f47 [0,4194304] 0 2026-03-09T17:29:47.517 INFO:tasks.workunit.client.0.vm06.stdout:2/556: mkdir d3/d4/d12/da7/db3 0 2026-03-09T17:29:47.517 INFO:tasks.workunit.client.0.vm06.stdout:5/613: mkdir d4/d50/d18/de1 0 2026-03-09T17:29:47.518 INFO:tasks.workunit.client.0.vm06.stdout:4/634: write db/d1d/d21/d25/d4b/f4e [423275,82739] 0 2026-03-09T17:29:47.518 INFO:tasks.workunit.client.0.vm06.stdout:0/757: creat d7/d11/d19/d8b/da4/f107 x:0 0 0 2026-03-09T17:29:47.519 INFO:tasks.workunit.client.0.vm06.stdout:4/635: write db/d1d/d21/d26/d89/dab/dae/dcc/fd0 [2811223,73199] 0 2026-03-09T17:29:47.522 INFO:tasks.workunit.client.0.vm06.stdout:8/597: write d15/d16/d1e/f8c [799422,58383] 0 2026-03-09T17:29:47.525 INFO:tasks.workunit.client.0.vm06.stdout:8/598: dread d15/d16/f23 [0,4194304] 0 2026-03-09T17:29:47.528 INFO:tasks.workunit.client.0.vm06.stdout:3/665: symlink dd/d5c/le3 0 2026-03-09T17:29:47.535 INFO:tasks.workunit.client.0.vm06.stdout:1/605: dread d11/d14/d1d/d42/d46/d92/dc0/f7f [4194304,4194304] 0 2026-03-09T17:29:47.538 INFO:tasks.workunit.client.0.vm06.stdout:9/699: rename d3/d15/d16/la7 to d3/ldb 0 2026-03-09T17:29:47.545 INFO:tasks.workunit.client.0.vm06.stdout:2/557: symlink d3/d4/d46/da5/lb4 0 2026-03-09T17:29:47.550 INFO:tasks.workunit.client.0.vm06.stdout:7/733: dwrite d5/dd/dc5/fa2 [0,4194304] 0 2026-03-09T17:29:47.552 INFO:tasks.workunit.client.0.vm06.stdout:7/734: chown d5/d7/c3b 1 1 2026-03-09T17:29:47.559 INFO:tasks.workunit.client.0.vm06.stdout:8/599: mkdir d15/d16/d19/d3d/dc7 0 2026-03-09T17:29:47.559 INFO:tasks.workunit.client.0.vm06.stdout:3/666: rmdir dd/d19 39 2026-03-09T17:29:47.561 INFO:tasks.workunit.client.0.vm06.stdout:3/667: write dd/d81/da3/dae/fb5 [1363214,13824] 0 2026-03-09T17:29:47.561 INFO:tasks.workunit.client.0.vm06.stdout:3/668: chown dd/f26 653 1 2026-03-09T17:29:47.570 INFO:tasks.workunit.client.0.vm06.stdout:6/523: rename d6/d12/d2d/f5e to d6/d47/d4d/fae 0 2026-03-09T17:29:47.574 INFO:tasks.workunit.client.0.vm06.stdout:9/700: creat d3/d15/d36/d4c/d6a/d8a/fdc x:0 0 0 2026-03-09T17:29:47.587 INFO:tasks.workunit.client.0.vm06.stdout:7/735: mkdir d5/d7/d2b/dc8/dd7 0 2026-03-09T17:29:47.599 INFO:tasks.workunit.client.0.vm06.stdout:3/669: unlink dd/d59/la6 0 2026-03-09T17:29:47.606 INFO:tasks.workunit.client.0.vm06.stdout:5/614: dwrite d4/d50/d18/f4b [0,4194304] 0 2026-03-09T17:29:47.611 INFO:tasks.workunit.client.0.vm06.stdout:6/524: stat d6/d47/d4d/d6d/c8d 0 2026-03-09T17:29:47.613 INFO:tasks.workunit.client.0.vm06.stdout:9/701: sync 2026-03-09T17:29:47.625 INFO:tasks.workunit.client.0.vm06.stdout:4/636: dwrite db/f39 [0,4194304] 0 2026-03-09T17:29:47.630 INFO:tasks.workunit.client.0.vm06.stdout:8/600: dwrite d15/d16/d19/d2b/f46 [4194304,4194304] 0 2026-03-09T17:29:47.644 INFO:tasks.workunit.client.0.vm06.stdout:5/615: truncate d4/d50/d18/d3d/fc0 81736 0 2026-03-09T17:29:47.646 INFO:tasks.workunit.client.0.vm06.stdout:6/525: write d6/f5c [3650787,44503] 0 2026-03-09T17:29:47.660 INFO:tasks.workunit.client.0.vm06.stdout:4/637: mkdir db/d1d/d21/d25/d4b/de4 0 2026-03-09T17:29:47.661 INFO:tasks.workunit.client.0.vm06.stdout:6/526: dread d6/d4f/fa3 [0,4194304] 0 2026-03-09T17:29:47.669 INFO:tasks.workunit.client.0.vm06.stdout:0/758: getdents d7/d11/d19/d37 0 2026-03-09T17:29:47.676 INFO:tasks.workunit.client.0.vm06.stdout:1/606: rename d11/d14/d1d/d1e/d96/f9b to d11/d14/d1d/d42/d46/d92/dc0/fc9 0 2026-03-09T17:29:47.680 INFO:tasks.workunit.client.0.vm06.stdout:5/616: creat d4/d50/db2/fe2 x:0 0 0 2026-03-09T17:29:47.686 INFO:tasks.workunit.client.0.vm06.stdout:2/558: getdents d3/d4 0 2026-03-09T17:29:47.692 INFO:tasks.workunit.client.0.vm06.stdout:4/638: creat db/d59/d90/fe5 x:0 0 0 2026-03-09T17:29:47.695 INFO:tasks.workunit.client.0.vm06.stdout:6/527: creat d6/d12/d17/d85/faf x:0 0 0 2026-03-09T17:29:47.696 INFO:tasks.workunit.client.0.vm06.stdout:6/528: chown d6/d47/d4d/d6d/l92 1658667 1 2026-03-09T17:29:47.699 INFO:tasks.workunit.client.0.vm06.stdout:5/617: dread d4/d50/d18/f3e [4194304,4194304] 0 2026-03-09T17:29:47.704 INFO:tasks.workunit.client.0.vm06.stdout:7/736: write d5/d1f/d34/d46/f55 [4667496,113287] 0 2026-03-09T17:29:47.713 INFO:tasks.workunit.client.0.vm06.stdout:8/601: rename d15/d16/fb4 to d15/d31/d58/fc8 0 2026-03-09T17:29:47.721 INFO:tasks.workunit.client.0.vm06.stdout:6/529: mkdir d6/d4f/d3e/d52/d8c/db0 0 2026-03-09T17:29:47.722 INFO:tasks.workunit.client.0.vm06.stdout:0/759: mknod d7/d11/d89/da8/db2/dea/c108 0 2026-03-09T17:29:47.724 INFO:tasks.workunit.client.0.vm06.stdout:5/618: creat d4/d50/d35/d40/d95/db8/fe3 x:0 0 0 2026-03-09T17:29:47.729 INFO:tasks.workunit.client.0.vm06.stdout:8/602: mkdir d15/d31/d58/dc9 0 2026-03-09T17:29:47.732 INFO:tasks.workunit.client.0.vm06.stdout:8/603: dwrite d15/d16/f3f [0,4194304] 0 2026-03-09T17:29:47.732 INFO:tasks.workunit.client.0.vm06.stdout:9/702: getdents d3/d15/d48 0 2026-03-09T17:29:47.733 INFO:tasks.workunit.client.0.vm06.stdout:8/604: truncate d15/f3e 5037554 0 2026-03-09T17:29:47.747 INFO:tasks.workunit.client.0.vm06.stdout:6/530: mkdir d6/d47/d4d/d9a/da2/db1 0 2026-03-09T17:29:47.750 INFO:tasks.workunit.client.0.vm06.stdout:0/760: fsync d7/d11/d19/d8b/da4/fab 0 2026-03-09T17:29:47.754 INFO:tasks.workunit.client.0.vm06.stdout:5/619: unlink d4/d52/lbc 0 2026-03-09T17:29:47.757 INFO:tasks.workunit.client.0.vm06.stdout:3/670: link c3 dd/d19/d1e/db8/ce4 0 2026-03-09T17:29:47.757 INFO:tasks.workunit.client.0.vm06.stdout:3/671: chown dd/d19/d2c/lcd 27797 1 2026-03-09T17:29:47.758 INFO:tasks.workunit.client.0.vm06.stdout:3/672: stat dd/d19/d2c/f79 0 2026-03-09T17:29:47.760 INFO:tasks.workunit.client.0.vm06.stdout:9/703: creat d3/d15/d36/fdd x:0 0 0 2026-03-09T17:29:47.763 INFO:tasks.workunit.client.0.vm06.stdout:8/605: dread - d15/d16/d1a/d47/f9c zero size 2026-03-09T17:29:47.764 INFO:tasks.workunit.client.0.vm06.stdout:1/607: dwrite d11/d14/d1d/f4e [4194304,4194304] 0 2026-03-09T17:29:47.772 INFO:tasks.workunit.client.0.vm06.stdout:2/559: dwrite d3/d4/d22/f4b [0,4194304] 0 2026-03-09T17:29:47.779 INFO:tasks.workunit.client.0.vm06.stdout:5/620: rename d4/dca/lb5 to d4/d22/d46/le4 0 2026-03-09T17:29:47.779 INFO:tasks.workunit.client.0.vm06.stdout:5/621: stat d4/d52/d55/c7c 0 2026-03-09T17:29:47.781 INFO:tasks.workunit.client.0.vm06.stdout:5/622: write d4/d22/d46/fd9 [238384,55831] 0 2026-03-09T17:29:47.784 INFO:tasks.workunit.client.0.vm06.stdout:3/673: symlink dd/d19/d25/d44/d80/le5 0 2026-03-09T17:29:47.802 INFO:tasks.workunit.client.0.vm06.stdout:9/704: rename d3/d15/l24 to d3/d15/d16/lde 0 2026-03-09T17:29:47.803 INFO:tasks.workunit.client.0.vm06.stdout:7/737: write d5/d1f/d34/d46/f89 [459829,63607] 0 2026-03-09T17:29:47.809 INFO:tasks.workunit.client.0.vm06.stdout:3/674: rmdir dd 39 2026-03-09T17:29:47.816 INFO:tasks.workunit.client.0.vm06.stdout:6/531: rmdir d6/d4f/d3e/d52/d7c 0 2026-03-09T17:29:47.816 INFO:tasks.workunit.client.0.vm06.stdout:2/560: getdents d3/d4/d12/d2b/d9f 0 2026-03-09T17:29:47.816 INFO:tasks.workunit.client.0.vm06.stdout:2/561: readlink d3/d4/d12/d71/daa/d77/d81/d64/laf 0 2026-03-09T17:29:47.816 INFO:tasks.workunit.client.0.vm06.stdout:0/761: link d7/d11/d2d/daf/c95 d7/d11/d19/d3c/df3/c109 0 2026-03-09T17:29:47.816 INFO:tasks.workunit.client.0.vm06.stdout:4/639: write db/d1d/d21/fa5 [812033,49043] 0 2026-03-09T17:29:47.816 INFO:tasks.workunit.client.0.vm06.stdout:2/562: dread d3/d4/d12/f85 [0,4194304] 0 2026-03-09T17:29:47.816 INFO:tasks.workunit.client.0.vm06.stdout:2/563: readlink d3/d4/d22/d72/la1 0 2026-03-09T17:29:47.818 INFO:tasks.workunit.client.0.vm06.stdout:2/564: read d3/d4/d12/d71/daa/d77/d81/d64/d6a/fab [310466,5041] 0 2026-03-09T17:29:47.824 INFO:tasks.workunit.client.0.vm06.stdout:3/675: write dd/d19/d25/d44/fd6 [221152,67379] 0 2026-03-09T17:29:47.824 INFO:tasks.workunit.client.0.vm06.stdout:3/676: readlink dd/d19/d1e/l90 0 2026-03-09T17:29:47.828 INFO:tasks.workunit.client.0.vm06.stdout:4/640: mknod db/d57/dd4/ce6 0 2026-03-09T17:29:47.828 INFO:tasks.workunit.client.0.vm06.stdout:7/738: dread d5/dd/d79/fb3 [0,4194304] 0 2026-03-09T17:29:47.832 INFO:tasks.workunit.client.0.vm06.stdout:6/532: symlink d6/lb2 0 2026-03-09T17:29:47.833 INFO:tasks.workunit.client.0.vm06.stdout:3/677: creat dd/d19/d25/d44/d80/dd7/fe6 x:0 0 0 2026-03-09T17:29:47.835 INFO:tasks.workunit.client.0.vm06.stdout:0/762: creat d7/d11/d5d/db8/f10a x:0 0 0 2026-03-09T17:29:47.836 INFO:tasks.workunit.client.0.vm06.stdout:7/739: creat d5/dd/dc5/d5f/fd8 x:0 0 0 2026-03-09T17:29:47.838 INFO:tasks.workunit.client.0.vm06.stdout:2/565: rename d3/d4/d22/c8e to d3/d4/d12/d2b/d36/cb5 0 2026-03-09T17:29:47.838 INFO:tasks.workunit.client.0.vm06.stdout:9/705: dread d3/d11/d65/d80/fd0 [0,4194304] 0 2026-03-09T17:29:47.839 INFO:tasks.workunit.client.0.vm06.stdout:5/623: link d4/d50/l2a d4/d52/d55/le5 0 2026-03-09T17:29:47.843 INFO:tasks.workunit.client.0.vm06.stdout:3/678: symlink dd/d59/le7 0 2026-03-09T17:29:47.844 INFO:tasks.workunit.client.0.vm06.stdout:5/624: dwrite d4/d50/d35/d40/d6f/fd8 [0,4194304] 0 2026-03-09T17:29:47.844 INFO:tasks.workunit.client.0.vm06.stdout:5/625: stat d4/dca 0 2026-03-09T17:29:47.848 INFO:tasks.workunit.client.0.vm06.stdout:0/763: creat d7/d11/d89/f10b x:0 0 0 2026-03-09T17:29:47.849 INFO:tasks.workunit.client.0.vm06.stdout:0/764: write d7/fb1 [770052,111155] 0 2026-03-09T17:29:47.861 INFO:tasks.workunit.client.0.vm06.stdout:7/740: mknod d5/d7/dac/cd9 0 2026-03-09T17:29:47.861 INFO:tasks.workunit.client.0.vm06.stdout:7/741: chown d5/d1f/d34/f54 16886 1 2026-03-09T17:29:47.862 INFO:tasks.workunit.client.0.vm06.stdout:7/742: write d5/dd/dc5/fa2 [5013903,120940] 0 2026-03-09T17:29:47.865 INFO:tasks.workunit.client.0.vm06.stdout:4/641: rename db/f39 to db/d57/dd4/fe7 0 2026-03-09T17:29:47.868 INFO:tasks.workunit.client.0.vm06.stdout:9/706: creat d3/d6d/d9a/d9c/fdf x:0 0 0 2026-03-09T17:29:47.871 INFO:tasks.workunit.client.0.vm06.stdout:3/679: creat dd/d19/d25/d2d/fe8 x:0 0 0 2026-03-09T17:29:47.880 INFO:tasks.workunit.client.0.vm06.stdout:8/606: write d15/d16/d19/d71/f65 [48226,52094] 0 2026-03-09T17:29:47.881 INFO:tasks.workunit.client.0.vm06.stdout:8/607: chown d15/d16/d1e/d30/db8/da4 232 1 2026-03-09T17:29:47.881 INFO:tasks.workunit.client.0.vm06.stdout:8/608: readlink l1 0 2026-03-09T17:29:47.886 INFO:tasks.workunit.client.0.vm06.stdout:4/642: rename db/d59/d5f/d5d/c92 to db/d1d/d21/d25/d4b/ce8 0 2026-03-09T17:29:47.886 INFO:tasks.workunit.client.0.vm06.stdout:1/608: dwrite d11/f18 [4194304,4194304] 0 2026-03-09T17:29:47.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:47 vm06.local ceph-mon[57307]: pgmap v157: 65 pgs: 65 active+clean; 1.4 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 31 MiB/s rd, 77 MiB/s wr, 316 op/s 2026-03-09T17:29:47.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:47 vm09.local ceph-mon[62061]: pgmap v157: 65 pgs: 65 active+clean; 1.4 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 31 MiB/s rd, 77 MiB/s wr, 316 op/s 2026-03-09T17:29:47.896 INFO:tasks.workunit.client.0.vm06.stdout:3/680: read dd/d1d/f34 [3469413,22994] 0 2026-03-09T17:29:47.900 INFO:tasks.workunit.client.0.vm06.stdout:8/609: mknod d15/d39/d67/d77/d97/cca 0 2026-03-09T17:29:47.902 INFO:tasks.workunit.client.0.vm06.stdout:4/643: creat db/d1d/d21/d25/d4b/fe9 x:0 0 0 2026-03-09T17:29:47.903 INFO:tasks.workunit.client.0.vm06.stdout:4/644: write db/d1d/d21/d88/fd2 [778780,80612] 0 2026-03-09T17:29:47.904 INFO:tasks.workunit.client.0.vm06.stdout:4/645: chown db/d1d/d21/d37/d69/d78/da0/cb0 115775 1 2026-03-09T17:29:47.910 INFO:tasks.workunit.client.0.vm06.stdout:7/743: dread d5/dd/dc5/d64/d6b/f6f [0,4194304] 0 2026-03-09T17:29:47.912 INFO:tasks.workunit.client.0.vm06.stdout:2/566: creat d3/d4/d12/d2b/fb6 x:0 0 0 2026-03-09T17:29:47.915 INFO:tasks.workunit.client.0.vm06.stdout:3/681: creat dd/d19/d2c/fe9 x:0 0 0 2026-03-09T17:29:47.919 INFO:tasks.workunit.client.0.vm06.stdout:3/682: read - dd/d5b/d65/fb7 zero size 2026-03-09T17:29:47.919 INFO:tasks.workunit.client.0.vm06.stdout:8/610: truncate d15/d16/f23 2817251 0 2026-03-09T17:29:47.919 INFO:tasks.workunit.client.0.vm06.stdout:4/646: rename db/d1d/d21/d26/d7a/c86 to db/d57/cea 0 2026-03-09T17:29:47.923 INFO:tasks.workunit.client.0.vm06.stdout:7/744: creat d5/d1f/d34/d46/d51/fda x:0 0 0 2026-03-09T17:29:47.924 INFO:tasks.workunit.client.0.vm06.stdout:7/745: chown d5/d7/dac/ccc 702384 1 2026-03-09T17:29:47.929 INFO:tasks.workunit.client.0.vm06.stdout:6/533: write d6/d47/d96/f3d [474327,107627] 0 2026-03-09T17:29:47.932 INFO:tasks.workunit.client.0.vm06.stdout:0/765: write d7/d11/d19/d1d/fb3 [987733,42511] 0 2026-03-09T17:29:47.936 INFO:tasks.workunit.client.0.vm06.stdout:2/567: creat d3/d4/d22/d72/d8f/fb7 x:0 0 0 2026-03-09T17:29:47.938 INFO:tasks.workunit.client.0.vm06.stdout:5/626: link d4/d50/c2f d4/d22/ce6 0 2026-03-09T17:29:47.940 INFO:tasks.workunit.client.0.vm06.stdout:3/683: mknod dd/d5b/d65/cea 0 2026-03-09T17:29:47.942 INFO:tasks.workunit.client.0.vm06.stdout:9/707: write d3/d15/d48/fda [3413006,32855] 0 2026-03-09T17:29:47.943 INFO:tasks.workunit.client.0.vm06.stdout:5/627: dread d4/d22/d46/fd9 [0,4194304] 0 2026-03-09T17:29:47.944 INFO:tasks.workunit.client.0.vm06.stdout:4/647: creat db/d1d/d21/d37/d69/d78/feb x:0 0 0 2026-03-09T17:29:47.949 INFO:tasks.workunit.client.0.vm06.stdout:4/648: dwrite db/d1d/d21/d25/d4b/d85/fbe [0,4194304] 0 2026-03-09T17:29:47.952 INFO:tasks.workunit.client.0.vm06.stdout:7/746: mknod d5/d7/d2b/dc8/cdb 0 2026-03-09T17:29:47.961 INFO:tasks.workunit.client.0.vm06.stdout:7/747: dread d5/d1f/d34/d3f/d91/fb9 [0,4194304] 0 2026-03-09T17:29:47.964 INFO:tasks.workunit.client.0.vm06.stdout:6/534: truncate d6/d4f/f3c 445397 0 2026-03-09T17:29:47.968 INFO:tasks.workunit.client.0.vm06.stdout:6/535: dread d6/d47/d96/d40/f67 [0,4194304] 0 2026-03-09T17:29:47.974 INFO:tasks.workunit.client.0.vm06.stdout:2/568: creat d3/d4/d12/d2b/d9f/fb8 x:0 0 0 2026-03-09T17:29:47.979 INFO:tasks.workunit.client.0.vm06.stdout:3/684: rename dd/f75 to dd/d19/d28/feb 0 2026-03-09T17:29:47.980 INFO:tasks.workunit.client.0.vm06.stdout:1/609: truncate d11/d14/d1d/d42/d46/d92/dc0/f7f 6219280 0 2026-03-09T17:29:47.982 INFO:tasks.workunit.client.0.vm06.stdout:8/611: mkdir d15/d39/d67/d77/d97/dac/dcb 0 2026-03-09T17:29:47.984 INFO:tasks.workunit.client.0.vm06.stdout:5/628: creat d4/d50/d35/d40/d95/db8/dda/fe7 x:0 0 0 2026-03-09T17:29:47.985 INFO:tasks.workunit.client.0.vm06.stdout:4/649: read - db/d57/fc7 zero size 2026-03-09T17:29:47.987 INFO:tasks.workunit.client.0.vm06.stdout:6/536: rmdir d6/d4f 39 2026-03-09T17:29:47.990 INFO:tasks.workunit.client.0.vm06.stdout:3/685: creat dd/d1d/d2e/fec x:0 0 0 2026-03-09T17:29:47.991 INFO:tasks.workunit.client.0.vm06.stdout:8/612: mknod d15/d16/d19/d3d/d5f/ccc 0 2026-03-09T17:29:47.994 INFO:tasks.workunit.client.0.vm06.stdout:1/610: dwrite d11/d14/d1d/d94/f95 [0,4194304] 0 2026-03-09T17:29:48.008 INFO:tasks.workunit.client.0.vm06.stdout:4/650: fdatasync db/d1d/d21/f2f 0 2026-03-09T17:29:48.012 INFO:tasks.workunit.client.0.vm06.stdout:5/629: dread d4/d50/d18/f73 [0,4194304] 0 2026-03-09T17:29:48.013 INFO:tasks.workunit.client.0.vm06.stdout:3/686: fdatasync dd/d5b/d65/f6a 0 2026-03-09T17:29:48.014 INFO:tasks.workunit.client.0.vm06.stdout:8/613: symlink d15/d39/d67/lcd 0 2026-03-09T17:29:48.016 INFO:tasks.workunit.client.0.vm06.stdout:1/611: truncate d11/d14/d1d/d42/d46/d92/dc0/f4c 4840812 0 2026-03-09T17:29:48.018 INFO:tasks.workunit.client.0.vm06.stdout:7/748: sync 2026-03-09T17:29:48.021 INFO:tasks.workunit.client.0.vm06.stdout:4/651: unlink db/d1d/d21/d26/f70 0 2026-03-09T17:29:48.025 INFO:tasks.workunit.client.0.vm06.stdout:8/614: rename d15/d39/d3c/l70 to d15/d16/d6d/lce 0 2026-03-09T17:29:48.032 INFO:tasks.workunit.client.0.vm06.stdout:4/652: mkdir db/d1d/d21/d44/d8a/dec 0 2026-03-09T17:29:48.034 INFO:tasks.workunit.client.0.vm06.stdout:1/612: unlink d11/d14/d1d/d1e/d2a/f50 0 2026-03-09T17:29:48.037 INFO:tasks.workunit.client.0.vm06.stdout:4/653: fdatasync db/f6f 0 2026-03-09T17:29:48.039 INFO:tasks.workunit.client.0.vm06.stdout:1/613: creat d11/d69/fca x:0 0 0 2026-03-09T17:29:48.043 INFO:tasks.workunit.client.0.vm06.stdout:8/615: creat d15/d16/d1e/d30/fcf x:0 0 0 2026-03-09T17:29:48.047 INFO:tasks.workunit.client.0.vm06.stdout:0/766: write d7/d11/d19/d1d/f4c [2527796,25874] 0 2026-03-09T17:29:48.052 INFO:tasks.workunit.client.0.vm06.stdout:4/654: rename db/cd5 to db/d57/dd4/ced 0 2026-03-09T17:29:48.059 INFO:tasks.workunit.client.0.vm06.stdout:1/614: dread d11/d14/d1d/d42/d46/d92/dc0/f45 [0,4194304] 0 2026-03-09T17:29:48.060 INFO:tasks.workunit.client.0.vm06.stdout:1/615: chown d11/d14/d1d/d42/c9f 36 1 2026-03-09T17:29:48.060 INFO:tasks.workunit.client.0.vm06.stdout:9/708: write d3/d15/f2e [910845,80592] 0 2026-03-09T17:29:48.061 INFO:tasks.workunit.client.0.vm06.stdout:9/709: chown d3/d15/fbb 13587 1 2026-03-09T17:29:48.065 INFO:tasks.workunit.client.0.vm06.stdout:0/767: dwrite d7/d11/d2d/dca/ff4 [0,4194304] 0 2026-03-09T17:29:48.076 INFO:tasks.workunit.client.0.vm06.stdout:2/569: write d3/d4/f1f [13736,77613] 0 2026-03-09T17:29:48.080 INFO:tasks.workunit.client.0.vm06.stdout:8/616: unlink d15/d16/d1e/d30/d55/c8e 0 2026-03-09T17:29:48.081 INFO:tasks.workunit.client.0.vm06.stdout:8/617: write d15/d16/d1e/d30/f3b [3901877,69098] 0 2026-03-09T17:29:48.081 INFO:tasks.workunit.client.0.vm06.stdout:8/618: readlink d15/d16/d1a/l20 0 2026-03-09T17:29:48.082 INFO:tasks.workunit.client.0.vm06.stdout:8/619: readlink d15/d16/d19/d3d/l92 0 2026-03-09T17:29:48.088 INFO:tasks.workunit.client.0.vm06.stdout:5/630: write d4/fb [4423472,87501] 0 2026-03-09T17:29:48.088 INFO:tasks.workunit.client.0.vm06.stdout:5/631: stat d4/d50/l4c 0 2026-03-09T17:29:48.091 INFO:tasks.workunit.client.0.vm06.stdout:1/616: unlink d11/d14/d1d/d42/d46/d92/dc0/c23 0 2026-03-09T17:29:48.096 INFO:tasks.workunit.client.0.vm06.stdout:7/749: dwrite d5/dd/d79/f97 [0,4194304] 0 2026-03-09T17:29:48.101 INFO:tasks.workunit.client.0.vm06.stdout:7/750: dwrite d5/d1f/d34/fd2 [0,4194304] 0 2026-03-09T17:29:48.106 INFO:tasks.workunit.client.0.vm06.stdout:7/751: read d5/d7/d2b/dbd/fbe [668824,35682] 0 2026-03-09T17:29:48.114 INFO:tasks.workunit.client.0.vm06.stdout:0/768: rmdir d7/d11/d5d/d64 39 2026-03-09T17:29:48.116 INFO:tasks.workunit.client.0.vm06.stdout:2/570: rename d3/d4/d12/d2b/d2d/f9b to d3/d4/d12/d2b/d36/fb9 0 2026-03-09T17:29:48.117 INFO:tasks.workunit.client.0.vm06.stdout:2/571: dread - d3/d4/d12/d2b/d2d/f9d zero size 2026-03-09T17:29:48.119 INFO:tasks.workunit.client.0.vm06.stdout:4/655: symlink db/d1d/d21/d26/d89/dab/dae/dba/lee 0 2026-03-09T17:29:48.129 INFO:tasks.workunit.client.0.vm06.stdout:6/537: dwrite d6/d4f/f3c [0,4194304] 0 2026-03-09T17:29:48.130 INFO:tasks.workunit.client.0.vm06.stdout:6/538: fsync d6/d47/f49 0 2026-03-09T17:29:48.147 INFO:tasks.workunit.client.0.vm06.stdout:3/687: write dd/d19/d28/feb [2198519,76659] 0 2026-03-09T17:29:48.148 INFO:tasks.workunit.client.0.vm06.stdout:3/688: readlink dd/d1d/d6e/d70/ld5 0 2026-03-09T17:29:48.151 INFO:tasks.workunit.client.0.vm06.stdout:9/710: link d3/d6d/d9a/fb4 d3/d15/d36/d4c/d6a/d8a/fe0 0 2026-03-09T17:29:48.152 INFO:tasks.workunit.client.0.vm06.stdout:9/711: write d3/d11/d65/f71 [2046725,65643] 0 2026-03-09T17:29:48.153 INFO:tasks.workunit.client.0.vm06.stdout:9/712: chown d3/d26/d6c/d68/f7f 765 1 2026-03-09T17:29:48.154 INFO:tasks.workunit.client.0.vm06.stdout:1/617: dread d11/d14/d1d/f7c [0,4194304] 0 2026-03-09T17:29:48.162 INFO:tasks.workunit.client.0.vm06.stdout:2/572: mkdir d3/d4/d12/d71/daa/d77/d81/d64/d6a/dba 0 2026-03-09T17:29:48.184 INFO:tasks.workunit.client.0.vm06.stdout:5/632: dread d4/d50/f1d [0,4194304] 0 2026-03-09T17:29:48.184 INFO:tasks.workunit.client.0.vm06.stdout:6/539: truncate d6/d12/d17/f29 2811723 0 2026-03-09T17:29:48.184 INFO:tasks.workunit.client.0.vm06.stdout:6/540: fdatasync d6/d47/d96/f37 0 2026-03-09T17:29:48.184 INFO:tasks.workunit.client.0.vm06.stdout:3/689: creat dd/d1d/d2e/d67/fed x:0 0 0 2026-03-09T17:29:48.184 INFO:tasks.workunit.client.0.vm06.stdout:3/690: dwrite dd/d5b/fa7 [0,4194304] 0 2026-03-09T17:29:48.185 INFO:tasks.workunit.client.0.vm06.stdout:1/618: creat d11/d14/d1d/d42/fcb x:0 0 0 2026-03-09T17:29:48.186 INFO:tasks.workunit.client.0.vm06.stdout:7/752: rename d5/d1f/f56 to d5/dd/fdc 0 2026-03-09T17:29:48.187 INFO:tasks.workunit.client.0.vm06.stdout:2/573: creat d3/d4/d12/da7/fbb x:0 0 0 2026-03-09T17:29:48.188 INFO:tasks.workunit.client.0.vm06.stdout:2/574: chown d3/d4/d46/da5 81351 1 2026-03-09T17:29:48.189 INFO:tasks.workunit.client.0.vm06.stdout:2/575: truncate d3/d4/d22/fb2 518560 0 2026-03-09T17:29:48.190 INFO:tasks.workunit.client.0.vm06.stdout:5/633: mknod d4/d22/ce8 0 2026-03-09T17:29:48.192 INFO:tasks.workunit.client.0.vm06.stdout:3/691: creat dd/d1d/d2e/d67/fee x:0 0 0 2026-03-09T17:29:48.192 INFO:tasks.workunit.client.0.vm06.stdout:3/692: dread - dd/d19/d25/d2d/fce zero size 2026-03-09T17:29:48.193 INFO:tasks.workunit.client.0.vm06.stdout:3/693: stat dd/d1d/f4b 0 2026-03-09T17:29:48.194 INFO:tasks.workunit.client.0.vm06.stdout:0/769: creat d7/d11/f10c x:0 0 0 2026-03-09T17:29:48.198 INFO:tasks.workunit.client.0.vm06.stdout:9/713: mkdir d3/de1 0 2026-03-09T17:29:48.198 INFO:tasks.workunit.client.0.vm06.stdout:2/576: stat d3/d4/c16 0 2026-03-09T17:29:48.200 INFO:tasks.workunit.client.0.vm06.stdout:5/634: fsync d4/d22/d46/fb9 0 2026-03-09T17:29:48.201 INFO:tasks.workunit.client.0.vm06.stdout:5/635: readlink d4/l12 0 2026-03-09T17:29:48.204 INFO:tasks.workunit.client.0.vm06.stdout:3/694: mkdir dd/d1d/d2e/d67/def 0 2026-03-09T17:29:48.208 INFO:tasks.workunit.client.0.vm06.stdout:3/695: dwrite dd/d19/d25/d2d/d9b/fc5 [0,4194304] 0 2026-03-09T17:29:48.208 INFO:tasks.workunit.client.0.vm06.stdout:3/696: dread - dd/d19/d25/d44/d80/dd7/fde zero size 2026-03-09T17:29:48.210 INFO:tasks.workunit.client.0.vm06.stdout:9/714: rename d3/dad/lc1 to d3/d15/d36/d4c/d6a/le2 0 2026-03-09T17:29:48.224 INFO:tasks.workunit.client.0.vm06.stdout:3/697: dread dd/f1a [0,4194304] 0 2026-03-09T17:29:48.224 INFO:tasks.workunit.client.0.vm06.stdout:9/715: dwrite d3/d15/d36/fdd [0,4194304] 0 2026-03-09T17:29:48.226 INFO:tasks.workunit.client.0.vm06.stdout:9/716: fdatasync d3/d26/d6c/f3a 0 2026-03-09T17:29:48.227 INFO:tasks.workunit.client.0.vm06.stdout:3/698: truncate dd/d5b/d65/fb7 52370 0 2026-03-09T17:29:48.233 INFO:tasks.workunit.client.0.vm06.stdout:1/619: creat d11/d14/d1d/fcc x:0 0 0 2026-03-09T17:29:48.237 INFO:tasks.workunit.client.0.vm06.stdout:5/636: link d4/d50/d35/d40/d95/fa1 d4/d50/dd6/fe9 0 2026-03-09T17:29:48.237 INFO:tasks.workunit.client.0.vm06.stdout:9/717: fdatasync d3/d11/f1c 0 2026-03-09T17:29:48.240 INFO:tasks.workunit.client.0.vm06.stdout:1/620: creat d11/d14/d1d/d42/d46/fcd x:0 0 0 2026-03-09T17:29:48.246 INFO:tasks.workunit.client.0.vm06.stdout:9/718: creat d3/d15/d36/d83/fe3 x:0 0 0 2026-03-09T17:29:48.247 INFO:tasks.workunit.client.0.vm06.stdout:9/719: mknod d3/d11/d65/d80/ce4 0 2026-03-09T17:29:48.247 INFO:tasks.workunit.client.0.vm06.stdout:3/699: dread dd/f4a [0,4194304] 0 2026-03-09T17:29:48.248 INFO:tasks.workunit.client.0.vm06.stdout:1/621: getdents d11/d14/d1d/d1e/d2a/d99/db0 0 2026-03-09T17:29:48.251 INFO:tasks.workunit.client.0.vm06.stdout:1/622: mkdir d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/dce 0 2026-03-09T17:29:48.253 INFO:tasks.workunit.client.0.vm06.stdout:1/623: creat d11/d14/d1d/d42/d46/fcf x:0 0 0 2026-03-09T17:29:48.269 INFO:tasks.workunit.client.0.vm06.stdout:0/770: read d7/f76 [3813083,106507] 0 2026-03-09T17:29:48.270 INFO:tasks.workunit.client.0.vm06.stdout:0/771: chown d7/d11/d5d/db8 1 1 2026-03-09T17:29:48.270 INFO:tasks.workunit.client.0.vm06.stdout:0/772: dread - d7/d11/d19/fe5 zero size 2026-03-09T17:29:48.296 INFO:tasks.workunit.client.0.vm06.stdout:8/620: dwrite d15/d16/d19/d71/f80 [0,4194304] 0 2026-03-09T17:29:48.308 INFO:tasks.workunit.client.0.vm06.stdout:2/577: sync 2026-03-09T17:29:48.308 INFO:tasks.workunit.client.0.vm06.stdout:5/637: sync 2026-03-09T17:29:48.308 INFO:tasks.workunit.client.0.vm06.stdout:0/773: sync 2026-03-09T17:29:48.312 INFO:tasks.workunit.client.0.vm06.stdout:3/700: dread dd/d19/d2c/f30 [0,4194304] 0 2026-03-09T17:29:48.314 INFO:tasks.workunit.client.0.vm06.stdout:1/624: dread d11/d14/d1d/d42/d46/d92/dc0/f4c [0,4194304] 0 2026-03-09T17:29:48.317 INFO:tasks.workunit.client.0.vm06.stdout:8/621: creat d15/d39/d67/fd0 x:0 0 0 2026-03-09T17:29:48.321 INFO:tasks.workunit.client.0.vm06.stdout:4/656: write db/d1d/d21/d25/d4b/d85/f98 [622602,42227] 0 2026-03-09T17:29:48.321 INFO:tasks.workunit.client.0.vm06.stdout:4/657: rename db/d1d/d21/d26 to db/d1d/d21/d26/d89/dab/def 22 2026-03-09T17:29:48.330 INFO:tasks.workunit.client.0.vm06.stdout:2/578: symlink d3/d4/d12/da7/lbc 0 2026-03-09T17:29:48.335 INFO:tasks.workunit.client.0.vm06.stdout:3/701: fdatasync dd/d1d/f4b 0 2026-03-09T17:29:48.336 INFO:tasks.workunit.client.0.vm06.stdout:3/702: chown dd/d59 105302 1 2026-03-09T17:29:48.349 INFO:tasks.workunit.client.0.vm06.stdout:7/753: getdents d5/d1f 0 2026-03-09T17:29:48.352 INFO:tasks.workunit.client.0.vm06.stdout:0/774: dread d7/d11/d19/d1d/f8a [0,4194304] 0 2026-03-09T17:29:48.361 INFO:tasks.workunit.client.0.vm06.stdout:8/622: link d15/d16/d19/d71/f65 d15/d16/d1e/d30/db8/d5e/fd1 0 2026-03-09T17:29:48.362 INFO:tasks.workunit.client.0.vm06.stdout:8/623: chown d15/d16/d1e/d30/db8/d5e 141964 1 2026-03-09T17:29:48.363 INFO:tasks.workunit.client.0.vm06.stdout:8/624: dread - d15/d39/d67/fd0 zero size 2026-03-09T17:29:48.365 INFO:tasks.workunit.client.0.vm06.stdout:7/754: fdatasync d5/d1f/d34/d46/d51/f92 0 2026-03-09T17:29:48.367 INFO:tasks.workunit.client.0.vm06.stdout:7/755: read d5/d1f/d34/d46/f4e [4042658,38974] 0 2026-03-09T17:29:48.369 INFO:tasks.workunit.client.0.vm06.stdout:4/658: truncate db/d59/d5f/d45/f61 214862 0 2026-03-09T17:29:48.371 INFO:tasks.workunit.client.0.vm06.stdout:0/775: truncate d7/d11/d2d/fdf 929296 0 2026-03-09T17:29:48.374 INFO:tasks.workunit.client.0.vm06.stdout:6/541: dwrite d6/d12/d17/d65/f72 [0,4194304] 0 2026-03-09T17:29:48.375 INFO:tasks.workunit.client.0.vm06.stdout:6/542: write d6/d12/d17/f78 [6035260,24412] 0 2026-03-09T17:29:48.376 INFO:tasks.workunit.client.0.vm06.stdout:2/579: mknod d3/d4/d12/da7/db3/cbd 0 2026-03-09T17:29:48.388 INFO:tasks.workunit.client.0.vm06.stdout:3/703: mkdir dd/d19/d25/df0 0 2026-03-09T17:29:48.402 INFO:tasks.workunit.client.0.vm06.stdout:0/776: rmdir d7/d11/d89/d99 39 2026-03-09T17:29:48.403 INFO:tasks.workunit.client.0.vm06.stdout:0/777: read d7/f14 [1760435,27464] 0 2026-03-09T17:29:48.405 INFO:tasks.workunit.client.0.vm06.stdout:6/543: mkdir d6/d12/d2d/db3 0 2026-03-09T17:29:48.408 INFO:tasks.workunit.client.0.vm06.stdout:2/580: rmdir d3/d4/d22/d72/d8f 39 2026-03-09T17:29:48.414 INFO:tasks.workunit.client.0.vm06.stdout:9/720: write d3/d2c/f81 [359908,67274] 0 2026-03-09T17:29:48.421 INFO:tasks.workunit.client.0.vm06.stdout:0/778: mkdir d7/d88/d10d 0 2026-03-09T17:29:48.424 INFO:tasks.workunit.client.0.vm06.stdout:2/581: creat d3/d4/d12/da7/db3/fbe x:0 0 0 2026-03-09T17:29:48.426 INFO:tasks.workunit.client.0.vm06.stdout:5/638: write d4/d22/f3f [157481,78277] 0 2026-03-09T17:29:48.426 INFO:tasks.workunit.client.0.vm06.stdout:1/625: link d11/c12 d11/d14/d1d/d42/d46/d92/dc0/cd0 0 2026-03-09T17:29:48.432 INFO:tasks.workunit.client.0.vm06.stdout:8/625: rename d15/da6 to d15/d39/dd2 0 2026-03-09T17:29:48.434 INFO:tasks.workunit.client.0.vm06.stdout:6/544: getdents d6/d47/d8a 0 2026-03-09T17:29:48.440 INFO:tasks.workunit.client.0.vm06.stdout:5/639: rmdir d4/da4 39 2026-03-09T17:29:48.440 INFO:tasks.workunit.client.0.vm06.stdout:5/640: stat d4/d22/c47 0 2026-03-09T17:29:48.441 INFO:tasks.workunit.client.0.vm06.stdout:5/641: read - d4/d50/d35/d40/d6f/fd0 zero size 2026-03-09T17:29:48.446 INFO:tasks.workunit.client.0.vm06.stdout:3/704: rename dd/d5c to dd/d19/d25/d2d/d9b/df1 0 2026-03-09T17:29:48.447 INFO:tasks.workunit.client.0.vm06.stdout:8/626: stat d15/c6b 0 2026-03-09T17:29:48.450 INFO:tasks.workunit.client.0.vm06.stdout:4/659: write db/f13 [4910021,87918] 0 2026-03-09T17:29:48.458 INFO:tasks.workunit.client.0.vm06.stdout:2/582: creat d3/d4/d22/d72/d8f/fbf x:0 0 0 2026-03-09T17:29:48.460 INFO:tasks.workunit.client.0.vm06.stdout:2/583: readlink d3/d4/d12/d2b/la4 0 2026-03-09T17:29:48.460 INFO:tasks.workunit.client.0.vm06.stdout:2/584: chown d3/d4/d12/d2b/fb6 15 1 2026-03-09T17:29:48.468 INFO:tasks.workunit.client.0.vm06.stdout:7/756: truncate d5/f18 2105465 0 2026-03-09T17:29:48.472 INFO:tasks.workunit.client.0.vm06.stdout:1/626: mkdir d11/d14/d1d/dd1 0 2026-03-09T17:29:48.474 INFO:tasks.workunit.client.0.vm06.stdout:9/721: write d3/d6d/f78 [658842,73260] 0 2026-03-09T17:29:48.476 INFO:tasks.workunit.client.0.vm06.stdout:0/779: write d7/ffa [2934900,17262] 0 2026-03-09T17:29:48.481 INFO:tasks.workunit.client.0.vm06.stdout:3/705: symlink dd/d59/lf2 0 2026-03-09T17:29:48.481 INFO:tasks.workunit.client.0.vm06.stdout:0/780: dread d7/d11/d19/d23/db7/dbd/fc0 [0,4194304] 0 2026-03-09T17:29:48.484 INFO:tasks.workunit.client.0.vm06.stdout:4/660: creat db/d1d/d21/d26/d7a/ff0 x:0 0 0 2026-03-09T17:29:48.495 INFO:tasks.workunit.client.0.vm06.stdout:8/627: write d15/d16/d1e/d30/db8/d5e/f98 [390596,44211] 0 2026-03-09T17:29:48.497 INFO:tasks.workunit.client.0.vm06.stdout:5/642: mknod d4/d52/db4/dc2/cea 0 2026-03-09T17:29:48.499 INFO:tasks.workunit.client.0.vm06.stdout:1/627: rename d11/d14/d1d/d4a/fa2 to d11/d14/d1c/d3a/fd2 0 2026-03-09T17:29:48.499 INFO:tasks.workunit.client.0.vm06.stdout:1/628: readlink d11/d14/d1c/l29 0 2026-03-09T17:29:48.500 INFO:tasks.workunit.client.0.vm06.stdout:1/629: write d11/d14/d1d/d42/d46/d92/dc0/f21 [4409473,51428] 0 2026-03-09T17:29:48.501 INFO:tasks.workunit.client.0.vm06.stdout:9/722: truncate d3/d26/d6c/d68/f9b 946436 0 2026-03-09T17:29:48.506 INFO:tasks.workunit.client.0.vm06.stdout:3/706: mkdir dd/d19/d1e/db8/df3 0 2026-03-09T17:29:48.508 INFO:tasks.workunit.client.0.vm06.stdout:0/781: mkdir d7/d11/d19/d3c/db9/ddd/d10e 0 2026-03-09T17:29:48.509 INFO:tasks.workunit.client.0.vm06.stdout:0/782: chown d7/d11/d19/d23/db7/dbd/ca9 22 1 2026-03-09T17:29:48.509 INFO:tasks.workunit.client.0.vm06.stdout:0/783: chown d7/d11/d19/d8b/da4/d85 61593668 1 2026-03-09T17:29:48.511 INFO:tasks.workunit.client.0.vm06.stdout:3/707: dwrite dd/d81/da3/fbc [0,4194304] 0 2026-03-09T17:29:48.521 INFO:tasks.workunit.client.0.vm06.stdout:6/545: truncate d6/d12/d17/f78 4078472 0 2026-03-09T17:29:48.523 INFO:tasks.workunit.client.0.vm06.stdout:2/585: creat d3/d4/d12/d71/daa/d77/d81/d64/d6a/dba/fc0 x:0 0 0 2026-03-09T17:29:48.526 INFO:tasks.workunit.client.0.vm06.stdout:7/757: dwrite d5/dd/f22 [0,4194304] 0 2026-03-09T17:29:48.528 INFO:tasks.workunit.client.0.vm06.stdout:7/758: write d5/d1f/d34/d46/d51/fda [372525,43883] 0 2026-03-09T17:29:48.538 INFO:tasks.workunit.client.0.vm06.stdout:5/643: mknod d4/d52/db4/dc2/ceb 0 2026-03-09T17:29:48.541 INFO:tasks.workunit.client.0.vm06.stdout:9/723: symlink d3/d15/d36/d83/le5 0 2026-03-09T17:29:48.542 INFO:tasks.workunit.client.0.vm06.stdout:9/724: read - d3/d15/f5e zero size 2026-03-09T17:29:48.546 INFO:tasks.workunit.client.0.vm06.stdout:3/708: write dd/d59/da1/fa4 [1640644,41271] 0 2026-03-09T17:29:48.549 INFO:tasks.workunit.client.0.vm06.stdout:6/546: rmdir d6/d47/d4d 39 2026-03-09T17:29:48.553 INFO:tasks.workunit.client.0.vm06.stdout:7/759: creat d5/dd/d79/d7f/fdd x:0 0 0 2026-03-09T17:29:48.557 INFO:tasks.workunit.client.0.vm06.stdout:9/725: mknod d3/d6d/d9a/d9c/dcd/ce6 0 2026-03-09T17:29:48.558 INFO:tasks.workunit.client.0.vm06.stdout:9/726: write d3/d26/d6c/f5b [4151359,55121] 0 2026-03-09T17:29:48.558 INFO:tasks.workunit.client.0.vm06.stdout:9/727: fsync d3/d15/d48/fda 0 2026-03-09T17:29:48.562 INFO:tasks.workunit.client.0.vm06.stdout:3/709: dread - dd/d1d/d6e/f9a zero size 2026-03-09T17:29:48.563 INFO:tasks.workunit.client.0.vm06.stdout:6/547: creat d6/d47/d96/d40/fb4 x:0 0 0 2026-03-09T17:29:48.565 INFO:tasks.workunit.client.0.vm06.stdout:7/760: creat d5/dd/d79/d7f/fde x:0 0 0 2026-03-09T17:29:48.572 INFO:tasks.workunit.client.0.vm06.stdout:9/728: unlink d3/d15/d16/l82 0 2026-03-09T17:29:48.574 INFO:tasks.workunit.client.0.vm06.stdout:0/784: truncate d7/d11/f29 2000751 0 2026-03-09T17:29:48.574 INFO:tasks.workunit.client.0.vm06.stdout:0/785: chown d7/c3e 183791083 1 2026-03-09T17:29:48.575 INFO:tasks.workunit.client.0.vm06.stdout:0/786: readlink d7/d11/d2d/dca/le2 0 2026-03-09T17:29:48.577 INFO:tasks.workunit.client.0.vm06.stdout:4/661: dwrite db/d1d/d21/d44/fb7 [4194304,4194304] 0 2026-03-09T17:29:48.592 INFO:tasks.workunit.client.0.vm06.stdout:5/644: write d4/d50/d18/f4a [1722370,40405] 0 2026-03-09T17:29:48.593 INFO:tasks.workunit.client.0.vm06.stdout:7/761: creat d5/dd/dc5/dad/fdf x:0 0 0 2026-03-09T17:29:48.593 INFO:tasks.workunit.client.0.vm06.stdout:8/628: getdents d15/d16 0 2026-03-09T17:29:48.594 INFO:tasks.workunit.client.0.vm06.stdout:5/645: chown d4/d22/d64 17 1 2026-03-09T17:29:48.595 INFO:tasks.workunit.client.0.vm06.stdout:8/629: dread - d15/d16/d1e/d30/db8/d5e/fb9 zero size 2026-03-09T17:29:48.595 INFO:tasks.workunit.client.0.vm06.stdout:5/646: chown d4/d50/f1e 664247 1 2026-03-09T17:29:48.595 INFO:tasks.workunit.client.0.vm06.stdout:8/630: chown d15/d31 53 1 2026-03-09T17:29:48.596 INFO:tasks.workunit.client.0.vm06.stdout:1/630: dwrite d11/d14/d1d/d1e/d2a/d34/d58/fa1 [4194304,4194304] 0 2026-03-09T17:29:48.603 INFO:tasks.workunit.client.0.vm06.stdout:8/631: read d15/d16/d1a/d47/fa5 [182934,12931] 0 2026-03-09T17:29:48.607 INFO:tasks.workunit.client.0.vm06.stdout:9/729: write d3/d15/d36/d4d/f61 [3097510,126741] 0 2026-03-09T17:29:48.607 INFO:tasks.workunit.client.0.vm06.stdout:0/787: mkdir d7/d11/d19/d37/d10f 0 2026-03-09T17:29:48.608 INFO:tasks.workunit.client.0.vm06.stdout:4/662: unlink db/df/l79 0 2026-03-09T17:29:48.610 INFO:tasks.workunit.client.0.vm06.stdout:3/710: truncate dd/f10 7506523 0 2026-03-09T17:29:48.613 INFO:tasks.workunit.client.0.vm06.stdout:2/586: getdents d3/d4/d22/d72 0 2026-03-09T17:29:48.614 INFO:tasks.workunit.client.0.vm06.stdout:7/762: creat d5/dd/dc5/d64/fe0 x:0 0 0 2026-03-09T17:29:48.618 INFO:tasks.workunit.client.0.vm06.stdout:1/631: read - d11/d14/d1c/d3a/fd2 zero size 2026-03-09T17:29:48.620 INFO:tasks.workunit.client.0.vm06.stdout:0/788: mknod d7/d11/d2d/dca/c110 0 2026-03-09T17:29:48.621 INFO:tasks.workunit.client.0.vm06.stdout:0/789: write d7/d11/d19/d1d/fec [464830,60125] 0 2026-03-09T17:29:48.622 INFO:tasks.workunit.client.0.vm06.stdout:4/663: write db/d1d/d21/d26/d89/dab/dae/fdd [7500445,108460] 0 2026-03-09T17:29:48.627 INFO:tasks.workunit.client.0.vm06.stdout:4/664: dwrite db/d1d/d21/d44/fb7 [8388608,4194304] 0 2026-03-09T17:29:48.630 INFO:tasks.workunit.client.0.vm06.stdout:4/665: dread - db/d1d/d21/d37/d69/d78/feb zero size 2026-03-09T17:29:48.637 INFO:tasks.workunit.client.0.vm06.stdout:3/711: symlink dd/d19/d1e/lf4 0 2026-03-09T17:29:48.638 INFO:tasks.workunit.client.0.vm06.stdout:6/548: symlink d6/d47/d4d/da0/lb5 0 2026-03-09T17:29:48.638 INFO:tasks.workunit.client.0.vm06.stdout:6/549: readlink d6/d47/l58 0 2026-03-09T17:29:48.643 INFO:tasks.workunit.client.0.vm06.stdout:5/647: unlink d4/d50/d35/d40/c9c 0 2026-03-09T17:29:48.644 INFO:tasks.workunit.client.0.vm06.stdout:6/550: dwrite d6/d12/d17/d65/f72 [0,4194304] 0 2026-03-09T17:29:48.656 INFO:tasks.workunit.client.0.vm06.stdout:1/632: symlink d11/d14/d1d/d8c/ld3 0 2026-03-09T17:29:48.658 INFO:tasks.workunit.client.0.vm06.stdout:1/633: truncate d11/d14/d1d/d1e/d2a/fba 771969 0 2026-03-09T17:29:48.683 INFO:tasks.workunit.client.0.vm06.stdout:2/587: mkdir d3/d4/d12/d2b/db0/dc1 0 2026-03-09T17:29:48.684 INFO:tasks.workunit.client.0.vm06.stdout:3/712: mkdir dd/d81/d97/df5 0 2026-03-09T17:29:48.685 INFO:tasks.workunit.client.0.vm06.stdout:8/632: write d15/d39/d3c/d6c/f8b [1736954,41647] 0 2026-03-09T17:29:48.691 INFO:tasks.workunit.client.0.vm06.stdout:5/648: mkdir d4/d22/d46/dec 0 2026-03-09T17:29:48.694 INFO:tasks.workunit.client.0.vm06.stdout:5/649: dwrite d4/f71 [0,4194304] 0 2026-03-09T17:29:48.696 INFO:tasks.workunit.client.0.vm06.stdout:5/650: chown d4/f1f 228594281 1 2026-03-09T17:29:48.696 INFO:tasks.workunit.client.0.vm06.stdout:5/651: chown d4/f49 1832718 1 2026-03-09T17:29:48.701 INFO:tasks.workunit.client.0.vm06.stdout:5/652: dwrite d4/d50/db2/fe2 [0,4194304] 0 2026-03-09T17:29:48.703 INFO:tasks.workunit.client.0.vm06.stdout:5/653: chown d4/d22/l2c 37307580 1 2026-03-09T17:29:48.714 INFO:tasks.workunit.client.0.vm06.stdout:1/634: rename d11/d14/d1d/d42/d46/l88 to d11/d14/d1c/d5f/ld4 0 2026-03-09T17:29:48.715 INFO:tasks.workunit.client.0.vm06.stdout:1/635: readlink d11/d14/d1d/d42/la0 0 2026-03-09T17:29:48.719 INFO:tasks.workunit.client.0.vm06.stdout:7/763: write d5/dd/f29 [5251162,35541] 0 2026-03-09T17:29:48.719 INFO:tasks.workunit.client.0.vm06.stdout:7/764: readlink d5/d1f/d34/d3f/l43 0 2026-03-09T17:29:48.724 INFO:tasks.workunit.client.0.vm06.stdout:0/790: symlink d7/d11/d19/l111 0 2026-03-09T17:29:48.724 INFO:tasks.workunit.client.0.vm06.stdout:0/791: stat d7/d11/d19/fe5 0 2026-03-09T17:29:48.729 INFO:tasks.workunit.client.0.vm06.stdout:2/588: creat d3/d4/d12/da7/db3/fc2 x:0 0 0 2026-03-09T17:29:48.731 INFO:tasks.workunit.client.0.vm06.stdout:3/713: creat dd/d5b/d65/ff6 x:0 0 0 2026-03-09T17:29:48.733 INFO:tasks.workunit.client.0.vm06.stdout:8/633: fsync d15/d16/d1e/f4e 0 2026-03-09T17:29:48.735 INFO:tasks.workunit.client.0.vm06.stdout:8/634: read d15/d16/d1e/f8c [155402,64721] 0 2026-03-09T17:29:48.735 INFO:tasks.workunit.client.0.vm06.stdout:5/654: creat d4/d50/d35/d40/d6f/fed x:0 0 0 2026-03-09T17:29:48.741 INFO:tasks.workunit.client.0.vm06.stdout:9/730: getdents d3/d11/d65 0 2026-03-09T17:29:48.744 INFO:tasks.workunit.client.0.vm06.stdout:4/666: link db/d1d/d21/d44/d8a/fa3 db/d1d/d21/d26/d89/dab/dae/dba/ff1 0 2026-03-09T17:29:48.745 INFO:tasks.workunit.client.0.vm06.stdout:2/589: fdatasync d3/d4/d12/f2e 0 2026-03-09T17:29:48.746 INFO:tasks.workunit.client.0.vm06.stdout:2/590: write d3/d4/d12/d71/daa/d77/d81/d64/f9a [1811094,56345] 0 2026-03-09T17:29:48.747 INFO:tasks.workunit.client.0.vm06.stdout:3/714: creat dd/d19/d28/ff7 x:0 0 0 2026-03-09T17:29:48.747 INFO:tasks.workunit.client.0.vm06.stdout:3/715: write dd/d19/d25/d2d/fe8 [620743,9580] 0 2026-03-09T17:29:48.754 INFO:tasks.workunit.client.0.vm06.stdout:8/635: truncate d15/d39/f4b 4327634 0 2026-03-09T17:29:48.755 INFO:tasks.workunit.client.0.vm06.stdout:5/655: truncate d4/d50/f1d 4572717 0 2026-03-09T17:29:48.757 INFO:tasks.workunit.client.0.vm06.stdout:8/636: dwrite d15/d39/d3c/f5d [0,4194304] 0 2026-03-09T17:29:48.759 INFO:tasks.workunit.client.0.vm06.stdout:8/637: dread - d15/d16/d19/d3d/fc0 zero size 2026-03-09T17:29:48.762 INFO:tasks.workunit.client.0.vm06.stdout:4/667: creat db/d1d/d21/d25/d4b/ff2 x:0 0 0 2026-03-09T17:29:48.770 INFO:tasks.workunit.client.0.vm06.stdout:2/591: symlink d3/d4/d22/lc3 0 2026-03-09T17:29:48.772 INFO:tasks.workunit.client.0.vm06.stdout:3/716: mkdir dd/d81/da3/dae/df8 0 2026-03-09T17:29:48.780 INFO:tasks.workunit.client.0.vm06.stdout:8/638: read d15/d31/f33 [566321,56472] 0 2026-03-09T17:29:48.787 INFO:tasks.workunit.client.0.vm06.stdout:4/668: fsync db/d1d/d21/d37/d69/f75 0 2026-03-09T17:29:48.788 INFO:tasks.workunit.client.0.vm06.stdout:6/551: dwrite d6/d4f/f26 [4194304,4194304] 0 2026-03-09T17:29:48.804 INFO:tasks.workunit.client.0.vm06.stdout:3/717: unlink dd/d59/da1/fc8 0 2026-03-09T17:29:48.819 INFO:tasks.workunit.client.0.vm06.stdout:4/669: chown db/df/l7e 4 1 2026-03-09T17:29:48.820 INFO:tasks.workunit.client.0.vm06.stdout:3/718: dread dd/f15 [4194304,4194304] 0 2026-03-09T17:29:48.820 INFO:tasks.workunit.client.0.vm06.stdout:3/719: write dd/d19/d25/f56 [578887,45994] 0 2026-03-09T17:29:48.821 INFO:tasks.workunit.client.0.vm06.stdout:3/720: write dd/d81/da3/fc6 [409210,113074] 0 2026-03-09T17:29:48.822 INFO:tasks.workunit.client.0.vm06.stdout:3/721: readlink dd/d19/d25/d44/d80/le5 0 2026-03-09T17:29:48.823 INFO:tasks.workunit.client.0.vm06.stdout:3/722: write dd/d19/d25/d2d/fce [771063,32057] 0 2026-03-09T17:29:48.829 INFO:tasks.workunit.client.0.vm06.stdout:2/592: symlink d3/d4/d12/d2b/d2d/lc4 0 2026-03-09T17:29:48.830 INFO:tasks.workunit.client.0.vm06.stdout:2/593: readlink d3/d4/d12/d71/daa/d77/d81/d64/laf 0 2026-03-09T17:29:48.831 INFO:tasks.workunit.client.0.vm06.stdout:2/594: dread - d3/d4/d12/d71/daa/d77/d81/d64/d6a/dba/fc0 zero size 2026-03-09T17:29:48.832 INFO:tasks.workunit.client.0.vm06.stdout:2/595: write d3/d4/d12/d71/daa/d77/d81/d64/d6a/dba/fc0 [649190,65475] 0 2026-03-09T17:29:48.838 INFO:tasks.workunit.client.0.vm06.stdout:1/636: write d11/d14/d1d/d42/d46/d92/dc0/f45 [4899303,115419] 0 2026-03-09T17:29:48.839 INFO:tasks.workunit.client.0.vm06.stdout:1/637: fdatasync d11/d14/d1d/d42/d46/d92/dc0/f21 0 2026-03-09T17:29:48.840 INFO:tasks.workunit.client.0.vm06.stdout:8/639: mknod d15/d16/d19/d3d/d5f/d83/dc1/cd3 0 2026-03-09T17:29:48.843 INFO:tasks.workunit.client.0.vm06.stdout:4/670: creat db/d57/ff3 x:0 0 0 2026-03-09T17:29:48.843 INFO:tasks.workunit.client.0.vm06.stdout:7/765: dwrite d5/d7/fb4 [0,4194304] 0 2026-03-09T17:29:48.845 INFO:tasks.workunit.client.0.vm06.stdout:4/671: write db/d1d/d21/d25/d4b/fe9 [948340,78333] 0 2026-03-09T17:29:48.846 INFO:tasks.workunit.client.0.vm06.stdout:4/672: write db/d1d/d21/d44/dc1/fd8 [849658,814] 0 2026-03-09T17:29:48.852 INFO:tasks.workunit.client.0.vm06.stdout:0/792: dwrite d7/f56 [4194304,4194304] 0 2026-03-09T17:29:48.856 INFO:tasks.workunit.client.0.vm06.stdout:6/552: mknod d6/d47/d96/da1/cb6 0 2026-03-09T17:29:48.857 INFO:tasks.workunit.client.0.vm06.stdout:6/553: truncate d6/d47/d96/f3d 1913640 0 2026-03-09T17:29:48.858 INFO:tasks.workunit.client.0.vm06.stdout:2/596: symlink d3/d4/d12/d71/daa/d77/lc5 0 2026-03-09T17:29:48.859 INFO:tasks.workunit.client.0.vm06.stdout:2/597: write d3/d4/d12/da7/fbb [484605,68736] 0 2026-03-09T17:29:48.870 INFO:tasks.workunit.client.0.vm06.stdout:7/766: symlink d5/d1f/d34/d46/d51/le1 0 2026-03-09T17:29:48.871 INFO:tasks.workunit.client.0.vm06.stdout:4/673: unlink db/d59/d5f/d45/l56 0 2026-03-09T17:29:48.874 INFO:tasks.workunit.client.0.vm06.stdout:3/723: creat dd/d19/d1e/db8/df3/ff9 x:0 0 0 2026-03-09T17:29:48.876 INFO:tasks.workunit.client.0.vm06.stdout:2/598: creat d3/d4/d46/fc6 x:0 0 0 2026-03-09T17:29:48.878 INFO:tasks.workunit.client.0.vm06.stdout:1/638: symlink d11/d14/d1d/d1e/d2a/d34/d64/ld5 0 2026-03-09T17:29:48.881 INFO:tasks.workunit.client.0.vm06.stdout:1/639: dread d11/d14/d1d/d42/d46/d92/dc0/f45 [0,4194304] 0 2026-03-09T17:29:48.882 INFO:tasks.workunit.client.0.vm06.stdout:1/640: read - d11/d14/d1c/d3a/fc3 zero size 2026-03-09T17:29:48.885 INFO:tasks.workunit.client.0.vm06.stdout:5/656: dwrite d4/d50/d18/d3d/f54 [0,4194304] 0 2026-03-09T17:29:48.889 INFO:tasks.workunit.client.0.vm06.stdout:5/657: chown d4/d50/d18/f8c 3 1 2026-03-09T17:29:48.889 INFO:tasks.workunit.client.0.vm06.stdout:7/767: creat d5/d7/dac/fe2 x:0 0 0 2026-03-09T17:29:48.890 INFO:tasks.workunit.client.0.vm06.stdout:3/724: creat dd/d19/d25/d2d/ffa x:0 0 0 2026-03-09T17:29:48.891 INFO:tasks.workunit.client.0.vm06.stdout:7/768: write d5/d1f/d34/d46/f89 [461130,62001] 0 2026-03-09T17:29:48.892 INFO:tasks.workunit.client.0.vm06.stdout:9/731: dwrite d3/d26/d6c/d68/f9b [0,4194304] 0 2026-03-09T17:29:48.903 INFO:tasks.workunit.client.0.vm06.stdout:2/599: unlink d3/d4/d12/d71/daa/d77/d81/d64/f9a 0 2026-03-09T17:29:48.907 INFO:tasks.workunit.client.0.vm06.stdout:5/658: fdatasync d4/d52/f8a 0 2026-03-09T17:29:48.907 INFO:tasks.workunit.client.0.vm06.stdout:5/659: chown d4/d50 1519 1 2026-03-09T17:29:48.911 INFO:tasks.workunit.client.0.vm06.stdout:9/732: creat d3/d15/d36/d4c/fe7 x:0 0 0 2026-03-09T17:29:48.914 INFO:tasks.workunit.client.0.vm06.stdout:2/600: fsync d3/d4/d12/d71/daa/d77/d81/d64/d6a/fab 0 2026-03-09T17:29:48.915 INFO:tasks.workunit.client.0.vm06.stdout:8/640: getdents d15/d16/d19/d3d/d5f 0 2026-03-09T17:29:48.916 INFO:tasks.workunit.client.0.vm06.stdout:1/641: mkdir d11/d14/d1d/d1e/dd6 0 2026-03-09T17:29:48.917 INFO:tasks.workunit.client.0.vm06.stdout:1/642: dread - d11/d14/d1d/d42/fcb zero size 2026-03-09T17:29:48.919 INFO:tasks.workunit.client.0.vm06.stdout:5/660: mkdir d4/d52/d55/dee 0 2026-03-09T17:29:48.920 INFO:tasks.workunit.client.0.vm06.stdout:3/725: mknod dd/d81/da3/dae/df8/cfb 0 2026-03-09T17:29:48.921 INFO:tasks.workunit.client.0.vm06.stdout:3/726: write dd/d19/d28/fab [980921,74709] 0 2026-03-09T17:29:48.923 INFO:tasks.workunit.client.0.vm06.stdout:7/769: creat d5/dd/dc5/d64/d6b/dd1/fe3 x:0 0 0 2026-03-09T17:29:48.923 INFO:tasks.workunit.client.0.vm06.stdout:9/733: dread - d3/d15/d48/da8/db9/faf zero size 2026-03-09T17:29:48.926 INFO:tasks.workunit.client.0.vm06.stdout:8/641: fdatasync d15/d16/d1a/d47/f9c 0 2026-03-09T17:29:48.930 INFO:tasks.workunit.client.0.vm06.stdout:8/642: dwrite d15/d16/d19/d2b/f63 [0,4194304] 0 2026-03-09T17:29:48.933 INFO:tasks.workunit.client.0.vm06.stdout:5/661: mknod d4/d22/cef 0 2026-03-09T17:29:48.933 INFO:tasks.workunit.client.0.vm06.stdout:5/662: chown d4/d50/f24 6 1 2026-03-09T17:29:48.935 INFO:tasks.workunit.client.0.vm06.stdout:5/663: dread - d4/d50/d35/d40/d95/db8/dda/fdd zero size 2026-03-09T17:29:48.935 INFO:tasks.workunit.client.0.vm06.stdout:5/664: chown d4/d22/f77 1458 1 2026-03-09T17:29:48.938 INFO:tasks.workunit.client.0.vm06.stdout:4/674: getdents db/d1d/d21/d26/d89/dab 0 2026-03-09T17:29:48.939 INFO:tasks.workunit.client.0.vm06.stdout:4/675: chown db/d1d/c8d 131087 1 2026-03-09T17:29:48.940 INFO:tasks.workunit.client.0.vm06.stdout:3/727: mknod dd/d19/d25/d2d/d9b/cfc 0 2026-03-09T17:29:48.941 INFO:tasks.workunit.client.0.vm06.stdout:7/770: rename d5/dd/dc5/d64/d6b/l8e to d5/dd/dc5/d64/le4 0 2026-03-09T17:29:48.942 INFO:tasks.workunit.client.0.vm06.stdout:9/734: mkdir d3/d15/d48/da8/db9/de8 0 2026-03-09T17:29:48.944 INFO:tasks.workunit.client.0.vm06.stdout:1/643: mknod d11/d14/d1d/d1e/d2a/d99/db0/cd7 0 2026-03-09T17:29:48.946 INFO:tasks.workunit.client.0.vm06.stdout:5/665: unlink d4/f11 0 2026-03-09T17:29:48.962 INFO:tasks.workunit.client.0.vm06.stdout:5/666: chown d4/d22/l3b 15416 1 2026-03-09T17:29:48.962 INFO:tasks.workunit.client.0.vm06.stdout:4/676: creat db/d59/d90/ff4 x:0 0 0 2026-03-09T17:29:48.962 INFO:tasks.workunit.client.0.vm06.stdout:1/644: unlink d11/d14/d1d/d42/fcb 0 2026-03-09T17:29:48.962 INFO:tasks.workunit.client.0.vm06.stdout:5/667: creat d4/d22/d46/ff0 x:0 0 0 2026-03-09T17:29:48.962 INFO:tasks.workunit.client.0.vm06.stdout:7/771: mkdir d5/d7/de5 0 2026-03-09T17:29:48.962 INFO:tasks.workunit.client.0.vm06.stdout:9/735: mkdir d3/de9 0 2026-03-09T17:29:48.962 INFO:tasks.workunit.client.0.vm06.stdout:2/601: getdents d3/d4/d12/d2b/db0 0 2026-03-09T17:29:48.962 INFO:tasks.workunit.client.0.vm06.stdout:9/736: write d3/d15/d48/fda [173191,63421] 0 2026-03-09T17:29:48.962 INFO:tasks.workunit.client.0.vm06.stdout:9/737: write d3/d6d/d9a/d9c/fdf [511298,120913] 0 2026-03-09T17:29:48.962 INFO:tasks.workunit.client.0.vm06.stdout:1/645: dread d11/d14/d1d/f31 [0,4194304] 0 2026-03-09T17:29:48.962 INFO:tasks.workunit.client.0.vm06.stdout:1/646: chown d11/d14/d1c/d3a 6729582 1 2026-03-09T17:29:48.963 INFO:tasks.workunit.client.0.vm06.stdout:5/668: mkdir d4/d50/d35/d40/d95/db8/df1 0 2026-03-09T17:29:48.964 INFO:tasks.workunit.client.0.vm06.stdout:9/738: rmdir d3/d15/d36/d4c/d6a 39 2026-03-09T17:29:48.966 INFO:tasks.workunit.client.0.vm06.stdout:4/677: creat db/ff5 x:0 0 0 2026-03-09T17:29:48.967 INFO:tasks.workunit.client.0.vm06.stdout:4/678: read db/d1d/d21/d26/d89/dab/dae/dcc/fd0 [647432,2351] 0 2026-03-09T17:29:48.971 INFO:tasks.workunit.client.0.vm06.stdout:4/679: dwrite db/d1d/f1f [4194304,4194304] 0 2026-03-09T17:29:48.973 INFO:tasks.workunit.client.0.vm06.stdout:5/669: unlink d4/d50/c89 0 2026-03-09T17:29:48.980 INFO:tasks.workunit.client.0.vm06.stdout:2/602: link d3/d4/f70 d3/fc7 0 2026-03-09T17:29:48.981 INFO:tasks.workunit.client.0.vm06.stdout:2/603: write d3/d4/d12/d71/daa/d77/d81/f50 [2824263,40044] 0 2026-03-09T17:29:48.981 INFO:tasks.workunit.client.0.vm06.stdout:2/604: fsync d3/d4/d46/fa0 0 2026-03-09T17:29:48.982 INFO:tasks.workunit.client.0.vm06.stdout:1/647: link d11/d69/c76 d11/d14/d1c/d3a/db7/cd8 0 2026-03-09T17:29:48.986 INFO:tasks.workunit.client.0.vm06.stdout:9/739: creat d3/d26/d35/fea x:0 0 0 2026-03-09T17:29:48.991 INFO:tasks.workunit.client.0.vm06.stdout:9/740: creat d3/d6d/d9a/feb x:0 0 0 2026-03-09T17:29:48.993 INFO:tasks.workunit.client.0.vm06.stdout:9/741: dread - d3/d26/d35/f6f zero size 2026-03-09T17:29:48.996 INFO:tasks.workunit.client.0.vm06.stdout:9/742: creat d3/d15/d48/da8/db9/de8/fec x:0 0 0 2026-03-09T17:29:49.066 INFO:tasks.workunit.client.0.vm06.stdout:0/793: fsync d7/f56 0 2026-03-09T17:29:49.066 INFO:tasks.workunit.client.0.vm06.stdout:9/743: sync 2026-03-09T17:29:49.066 INFO:tasks.workunit.client.0.vm06.stdout:7/772: sync 2026-03-09T17:29:49.066 INFO:tasks.workunit.client.0.vm06.stdout:9/744: stat d3/c47 0 2026-03-09T17:29:49.067 INFO:tasks.workunit.client.0.vm06.stdout:0/794: chown d7/l63 16340970 1 2026-03-09T17:29:49.069 INFO:tasks.workunit.client.0.vm06.stdout:7/773: chown d5/d1f/d34/d46/d51/lc1 7 1 2026-03-09T17:29:49.070 INFO:tasks.workunit.client.0.vm06.stdout:7/774: write d5/dd/fa6 [229348,89143] 0 2026-03-09T17:29:49.071 INFO:tasks.workunit.client.0.vm06.stdout:0/795: unlink d7/d11/d19/d1d/d39/f4a 0 2026-03-09T17:29:49.079 INFO:tasks.workunit.client.0.vm06.stdout:0/796: link d7/d11/d19/d1d/fb3 d7/d11/d89/da8/db2/dea/f112 0 2026-03-09T17:29:49.081 INFO:tasks.workunit.client.0.vm06.stdout:9/745: getdents d3/d26/d6c 0 2026-03-09T17:29:49.083 INFO:tasks.workunit.client.0.vm06.stdout:7/775: link d5/d1f/d34/d3f/c9a d5/dd/d79/d7f/ce6 0 2026-03-09T17:29:49.085 INFO:tasks.workunit.client.0.vm06.stdout:9/746: dwrite d3/d6d/d9a/d9c/fdf [0,4194304] 0 2026-03-09T17:29:49.091 INFO:tasks.workunit.client.0.vm06.stdout:4/680: read db/d59/d5f/d45/f8e [435719,119563] 0 2026-03-09T17:29:49.100 INFO:tasks.workunit.client.0.vm06.stdout:5/670: read d4/d50/f80 [1093246,98326] 0 2026-03-09T17:29:49.106 INFO:tasks.workunit.client.0.vm06.stdout:5/671: dwrite d4/d50/f43 [0,4194304] 0 2026-03-09T17:29:49.109 INFO:tasks.workunit.client.0.vm06.stdout:0/797: dread d7/f2a [0,4194304] 0 2026-03-09T17:29:49.119 INFO:tasks.workunit.client.0.vm06.stdout:4/681: link db/d1d/f3a db/d1d/d21/d44/ff6 0 2026-03-09T17:29:49.120 INFO:tasks.workunit.client.0.vm06.stdout:9/747: creat d3/d15/fed x:0 0 0 2026-03-09T17:29:49.131 INFO:tasks.workunit.client.0.vm06.stdout:8/643: fdatasync d15/d39/f4b 0 2026-03-09T17:29:49.132 INFO:tasks.workunit.client.0.vm06.stdout:8/644: rmdir d15/d16/d19/d3d/d5f/d83 39 2026-03-09T17:29:49.133 INFO:tasks.workunit.client.0.vm06.stdout:8/645: mkdir d15/d16/d19/d3d/d5f/dd4 0 2026-03-09T17:29:49.135 INFO:tasks.workunit.client.0.vm06.stdout:8/646: mkdir d15/d39/d3c/dd5 0 2026-03-09T17:29:49.136 INFO:tasks.workunit.client.0.vm06.stdout:8/647: read d15/d16/d19/d71/f65 [3618907,65198] 0 2026-03-09T17:29:49.137 INFO:tasks.workunit.client.0.vm06.stdout:8/648: chown d15/d16/d1e/d30/db8/d5e/c90 31612078 1 2026-03-09T17:29:49.137 INFO:tasks.workunit.client.0.vm06.stdout:8/649: write d15/d16/d1e/d30/f3b [5502469,3345] 0 2026-03-09T17:29:49.169 INFO:tasks.workunit.client.0.vm06.stdout:6/554: write d6/d47/d4d/f50 [580574,62630] 0 2026-03-09T17:29:49.173 INFO:tasks.workunit.client.0.vm06.stdout:6/555: creat d6/d47/d96/da1/fb7 x:0 0 0 2026-03-09T17:29:49.246 INFO:tasks.workunit.client.0.vm06.stdout:8/650: truncate d15/d16/d1a/d47/f9c 912388 0 2026-03-09T17:29:49.247 INFO:tasks.workunit.client.0.vm06.stdout:8/651: chown d15/d16 3628 1 2026-03-09T17:29:49.250 INFO:tasks.workunit.client.0.vm06.stdout:3/728: write dd/f26 [4133023,81360] 0 2026-03-09T17:29:49.267 INFO:tasks.workunit.client.0.vm06.stdout:8/652: creat d15/d16/d19/d3d/d5f/dd4/fd6 x:0 0 0 2026-03-09T17:29:49.272 INFO:tasks.workunit.client.0.vm06.stdout:8/653: fdatasync d15/d31/f33 0 2026-03-09T17:29:49.281 INFO:tasks.workunit.client.0.vm06.stdout:2/605: write d3/d4/d46/f94 [831547,15857] 0 2026-03-09T17:29:49.288 INFO:tasks.workunit.client.0.vm06.stdout:1/648: write d11/d14/d1d/d1e/d2a/d34/d64/f8a [2012095,89247] 0 2026-03-09T17:29:49.302 INFO:tasks.workunit.client.0.vm06.stdout:1/649: link d11/l32 d11/d14/d1d/d4a/ld9 0 2026-03-09T17:29:49.305 INFO:tasks.workunit.client.0.vm06.stdout:1/650: rename d11/l5d to d11/d14/d1d/d94/lda 0 2026-03-09T17:29:49.313 INFO:tasks.workunit.client.0.vm06.stdout:1/651: dread d11/d14/d1d/d94/f95 [4194304,4194304] 0 2026-03-09T17:29:49.322 INFO:tasks.workunit.client.0.vm06.stdout:1/652: chown d11/l32 187477044 1 2026-03-09T17:29:49.324 INFO:tasks.workunit.client.0.vm06.stdout:1/653: fdatasync f10 0 2026-03-09T17:29:49.328 INFO:tasks.workunit.client.0.vm06.stdout:1/654: read d11/d14/d1d/d1e/d2a/d34/f3b [1247220,59689] 0 2026-03-09T17:29:49.339 INFO:tasks.workunit.client.0.vm06.stdout:7/776: write d5/d1f/d34/d46/d51/f92 [591017,38569] 0 2026-03-09T17:29:49.358 INFO:tasks.workunit.client.0.vm06.stdout:7/777: write d5/dd/dc5/d64/fe0 [5592,75432] 0 2026-03-09T17:29:49.358 INFO:tasks.workunit.client.0.vm06.stdout:0/798: write d7/d88/fbe [403048,89423] 0 2026-03-09T17:29:49.358 INFO:tasks.workunit.client.0.vm06.stdout:0/799: write d7/d11/d19/d8b/da4/f107 [801460,44156] 0 2026-03-09T17:29:49.358 INFO:tasks.workunit.client.0.vm06.stdout:5/672: dwrite d4/dca/f8b [0,4194304] 0 2026-03-09T17:29:49.358 INFO:tasks.workunit.client.0.vm06.stdout:5/673: rename d4/d50/d35/d40/d95/db8/fe3 to d4/d52/db4/dc2/ff2 0 2026-03-09T17:29:49.362 INFO:tasks.workunit.client.0.vm06.stdout:5/674: dwrite d4/d52/db4/dc2/ff2 [0,4194304] 0 2026-03-09T17:29:49.375 INFO:tasks.workunit.client.0.vm06.stdout:5/675: mkdir d4/d22/d64/df3 0 2026-03-09T17:29:49.382 INFO:tasks.workunit.client.0.vm06.stdout:5/676: creat d4/ff4 x:0 0 0 2026-03-09T17:29:49.403 INFO:tasks.workunit.client.0.vm06.stdout:0/800: dread d7/d11/f13 [0,4194304] 0 2026-03-09T17:29:49.403 INFO:tasks.workunit.client.0.vm06.stdout:0/801: readlink d7/d11/d19/le6 0 2026-03-09T17:29:49.406 INFO:tasks.workunit.client.0.vm06.stdout:0/802: dread d7/d11/d19/d1d/fb5 [0,4194304] 0 2026-03-09T17:29:49.445 INFO:tasks.workunit.client.0.vm06.stdout:4/682: write db/d59/d5f/d45/f8e [809495,15959] 0 2026-03-09T17:29:49.475 INFO:tasks.workunit.client.0.vm06.stdout:4/683: unlink db/df/f2d 0 2026-03-09T17:29:49.490 INFO:tasks.workunit.client.0.vm06.stdout:4/684: write db/d1d/f1f [1808662,83872] 0 2026-03-09T17:29:49.596 INFO:tasks.workunit.client.0.vm06.stdout:7/778: sync 2026-03-09T17:29:49.596 INFO:tasks.workunit.client.0.vm06.stdout:7/779: readlink d5/d1f/d34/d46/d51/le1 0 2026-03-09T17:29:49.597 INFO:tasks.workunit.client.0.vm06.stdout:7/780: chown d5/d1f/d34/d46/d51/fda 219647142 1 2026-03-09T17:29:49.597 INFO:tasks.workunit.client.0.vm06.stdout:7/781: chown d5/dd/dc5/c61 54732726 1 2026-03-09T17:29:49.599 INFO:tasks.workunit.client.0.vm06.stdout:7/782: mkdir d5/d7/d2b/de7 0 2026-03-09T17:29:49.602 INFO:tasks.workunit.client.0.vm06.stdout:7/783: mkdir d5/dd/dc5/d64/de8 0 2026-03-09T17:29:49.605 INFO:tasks.workunit.client.0.vm06.stdout:7/784: link d5/dd/d79/d7f/lbf d5/dd/le9 0 2026-03-09T17:29:49.608 INFO:tasks.workunit.client.0.vm06.stdout:7/785: link d5/d1f/d34/d46/f4c d5/dd/dc5/d5f/fea 0 2026-03-09T17:29:49.610 INFO:tasks.workunit.client.0.vm06.stdout:7/786: creat d5/dd/dc5/d64/d6b/feb x:0 0 0 2026-03-09T17:29:49.611 INFO:tasks.workunit.client.0.vm06.stdout:7/787: dread - d5/d7/dac/fe2 zero size 2026-03-09T17:29:49.614 INFO:tasks.workunit.client.0.vm06.stdout:7/788: dwrite d5/dd/f22 [0,4194304] 0 2026-03-09T17:29:49.756 INFO:tasks.workunit.client.0.vm06.stdout:9/748: dwrite d3/d6d/d9a/d9c/fdf [4194304,4194304] 0 2026-03-09T17:29:49.762 INFO:tasks.workunit.client.0.vm06.stdout:9/749: creat d3/d15/d36/d83/fee x:0 0 0 2026-03-09T17:29:49.763 INFO:tasks.workunit.client.0.vm06.stdout:9/750: chown d3/d15/d36/d4d/f61 1251333628 1 2026-03-09T17:29:49.763 INFO:tasks.workunit.client.0.vm06.stdout:9/751: readlink d3/d15/l51 0 2026-03-09T17:29:49.764 INFO:tasks.workunit.client.0.vm06.stdout:9/752: write d3/d2c/f81 [669349,87854] 0 2026-03-09T17:29:49.769 INFO:tasks.workunit.client.0.vm06.stdout:9/753: creat d3/d15/d36/d4c/d6a/d8a/fef x:0 0 0 2026-03-09T17:29:49.770 INFO:tasks.workunit.client.0.vm06.stdout:9/754: readlink d3/d15/l30 0 2026-03-09T17:29:49.785 INFO:tasks.workunit.client.0.vm06.stdout:7/789: sync 2026-03-09T17:29:49.947 INFO:tasks.workunit.client.0.vm06.stdout:6/556: dwrite d6/d4f/d3e/f62 [0,4194304] 0 2026-03-09T17:29:49.949 INFO:tasks.workunit.client.0.vm06.stdout:6/557: read - d6/d12/d17/d85/faf zero size 2026-03-09T17:29:49.951 INFO:tasks.workunit.client.0.vm06.stdout:6/558: chown d6/d12/d53/f64 3894587 1 2026-03-09T17:29:49.952 INFO:tasks.workunit.client.0.vm06.stdout:6/559: write d6/d4f/f3c [4065451,26398] 0 2026-03-09T17:29:49.954 INFO:tasks.workunit.client.0.vm06.stdout:6/560: stat d6/d12/d53/f87 0 2026-03-09T17:29:49.955 INFO:tasks.workunit.client.0.vm06.stdout:6/561: chown d6/d4f/d3e/cad 6906613 1 2026-03-09T17:29:49.958 INFO:tasks.workunit.client.0.vm06.stdout:6/562: dwrite d6/d47/fa8 [0,4194304] 0 2026-03-09T17:29:49.968 INFO:tasks.workunit.client.0.vm06.stdout:6/563: rename d6/d4f/f3c to d6/d47/d4d/d9a/da2/db1/fb8 0 2026-03-09T17:29:49.970 INFO:tasks.workunit.client.0.vm06.stdout:6/564: fdatasync d6/d47/d96/d40/f67 0 2026-03-09T17:29:49.971 INFO:tasks.workunit.client.0.vm06.stdout:6/565: symlink d6/d4f/d73/lb9 0 2026-03-09T17:29:49.972 INFO:tasks.workunit.client.0.vm06.stdout:6/566: write d6/d12/d17/d85/faf [660175,106612] 0 2026-03-09T17:29:50.094 INFO:tasks.workunit.client.0.vm06.stdout:3/729: write dd/f5f [620997,115545] 0 2026-03-09T17:29:50.094 INFO:tasks.workunit.client.0.vm06.stdout:3/730: dread - dd/d19/d2c/fe9 zero size 2026-03-09T17:29:50.096 INFO:tasks.workunit.client.0.vm06.stdout:3/731: chown dd/d19/d25/d44/d80/fc4 8355810 1 2026-03-09T17:29:50.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:49 vm06.local ceph-mon[57307]: pgmap v158: 65 pgs: 65 active+clean; 1.5 GiB data, 5.7 GiB used, 114 GiB / 120 GiB avail; 31 MiB/s rd, 78 MiB/s wr, 316 op/s 2026-03-09T17:29:50.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:49 vm09.local ceph-mon[62061]: pgmap v158: 65 pgs: 65 active+clean; 1.5 GiB data, 5.7 GiB used, 114 GiB / 120 GiB avail; 31 MiB/s rd, 78 MiB/s wr, 316 op/s 2026-03-09T17:29:50.168 INFO:tasks.workunit.client.0.vm06.stdout:8/654: dwrite d15/d16/d6d/f89 [0,4194304] 0 2026-03-09T17:29:50.170 INFO:tasks.workunit.client.0.vm06.stdout:8/655: chown d15/d39/f7b 486066014 1 2026-03-09T17:29:50.173 INFO:tasks.workunit.client.0.vm06.stdout:8/656: mknod d15/d16/d1a/d47/cd7 0 2026-03-09T17:29:50.198 INFO:tasks.workunit.client.0.vm06.stdout:2/606: dwrite d3/d4/d12/f85 [0,4194304] 0 2026-03-09T17:29:50.200 INFO:tasks.workunit.client.0.vm06.stdout:2/607: unlink d3/d4/d12/d2b/c99 0 2026-03-09T17:29:50.201 INFO:tasks.workunit.client.0.vm06.stdout:2/608: chown d3/d4/d46/da5/ca9 282 1 2026-03-09T17:29:50.206 INFO:tasks.workunit.client.0.vm06.stdout:2/609: unlink d3/d4/d12/d71/daa/d77/d81/d64/d6a/fa3 0 2026-03-09T17:29:50.211 INFO:tasks.workunit.client.0.vm06.stdout:2/610: truncate d3/d4/d12/d71/daa/d77/d81/d64/d6a/f6d 721123 0 2026-03-09T17:29:50.265 INFO:tasks.workunit.client.0.vm06.stdout:2/611: sync 2026-03-09T17:29:50.269 INFO:tasks.workunit.client.0.vm06.stdout:1/655: write d11/d14/d1d/d42/d46/d92/dc0/f68 [2979712,59513] 0 2026-03-09T17:29:50.270 INFO:tasks.workunit.client.0.vm06.stdout:1/656: write d11/d14/d1c/d3a/fc5 [916290,8855] 0 2026-03-09T17:29:50.273 INFO:tasks.workunit.client.0.vm06.stdout:2/612: symlink d3/d4/d12/d2b/d36/lc8 0 2026-03-09T17:29:50.292 INFO:tasks.workunit.client.0.vm06.stdout:1/657: dread d11/d14/d1d/f90 [0,4194304] 0 2026-03-09T17:29:50.293 INFO:tasks.workunit.client.0.vm06.stdout:1/658: chown d11/d69/fca 16 1 2026-03-09T17:29:50.293 INFO:tasks.workunit.client.0.vm06.stdout:1/659: fsync d11/d69/fca 0 2026-03-09T17:29:50.301 INFO:tasks.workunit.client.0.vm06.stdout:1/660: rmdir d11/d14/d1d/d4a 39 2026-03-09T17:29:50.307 INFO:tasks.workunit.client.0.vm06.stdout:5/677: write d4/d50/d18/d3d/fa7 [756845,76949] 0 2026-03-09T17:29:50.307 INFO:tasks.workunit.client.0.vm06.stdout:5/678: readlink d4/l12 0 2026-03-09T17:29:50.307 INFO:tasks.workunit.client.0.vm06.stdout:5/679: readlink d4/d22/l3b 0 2026-03-09T17:29:50.319 INFO:tasks.workunit.client.0.vm06.stdout:5/680: symlink d4/da4/dcf/lf5 0 2026-03-09T17:29:50.320 INFO:tasks.workunit.client.0.vm06.stdout:1/661: getdents d11/d14/d1d/d42/d46/d92/dc0/d57 0 2026-03-09T17:29:50.329 INFO:tasks.workunit.client.0.vm06.stdout:1/662: link d11/d14/l77 d11/d14/d1d/d42/d46/d92/dc0/d57/ldb 0 2026-03-09T17:29:50.331 INFO:tasks.workunit.client.0.vm06.stdout:1/663: mknod d11/d14/d1d/d42/d46/d92/dc0/cdc 0 2026-03-09T17:29:50.333 INFO:tasks.workunit.client.0.vm06.stdout:1/664: mknod d11/d14/d1d/d4a/cdd 0 2026-03-09T17:29:50.337 INFO:tasks.workunit.client.0.vm06.stdout:1/665: rename d11/d14/d1d/d42/d46/d92/dc0/f4c to d11/d14/d1d/d42/d46/d92/fde 0 2026-03-09T17:29:50.340 INFO:tasks.workunit.client.0.vm06.stdout:1/666: dwrite d11/d14/d1d/d42/d46/d92/dc0/f68 [4194304,4194304] 0 2026-03-09T17:29:50.351 INFO:tasks.workunit.client.0.vm06.stdout:1/667: mkdir d11/d14/d1d/d42/d46/d92/dc0/ddf 0 2026-03-09T17:29:50.354 INFO:tasks.workunit.client.0.vm06.stdout:0/803: write d7/d11/d2d/fe7 [61425,52680] 0 2026-03-09T17:29:50.360 INFO:tasks.workunit.client.0.vm06.stdout:0/804: truncate d7/d11/f30 201184 0 2026-03-09T17:29:50.363 INFO:tasks.workunit.client.0.vm06.stdout:1/668: unlink d11/d14/d1c/d3a/db7/cd8 0 2026-03-09T17:29:50.364 INFO:tasks.workunit.client.0.vm06.stdout:1/669: write d11/d14/d1d/d1e/d2a/d34/d58/f6a [4584207,103485] 0 2026-03-09T17:29:50.374 INFO:tasks.workunit.client.0.vm06.stdout:0/805: link d7/f50 d7/d11/d19/d3c/df8/f113 0 2026-03-09T17:29:50.374 INFO:tasks.workunit.client.0.vm06.stdout:0/806: chown d7/d11/d19/d8b/da4/d85/fc8 0 1 2026-03-09T17:29:50.376 INFO:tasks.workunit.client.0.vm06.stdout:0/807: mkdir d7/d11/d19/d23/db7/dbd/d101/d114 0 2026-03-09T17:29:50.380 INFO:tasks.workunit.client.0.vm06.stdout:0/808: rename d7/d11/d19/d23/db7/ccd to d7/d11/d89/da8/db2/dea/c115 0 2026-03-09T17:29:50.381 INFO:tasks.workunit.client.0.vm06.stdout:0/809: truncate d7/d11/d19/fe5 891624 0 2026-03-09T17:29:50.384 INFO:tasks.workunit.client.0.vm06.stdout:0/810: dwrite d7/d11/f1c [4194304,4194304] 0 2026-03-09T17:29:50.388 INFO:tasks.workunit.client.0.vm06.stdout:4/685: write db/f68 [4403775,74477] 0 2026-03-09T17:29:50.392 INFO:tasks.workunit.client.0.vm06.stdout:4/686: getdents db/d59 0 2026-03-09T17:29:50.522 INFO:tasks.workunit.client.0.vm06.stdout:9/755: write d3/d11/d65/f7c [349161,40708] 0 2026-03-09T17:29:50.527 INFO:tasks.workunit.client.0.vm06.stdout:9/756: read d3/d2c/f9d [1089726,57590] 0 2026-03-09T17:29:50.528 INFO:tasks.workunit.client.0.vm06.stdout:9/757: truncate d3/d15/d36/d4c/d6a/d8a/fdc 282460 0 2026-03-09T17:29:50.533 INFO:tasks.workunit.client.0.vm06.stdout:7/790: dwrite d5/f71 [0,4194304] 0 2026-03-09T17:29:50.535 INFO:tasks.workunit.client.0.vm06.stdout:7/791: write d5/dd/d79/d7f/fdd [924399,130035] 0 2026-03-09T17:29:50.537 INFO:tasks.workunit.client.0.vm06.stdout:9/758: symlink d3/d6d/d9a/d9c/lf0 0 2026-03-09T17:29:50.549 INFO:tasks.workunit.client.0.vm06.stdout:7/792: rmdir d5/dd/dc5/d64/d6b/dd1 39 2026-03-09T17:29:50.560 INFO:tasks.workunit.client.0.vm06.stdout:7/793: dwrite d5/d1f/d34/d3f/d91/fce [0,4194304] 0 2026-03-09T17:29:50.569 INFO:tasks.workunit.client.0.vm06.stdout:7/794: mknod d5/d1f/d34/d3f/d91/cec 0 2026-03-09T17:29:50.576 INFO:tasks.workunit.client.0.vm06.stdout:7/795: symlink d5/d1f/d34/led 0 2026-03-09T17:29:50.578 INFO:tasks.workunit.client.0.vm06.stdout:7/796: rmdir d5/dd/d79/d7f 39 2026-03-09T17:29:50.585 INFO:tasks.workunit.client.0.vm06.stdout:7/797: getdents d5/dd/d79 0 2026-03-09T17:29:50.586 INFO:tasks.workunit.client.0.vm06.stdout:7/798: write d5/d7/dac/fe2 [63122,23871] 0 2026-03-09T17:29:50.592 INFO:tasks.workunit.client.0.vm06.stdout:7/799: mkdir d5/dd/dc5/dee 0 2026-03-09T17:29:50.593 INFO:tasks.workunit.client.0.vm06.stdout:7/800: write d5/d1f/d34/d3f/fca [342451,113083] 0 2026-03-09T17:29:50.594 INFO:tasks.workunit.client.0.vm06.stdout:7/801: chown d5/d1f/d34/d3f/d8b/fd3 2078125 1 2026-03-09T17:29:50.599 INFO:tasks.workunit.client.0.vm06.stdout:7/802: symlink d5/d7/dac/dd4/lef 0 2026-03-09T17:29:50.603 INFO:tasks.workunit.client.0.vm06.stdout:7/803: mknod d5/dd/dc5/d5f/cf0 0 2026-03-09T17:29:50.603 INFO:tasks.workunit.client.0.vm06.stdout:7/804: chown d5/d1f/d34/d46/d51/f7c 736582 1 2026-03-09T17:29:50.616 INFO:tasks.workunit.client.0.vm06.stdout:6/567: dwrite d6/d4f/d3e/d52/d80/faa [0,4194304] 0 2026-03-09T17:29:50.617 INFO:tasks.workunit.client.0.vm06.stdout:6/568: write d6/d47/f49 [1939164,105995] 0 2026-03-09T17:29:50.632 INFO:tasks.workunit.client.0.vm06.stdout:7/805: dread d5/dd/dc5/d64/f8d [4194304,4194304] 0 2026-03-09T17:29:50.640 INFO:tasks.workunit.client.0.vm06.stdout:6/569: dread d6/d47/d96/f7e [0,4194304] 0 2026-03-09T17:29:50.642 INFO:tasks.workunit.client.0.vm06.stdout:7/806: fsync d5/d7/d2b/fa1 0 2026-03-09T17:29:50.643 INFO:tasks.workunit.client.0.vm06.stdout:6/570: dwrite d6/d47/f49 [0,4194304] 0 2026-03-09T17:29:50.669 INFO:tasks.workunit.client.0.vm06.stdout:3/732: dwrite dd/d19/d1e/f41 [0,4194304] 0 2026-03-09T17:29:50.688 INFO:tasks.workunit.client.0.vm06.stdout:8/657: dwrite d15/d16/d1e/f8c [0,4194304] 0 2026-03-09T17:29:50.692 INFO:tasks.workunit.client.0.vm06.stdout:8/658: creat d15/d16/d6d/fd8 x:0 0 0 2026-03-09T17:29:50.699 INFO:tasks.workunit.client.0.vm06.stdout:8/659: write d15/d16/d1e/d30/fcf [1023533,58504] 0 2026-03-09T17:29:50.720 INFO:tasks.workunit.client.0.vm06.stdout:2/613: dwrite d3/d4/d22/d72/f54 [0,4194304] 0 2026-03-09T17:29:50.735 INFO:tasks.workunit.client.0.vm06.stdout:5/681: write d4/d22/d64/f70 [2425723,51538] 0 2026-03-09T17:29:50.738 INFO:tasks.workunit.client.0.vm06.stdout:3/733: sync 2026-03-09T17:29:50.747 INFO:tasks.workunit.client.0.vm06.stdout:3/734: mknod dd/cfd 0 2026-03-09T17:29:50.747 INFO:tasks.workunit.client.0.vm06.stdout:3/735: fsync dd/d1d/d2e/d67/fed 0 2026-03-09T17:29:50.749 INFO:tasks.workunit.client.0.vm06.stdout:3/736: fsync dd/d1d/d2e/d67/fcf 0 2026-03-09T17:29:50.755 INFO:tasks.workunit.client.0.vm06.stdout:3/737: getdents dd/d19/d25/df0 0 2026-03-09T17:29:50.756 INFO:tasks.workunit.client.0.vm06.stdout:1/670: write f8 [1202729,114485] 0 2026-03-09T17:29:50.760 INFO:tasks.workunit.client.0.vm06.stdout:1/671: mkdir d11/de0 0 2026-03-09T17:29:50.761 INFO:tasks.workunit.client.0.vm06.stdout:1/672: readlink d11/d14/d1d/d8c/ld3 0 2026-03-09T17:29:50.767 INFO:tasks.workunit.client.0.vm06.stdout:0/811: write d7/d11/fa3 [656736,108005] 0 2026-03-09T17:29:50.773 INFO:tasks.workunit.client.0.vm06.stdout:0/812: symlink d7/d102/l116 0 2026-03-09T17:29:50.777 INFO:tasks.workunit.client.0.vm06.stdout:4/687: mkdir db/d1d/d21/d25/d4b/df7 0 2026-03-09T17:29:50.790 INFO:tasks.workunit.client.0.vm06.stdout:0/813: dread d7/d11/d19/d3c/df8/f113 [0,4194304] 0 2026-03-09T17:29:50.793 INFO:tasks.workunit.client.0.vm06.stdout:0/814: creat d7/d11/d19/d3c/db9/dd8/f117 x:0 0 0 2026-03-09T17:29:50.794 INFO:tasks.workunit.client.0.vm06.stdout:0/815: readlink d7/l4b 0 2026-03-09T17:29:50.824 INFO:tasks.workunit.client.0.vm06.stdout:9/759: dwrite d3/f4b [0,4194304] 0 2026-03-09T17:29:50.836 INFO:tasks.workunit.client.0.vm06.stdout:9/760: mkdir d3/d26/dcb/df1 0 2026-03-09T17:29:50.839 INFO:tasks.workunit.client.0.vm06.stdout:9/761: rename d3/d26/d6c/d68/c88 to d3/d26/dd7/cf2 0 2026-03-09T17:29:50.843 INFO:tasks.workunit.client.0.vm06.stdout:9/762: symlink d3/d6d/lf3 0 2026-03-09T17:29:50.844 INFO:tasks.workunit.client.0.vm06.stdout:9/763: dread d3/d6d/d9a/d9c/fdf [4194304,4194304] 0 2026-03-09T17:29:50.850 INFO:tasks.workunit.client.0.vm06.stdout:9/764: mkdir d3/d15/d36/df4 0 2026-03-09T17:29:50.855 INFO:tasks.workunit.client.0.vm06.stdout:9/765: dwrite d3/d6d/d9a/feb [0,4194304] 0 2026-03-09T17:29:50.857 INFO:tasks.workunit.client.0.vm06.stdout:9/766: chown d3/d15/d36/d4d 24 1 2026-03-09T17:29:50.857 INFO:tasks.workunit.client.0.vm06.stdout:9/767: chown d3/d11/d65 110 1 2026-03-09T17:29:50.869 INFO:tasks.workunit.client.0.vm06.stdout:7/807: dwrite d5/d1f/f74 [0,4194304] 0 2026-03-09T17:29:50.879 INFO:tasks.workunit.client.0.vm06.stdout:8/660: truncate d15/d39/d3c/f5d 3056951 0 2026-03-09T17:29:50.881 INFO:tasks.workunit.client.0.vm06.stdout:6/571: dwrite d6/d12/d53/f5b [0,4194304] 0 2026-03-09T17:29:50.888 INFO:tasks.workunit.client.0.vm06.stdout:2/614: dwrite d3/d4/d12/f35 [0,4194304] 0 2026-03-09T17:29:50.891 INFO:tasks.workunit.client.0.vm06.stdout:2/615: chown d3/d4/d12/d2b/f32 22618 1 2026-03-09T17:29:50.901 INFO:tasks.workunit.client.0.vm06.stdout:5/682: dwrite d4/f26 [0,4194304] 0 2026-03-09T17:29:50.905 INFO:tasks.workunit.client.0.vm06.stdout:5/683: chown d4/da4/fc5 453161 1 2026-03-09T17:29:50.905 INFO:tasks.workunit.client.0.vm06.stdout:8/661: mknod d15/d39/dd2/cd9 0 2026-03-09T17:29:50.913 INFO:tasks.workunit.client.0.vm06.stdout:3/738: dwrite dd/d1d/d4e/f7d [0,4194304] 0 2026-03-09T17:29:50.919 INFO:tasks.workunit.client.0.vm06.stdout:1/673: write d11/d14/d1c/d3a/fbe [173155,6036] 0 2026-03-09T17:29:50.920 INFO:tasks.workunit.client.0.vm06.stdout:5/684: symlink d4/d50/d35/d40/d6f/lf6 0 2026-03-09T17:29:50.922 INFO:tasks.workunit.client.0.vm06.stdout:8/662: rmdir d15/d16/d19 39 2026-03-09T17:29:50.931 INFO:tasks.workunit.client.0.vm06.stdout:6/572: getdents d6/d4f/d3e/d52/d95 0 2026-03-09T17:29:50.931 INFO:tasks.workunit.client.0.vm06.stdout:0/816: write d7/d11/f29 [382410,97199] 0 2026-03-09T17:29:50.931 INFO:tasks.workunit.client.0.vm06.stdout:7/808: getdents d5/dd/d79 0 2026-03-09T17:29:50.931 INFO:tasks.workunit.client.0.vm06.stdout:0/817: dread d7/f50 [0,4194304] 0 2026-03-09T17:29:50.935 INFO:tasks.workunit.client.0.vm06.stdout:0/818: dwrite d7/d11/d19/f24 [4194304,4194304] 0 2026-03-09T17:29:50.937 INFO:tasks.workunit.client.0.vm06.stdout:0/819: write d7/d11/fa3 [1372357,50612] 0 2026-03-09T17:29:50.941 INFO:tasks.workunit.client.0.vm06.stdout:5/685: read - d4/d50/fa3 zero size 2026-03-09T17:29:50.942 INFO:tasks.workunit.client.0.vm06.stdout:7/809: dread d5/d1f/d34/d3f/fca [0,4194304] 0 2026-03-09T17:29:50.943 INFO:tasks.workunit.client.0.vm06.stdout:7/810: write d5/dd/dc5/fa2 [1320387,13592] 0 2026-03-09T17:29:50.955 INFO:tasks.workunit.client.0.vm06.stdout:1/674: mknod d11/d14/d1d/dd1/ce1 0 2026-03-09T17:29:50.959 INFO:tasks.workunit.client.0.vm06.stdout:5/686: mknod d4/dbb/cf7 0 2026-03-09T17:29:50.965 INFO:tasks.workunit.client.0.vm06.stdout:7/811: rmdir d5/d7/d2b/dbd 39 2026-03-09T17:29:50.967 INFO:tasks.workunit.client.0.vm06.stdout:4/688: rename db/d1d/d21/f9f to db/d1d/d21/d25/d4b/de4/ff8 0 2026-03-09T17:29:50.969 INFO:tasks.workunit.client.0.vm06.stdout:1/675: mkdir d11/d14/d1d/dd1/de2 0 2026-03-09T17:29:50.972 INFO:tasks.workunit.client.0.vm06.stdout:5/687: symlink d4/d52/d55/lf8 0 2026-03-09T17:29:50.976 INFO:tasks.workunit.client.0.vm06.stdout:7/812: rmdir d5/d7/dac/dd4 39 2026-03-09T17:29:50.976 INFO:tasks.workunit.client.0.vm06.stdout:8/663: dread d15/d16/d1a/d47/fa5 [0,4194304] 0 2026-03-09T17:29:50.981 INFO:tasks.workunit.client.0.vm06.stdout:5/688: creat d4/dbb/ff9 x:0 0 0 2026-03-09T17:29:50.982 INFO:tasks.workunit.client.0.vm06.stdout:5/689: dread - d4/d22/d46/ff0 zero size 2026-03-09T17:29:50.984 INFO:tasks.workunit.client.0.vm06.stdout:1/676: dread d11/d14/d1d/f73 [0,4194304] 0 2026-03-09T17:29:50.985 INFO:tasks.workunit.client.0.vm06.stdout:8/664: mknod d15/d39/d3c/cda 0 2026-03-09T17:29:50.985 INFO:tasks.workunit.client.0.vm06.stdout:4/689: mknod db/d1d/d21/d26/d89/dab/dae/dcc/de0/cf9 0 2026-03-09T17:29:50.987 INFO:tasks.workunit.client.0.vm06.stdout:5/690: fdatasync d4/d50/f61 0 2026-03-09T17:29:50.988 INFO:tasks.workunit.client.0.vm06.stdout:7/813: mknod d5/dd/dc5/d64/d6b/dd1/cf1 0 2026-03-09T17:29:50.988 INFO:tasks.workunit.client.0.vm06.stdout:7/814: chown d5/dd/d79/fb3 493512382 1 2026-03-09T17:29:50.989 INFO:tasks.workunit.client.0.vm06.stdout:1/677: readlink d11/d14/d1d/d1e/l71 0 2026-03-09T17:29:50.989 INFO:tasks.workunit.client.0.vm06.stdout:1/678: chown d11/d14/d1c/d3a/fd2 90243900 1 2026-03-09T17:29:50.991 INFO:tasks.workunit.client.0.vm06.stdout:4/690: mknod db/d57/dd4/cfa 0 2026-03-09T17:29:50.995 INFO:tasks.workunit.client.0.vm06.stdout:1/679: fsync d11/d14/d1d/d42/d46/f55 0 2026-03-09T17:29:50.996 INFO:tasks.workunit.client.0.vm06.stdout:1/680: chown d11 70496 1 2026-03-09T17:29:50.996 INFO:tasks.workunit.client.0.vm06.stdout:1/681: truncate f7 3454154 0 2026-03-09T17:29:50.997 INFO:tasks.workunit.client.0.vm06.stdout:5/691: creat d4/d50/dd6/ffa x:0 0 0 2026-03-09T17:29:50.998 INFO:tasks.workunit.client.0.vm06.stdout:7/815: symlink d5/d7/dac/dd4/lf2 0 2026-03-09T17:29:51.002 INFO:tasks.workunit.client.0.vm06.stdout:8/665: creat d15/d16/d1e/d30/fdb x:0 0 0 2026-03-09T17:29:51.005 INFO:tasks.workunit.client.0.vm06.stdout:7/816: fsync d5/d1f/d34/d46/d51/f7c 0 2026-03-09T17:29:51.008 INFO:tasks.workunit.client.0.vm06.stdout:5/692: dread d4/d50/d35/d40/d6f/fc7 [0,4194304] 0 2026-03-09T17:29:51.043 INFO:tasks.workunit.client.0.vm06.stdout:1/682: link d11/fa9 d11/d14/d1c/fe3 0 2026-03-09T17:29:51.045 INFO:tasks.workunit.client.0.vm06.stdout:8/666: sync 2026-03-09T17:29:51.068 INFO:tasks.workunit.client.0.vm06.stdout:8/667: dread d15/d16/d19/f61 [0,4194304] 0 2026-03-09T17:29:51.114 INFO:tasks.workunit.client.0.vm06.stdout:9/768: dwrite d3/d11/d65/d80/fd0 [0,4194304] 0 2026-03-09T17:29:51.123 INFO:tasks.workunit.client.0.vm06.stdout:9/769: fdatasync d3/d6d/d9a/d9c/dcd/fd4 0 2026-03-09T17:29:51.126 INFO:tasks.workunit.client.0.vm06.stdout:9/770: symlink d3/d6d/d9a/lf5 0 2026-03-09T17:29:51.128 INFO:tasks.workunit.client.0.vm06.stdout:9/771: mkdir d3/d26/d35/d9f/df6 0 2026-03-09T17:29:51.129 INFO:tasks.workunit.client.0.vm06.stdout:9/772: chown d3/d11/d65/f71 1501574 1 2026-03-09T17:29:51.150 INFO:tasks.workunit.client.0.vm06.stdout:2/616: fdatasync d3/d4/d12/d71/daa/d77/d81/d64/d6a/f6d 0 2026-03-09T17:29:51.151 INFO:tasks.workunit.client.0.vm06.stdout:2/617: dread - d3/d4/d12/da7/db3/fbe zero size 2026-03-09T17:29:51.158 INFO:tasks.workunit.client.0.vm06.stdout:6/573: write d6/d12/d17/f32 [732871,62955] 0 2026-03-09T17:29:51.160 INFO:tasks.workunit.client.0.vm06.stdout:3/739: write dd/d1d/d4e/f72 [692114,83913] 0 2026-03-09T17:29:51.165 INFO:tasks.workunit.client.0.vm06.stdout:6/574: symlink d6/d4f/d73/lba 0 2026-03-09T17:29:51.166 INFO:tasks.workunit.client.0.vm06.stdout:3/740: creat dd/d19/d28/ffe x:0 0 0 2026-03-09T17:29:51.166 INFO:tasks.workunit.client.0.vm06.stdout:6/575: read d6/d47/d96/d40/f67 [2473674,38475] 0 2026-03-09T17:29:51.167 INFO:tasks.workunit.client.0.vm06.stdout:3/741: write dd/d19/d25/d2d/ffa [31142,30619] 0 2026-03-09T17:29:51.174 INFO:tasks.workunit.client.0.vm06.stdout:0/820: write d7/d11/d19/d1d/d87/fc4 [1038553,124080] 0 2026-03-09T17:29:51.175 INFO:tasks.workunit.client.0.vm06.stdout:0/821: chown d7/d11/d19/d37/c69 1 1 2026-03-09T17:29:51.176 INFO:tasks.workunit.client.0.vm06.stdout:0/822: dread - d7/d11/d2d/daf/fd5 zero size 2026-03-09T17:29:51.180 INFO:tasks.workunit.client.0.vm06.stdout:0/823: getdents d7/d102 0 2026-03-09T17:29:51.186 INFO:tasks.workunit.client.0.vm06.stdout:7/817: getdents d5/dd/dc5/d64/d6b/dd1 0 2026-03-09T17:29:51.201 INFO:tasks.workunit.client.0.vm06.stdout:7/818: dread d5/dd/f9d [0,4194304] 0 2026-03-09T17:29:51.202 INFO:tasks.workunit.client.0.vm06.stdout:7/819: chown d5/dd/d79/la7 7431542 1 2026-03-09T17:29:51.205 INFO:tasks.workunit.client.0.vm06.stdout:7/820: link d5/dd/dc5/d64/le4 d5/dd/dc5/d64/de8/lf3 0 2026-03-09T17:29:51.252 INFO:tasks.workunit.client.0.vm06.stdout:4/691: dwrite db/df/f4d [0,4194304] 0 2026-03-09T17:29:51.344 INFO:tasks.workunit.client.0.vm06.stdout:4/692: dread db/d59/d5f/d6d/f7b [0,4194304] 0 2026-03-09T17:29:51.345 INFO:tasks.workunit.client.0.vm06.stdout:4/693: rmdir db/d1d/d21/d37/d69 39 2026-03-09T17:29:51.348 INFO:tasks.workunit.client.0.vm06.stdout:4/694: write db/d1d/d21/d25/d4b/d85/f98 [1385064,1107] 0 2026-03-09T17:29:51.387 INFO:tasks.workunit.client.0.vm06.stdout:8/668: mknod d15/d39/d67/d77/cdc 0 2026-03-09T17:29:51.608 INFO:tasks.workunit.client.0.vm06.stdout:5/693: dwrite d4/d50/d18/f73 [0,4194304] 0 2026-03-09T17:29:51.622 INFO:tasks.workunit.client.0.vm06.stdout:1/683: dwrite d11/d14/d1d/f90 [0,4194304] 0 2026-03-09T17:29:51.628 INFO:tasks.workunit.client.0.vm06.stdout:5/694: mkdir d4/d22/dbe/dfb 0 2026-03-09T17:29:51.628 INFO:tasks.workunit.client.0.vm06.stdout:5/695: readlink d4/d22/laa 0 2026-03-09T17:29:51.637 INFO:tasks.workunit.client.0.vm06.stdout:5/696: creat d4/d22/dbe/ffc x:0 0 0 2026-03-09T17:29:51.639 INFO:tasks.workunit.client.0.vm06.stdout:1/684: getdents d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/dce 0 2026-03-09T17:29:51.649 INFO:tasks.workunit.client.0.vm06.stdout:5/697: mknod d4/d52/d55/dee/cfd 0 2026-03-09T17:29:51.651 INFO:tasks.workunit.client.0.vm06.stdout:1/685: fsync d11/d14/d1d/d42/d46/d92/dc0/f7f 0 2026-03-09T17:29:51.652 INFO:tasks.workunit.client.0.vm06.stdout:1/686: dread - d11/d14/d1c/d5f/fc4 zero size 2026-03-09T17:29:51.654 INFO:tasks.workunit.client.0.vm06.stdout:5/698: mkdir d4/d50/d35/d40/d96/dfe 0 2026-03-09T17:29:51.659 INFO:tasks.workunit.client.0.vm06.stdout:1/687: getdents d11/de0 0 2026-03-09T17:29:51.663 INFO:tasks.workunit.client.0.vm06.stdout:1/688: mkdir d11/d14/d1d/d42/d46/d92/dc0/d57/de4 0 2026-03-09T17:29:51.665 INFO:tasks.workunit.client.0.vm06.stdout:5/699: link f0 d4/dca/fff 0 2026-03-09T17:29:51.670 INFO:tasks.workunit.client.0.vm06.stdout:5/700: dwrite d4/d50/d35/d40/d95/db8/dda/fe7 [0,4194304] 0 2026-03-09T17:29:51.672 INFO:tasks.workunit.client.0.vm06.stdout:5/701: truncate d4/d22/fde 169093 0 2026-03-09T17:29:51.682 INFO:tasks.workunit.client.0.vm06.stdout:1/689: creat d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/dce/fe5 x:0 0 0 2026-03-09T17:29:51.702 INFO:tasks.workunit.client.0.vm06.stdout:2/618: dwrite d3/d4/d12/f15 [4194304,4194304] 0 2026-03-09T17:29:51.705 INFO:tasks.workunit.client.0.vm06.stdout:2/619: chown d3/d4/d46/da5/fa8 738 1 2026-03-09T17:29:51.722 INFO:tasks.workunit.client.0.vm06.stdout:3/742: write dd/f1b [986400,44179] 0 2026-03-09T17:29:51.752 INFO:tasks.workunit.client.0.vm06.stdout:3/743: stat f7 0 2026-03-09T17:29:51.752 INFO:tasks.workunit.client.0.vm06.stdout:6/576: dwrite d6/d47/f88 [4194304,4194304] 0 2026-03-09T17:29:51.752 INFO:tasks.workunit.client.0.vm06.stdout:3/744: mkdir dd/d81/da3/dae/df8/dff 0 2026-03-09T17:29:51.755 INFO:tasks.workunit.client.0.vm06.stdout:9/773: rmdir d3/d26/dcb 39 2026-03-09T17:29:51.759 INFO:tasks.workunit.client.0.vm06.stdout:0/824: rmdir d7/d11/d19/d8b/da4 39 2026-03-09T17:29:51.768 INFO:tasks.workunit.client.0.vm06.stdout:0/825: stat d7/d11/d19/d23 0 2026-03-09T17:29:51.768 INFO:tasks.workunit.client.0.vm06.stdout:7/821: mknod d5/d1f/cf4 0 2026-03-09T17:29:51.768 INFO:tasks.workunit.client.0.vm06.stdout:7/822: mknod d5/d1f/d34/d3f/d91/cf5 0 2026-03-09T17:29:51.768 INFO:tasks.workunit.client.0.vm06.stdout:0/826: link d7/fe d7/d11/d19/d3c/df3/f118 0 2026-03-09T17:29:51.770 INFO:tasks.workunit.client.0.vm06.stdout:0/827: creat d7/d11/d19/d23/db7/dbd/f119 x:0 0 0 2026-03-09T17:29:51.775 INFO:tasks.workunit.client.0.vm06.stdout:8/669: write d15/d39/f45 [67007,76271] 0 2026-03-09T17:29:51.801 INFO:tasks.workunit.client.0.vm06.stdout:8/670: mkdir d15/d39/d67/d86/ddd 0 2026-03-09T17:29:51.802 INFO:tasks.workunit.client.0.vm06.stdout:5/702: sync 2026-03-09T17:29:51.804 INFO:tasks.workunit.client.0.vm06.stdout:5/703: chown d4/d22/d46/la0 5068458 1 2026-03-09T17:29:51.810 INFO:tasks.workunit.client.0.vm06.stdout:2/620: symlink d3/d4/d12/lc9 0 2026-03-09T17:29:51.811 INFO:tasks.workunit.client.0.vm06.stdout:2/621: truncate d3/d4/d22/d72/d8f/fbf 829958 0 2026-03-09T17:29:51.817 INFO:tasks.workunit.client.0.vm06.stdout:2/622: mknod d3/d4/d22/d72/d8f/cca 0 2026-03-09T17:29:51.821 INFO:tasks.workunit.client.0.vm06.stdout:2/623: dwrite d3/d4/f1f [0,4194304] 0 2026-03-09T17:29:51.823 INFO:tasks.workunit.client.0.vm06.stdout:2/624: stat d3/d4/d46/fc6 0 2026-03-09T17:29:51.828 INFO:tasks.workunit.client.0.vm06.stdout:2/625: dwrite d3/d4/d46/da5/f6c [0,4194304] 0 2026-03-09T17:29:51.855 INFO:tasks.workunit.client.0.vm06.stdout:2/626: dread d3/d4/f3c [0,4194304] 0 2026-03-09T17:29:51.891 INFO:tasks.workunit.client.0.vm06.stdout:1/690: write d11/d14/fa6 [1187546,73587] 0 2026-03-09T17:29:51.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:51 vm06.local ceph-mon[57307]: pgmap v159: 65 pgs: 65 active+clean; 1.6 GiB data, 5.9 GiB used, 114 GiB / 120 GiB avail; 31 MiB/s rd, 72 MiB/s wr, 295 op/s 2026-03-09T17:29:51.922 INFO:tasks.workunit.client.0.vm06.stdout:6/577: rename d6/f7b to d6/d12/fbb 0 2026-03-09T17:29:51.927 INFO:tasks.workunit.client.0.vm06.stdout:6/578: dread d6/d12/d53/f87 [0,4194304] 0 2026-03-09T17:29:51.931 INFO:tasks.workunit.client.0.vm06.stdout:6/579: dread - d6/d12/d17/d85/f9c zero size 2026-03-09T17:29:51.931 INFO:tasks.workunit.client.0.vm06.stdout:6/580: readlink d6/d47/d96/l2c 0 2026-03-09T17:29:51.936 INFO:tasks.workunit.client.0.vm06.stdout:4/695: rmdir db/d59/d5f/d6d 39 2026-03-09T17:29:51.939 INFO:tasks.workunit.client.0.vm06.stdout:7/823: rmdir d5/dd/dc5/dad 39 2026-03-09T17:29:51.939 INFO:tasks.workunit.client.0.vm06.stdout:7/824: chown d5/d1f/d34/d3f 254163282 1 2026-03-09T17:29:51.940 INFO:tasks.workunit.client.0.vm06.stdout:6/581: fdatasync d6/d4f/f3a 0 2026-03-09T17:29:51.945 INFO:tasks.workunit.client.0.vm06.stdout:0/828: rename d7/d11/d19/d1d/d87/c9d to d7/d11/d19/d23/db7/dbd/c11a 0 2026-03-09T17:29:51.949 INFO:tasks.workunit.client.0.vm06.stdout:3/745: dwrite dd/d1d/f4b [0,4194304] 0 2026-03-09T17:29:51.953 INFO:tasks.workunit.client.0.vm06.stdout:4/696: truncate db/d1d/d21/f67 2947318 0 2026-03-09T17:29:51.960 INFO:tasks.workunit.client.0.vm06.stdout:3/746: mkdir dd/d19/d1e/d100 0 2026-03-09T17:29:51.961 INFO:tasks.workunit.client.0.vm06.stdout:4/697: creat db/d1d/d21/d44/dc1/ffb x:0 0 0 2026-03-09T17:29:51.963 INFO:tasks.workunit.client.0.vm06.stdout:6/582: creat d6/d12/fbc x:0 0 0 2026-03-09T17:29:51.963 INFO:tasks.workunit.client.0.vm06.stdout:6/583: stat d6/d47/d96/da1/fb7 0 2026-03-09T17:29:51.965 INFO:tasks.workunit.client.0.vm06.stdout:3/747: mkdir dd/d5b/d101 0 2026-03-09T17:29:51.969 INFO:tasks.workunit.client.0.vm06.stdout:0/829: creat d7/d11/d19/d23/db7/dbd/f11b x:0 0 0 2026-03-09T17:29:51.970 INFO:tasks.workunit.client.0.vm06.stdout:0/830: stat d7/d88 0 2026-03-09T17:29:51.970 INFO:tasks.workunit.client.0.vm06.stdout:0/831: write d7/d11/f10c [352272,14384] 0 2026-03-09T17:29:51.970 INFO:tasks.workunit.client.0.vm06.stdout:0/832: chown d7/d11/d19/d1d/d39/l6e 223279 1 2026-03-09T17:29:51.971 INFO:tasks.workunit.client.0.vm06.stdout:7/825: getdents d5/d7 0 2026-03-09T17:29:51.974 INFO:tasks.workunit.client.0.vm06.stdout:0/833: dwrite d7/d11/d19/f24 [4194304,4194304] 0 2026-03-09T17:29:51.976 INFO:tasks.workunit.client.0.vm06.stdout:6/584: creat d6/d47/d96/d40/fbd x:0 0 0 2026-03-09T17:29:51.976 INFO:tasks.workunit.client.0.vm06.stdout:4/698: mkdir db/d59/d5f/d6d/dfc 0 2026-03-09T17:29:51.977 INFO:tasks.workunit.client.0.vm06.stdout:6/585: stat d6/d47/d4d/d6d/c8d 0 2026-03-09T17:29:51.979 INFO:tasks.workunit.client.0.vm06.stdout:7/826: chown d5/dd/dc5/d64/le4 1 1 2026-03-09T17:29:51.980 INFO:tasks.workunit.client.0.vm06.stdout:7/827: write d5/dd/dc5/d64/d6b/dd1/fe3 [371741,119917] 0 2026-03-09T17:29:51.986 INFO:tasks.workunit.client.0.vm06.stdout:0/834: mkdir d7/d11/d19/d3c/df3/d11c 0 2026-03-09T17:29:51.989 INFO:tasks.workunit.client.0.vm06.stdout:3/748: rmdir dd/d5b/d101 0 2026-03-09T17:29:51.993 INFO:tasks.workunit.client.0.vm06.stdout:3/749: rename dd/d19/d1e/l7f to dd/d19/d25/d44/d80/l102 0 2026-03-09T17:29:51.994 INFO:tasks.workunit.client.0.vm06.stdout:0/835: creat d7/d11/d19/d23/db7/dbd/d101/d114/f11d x:0 0 0 2026-03-09T17:29:52.001 INFO:tasks.workunit.client.0.vm06.stdout:0/836: link d7/d11/d19/fe5 d7/d11/d19/d1d/d87/f11e 0 2026-03-09T17:29:52.009 INFO:tasks.workunit.client.0.vm06.stdout:0/837: dwrite d7/fb1 [0,4194304] 0 2026-03-09T17:29:52.024 INFO:tasks.workunit.client.0.vm06.stdout:0/838: creat d7/d11/d89/f11f x:0 0 0 2026-03-09T17:29:52.025 INFO:tasks.workunit.client.0.vm06.stdout:9/774: write d3/d6d/d9a/d9c/fdf [2509336,18227] 0 2026-03-09T17:29:52.030 INFO:tasks.workunit.client.0.vm06.stdout:8/671: write d15/d16/d1e/f34 [4978229,96958] 0 2026-03-09T17:29:52.034 INFO:tasks.workunit.client.0.vm06.stdout:8/672: dwrite d15/d16/d19/d71/f80 [4194304,4194304] 0 2026-03-09T17:29:52.038 INFO:tasks.workunit.client.0.vm06.stdout:0/839: mknod d7/d11/d19/d23/db7/dbd/d101/d114/c120 0 2026-03-09T17:29:52.043 INFO:tasks.workunit.client.0.vm06.stdout:8/673: dwrite d15/d16/d19/d2b/f46 [0,4194304] 0 2026-03-09T17:29:52.050 INFO:tasks.workunit.client.0.vm06.stdout:0/840: fsync d7/d11/d19/d8b/da4/fa7 0 2026-03-09T17:29:52.055 INFO:tasks.workunit.client.0.vm06.stdout:8/674: creat d15/d39/d3c/dd5/fde x:0 0 0 2026-03-09T17:29:52.060 INFO:tasks.workunit.client.0.vm06.stdout:8/675: link d15/d16/d19/d3d/d5f/lbb d15/d39/dd2/ldf 0 2026-03-09T17:29:52.064 INFO:tasks.workunit.client.0.vm06.stdout:8/676: readlink d15/d16/d6d/lce 0 2026-03-09T17:29:52.074 INFO:tasks.workunit.client.0.vm06.stdout:8/677: rename d15/d16/d19/d3d/d5f/c73 to d15/ce0 0 2026-03-09T17:29:52.074 INFO:tasks.workunit.client.0.vm06.stdout:8/678: readlink d15/d16/d19/d3d/d5f/l84 0 2026-03-09T17:29:52.079 INFO:tasks.workunit.client.0.vm06.stdout:8/679: creat d15/d39/d67/d77/d97/dac/fe1 x:0 0 0 2026-03-09T17:29:52.082 INFO:tasks.workunit.client.0.vm06.stdout:8/680: truncate d15/d16/d1a/d47/f7e 2493934 0 2026-03-09T17:29:52.086 INFO:tasks.workunit.client.0.vm06.stdout:8/681: rmdir d15/d16/d19/d3d/d5f/dd4 39 2026-03-09T17:29:52.111 INFO:tasks.workunit.client.0.vm06.stdout:9/775: dread d3/d15/d36/d4d/f60 [4194304,4194304] 0 2026-03-09T17:29:52.114 INFO:tasks.workunit.client.0.vm06.stdout:9/776: fdatasync d3/d26/d35/fb0 0 2026-03-09T17:29:52.129 INFO:tasks.workunit.client.0.vm06.stdout:5/704: dwrite d4/d50/fad [0,4194304] 0 2026-03-09T17:29:52.133 INFO:tasks.workunit.client.0.vm06.stdout:5/705: creat d4/d52/db4/dc2/f100 x:0 0 0 2026-03-09T17:29:52.136 INFO:tasks.workunit.client.0.vm06.stdout:5/706: creat d4/d50/d18/f101 x:0 0 0 2026-03-09T17:29:52.137 INFO:tasks.workunit.client.0.vm06.stdout:5/707: creat d4/d22/d64/df3/f102 x:0 0 0 2026-03-09T17:29:52.138 INFO:tasks.workunit.client.0.vm06.stdout:5/708: mknod d4/dbb/c103 0 2026-03-09T17:29:52.141 INFO:tasks.workunit.client.0.vm06.stdout:5/709: dwrite d4/d22/d64/df3/f102 [0,4194304] 0 2026-03-09T17:29:52.148 INFO:tasks.workunit.client.0.vm06.stdout:5/710: symlink d4/d50/d35/d40/l104 0 2026-03-09T17:29:52.149 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:51 vm09.local ceph-mon[62061]: pgmap v159: 65 pgs: 65 active+clean; 1.6 GiB data, 5.9 GiB used, 114 GiB / 120 GiB avail; 31 MiB/s rd, 72 MiB/s wr, 295 op/s 2026-03-09T17:29:52.223 INFO:tasks.workunit.client.0.vm06.stdout:2/627: write d3/d4/d12/d2b/f89 [1638828,14769] 0 2026-03-09T17:29:52.241 INFO:tasks.workunit.client.0.vm06.stdout:2/628: fsync d3/d4/d12/d2b/d2d/f2a 0 2026-03-09T17:29:52.241 INFO:tasks.workunit.client.0.vm06.stdout:2/629: chown d3/d4/f9c 6044603 1 2026-03-09T17:29:52.242 INFO:tasks.workunit.client.0.vm06.stdout:2/630: chown d3/d4/d46/da5/f6c 185659 1 2026-03-09T17:29:52.246 INFO:tasks.workunit.client.0.vm06.stdout:1/691: dwrite d11/d14/d1d/d1e/d2a/f40 [0,4194304] 0 2026-03-09T17:29:52.246 INFO:tasks.workunit.client.0.vm06.stdout:1/692: dread - d11/d14/d1d/d42/f44 zero size 2026-03-09T17:29:52.253 INFO:tasks.workunit.client.0.vm06.stdout:1/693: dwrite d11/d14/d1d/d42/d46/d92/dc0/f68 [0,4194304] 0 2026-03-09T17:29:52.253 INFO:tasks.workunit.client.0.vm06.stdout:1/694: chown d11/d14/d1c/d3a/c87 0 1 2026-03-09T17:29:52.255 INFO:tasks.workunit.client.0.vm06.stdout:1/695: chown d11/d14/d1d/d1e/d2a/d99/db0 41724573 1 2026-03-09T17:29:52.285 INFO:tasks.workunit.client.0.vm06.stdout:6/586: dwrite d6/d12/d2d/f39 [0,4194304] 0 2026-03-09T17:29:52.291 INFO:tasks.workunit.client.0.vm06.stdout:7/828: write d5/d1f/d34/d46/d51/f7c [550732,57125] 0 2026-03-09T17:29:52.295 INFO:tasks.workunit.client.0.vm06.stdout:4/699: truncate db/d1d/f1f 4255785 0 2026-03-09T17:29:52.307 INFO:tasks.workunit.client.0.vm06.stdout:3/750: truncate dd/d1d/d4e/f7d 3840849 0 2026-03-09T17:29:52.307 INFO:tasks.workunit.client.0.vm06.stdout:2/631: fsync f2 0 2026-03-09T17:29:52.310 INFO:tasks.workunit.client.0.vm06.stdout:7/829: dread d5/d7/f62 [0,4194304] 0 2026-03-09T17:29:52.319 INFO:tasks.workunit.client.0.vm06.stdout:1/696: truncate d11/d14/d1d/d42/d46/d92/dc0/f7f 2117213 0 2026-03-09T17:29:52.327 INFO:tasks.workunit.client.0.vm06.stdout:4/700: mkdir db/d1d/d21/d88/dfd 0 2026-03-09T17:29:52.356 INFO:tasks.workunit.client.0.vm06.stdout:7/830: dread d5/d1f/d34/d46/fa9 [0,4194304] 0 2026-03-09T17:29:52.357 INFO:tasks.workunit.client.0.vm06.stdout:1/697: write d11/d14/d1d/d94/fc6 [15232,71201] 0 2026-03-09T17:29:52.357 INFO:tasks.workunit.client.0.vm06.stdout:1/698: rename d11/d14/d1d/d1e/d2a/f38 to d11/d14/d1d/d42/d46/d92/dc0/ddf/fe6 0 2026-03-09T17:29:52.357 INFO:tasks.workunit.client.0.vm06.stdout:1/699: mkdir d11/d14/d1d/d4a/de7 0 2026-03-09T17:29:52.357 INFO:tasks.workunit.client.0.vm06.stdout:3/751: getdents dd/d81/da3/dae/df8 0 2026-03-09T17:29:52.357 INFO:tasks.workunit.client.0.vm06.stdout:3/752: read - dd/d19/d2c/fad zero size 2026-03-09T17:29:52.357 INFO:tasks.workunit.client.0.vm06.stdout:3/753: creat dd/d19/d2c/f103 x:0 0 0 2026-03-09T17:29:52.357 INFO:tasks.workunit.client.0.vm06.stdout:1/700: getdents d11/d14/d1d/d1e 0 2026-03-09T17:29:52.357 INFO:tasks.workunit.client.0.vm06.stdout:1/701: dwrite d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/dce/fe5 [0,4194304] 0 2026-03-09T17:29:52.357 INFO:tasks.workunit.client.0.vm06.stdout:1/702: write d11/f18 [9357522,67620] 0 2026-03-09T17:29:52.358 INFO:tasks.workunit.client.0.vm06.stdout:4/701: read db/d1d/d21/d37/d69/f8b [1698679,12469] 0 2026-03-09T17:29:52.358 INFO:tasks.workunit.client.0.vm06.stdout:4/702: readlink l9 0 2026-03-09T17:29:52.359 INFO:tasks.workunit.client.0.vm06.stdout:1/703: dread d11/d14/d1d/d1e/f65 [0,4194304] 0 2026-03-09T17:29:52.361 INFO:tasks.workunit.client.0.vm06.stdout:4/703: creat db/d1d/d21/d25/d4b/df7/ffe x:0 0 0 2026-03-09T17:29:52.366 INFO:tasks.workunit.client.0.vm06.stdout:4/704: fsync db/f55 0 2026-03-09T17:29:52.370 INFO:tasks.workunit.client.0.vm06.stdout:7/831: dread d5/dd/d79/d7f/f98 [0,4194304] 0 2026-03-09T17:29:52.374 INFO:tasks.workunit.client.0.vm06.stdout:7/832: read d5/d7/d2b/f42 [35199,12354] 0 2026-03-09T17:29:52.388 INFO:tasks.workunit.client.0.vm06.stdout:6/587: sync 2026-03-09T17:29:52.416 INFO:tasks.workunit.client.0.vm06.stdout:6/588: sync 2026-03-09T17:29:52.416 INFO:tasks.workunit.client.0.vm06.stdout:6/589: dread - d6/d47/d96/d40/fbd zero size 2026-03-09T17:29:52.420 INFO:tasks.workunit.client.0.vm06.stdout:6/590: creat d6/d47/d4d/d6d/fbe x:0 0 0 2026-03-09T17:29:52.431 INFO:tasks.workunit.client.0.vm06.stdout:0/841: write d7/d11/d19/d23/f8e [2808762,122746] 0 2026-03-09T17:29:52.435 INFO:tasks.workunit.client.0.vm06.stdout:0/842: dread - d7/d11/d2d/daf/fd3 zero size 2026-03-09T17:29:52.438 INFO:tasks.workunit.client.0.vm06.stdout:0/843: mknod d7/d11/d19/d3c/df8/c121 0 2026-03-09T17:29:52.446 INFO:tasks.workunit.client.0.vm06.stdout:0/844: link d7/d11/f10c d7/d11/d19/d23/db7/dbd/d101/d114/f122 0 2026-03-09T17:29:52.447 INFO:tasks.workunit.client.0.vm06.stdout:0/845: stat d7/d11/d19/d8b/da4/fa7 0 2026-03-09T17:29:52.454 INFO:tasks.workunit.client.0.vm06.stdout:8/682: truncate d15/d31/d58/fc8 2822010 0 2026-03-09T17:29:52.459 INFO:tasks.workunit.client.0.vm06.stdout:0/846: dread d7/d11/d19/d8b/da4/fa7 [0,4194304] 0 2026-03-09T17:29:52.461 INFO:tasks.workunit.client.0.vm06.stdout:8/683: mkdir d15/d31/de2 0 2026-03-09T17:29:52.463 INFO:tasks.workunit.client.0.vm06.stdout:8/684: rmdir d15/d31/dc5 39 2026-03-09T17:29:52.467 INFO:tasks.workunit.client.0.vm06.stdout:7/833: dread d5/dd/fa0 [0,4194304] 0 2026-03-09T17:29:52.468 INFO:tasks.workunit.client.0.vm06.stdout:7/834: read d5/dd/d79/d7f/f98 [1786280,46444] 0 2026-03-09T17:29:52.469 INFO:tasks.workunit.client.0.vm06.stdout:9/777: dwrite d3/d15/d36/d83/fc6 [0,4194304] 0 2026-03-09T17:29:52.470 INFO:tasks.workunit.client.0.vm06.stdout:7/835: chown d5/d1f/c37 25603938 1 2026-03-09T17:29:52.471 INFO:tasks.workunit.client.0.vm06.stdout:8/685: sync 2026-03-09T17:29:52.471 INFO:tasks.workunit.client.0.vm06.stdout:7/836: dread - d5/d7/f75 zero size 2026-03-09T17:29:52.471 INFO:tasks.workunit.client.0.vm06.stdout:9/778: readlink d3/d15/d16/l6b 0 2026-03-09T17:29:52.475 INFO:tasks.workunit.client.0.vm06.stdout:9/779: stat d3/d15/d36/d83/fb1 0 2026-03-09T17:29:52.479 INFO:tasks.workunit.client.0.vm06.stdout:9/780: dwrite d3/d15/d36/d4c/d6a/d8a/fdc [0,4194304] 0 2026-03-09T17:29:52.487 INFO:tasks.workunit.client.0.vm06.stdout:8/686: mkdir d15/d39/d67/de3 0 2026-03-09T17:29:52.490 INFO:tasks.workunit.client.0.vm06.stdout:8/687: dread d15/d16/d19/d71/f80 [4194304,4194304] 0 2026-03-09T17:29:52.491 INFO:tasks.workunit.client.0.vm06.stdout:7/837: rmdir d5/dd/dc5/d64/de8 39 2026-03-09T17:29:52.497 INFO:tasks.workunit.client.0.vm06.stdout:5/711: dwrite d4/d50/d18/f3c [8388608,4194304] 0 2026-03-09T17:29:52.499 INFO:tasks.workunit.client.0.vm06.stdout:5/712: truncate d4/dbb/ff9 552697 0 2026-03-09T17:29:52.508 INFO:tasks.workunit.client.0.vm06.stdout:9/781: symlink d3/d26/d6c/d68/lf7 0 2026-03-09T17:29:52.512 INFO:tasks.workunit.client.0.vm06.stdout:9/782: dread d3/d6d/f78 [0,4194304] 0 2026-03-09T17:29:52.515 INFO:tasks.workunit.client.0.vm06.stdout:0/847: getdents d7/d11/d19 0 2026-03-09T17:29:52.516 INFO:tasks.workunit.client.0.vm06.stdout:0/848: truncate d7/d88/fbe 1492513 0 2026-03-09T17:29:52.534 INFO:tasks.workunit.client.0.vm06.stdout:0/849: creat d7/d11/d19/d3c/db9/f123 x:0 0 0 2026-03-09T17:29:52.546 INFO:tasks.workunit.client.0.vm06.stdout:7/838: truncate d5/d1f/d34/f5e 975612 0 2026-03-09T17:29:52.546 INFO:tasks.workunit.client.0.vm06.stdout:7/839: fsync d5/d7/dac/fe2 0 2026-03-09T17:29:52.547 INFO:tasks.workunit.client.0.vm06.stdout:5/713: link d4/d52/f6c d4/d22/d46/dec/f105 0 2026-03-09T17:29:52.565 INFO:tasks.workunit.client.0.vm06.stdout:2/632: dwrite d3/d4/d12/f31 [0,4194304] 0 2026-03-09T17:29:52.572 INFO:tasks.workunit.client.0.vm06.stdout:3/754: write dd/d19/d2c/fad [97563,115722] 0 2026-03-09T17:29:52.578 INFO:tasks.workunit.client.0.vm06.stdout:1/704: write d11/d14/d1d/f56 [3990,78763] 0 2026-03-09T17:29:52.583 INFO:tasks.workunit.client.0.vm06.stdout:4/705: dwrite db/d59/d5f/d5d/fc2 [0,4194304] 0 2026-03-09T17:29:52.588 INFO:tasks.workunit.client.0.vm06.stdout:4/706: dread db/df/f4d [0,4194304] 0 2026-03-09T17:29:52.593 INFO:tasks.workunit.client.0.vm06.stdout:1/705: symlink d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/dce/le8 0 2026-03-09T17:29:52.593 INFO:tasks.workunit.client.0.vm06.stdout:1/706: chown d11/d14/d1d/d4a/fa5 182813 1 2026-03-09T17:29:52.598 INFO:tasks.workunit.client.0.vm06.stdout:6/591: truncate d6/d12/d2d/f39 3989558 0 2026-03-09T17:29:52.600 INFO:tasks.workunit.client.0.vm06.stdout:4/707: symlink db/d1d/d21/d26/d7a/lff 0 2026-03-09T17:29:52.601 INFO:tasks.workunit.client.0.vm06.stdout:1/707: mkdir d11/d14/d1d/d1e/d2a/d99/de9 0 2026-03-09T17:29:52.605 INFO:tasks.workunit.client.0.vm06.stdout:1/708: creat d11/d14/d1d/d1e/d2a/d34/d64/fea x:0 0 0 2026-03-09T17:29:52.610 INFO:tasks.workunit.client.0.vm06.stdout:6/592: mkdir d6/d12/d53/d91/dbf 0 2026-03-09T17:29:52.614 INFO:tasks.workunit.client.0.vm06.stdout:1/709: unlink d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/f84 0 2026-03-09T17:29:52.618 INFO:tasks.workunit.client.0.vm06.stdout:1/710: rename d11/d14/d1d/f7c to d11/d14/d1d/d1e/d2a/d99/de9/feb 0 2026-03-09T17:29:52.619 INFO:tasks.workunit.client.0.vm06.stdout:1/711: readlink d11/d14/d1d/d42/d46/d92/dc0/lb6 0 2026-03-09T17:29:52.620 INFO:tasks.workunit.client.0.vm06.stdout:1/712: dread d11/d14/d1d/f31 [0,4194304] 0 2026-03-09T17:29:52.623 INFO:tasks.workunit.client.0.vm06.stdout:8/688: dwrite d15/d16/d1e/f4e [0,4194304] 0 2026-03-09T17:29:52.625 INFO:tasks.workunit.client.0.vm06.stdout:9/783: write d3/d6d/d9a/d9c/fab [52236,98228] 0 2026-03-09T17:29:52.630 INFO:tasks.workunit.client.0.vm06.stdout:0/850: write d7/f76 [1643414,56961] 0 2026-03-09T17:29:52.635 INFO:tasks.workunit.client.0.vm06.stdout:1/713: dread d11/d69/fad [0,4194304] 0 2026-03-09T17:29:52.640 INFO:tasks.workunit.client.0.vm06.stdout:6/593: mknod d6/d4f/d3e/d52/d95/cc0 0 2026-03-09T17:29:52.642 INFO:tasks.workunit.client.0.vm06.stdout:4/708: link db/lbc db/d1d/d21/d25/l100 0 2026-03-09T17:29:52.642 INFO:tasks.workunit.client.0.vm06.stdout:4/709: stat db/d59/d5f/d6d/lb2 0 2026-03-09T17:29:52.643 INFO:tasks.workunit.client.0.vm06.stdout:4/710: chown db/d59/d5f/d5d/f62 278264 1 2026-03-09T17:29:52.646 INFO:tasks.workunit.client.0.vm06.stdout:8/689: mknod d15/d39/d67/d77/d97/dac/ce4 0 2026-03-09T17:29:52.646 INFO:tasks.workunit.client.0.vm06.stdout:7/840: write d5/dd/dc5/d64/f77 [4889157,93488] 0 2026-03-09T17:29:52.651 INFO:tasks.workunit.client.0.vm06.stdout:8/690: dwrite d15/d39/d67/fd0 [0,4194304] 0 2026-03-09T17:29:52.662 INFO:tasks.workunit.client.0.vm06.stdout:5/714: write d4/d22/d46/f6e [3474568,54769] 0 2026-03-09T17:29:52.667 INFO:tasks.workunit.client.0.vm06.stdout:1/714: creat d11/d14/d1d/d1e/d2a/d34/d64/fec x:0 0 0 2026-03-09T17:29:52.669 INFO:tasks.workunit.client.0.vm06.stdout:1/715: truncate d11/d14/d1d/fcc 8159 0 2026-03-09T17:29:52.670 INFO:tasks.workunit.client.0.vm06.stdout:6/594: stat d6/c23 0 2026-03-09T17:29:52.673 INFO:tasks.workunit.client.0.vm06.stdout:7/841: unlink d5/dd/dc5/d5f/cf0 0 2026-03-09T17:29:52.680 INFO:tasks.workunit.client.0.vm06.stdout:2/633: write d3/d4/d12/d2b/d2d/f1b [1175958,11551] 0 2026-03-09T17:29:52.682 INFO:tasks.workunit.client.0.vm06.stdout:2/634: dread d3/d4/d22/d72/d8f/fbf [0,4194304] 0 2026-03-09T17:29:52.687 INFO:tasks.workunit.client.0.vm06.stdout:3/755: truncate dd/f5f 520709 0 2026-03-09T17:29:52.687 INFO:tasks.workunit.client.0.vm06.stdout:3/756: fsync dd/d19/d2c/fe9 0 2026-03-09T17:29:52.689 INFO:tasks.workunit.client.0.vm06.stdout:1/716: mknod d11/d14/d1d/d1e/d2a/d34/d64/ced 0 2026-03-09T17:29:52.692 INFO:tasks.workunit.client.0.vm06.stdout:6/595: symlink d6/d47/d96/da1/lc1 0 2026-03-09T17:29:52.695 INFO:tasks.workunit.client.0.vm06.stdout:8/691: creat d15/d39/d67/de3/fe5 x:0 0 0 2026-03-09T17:29:52.697 INFO:tasks.workunit.client.0.vm06.stdout:2/635: symlink d3/d4/d12/d71/lcb 0 2026-03-09T17:29:52.697 INFO:tasks.workunit.client.0.vm06.stdout:2/636: read d3/d4/f1f [1869553,19757] 0 2026-03-09T17:29:52.699 INFO:tasks.workunit.client.0.vm06.stdout:3/757: rmdir dd/d81/d97 39 2026-03-09T17:29:52.706 INFO:tasks.workunit.client.0.vm06.stdout:4/711: creat db/d1d/d21/d37/f101 x:0 0 0 2026-03-09T17:29:52.715 INFO:tasks.workunit.client.0.vm06.stdout:2/637: symlink d3/d4/d46/da5/lcc 0 2026-03-09T17:29:52.717 INFO:tasks.workunit.client.0.vm06.stdout:5/715: rename d4/d22/d46/c67 to d4/d50/c106 0 2026-03-09T17:29:52.724 INFO:tasks.workunit.client.0.vm06.stdout:9/784: dwrite d3/d15/d48/fb7 [0,4194304] 0 2026-03-09T17:29:52.734 INFO:tasks.workunit.client.0.vm06.stdout:0/851: write d7/d11/f13 [2697882,103578] 0 2026-03-09T17:29:52.740 INFO:tasks.workunit.client.0.vm06.stdout:7/842: rename d5/dd/f9d to d5/d1f/d34/ff6 0 2026-03-09T17:29:52.743 INFO:tasks.workunit.client.0.vm06.stdout:7/843: readlink d5/d1f/d34/d3f/l44 0 2026-03-09T17:29:52.743 INFO:tasks.workunit.client.0.vm06.stdout:7/844: chown d5/dd/dc5/f93 55 1 2026-03-09T17:29:52.743 INFO:tasks.workunit.client.0.vm06.stdout:5/716: mkdir d4/d50/d35/d40/d95/db8/d107 0 2026-03-09T17:29:52.745 INFO:tasks.workunit.client.0.vm06.stdout:1/717: creat d11/fee x:0 0 0 2026-03-09T17:29:52.749 INFO:tasks.workunit.client.0.vm06.stdout:4/712: mkdir db/d59/d5f/d102 0 2026-03-09T17:29:52.759 INFO:tasks.workunit.client.0.vm06.stdout:7/845: dread d5/d7/d2b/f50 [0,4194304] 0 2026-03-09T17:29:52.760 INFO:tasks.workunit.client.0.vm06.stdout:7/846: write d5/dd/dc5/d64/d6b/feb [551422,50160] 0 2026-03-09T17:29:52.761 INFO:tasks.workunit.client.0.vm06.stdout:7/847: write d5/dd/dc5/fa2 [151891,77574] 0 2026-03-09T17:29:52.762 INFO:tasks.workunit.client.0.vm06.stdout:5/717: rename d4/d52/d55/c72 to d4/d50/d35/d40/d95/db8/c108 0 2026-03-09T17:29:52.769 INFO:tasks.workunit.client.0.vm06.stdout:4/713: fdatasync db/f51 0 2026-03-09T17:29:52.769 INFO:tasks.workunit.client.0.vm06.stdout:6/596: dwrite d6/d12/d17/d85/f9c [0,4194304] 0 2026-03-09T17:29:52.775 INFO:tasks.workunit.client.0.vm06.stdout:8/692: dwrite f12 [4194304,4194304] 0 2026-03-09T17:29:52.778 INFO:tasks.workunit.client.0.vm06.stdout:9/785: sync 2026-03-09T17:29:52.780 INFO:tasks.workunit.client.0.vm06.stdout:5/718: sync 2026-03-09T17:29:52.780 INFO:tasks.workunit.client.0.vm06.stdout:4/714: sync 2026-03-09T17:29:52.780 INFO:tasks.workunit.client.0.vm06.stdout:4/715: fsync db/f13 0 2026-03-09T17:29:52.780 INFO:tasks.workunit.client.0.vm06.stdout:5/719: readlink d4/d50/d18/d3d/l7f 0 2026-03-09T17:29:52.781 INFO:tasks.workunit.client.0.vm06.stdout:5/720: chown d4/d22/d46/c60 101878 1 2026-03-09T17:29:52.781 INFO:tasks.workunit.client.0.vm06.stdout:4/716: dread db/d59/d5f/d6d/f7b [0,4194304] 0 2026-03-09T17:29:52.790 INFO:tasks.workunit.client.0.vm06.stdout:2/638: creat d3/d4/d12/d2b/d2d/fcd x:0 0 0 2026-03-09T17:29:52.794 INFO:tasks.workunit.client.0.vm06.stdout:7/848: unlink d5/d1f/f74 0 2026-03-09T17:29:52.795 INFO:tasks.workunit.client.0.vm06.stdout:3/758: getdents dd/d19/d1e 0 2026-03-09T17:29:52.799 INFO:tasks.workunit.client.0.vm06.stdout:6/597: truncate d6/d4f/f3a 4437691 0 2026-03-09T17:29:52.800 INFO:tasks.workunit.client.0.vm06.stdout:0/852: creat d7/d11/d89/d99/df6/f124 x:0 0 0 2026-03-09T17:29:52.802 INFO:tasks.workunit.client.0.vm06.stdout:7/849: dread d5/dd/d79/fb3 [0,4194304] 0 2026-03-09T17:29:52.808 INFO:tasks.workunit.client.0.vm06.stdout:9/786: mkdir d3/d15/d36/d83/df8 0 2026-03-09T17:29:52.813 INFO:tasks.workunit.client.0.vm06.stdout:5/721: fsync d4/d22/d64/fce 0 2026-03-09T17:29:52.816 INFO:tasks.workunit.client.0.vm06.stdout:4/717: symlink db/d1d/d21/d37/d69/l103 0 2026-03-09T17:29:52.817 INFO:tasks.workunit.client.0.vm06.stdout:3/759: mknod dd/d19/d25/d48/c104 0 2026-03-09T17:29:52.821 INFO:tasks.workunit.client.0.vm06.stdout:6/598: fsync d6/d47/d96/f7e 0 2026-03-09T17:29:52.822 INFO:tasks.workunit.client.0.vm06.stdout:1/718: write d11/d14/d1d/f31 [1026945,73537] 0 2026-03-09T17:29:52.829 INFO:tasks.workunit.client.0.vm06.stdout:6/599: dread d6/d47/d96/f7e [0,4194304] 0 2026-03-09T17:29:52.834 INFO:tasks.workunit.client.0.vm06.stdout:8/693: write fe [634035,22776] 0 2026-03-09T17:29:52.838 INFO:tasks.workunit.client.0.vm06.stdout:0/853: symlink d7/d11/d89/da8/db2/dea/l125 0 2026-03-09T17:29:52.843 INFO:tasks.workunit.client.0.vm06.stdout:7/850: fdatasync d5/dd/dc5/d5f/fb2 0 2026-03-09T17:29:52.849 INFO:tasks.workunit.client.0.vm06.stdout:5/722: fsync d4/f49 0 2026-03-09T17:29:52.852 INFO:tasks.workunit.client.0.vm06.stdout:5/723: dwrite d4/dbb/ff9 [0,4194304] 0 2026-03-09T17:29:52.857 INFO:tasks.workunit.client.0.vm06.stdout:2/639: truncate d3/fc7 13143 0 2026-03-09T17:29:52.859 INFO:tasks.workunit.client.0.vm06.stdout:4/718: rename db/d1d/d21/d25/f53 to db/d1d/d21/d26/d89/dab/dae/dcc/de0/f104 0 2026-03-09T17:29:52.864 INFO:tasks.workunit.client.0.vm06.stdout:8/694: truncate d15/d16/d19/d71/f80 2500233 0 2026-03-09T17:29:52.885 INFO:tasks.workunit.client.0.vm06.stdout:0/854: creat d7/d11/d19/d8b/da4/d85/f126 x:0 0 0 2026-03-09T17:29:52.885 INFO:tasks.workunit.client.0.vm06.stdout:9/787: symlink d3/de1/lf9 0 2026-03-09T17:29:52.885 INFO:tasks.workunit.client.0.vm06.stdout:5/724: rmdir d4 39 2026-03-09T17:29:52.885 INFO:tasks.workunit.client.0.vm06.stdout:7/851: rename d5/d7/de5 to d5/dd/d79/d7f/df7 0 2026-03-09T17:29:52.885 INFO:tasks.workunit.client.0.vm06.stdout:4/719: creat db/d1d/d21/d26/d7a/f105 x:0 0 0 2026-03-09T17:29:52.885 INFO:tasks.workunit.client.0.vm06.stdout:8/695: creat d15/d16/d19/d2b/fe6 x:0 0 0 2026-03-09T17:29:52.885 INFO:tasks.workunit.client.0.vm06.stdout:9/788: truncate d3/d26/d35/f6f 164026 0 2026-03-09T17:29:52.890 INFO:tasks.workunit.client.0.vm06.stdout:6/600: dread d6/d12/f22 [0,4194304] 0 2026-03-09T17:29:52.897 INFO:tasks.workunit.client.0.vm06.stdout:4/720: dread - db/d1d/d21/d44/dc1/fe1 zero size 2026-03-09T17:29:52.908 INFO:tasks.workunit.client.0.vm06.stdout:0/855: truncate d7/d11/d5d/d64/fc9 2036180 0 2026-03-09T17:29:52.908 INFO:tasks.workunit.client.0.vm06.stdout:9/789: dread d3/d26/d6c/d68/f9b [0,4194304] 0 2026-03-09T17:29:52.908 INFO:tasks.workunit.client.0.vm06.stdout:4/721: dwrite db/d1d/d21/d25/d4b/df7/ffe [0,4194304] 0 2026-03-09T17:29:52.916 INFO:tasks.workunit.client.0.vm06.stdout:6/601: dread d6/d12/d17/f32 [0,4194304] 0 2026-03-09T17:29:52.930 INFO:tasks.workunit.client.0.vm06.stdout:5/725: rename d4/d50/d35/d40/d96/d98 to d4/d50/d35/d40/d109 0 2026-03-09T17:29:52.930 INFO:tasks.workunit.client.0.vm06.stdout:5/726: dread - d4/d52/db4/dc2/f100 zero size 2026-03-09T17:29:52.933 INFO:tasks.workunit.client.0.vm06.stdout:6/602: unlink l1 0 2026-03-09T17:29:52.933 INFO:tasks.workunit.client.0.vm06.stdout:6/603: read d6/d12/d2d/f39 [2905865,61411] 0 2026-03-09T17:29:52.934 INFO:tasks.workunit.client.0.vm06.stdout:5/727: truncate d4/d50/f61 7373219 0 2026-03-09T17:29:52.938 INFO:tasks.workunit.client.0.vm06.stdout:4/722: unlink db/d1d/d21/c29 0 2026-03-09T17:29:52.942 INFO:tasks.workunit.client.0.vm06.stdout:6/604: dread d6/d4f/fa3 [0,4194304] 0 2026-03-09T17:29:52.942 INFO:tasks.workunit.client.0.vm06.stdout:6/605: stat d6/d47/d96/d40/fb4 0 2026-03-09T17:29:52.956 INFO:tasks.workunit.client.0.vm06.stdout:4/723: dwrite db/d1d/d21/d44/dc1/ffb [0,4194304] 0 2026-03-09T17:29:52.956 INFO:tasks.workunit.client.0.vm06.stdout:5/728: mknod d4/d50/d35/c10a 0 2026-03-09T17:29:52.956 INFO:tasks.workunit.client.0.vm06.stdout:7/852: getdents d5/d1f 0 2026-03-09T17:29:52.956 INFO:tasks.workunit.client.0.vm06.stdout:6/606: mkdir d6/d12/d53/d8f/dc2 0 2026-03-09T17:29:52.956 INFO:tasks.workunit.client.0.vm06.stdout:4/724: mkdir db/d1d/d21/d25/d4b/d85/d106 0 2026-03-09T17:29:52.957 INFO:tasks.workunit.client.0.vm06.stdout:6/607: write d6/d4f/d3e/d52/d80/faa [1437030,98539] 0 2026-03-09T17:29:52.958 INFO:tasks.workunit.client.0.vm06.stdout:6/608: truncate d6/d12/fbc 662286 0 2026-03-09T17:29:52.964 INFO:tasks.workunit.client.0.vm06.stdout:5/729: unlink d4/d52/db4/dc2/ff2 0 2026-03-09T17:29:52.965 INFO:tasks.workunit.client.0.vm06.stdout:0/856: getdents d7/d11/d89 0 2026-03-09T17:29:52.978 INFO:tasks.workunit.client.0.vm06.stdout:0/857: mknod d7/d11/d19/d3c/df8/c127 0 2026-03-09T17:29:52.982 INFO:tasks.workunit.client.0.vm06.stdout:6/609: fsync d6/d12/fbb 0 2026-03-09T17:29:52.989 INFO:tasks.workunit.client.0.vm06.stdout:9/790: sync 2026-03-09T17:29:52.993 INFO:tasks.workunit.client.0.vm06.stdout:9/791: dwrite d3/d2c/f81 [0,4194304] 0 2026-03-09T17:29:53.003 INFO:tasks.workunit.client.0.vm06.stdout:4/725: creat db/df/f107 x:0 0 0 2026-03-09T17:29:53.010 INFO:tasks.workunit.client.0.vm06.stdout:1/719: write d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/f93 [881207,47703] 0 2026-03-09T17:29:53.013 INFO:tasks.workunit.client.0.vm06.stdout:1/720: dwrite d11/d14/d1d/d1e/d2a/d34/d58/f6a [0,4194304] 0 2026-03-09T17:29:53.031 INFO:tasks.workunit.client.0.vm06.stdout:3/760: dwrite dd/d1d/d4e/f7d [4194304,4194304] 0 2026-03-09T17:29:53.032 INFO:tasks.workunit.client.0.vm06.stdout:3/761: readlink dd/d19/d25/d2d/l9d 0 2026-03-09T17:29:53.036 INFO:tasks.workunit.client.0.vm06.stdout:6/610: mknod d6/d4f/d3e/d52/d80/cc3 0 2026-03-09T17:29:53.043 INFO:tasks.workunit.client.0.vm06.stdout:7/853: getdents d5/dd 0 2026-03-09T17:29:53.048 INFO:tasks.workunit.client.0.vm06.stdout:4/726: truncate db/f15 3775277 0 2026-03-09T17:29:53.057 INFO:tasks.workunit.client.0.vm06.stdout:8/696: dwrite d15/d16/f23 [0,4194304] 0 2026-03-09T17:29:53.071 INFO:tasks.workunit.client.0.vm06.stdout:5/730: getdents d4/d50/d35/d40/d95 0 2026-03-09T17:29:53.071 INFO:tasks.workunit.client.0.vm06.stdout:3/762: dread dd/d19/d28/fab [0,4194304] 0 2026-03-09T17:29:53.071 INFO:tasks.workunit.client.0.vm06.stdout:5/731: readlink d4/d22/laa 0 2026-03-09T17:29:53.071 INFO:tasks.workunit.client.0.vm06.stdout:7/854: truncate d5/d1f/d34/d3f/d8b/faa 782319 0 2026-03-09T17:29:53.071 INFO:tasks.workunit.client.0.vm06.stdout:5/732: chown d4/d50/d35/d40/d109/cdc 16 1 2026-03-09T17:29:53.071 INFO:tasks.workunit.client.0.vm06.stdout:3/763: truncate dd/d19/d28/f6f 887843 0 2026-03-09T17:29:53.072 INFO:tasks.workunit.client.0.vm06.stdout:8/697: dread d15/d16/d1a/f29 [0,4194304] 0 2026-03-09T17:29:53.077 INFO:tasks.workunit.client.0.vm06.stdout:4/727: mkdir db/d1d/d21/d108 0 2026-03-09T17:29:53.082 INFO:tasks.workunit.client.0.vm06.stdout:8/698: fsync d15/d31/d58/d9b/fb1 0 2026-03-09T17:29:53.089 INFO:tasks.workunit.client.0.vm06.stdout:6/611: sync 2026-03-09T17:29:53.090 INFO:tasks.workunit.client.0.vm06.stdout:8/699: truncate d15/d16/d1e/fa9 4347849 0 2026-03-09T17:29:53.094 INFO:tasks.workunit.client.0.vm06.stdout:8/700: fdatasync d15/fb2 0 2026-03-09T17:29:53.097 INFO:tasks.workunit.client.0.vm06.stdout:0/858: dwrite d7/d11/d19/d37/f6d [0,4194304] 0 2026-03-09T17:29:53.099 INFO:tasks.workunit.client.0.vm06.stdout:5/733: getdents d4/d50/dd6 0 2026-03-09T17:29:53.102 INFO:tasks.workunit.client.0.vm06.stdout:8/701: mkdir d15/d39/d67/d77/de7 0 2026-03-09T17:29:53.107 INFO:tasks.workunit.client.0.vm06.stdout:8/702: dwrite d15/d16/d1e/d30/db8/d5e/fb9 [0,4194304] 0 2026-03-09T17:29:53.108 INFO:tasks.workunit.client.0.vm06.stdout:6/612: dread d6/d4f/d3e/d52/f89 [0,4194304] 0 2026-03-09T17:29:53.119 INFO:tasks.workunit.client.0.vm06.stdout:2/640: write d3/fc7 [796579,62518] 0 2026-03-09T17:29:53.120 INFO:tasks.workunit.client.0.vm06.stdout:2/641: fdatasync d3/d4/d12/d2b/d2d/f9d 0 2026-03-09T17:29:53.131 INFO:tasks.workunit.client.0.vm06.stdout:9/792: dwrite d3/d15/d48/da8/db9/faf [0,4194304] 0 2026-03-09T17:29:53.141 INFO:tasks.workunit.client.0.vm06.stdout:1/721: dwrite d11/f13 [4194304,4194304] 0 2026-03-09T17:29:53.148 INFO:tasks.workunit.client.0.vm06.stdout:6/613: truncate d6/d12/f1c 1031697 0 2026-03-09T17:29:53.148 INFO:tasks.workunit.client.0.vm06.stdout:7/855: truncate d5/f71 3387497 0 2026-03-09T17:29:53.151 INFO:tasks.workunit.client.0.vm06.stdout:3/764: dwrite dd/d19/f2b [0,4194304] 0 2026-03-09T17:29:53.155 INFO:tasks.workunit.client.0.vm06.stdout:5/734: link d4/d50/d18/d3d/fa7 d4/d22/d46/f10b 0 2026-03-09T17:29:53.160 INFO:tasks.workunit.client.0.vm06.stdout:8/703: truncate d15/d16/d19/d71/f82 1720783 0 2026-03-09T17:29:53.168 INFO:tasks.workunit.client.0.vm06.stdout:9/793: dread d3/d15/d36/d4c/f55 [0,4194304] 0 2026-03-09T17:29:53.171 INFO:tasks.workunit.client.0.vm06.stdout:1/722: link d11/d14/d1d/d42/d46/d92/dc0/fc9 d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/fef 0 2026-03-09T17:29:53.172 INFO:tasks.workunit.client.0.vm06.stdout:5/735: mknod d4/d22/dbe/c10c 0 2026-03-09T17:29:53.173 INFO:tasks.workunit.client.0.vm06.stdout:2/642: link d3/d4/d12/d71/f8c d3/d4/d12/d71/daa/d77/d81/d64/fce 0 2026-03-09T17:29:53.174 INFO:tasks.workunit.client.0.vm06.stdout:2/643: dread - d3/d4/d12/d2b/d2d/f9d zero size 2026-03-09T17:29:53.177 INFO:tasks.workunit.client.0.vm06.stdout:6/614: creat d6/d12/d53/d91/dbf/fc4 x:0 0 0 2026-03-09T17:29:53.178 INFO:tasks.workunit.client.0.vm06.stdout:3/765: symlink dd/d81/d97/l105 0 2026-03-09T17:29:53.178 INFO:tasks.workunit.client.0.vm06.stdout:3/766: chown dd/d81/fa9 23042969 1 2026-03-09T17:29:53.179 INFO:tasks.workunit.client.0.vm06.stdout:3/767: write dd/d19/d2c/fe9 [120223,49686] 0 2026-03-09T17:29:53.182 INFO:tasks.workunit.client.0.vm06.stdout:1/723: dwrite d11/d14/d1d/d42/d46/d92/dc0/fc9 [0,4194304] 0 2026-03-09T17:29:53.187 INFO:tasks.workunit.client.0.vm06.stdout:9/794: mknod d3/d6d/d9a/cfa 0 2026-03-09T17:29:53.188 INFO:tasks.workunit.client.0.vm06.stdout:6/615: mknod d6/d47/d8a/cc5 0 2026-03-09T17:29:53.190 INFO:tasks.workunit.client.0.vm06.stdout:1/724: creat d11/d14/d1c/d3a/db7/ff0 x:0 0 0 2026-03-09T17:29:53.191 INFO:tasks.workunit.client.0.vm06.stdout:2/644: unlink d3/d4/d12/d2b/d36/ca6 0 2026-03-09T17:29:53.201 INFO:tasks.workunit.client.0.vm06.stdout:4/728: write db/d1d/d21/d37/f81 [4509023,117063] 0 2026-03-09T17:29:53.203 INFO:tasks.workunit.client.0.vm06.stdout:6/616: dread d6/d47/f49 [0,4194304] 0 2026-03-09T17:29:53.206 INFO:tasks.workunit.client.0.vm06.stdout:9/795: creat d3/d15/d16/ffb x:0 0 0 2026-03-09T17:29:53.209 INFO:tasks.workunit.client.0.vm06.stdout:0/859: write d7/d11/d2d/daf/fd5 [319657,6878] 0 2026-03-09T17:29:53.209 INFO:tasks.workunit.client.0.vm06.stdout:0/860: stat d7/d11/d19/d3c/ffd 0 2026-03-09T17:29:53.213 INFO:tasks.workunit.client.0.vm06.stdout:3/768: mkdir dd/d19/d25/d106 0 2026-03-09T17:29:53.214 INFO:tasks.workunit.client.0.vm06.stdout:4/729: dread db/f13 [0,4194304] 0 2026-03-09T17:29:53.224 INFO:tasks.workunit.client.0.vm06.stdout:4/730: write db/f68 [3717445,37948] 0 2026-03-09T17:29:53.224 INFO:tasks.workunit.client.0.vm06.stdout:4/731: chown db/d1d/c8d 9680695 1 2026-03-09T17:29:53.224 INFO:tasks.workunit.client.0.vm06.stdout:9/796: fdatasync d3/d6d/f78 0 2026-03-09T17:29:53.224 INFO:tasks.workunit.client.0.vm06.stdout:3/769: creat dd/d19/d25/d48/f107 x:0 0 0 2026-03-09T17:29:53.224 INFO:tasks.workunit.client.0.vm06.stdout:3/770: read dd/fdd [845849,33375] 0 2026-03-09T17:29:53.229 INFO:tasks.workunit.client.0.vm06.stdout:2/645: sync 2026-03-09T17:29:53.240 INFO:tasks.workunit.client.0.vm06.stdout:3/771: mknod dd/d81/da3/dae/df8/c108 0 2026-03-09T17:29:53.272 INFO:tasks.workunit.client.0.vm06.stdout:8/704: write d15/d16/d19/d71/f96 [1056658,8099] 0 2026-03-09T17:29:53.273 INFO:tasks.workunit.client.0.vm06.stdout:7/856: write d5/d1f/d34/d3f/d8b/faa [596805,122654] 0 2026-03-09T17:29:53.278 INFO:tasks.workunit.client.0.vm06.stdout:5/736: dwrite d4/d52/f8a [0,4194304] 0 2026-03-09T17:29:53.298 INFO:tasks.workunit.client.0.vm06.stdout:0/861: dwrite d7/d11/d19/d3c/df8/ffc [0,4194304] 0 2026-03-09T17:29:53.305 INFO:tasks.workunit.client.0.vm06.stdout:1/725: write d11/d14/d1d/d42/d46/faa [685275,23071] 0 2026-03-09T17:29:53.306 INFO:tasks.workunit.client.0.vm06.stdout:1/726: truncate d11/d14/d1d/d94/fc6 1077175 0 2026-03-09T17:29:53.307 INFO:tasks.workunit.client.0.vm06.stdout:1/727: fdatasync d11/d14/d1d/d1e/d2a/d34/f3b 0 2026-03-09T17:29:53.310 INFO:tasks.workunit.client.0.vm06.stdout:4/732: write f6 [218170,83356] 0 2026-03-09T17:29:53.312 INFO:tasks.workunit.client.0.vm06.stdout:6/617: truncate d6/d4f/d3e/f62 2205362 0 2026-03-09T17:29:53.317 INFO:tasks.workunit.client.0.vm06.stdout:9/797: dwrite d3/d2c/f8c [0,4194304] 0 2026-03-09T17:29:53.345 INFO:tasks.workunit.client.0.vm06.stdout:2/646: fsync d3/d4/d22/fb2 0 2026-03-09T17:29:53.347 INFO:tasks.workunit.client.0.vm06.stdout:8/705: mkdir d15/d39/d67/de3/de8 0 2026-03-09T17:29:53.348 INFO:tasks.workunit.client.0.vm06.stdout:7/857: creat d5/d1f/d34/d3f/d8b/ff8 x:0 0 0 2026-03-09T17:29:53.351 INFO:tasks.workunit.client.0.vm06.stdout:0/862: read - d7/d11/d19/d1d/d87/f104 zero size 2026-03-09T17:29:53.353 INFO:tasks.workunit.client.0.vm06.stdout:1/728: fdatasync d11/d14/d1d/f8f 0 2026-03-09T17:29:53.353 INFO:tasks.workunit.client.0.vm06.stdout:4/733: mknod db/d1d/d21/d26/d89/dab/dae/dcc/c109 0 2026-03-09T17:29:53.354 INFO:tasks.workunit.client.0.vm06.stdout:4/734: stat db/d59/d5f/d5d 0 2026-03-09T17:29:53.355 INFO:tasks.workunit.client.0.vm06.stdout:9/798: creat d3/d2c/ffc x:0 0 0 2026-03-09T17:29:53.357 INFO:tasks.workunit.client.0.vm06.stdout:3/772: mknod dd/d19/d25/c109 0 2026-03-09T17:29:53.358 INFO:tasks.workunit.client.0.vm06.stdout:5/737: symlink d4/d50/d18/de1/l10d 0 2026-03-09T17:29:53.359 INFO:tasks.workunit.client.0.vm06.stdout:0/863: fsync d7/d11/f10c 0 2026-03-09T17:29:53.362 INFO:tasks.workunit.client.0.vm06.stdout:4/735: dread - db/d59/d5f/d6d/fd9 zero size 2026-03-09T17:29:53.363 INFO:tasks.workunit.client.0.vm06.stdout:4/736: stat db/d1d/d21/d26/d89/dab/dae/fdd 0 2026-03-09T17:29:53.364 INFO:tasks.workunit.client.0.vm06.stdout:9/799: rename d3/d15/d16/c86 to d3/d15/d36/d4d/cfd 0 2026-03-09T17:29:53.365 INFO:tasks.workunit.client.0.vm06.stdout:3/773: rmdir dd/d19/d2c 39 2026-03-09T17:29:53.369 INFO:tasks.workunit.client.0.vm06.stdout:6/618: link d6/d12/d2d/f39 d6/d12/d53/fc6 0 2026-03-09T17:29:53.371 INFO:tasks.workunit.client.0.vm06.stdout:2/647: sync 2026-03-09T17:29:53.371 INFO:tasks.workunit.client.0.vm06.stdout:8/706: sync 2026-03-09T17:29:53.372 INFO:tasks.workunit.client.0.vm06.stdout:7/858: sync 2026-03-09T17:29:53.372 INFO:tasks.workunit.client.0.vm06.stdout:1/729: sync 2026-03-09T17:29:53.372 INFO:tasks.workunit.client.0.vm06.stdout:4/737: sync 2026-03-09T17:29:53.374 INFO:tasks.workunit.client.0.vm06.stdout:7/859: sync 2026-03-09T17:29:53.376 INFO:tasks.workunit.client.0.vm06.stdout:3/774: write dd/d5b/d65/fb7 [684780,130738] 0 2026-03-09T17:29:53.380 INFO:tasks.workunit.client.0.vm06.stdout:6/619: read - d6/d12/d17/f7a zero size 2026-03-09T17:29:53.383 INFO:tasks.workunit.client.0.vm06.stdout:6/620: dwrite d6/d4f/f26 [4194304,4194304] 0 2026-03-09T17:29:53.384 INFO:tasks.workunit.client.0.vm06.stdout:6/621: stat d6/c23 0 2026-03-09T17:29:53.384 INFO:tasks.workunit.client.0.vm06.stdout:6/622: chown d6/d47/d96/d40/f9f 165 1 2026-03-09T17:29:53.393 INFO:tasks.workunit.client.0.vm06.stdout:2/648: mkdir d3/d4/dcf 0 2026-03-09T17:29:53.395 INFO:tasks.workunit.client.0.vm06.stdout:8/707: rmdir d15/d16/d19/d3d/d5f 39 2026-03-09T17:29:53.396 INFO:tasks.workunit.client.0.vm06.stdout:1/730: rename d11/d14/d1d/d94/f95 to d11/de0/ff1 0 2026-03-09T17:29:53.397 INFO:tasks.workunit.client.0.vm06.stdout:4/738: truncate db/d1d/d21/d37/fbd 170257 0 2026-03-09T17:29:53.408 INFO:tasks.workunit.client.0.vm06.stdout:8/708: dread d15/d39/d67/d77/fa0 [0,4194304] 0 2026-03-09T17:29:53.415 INFO:tasks.workunit.client.0.vm06.stdout:1/731: symlink d11/d14/d1c/d3a/lf2 0 2026-03-09T17:29:53.420 INFO:tasks.workunit.client.0.vm06.stdout:1/732: dread d11/d14/f17 [4194304,4194304] 0 2026-03-09T17:29:53.431 INFO:tasks.workunit.client.0.vm06.stdout:0/864: link d7/d11/d19/cb6 d7/d11/d2d/c128 0 2026-03-09T17:29:53.433 INFO:tasks.workunit.client.0.vm06.stdout:6/623: creat d6/d4f/d3e/d52/d8c/db0/fc7 x:0 0 0 2026-03-09T17:29:53.437 INFO:tasks.workunit.client.0.vm06.stdout:2/649: mknod d3/d4/d12/d2b/db0/dc1/cd0 0 2026-03-09T17:29:53.440 INFO:tasks.workunit.client.0.vm06.stdout:2/650: dwrite d3/d4/d12/f35 [0,4194304] 0 2026-03-09T17:29:53.449 INFO:tasks.workunit.client.0.vm06.stdout:9/800: write d3/d15/f17 [3106818,38349] 0 2026-03-09T17:29:53.451 INFO:tasks.workunit.client.0.vm06.stdout:0/865: dread d7/d11/d19/d1d/f8a [0,4194304] 0 2026-03-09T17:29:53.458 INFO:tasks.workunit.client.0.vm06.stdout:1/733: creat d11/d14/d1d/d1e/d2a/d99/ff3 x:0 0 0 2026-03-09T17:29:53.462 INFO:tasks.workunit.client.0.vm06.stdout:5/738: dwrite d4/f17 [0,4194304] 0 2026-03-09T17:29:53.472 INFO:tasks.workunit.client.0.vm06.stdout:7/860: mknod d5/d7/dac/dd4/cf9 0 2026-03-09T17:29:53.473 INFO:tasks.workunit.client.0.vm06.stdout:3/775: write dd/d19/d25/d44/f88 [86050,3481] 0 2026-03-09T17:29:53.474 INFO:tasks.workunit.client.0.vm06.stdout:6/624: rmdir d6/d4f/d3e/d52 39 2026-03-09T17:29:53.477 INFO:tasks.workunit.client.0.vm06.stdout:2/651: mknod d3/d4/d22/d72/cd1 0 2026-03-09T17:29:53.482 INFO:tasks.workunit.client.0.vm06.stdout:4/739: dwrite db/f6f [0,4194304] 0 2026-03-09T17:29:53.484 INFO:tasks.workunit.client.0.vm06.stdout:4/740: chown db/d1d/d21/d44/dc1/fe1 678 1 2026-03-09T17:29:53.487 INFO:tasks.workunit.client.0.vm06.stdout:1/734: symlink d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/dce/lf4 0 2026-03-09T17:29:53.489 INFO:tasks.workunit.client.0.vm06.stdout:1/735: fsync d11/d14/d1d/d42/d46/fcf 0 2026-03-09T17:29:53.498 INFO:tasks.workunit.client.0.vm06.stdout:3/776: readlink dd/d19/d2c/lcd 0 2026-03-09T17:29:53.502 INFO:tasks.workunit.client.0.vm06.stdout:7/861: dread d5/dd/dc5/f32 [0,4194304] 0 2026-03-09T17:29:53.510 INFO:tasks.workunit.client.0.vm06.stdout:0/866: mkdir d7/d11/d19/d3c/db9/ddd/d10e/d129 0 2026-03-09T17:29:53.512 INFO:tasks.workunit.client.0.vm06.stdout:5/739: truncate d4/d22/f90 912941 0 2026-03-09T17:29:53.513 INFO:tasks.workunit.client.0.vm06.stdout:1/736: mknod d11/d14/d1d/d1e/d2a/d99/db0/cf5 0 2026-03-09T17:29:53.515 INFO:tasks.workunit.client.0.vm06.stdout:6/625: dread d6/d4f/d3e/d52/f84 [0,4194304] 0 2026-03-09T17:29:53.520 INFO:tasks.workunit.client.0.vm06.stdout:5/740: symlink d4/d22/d46/l10e 0 2026-03-09T17:29:53.521 INFO:tasks.workunit.client.0.vm06.stdout:6/626: mknod d6/d12/d53/d8f/cc8 0 2026-03-09T17:29:53.522 INFO:tasks.workunit.client.0.vm06.stdout:6/627: fsync d6/d12/d17/d85/faf 0 2026-03-09T17:29:53.527 INFO:tasks.workunit.client.0.vm06.stdout:4/741: rename db/d1d/d21/d26/d89/dab/dae to db/d59/d5f/d45/d10a 0 2026-03-09T17:29:53.531 INFO:tasks.workunit.client.0.vm06.stdout:4/742: dwrite db/d1d/d21/d26/d7a/f105 [0,4194304] 0 2026-03-09T17:29:53.539 INFO:tasks.workunit.client.0.vm06.stdout:5/741: truncate d4/d22/d64/fcc 3713015 0 2026-03-09T17:29:53.541 INFO:tasks.workunit.client.0.vm06.stdout:6/628: unlink d6/d47/d4d/d6d/la6 0 2026-03-09T17:29:53.543 INFO:tasks.workunit.client.0.vm06.stdout:0/867: creat d7/f12a x:0 0 0 2026-03-09T17:29:53.554 INFO:tasks.workunit.client.0.vm06.stdout:4/743: creat db/d59/f10b x:0 0 0 2026-03-09T17:29:53.554 INFO:tasks.workunit.client.0.vm06.stdout:1/737: dread d11/d14/d1d/d1e/d2a/f43 [0,4194304] 0 2026-03-09T17:29:53.554 INFO:tasks.workunit.client.0.vm06.stdout:1/738: write d11/d14/d1c/d3a/db7/ff0 [624733,28304] 0 2026-03-09T17:29:53.554 INFO:tasks.workunit.client.0.vm06.stdout:6/629: readlink d6/l20 0 2026-03-09T17:29:53.554 INFO:tasks.workunit.client.0.vm06.stdout:6/630: read - d6/d47/d4d/d6d/fbe zero size 2026-03-09T17:29:53.558 INFO:tasks.workunit.client.0.vm06.stdout:8/709: dread d15/d16/d19/d2b/d85/f9a [0,4194304] 0 2026-03-09T17:29:53.559 INFO:tasks.workunit.client.0.vm06.stdout:0/868: creat d7/d11/d19/d8b/da4/d85/f12b x:0 0 0 2026-03-09T17:29:53.562 INFO:tasks.workunit.client.0.vm06.stdout:6/631: mkdir d6/d4f/d3e/d52/d80/dc9 0 2026-03-09T17:29:53.567 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:53 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:29:53.567 INFO:tasks.workunit.client.0.vm06.stdout:8/710: dread - d15/d39/d67/d77/d97/fad zero size 2026-03-09T17:29:53.568 INFO:tasks.workunit.client.0.vm06.stdout:0/869: unlink d7/l2b 0 2026-03-09T17:29:53.569 INFO:tasks.workunit.client.0.vm06.stdout:9/801: dwrite d3/d15/d16/f72 [0,4194304] 0 2026-03-09T17:29:53.569 INFO:tasks.workunit.client.0.vm06.stdout:4/744: creat db/de2/f10c x:0 0 0 2026-03-09T17:29:53.572 INFO:tasks.workunit.client.0.vm06.stdout:8/711: dwrite d15/d16/d1e/f4e [4194304,4194304] 0 2026-03-09T17:29:53.572 INFO:tasks.workunit.client.0.vm06.stdout:3/777: dread dd/d1d/f34 [0,4194304] 0 2026-03-09T17:29:53.576 INFO:tasks.workunit.client.0.vm06.stdout:1/739: sync 2026-03-09T17:29:53.585 INFO:tasks.workunit.client.0.vm06.stdout:8/712: dread d15/d31/f33 [0,4194304] 0 2026-03-09T17:29:53.589 INFO:tasks.workunit.client.0.vm06.stdout:6/632: truncate d6/d12/d53/fc6 2060497 0 2026-03-09T17:29:53.589 INFO:tasks.workunit.client.0.vm06.stdout:6/633: stat d6/d12/d2d/c2e 0 2026-03-09T17:29:53.599 INFO:tasks.workunit.client.0.vm06.stdout:2/652: dwrite d3/d4/f1f [0,4194304] 0 2026-03-09T17:29:53.600 INFO:tasks.workunit.client.0.vm06.stdout:2/653: chown d3/d4/d12 14685170 1 2026-03-09T17:29:53.614 INFO:tasks.workunit.client.0.vm06.stdout:7/862: dwrite d5/dd/dc5/d5f/fd8 [0,4194304] 0 2026-03-09T17:29:53.618 INFO:tasks.workunit.client.0.vm06.stdout:4/745: symlink db/d1d/d21/d26/d89/l10d 0 2026-03-09T17:29:53.620 INFO:tasks.workunit.client.0.vm06.stdout:9/802: truncate d3/d15/d36/d4d/f60 5482605 0 2026-03-09T17:29:53.626 INFO:tasks.workunit.client.0.vm06.stdout:0/870: mkdir d7/d11/d19/d23/db7/dbd/d101/d12c 0 2026-03-09T17:29:53.626 INFO:tasks.workunit.client.0.vm06.stdout:0/871: stat d7/d11/d19/d37/f4f 0 2026-03-09T17:29:53.630 INFO:tasks.workunit.client.0.vm06.stdout:1/740: mkdir d11/d14/d1d/d1e/d2a/d34/d64/df6 0 2026-03-09T17:29:53.633 INFO:tasks.workunit.client.0.vm06.stdout:8/713: creat d15/d39/d67/de3/fe9 x:0 0 0 2026-03-09T17:29:53.634 INFO:tasks.workunit.client.0.vm06.stdout:6/634: fdatasync d6/d47/f49 0 2026-03-09T17:29:53.635 INFO:tasks.workunit.client.0.vm06.stdout:6/635: chown d6/d47/f88 29396 1 2026-03-09T17:29:53.638 INFO:tasks.workunit.client.0.vm06.stdout:5/742: write d4/d50/fd7 [68997,124585] 0 2026-03-09T17:29:53.639 INFO:tasks.workunit.client.0.vm06.stdout:2/654: rename d3/d4/d22/fb2 to d3/d4/d22/fd2 0 2026-03-09T17:29:53.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:53 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:29:53.644 INFO:tasks.workunit.client.0.vm06.stdout:9/803: mknod d3/d15/d48/da8/cfe 0 2026-03-09T17:29:53.645 INFO:tasks.workunit.client.0.vm06.stdout:0/872: symlink d7/d88/l12d 0 2026-03-09T17:29:53.651 INFO:tasks.workunit.client.0.vm06.stdout:3/778: link dd/d5b/d65/fb7 dd/d19/d1e/f10a 0 2026-03-09T17:29:53.653 INFO:tasks.workunit.client.0.vm06.stdout:1/741: mkdir d11/d14/d1d/d4a/df7 0 2026-03-09T17:29:53.660 INFO:tasks.workunit.client.0.vm06.stdout:5/743: symlink d4/d50/d35/l10f 0 2026-03-09T17:29:53.663 INFO:tasks.workunit.client.0.vm06.stdout:2/655: mknod d3/d4/d12/d71/daa/cd3 0 2026-03-09T17:29:53.666 INFO:tasks.workunit.client.0.vm06.stdout:7/863: creat d5/d7/d2b/dc8/ffa x:0 0 0 2026-03-09T17:29:53.670 INFO:tasks.workunit.client.0.vm06.stdout:4/746: rename db/d57/ccb to db/d59/c10e 0 2026-03-09T17:29:53.674 INFO:tasks.workunit.client.0.vm06.stdout:9/804: symlink d3/d26/d6c/d68/lff 0 2026-03-09T17:29:53.680 INFO:tasks.workunit.client.0.vm06.stdout:1/742: mknod d11/d14/d1d/dd1/cf8 0 2026-03-09T17:29:53.680 INFO:tasks.workunit.client.0.vm06.stdout:1/743: chown d11/d14/d1d/d1e/d2a/d34/d64/df6 252135301 1 2026-03-09T17:29:53.683 INFO:tasks.workunit.client.0.vm06.stdout:0/873: dread d7/d11/f75 [0,4194304] 0 2026-03-09T17:29:53.685 INFO:tasks.workunit.client.0.vm06.stdout:8/714: unlink d15/d16/d1a/f22 0 2026-03-09T17:29:53.687 INFO:tasks.workunit.client.0.vm06.stdout:5/744: symlink d4/d50/d35/d40/d109/l110 0 2026-03-09T17:29:53.688 INFO:tasks.workunit.client.0.vm06.stdout:2/656: mkdir d3/d4/d12/d2b/d36/dd4 0 2026-03-09T17:29:53.698 INFO:tasks.workunit.client.0.vm06.stdout:7/864: dread d5/f71 [0,4194304] 0 2026-03-09T17:29:53.699 INFO:tasks.workunit.client.0.vm06.stdout:6/636: rename d6/d47/d4d/d6d/c75 to d6/d4f/d3e/d52/d95/cca 0 2026-03-09T17:29:53.701 INFO:tasks.workunit.client.0.vm06.stdout:4/747: dread - db/d1d/d21/d44/d8a/fa7 zero size 2026-03-09T17:29:53.701 INFO:tasks.workunit.client.0.vm06.stdout:6/637: dread d6/d47/f49 [0,4194304] 0 2026-03-09T17:29:53.705 INFO:tasks.workunit.client.0.vm06.stdout:6/638: dwrite d6/d47/d96/f3d [0,4194304] 0 2026-03-09T17:29:53.719 INFO:tasks.workunit.client.0.vm06.stdout:9/805: creat d3/d15/d36/d4c/d6a/f100 x:0 0 0 2026-03-09T17:29:53.721 INFO:tasks.workunit.client.0.vm06.stdout:1/744: creat d11/d14/d1d/d1e/d2a/d99/de9/ff9 x:0 0 0 2026-03-09T17:29:53.724 INFO:tasks.workunit.client.0.vm06.stdout:0/874: chown d7/d11/d19/d3c/l83 34 1 2026-03-09T17:29:53.735 INFO:tasks.workunit.client.0.vm06.stdout:6/639: mkdir d6/d12/d53/d91/dcb 0 2026-03-09T17:29:53.735 INFO:tasks.workunit.client.0.vm06.stdout:6/640: fdatasync d6/d47/f88 0 2026-03-09T17:29:53.736 INFO:tasks.workunit.client.0.vm06.stdout:9/806: rmdir d3/d11/d65 39 2026-03-09T17:29:53.737 INFO:tasks.workunit.client.0.vm06.stdout:3/779: creat dd/d19/f10b x:0 0 0 2026-03-09T17:29:53.737 INFO:tasks.workunit.client.0.vm06.stdout:3/780: chown dd/d19/d25/d44/d80/dd7 901855 1 2026-03-09T17:29:53.738 INFO:tasks.workunit.client.0.vm06.stdout:1/745: mkdir d11/d14/d1d/d1e/d2a/d34/d64/dfa 0 2026-03-09T17:29:53.739 INFO:tasks.workunit.client.0.vm06.stdout:9/807: read d3/d15/d36/d83/fc6 [478003,23874] 0 2026-03-09T17:29:53.741 INFO:tasks.workunit.client.0.vm06.stdout:7/865: dread d5/d1f/d34/d3f/f5b [0,4194304] 0 2026-03-09T17:29:53.742 INFO:tasks.workunit.client.0.vm06.stdout:7/866: readlink d5/dd/dc5/d64/l72 0 2026-03-09T17:29:53.742 INFO:tasks.workunit.client.0.vm06.stdout:3/781: dwrite dd/d19/d2c/fe9 [0,4194304] 0 2026-03-09T17:29:53.743 INFO:tasks.workunit.client.0.vm06.stdout:7/867: write d5/d1f/d34/d46/d51/fda [712964,97681] 0 2026-03-09T17:29:53.743 INFO:tasks.workunit.client.0.vm06.stdout:3/782: stat dd/d19/d25/d2d/l9d 0 2026-03-09T17:29:53.746 INFO:tasks.workunit.client.0.vm06.stdout:0/875: rmdir d7/d11/d5d 39 2026-03-09T17:29:53.747 INFO:tasks.workunit.client.0.vm06.stdout:2/657: link d3/d4/d22/fd2 d3/d4/d12/d2b/d36/dd4/fd5 0 2026-03-09T17:29:53.776 INFO:tasks.workunit.client.0.vm06.stdout:7/868: dread d5/dd/dc5/d5f/fb2 [0,4194304] 0 2026-03-09T17:29:53.777 INFO:tasks.workunit.client.0.vm06.stdout:7/869: chown d5/d7/d2b/l84 56 1 2026-03-09T17:29:53.780 INFO:tasks.workunit.client.0.vm06.stdout:8/715: dwrite d15/f3e [0,4194304] 0 2026-03-09T17:29:53.793 INFO:tasks.workunit.client.0.vm06.stdout:4/748: rename db/d59/c10e to db/d1d/d21/d37/d69/d78/db4/c10f 0 2026-03-09T17:29:53.794 INFO:tasks.workunit.client.0.vm06.stdout:4/749: chown db/l9d 677 1 2026-03-09T17:29:53.803 INFO:tasks.workunit.client.0.vm06.stdout:2/658: dread d3/d4/d12/d2b/d2d/f2a [0,4194304] 0 2026-03-09T17:29:53.803 INFO:tasks.workunit.client.0.vm06.stdout:3/783: mknod dd/d19/d25/c10c 0 2026-03-09T17:29:53.803 INFO:tasks.workunit.client.0.vm06.stdout:2/659: chown d3/d4/d12/da7/lbc 444796841 1 2026-03-09T17:29:53.804 INFO:tasks.workunit.client.0.vm06.stdout:3/784: truncate dd/d19/d28/ffe 952897 0 2026-03-09T17:29:53.804 INFO:tasks.workunit.client.0.vm06.stdout:3/785: write dd/d81/da3/fbc [3832391,53619] 0 2026-03-09T17:29:53.811 INFO:tasks.workunit.client.0.vm06.stdout:6/641: dwrite d6/d12/d53/f64 [0,4194304] 0 2026-03-09T17:29:53.814 INFO:tasks.workunit.client.0.vm06.stdout:6/642: write d6/d47/d96/da1/fb7 [580599,101102] 0 2026-03-09T17:29:53.815 INFO:tasks.workunit.client.0.vm06.stdout:9/808: dwrite d3/d6d/f78 [0,4194304] 0 2026-03-09T17:29:53.820 INFO:tasks.workunit.client.0.vm06.stdout:5/745: getdents d4/d22/d46 0 2026-03-09T17:29:53.825 INFO:tasks.workunit.client.0.vm06.stdout:9/809: truncate d3/d15/d36/d4d/fa4 4452906 0 2026-03-09T17:29:53.837 INFO:tasks.workunit.client.0.vm06.stdout:2/660: dread d3/f91 [0,4194304] 0 2026-03-09T17:29:53.846 INFO:tasks.workunit.client.0.vm06.stdout:4/750: mkdir db/d1d/d21/d25/d4b/d85/d106/d110 0 2026-03-09T17:29:53.846 INFO:tasks.workunit.client.0.vm06.stdout:4/751: write db/df/f107 [903949,101375] 0 2026-03-09T17:29:53.847 INFO:tasks.workunit.client.0.vm06.stdout:4/752: chown db/d1d/d21/d37/d69/d78/c7d 26233 1 2026-03-09T17:29:53.855 INFO:tasks.workunit.client.0.vm06.stdout:0/876: getdents d7/d11/d19/d3c 0 2026-03-09T17:29:53.858 INFO:tasks.workunit.client.0.vm06.stdout:0/877: write d7/d11/d19/d3c/db9/f123 [292372,106382] 0 2026-03-09T17:29:53.858 INFO:tasks.workunit.client.0.vm06.stdout:7/870: link d5/d7/d2b/f50 d5/d1f/d34/ffb 0 2026-03-09T17:29:53.861 INFO:tasks.workunit.client.0.vm06.stdout:4/753: truncate db/d1d/d21/f2f 2235477 0 2026-03-09T17:29:53.865 INFO:tasks.workunit.client.0.vm06.stdout:5/746: sync 2026-03-09T17:29:53.866 INFO:tasks.workunit.client.0.vm06.stdout:2/661: mknod d3/d4/dcf/cd6 0 2026-03-09T17:29:53.880 INFO:tasks.workunit.client.0.vm06.stdout:1/746: rename d11/d14/d1d/d42/d46/d92/fde to d11/d14/d1d/d42/d46/d92/dc0/ffb 0 2026-03-09T17:29:53.881 INFO:tasks.workunit.client.0.vm06.stdout:1/747: write d11/d14/d1d/d42/d46/d92/dc0/f68 [8670100,63851] 0 2026-03-09T17:29:53.882 INFO:tasks.workunit.client.0.vm06.stdout:1/748: chown d11/d14/d1c/d5f/c75 78 1 2026-03-09T17:29:53.890 INFO:tasks.workunit.client.0.vm06.stdout:3/786: write dd/d19/d25/fd1 [1668568,85110] 0 2026-03-09T17:29:53.892 INFO:tasks.workunit.client.0.vm06.stdout:8/716: dwrite d15/d39/f40 [0,4194304] 0 2026-03-09T17:29:53.893 INFO:tasks.workunit.client.0.vm06.stdout:8/717: chown d15/d16/d1e/d30/db8/d5e/fb9 498858 1 2026-03-09T17:29:53.900 INFO:tasks.workunit.client.0.vm06.stdout:6/643: dwrite d6/d4f/d3e/d52/f89 [0,4194304] 0 2026-03-09T17:29:53.919 INFO:tasks.workunit.client.0.vm06.stdout:9/810: write d3/d11/d65/d80/fd2 [306690,76134] 0 2026-03-09T17:29:53.921 INFO:tasks.workunit.client.0.vm06.stdout:0/878: dwrite d7/d11/d19/d3c/df3/f118 [0,4194304] 0 2026-03-09T17:29:53.925 INFO:tasks.workunit.client.0.vm06.stdout:7/871: dwrite d5/d1f/d34/d46/f55 [0,4194304] 0 2026-03-09T17:29:53.927 INFO:tasks.workunit.client.0.vm06.stdout:5/747: rename d4/dbb/cf7 to d4/d22/d64/df3/c111 0 2026-03-09T17:29:53.928 INFO:tasks.workunit.client.0.vm06.stdout:5/748: chown d4/d50/lb3 24865 1 2026-03-09T17:29:53.928 INFO:tasks.workunit.client.0.vm06.stdout:5/749: stat d4/d50/d35/f4d 0 2026-03-09T17:29:53.930 INFO:tasks.workunit.client.0.vm06.stdout:1/749: fsync d11/d14/d1d/d42/f52 0 2026-03-09T17:29:53.931 INFO:tasks.workunit.client.0.vm06.stdout:1/750: chown d11/d14/d1d/d42/d46/d92/fa3 19414 1 2026-03-09T17:29:53.933 INFO:tasks.workunit.client.0.vm06.stdout:3/787: mkdir dd/d19/d25/d44/d80/dd7/d10d 0 2026-03-09T17:29:53.936 INFO:tasks.workunit.client.0.vm06.stdout:9/811: symlink d3/d26/dd7/l101 0 2026-03-09T17:29:53.937 INFO:tasks.workunit.client.0.vm06.stdout:9/812: chown d3/d15/d36/d83/df8 1371949 1 2026-03-09T17:29:53.945 INFO:tasks.workunit.client.0.vm06.stdout:2/662: creat d3/d4/d12/d2b/d36/dd4/fd7 x:0 0 0 2026-03-09T17:29:53.951 INFO:tasks.workunit.client.0.vm06.stdout:7/872: rename d5/dd/dc5/d64/fe0 to d5/d1f/dae/ffc 0 2026-03-09T17:29:53.953 INFO:tasks.workunit.client.0.vm06.stdout:5/750: fsync d4/d50/f1e 0 2026-03-09T17:29:53.959 INFO:tasks.workunit.client.0.vm06.stdout:1/751: chown d11/d69/c76 29205223 1 2026-03-09T17:29:53.964 INFO:tasks.workunit.client.0.vm06.stdout:8/718: truncate d15/d39/d67/d77/fc3 3934396 0 2026-03-09T17:29:53.967 INFO:tasks.workunit.client.0.vm06.stdout:8/719: dwrite d15/d16/d1e/fa9 [4194304,4194304] 0 2026-03-09T17:29:53.981 INFO:tasks.workunit.client.0.vm06.stdout:2/663: creat d3/d4/d46/fd8 x:0 0 0 2026-03-09T17:29:53.992 INFO:tasks.workunit.client.0.vm06.stdout:5/751: mkdir d4/d52/d112 0 2026-03-09T17:29:53.995 INFO:tasks.workunit.client.0.vm06.stdout:1/752: dwrite d11/d14/f17 [0,4194304] 0 2026-03-09T17:29:53.997 INFO:tasks.workunit.client.0.vm06.stdout:5/752: dwrite d4/d22/f3f [0,4194304] 0 2026-03-09T17:29:54.009 INFO:tasks.workunit.client.0.vm06.stdout:3/788: truncate dd/d5b/d65/fb7 345894 0 2026-03-09T17:29:54.017 INFO:tasks.workunit.client.0.vm06.stdout:6/644: creat d6/d12/d53/fcc x:0 0 0 2026-03-09T17:29:54.018 INFO:tasks.workunit.client.0.vm06.stdout:4/754: write db/f51 [2267206,52432] 0 2026-03-09T17:29:54.024 INFO:tasks.workunit.client.0.vm06.stdout:8/720: dread d15/d16/f50 [0,4194304] 0 2026-03-09T17:29:54.025 INFO:tasks.workunit.client.0.vm06.stdout:8/721: read - d15/d16/d19/d3d/fc0 zero size 2026-03-09T17:29:54.033 INFO:tasks.workunit.client.0.vm06.stdout:9/813: dwrite d3/d15/f74 [0,4194304] 0 2026-03-09T17:29:54.040 INFO:tasks.workunit.client.0.vm06.stdout:5/753: mknod d4/dca/c113 0 2026-03-09T17:29:54.042 INFO:tasks.workunit.client.0.vm06.stdout:3/789: mknod dd/d1d/c10e 0 2026-03-09T17:29:54.050 INFO:tasks.workunit.client.0.vm06.stdout:4/755: fdatasync db/d1d/d21/d37/d69/f8b 0 2026-03-09T17:29:54.056 INFO:tasks.workunit.client.0.vm06.stdout:8/722: sync 2026-03-09T17:29:54.059 INFO:tasks.workunit.client.0.vm06.stdout:9/814: symlink d3/d6d/d9a/d9c/dcd/l102 0 2026-03-09T17:29:54.061 INFO:tasks.workunit.client.0.vm06.stdout:5/754: fdatasync d4/d52/fcd 0 2026-03-09T17:29:54.068 INFO:tasks.workunit.client.0.vm06.stdout:4/756: mkdir db/d1d/d21/d37/d69/d78/db4/d111 0 2026-03-09T17:29:54.073 INFO:tasks.workunit.client.0.vm06.stdout:7/873: write d5/d7/dac/fcb [592759,52157] 0 2026-03-09T17:29:54.075 INFO:tasks.workunit.client.0.vm06.stdout:2/664: write d3/d4/d12/d2b/d36/dd4/fd5 [969758,83876] 0 2026-03-09T17:29:54.076 INFO:tasks.workunit.client.0.vm06.stdout:2/665: chown d3/d4/d46/l80 7 1 2026-03-09T17:29:54.077 INFO:tasks.workunit.client.0.vm06.stdout:6/645: write d6/d12/d17/f6b [197906,16745] 0 2026-03-09T17:29:54.083 INFO:tasks.workunit.client.0.vm06.stdout:0/879: getdents d7/d11/d5d 0 2026-03-09T17:29:54.088 INFO:tasks.workunit.client.0.vm06.stdout:1/753: link d11/d14/d1d/f8f d11/d14/d1d/d1e/d2a/d34/d64/df6/ffc 0 2026-03-09T17:29:54.090 INFO:tasks.workunit.client.0.vm06.stdout:5/755: truncate d4/d50/d35/f39 3174331 0 2026-03-09T17:29:54.101 INFO:tasks.workunit.client.0.vm06.stdout:6/646: symlink d6/d4f/d73/lcd 0 2026-03-09T17:29:54.101 INFO:tasks.workunit.client.0.vm06.stdout:6/647: write d6/d4f/d3e/d52/d80/faa [4259352,125119] 0 2026-03-09T17:29:54.103 INFO:tasks.workunit.client.0.vm06.stdout:9/815: dread d3/d15/f23 [0,4194304] 0 2026-03-09T17:29:54.104 INFO:tasks.workunit.client.0.vm06.stdout:0/880: unlink d7/c3e 0 2026-03-09T17:29:54.112 INFO:tasks.workunit.client.0.vm06.stdout:6/648: dread d6/d4f/f25 [0,4194304] 0 2026-03-09T17:29:54.113 INFO:tasks.workunit.client.0.vm06.stdout:9/816: dread d3/d26/d35/f99 [0,4194304] 0 2026-03-09T17:29:54.115 INFO:tasks.workunit.client.0.vm06.stdout:1/754: truncate d11/d14/d1c/f2e 2056632 0 2026-03-09T17:29:54.115 INFO:tasks.workunit.client.0.vm06.stdout:3/790: write dd/d81/da3/dae/fcb [1995781,53943] 0 2026-03-09T17:29:54.116 INFO:tasks.workunit.client.0.vm06.stdout:1/755: chown d11/d14/d1d/d1e/d2a/d34/f3b 2567 1 2026-03-09T17:29:54.119 INFO:tasks.workunit.client.0.vm06.stdout:7/874: write d5/dd/dc5/d64/d6b/fb8 [965798,62412] 0 2026-03-09T17:29:54.122 INFO:tasks.workunit.client.0.vm06.stdout:5/756: read d4/fd [1373272,69363] 0 2026-03-09T17:29:54.128 INFO:tasks.workunit.client.0.vm06.stdout:4/757: mknod db/d59/d5f/d6d/ddb/c112 0 2026-03-09T17:29:54.134 INFO:tasks.workunit.client.0.vm06.stdout:0/881: dread - d7/f106 zero size 2026-03-09T17:29:54.139 INFO:tasks.workunit.client.0.vm06.stdout:4/758: sync 2026-03-09T17:29:54.140 INFO:tasks.workunit.client.0.vm06.stdout:1/756: truncate d11/d14/d1d/d1e/d2a/d34/d64/df6/ffc 2990170 0 2026-03-09T17:29:54.140 INFO:tasks.workunit.client.0.vm06.stdout:8/723: link d15/d16/d19/d71/lc4 d15/d39/d67/lea 0 2026-03-09T17:29:54.141 INFO:tasks.workunit.client.0.vm06.stdout:8/724: truncate d15/d16/d1e/f59 5493124 0 2026-03-09T17:29:54.145 INFO:tasks.workunit.client.0.vm06.stdout:7/875: dread d5/d1f/d34/d3f/fca [0,4194304] 0 2026-03-09T17:29:54.146 INFO:tasks.workunit.client.0.vm06.stdout:5/757: symlink d4/d22/d64/l114 0 2026-03-09T17:29:54.154 INFO:tasks.workunit.client.0.vm06.stdout:0/882: symlink d7/d11/d19/d1d/d39/l12e 0 2026-03-09T17:29:54.162 INFO:tasks.workunit.client.0.vm06.stdout:9/817: mkdir d3/d15/d36/d83/df8/d103 0 2026-03-09T17:29:54.162 INFO:tasks.workunit.client.0.vm06.stdout:3/791: mknod dd/d19/d25/df0/c10f 0 2026-03-09T17:29:54.163 INFO:tasks.workunit.client.0.vm06.stdout:1/757: creat d11/de0/ffd x:0 0 0 2026-03-09T17:29:54.164 INFO:tasks.workunit.client.0.vm06.stdout:1/758: write d11/d14/d1d/d94/fc6 [604925,82403] 0 2026-03-09T17:29:54.170 INFO:tasks.workunit.client.0.vm06.stdout:7/876: creat d5/dd/dc5/d5f/ffd x:0 0 0 2026-03-09T17:29:54.173 INFO:tasks.workunit.client.0.vm06.stdout:7/877: read d5/dd/f1a [2041001,63317] 0 2026-03-09T17:29:54.173 INFO:tasks.workunit.client.0.vm06.stdout:7/878: write d5/d7/d2b/dc8/ffa [742223,22699] 0 2026-03-09T17:29:54.174 INFO:tasks.workunit.client.0.vm06.stdout:2/666: getdents d3/d4/d12/d2b/d9f 0 2026-03-09T17:29:54.178 INFO:tasks.workunit.client.0.vm06.stdout:0/883: symlink d7/d11/d19/d3c/db9/l12f 0 2026-03-09T17:29:54.179 INFO:tasks.workunit.client.0.vm06.stdout:2/667: dread f2 [0,4194304] 0 2026-03-09T17:29:54.183 INFO:tasks.workunit.client.0.vm06.stdout:2/668: dwrite d3/d4/d12/d71/daa/d77/d81/d64/d6a/dba/fc0 [0,4194304] 0 2026-03-09T17:29:54.184 INFO:tasks.workunit.client.0.vm06.stdout:0/884: sync 2026-03-09T17:29:54.195 INFO:tasks.workunit.client.0.vm06.stdout:9/818: rename d3/d15/cca to d3/d15/d36/d4c/d6a/c104 0 2026-03-09T17:29:54.197 INFO:tasks.workunit.client.0.vm06.stdout:9/819: dwrite d3/d6d/d9a/d9c/fab [0,4194304] 0 2026-03-09T17:29:54.201 INFO:tasks.workunit.client.0.vm06.stdout:3/792: mkdir dd/d81/da3/dae/d110 0 2026-03-09T17:29:54.206 INFO:tasks.workunit.client.0.vm06.stdout:8/725: creat d15/d39/d67/d77/de7/feb x:0 0 0 2026-03-09T17:29:54.206 INFO:tasks.workunit.client.0.vm06.stdout:5/758: truncate d4/d22/d46/dec/f105 318422 0 2026-03-09T17:29:54.212 INFO:tasks.workunit.client.0.vm06.stdout:6/649: getdents d6/d4f/d3e/d52/d80 0 2026-03-09T17:29:54.217 INFO:tasks.workunit.client.0.vm06.stdout:6/650: dread d6/d12/f22 [4194304,4194304] 0 2026-03-09T17:29:54.219 INFO:tasks.workunit.client.0.vm06.stdout:2/669: fsync d3/d4/d12/d2b/d9f/fb8 0 2026-03-09T17:29:54.220 INFO:tasks.workunit.client.0.vm06.stdout:2/670: read d3/d4/d12/d2b/d36/dd4/fd5 [671077,101528] 0 2026-03-09T17:29:54.227 INFO:tasks.workunit.client.0.vm06.stdout:9/820: dread d3/d26/d35/f6f [0,4194304] 0 2026-03-09T17:29:54.228 INFO:tasks.workunit.client.0.vm06.stdout:4/759: creat db/d1d/d21/d26/f113 x:0 0 0 2026-03-09T17:29:54.233 INFO:tasks.workunit.client.0.vm06.stdout:3/793: dread - dd/d1d/d6e/d70/fbd zero size 2026-03-09T17:29:54.234 INFO:tasks.workunit.client.0.vm06.stdout:1/759: symlink d11/d14/d1d/d1e/dd6/lfe 0 2026-03-09T17:29:54.235 INFO:tasks.workunit.client.0.vm06.stdout:1/760: chown d11/d14/d1d 968138 1 2026-03-09T17:29:54.236 INFO:tasks.workunit.client.0.vm06.stdout:5/759: creat d4/d52/d55/dee/f115 x:0 0 0 2026-03-09T17:29:54.238 INFO:tasks.workunit.client.0.vm06.stdout:5/760: read d4/f49 [1788309,58803] 0 2026-03-09T17:29:54.246 INFO:tasks.workunit.client.0.vm06.stdout:2/671: creat d3/d4/d12/d71/daa/d77/fd9 x:0 0 0 2026-03-09T17:29:54.251 INFO:tasks.workunit.client.0.vm06.stdout:6/651: write d6/d12/d17/f32 [5427947,103054] 0 2026-03-09T17:29:54.255 INFO:tasks.workunit.client.0.vm06.stdout:9/821: creat d3/d15/d16/f105 x:0 0 0 2026-03-09T17:29:54.255 INFO:tasks.workunit.client.0.vm06.stdout:9/822: readlink d3/d6d/d9a/d9c/lf0 0 2026-03-09T17:29:54.259 INFO:tasks.workunit.client.0.vm06.stdout:4/760: dwrite db/d1d/d21/d26/d89/fb1 [0,4194304] 0 2026-03-09T17:29:54.260 INFO:tasks.workunit.client.0.vm06.stdout:4/761: write db/d1d/d21/d37/f54 [5803592,1571] 0 2026-03-09T17:29:54.331 INFO:tasks.workunit.client.0.vm06.stdout:2/672: rmdir d3/d4/d12/d2b 39 2026-03-09T17:29:54.331 INFO:tasks.workunit.client.0.vm06.stdout:0/885: creat d7/d11/d19/d23/db7/dbd/f130 x:0 0 0 2026-03-09T17:29:54.334 INFO:tasks.workunit.client.0.vm06.stdout:0/886: dwrite d7/d11/d2d/fe7 [0,4194304] 0 2026-03-09T17:29:54.335 INFO:tasks.workunit.client.0.vm06.stdout:0/887: chown d7/d11/d19/d3c/db9/dd8/f117 93450 1 2026-03-09T17:29:54.337 INFO:tasks.workunit.client.0.vm06.stdout:0/888: write d7/d11/d19/d1d/f4c [788886,125245] 0 2026-03-09T17:29:54.341 INFO:tasks.workunit.client.0.vm06.stdout:4/762: symlink db/d59/d5f/d6d/l114 0 2026-03-09T17:29:54.343 INFO:tasks.workunit.client.0.vm06.stdout:8/726: creat d15/d16/d1e/fec x:0 0 0 2026-03-09T17:29:54.345 INFO:tasks.workunit.client.0.vm06.stdout:3/794: creat dd/d19/d25/d44/d80/dd7/d10d/f111 x:0 0 0 2026-03-09T17:29:54.346 INFO:tasks.workunit.client.0.vm06.stdout:7/879: getdents d5/d7/dac/dd4 0 2026-03-09T17:29:54.346 INFO:tasks.workunit.client.0.vm06.stdout:7/880: chown d5/d7/dac/fcb 8410408 1 2026-03-09T17:29:54.347 INFO:tasks.workunit.client.0.vm06.stdout:7/881: stat d5/dd/dc5/d5f/ffd 0 2026-03-09T17:29:54.347 INFO:tasks.workunit.client.0.vm06.stdout:5/761: creat d4/d22/d46/dec/f116 x:0 0 0 2026-03-09T17:29:54.347 INFO:tasks.workunit.client.0.vm06.stdout:6/652: mkdir d6/d12/d17/dce 0 2026-03-09T17:29:54.355 INFO:tasks.workunit.client.0.vm06.stdout:0/889: creat d7/d11/d19/f131 x:0 0 0 2026-03-09T17:29:54.355 INFO:tasks.workunit.client.0.vm06.stdout:0/890: chown d7/d11/d89/d99/fff 1830794 1 2026-03-09T17:29:54.355 INFO:tasks.workunit.client.0.vm06.stdout:0/891: chown d7/d11/d19/d1d/c5c 24019 1 2026-03-09T17:29:54.360 INFO:tasks.workunit.client.0.vm06.stdout:8/727: dread d15/d16/d1e/f8c [0,4194304] 0 2026-03-09T17:29:54.362 INFO:tasks.workunit.client.0.vm06.stdout:4/763: rename db/d57/fe3 to db/d59/d5f/d6d/ddb/f115 0 2026-03-09T17:29:54.365 INFO:tasks.workunit.client.0.vm06.stdout:6/653: sync 2026-03-09T17:29:54.373 INFO:tasks.workunit.client.0.vm06.stdout:6/654: sync 2026-03-09T17:29:54.375 INFO:tasks.workunit.client.0.vm06.stdout:1/761: truncate d11/d14/d1d/d42/d46/d92/dc0/f21 4789446 0 2026-03-09T17:29:54.379 INFO:tasks.workunit.client.0.vm06.stdout:3/795: creat dd/d1d/f112 x:0 0 0 2026-03-09T17:29:54.382 INFO:tasks.workunit.client.0.vm06.stdout:1/762: dread d11/d14/d1d/d42/d46/f55 [0,4194304] 0 2026-03-09T17:29:54.392 INFO:tasks.workunit.client.0.vm06.stdout:9/823: dwrite d3/d6d/d9a/fd6 [0,4194304] 0 2026-03-09T17:29:54.395 INFO:tasks.workunit.client.0.vm06.stdout:1/763: mkdir d11/d14/d1d/d42/dff 0 2026-03-09T17:29:54.396 INFO:tasks.workunit.client.0.vm06.stdout:2/673: write d3/d4/d12/f2e [187627,8177] 0 2026-03-09T17:29:54.399 INFO:tasks.workunit.client.0.vm06.stdout:0/892: creat d7/d11/d19/d3c/db9/ddd/d10e/d129/f132 x:0 0 0 2026-03-09T17:29:54.402 INFO:tasks.workunit.client.0.vm06.stdout:6/655: fdatasync d6/d4f/f3a 0 2026-03-09T17:29:54.406 INFO:tasks.workunit.client.0.vm06.stdout:3/796: dread dd/d1d/d2e/f3a [0,4194304] 0 2026-03-09T17:29:54.412 INFO:tasks.workunit.client.0.vm06.stdout:0/893: symlink d7/d11/d19/d1d/d87/l133 0 2026-03-09T17:29:54.414 INFO:tasks.workunit.client.0.vm06.stdout:7/882: write d5/d1f/d34/f41 [3695395,92542] 0 2026-03-09T17:29:54.414 INFO:tasks.workunit.client.0.vm06.stdout:7/883: stat d5/dd/d79/d7f/f98 0 2026-03-09T17:29:54.415 INFO:tasks.workunit.client.0.vm06.stdout:6/656: symlink d6/d4f/d3e/d52/d80/lcf 0 2026-03-09T17:29:54.421 INFO:tasks.workunit.client.0.vm06.stdout:2/674: dread - d3/d4/d12/d2b/fb6 zero size 2026-03-09T17:29:54.423 INFO:tasks.workunit.client.0.vm06.stdout:8/728: rename d15/d16/d1e/d30/db8/d5e/c7f to d15/d16/d19/ced 0 2026-03-09T17:29:54.435 INFO:tasks.workunit.client.0.vm06.stdout:4/764: rename f7 to db/d57/f116 0 2026-03-09T17:29:54.436 INFO:tasks.workunit.client.0.vm06.stdout:8/729: truncate d15/d31/d58/d9b/fb1 2395781 0 2026-03-09T17:29:54.437 INFO:tasks.workunit.client.0.vm06.stdout:8/730: chown d15/d16/d19/d3d/l92 153543 1 2026-03-09T17:29:54.440 INFO:tasks.workunit.client.0.vm06.stdout:7/884: mkdir d5/d7/d2b/dbd/dfe 0 2026-03-09T17:29:54.441 INFO:tasks.workunit.client.0.vm06.stdout:6/657: mkdir d6/d12/d53/dd0 0 2026-03-09T17:29:54.441 INFO:tasks.workunit.client.0.vm06.stdout:6/658: stat d6/d4f/l4b 0 2026-03-09T17:29:54.446 INFO:tasks.workunit.client.0.vm06.stdout:1/764: rename d11/d14/d1c/d5f/lc7 to d11/d14/d1d/d4a/l100 0 2026-03-09T17:29:54.470 INFO:tasks.workunit.client.0.vm06.stdout:4/765: creat db/d59/d5f/d45/d10a/dba/f117 x:0 0 0 2026-03-09T17:29:54.470 INFO:tasks.workunit.client.0.vm06.stdout:8/731: symlink d15/d39/d67/d77/de7/lee 0 2026-03-09T17:29:54.470 INFO:tasks.workunit.client.0.vm06.stdout:6/659: rmdir d6/d47/d96 39 2026-03-09T17:29:54.470 INFO:tasks.workunit.client.0.vm06.stdout:6/660: dread - d6/d47/d4d/d6d/fbe zero size 2026-03-09T17:29:54.470 INFO:tasks.workunit.client.0.vm06.stdout:1/765: creat d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/dce/f101 x:0 0 0 2026-03-09T17:29:54.470 INFO:tasks.workunit.client.0.vm06.stdout:8/732: fdatasync d15/f9d 0 2026-03-09T17:29:54.470 INFO:tasks.workunit.client.0.vm06.stdout:6/661: rmdir d6/d4f/d3e/d52/d95 39 2026-03-09T17:29:54.470 INFO:tasks.workunit.client.0.vm06.stdout:2/675: getdents d3/d4/d12/d2b/d36/dd4 0 2026-03-09T17:29:54.473 INFO:tasks.workunit.client.0.vm06.stdout:1/766: read d11/d14/d1d/d4a/fa7 [2294750,40929] 0 2026-03-09T17:29:54.474 INFO:tasks.workunit.client.0.vm06.stdout:1/767: chown d11/d14/d1d/d42/d46/d92/dc0 194 1 2026-03-09T17:29:54.482 INFO:tasks.workunit.client.0.vm06.stdout:7/885: dread d5/dd/fa6 [0,4194304] 0 2026-03-09T17:29:54.482 INFO:tasks.workunit.client.0.vm06.stdout:4/766: sync 2026-03-09T17:29:54.485 INFO:tasks.workunit.client.0.vm06.stdout:5/762: write d4/d22/f45 [851441,98093] 0 2026-03-09T17:29:54.489 INFO:tasks.workunit.client.0.vm06.stdout:9/824: write d3/d15/d36/d4d/fa4 [1513459,68078] 0 2026-03-09T17:29:54.490 INFO:tasks.workunit.client.0.vm06.stdout:9/825: write d3/d6d/f78 [3284858,123211] 0 2026-03-09T17:29:54.492 INFO:tasks.workunit.client.0.vm06.stdout:3/797: dwrite dd/d19/d25/d2d/fe8 [0,4194304] 0 2026-03-09T17:29:54.495 INFO:tasks.workunit.client.0.vm06.stdout:0/894: write d7/d11/d2d/dca/ff4 [5089827,83989] 0 2026-03-09T17:29:54.516 INFO:tasks.workunit.client.0.vm06.stdout:5/763: creat d4/d50/d35/d40/d95/f117 x:0 0 0 2026-03-09T17:29:54.526 INFO:tasks.workunit.client.0.vm06.stdout:3/798: chown dd/d19/d25/f4f 14608693 1 2026-03-09T17:29:54.528 INFO:tasks.workunit.client.0.vm06.stdout:0/895: creat d7/d11/d89/da8/f134 x:0 0 0 2026-03-09T17:29:54.534 INFO:tasks.workunit.client.0.vm06.stdout:1/768: rename d11/c12 to d11/d14/d1d/d1e/d96/c102 0 2026-03-09T17:29:54.537 INFO:tasks.workunit.client.0.vm06.stdout:5/764: creat d4/d50/d35/d40/d6f/f118 x:0 0 0 2026-03-09T17:29:54.539 INFO:tasks.workunit.client.0.vm06.stdout:9/826: creat d3/d15/d36/df4/f106 x:0 0 0 2026-03-09T17:29:54.543 INFO:tasks.workunit.client.0.vm06.stdout:2/676: dwrite d3/d4/f9c [0,4194304] 0 2026-03-09T17:29:54.546 INFO:tasks.workunit.client.0.vm06.stdout:3/799: creat dd/d19/d1e/db8/f113 x:0 0 0 2026-03-09T17:29:54.548 INFO:tasks.workunit.client.0.vm06.stdout:2/677: dwrite d3/d4/d12/d2b/f89 [0,4194304] 0 2026-03-09T17:29:54.548 INFO:tasks.workunit.client.0.vm06.stdout:0/896: creat d7/d11/d19/d3c/db9/ddd/d10e/d129/f135 x:0 0 0 2026-03-09T17:29:54.549 INFO:tasks.workunit.client.0.vm06.stdout:3/800: write dd/d19/f2b [5079786,72069] 0 2026-03-09T17:29:54.550 INFO:tasks.workunit.client.0.vm06.stdout:3/801: chown dd/d19 1147 1 2026-03-09T17:29:54.552 INFO:tasks.workunit.client.0.vm06.stdout:3/802: read dd/d19/d25/d44/f88 [2665604,98184] 0 2026-03-09T17:29:54.552 INFO:tasks.workunit.client.0.vm06.stdout:4/767: write db/df/f30 [4958446,26911] 0 2026-03-09T17:29:54.553 INFO:tasks.workunit.client.0.vm06.stdout:7/886: rename d5/dd/dc5/c61 to d5/d7/d2b/dbd/dfe/cff 0 2026-03-09T17:29:54.554 INFO:tasks.workunit.client.0.vm06.stdout:8/733: dread d15/d31/d58/d9b/fb1 [0,4194304] 0 2026-03-09T17:29:54.554 INFO:tasks.workunit.client.0.vm06.stdout:4/768: chown db/d1d/c1e 6708 1 2026-03-09T17:29:54.558 INFO:tasks.workunit.client.0.vm06.stdout:9/827: fdatasync d3/d26/d6c/d68/f7f 0 2026-03-09T17:29:54.565 INFO:tasks.workunit.client.0.vm06.stdout:0/897: mkdir d7/d11/d5d/d136 0 2026-03-09T17:29:54.567 INFO:tasks.workunit.client.0.vm06.stdout:6/662: getdents d6/d4f/d3e/d52 0 2026-03-09T17:29:54.570 INFO:tasks.workunit.client.0.vm06.stdout:7/887: symlink d5/d7/d2b/dbd/dfe/l100 0 2026-03-09T17:29:54.572 INFO:tasks.workunit.client.0.vm06.stdout:5/765: dread d4/dca/fab [0,4194304] 0 2026-03-09T17:29:54.573 INFO:tasks.workunit.client.0.vm06.stdout:3/803: mknod dd/d1d/d2e/c114 0 2026-03-09T17:29:54.577 INFO:tasks.workunit.client.0.vm06.stdout:5/766: dwrite d4/d52/db4/dc2/f100 [0,4194304] 0 2026-03-09T17:29:54.578 INFO:tasks.workunit.client.0.vm06.stdout:8/734: mkdir d15/d16/d1e/d30/d55/def 0 2026-03-09T17:29:54.579 INFO:tasks.workunit.client.0.vm06.stdout:3/804: dwrite dd/f10 [0,4194304] 0 2026-03-09T17:29:54.579 INFO:tasks.workunit.client.0.vm06.stdout:5/767: chown d4/d52/d55 1 1 2026-03-09T17:29:54.582 INFO:tasks.workunit.client.0.vm06.stdout:5/768: write d4/d22/d64/f70 [6438090,51098] 0 2026-03-09T17:29:54.582 INFO:tasks.workunit.client.0.vm06.stdout:9/828: dread - d3/dad/fb2 zero size 2026-03-09T17:29:54.583 INFO:tasks.workunit.client.0.vm06.stdout:5/769: chown d4/d22/d46/f78 3 1 2026-03-09T17:29:54.584 INFO:tasks.workunit.client.0.vm06.stdout:0/898: mkdir d7/d11/d19/d3c/df8/d137 0 2026-03-09T17:29:54.585 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:54 vm09.local ceph-mon[62061]: pgmap v160: 65 pgs: 65 active+clean; 1.7 GiB data, 6.2 GiB used, 114 GiB / 120 GiB avail; 31 MiB/s rd, 80 MiB/s wr, 332 op/s 2026-03-09T17:29:54.585 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:54 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:29:54.585 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:54 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm09.lqzvkh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:29:54.585 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:54 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:29:54.585 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:54 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:29:54.606 INFO:tasks.workunit.client.0.vm06.stdout:3/805: readlink dd/d19/d25/l63 0 2026-03-09T17:29:54.606 INFO:tasks.workunit.client.0.vm06.stdout:3/806: stat dd/d1d/d2e/d67/fed 0 2026-03-09T17:29:54.607 INFO:tasks.workunit.client.0.vm06.stdout:3/807: chown dd/d5b/d65/cea 1 1 2026-03-09T17:29:54.607 INFO:tasks.workunit.client.0.vm06.stdout:3/808: chown dd/d59/da1/fa4 2284339 1 2026-03-09T17:29:54.619 INFO:tasks.workunit.client.0.vm06.stdout:7/888: write d5/f16 [5231567,110614] 0 2026-03-09T17:29:54.620 INFO:tasks.workunit.client.0.vm06.stdout:0/899: truncate d7/f14 841463 0 2026-03-09T17:29:54.621 INFO:tasks.workunit.client.0.vm06.stdout:6/663: creat d6/d12/d53/d8f/dc2/fd1 x:0 0 0 2026-03-09T17:29:54.623 INFO:tasks.workunit.client.0.vm06.stdout:2/678: mkdir d3/d4/d22/d72/d8f/dda 0 2026-03-09T17:29:54.628 INFO:tasks.workunit.client.0.vm06.stdout:2/679: dwrite d3/d4/d22/f2f [0,4194304] 0 2026-03-09T17:29:54.635 INFO:tasks.workunit.client.0.vm06.stdout:4/769: rename db/d59/d5f/d45/d10a/cc9 to db/d1d/d21/d25/d4b/c118 0 2026-03-09T17:29:54.636 INFO:tasks.workunit.client.0.vm06.stdout:1/769: getdents d11/d14/d1d/d1e/d2a/d34/d64/df6 0 2026-03-09T17:29:54.637 INFO:tasks.workunit.client.0.vm06.stdout:1/770: fdatasync d11/de0/ff1 0 2026-03-09T17:29:54.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:54 vm06.local ceph-mon[57307]: pgmap v160: 65 pgs: 65 active+clean; 1.7 GiB data, 6.2 GiB used, 114 GiB / 120 GiB avail; 31 MiB/s rd, 80 MiB/s wr, 332 op/s 2026-03-09T17:29:54.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:54 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:29:54.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:54 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm09.lqzvkh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:29:54.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:54 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:29:54.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:54 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:29:54.643 INFO:tasks.workunit.client.0.vm06.stdout:6/664: creat d6/d4f/d3e/d52/fd2 x:0 0 0 2026-03-09T17:29:54.647 INFO:tasks.workunit.client.0.vm06.stdout:2/680: symlink d3/d4/d12/d2b/d36/dd4/ldb 0 2026-03-09T17:29:54.650 INFO:tasks.workunit.client.0.vm06.stdout:4/770: chown db/d59/d5f/d6d/ddb/f115 1753752 1 2026-03-09T17:29:54.660 INFO:tasks.workunit.client.0.vm06.stdout:8/735: mknod d15/d16/d19/d3d/d5f/dd4/cf0 0 2026-03-09T17:29:54.669 INFO:tasks.workunit.client.0.vm06.stdout:9/829: rmdir d3/de9 0 2026-03-09T17:29:54.673 INFO:tasks.workunit.client.0.vm06.stdout:9/830: dwrite d3/d2c/ffc [0,4194304] 0 2026-03-09T17:29:54.686 INFO:tasks.workunit.client.0.vm06.stdout:5/770: mkdir d4/d50/db2/d119 0 2026-03-09T17:29:54.686 INFO:tasks.workunit.client.0.vm06.stdout:7/889: mknod d5/dd/dc5/c101 0 2026-03-09T17:29:54.686 INFO:tasks.workunit.client.0.vm06.stdout:0/900: getdents d7/d11/d19/d23/db7/dbd/d101/d12c 0 2026-03-09T17:29:54.686 INFO:tasks.workunit.client.0.vm06.stdout:0/901: read - d7/d11/d19/d23/db7/dbd/f119 zero size 2026-03-09T17:29:54.686 INFO:tasks.workunit.client.0.vm06.stdout:0/902: chown d7/d11/d2d/dca/ff4 24 1 2026-03-09T17:29:54.686 INFO:tasks.workunit.client.0.vm06.stdout:0/903: stat d7/d11/d19/d1d/l9b 0 2026-03-09T17:29:54.687 INFO:tasks.workunit.client.0.vm06.stdout:6/665: chown d6/d47/d96/d40/l5f 983817 1 2026-03-09T17:29:54.695 INFO:tasks.workunit.client.0.vm06.stdout:3/809: dwrite dd/d19/d25/d2d/d9b/fc7 [0,4194304] 0 2026-03-09T17:29:54.696 INFO:tasks.workunit.client.0.vm06.stdout:3/810: chown dd/d81/da3/dae/df8/dff 36 1 2026-03-09T17:29:54.697 INFO:tasks.workunit.client.0.vm06.stdout:3/811: chown dd/d19/d1e/lf4 19469 1 2026-03-09T17:29:54.712 INFO:tasks.workunit.client.0.vm06.stdout:1/771: mkdir d11/d14/d1d/d1e/dc2/d103 0 2026-03-09T17:29:54.719 INFO:tasks.workunit.client.0.vm06.stdout:5/771: rename d4/d22/l27 to d4/d52/d55/l11a 0 2026-03-09T17:29:54.723 INFO:tasks.workunit.client.0.vm06.stdout:7/890: rmdir d5/dd/dc5/d5f 39 2026-03-09T17:29:54.733 INFO:tasks.workunit.client.0.vm06.stdout:3/812: dread - dd/d59/f83 zero size 2026-03-09T17:29:54.738 INFO:tasks.workunit.client.0.vm06.stdout:4/771: write fa [3184431,113367] 0 2026-03-09T17:29:54.743 INFO:tasks.workunit.client.0.vm06.stdout:8/736: rename d15/d16/d19 to d15/d31/dc5/df1 0 2026-03-09T17:29:54.745 INFO:tasks.workunit.client.0.vm06.stdout:5/772: dread - d4/d52/fcd zero size 2026-03-09T17:29:54.747 INFO:tasks.workunit.client.0.vm06.stdout:7/891: fsync d5/dd/d79/d7f/fde 0 2026-03-09T17:29:54.754 INFO:tasks.workunit.client.0.vm06.stdout:6/666: dwrite d6/d4f/d3e/d52/f89 [0,4194304] 0 2026-03-09T17:29:54.759 INFO:tasks.workunit.client.0.vm06.stdout:3/813: creat dd/d19/d25/d44/f115 x:0 0 0 2026-03-09T17:29:54.759 INFO:tasks.workunit.client.0.vm06.stdout:9/831: dwrite d3/d11/d65/faa [0,4194304] 0 2026-03-09T17:29:54.761 INFO:tasks.workunit.client.0.vm06.stdout:3/814: stat dd/d19/d1e/db8/fd0 0 2026-03-09T17:29:54.771 INFO:tasks.workunit.client.0.vm06.stdout:7/892: mkdir d5/d1f/d102 0 2026-03-09T17:29:54.772 INFO:tasks.workunit.client.0.vm06.stdout:7/893: chown d5/dd/dc5/d64/de8/lf3 21 1 2026-03-09T17:29:54.774 INFO:tasks.workunit.client.0.vm06.stdout:0/904: mknod d7/d11/d5d/c138 0 2026-03-09T17:29:54.775 INFO:tasks.workunit.client.0.vm06.stdout:0/905: chown d7/d11/d19/d23/f60 6572999 1 2026-03-09T17:29:54.779 INFO:tasks.workunit.client.0.vm06.stdout:2/681: getdents d3/d4/d12/d71/daa/d77/d81/d64/d6a/dba 0 2026-03-09T17:29:54.784 INFO:tasks.workunit.client.0.vm06.stdout:2/682: stat d3/d4/d12/d2b/db0/dc1 0 2026-03-09T17:29:54.784 INFO:tasks.workunit.client.0.vm06.stdout:2/683: fsync d3/d4/d12/da7/db3/fbe 0 2026-03-09T17:29:54.784 INFO:tasks.workunit.client.0.vm06.stdout:9/832: creat d3/d15/d36/df4/f107 x:0 0 0 2026-03-09T17:29:54.788 INFO:tasks.workunit.client.0.vm06.stdout:7/894: symlink d5/d1f/d34/d3f/d8b/l103 0 2026-03-09T17:29:54.791 INFO:tasks.workunit.client.0.vm06.stdout:6/667: mkdir d6/d4f/dd3 0 2026-03-09T17:29:54.791 INFO:tasks.workunit.client.0.vm06.stdout:6/668: write d6/d12/d17/f32 [9083644,7605] 0 2026-03-09T17:29:54.795 INFO:tasks.workunit.client.0.vm06.stdout:4/772: creat db/d1d/d21/d25/d4b/f119 x:0 0 0 2026-03-09T17:29:54.796 INFO:tasks.workunit.client.0.vm06.stdout:9/833: mkdir d3/d15/d48/da8/db9/de8/d108 0 2026-03-09T17:29:54.796 INFO:tasks.workunit.client.0.vm06.stdout:9/834: fsync d3/d2c/f81 0 2026-03-09T17:29:54.798 INFO:tasks.workunit.client.0.vm06.stdout:5/773: link d4/d22/c47 d4/d50/d35/d40/d95/db8/c11b 0 2026-03-09T17:29:54.799 INFO:tasks.workunit.client.0.vm06.stdout:7/895: stat d5/d7/f75 0 2026-03-09T17:29:54.800 INFO:tasks.workunit.client.0.vm06.stdout:6/669: write d6/d4f/d3e/d52/d8c/db0/fc7 [976746,5945] 0 2026-03-09T17:29:54.806 INFO:tasks.workunit.client.0.vm06.stdout:4/773: dwrite db/d1d/d21/d25/d4b/d85/f98 [0,4194304] 0 2026-03-09T17:29:54.811 INFO:tasks.workunit.client.0.vm06.stdout:9/835: read - d3/d26/d35/fb5 zero size 2026-03-09T17:29:54.811 INFO:tasks.workunit.client.0.vm06.stdout:9/836: chown d3/d15/d36/d4c/d6a/d8a/cc5 78 1 2026-03-09T17:29:54.811 INFO:tasks.workunit.client.0.vm06.stdout:9/837: chown d3/d26/dd7 120059 1 2026-03-09T17:29:54.811 INFO:tasks.workunit.client.0.vm06.stdout:8/737: getdents d15/d31/dc5/df1 0 2026-03-09T17:29:54.830 INFO:tasks.workunit.client.0.vm06.stdout:3/815: sync 2026-03-09T17:29:54.830 INFO:tasks.workunit.client.0.vm06.stdout:2/684: sync 2026-03-09T17:29:54.837 INFO:tasks.workunit.client.0.vm06.stdout:9/838: rename d3/d26/d6c/d68/cc0 to d3/d15/d36/d4c/d6a/c109 0 2026-03-09T17:29:54.838 INFO:tasks.workunit.client.0.vm06.stdout:4/774: dread db/d59/d5f/d5d/fc2 [0,4194304] 0 2026-03-09T17:29:54.840 INFO:tasks.workunit.client.0.vm06.stdout:3/816: rmdir dd/d59 39 2026-03-09T17:29:54.846 INFO:tasks.workunit.client.0.vm06.stdout:4/775: symlink db/d59/d5f/d45/d10a/dcc/de0/l11a 0 2026-03-09T17:29:54.849 INFO:tasks.workunit.client.0.vm06.stdout:3/817: stat dd/d1d/d2e/fb4 0 2026-03-09T17:29:54.852 INFO:tasks.workunit.client.0.vm06.stdout:6/670: dread d6/d47/f88 [4194304,4194304] 0 2026-03-09T17:29:54.855 INFO:tasks.workunit.client.0.vm06.stdout:3/818: dread dd/d19/d25/d2d/d9b/fdb [0,4194304] 0 2026-03-09T17:29:54.857 INFO:tasks.workunit.client.0.vm06.stdout:9/839: creat d3/d15/d36/d83/df8/d103/f10a x:0 0 0 2026-03-09T17:29:54.858 INFO:tasks.workunit.client.0.vm06.stdout:4/776: symlink db/d1d/d21/d25/d4b/d85/l11b 0 2026-03-09T17:29:54.861 INFO:tasks.workunit.client.0.vm06.stdout:9/840: dread - d3/d15/fed zero size 2026-03-09T17:29:54.862 INFO:tasks.workunit.client.0.vm06.stdout:4/777: write db/d59/d5f/d5d/fc2 [2362741,98086] 0 2026-03-09T17:29:54.863 INFO:tasks.workunit.client.0.vm06.stdout:4/778: chown db/d59/d90/fe5 12 1 2026-03-09T17:29:54.864 INFO:tasks.workunit.client.0.vm06.stdout:4/779: chown db/d1d/d21/d88/l95 649379758 1 2026-03-09T17:29:54.864 INFO:tasks.workunit.client.0.vm06.stdout:2/685: getdents d3/d4/d12/d71/daa/d77/d81 0 2026-03-09T17:29:54.864 INFO:tasks.workunit.client.0.vm06.stdout:4/780: write db/f6f [3770614,116748] 0 2026-03-09T17:29:54.866 INFO:tasks.workunit.client.0.vm06.stdout:4/781: write db/d59/d5f/d5d/fc2 [2611402,26178] 0 2026-03-09T17:29:54.866 INFO:tasks.workunit.client.0.vm06.stdout:3/819: getdents dd/d19/d1e/d100 0 2026-03-09T17:29:54.868 INFO:tasks.workunit.client.0.vm06.stdout:9/841: mkdir d3/d6d/d10b 0 2026-03-09T17:29:54.874 INFO:tasks.workunit.client.0.vm06.stdout:6/671: link d6/d4f/d3e/d52/f89 d6/d12/fd4 0 2026-03-09T17:29:54.876 INFO:tasks.workunit.client.0.vm06.stdout:6/672: read d6/d12/d17/d85/faf [37325,82149] 0 2026-03-09T17:29:54.884 INFO:tasks.workunit.client.0.vm06.stdout:1/772: dwrite d11/d14/d1d/d1e/d96/fab [0,4194304] 0 2026-03-09T17:29:54.886 INFO:tasks.workunit.client.0.vm06.stdout:1/773: chown d11/d14/d1d/c36 2130952470 1 2026-03-09T17:29:54.898 INFO:tasks.workunit.client.0.vm06.stdout:9/842: dread d3/d15/f2e [0,4194304] 0 2026-03-09T17:29:54.899 INFO:tasks.workunit.client.0.vm06.stdout:9/843: chown d3/d6d/d9a/d9c/dcd/l102 3822 1 2026-03-09T17:29:54.902 INFO:tasks.workunit.client.0.vm06.stdout:9/844: dread d3/d6d/f78 [0,4194304] 0 2026-03-09T17:29:54.904 INFO:tasks.workunit.client.0.vm06.stdout:1/774: mknod d11/d14/d1d/d1e/d2a/d99/db0/c104 0 2026-03-09T17:29:54.906 INFO:tasks.workunit.client.0.vm06.stdout:1/775: truncate d11/d14/d1d/d1e/d2a/d99/de9/ff9 597762 0 2026-03-09T17:29:54.917 INFO:tasks.workunit.client.0.vm06.stdout:7/896: write d5/dd/d79/d7f/fde [220077,21587] 0 2026-03-09T17:29:54.928 INFO:tasks.workunit.client.0.vm06.stdout:3/820: link dd/f1a dd/d81/d97/df5/f116 0 2026-03-09T17:29:54.937 INFO:tasks.workunit.client.0.vm06.stdout:3/821: truncate dd/d19/d25/d44/d80/dd7/fe6 315590 0 2026-03-09T17:29:54.947 INFO:tasks.workunit.client.0.vm06.stdout:3/822: dwrite dd/d19/d25/d44/d80/fc4 [0,4194304] 0 2026-03-09T17:29:54.957 INFO:tasks.workunit.client.0.vm06.stdout:7/897: creat d5/d7/f104 x:0 0 0 2026-03-09T17:29:54.960 INFO:tasks.workunit.client.0.vm06.stdout:8/738: dwrite d15/d16/f50 [0,4194304] 0 2026-03-09T17:29:54.961 INFO:tasks.workunit.client.0.vm06.stdout:5/774: dwrite d4/d22/f77 [0,4194304] 0 2026-03-09T17:29:54.980 INFO:tasks.workunit.client.0.vm06.stdout:5/775: symlink d4/d52/d55/l11c 0 2026-03-09T17:29:54.983 INFO:tasks.workunit.client.0.vm06.stdout:8/739: rename d15/d16/d1a/f29 to d15/d31/dc5/df1/d2b/d85/ff2 0 2026-03-09T17:29:54.983 INFO:tasks.workunit.client.0.vm06.stdout:8/740: stat d15/d16/d1e/d30/c4a 0 2026-03-09T17:29:54.984 INFO:tasks.workunit.client.0.vm06.stdout:8/741: chown d15/d39/d67/d77/cab 7285 1 2026-03-09T17:29:54.996 INFO:tasks.workunit.client.0.vm06.stdout:7/898: creat d5/d7/d2b/dc8/dd7/f105 x:0 0 0 2026-03-09T17:29:55.003 INFO:tasks.workunit.client.0.vm06.stdout:7/899: rename d5/d7/d2b/dbd/fbe to d5/d7/dac/dd4/f106 0 2026-03-09T17:29:55.004 INFO:tasks.workunit.client.0.vm06.stdout:5/776: symlink d4/d52/d112/l11d 0 2026-03-09T17:29:55.011 INFO:tasks.workunit.client.0.vm06.stdout:8/742: mkdir d15/d16/d1e/d30/d55/def/df3 0 2026-03-09T17:29:55.011 INFO:tasks.workunit.client.0.vm06.stdout:8/743: fsync f12 0 2026-03-09T17:29:55.013 INFO:tasks.workunit.client.0.vm06.stdout:7/900: chown d5/d7/c3b 0 1 2026-03-09T17:29:55.017 INFO:tasks.workunit.client.0.vm06.stdout:5/777: rename d4/d50/d35/d40/d95/db8/df1 to d4/d52/d55/d11e 0 2026-03-09T17:29:55.028 INFO:tasks.workunit.client.0.vm06.stdout:8/744: rename d15/d31/dc5/df1/d71/f65 to d15/d31/d58/d9b/ff4 0 2026-03-09T17:29:55.029 INFO:tasks.workunit.client.0.vm06.stdout:8/745: chown d15/d16/d1e/d30/d55/def 3025 1 2026-03-09T17:29:55.032 INFO:tasks.workunit.client.0.vm06.stdout:5/778: mkdir d4/d50/d35/d40/d109/d11f 0 2026-03-09T17:29:55.033 INFO:tasks.workunit.client.0.vm06.stdout:2/686: write d3/d4/d12/d71/daa/f5f [851991,85037] 0 2026-03-09T17:29:55.048 INFO:tasks.workunit.client.0.vm06.stdout:2/687: rmdir d3/d4/d22/d72 39 2026-03-09T17:29:55.049 INFO:tasks.workunit.client.0.vm06.stdout:4/782: dwrite db/d59/f76 [0,4194304] 0 2026-03-09T17:29:55.054 INFO:tasks.workunit.client.0.vm06.stdout:8/746: creat d15/d31/dc5/df1/d3d/d5f/d83/ff5 x:0 0 0 2026-03-09T17:29:55.061 INFO:tasks.workunit.client.0.vm06.stdout:2/688: rmdir d3/d4/d12/d71/daa/d77/d81 39 2026-03-09T17:29:55.065 INFO:tasks.workunit.client.0.vm06.stdout:4/783: creat db/d59/d5f/d45/d10a/dcc/f11c x:0 0 0 2026-03-09T17:29:55.066 INFO:tasks.workunit.client.0.vm06.stdout:8/747: creat d15/d16/d1e/d30/d55/ff6 x:0 0 0 2026-03-09T17:29:55.068 INFO:tasks.workunit.client.0.vm06.stdout:6/673: write d6/d4f/d3e/d52/f84 [374370,25962] 0 2026-03-09T17:29:55.070 INFO:tasks.workunit.client.0.vm06.stdout:2/689: unlink d3/d4/d46/da5/lb4 0 2026-03-09T17:29:55.073 INFO:tasks.workunit.client.0.vm06.stdout:4/784: chown db/f15 908 1 2026-03-09T17:29:55.075 INFO:tasks.workunit.client.0.vm06.stdout:6/674: dwrite d6/d4f/d3e/d52/d8c/db0/fc7 [0,4194304] 0 2026-03-09T17:29:55.087 INFO:tasks.workunit.client.0.vm06.stdout:8/748: fdatasync d15/d39/d67/d77/d97/fad 0 2026-03-09T17:29:55.091 INFO:tasks.workunit.client.0.vm06.stdout:9/845: dwrite d3/f1b [0,4194304] 0 2026-03-09T17:29:55.106 INFO:tasks.workunit.client.0.vm06.stdout:2/690: creat d3/d4/d12/d2b/fdc x:0 0 0 2026-03-09T17:29:55.106 INFO:tasks.workunit.client.0.vm06.stdout:4/785: sync 2026-03-09T17:29:55.112 INFO:tasks.workunit.client.0.vm06.stdout:6/675: creat d6/d12/d53/fd5 x:0 0 0 2026-03-09T17:29:55.112 INFO:tasks.workunit.client.0.vm06.stdout:2/691: mknod d3/d4/d12/da7/cdd 0 2026-03-09T17:29:55.112 INFO:tasks.workunit.client.0.vm06.stdout:9/846: creat d3/d15/f10c x:0 0 0 2026-03-09T17:29:55.113 INFO:tasks.workunit.client.0.vm06.stdout:4/786: mknod db/d59/d5f/c11d 0 2026-03-09T17:29:55.115 INFO:tasks.workunit.client.0.vm06.stdout:9/847: rename d3/d11/f1c to d3/d6d/f10d 0 2026-03-09T17:29:55.115 INFO:tasks.workunit.client.0.vm06.stdout:9/848: readlink d3/d11/d65/lc9 0 2026-03-09T17:29:55.117 INFO:tasks.workunit.client.0.vm06.stdout:9/849: read d3/d15/d48/f64 [2620688,79165] 0 2026-03-09T17:29:55.118 INFO:tasks.workunit.client.0.vm06.stdout:6/676: symlink d6/d47/ld6 0 2026-03-09T17:29:55.118 INFO:tasks.workunit.client.0.vm06.stdout:6/677: stat d6/d4f/l4b 0 2026-03-09T17:29:55.123 INFO:tasks.workunit.client.0.vm06.stdout:1/776: truncate d11/f18 7134641 0 2026-03-09T17:29:55.128 INFO:tasks.workunit.client.0.vm06.stdout:9/850: fsync d3/d26/d6c/f5b 0 2026-03-09T17:29:55.128 INFO:tasks.workunit.client.0.vm06.stdout:9/851: chown d3/d15/f10c 83938 1 2026-03-09T17:29:55.128 INFO:tasks.workunit.client.0.vm06.stdout:4/787: creat db/d1d/d21/d44/d8a/dec/f11e x:0 0 0 2026-03-09T17:29:55.130 INFO:tasks.workunit.client.0.vm06.stdout:4/788: mkdir db/d1d/d21/d37/d69/d11f 0 2026-03-09T17:29:55.135 INFO:tasks.workunit.client.0.vm06.stdout:4/789: rmdir db/d59/d90 39 2026-03-09T17:29:55.136 INFO:tasks.workunit.client.0.vm06.stdout:4/790: write db/d59/d5f/d45/d10a/dba/f117 [486267,10300] 0 2026-03-09T17:29:55.139 INFO:tasks.workunit.client.0.vm06.stdout:1/777: link d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/fef d11/f105 0 2026-03-09T17:29:55.141 INFO:tasks.workunit.client.0.vm06.stdout:9/852: creat d3/d15/f10e x:0 0 0 2026-03-09T17:29:55.145 INFO:tasks.workunit.client.0.vm06.stdout:9/853: dwrite d3/d15/d36/d83/fee [0,4194304] 0 2026-03-09T17:29:55.148 INFO:tasks.workunit.client.0.vm06.stdout:9/854: truncate d3/d11/d65/d80/fd2 992029 0 2026-03-09T17:29:55.148 INFO:tasks.workunit.client.0.vm06.stdout:9/855: readlink d3/d26/l56 0 2026-03-09T17:29:55.149 INFO:tasks.workunit.client.0.vm06.stdout:0/906: dwrite d7/d11/d5d/db8/f10a [0,4194304] 0 2026-03-09T17:29:55.160 INFO:tasks.workunit.client.0.vm06.stdout:1/778: truncate d11/d14/d1d/d1e/f47 1021621 0 2026-03-09T17:29:55.167 INFO:tasks.workunit.client.0.vm06.stdout:0/907: fsync d7/d11/d19/d8b/da4/d85/f105 0 2026-03-09T17:29:55.167 INFO:tasks.workunit.client.0.vm06.stdout:9/856: link d3/d15/d36/d4c/d6a/d8a/fef d3/d15/d36/d4c/d6a/f10f 0 2026-03-09T17:29:55.173 INFO:tasks.workunit.client.0.vm06.stdout:0/908: creat d7/d11/d2d/daf/f139 x:0 0 0 2026-03-09T17:29:55.174 INFO:tasks.workunit.client.0.vm06.stdout:4/791: dread db/d59/d5f/d45/fa9 [0,4194304] 0 2026-03-09T17:29:55.179 INFO:tasks.workunit.client.0.vm06.stdout:0/909: mkdir d7/d11/d89/da8/db2/d13a 0 2026-03-09T17:29:55.180 INFO:tasks.workunit.client.0.vm06.stdout:4/792: dwrite db/d57/dd4/fe7 [4194304,4194304] 0 2026-03-09T17:29:55.183 INFO:tasks.workunit.client.0.vm06.stdout:0/910: dread - d7/d11/d89/f10b zero size 2026-03-09T17:29:55.185 INFO:tasks.workunit.client.0.vm06.stdout:4/793: chown db/d1d/f1f 16681 1 2026-03-09T17:29:55.187 INFO:tasks.workunit.client.0.vm06.stdout:4/794: symlink db/d1d/d21/d88/l120 0 2026-03-09T17:29:55.192 INFO:tasks.workunit.client.0.vm06.stdout:4/795: dread db/d59/f76 [0,4194304] 0 2026-03-09T17:29:55.192 INFO:tasks.workunit.client.0.vm06.stdout:4/796: chown db/d1d/d21/d25/d4b/ce8 2670288 1 2026-03-09T17:29:55.195 INFO:tasks.workunit.client.0.vm06.stdout:4/797: rename db/d59/d5f/d45/d10a/dcc/c109 to db/d59/d5f/d45/d10a/dcc/de0/c121 0 2026-03-09T17:29:55.199 INFO:tasks.workunit.client.0.vm06.stdout:4/798: dwrite db/d1d/d21/d26/d7a/f105 [0,4194304] 0 2026-03-09T17:29:55.215 INFO:tasks.workunit.client.0.vm06.stdout:4/799: link db/d1d/d21/d37/d69/c7c db/d1d/d21/d25/c122 0 2026-03-09T17:29:55.217 INFO:tasks.workunit.client.0.vm06.stdout:4/800: creat db/d59/d5f/d45/d10a/dcc/f123 x:0 0 0 2026-03-09T17:29:55.218 INFO:tasks.workunit.client.0.vm06.stdout:4/801: rmdir db/d1d/d21/d26 39 2026-03-09T17:29:55.218 INFO:tasks.workunit.client.0.vm06.stdout:4/802: chown db/d1d/d21/d25/f35 22942 1 2026-03-09T17:29:55.235 INFO:tasks.workunit.client.0.vm06.stdout:4/803: dread db/df/f2a [0,4194304] 0 2026-03-09T17:29:55.250 INFO:tasks.workunit.client.0.vm06.stdout:7/901: dwrite d5/dd/dc5/f93 [0,4194304] 0 2026-03-09T17:29:55.252 INFO:tasks.workunit.client.0.vm06.stdout:7/902: write d5/d7/f104 [111760,122552] 0 2026-03-09T17:29:55.255 INFO:tasks.workunit.client.0.vm06.stdout:5/779: write d4/d50/d35/f94 [392037,57024] 0 2026-03-09T17:29:55.270 INFO:tasks.workunit.client.0.vm06.stdout:7/903: mkdir d5/d7/dac/d107 0 2026-03-09T17:29:55.270 INFO:tasks.workunit.client.0.vm06.stdout:7/904: chown d5/dd/dc5/d64 0 1 2026-03-09T17:29:55.276 INFO:tasks.workunit.client.0.vm06.stdout:7/905: fsync d5/d1f/d34/d3f/d8b/fd3 0 2026-03-09T17:29:55.280 INFO:tasks.workunit.client.0.vm06.stdout:7/906: rename d5/d1f/d34/d46/c88 to d5/dd/dc5/dee/c108 0 2026-03-09T17:29:55.283 INFO:tasks.workunit.client.0.vm06.stdout:5/780: sync 2026-03-09T17:29:55.284 INFO:tasks.workunit.client.0.vm06.stdout:7/907: fdatasync d5/d1f/d34/f54 0 2026-03-09T17:29:55.288 INFO:tasks.workunit.client.0.vm06.stdout:7/908: sync 2026-03-09T17:29:55.293 INFO:tasks.workunit.client.0.vm06.stdout:7/909: dwrite d5/d1f/d34/d46/d51/f7c [0,4194304] 0 2026-03-09T17:29:55.319 INFO:tasks.workunit.client.0.vm06.stdout:8/749: write d15/d31/dc5/df1/d3d/f6a [1125570,38190] 0 2026-03-09T17:29:55.323 INFO:tasks.workunit.client.0.vm06.stdout:8/750: unlink d15/d39/d67/fd0 0 2026-03-09T17:29:55.331 INFO:tasks.workunit.client.0.vm06.stdout:8/751: read d15/d39/d3c/d6c/fbf [8347,37319] 0 2026-03-09T17:29:55.331 INFO:tasks.workunit.client.0.vm06.stdout:8/752: stat d15/d31/dc5/df1/d3d/dc7 0 2026-03-09T17:29:55.339 INFO:tasks.workunit.client.0.vm06.stdout:8/753: symlink d15/d31/dc5/df1/d2b/d85/lf7 0 2026-03-09T17:29:55.340 INFO:tasks.workunit.client.0.vm06.stdout:8/754: write d15/d39/d67/de3/fe9 [806188,116768] 0 2026-03-09T17:29:55.340 INFO:tasks.workunit.client.0.vm06.stdout:8/755: chown d15/d31/dc5/df1/d3d/d5f/d83 25007 1 2026-03-09T17:29:55.345 INFO:tasks.workunit.client.0.vm06.stdout:2/692: dwrite d3/d4/d12/d71/daa/d77/d81/f50 [0,4194304] 0 2026-03-09T17:29:55.347 INFO:tasks.workunit.client.0.vm06.stdout:2/693: fsync d3/d4/f70 0 2026-03-09T17:29:55.357 INFO:tasks.workunit.client.0.vm06.stdout:8/756: read d15/d39/d67/d77/fc3 [400345,30610] 0 2026-03-09T17:29:55.359 INFO:tasks.workunit.client.0.vm06.stdout:6/678: dwrite d6/d4f/d3e/d52/f89 [0,4194304] 0 2026-03-09T17:29:55.368 INFO:tasks.workunit.client.0.vm06.stdout:2/694: link d3/d4/d12/d71/daa/d77/d81/d64/d6a/dba/fc0 d3/d4/d12/d2b/d36/fde 0 2026-03-09T17:29:55.369 INFO:tasks.workunit.client.0.vm06.stdout:2/695: write d3/d4/d12/d2b/f89 [4146554,56419] 0 2026-03-09T17:29:55.372 INFO:tasks.workunit.client.0.vm06.stdout:8/757: symlink d15/d31/dc5/df1/d3d/d5f/d83/dc1/lf8 0 2026-03-09T17:29:55.387 INFO:tasks.workunit.client.0.vm06.stdout:8/758: fsync d15/d39/d67/d77/fa0 0 2026-03-09T17:29:55.393 INFO:tasks.workunit.client.0.vm06.stdout:2/696: dwrite d3/d4/d22/d72/f54 [0,4194304] 0 2026-03-09T17:29:55.480 INFO:tasks.workunit.client.0.vm06.stdout:3/823: dwrite dd/d19/d25/d44/d80/dd7/fe6 [0,4194304] 0 2026-03-09T17:29:55.490 INFO:tasks.workunit.client.0.vm06.stdout:3/824: symlink dd/d19/d25/d44/l117 0 2026-03-09T17:29:55.498 INFO:tasks.workunit.client.0.vm06.stdout:3/825: dread dd/d19/f2b [0,4194304] 0 2026-03-09T17:29:55.508 INFO:tasks.workunit.client.0.vm06.stdout:3/826: getdents dd/d1d 0 2026-03-09T17:29:55.509 INFO:tasks.workunit.client.0.vm06.stdout:3/827: truncate dd/d1d/d4e/f74 472097 0 2026-03-09T17:29:55.510 INFO:tasks.workunit.client.0.vm06.stdout:3/828: fsync dd/d19/d2c/f103 0 2026-03-09T17:29:55.511 INFO:tasks.workunit.client.0.vm06.stdout:3/829: truncate dd/d19/d25/d44/fd6 503497 0 2026-03-09T17:29:55.512 INFO:tasks.workunit.client.0.vm06.stdout:3/830: dread - dd/d1d/f9f zero size 2026-03-09T17:29:55.515 INFO:tasks.workunit.client.0.vm06.stdout:3/831: chown dd/d1d/d6e/d70/c76 295 1 2026-03-09T17:29:55.517 INFO:tasks.workunit.client.0.vm06.stdout:3/832: mkdir dd/d118 0 2026-03-09T17:29:55.520 INFO:tasks.workunit.client.0.vm06.stdout:3/833: mkdir dd/d1d/d6e/d70/d119 0 2026-03-09T17:29:55.529 INFO:tasks.workunit.client.0.vm06.stdout:1/779: write d11/d14/d1d/d42/d46/d92/dc0/f21 [4030068,89338] 0 2026-03-09T17:29:55.532 INFO:tasks.workunit.client.0.vm06.stdout:9/857: write d3/d2c/f9d [624195,81782] 0 2026-03-09T17:29:55.536 INFO:tasks.workunit.client.0.vm06.stdout:0/911: rmdir d7/d11/d2d/daf 39 2026-03-09T17:29:55.550 INFO:tasks.workunit.client.0.vm06.stdout:1/780: rename d11/d14/d1d/d1e/d96 to d11/d14/d1d/d4a/df7/d106 0 2026-03-09T17:29:55.566 INFO:tasks.workunit.client.0.vm06.stdout:0/912: fdatasync d7/d11/f30 0 2026-03-09T17:29:55.571 INFO:tasks.workunit.client.0.vm06.stdout:7/910: truncate d5/dd/f22 2669837 0 2026-03-09T17:29:55.575 INFO:tasks.workunit.client.0.vm06.stdout:7/911: dread d5/d1f/d34/d46/d51/f6e [0,4194304] 0 2026-03-09T17:29:55.576 INFO:tasks.workunit.client.0.vm06.stdout:4/804: dwrite db/d1d/d21/d44/d8a/fa7 [0,4194304] 0 2026-03-09T17:29:55.580 INFO:tasks.workunit.client.0.vm06.stdout:5/781: write d4/d50/d18/d3d/f81 [649713,28667] 0 2026-03-09T17:29:55.591 INFO:tasks.workunit.client.0.vm06.stdout:4/805: mknod db/d57/c124 0 2026-03-09T17:29:55.592 INFO:tasks.workunit.client.0.vm06.stdout:6/679: write d6/d47/f49 [3139178,58738] 0 2026-03-09T17:29:55.593 INFO:tasks.workunit.client.0.vm06.stdout:6/680: chown d6/d47/d96/da1/cb6 126166 1 2026-03-09T17:29:55.595 INFO:tasks.workunit.client.0.vm06.stdout:5/782: dread - d4/d52/d55/f84 zero size 2026-03-09T17:29:55.598 INFO:tasks.workunit.client.0.vm06.stdout:0/913: creat d7/d11/d89/da8/db2/d13a/f13b x:0 0 0 2026-03-09T17:29:55.601 INFO:tasks.workunit.client.0.vm06.stdout:4/806: rmdir db/d1d/d21/d88 39 2026-03-09T17:29:55.602 INFO:tasks.workunit.client.0.vm06.stdout:4/807: write db/d57/dd4/fe7 [3537395,117245] 0 2026-03-09T17:29:55.605 INFO:tasks.workunit.client.0.vm06.stdout:8/759: dwrite d15/d39/f7b [0,4194304] 0 2026-03-09T17:29:55.606 INFO:tasks.workunit.client.0.vm06.stdout:8/760: chown d15/d16/d1e/d30/db8/d5e/fb9 4089 1 2026-03-09T17:29:55.615 INFO:tasks.workunit.client.0.vm06.stdout:7/912: link d5/d1f/d34/d46/d51/f6e d5/dd/dc5/d64/d6b/dd1/f109 0 2026-03-09T17:29:55.621 INFO:tasks.workunit.client.0.vm06.stdout:1/781: getdents d11/d14/d1c/d3a/db7 0 2026-03-09T17:29:55.627 INFO:tasks.workunit.client.0.vm06.stdout:7/913: creat d5/d1f/d34/d3f/d91/f10a x:0 0 0 2026-03-09T17:29:55.630 INFO:tasks.workunit.client.0.vm06.stdout:4/808: unlink db/d1d/d21/d37/d69/d78/db4/lb8 0 2026-03-09T17:29:55.630 INFO:tasks.workunit.client.0.vm06.stdout:6/681: sync 2026-03-09T17:29:55.631 INFO:tasks.workunit.client.0.vm06.stdout:4/809: write db/d1d/d21/d44/d8a/fa7 [570788,77493] 0 2026-03-09T17:29:55.633 INFO:tasks.workunit.client.0.vm06.stdout:8/761: symlink d15/d31/d58/dc9/lf9 0 2026-03-09T17:29:55.634 INFO:tasks.workunit.client.0.vm06.stdout:8/762: fdatasync d15/d16/d1a/f1b 0 2026-03-09T17:29:55.635 INFO:tasks.workunit.client.0.vm06.stdout:8/763: chown d15/d31/dc5/df1/d3d/f6a 368633486 1 2026-03-09T17:29:55.640 INFO:tasks.workunit.client.0.vm06.stdout:0/914: getdents d7/d11/d2d/dca 0 2026-03-09T17:29:55.641 INFO:tasks.workunit.client.0.vm06.stdout:0/915: chown d7/d11/d2d/dca/le2 438586807 1 2026-03-09T17:29:55.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:55 vm06.local ceph-mon[57307]: Upgrade: Updating mgr.vm09.lqzvkh 2026-03-09T17:29:55.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:55 vm06.local ceph-mon[57307]: Deploying daemon mgr.vm09.lqzvkh on vm09 2026-03-09T17:29:55.643 INFO:tasks.workunit.client.0.vm06.stdout:8/764: fdatasync d15/d16/f51 0 2026-03-09T17:29:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:55 vm09.local ceph-mon[62061]: Upgrade: Updating mgr.vm09.lqzvkh 2026-03-09T17:29:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:55 vm09.local ceph-mon[62061]: Deploying daemon mgr.vm09.lqzvkh on vm09 2026-03-09T17:29:55.645 INFO:tasks.workunit.client.0.vm06.stdout:4/810: mkdir db/d1d/d21/d26/d125 0 2026-03-09T17:29:55.650 INFO:tasks.workunit.client.0.vm06.stdout:4/811: creat db/d1d/d21/d108/f126 x:0 0 0 2026-03-09T17:29:55.651 INFO:tasks.workunit.client.0.vm06.stdout:7/914: dread d5/dd/d79/d7f/fdd [0,4194304] 0 2026-03-09T17:29:55.652 INFO:tasks.workunit.client.0.vm06.stdout:7/915: fdatasync d5/dd/dc5/d64/d6b/dd1/fe3 0 2026-03-09T17:29:55.653 INFO:tasks.workunit.client.0.vm06.stdout:4/812: fdatasync db/d59/d5f/d6d/f7b 0 2026-03-09T17:29:55.655 INFO:tasks.workunit.client.0.vm06.stdout:8/765: rename d15/d16/f87 to d15/d16/d1e/ffa 0 2026-03-09T17:29:55.659 INFO:tasks.workunit.client.0.vm06.stdout:4/813: truncate db/d57/fc7 369942 0 2026-03-09T17:29:55.662 INFO:tasks.workunit.client.0.vm06.stdout:7/916: sync 2026-03-09T17:29:55.667 INFO:tasks.workunit.client.0.vm06.stdout:4/814: dread db/f13 [0,4194304] 0 2026-03-09T17:29:55.670 INFO:tasks.workunit.client.0.vm06.stdout:7/917: dread - d5/d1f/d34/d3f/fbb zero size 2026-03-09T17:29:55.671 INFO:tasks.workunit.client.0.vm06.stdout:8/766: link d15/d31/dc5/df1/d71/f96 d15/d31/dc5/df1/d71/ffb 0 2026-03-09T17:29:55.672 INFO:tasks.workunit.client.0.vm06.stdout:8/767: dread - d15/d39/d67/d77/d97/dac/fe1 zero size 2026-03-09T17:29:55.676 INFO:tasks.workunit.client.0.vm06.stdout:2/697: dwrite d3/d4/d12/da7/db3/fc2 [0,4194304] 0 2026-03-09T17:29:55.678 INFO:tasks.workunit.client.0.vm06.stdout:4/815: chown db/d59/d5f/d45/d10a/dcc/de0/f104 3 1 2026-03-09T17:29:55.678 INFO:tasks.workunit.client.0.vm06.stdout:4/816: fdatasync db/d1d/d21/d26/d7a/f105 0 2026-03-09T17:29:55.679 INFO:tasks.workunit.client.0.vm06.stdout:4/817: chown db/f68 9 1 2026-03-09T17:29:55.679 INFO:tasks.workunit.client.0.vm06.stdout:4/818: chown db/d1d/d21/d37/d69/d78/da0/db6 222468668 1 2026-03-09T17:29:55.686 INFO:tasks.workunit.client.0.vm06.stdout:2/698: dwrite d3/d4/d12/d2b/d2d/fcd [0,4194304] 0 2026-03-09T17:29:55.687 INFO:tasks.workunit.client.0.vm06.stdout:2/699: readlink d3/d4/d22/l3d 0 2026-03-09T17:29:55.687 INFO:tasks.workunit.client.0.vm06.stdout:2/700: chown d3/d4/d22/d72 443582448 1 2026-03-09T17:29:55.707 INFO:tasks.workunit.client.0.vm06.stdout:7/918: mknod d5/dd/dc5/dad/c10b 0 2026-03-09T17:29:55.708 INFO:tasks.workunit.client.0.vm06.stdout:8/768: symlink d15/d16/lfc 0 2026-03-09T17:29:55.711 INFO:tasks.workunit.client.0.vm06.stdout:7/919: readlink d5/dd/dc5/d5f/l9e 0 2026-03-09T17:29:55.715 INFO:tasks.workunit.client.0.vm06.stdout:8/769: creat d15/d16/d1e/d30/d55/ffd x:0 0 0 2026-03-09T17:29:55.718 INFO:tasks.workunit.client.0.vm06.stdout:7/920: creat d5/dd/dc5/dee/f10c x:0 0 0 2026-03-09T17:29:55.721 INFO:tasks.workunit.client.0.vm06.stdout:4/819: rename db/d59/d5f/d45/f61 to db/d1d/d21/f127 0 2026-03-09T17:29:55.728 INFO:tasks.workunit.client.0.vm06.stdout:8/770: rename d15/f3e to d15/d31/dc5/df1/d71/ffe 0 2026-03-09T17:29:55.734 INFO:tasks.workunit.client.0.vm06.stdout:7/921: mkdir d5/d10d 0 2026-03-09T17:29:55.737 INFO:tasks.workunit.client.0.vm06.stdout:7/922: dread d5/d1f/d34/ff6 [0,4194304] 0 2026-03-09T17:29:55.740 INFO:tasks.workunit.client.0.vm06.stdout:8/771: rename d15/d16/d1e/d30/db8/d5e/c90 to d15/d39/d67/d77/d97/dac/dcb/cff 0 2026-03-09T17:29:55.743 INFO:tasks.workunit.client.0.vm06.stdout:4/820: mknod db/d1d/d21/d25/d4b/c128 0 2026-03-09T17:29:55.749 INFO:tasks.workunit.client.0.vm06.stdout:8/772: readlink d15/d31/dc5/df1/d3d/d5f/lbb 0 2026-03-09T17:29:55.752 INFO:tasks.workunit.client.0.vm06.stdout:8/773: symlink d15/d31/dc5/df1/d3d/d5f/d83/dc1/l100 0 2026-03-09T17:29:55.752 INFO:tasks.workunit.client.0.vm06.stdout:8/774: chown d15/d39/f40 352190370 1 2026-03-09T17:29:55.755 INFO:tasks.workunit.client.0.vm06.stdout:4/821: sync 2026-03-09T17:29:55.759 INFO:tasks.workunit.client.0.vm06.stdout:4/822: symlink db/d1d/d21/d26/d89/l129 0 2026-03-09T17:29:55.763 INFO:tasks.workunit.client.0.vm06.stdout:4/823: dwrite db/d1d/d21/d108/f126 [0,4194304] 0 2026-03-09T17:29:55.765 INFO:tasks.workunit.client.0.vm06.stdout:4/824: chown db/d59/f10b 577 1 2026-03-09T17:29:55.765 INFO:tasks.workunit.client.0.vm06.stdout:4/825: write db/de2/f10c [81520,55225] 0 2026-03-09T17:29:55.766 INFO:tasks.workunit.client.0.vm06.stdout:4/826: dread - db/d1d/d21/d37/f101 zero size 2026-03-09T17:29:55.770 INFO:tasks.workunit.client.0.vm06.stdout:4/827: sync 2026-03-09T17:29:55.777 INFO:tasks.workunit.client.0.vm06.stdout:4/828: creat db/d1d/d21/d37/d69/d78/db4/f12a x:0 0 0 2026-03-09T17:29:55.783 INFO:tasks.workunit.client.0.vm06.stdout:4/829: getdents db/d59/d5f/d6d/ddb 0 2026-03-09T17:29:55.788 INFO:tasks.workunit.client.0.vm06.stdout:4/830: rename db/d1d/d21/d25/d4b/c118 to db/d59/d5f/c12b 0 2026-03-09T17:29:55.788 INFO:tasks.workunit.client.0.vm06.stdout:4/831: stat f6 0 2026-03-09T17:29:55.792 INFO:tasks.workunit.client.0.vm06.stdout:4/832: mkdir db/d1d/d21/d37/d69/d78/da0/db6/d12c 0 2026-03-09T17:29:55.819 INFO:tasks.workunit.client.0.vm06.stdout:3/834: dwrite dd/d19/d25/d48/f4c [0,4194304] 0 2026-03-09T17:29:55.828 INFO:tasks.workunit.client.0.vm06.stdout:9/858: dwrite d3/f27 [0,4194304] 0 2026-03-09T17:29:55.844 INFO:tasks.workunit.client.0.vm06.stdout:9/859: fsync d3/d26/d35/fea 0 2026-03-09T17:29:55.847 INFO:tasks.workunit.client.0.vm06.stdout:9/860: mkdir d3/d15/d36/d4c/d6a/d8a/d110 0 2026-03-09T17:29:55.852 INFO:tasks.workunit.client.0.vm06.stdout:9/861: mknod d3/d15/d36/d4c/d6a/d8a/dc3/c111 0 2026-03-09T17:29:55.853 INFO:tasks.workunit.client.0.vm06.stdout:9/862: dread - d3/d6d/f9e zero size 2026-03-09T17:29:55.855 INFO:tasks.workunit.client.0.vm06.stdout:9/863: fsync d3/d15/d36/d4d/fd1 0 2026-03-09T17:29:55.860 INFO:tasks.workunit.client.0.vm06.stdout:5/783: truncate d4/dbb/ff9 690493 0 2026-03-09T17:29:55.867 INFO:tasks.workunit.client.0.vm06.stdout:6/682: write d6/d4f/fa3 [2880207,51395] 0 2026-03-09T17:29:55.867 INFO:tasks.workunit.client.0.vm06.stdout:6/683: chown d6/d12/d53/dd0 11 1 2026-03-09T17:29:55.872 INFO:tasks.workunit.client.0.vm06.stdout:0/916: write d7/fb1 [4454724,26746] 0 2026-03-09T17:29:55.875 INFO:tasks.workunit.client.0.vm06.stdout:5/784: sync 2026-03-09T17:29:55.877 INFO:tasks.workunit.client.0.vm06.stdout:6/684: mkdir d6/d47/dd7 0 2026-03-09T17:29:55.878 INFO:tasks.workunit.client.0.vm06.stdout:6/685: chown d6/d47/d96/d40/l5f 0 1 2026-03-09T17:29:55.883 INFO:tasks.workunit.client.0.vm06.stdout:9/864: dread d3/d15/d48/f64 [0,4194304] 0 2026-03-09T17:29:55.884 INFO:tasks.workunit.client.0.vm06.stdout:0/917: fsync d7/d11/d2d/fc3 0 2026-03-09T17:29:55.885 INFO:tasks.workunit.client.0.vm06.stdout:5/785: truncate d4/d22/d46/f58 119824 0 2026-03-09T17:29:55.886 INFO:tasks.workunit.client.0.vm06.stdout:6/686: dread d6/d4f/fa3 [0,4194304] 0 2026-03-09T17:29:55.891 INFO:tasks.workunit.client.0.vm06.stdout:6/687: dwrite d6/d47/d96/d40/fbd [0,4194304] 0 2026-03-09T17:29:55.894 INFO:tasks.workunit.client.0.vm06.stdout:9/865: fsync d3/d15/d36/f49 0 2026-03-09T17:29:55.905 INFO:tasks.workunit.client.0.vm06.stdout:6/688: mknod d6/d4f/d3e/d52/d8c/db0/cd8 0 2026-03-09T17:29:55.906 INFO:tasks.workunit.client.0.vm06.stdout:6/689: readlink d6/d47/d96/d40/l94 0 2026-03-09T17:29:55.906 INFO:tasks.workunit.client.0.vm06.stdout:6/690: chown d6/d47/d96/d40/c45 213789683 1 2026-03-09T17:29:55.912 INFO:tasks.workunit.client.0.vm06.stdout:2/701: dwrite d3/d4/d12/d71/daa/d77/d81/d64/d6a/fab [0,4194304] 0 2026-03-09T17:29:55.916 INFO:tasks.workunit.client.0.vm06.stdout:6/691: write d6/d12/d17/f29 [3661166,27178] 0 2026-03-09T17:29:55.916 INFO:tasks.workunit.client.0.vm06.stdout:2/702: dread d3/d4/f70 [0,4194304] 0 2026-03-09T17:29:55.926 INFO:tasks.workunit.client.0.vm06.stdout:9/866: dread d3/d15/d36/d83/fb1 [0,4194304] 0 2026-03-09T17:29:55.927 INFO:tasks.workunit.client.0.vm06.stdout:9/867: stat d3/d15/d36/d4d/f61 0 2026-03-09T17:29:55.933 INFO:tasks.workunit.client.0.vm06.stdout:7/923: dwrite d5/dd/fa6 [0,4194304] 0 2026-03-09T17:29:55.934 INFO:tasks.workunit.client.0.vm06.stdout:7/924: truncate d5/d7/dac/fe2 441698 0 2026-03-09T17:29:55.950 INFO:tasks.workunit.client.0.vm06.stdout:7/925: mkdir d5/d7/d2b/dbd/d10e 0 2026-03-09T17:29:55.950 INFO:tasks.workunit.client.0.vm06.stdout:7/926: stat d5/d1f/d34/d46/d51/lc1 0 2026-03-09T17:29:55.953 INFO:tasks.workunit.client.0.vm06.stdout:8/775: write d15/d31/dc5/df1/d71/f80 [1005209,102835] 0 2026-03-09T17:29:55.972 INFO:tasks.workunit.client.0.vm06.stdout:9/868: mknod d3/d26/dcb/df1/c112 0 2026-03-09T17:29:55.974 INFO:tasks.workunit.client.0.vm06.stdout:2/703: dread d3/d4/d12/f35 [0,4194304] 0 2026-03-09T17:29:55.977 INFO:tasks.workunit.client.0.vm06.stdout:9/869: mkdir d3/d15/d16/d113 0 2026-03-09T17:29:55.979 INFO:tasks.workunit.client.0.vm06.stdout:7/927: dread d5/d1f/d34/d3f/d91/fce [0,4194304] 0 2026-03-09T17:29:55.984 INFO:tasks.workunit.client.0.vm06.stdout:8/776: link d15/d39/d67/d77/d97/cca d15/d31/d58/d9b/c101 0 2026-03-09T17:29:55.984 INFO:tasks.workunit.client.0.vm06.stdout:8/777: stat d15/d31/dc5 0 2026-03-09T17:29:55.992 INFO:tasks.workunit.client.0.vm06.stdout:8/778: dread d15/d31/dc5/df1/d71/ffe [0,4194304] 0 2026-03-09T17:29:55.992 INFO:tasks.workunit.client.0.vm06.stdout:8/779: chown d15 1 1 2026-03-09T17:29:55.992 INFO:tasks.workunit.client.0.vm06.stdout:8/780: chown d15 51 1 2026-03-09T17:29:55.993 INFO:tasks.workunit.client.0.vm06.stdout:7/928: write d5/d1f/d34/d3f/d91/fce [87953,8002] 0 2026-03-09T17:29:55.998 INFO:tasks.workunit.client.0.vm06.stdout:9/870: symlink d3/d26/d35/l114 0 2026-03-09T17:29:56.007 INFO:tasks.workunit.client.0.vm06.stdout:8/781: rename d15/d31/dc5/df1/fbd to d15/d16/d1e/d30/d55/def/df3/f102 0 2026-03-09T17:29:56.010 INFO:tasks.workunit.client.0.vm06.stdout:8/782: dwrite d15/d16/d1e/d30/fcf [0,4194304] 0 2026-03-09T17:29:56.064 INFO:tasks.workunit.client.0.vm06.stdout:4/833: dwrite db/d59/d5f/d45/d10a/dba/ff1 [0,4194304] 0 2026-03-09T17:29:56.075 INFO:tasks.workunit.client.0.vm06.stdout:7/929: link d5/d7/c31 d5/dd/dc5/dee/c10f 0 2026-03-09T17:29:56.077 INFO:tasks.workunit.client.0.vm06.stdout:4/834: getdents db/d1d/d21/d25/d4b/d85/d106 0 2026-03-09T17:29:56.078 INFO:tasks.workunit.client.0.vm06.stdout:4/835: mknod db/d59/d5f/d5d/c12d 0 2026-03-09T17:29:56.087 INFO:tasks.workunit.client.0.vm06.stdout:4/836: dread db/d57/fc7 [0,4194304] 0 2026-03-09T17:29:56.090 INFO:tasks.workunit.client.0.vm06.stdout:4/837: dread db/d59/f76 [0,4194304] 0 2026-03-09T17:29:56.091 INFO:tasks.workunit.client.0.vm06.stdout:4/838: dread - db/d1d/d21/d37/d69/f75 zero size 2026-03-09T17:29:56.098 INFO:tasks.workunit.client.0.vm06.stdout:3/835: truncate dd/d19/d25/fd1 3062856 0 2026-03-09T17:29:56.102 INFO:tasks.workunit.client.0.vm06.stdout:1/782: dwrite d11/d14/d1d/d1e/d2a/f43 [4194304,4194304] 0 2026-03-09T17:29:56.103 INFO:tasks.workunit.client.0.vm06.stdout:3/836: creat dd/d81/da3/dae/d110/f11a x:0 0 0 2026-03-09T17:29:56.108 INFO:tasks.workunit.client.0.vm06.stdout:1/783: dwrite d11/d14/d1d/d1e/d2a/d99/de9/ff9 [0,4194304] 0 2026-03-09T17:29:56.118 INFO:tasks.workunit.client.0.vm06.stdout:4/839: sync 2026-03-09T17:29:56.121 INFO:tasks.workunit.client.0.vm06.stdout:4/840: stat db/d1d/d21/d44/d8a 0 2026-03-09T17:29:56.138 INFO:tasks.workunit.client.0.vm06.stdout:4/841: mknod db/d1d/d21/d25/d4b/d85/d106/d110/c12e 0 2026-03-09T17:29:56.143 INFO:tasks.workunit.client.0.vm06.stdout:4/842: unlink db/d1d/d21/d26/d7a/lff 0 2026-03-09T17:29:56.196 INFO:tasks.workunit.client.0.vm06.stdout:0/918: dwrite d7/d11/d19/d1d/d39/f7d [0,4194304] 0 2026-03-09T17:29:56.214 INFO:tasks.workunit.client.0.vm06.stdout:5/786: dwrite d4/d52/fc8 [0,4194304] 0 2026-03-09T17:29:56.240 INFO:tasks.workunit.client.0.vm06.stdout:0/919: creat d7/d11/d19/d8b/f13c x:0 0 0 2026-03-09T17:29:56.242 INFO:tasks.workunit.client.0.vm06.stdout:0/920: dread - d7/d11/d19/d23/db7/dbd/f119 zero size 2026-03-09T17:29:56.247 INFO:tasks.workunit.client.0.vm06.stdout:6/692: write d6/d12/d17/d65/f72 [1962259,39182] 0 2026-03-09T17:29:56.248 INFO:tasks.workunit.client.0.vm06.stdout:6/693: chown d6/d47/d96/l9d 0 1 2026-03-09T17:29:56.298 INFO:tasks.workunit.client.0.vm06.stdout:2/704: write d3/f29 [1838534,99812] 0 2026-03-09T17:29:56.315 INFO:tasks.workunit.client.0.vm06.stdout:9/871: write d3/d6d/d9a/d9c/dcd/fd4 [669319,87582] 0 2026-03-09T17:29:56.367 INFO:tasks.workunit.client.0.vm06.stdout:6/694: symlink d6/d47/d4d/d6d/ld9 0 2026-03-09T17:29:56.367 INFO:tasks.workunit.client.0.vm06.stdout:2/705: creat d3/d4/d46/fdf x:0 0 0 2026-03-09T17:29:56.371 INFO:tasks.workunit.client.0.vm06.stdout:9/872: write d3/d11/d65/f7c [76917,118338] 0 2026-03-09T17:29:56.379 INFO:tasks.workunit.client.0.vm06.stdout:6/695: symlink d6/d12/d53/d8f/lda 0 2026-03-09T17:29:56.380 INFO:tasks.workunit.client.0.vm06.stdout:6/696: chown d6/d12/d2d/l4c 3 1 2026-03-09T17:29:56.386 INFO:tasks.workunit.client.0.vm06.stdout:9/873: rename d3/d26/dd7 to d3/d15/d36/d83/d115 0 2026-03-09T17:29:56.388 INFO:tasks.workunit.client.0.vm06.stdout:5/787: getdents d4/d22/d64/df3 0 2026-03-09T17:29:56.390 INFO:tasks.workunit.client.0.vm06.stdout:6/697: dread d6/d12/d17/f32 [8388608,4194304] 0 2026-03-09T17:29:56.406 INFO:tasks.workunit.client.0.vm06.stdout:5/788: getdents d4/d50 0 2026-03-09T17:29:56.409 INFO:tasks.workunit.client.0.vm06.stdout:6/698: dread d6/d4f/f33 [0,4194304] 0 2026-03-09T17:29:56.417 INFO:tasks.workunit.client.0.vm06.stdout:6/699: dread d6/d4f/d3e/d52/f89 [0,4194304] 0 2026-03-09T17:29:56.435 INFO:tasks.workunit.client.0.vm06.stdout:5/789: dread d4/d22/d64/f70 [0,4194304] 0 2026-03-09T17:29:56.438 INFO:tasks.workunit.client.0.vm06.stdout:6/700: truncate d6/d4f/f33 366123 0 2026-03-09T17:29:56.440 INFO:tasks.workunit.client.0.vm06.stdout:6/701: fsync d6/d47/fa8 0 2026-03-09T17:29:56.446 INFO:tasks.workunit.client.0.vm06.stdout:5/790: mknod d4/d22/dbe/dfb/c120 0 2026-03-09T17:29:56.447 INFO:tasks.workunit.client.0.vm06.stdout:5/791: dread - d4/d22/dbe/ffc zero size 2026-03-09T17:29:56.451 INFO:tasks.workunit.client.0.vm06.stdout:5/792: chown d4/d22/d46/dec/f105 904886860 1 2026-03-09T17:29:56.454 INFO:tasks.workunit.client.0.vm06.stdout:5/793: read d4/d50/d35/d40/d6f/fc7 [45578,126227] 0 2026-03-09T17:29:56.455 INFO:tasks.workunit.client.0.vm06.stdout:5/794: mkdir d4/da4/dcf/d121 0 2026-03-09T17:29:56.457 INFO:tasks.workunit.client.0.vm06.stdout:8/783: write d15/f7d [341427,72889] 0 2026-03-09T17:29:56.458 INFO:tasks.workunit.client.0.vm06.stdout:8/784: write d15/d31/dc5/df1/d2b/f46 [5623916,42893] 0 2026-03-09T17:29:56.460 INFO:tasks.workunit.client.0.vm06.stdout:8/785: chown fe 307801073 1 2026-03-09T17:29:56.462 INFO:tasks.workunit.client.0.vm06.stdout:6/702: dread d6/d12/f76 [0,4194304] 0 2026-03-09T17:29:56.463 INFO:tasks.workunit.client.0.vm06.stdout:6/703: chown d6/d12/d53/d91 48959710 1 2026-03-09T17:29:56.470 INFO:tasks.workunit.client.0.vm06.stdout:5/795: creat d4/d50/d35/d40/d109/d11f/f122 x:0 0 0 2026-03-09T17:29:56.470 INFO:tasks.workunit.client.0.vm06.stdout:6/704: creat d6/d4f/d3e/d52/d8c/db0/fdb x:0 0 0 2026-03-09T17:29:56.471 INFO:tasks.workunit.client.0.vm06.stdout:8/786: link d15/d39/d67/de3/fe9 d15/d39/d67/d86/ddd/f103 0 2026-03-09T17:29:56.485 INFO:tasks.workunit.client.0.vm06.stdout:1/784: dwrite d11/d14/d1d/d42/d46/d92/dc0/f7f [0,4194304] 0 2026-03-09T17:29:56.486 INFO:tasks.workunit.client.0.vm06.stdout:3/837: dwrite dd/f15 [0,4194304] 0 2026-03-09T17:29:56.489 INFO:tasks.workunit.client.0.vm06.stdout:6/705: rename d6/d12/d53/d8f/dc2 to d6/d12/d17/d65/ddc 0 2026-03-09T17:29:56.490 INFO:tasks.workunit.client.0.vm06.stdout:1/785: fdatasync d11/d14/d1c/d5f/fc4 0 2026-03-09T17:29:56.490 INFO:tasks.workunit.client.0.vm06.stdout:3/838: chown dd/d19/d1e/f41 132445 1 2026-03-09T17:29:56.498 INFO:tasks.workunit.client.0.vm06.stdout:4/843: dwrite db/d1d/d21/d25/f80 [0,4194304] 0 2026-03-09T17:29:56.528 INFO:tasks.workunit.client.0.vm06.stdout:6/706: read d6/f97 [10,27598] 0 2026-03-09T17:29:56.529 INFO:tasks.workunit.client.0.vm06.stdout:6/707: dread - d6/d47/d4d/d6d/fbe zero size 2026-03-09T17:29:56.588 INFO:tasks.workunit.client.0.vm06.stdout:6/708: creat d6/d47/d96/d40/fdd x:0 0 0 2026-03-09T17:29:56.589 INFO:tasks.workunit.client.0.vm06.stdout:3/839: mkdir dd/d19/d1e/d100/d11b 0 2026-03-09T17:29:56.598 INFO:tasks.workunit.client.0.vm06.stdout:4/844: link db/d1d/d21/d26/f113 db/d59/d90/f12f 0 2026-03-09T17:29:56.603 INFO:tasks.workunit.client.0.vm06.stdout:9/874: mkdir d3/d6d/d9a/d9c/d116 0 2026-03-09T17:29:56.603 INFO:tasks.workunit.client.0.vm06.stdout:4/845: mknod db/d1d/d21/d37/d69/d78/c130 0 2026-03-09T17:29:56.604 INFO:tasks.workunit.client.0.vm06.stdout:4/846: chown db/d1d/d21/d37/f81 1 1 2026-03-09T17:29:56.607 INFO:tasks.workunit.client.0.vm06.stdout:4/847: read db/d59/d5f/d45/d10a/dcc/de0/f104 [769978,99157] 0 2026-03-09T17:29:56.609 INFO:tasks.workunit.client.0.vm06.stdout:9/875: fdatasync d3/d15/d36/d4c/d6a/f10f 0 2026-03-09T17:29:56.610 INFO:tasks.workunit.client.0.vm06.stdout:9/876: write d3/d2c/ffc [2288151,55829] 0 2026-03-09T17:29:56.622 INFO:tasks.workunit.client.0.vm06.stdout:9/877: fsync d3/d6d/f78 0 2026-03-09T17:29:56.628 INFO:tasks.workunit.client.0.vm06.stdout:9/878: truncate d3/d15/f2e 1966973 0 2026-03-09T17:29:56.638 INFO:tasks.workunit.client.0.vm06.stdout:9/879: link d3/d6d/f7a d3/d15/d36/d83/df8/f117 0 2026-03-09T17:29:56.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:56 vm06.local ceph-mon[57307]: pgmap v161: 65 pgs: 65 active+clean; 1.7 GiB data, 6.4 GiB used, 114 GiB / 120 GiB avail; 30 MiB/s rd, 75 MiB/s wr, 336 op/s 2026-03-09T17:29:56.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:56 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:29:56.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:56 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:29:56.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:56 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:29:56.642 INFO:tasks.workunit.client.0.vm06.stdout:9/880: fdatasync d3/d15/d36/d4d/fd1 0 2026-03-09T17:29:56.642 INFO:tasks.workunit.client.0.vm06.stdout:9/881: write d3/d11/d65/faa [1364299,77006] 0 2026-03-09T17:29:56.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:56 vm09.local ceph-mon[62061]: pgmap v161: 65 pgs: 65 active+clean; 1.7 GiB data, 6.4 GiB used, 114 GiB / 120 GiB avail; 30 MiB/s rd, 75 MiB/s wr, 336 op/s 2026-03-09T17:29:56.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:56 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:29:56.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:56 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:29:56.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:56 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:29:56.666 INFO:tasks.workunit.client.0.vm06.stdout:2/706: dwrite d3/d4/d22/d72/d8f/fb7 [0,4194304] 0 2026-03-09T17:29:56.668 INFO:tasks.workunit.client.0.vm06.stdout:2/707: stat d3/d4/d12/d71/daa/f5f 0 2026-03-09T17:29:56.678 INFO:tasks.workunit.client.0.vm06.stdout:0/921: creat d7/d11/d19/d3c/f13d x:0 0 0 2026-03-09T17:29:56.678 INFO:tasks.workunit.client.0.vm06.stdout:0/922: readlink d7/d11/d19/d1d/d87/l133 0 2026-03-09T17:29:56.681 INFO:tasks.workunit.client.0.vm06.stdout:0/923: chown d7/d11/d5d/f93 256358205 1 2026-03-09T17:29:56.696 INFO:tasks.workunit.client.0.vm06.stdout:7/930: rmdir d5/d7/d2b 39 2026-03-09T17:29:56.710 INFO:tasks.workunit.client.0.vm06.stdout:0/924: getdents d7/d11/d89/da8/db2/d13a 0 2026-03-09T17:29:56.720 INFO:tasks.workunit.client.0.vm06.stdout:2/708: dread d3/d4/d12/f66 [0,4194304] 0 2026-03-09T17:29:56.725 INFO:tasks.workunit.client.0.vm06.stdout:7/931: creat d5/d1f/d34/d3f/f110 x:0 0 0 2026-03-09T17:29:56.736 INFO:tasks.workunit.client.0.vm06.stdout:2/709: mkdir d3/d4/d12/d71/daa/d77/d81/d64/d6a/de0 0 2026-03-09T17:29:56.744 INFO:tasks.workunit.client.0.vm06.stdout:8/787: dwrite d15/d31/dc5/df1/d71/f82 [0,4194304] 0 2026-03-09T17:29:56.747 INFO:tasks.workunit.client.0.vm06.stdout:5/796: rename d4/d52/fc8 to d4/da4/dcf/f123 0 2026-03-09T17:29:56.758 INFO:tasks.workunit.client.0.vm06.stdout:0/925: rmdir d7/d11/d19/d3c/df8/d137 0 2026-03-09T17:29:56.762 INFO:tasks.workunit.client.0.vm06.stdout:2/710: truncate f2 206996 0 2026-03-09T17:29:56.763 INFO:tasks.workunit.client.0.vm06.stdout:1/786: rename d11/d14/d1d/d1e/d2a/d99/de9/ff9 to d11/d14/d1d/d42/dff/f107 0 2026-03-09T17:29:56.765 INFO:tasks.workunit.client.0.vm06.stdout:6/709: dwrite d6/d47/d96/d40/fb4 [0,4194304] 0 2026-03-09T17:29:56.767 INFO:tasks.workunit.client.0.vm06.stdout:2/711: fdatasync d3/d4/f11 0 2026-03-09T17:29:56.772 INFO:tasks.workunit.client.0.vm06.stdout:4/848: dwrite db/d57/ff3 [0,4194304] 0 2026-03-09T17:29:56.777 INFO:tasks.workunit.client.0.vm06.stdout:4/849: dwrite db/d1d/d21/d44/dc1/ffb [0,4194304] 0 2026-03-09T17:29:56.797 INFO:tasks.workunit.client.0.vm06.stdout:6/710: truncate d6/f97 36510 0 2026-03-09T17:29:56.808 INFO:tasks.workunit.client.0.vm06.stdout:2/712: dread d3/d4/d12/d2b/f32 [0,4194304] 0 2026-03-09T17:29:56.810 INFO:tasks.workunit.client.0.vm06.stdout:4/850: fsync db/d1d/d21/d44/ff6 0 2026-03-09T17:29:56.815 INFO:tasks.workunit.client.0.vm06.stdout:9/882: dwrite d3/d15/f23 [0,4194304] 0 2026-03-09T17:29:56.818 INFO:tasks.workunit.client.0.vm06.stdout:7/932: rename d5/dd/dc5/d64/d6b/dd1/f109 to d5/dd/dc5/d64/de8/f111 0 2026-03-09T17:29:56.823 INFO:tasks.workunit.client.0.vm06.stdout:6/711: fdatasync d6/d47/d4d/d9a/da2/db1/fb8 0 2026-03-09T17:29:56.830 INFO:tasks.workunit.client.0.vm06.stdout:0/926: link d7/d11/d89/da8/f134 d7/d11/d5d/f13e 0 2026-03-09T17:29:56.831 INFO:tasks.workunit.client.0.vm06.stdout:5/797: link d4/d50/cf d4/d50/d18/de1/c124 0 2026-03-09T17:29:56.835 INFO:tasks.workunit.client.0.vm06.stdout:9/883: rename d3/d15/d36/d4c/d6a/f8b to d3/d6d/d9a/d9c/d116/f118 0 2026-03-09T17:29:56.840 INFO:tasks.workunit.client.0.vm06.stdout:2/713: symlink d3/d4/d22/d72/d8f/dda/le1 0 2026-03-09T17:29:56.846 INFO:tasks.workunit.client.0.vm06.stdout:3/840: dwrite dd/fdd [0,4194304] 0 2026-03-09T17:29:56.849 INFO:tasks.workunit.client.0.vm06.stdout:0/927: dwrite d7/d11/d2d/daf/fd5 [0,4194304] 0 2026-03-09T17:29:56.852 INFO:tasks.workunit.client.0.vm06.stdout:0/928: dread - d7/d11/d19/d23/db7/dbd/f130 zero size 2026-03-09T17:29:56.868 INFO:tasks.workunit.client.0.vm06.stdout:7/933: fsync d5/f18 0 2026-03-09T17:29:56.868 INFO:tasks.workunit.client.0.vm06.stdout:7/934: read d5/dd/fa0 [3593296,103308] 0 2026-03-09T17:29:56.870 INFO:tasks.workunit.client.0.vm06.stdout:8/788: dwrite d15/d31/dc5/df1/f4f [0,4194304] 0 2026-03-09T17:29:56.885 INFO:tasks.workunit.client.0.vm06.stdout:9/884: symlink d3/d6d/l119 0 2026-03-09T17:29:56.892 INFO:tasks.workunit.client.0.vm06.stdout:2/714: creat d3/d4/d46/fe2 x:0 0 0 2026-03-09T17:29:56.892 INFO:tasks.workunit.client.0.vm06.stdout:2/715: fsync d3/d4/d46/da5/f6c 0 2026-03-09T17:29:56.893 INFO:tasks.workunit.client.0.vm06.stdout:2/716: chown d3/d4/d22/d72/d8f/fb7 18475217 1 2026-03-09T17:29:56.896 INFO:tasks.workunit.client.0.vm06.stdout:3/841: creat dd/d19/d25/d44/f11c x:0 0 0 2026-03-09T17:29:56.903 INFO:tasks.workunit.client.0.vm06.stdout:1/787: write d11/d69/fad [3008470,94511] 0 2026-03-09T17:29:56.909 INFO:tasks.workunit.client.0.vm06.stdout:7/935: creat d5/d1f/d34/d46/d51/f112 x:0 0 0 2026-03-09T17:29:56.910 INFO:tasks.workunit.client.0.vm06.stdout:8/789: fdatasync d15/d16/d6d/f89 0 2026-03-09T17:29:56.917 INFO:tasks.workunit.client.0.vm06.stdout:2/717: truncate d3/d4/d12/f92 716600 0 2026-03-09T17:29:56.919 INFO:tasks.workunit.client.0.vm06.stdout:3/842: mkdir dd/d1d/d6e/d70/d11d 0 2026-03-09T17:29:56.921 INFO:tasks.workunit.client.0.vm06.stdout:0/929: mknod d7/c13f 0 2026-03-09T17:29:56.923 INFO:tasks.workunit.client.0.vm06.stdout:1/788: dread - d11/d14/d1d/d42/d46/fcd zero size 2026-03-09T17:29:56.924 INFO:tasks.workunit.client.0.vm06.stdout:7/936: rename d5/dd/d79/fb3 to d5/d1f/dae/f113 0 2026-03-09T17:29:56.926 INFO:tasks.workunit.client.0.vm06.stdout:4/851: write db/d59/d5f/d45/d10a/dcc/fce [339948,34400] 0 2026-03-09T17:29:56.930 INFO:tasks.workunit.client.0.vm06.stdout:0/930: dread d7/d11/d19/d3c/fe8 [4194304,4194304] 0 2026-03-09T17:29:56.934 INFO:tasks.workunit.client.0.vm06.stdout:6/712: getdents d6/d47/d96 0 2026-03-09T17:29:56.935 INFO:tasks.workunit.client.0.vm06.stdout:6/713: chown d6/d47/d4d/d9a/da2/db1/fb8 7373321 1 2026-03-09T17:29:56.935 INFO:tasks.workunit.client.0.vm06.stdout:6/714: chown d6/d12/d17/f6b 97 1 2026-03-09T17:29:56.936 INFO:tasks.workunit.client.0.vm06.stdout:6/715: read d6/d12/d17/d85/faf [743571,104345] 0 2026-03-09T17:29:56.944 INFO:tasks.workunit.client.0.vm06.stdout:5/798: dwrite d4/d52/d55/f84 [0,4194304] 0 2026-03-09T17:29:56.946 INFO:tasks.workunit.client.0.vm06.stdout:4/852: stat db/d1d/d21/d25/c122 0 2026-03-09T17:29:56.957 INFO:tasks.workunit.client.0.vm06.stdout:9/885: link d3/d15/d16/l6b d3/d6d/d9a/d9c/l11a 0 2026-03-09T17:29:56.960 INFO:tasks.workunit.client.0.vm06.stdout:6/716: dread - d6/d12/d17/d85/fa7 zero size 2026-03-09T17:29:56.963 INFO:tasks.workunit.client.0.vm06.stdout:5/799: read d4/d50/fad [3364983,99145] 0 2026-03-09T17:29:56.967 INFO:tasks.workunit.client.0.vm06.stdout:1/789: symlink d11/d14/d1d/d1e/l108 0 2026-03-09T17:29:56.968 INFO:tasks.workunit.client.0.vm06.stdout:1/790: dread - d11/d14/d1d/d42/d46/d92/dc0/d57/fac zero size 2026-03-09T17:29:56.974 INFO:tasks.workunit.client.0.vm06.stdout:4/853: rename db/d59/d5f/d6d/lb2 to db/d1d/d21/d37/d69/d78/da0/l131 0 2026-03-09T17:29:56.977 INFO:tasks.workunit.client.0.vm06.stdout:8/790: rmdir d15/d39/d67/de3/de8 0 2026-03-09T17:29:56.977 INFO:tasks.workunit.client.0.vm06.stdout:8/791: stat d15/d31/dc5/df1/d2b/d85 0 2026-03-09T17:29:56.980 INFO:tasks.workunit.client.0.vm06.stdout:9/886: symlink d3/d15/d36/d83/df8/l11b 0 2026-03-09T17:29:56.982 INFO:tasks.workunit.client.0.vm06.stdout:6/717: dread - d6/d47/d96/d40/f9f zero size 2026-03-09T17:29:56.983 INFO:tasks.workunit.client.0.vm06.stdout:3/843: write dd/d1d/d2e/fec [546779,58444] 0 2026-03-09T17:29:56.989 INFO:tasks.workunit.client.0.vm06.stdout:1/791: dread - d11/d69/fca zero size 2026-03-09T17:29:56.994 INFO:tasks.workunit.client.0.vm06.stdout:1/792: dwrite d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/dce/fe5 [0,4194304] 0 2026-03-09T17:29:57.002 INFO:tasks.workunit.client.0.vm06.stdout:7/937: write d5/d1f/d34/d46/f4e [5315475,52521] 0 2026-03-09T17:29:57.005 INFO:tasks.workunit.client.0.vm06.stdout:4/854: rename db/d57/dd4 to db/de2/d132 0 2026-03-09T17:29:57.008 INFO:tasks.workunit.client.0.vm06.stdout:0/931: link d7/d88/l12d d7/d11/d19/d23/db7/dbd/d101/l140 0 2026-03-09T17:29:57.014 INFO:tasks.workunit.client.0.vm06.stdout:2/718: getdents d3/d4/d12/d71 0 2026-03-09T17:29:57.014 INFO:tasks.workunit.client.0.vm06.stdout:8/792: dread d15/d16/f52 [0,4194304] 0 2026-03-09T17:29:57.016 INFO:tasks.workunit.client.0.vm06.stdout:3/844: dread dd/d81/da3/fc6 [0,4194304] 0 2026-03-09T17:29:57.025 INFO:tasks.workunit.client.0.vm06.stdout:7/938: rmdir d5/d7 39 2026-03-09T17:29:57.027 INFO:tasks.workunit.client.0.vm06.stdout:4/855: mkdir db/d1d/d21/d44/dc1/d133 0 2026-03-09T17:29:57.029 INFO:tasks.workunit.client.0.vm06.stdout:0/932: rename d7/d11/d2d/l98 to d7/d11/d19/d3c/df3/d11c/l141 0 2026-03-09T17:29:57.031 INFO:tasks.workunit.client.0.vm06.stdout:8/793: truncate d15/d31/dc5/df1/d3d/d5f/dd4/fd6 666144 0 2026-03-09T17:29:57.033 INFO:tasks.workunit.client.0.vm06.stdout:1/793: truncate d11/d14/d1d/d1e/d2a/d34/d58/fb9 964763 0 2026-03-09T17:29:57.034 INFO:tasks.workunit.client.0.vm06.stdout:4/856: mkdir db/d1d/d21/d44/d8a/d134 0 2026-03-09T17:29:57.035 INFO:tasks.workunit.client.0.vm06.stdout:0/933: fsync d7/ffa 0 2026-03-09T17:29:57.045 INFO:tasks.workunit.client.0.vm06.stdout:0/934: truncate d7/d11/d19/d8b/da4/d85/fc8 4873957 0 2026-03-09T17:29:57.045 INFO:tasks.workunit.client.0.vm06.stdout:0/935: chown d7/fb1 17 1 2026-03-09T17:29:57.046 INFO:tasks.workunit.client.0.vm06.stdout:1/794: mknod d11/d14/d1d/d42/d46/d92/dc0/c109 0 2026-03-09T17:29:57.047 INFO:tasks.workunit.client.0.vm06.stdout:1/795: write d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/dce/fe5 [4450248,101431] 0 2026-03-09T17:29:57.050 INFO:tasks.workunit.client.0.vm06.stdout:7/939: creat d5/dd/dc5/d64/d6b/f114 x:0 0 0 2026-03-09T17:29:57.053 INFO:tasks.workunit.client.0.vm06.stdout:1/796: chown d11/d14/d1d/d1e/d2a/c98 31010 1 2026-03-09T17:29:57.057 INFO:tasks.workunit.client.0.vm06.stdout:1/797: dwrite d11/d14/d1d/d1e/d2a/d34/d58/f6a [0,4194304] 0 2026-03-09T17:29:57.058 INFO:tasks.workunit.client.0.vm06.stdout:1/798: fdatasync d11/f105 0 2026-03-09T17:29:57.060 INFO:tasks.workunit.client.0.vm06.stdout:0/936: creat d7/d11/d19/d1d/f142 x:0 0 0 2026-03-09T17:29:57.061 INFO:tasks.workunit.client.0.vm06.stdout:7/940: link d5/dd/c21 d5/d1f/c115 0 2026-03-09T17:29:57.067 INFO:tasks.workunit.client.0.vm06.stdout:1/799: creat d11/d14/d1c/d3a/f10a x:0 0 0 2026-03-09T17:29:57.069 INFO:tasks.workunit.client.0.vm06.stdout:0/937: fdatasync d7/d88/fdb 0 2026-03-09T17:29:57.071 INFO:tasks.workunit.client.0.vm06.stdout:0/938: dread d7/d11/f75 [0,4194304] 0 2026-03-09T17:29:57.073 INFO:tasks.workunit.client.0.vm06.stdout:1/800: mkdir d11/d14/d1d/dd1/d10b 0 2026-03-09T17:29:57.077 INFO:tasks.workunit.client.0.vm06.stdout:0/939: creat d7/d102/f143 x:0 0 0 2026-03-09T17:29:57.082 INFO:tasks.workunit.client.0.vm06.stdout:0/940: mknod d7/d11/d19/d1d/c144 0 2026-03-09T17:29:57.083 INFO:tasks.workunit.client.0.vm06.stdout:0/941: read - d7/d11/d89/da8/f134 zero size 2026-03-09T17:29:57.086 INFO:tasks.workunit.client.0.vm06.stdout:0/942: symlink d7/d11/d2d/daf/l145 0 2026-03-09T17:29:57.119 INFO:tasks.workunit.client.0.vm06.stdout:0/943: fsync d7/d11/f13 0 2026-03-09T17:29:57.209 INFO:tasks.workunit.client.0.vm06.stdout:3/845: sync 2026-03-09T17:29:57.209 INFO:tasks.workunit.client.0.vm06.stdout:3/846: chown dd/d19/d1e/d100/d11b 144370 1 2026-03-09T17:29:57.210 INFO:tasks.workunit.client.0.vm06.stdout:3/847: read dd/d19/d25/fd1 [2932270,115382] 0 2026-03-09T17:29:57.244 INFO:tasks.workunit.client.0.vm06.stdout:3/848: dread dd/f38 [0,4194304] 0 2026-03-09T17:29:57.250 INFO:tasks.workunit.client.0.vm06.stdout:3/849: mkdir dd/d59/da1/d11e 0 2026-03-09T17:29:57.252 INFO:tasks.workunit.client.0.vm06.stdout:3/850: dread - dd/d19/d25/d44/d80/dd7/d10d/f111 zero size 2026-03-09T17:29:57.296 INFO:tasks.workunit.client.0.vm06.stdout:9/887: dwrite d3/f21 [0,4194304] 0 2026-03-09T17:29:57.296 INFO:tasks.workunit.client.0.vm06.stdout:5/800: write d4/d50/d35/f39 [2367489,109691] 0 2026-03-09T17:29:57.297 INFO:tasks.workunit.client.0.vm06.stdout:9/888: write d3/d15/f10c [588444,1698] 0 2026-03-09T17:29:57.300 INFO:tasks.workunit.client.0.vm06.stdout:6/718: write d6/d12/f1c [1597588,1732] 0 2026-03-09T17:29:57.301 INFO:tasks.workunit.client.0.vm06.stdout:5/801: fdatasync d4/d50/f1e 0 2026-03-09T17:29:57.304 INFO:tasks.workunit.client.0.vm06.stdout:5/802: mkdir d4/d50/db2/d125 0 2026-03-09T17:29:57.307 INFO:tasks.workunit.client.0.vm06.stdout:5/803: fdatasync d4/d50/d18/f3e 0 2026-03-09T17:29:57.308 INFO:tasks.workunit.client.0.vm06.stdout:5/804: mknod d4/d50/db2/d119/c126 0 2026-03-09T17:29:57.312 INFO:tasks.workunit.client.0.vm06.stdout:5/805: dwrite d4/d50/d35/d40/d6f/f8e [0,4194304] 0 2026-03-09T17:29:57.317 INFO:tasks.workunit.client.0.vm06.stdout:2/719: dwrite d3/d4/d12/d71/daa/d77/d81/d64/fce [4194304,4194304] 0 2026-03-09T17:29:57.322 INFO:tasks.workunit.client.0.vm06.stdout:9/889: sync 2026-03-09T17:29:57.323 INFO:tasks.workunit.client.0.vm06.stdout:8/794: dwrite d15/d31/dc5/df1/f61 [0,4194304] 0 2026-03-09T17:29:57.335 INFO:tasks.workunit.client.0.vm06.stdout:2/720: truncate d3/d4/d12/d2b/d36/d37/f3a 2252780 0 2026-03-09T17:29:57.336 INFO:tasks.workunit.client.0.vm06.stdout:9/890: creat d3/d11/d65/f11c x:0 0 0 2026-03-09T17:29:57.340 INFO:tasks.workunit.client.0.vm06.stdout:8/795: mknod d15/d39/d3c/dd5/c104 0 2026-03-09T17:29:57.343 INFO:tasks.workunit.client.0.vm06.stdout:8/796: creat d15/d31/dc5/df1/d71/f105 x:0 0 0 2026-03-09T17:29:57.345 INFO:tasks.workunit.client.0.vm06.stdout:9/891: symlink d3/d26/d35/l11d 0 2026-03-09T17:29:57.345 INFO:tasks.workunit.client.0.vm06.stdout:9/892: stat d3/d6d/d9a/lf5 0 2026-03-09T17:29:57.368 INFO:tasks.workunit.client.0.vm06.stdout:9/893: sync 2026-03-09T17:29:57.386 INFO:tasks.workunit.client.0.vm06.stdout:4/857: dwrite db/d1d/fd3 [0,4194304] 0 2026-03-09T17:29:57.456 INFO:tasks.workunit.client.0.vm06.stdout:4/858: dread db/d59/d5f/d45/d10a/dcc/de0/f104 [0,4194304] 0 2026-03-09T17:29:57.457 INFO:tasks.workunit.client.0.vm06.stdout:4/859: dread - db/d1d/d21/d44/dc1/fe1 zero size 2026-03-09T17:29:57.461 INFO:tasks.workunit.client.0.vm06.stdout:4/860: dwrite db/d59/d5f/d45/d10a/dba/f117 [0,4194304] 0 2026-03-09T17:29:57.465 INFO:tasks.workunit.client.0.vm06.stdout:4/861: chown db/d1d/d21/d25/d4b/lad 160 1 2026-03-09T17:29:57.465 INFO:tasks.workunit.client.0.vm06.stdout:4/862: chown db/d1d/f3a 1349721082 1 2026-03-09T17:29:57.468 INFO:tasks.workunit.client.0.vm06.stdout:4/863: read db/d1d/d21/d25/d4b/fe9 [139338,54091] 0 2026-03-09T17:29:57.481 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:29:57 vm09.local ceph-mon[62061]: pgmap v162: 65 pgs: 65 active+clean; 1.9 GiB data, 6.8 GiB used, 113 GiB / 120 GiB avail; 32 MiB/s rd, 85 MiB/s wr, 365 op/s 2026-03-09T17:29:57.530 INFO:tasks.workunit.client.0.vm06.stdout:4/864: dread db/d1d/d21/d37/f81 [0,4194304] 0 2026-03-09T17:29:57.539 INFO:tasks.workunit.client.0.vm06.stdout:1/801: dwrite d11/d14/d1d/d4a/f9d [0,4194304] 0 2026-03-09T17:29:57.552 INFO:tasks.workunit.client.0.vm06.stdout:1/802: creat d11/de0/f10c x:0 0 0 2026-03-09T17:29:57.571 INFO:tasks.workunit.client.0.vm06.stdout:4/865: read db/d59/d5f/d45/d10a/fdd [3145831,28625] 0 2026-03-09T17:29:57.571 INFO:tasks.workunit.client.0.vm06.stdout:4/866: dread - db/d1d/d21/d37/f101 zero size 2026-03-09T17:29:57.574 INFO:tasks.workunit.client.0.vm06.stdout:4/867: dwrite db/d1d/d21/d37/f101 [0,4194304] 0 2026-03-09T17:29:57.617 INFO:tasks.workunit.client.0.vm06.stdout:4/868: dread db/d1d/d21/d25/f35 [0,4194304] 0 2026-03-09T17:29:57.620 INFO:tasks.workunit.client.0.vm06.stdout:4/869: symlink db/d1d/d21/d44/d8a/dec/l135 0 2026-03-09T17:29:57.621 INFO:tasks.workunit.client.0.vm06.stdout:7/941: write d5/d7/d2b/fa1 [2417763,64867] 0 2026-03-09T17:29:57.623 INFO:tasks.workunit.client.0.vm06.stdout:4/870: creat db/d1d/d21/d44/d8a/f136 x:0 0 0 2026-03-09T17:29:57.625 INFO:tasks.workunit.client.0.vm06.stdout:7/942: creat d5/d7/d2b/de7/f116 x:0 0 0 2026-03-09T17:29:57.628 INFO:tasks.workunit.client.0.vm06.stdout:4/871: truncate db/d1d/d21/d25/d4b/f66 4072846 0 2026-03-09T17:29:57.628 INFO:tasks.workunit.client.0.vm06.stdout:4/872: write db/d59/d5f/d45/d10a/dcc/f123 [213371,25451] 0 2026-03-09T17:29:57.631 INFO:tasks.workunit.client.0.vm06.stdout:4/873: dwrite db/df/f30 [0,4194304] 0 2026-03-09T17:29:57.632 INFO:tasks.workunit.client.0.vm06.stdout:4/874: stat db/d1d/d21/d26/d7a 0 2026-03-09T17:29:57.639 INFO:tasks.workunit.client.0.vm06.stdout:4/875: write db/d1d/f5b [4308446,34189] 0 2026-03-09T17:29:57.726 INFO:tasks.workunit.client.0.vm06.stdout:0/944: dwrite d7/d11/d2d/daf/fd3 [0,4194304] 0 2026-03-09T17:29:57.727 INFO:tasks.workunit.client.0.vm06.stdout:0/945: chown d7/d11/d2d/dca/c110 58 1 2026-03-09T17:29:57.731 INFO:tasks.workunit.client.0.vm06.stdout:0/946: dwrite d7/d11/d19/d3c/db9/f123 [0,4194304] 0 2026-03-09T17:29:57.741 INFO:tasks.workunit.client.0.vm06.stdout:0/947: mknod d7/d11/d5d/d64/c146 0 2026-03-09T17:29:57.744 INFO:tasks.workunit.client.0.vm06.stdout:0/948: stat d7/d11/d89/da8/cf2 0 2026-03-09T17:29:57.746 INFO:tasks.workunit.client.0.vm06.stdout:0/949: read d7/d11/d5d/d64/fc9 [1750863,65455] 0 2026-03-09T17:29:57.751 INFO:tasks.workunit.client.0.vm06.stdout:0/950: fsync d7/d11/d19/d23/db7/dbd/f119 0 2026-03-09T17:29:57.751 INFO:tasks.workunit.client.0.vm06.stdout:0/951: symlink d7/d11/d89/da8/db2/dea/l147 0 2026-03-09T17:29:57.751 INFO:tasks.workunit.client.0.vm06.stdout:0/952: dread - d7/d11/d2d/fc3 zero size 2026-03-09T17:29:57.752 INFO:tasks.workunit.client.0.vm06.stdout:0/953: creat d7/d11/d19/d3c/db9/f148 x:0 0 0 2026-03-09T17:29:57.753 INFO:tasks.workunit.client.0.vm06.stdout:0/954: truncate d7/d11/d19/d1d/f142 348882 0 2026-03-09T17:29:57.756 INFO:tasks.workunit.client.0.vm06.stdout:0/955: unlink d7/d11/d19/d1d/d87/la1 0 2026-03-09T17:29:57.757 INFO:tasks.workunit.client.0.vm06.stdout:0/956: mknod d7/d11/d19/d3c/df8/c149 0 2026-03-09T17:29:57.759 INFO:tasks.workunit.client.0.vm06.stdout:0/957: dread - d7/d11/d19/d23/db7/dbd/f11b zero size 2026-03-09T17:29:57.761 INFO:tasks.workunit.client.0.vm06.stdout:0/958: creat d7/d11/d5d/d136/f14a x:0 0 0 2026-03-09T17:29:57.762 INFO:tasks.workunit.client.0.vm06.stdout:0/959: truncate d7/d11/d19/d3c/db9/ddd/ff7 360155 0 2026-03-09T17:29:57.764 INFO:tasks.workunit.client.0.vm06.stdout:0/960: rename d7/d11/d19/d23/db7/dbd/dc1 to d7/d11/d19/d8b/da4/d14b 0 2026-03-09T17:29:57.806 INFO:tasks.workunit.client.0.vm06.stdout:0/961: sync 2026-03-09T17:29:57.806 INFO:tasks.workunit.client.0.vm06.stdout:0/962: readlink d7/d11/d19/d1d/d39/l12e 0 2026-03-09T17:29:57.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:29:57 vm06.local ceph-mon[57307]: pgmap v162: 65 pgs: 65 active+clean; 1.9 GiB data, 6.8 GiB used, 113 GiB / 120 GiB avail; 32 MiB/s rd, 85 MiB/s wr, 365 op/s 2026-03-09T17:29:57.904 INFO:tasks.workunit.client.1.vm09.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-09T17:29:57.907 INFO:tasks.workunit.client.1.vm09.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-09T17:29:57.907 INFO:tasks.workunit.client.1.vm09.stderr:+ make 2026-03-09T17:29:57.993 INFO:tasks.workunit.client.0.vm06.stdout:3/851: write dd/d59/da1/faf [1687563,91487] 0 2026-03-09T17:29:57.996 INFO:tasks.workunit.client.0.vm06.stdout:3/852: symlink dd/d19/d25/d44/d80/dd7/l11f 0 2026-03-09T17:29:57.997 INFO:tasks.workunit.client.0.vm06.stdout:3/853: stat dd/d19/d25/d2d/d9b/fdb 0 2026-03-09T17:29:57.999 INFO:tasks.workunit.client.0.vm06.stdout:3/854: dread dd/d81/da3/dae/fbb [0,4194304] 0 2026-03-09T17:29:58.000 INFO:tasks.workunit.client.0.vm06.stdout:3/855: read dd/d19/d25/fd1 [2977091,27681] 0 2026-03-09T17:29:58.015 INFO:tasks.workunit.client.0.vm06.stdout:3/856: sync 2026-03-09T17:29:58.025 INFO:tasks.workunit.client.0.vm06.stdout:3/857: dread dd/d19/d2c/f79 [0,4194304] 0 2026-03-09T17:29:58.026 INFO:tasks.workunit.client.0.vm06.stdout:3/858: stat dd/d19/d25/fd1 0 2026-03-09T17:29:58.031 INFO:tasks.workunit.client.0.vm06.stdout:6/719: write d6/f5c [412551,89402] 0 2026-03-09T17:29:58.034 INFO:tasks.workunit.client.0.vm06.stdout:6/720: dread d6/d4f/d3e/d52/f89 [0,4194304] 0 2026-03-09T17:29:58.038 INFO:tasks.workunit.client.0.vm06.stdout:6/721: dwrite d6/d12/d53/fcc [0,4194304] 0 2026-03-09T17:29:58.040 INFO:tasks.workunit.client.0.vm06.stdout:6/722: write d6/d47/d4d/d9a/da2/db1/fb8 [3178170,71778] 0 2026-03-09T17:29:58.053 INFO:tasks.workunit.client.0.vm06.stdout:6/723: dread d6/d4f/f33 [0,4194304] 0 2026-03-09T17:29:58.054 INFO:tasks.workunit.client.0.vm06.stdout:6/724: creat d6/d12/d53/d91/dbf/fde x:0 0 0 2026-03-09T17:29:58.067 INFO:tasks.workunit.client.0.vm06.stdout:6/725: sync 2026-03-09T17:29:58.085 INFO:tasks.workunit.client.0.vm06.stdout:5/806: dwrite d4/d50/d18/f8c [0,4194304] 0 2026-03-09T17:29:58.088 INFO:tasks.workunit.client.0.vm06.stdout:5/807: getdents d4/d22/d64 0 2026-03-09T17:29:58.088 INFO:tasks.workunit.client.0.vm06.stdout:5/808: chown d4/d22/d64/f9f 28356989 1 2026-03-09T17:29:58.089 INFO:tasks.workunit.client.0.vm06.stdout:5/809: write d4/d22/d46/f6e [4288370,33417] 0 2026-03-09T17:29:58.094 INFO:tasks.workunit.client.0.vm06.stdout:5/810: rmdir d4/d52 39 2026-03-09T17:29:58.097 INFO:tasks.workunit.client.0.vm06.stdout:5/811: fdatasync d4/d22/d64/fcc 0 2026-03-09T17:29:58.099 INFO:tasks.workunit.client.0.vm06.stdout:5/812: rmdir d4/d52/db4 39 2026-03-09T17:29:58.131 INFO:tasks.workunit.client.0.vm06.stdout:2/721: write d3/d4/d22/f67 [528097,16849] 0 2026-03-09T17:29:58.132 INFO:tasks.workunit.client.0.vm06.stdout:2/722: chown d3/f3b 4273 1 2026-03-09T17:29:58.132 INFO:tasks.workunit.client.0.vm06.stdout:2/723: chown d3/d4/d12/d71/daa/d77/d81/d64/d6a/dba 10990098 1 2026-03-09T17:29:58.138 INFO:tasks.workunit.client.0.vm06.stdout:8/797: dwrite d15/d31/dc5/df1/d71/f96 [0,4194304] 0 2026-03-09T17:29:58.152 INFO:tasks.workunit.client.0.vm06.stdout:9/894: dwrite d3/d15/d36/d4d/fd1 [0,4194304] 0 2026-03-09T17:29:58.154 INFO:tasks.workunit.client.0.vm06.stdout:8/798: chown c3 21 1 2026-03-09T17:29:58.154 INFO:tasks.workunit.client.0.vm06.stdout:8/799: stat fe 0 2026-03-09T17:29:58.164 INFO:tasks.workunit.client.0.vm06.stdout:8/800: symlink d15/d39/l106 0 2026-03-09T17:29:58.165 INFO:tasks.workunit.client.0.vm06.stdout:9/895: dread d3/d26/d35/f6f [0,4194304] 0 2026-03-09T17:29:58.167 INFO:tasks.workunit.client.0.vm06.stdout:9/896: mknod d3/d26/d35/c11e 0 2026-03-09T17:29:58.169 INFO:tasks.workunit.client.0.vm06.stdout:8/801: dread d15/d39/d67/d77/fa0 [0,4194304] 0 2026-03-09T17:29:58.169 INFO:tasks.workunit.client.0.vm06.stdout:9/897: fdatasync d3/d26/d6c/d68/f7f 0 2026-03-09T17:29:58.171 INFO:tasks.workunit.client.0.vm06.stdout:8/802: mknod d15/d39/d3c/c107 0 2026-03-09T17:29:58.172 INFO:tasks.workunit.client.0.vm06.stdout:9/898: creat d3/d11/d65/d80/f11f x:0 0 0 2026-03-09T17:29:58.173 INFO:tasks.workunit.client.0.vm06.stdout:8/803: symlink d15/d31/dc5/df1/d2b/d85/l108 0 2026-03-09T17:29:58.177 INFO:tasks.workunit.client.0.vm06.stdout:8/804: rename d15/d39/dd2/ldf to d15/d31/dc5/df1/d3d/d5f/l109 0 2026-03-09T17:29:58.189 INFO:tasks.workunit.client.0.vm06.stdout:8/805: write d15/d31/dc5/df1/d3d/d5f/d83/ff5 [200558,97029] 0 2026-03-09T17:29:58.197 INFO:tasks.workunit.client.0.vm06.stdout:8/806: sync 2026-03-09T17:29:58.199 INFO:tasks.workunit.client.0.vm06.stdout:8/807: creat d15/d39/d67/d77/de7/f10a x:0 0 0 2026-03-09T17:29:58.204 INFO:tasks.workunit.client.0.vm06.stdout:8/808: sync 2026-03-09T17:29:58.205 INFO:tasks.workunit.client.0.vm06.stdout:8/809: chown d15/d31/dc5/df1/d3d/d5f/d83/dc1/l100 2274957 1 2026-03-09T17:29:58.207 INFO:tasks.workunit.client.0.vm06.stdout:8/810: creat d15/d16/d6d/f10b x:0 0 0 2026-03-09T17:29:58.209 INFO:tasks.workunit.client.0.vm06.stdout:8/811: unlink d15/d31/dc5/df1/d2b/f46 0 2026-03-09T17:29:58.221 INFO:tasks.workunit.client.0.vm06.stdout:1/803: write f7 [3399329,30565] 0 2026-03-09T17:29:58.225 INFO:tasks.workunit.client.0.vm06.stdout:1/804: dwrite d11/d14/d1d/d42/d46/d92/dc0/f7f [0,4194304] 0 2026-03-09T17:29:58.236 INFO:tasks.workunit.client.0.vm06.stdout:1/805: symlink d11/d14/d1d/d42/dff/l10d 0 2026-03-09T17:29:58.237 INFO:tasks.workunit.client.0.vm06.stdout:1/806: chown d11/d14/d1d/d42/d46/d92/dc0/daf 531125587 1 2026-03-09T17:29:58.241 INFO:tasks.workunit.client.0.vm06.stdout:1/807: creat d11/d14/d1d/d1e/d2a/d99/f10e x:0 0 0 2026-03-09T17:29:58.247 INFO:tasks.workunit.client.0.vm06.stdout:8/812: dread d15/d16/d1a/d47/faf [0,4194304] 0 2026-03-09T17:29:58.253 INFO:tasks.workunit.client.0.vm06.stdout:7/943: dwrite d5/d1f/d34/d3f/fca [0,4194304] 0 2026-03-09T17:29:58.262 INFO:tasks.workunit.client.0.vm06.stdout:7/944: creat d5/d7/dac/f117 x:0 0 0 2026-03-09T17:29:58.262 INFO:tasks.workunit.client.0.vm06.stdout:7/945: chown d5/d1f/d34/d3f/l73 0 1 2026-03-09T17:29:58.265 INFO:tasks.workunit.client.0.vm06.stdout:7/946: unlink d5/f18 0 2026-03-09T17:29:58.266 INFO:tasks.workunit.client.0.vm06.stdout:7/947: readlink d5/dd/dc5/d64/l69 0 2026-03-09T17:29:58.269 INFO:tasks.workunit.client.0.vm06.stdout:7/948: dwrite d5/dd/dc5/f93 [4194304,4194304] 0 2026-03-09T17:29:58.282 INFO:tasks.workunit.client.0.vm06.stdout:7/949: sync 2026-03-09T17:29:58.283 INFO:tasks.workunit.client.0.vm06.stdout:7/950: fsync d5/d7/d2b/fa1 0 2026-03-09T17:29:58.284 INFO:tasks.workunit.client.0.vm06.stdout:7/951: creat d5/d7/dac/dd4/f118 x:0 0 0 2026-03-09T17:29:58.300 INFO:tasks.workunit.client.0.vm06.stdout:7/952: dread d5/dd/d79/f97 [0,4194304] 0 2026-03-09T17:29:58.301 INFO:tasks.workunit.client.0.vm06.stdout:7/953: mknod d5/d7/d2b/de7/c119 0 2026-03-09T17:29:58.304 INFO:tasks.workunit.client.0.vm06.stdout:7/954: creat d5/d7/f11a x:0 0 0 2026-03-09T17:29:58.304 INFO:tasks.workunit.client.0.vm06.stdout:7/955: write d5/d7/d2b/fa1 [4220056,73771] 0 2026-03-09T17:29:58.305 INFO:tasks.workunit.client.0.vm06.stdout:7/956: truncate d5/d7/d2b/f50 5667902 0 2026-03-09T17:29:58.307 INFO:tasks.workunit.client.0.vm06.stdout:7/957: truncate d5/dd/f22 3194563 0 2026-03-09T17:29:58.308 INFO:tasks.workunit.client.0.vm06.stdout:7/958: rmdir d5/dd/dc5 39 2026-03-09T17:29:58.309 INFO:tasks.workunit.client.0.vm06.stdout:7/959: rmdir d5/dd 39 2026-03-09T17:29:58.311 INFO:tasks.workunit.client.0.vm06.stdout:7/960: creat d5/dd/d79/d7f/df7/f11b x:0 0 0 2026-03-09T17:29:58.315 INFO:tasks.workunit.client.0.vm06.stdout:7/961: rmdir d5/d1f/d34 39 2026-03-09T17:29:58.316 INFO:tasks.workunit.client.0.vm06.stdout:7/962: creat d5/d1f/d102/f11c x:0 0 0 2026-03-09T17:29:58.323 INFO:tasks.workunit.client.0.vm06.stdout:7/963: link d5/dd/dc5/d64/d6b/f114 d5/dd/dc5/d64/d6b/dd1/f11d 0 2026-03-09T17:29:58.325 INFO:tasks.workunit.client.0.vm06.stdout:7/964: mkdir d5/dd/dc5/d64/d11e 0 2026-03-09T17:29:58.328 INFO:tasks.workunit.client.0.vm06.stdout:7/965: getdents d5/d1f/d34/d3f 0 2026-03-09T17:29:58.336 INFO:tasks.workunit.client.0.vm06.stdout:7/966: dread d5/d1f/d34/d46/fa9 [0,4194304] 0 2026-03-09T17:29:58.340 INFO:tasks.workunit.client.0.vm06.stdout:4/876: dwrite db/d59/d5f/d6d/f7b [0,4194304] 0 2026-03-09T17:29:58.342 INFO:tasks.workunit.client.0.vm06.stdout:7/967: fsync d5/d1f/d34/d3f/d91/fb9 0 2026-03-09T17:29:58.344 INFO:tasks.workunit.client.0.vm06.stdout:7/968: creat d5/d7/d2b/dbd/dfe/f11f x:0 0 0 2026-03-09T17:29:58.346 INFO:tasks.workunit.client.0.vm06.stdout:7/969: creat d5/d1f/d34/d3f/f120 x:0 0 0 2026-03-09T17:29:58.470 INFO:tasks.workunit.client.0.vm06.stdout:0/963: dwrite d7/d11/f10c [0,4194304] 0 2026-03-09T17:29:58.471 INFO:tasks.workunit.client.0.vm06.stdout:0/964: fdatasync d7/d11/d5d/d64/f7f 0 2026-03-09T17:29:58.475 INFO:tasks.workunit.client.0.vm06.stdout:0/965: dwrite d7/d11/d19/d8b/da4/d85/f126 [0,4194304] 0 2026-03-09T17:29:58.478 INFO:tasks.workunit.client.0.vm06.stdout:0/966: creat d7/d11/d19/d3c/db9/dd8/f14c x:0 0 0 2026-03-09T17:29:58.479 INFO:tasks.workunit.client.0.vm06.stdout:0/967: dread - d7/d11/d19/d23/db7/fd9 zero size 2026-03-09T17:29:58.482 INFO:tasks.workunit.client.0.vm06.stdout:0/968: creat d7/d11/d5d/f14d x:0 0 0 2026-03-09T17:29:58.484 INFO:tasks.workunit.client.0.vm06.stdout:0/969: symlink d7/d11/d19/d3c/db9/ddd/de4/l14e 0 2026-03-09T17:29:58.486 INFO:tasks.workunit.client.0.vm06.stdout:0/970: creat d7/d11/d19/d3c/df3/d11c/f14f x:0 0 0 2026-03-09T17:29:58.488 INFO:tasks.workunit.client.0.vm06.stdout:0/971: truncate d7/d11/d5d/db8/fc6 4345275 0 2026-03-09T17:29:58.546 INFO:tasks.workunit.client.0.vm06.stdout:3/859: write dd/d1d/d6e/d70/f73 [488904,78293] 0 2026-03-09T17:29:58.549 INFO:tasks.workunit.client.0.vm06.stdout:3/860: mkdir dd/d19/d25/d44/d80/dd7/d120 0 2026-03-09T17:29:58.553 INFO:tasks.workunit.client.1.vm09.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-09T17:29:58.558 INFO:tasks.workunit.client.0.vm06.stdout:3/861: dread dd/d19/d25/d44/d80/dd7/fe6 [0,4194304] 0 2026-03-09T17:29:58.560 INFO:tasks.workunit.client.0.vm06.stdout:3/862: symlink dd/d81/d97/df5/l121 0 2026-03-09T17:29:58.562 INFO:tasks.workunit.client.0.vm06.stdout:6/726: write d6/d47/d96/d40/f67 [2020976,113531] 0 2026-03-09T17:29:58.563 INFO:tasks.workunit.client.0.vm06.stdout:3/863: symlink dd/d19/d28/l122 0 2026-03-09T17:29:58.566 INFO:tasks.workunit.client.0.vm06.stdout:6/727: dwrite d6/d4f/f26 [8388608,4194304] 0 2026-03-09T17:29:58.579 INFO:tasks.workunit.client.0.vm06.stdout:6/728: getdents d6/d12/d53/d91/dbf 0 2026-03-09T17:29:58.582 INFO:tasks.workunit.client.0.vm06.stdout:6/729: rename d6/d12/d53/d8f/cc8 to d6/d4f/d3e/d52/d95/cdf 0 2026-03-09T17:29:58.585 INFO:tasks.workunit.client.0.vm06.stdout:6/730: rmdir d6/d4f/d3e/d52/d80/dc9 0 2026-03-09T17:29:58.585 INFO:tasks.workunit.client.0.vm06.stdout:6/731: readlink d6/d47/d4d/d6d/ld9 0 2026-03-09T17:29:58.600 INFO:tasks.workunit.client.0.vm06.stdout:5/813: dwrite d4/d50/f43 [0,4194304] 0 2026-03-09T17:29:58.602 INFO:tasks.workunit.client.0.vm06.stdout:5/814: fdatasync d4/d50/d18/d3d/fa7 0 2026-03-09T17:29:58.607 INFO:tasks.workunit.client.0.vm06.stdout:5/815: mkdir d4/dbb/d127 0 2026-03-09T17:29:58.608 INFO:tasks.workunit.client.0.vm06.stdout:5/816: chown d4/d50/d35/d40/d95/fd2 6605270 1 2026-03-09T17:29:58.609 INFO:tasks.workunit.client.0.vm06.stdout:5/817: symlink d4/d50/d35/d40/d6f/l128 0 2026-03-09T17:29:58.638 INFO:tasks.workunit.client.0.vm06.stdout:5/818: sync 2026-03-09T17:29:58.643 INFO:tasks.workunit.client.0.vm06.stdout:2/724: truncate d3/d4/d12/d2b/d2d/f1b 1003282 0 2026-03-09T17:29:58.644 INFO:tasks.workunit.client.0.vm06.stdout:2/725: mkdir d3/d4/d12/da7/de3 0 2026-03-09T17:29:58.651 INFO:tasks.workunit.client.0.vm06.stdout:2/726: dread d3/fc7 [0,4194304] 0 2026-03-09T17:29:58.654 INFO:tasks.workunit.client.0.vm06.stdout:2/727: getdents d3/d4/d22/d72/d8f/dda 0 2026-03-09T17:29:58.655 INFO:tasks.workunit.client.0.vm06.stdout:9/899: write d3/d15/d36/d4d/f60 [4851602,97701] 0 2026-03-09T17:29:58.657 INFO:tasks.workunit.client.0.vm06.stdout:9/900: chown d3/d15/d48/da8/db9/ca3 433029013 1 2026-03-09T17:29:58.660 INFO:tasks.workunit.client.0.vm06.stdout:2/728: creat d3/d4/d12/d71/daa/d77/d81/d64/d6a/de0/fe4 x:0 0 0 2026-03-09T17:29:58.676 INFO:tasks.workunit.client.0.vm06.stdout:8/813: getdents d15/d16/d6d 0 2026-03-09T17:29:58.676 INFO:tasks.workunit.client.0.vm06.stdout:8/814: read d15/d31/dc5/df1/f4f [2459952,89828] 0 2026-03-09T17:29:58.676 INFO:tasks.workunit.client.0.vm06.stdout:8/815: dread - d15/fb2 zero size 2026-03-09T17:29:58.676 INFO:tasks.workunit.client.0.vm06.stdout:8/816: fdatasync d15/d31/dc5/df1/d71/f105 0 2026-03-09T17:29:58.676 INFO:tasks.workunit.client.0.vm06.stdout:2/729: mkdir d3/d4/d12/d71/daa/d77/d81/d64/de5 0 2026-03-09T17:29:58.676 INFO:tasks.workunit.client.0.vm06.stdout:9/901: truncate d3/d15/d36/fcf 852960 0 2026-03-09T17:29:58.676 INFO:tasks.workunit.client.0.vm06.stdout:9/902: dwrite d3/d15/d36/d4d/fa4 [0,4194304] 0 2026-03-09T17:29:58.681 INFO:tasks.workunit.client.0.vm06.stdout:2/730: symlink d3/d4/d22/le6 0 2026-03-09T17:29:58.682 INFO:tasks.workunit.client.0.vm06.stdout:2/731: chown d3/d4/d12/da7/fbb 34 1 2026-03-09T17:29:58.683 INFO:tasks.workunit.client.0.vm06.stdout:9/903: mknod d3/d6d/d10b/c120 0 2026-03-09T17:29:58.686 INFO:tasks.workunit.client.0.vm06.stdout:9/904: link d3/d26/d35/c11e d3/d15/d36/d4c/c121 0 2026-03-09T17:29:58.688 INFO:tasks.workunit.client.0.vm06.stdout:9/905: truncate d3/d15/d36/d83/fb1 969168 0 2026-03-09T17:29:58.688 INFO:tasks.workunit.client.0.vm06.stdout:9/906: readlink d3/d15/d16/l20 0 2026-03-09T17:29:58.691 INFO:tasks.workunit.client.0.vm06.stdout:1/808: write d11/d14/d1d/d1e/d2a/fba [182348,92218] 0 2026-03-09T17:29:58.698 INFO:tasks.workunit.client.0.vm06.stdout:1/809: write d11/d14/d1d/d4a/f9d [255737,97076] 0 2026-03-09T17:29:58.700 INFO:tasks.workunit.client.0.vm06.stdout:1/810: dread d11/d14/d1d/d1e/d2a/d34/d58/fb9 [0,4194304] 0 2026-03-09T17:29:58.701 INFO:tasks.workunit.client.0.vm06.stdout:9/907: getdents d3/d15/d16 0 2026-03-09T17:29:58.703 INFO:tasks.workunit.client.0.vm06.stdout:9/908: fsync d3/d6d/d9a/fb4 0 2026-03-09T17:29:58.706 INFO:tasks.workunit.client.0.vm06.stdout:9/909: dwrite d3/d11/d65/d80/f11f [0,4194304] 0 2026-03-09T17:29:58.710 INFO:tasks.workunit.client.0.vm06.stdout:2/732: dread d3/d4/d12/f85 [0,4194304] 0 2026-03-09T17:29:58.721 INFO:tasks.workunit.client.0.vm06.stdout:9/910: rename d3/f21 to d3/d15/d36/d4d/f122 0 2026-03-09T17:29:58.727 INFO:tasks.workunit.client.0.vm06.stdout:1/811: getdents d11/d14/d1d/d42 0 2026-03-09T17:29:58.735 INFO:tasks.workunit.client.0.vm06.stdout:1/812: mknod d11/d14/d1d/d1e/d2a/d34/d58/c10f 0 2026-03-09T17:29:58.735 INFO:tasks.workunit.client.0.vm06.stdout:4/877: write db/df/f14 [2934004,95846] 0 2026-03-09T17:29:58.735 INFO:tasks.workunit.client.0.vm06.stdout:4/878: dread db/d1d/d21/d37/f101 [0,4194304] 0 2026-03-09T17:29:58.738 INFO:tasks.workunit.client.0.vm06.stdout:4/879: dwrite db/d59/d5f/d5d/f62 [4194304,4194304] 0 2026-03-09T17:29:58.739 INFO:tasks.workunit.client.0.vm06.stdout:9/911: link d3/d6d/d9a/d9c/d116/f118 d3/d15/f123 0 2026-03-09T17:29:58.764 INFO:tasks.workunit.client.0.vm06.stdout:1/813: mkdir d11/d14/d1d/d1e/dc2/d103/d110 0 2026-03-09T17:29:58.764 INFO:tasks.workunit.client.0.vm06.stdout:1/814: readlink d11/d14/d1d/d42/la0 0 2026-03-09T17:29:58.765 INFO:tasks.workunit.client.0.vm06.stdout:4/880: unlink db/d59/l64 0 2026-03-09T17:29:58.765 INFO:tasks.workunit.client.0.vm06.stdout:7/970: read d5/d1f/d34/d3f/d91/fce [3574931,3591] 0 2026-03-09T17:29:58.765 INFO:tasks.workunit.client.0.vm06.stdout:9/912: mkdir d3/d11/d65/d124 0 2026-03-09T17:29:58.765 INFO:tasks.workunit.client.0.vm06.stdout:7/971: creat d5/dd/d79/d7f/f121 x:0 0 0 2026-03-09T17:29:58.769 INFO:tasks.workunit.client.0.vm06.stdout:0/972: dwrite d7/f106 [0,4194304] 0 2026-03-09T17:29:58.773 INFO:tasks.workunit.client.0.vm06.stdout:0/973: dread - d7/d11/d19/d3c/df3/d11c/f14f zero size 2026-03-09T17:29:58.792 INFO:tasks.workunit.client.0.vm06.stdout:7/972: dread d5/dd/dc5/fa2 [0,4194304] 0 2026-03-09T17:29:58.794 INFO:tasks.workunit.client.0.vm06.stdout:7/973: mknod d5/d1f/d34/d3f/d8b/c122 0 2026-03-09T17:29:58.796 INFO:tasks.workunit.client.0.vm06.stdout:7/974: rmdir d5/d1f/d102 39 2026-03-09T17:29:58.796 INFO:tasks.workunit.client.0.vm06.stdout:7/975: readlink d5/d7/dac/dd4/lef 0 2026-03-09T17:29:58.797 INFO:tasks.workunit.client.0.vm06.stdout:7/976: fdatasync d5/d7/d2b/dc8/dd7/f105 0 2026-03-09T17:29:58.805 INFO:tasks.workunit.client.0.vm06.stdout:9/913: dread d3/d15/d48/f64 [0,4194304] 0 2026-03-09T17:29:58.806 INFO:tasks.workunit.client.0.vm06.stdout:7/977: unlink d5/d7/d2b/dbd/dfe/cff 0 2026-03-09T17:29:58.814 INFO:tasks.workunit.client.0.vm06.stdout:1/815: dread d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/fef [0,4194304] 0 2026-03-09T17:29:58.814 INFO:tasks.workunit.client.0.vm06.stdout:9/914: symlink d3/d26/d6c/d68/l125 0 2026-03-09T17:29:58.814 INFO:tasks.workunit.client.0.vm06.stdout:7/978: mknod d5/dd/dc5/d64/de8/c123 0 2026-03-09T17:29:58.816 INFO:tasks.workunit.client.0.vm06.stdout:1/816: symlink d11/d14/d1d/dd1/d10b/l111 0 2026-03-09T17:29:58.819 INFO:tasks.workunit.client.0.vm06.stdout:1/817: fsync d11/d14/d1d/d42/d46/fcd 0 2026-03-09T17:29:58.823 INFO:tasks.workunit.client.0.vm06.stdout:3/864: write dd/f1a [762451,107546] 0 2026-03-09T17:29:58.827 INFO:tasks.workunit.client.0.vm06.stdout:3/865: creat dd/d1d/d2e/d67/def/f123 x:0 0 0 2026-03-09T17:29:58.828 INFO:tasks.workunit.client.0.vm06.stdout:6/732: write d6/d47/f61 [2111039,4096] 0 2026-03-09T17:29:58.830 INFO:tasks.workunit.client.0.vm06.stdout:6/733: unlink d6/d4f/d3e/d52/d8c/l8e 0 2026-03-09T17:29:58.833 INFO:tasks.workunit.client.0.vm06.stdout:7/979: sync 2026-03-09T17:29:58.835 INFO:tasks.workunit.client.0.vm06.stdout:6/734: dwrite d6/d47/d96/da1/fb7 [0,4194304] 0 2026-03-09T17:29:58.840 INFO:tasks.workunit.client.0.vm06.stdout:5/819: write d4/dca/fff [3833328,61879] 0 2026-03-09T17:29:58.841 INFO:tasks.workunit.client.0.vm06.stdout:5/820: dread - d4/d50/d35/d40/d95/fd2 zero size 2026-03-09T17:29:58.844 INFO:tasks.workunit.client.0.vm06.stdout:7/980: rename d5/d1f/d34/d46/d51/f7c to d5/d1f/d34/f124 0 2026-03-09T17:29:58.847 INFO:tasks.workunit.client.0.vm06.stdout:6/735: creat d6/d47/d4d/da0/fe0 x:0 0 0 2026-03-09T17:29:58.848 INFO:tasks.workunit.client.0.vm06.stdout:7/981: sync 2026-03-09T17:29:58.851 INFO:tasks.workunit.client.0.vm06.stdout:3/866: dread dd/d19/d28/feb [0,4194304] 0 2026-03-09T17:29:58.852 INFO:tasks.workunit.client.0.vm06.stdout:7/982: creat d5/d7/d2b/dc8/f125 x:0 0 0 2026-03-09T17:29:58.856 INFO:tasks.workunit.client.0.vm06.stdout:8/817: dwrite d15/d39/f45 [0,4194304] 0 2026-03-09T17:29:58.856 INFO:tasks.workunit.client.0.vm06.stdout:3/867: mknod dd/d5b/c124 0 2026-03-09T17:29:58.858 INFO:tasks.workunit.client.0.vm06.stdout:6/736: mknod d6/d47/ce1 0 2026-03-09T17:29:58.861 INFO:tasks.workunit.client.0.vm06.stdout:5/821: link d4/d50/d18/f3c d4/d50/d18/de1/f129 0 2026-03-09T17:29:58.867 INFO:tasks.workunit.client.0.vm06.stdout:8/818: truncate d15/d39/d67/d77/fa0 1080386 0 2026-03-09T17:29:58.869 INFO:tasks.workunit.client.0.vm06.stdout:2/733: dwrite d3/d4/d22/d72/d8f/f95 [0,4194304] 0 2026-03-09T17:29:58.880 INFO:tasks.workunit.client.0.vm06.stdout:5/822: creat d4/d22/d64/df3/f12a x:0 0 0 2026-03-09T17:29:58.881 INFO:tasks.workunit.client.0.vm06.stdout:3/868: link dd/d1d/d2e/c114 dd/d59/da1/c125 0 2026-03-09T17:29:58.882 INFO:tasks.workunit.client.0.vm06.stdout:8/819: mknod d15/d39/d67/d77/d99/c10c 0 2026-03-09T17:29:58.884 INFO:tasks.workunit.client.0.vm06.stdout:5/823: chown d4/d50/l2a 19511 1 2026-03-09T17:29:58.889 INFO:tasks.workunit.client.0.vm06.stdout:8/820: mkdir d15/d16/d1e/d30/d55/d10d 0 2026-03-09T17:29:58.889 INFO:tasks.workunit.client.0.vm06.stdout:4/881: dwrite db/d1d/d21/d88/fd2 [0,4194304] 0 2026-03-09T17:29:58.889 INFO:tasks.workunit.client.0.vm06.stdout:0/974: write d7/d11/d5d/d64/fd2 [155945,124864] 0 2026-03-09T17:29:58.892 INFO:tasks.workunit.client.0.vm06.stdout:4/882: mkdir db/d1d/d21/d25/d4b/d85/d137 0 2026-03-09T17:29:58.893 INFO:tasks.workunit.client.0.vm06.stdout:4/883: chown db/d59/d5f/d45/c6a 17808 1 2026-03-09T17:29:58.893 INFO:tasks.workunit.client.0.vm06.stdout:4/884: chown db/d59/d5f/d6d/ddb/c112 530292 1 2026-03-09T17:29:58.894 INFO:tasks.workunit.client.0.vm06.stdout:2/734: getdents d3/d4/d12/d71/daa/d77/d81/d64 0 2026-03-09T17:29:58.895 INFO:tasks.workunit.client.0.vm06.stdout:8/821: chown d15/d31/dc5/df1/d3d/d5f/l84 468 1 2026-03-09T17:29:58.896 INFO:tasks.workunit.client.0.vm06.stdout:9/915: dwrite d3/d11/d65/f66 [0,4194304] 0 2026-03-09T17:29:58.898 INFO:tasks.workunit.client.0.vm06.stdout:4/885: mkdir db/d1d/d21/d25/d4b/df7/d138 0 2026-03-09T17:29:58.907 INFO:tasks.workunit.client.0.vm06.stdout:5/824: creat d4/d50/db2/f12b x:0 0 0 2026-03-09T17:29:58.908 INFO:tasks.workunit.client.0.vm06.stdout:8/822: symlink d15/d16/d1a/d7c/l10e 0 2026-03-09T17:29:58.909 INFO:tasks.workunit.client.0.vm06.stdout:9/916: creat d3/d15/d36/d4d/f126 x:0 0 0 2026-03-09T17:29:58.914 INFO:tasks.workunit.client.0.vm06.stdout:4/886: creat db/d1d/d21/d88/dc3/f139 x:0 0 0 2026-03-09T17:29:58.922 INFO:tasks.workunit.client.0.vm06.stdout:5/825: creat d4/d50/db2/d125/f12c x:0 0 0 2026-03-09T17:29:58.922 INFO:tasks.workunit.client.0.vm06.stdout:9/917: getdents d3/d11/d65/d124 0 2026-03-09T17:29:58.922 INFO:tasks.workunit.client.0.vm06.stdout:9/918: creat d3/d15/d36/d4c/d6a/d8a/dc3/f127 x:0 0 0 2026-03-09T17:29:58.922 INFO:tasks.workunit.client.0.vm06.stdout:9/919: readlink d3/d15/l30 0 2026-03-09T17:29:58.922 INFO:tasks.workunit.client.0.vm06.stdout:5/826: link d4/dca/fff d4/d52/d112/f12d 0 2026-03-09T17:29:58.922 INFO:tasks.workunit.client.0.vm06.stdout:9/920: symlink d3/d26/dcb/df1/l128 0 2026-03-09T17:29:58.922 INFO:tasks.workunit.client.0.vm06.stdout:8/823: dread d15/d31/dc5/df1/d3d/d5f/dd4/fd6 [0,4194304] 0 2026-03-09T17:29:58.923 INFO:tasks.workunit.client.0.vm06.stdout:5/827: symlink d4/d22/l12e 0 2026-03-09T17:29:58.923 INFO:tasks.workunit.client.0.vm06.stdout:8/824: chown d15/d39/d67/d77/fc3 349058039 1 2026-03-09T17:29:58.923 INFO:tasks.workunit.client.0.vm06.stdout:5/828: dread - d4/d50/d35/d40/d95/db8/dda/fdd zero size 2026-03-09T17:29:58.924 INFO:tasks.workunit.client.0.vm06.stdout:9/921: dwrite d3/d15/f17 [0,4194304] 0 2026-03-09T17:29:58.930 INFO:tasks.workunit.client.0.vm06.stdout:5/829: dwrite d4/d50/d18/d3d/f81 [0,4194304] 0 2026-03-09T17:29:58.933 INFO:tasks.workunit.client.0.vm06.stdout:9/922: link d3/d15/d36/d4c/d6a/le2 d3/d26/d35/d9f/l129 0 2026-03-09T17:29:59.053 INFO:tasks.workunit.client.0.vm06.stdout:1/818: write f8 [281876,33459] 0 2026-03-09T17:29:59.057 INFO:tasks.workunit.client.0.vm06.stdout:1/819: mkdir d11/d14/d1d/d4a/df7/d106/d112 0 2026-03-09T17:29:59.071 INFO:tasks.workunit.client.0.vm06.stdout:6/737: write d6/d47/d96/f7e [508044,120653] 0 2026-03-09T17:29:59.073 INFO:tasks.workunit.client.0.vm06.stdout:6/738: dread d6/d12/d53/fcc [0,4194304] 0 2026-03-09T17:29:59.080 INFO:tasks.workunit.client.0.vm06.stdout:7/983: write d5/d1f/d34/d46/d51/f6e [1151302,49051] 0 2026-03-09T17:29:59.081 INFO:tasks.workunit.client.0.vm06.stdout:5/830: dread d4/d50/d18/fa8 [0,4194304] 0 2026-03-09T17:29:59.082 INFO:tasks.workunit.client.0.vm06.stdout:5/831: chown d4/d50/d35/f94 133991483 1 2026-03-09T17:29:59.083 INFO:tasks.workunit.client.0.vm06.stdout:7/984: mkdir d5/dd/dc5/d64/de8/d126 0 2026-03-09T17:29:59.083 INFO:tasks.workunit.client.0.vm06.stdout:7/985: dread - d5/dd/d79/d7f/df7/f11b zero size 2026-03-09T17:29:59.086 INFO:tasks.workunit.client.0.vm06.stdout:3/869: write dd/d19/d25/d2d/d9b/fc5 [393159,83546] 0 2026-03-09T17:29:59.087 INFO:tasks.workunit.client.0.vm06.stdout:5/832: truncate d4/d50/d35/d40/fc1 1890992 0 2026-03-09T17:29:59.087 INFO:tasks.workunit.client.0.vm06.stdout:5/833: chown d4/d22/d64 122 1 2026-03-09T17:29:59.088 INFO:tasks.workunit.client.0.vm06.stdout:7/986: chown d5/d7/f99 1030284 1 2026-03-09T17:29:59.089 INFO:tasks.workunit.client.0.vm06.stdout:3/870: creat dd/d5b/f126 x:0 0 0 2026-03-09T17:29:59.090 INFO:tasks.workunit.client.0.vm06.stdout:5/834: symlink d4/d50/l12f 0 2026-03-09T17:29:59.091 INFO:tasks.workunit.client.0.vm06.stdout:5/835: chown d4/d22/d64/l92 59320 1 2026-03-09T17:29:59.093 INFO:tasks.workunit.client.0.vm06.stdout:3/871: unlink dd/d19/d1e/db8/f113 0 2026-03-09T17:29:59.097 INFO:tasks.workunit.client.0.vm06.stdout:3/872: rename dd/d19/d25/d2d/ffa to dd/d81/f127 0 2026-03-09T17:29:59.097 INFO:tasks.workunit.client.0.vm06.stdout:3/873: chown dd/d1d/d6e 51497 1 2026-03-09T17:29:59.099 INFO:tasks.workunit.client.0.vm06.stdout:3/874: truncate dd/d5b/d65/f6a 3053128 0 2026-03-09T17:29:59.099 INFO:tasks.workunit.client.0.vm06.stdout:3/875: dread - dd/d1d/d2e/d67/def/f123 zero size 2026-03-09T17:29:59.101 INFO:tasks.workunit.client.0.vm06.stdout:7/987: dread d5/d1f/dae/f113 [0,4194304] 0 2026-03-09T17:29:59.104 INFO:tasks.workunit.client.0.vm06.stdout:0/975: dwrite d7/d11/d19/d1d/f8a [4194304,4194304] 0 2026-03-09T17:29:59.108 INFO:tasks.workunit.client.0.vm06.stdout:2/735: write d3/f5a [488987,56476] 0 2026-03-09T17:29:59.109 INFO:tasks.workunit.client.0.vm06.stdout:4/887: write db/d1d/f3a [1521390,119213] 0 2026-03-09T17:29:59.115 INFO:tasks.workunit.client.0.vm06.stdout:4/888: creat db/d1d/d21/d25/d4b/d85/d106/d110/f13a x:0 0 0 2026-03-09T17:29:59.117 INFO:tasks.workunit.client.0.vm06.stdout:0/976: fsync d7/d11/d19/d8b/da4/d85/fc8 0 2026-03-09T17:29:59.118 INFO:tasks.workunit.client.0.vm06.stdout:0/977: chown d7/d11/d19/d1d/fec 8920326 1 2026-03-09T17:29:59.119 INFO:tasks.workunit.client.0.vm06.stdout:2/736: mknod d3/d4/d12/d71/daa/d77/d81/d64/de5/ce7 0 2026-03-09T17:29:59.119 INFO:tasks.workunit.client.0.vm06.stdout:0/978: write d7/d11/d19/d3c/db9/ddd/d10e/d129/f132 [667651,10511] 0 2026-03-09T17:29:59.130 INFO:tasks.workunit.client.0.vm06.stdout:4/889: rmdir db/d1d/d21/d44/dc1/d133 0 2026-03-09T17:29:59.131 INFO:tasks.workunit.client.0.vm06.stdout:2/737: getdents d3/d4/d12/d71/daa 0 2026-03-09T17:29:59.132 INFO:tasks.workunit.client.0.vm06.stdout:4/890: mkdir db/d59/d5f/d45/d10a/dcc/de0/d13b 0 2026-03-09T17:29:59.132 INFO:tasks.workunit.client.0.vm06.stdout:4/891: stat db/d1d/d21/d44/l7f 0 2026-03-09T17:29:59.134 INFO:tasks.workunit.client.0.vm06.stdout:2/738: rmdir d3/d4/d12/d2b 39 2026-03-09T17:29:59.136 INFO:tasks.workunit.client.0.vm06.stdout:2/739: chown d3/d4/d12/d2b/d36/l3f 25 1 2026-03-09T17:29:59.141 INFO:tasks.workunit.client.0.vm06.stdout:2/740: dread d3/d4/f52 [0,4194304] 0 2026-03-09T17:29:59.143 INFO:tasks.workunit.client.0.vm06.stdout:2/741: creat d3/d4/d12/da7/fe8 x:0 0 0 2026-03-09T17:29:59.146 INFO:tasks.workunit.client.0.vm06.stdout:2/742: dwrite d3/d4/f9c [0,4194304] 0 2026-03-09T17:29:59.155 INFO:tasks.workunit.client.0.vm06.stdout:4/892: sync 2026-03-09T17:29:59.157 INFO:tasks.workunit.client.0.vm06.stdout:4/893: mknod db/d1d/d21/d37/d69/d11f/c13c 0 2026-03-09T17:29:59.164 INFO:tasks.workunit.client.1.vm09.stderr:++ readlink -f fsstress 2026-03-09T17:29:59.167 INFO:tasks.workunit.client.0.vm06.stdout:3/876: dread dd/d19/d2c/f30 [0,4194304] 0 2026-03-09T17:29:59.168 INFO:tasks.workunit.client.0.vm06.stdout:3/877: creat dd/d118/f128 x:0 0 0 2026-03-09T17:29:59.169 INFO:tasks.workunit.client.1.vm09.stderr:+ BIN=/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-09T17:29:59.169 INFO:tasks.workunit.client.1.vm09.stderr:+ popd 2026-03-09T17:29:59.169 INFO:tasks.workunit.client.0.vm06.stdout:3/878: chown dd/d1d/d2e/d67/def 6241200 1 2026-03-09T17:29:59.170 INFO:tasks.workunit.client.1.vm09.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-09T17:29:59.170 INFO:tasks.workunit.client.1.vm09.stderr:+ popd 2026-03-09T17:29:59.170 INFO:tasks.workunit.client.1.vm09.stdout:~/cephtest/mnt.1/client.1/tmp 2026-03-09T17:29:59.171 INFO:tasks.workunit.client.0.vm06.stdout:3/879: truncate dd/d1d/d4e/fca 118100 0 2026-03-09T17:29:59.174 INFO:tasks.workunit.client.0.vm06.stdout:3/880: dwrite dd/d81/da3/dae/d110/f11a [0,4194304] 0 2026-03-09T17:29:59.175 INFO:tasks.workunit.client.1.vm09.stderr:++ mktemp -d -p . 2026-03-09T17:29:59.179 INFO:tasks.workunit.client.0.vm06.stdout:3/881: creat dd/d118/f129 x:0 0 0 2026-03-09T17:29:59.181 INFO:tasks.workunit.client.0.vm06.stdout:3/882: creat dd/d1d/d6e/d70/d119/f12a x:0 0 0 2026-03-09T17:29:59.183 INFO:tasks.workunit.client.1.vm09.stderr:+ T=./tmp.KQiywNAPAT 2026-03-09T17:29:59.183 INFO:tasks.workunit.client.1.vm09.stderr:+ /home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.KQiywNAPAT -l 1 -n 1000 -p 10 -v 2026-03-09T17:29:59.185 INFO:tasks.workunit.client.0.vm06.stdout:4/894: dread db/d1d/fd3 [0,4194304] 0 2026-03-09T17:29:59.186 INFO:tasks.workunit.client.0.vm06.stdout:3/883: link dd/d1d/d2e/cb0 dd/d1d/d4e/c12b 0 2026-03-09T17:29:59.189 INFO:tasks.workunit.client.0.vm06.stdout:3/884: mkdir dd/d19/d25/d44/d80/d12c 0 2026-03-09T17:29:59.192 INFO:tasks.workunit.client.1.vm09.stdout:seed = 1773006642 2026-03-09T17:29:59.192 INFO:tasks.workunit.client.0.vm06.stdout:3/885: mknod dd/d19/d25/d44/c12d 0 2026-03-09T17:29:59.197 INFO:tasks.workunit.client.1.vm09.stdout:7/0: creat f0 x:0 0 0 2026-03-09T17:29:59.198 INFO:tasks.workunit.client.1.vm09.stdout:4/0: dwrite - no filename 2026-03-09T17:29:59.198 INFO:tasks.workunit.client.1.vm09.stdout:2/0: truncate - no filename 2026-03-09T17:29:59.203 INFO:tasks.workunit.client.1.vm09.stdout:2/1: write - no filename 2026-03-09T17:29:59.203 INFO:tasks.workunit.client.1.vm09.stdout:2/2: fsync - no filename 2026-03-09T17:29:59.203 INFO:tasks.workunit.client.1.vm09.stdout:7/1: dread - f0 zero size 2026-03-09T17:29:59.203 INFO:tasks.workunit.client.1.vm09.stdout:4/1: symlink l0 0 2026-03-09T17:29:59.203 INFO:tasks.workunit.client.0.vm06.stdout:8/825: dwrite d15/d31/d58/fc8 [0,4194304] 0 2026-03-09T17:29:59.203 INFO:tasks.workunit.client.0.vm06.stdout:3/886: sync 2026-03-09T17:29:59.204 INFO:tasks.workunit.client.0.vm06.stdout:8/826: stat d15/d39/d67/d77/de7/feb 0 2026-03-09T17:29:59.207 INFO:tasks.workunit.client.0.vm06.stdout:8/827: rename d15/d16/d1e/d30/db8/da4/ca1 to d15/d39/d67/d77/d99/c10f 0 2026-03-09T17:29:59.208 INFO:tasks.workunit.client.1.vm09.stdout:2/3: creat f0 x:0 0 0 2026-03-09T17:29:59.209 INFO:tasks.workunit.client.1.vm09.stdout:4/2: creat f1 x:0 0 0 2026-03-09T17:29:59.209 INFO:tasks.workunit.client.1.vm09.stdout:8/0: rmdir - no directory 2026-03-09T17:29:59.209 INFO:tasks.workunit.client.1.vm09.stdout:8/1: dwrite - no filename 2026-03-09T17:29:59.209 INFO:tasks.workunit.client.1.vm09.stdout:8/2: chown . 361664730 1 2026-03-09T17:29:59.209 INFO:tasks.workunit.client.0.vm06.stdout:3/887: rmdir dd/d19/d25/d44/d80/d12c 0 2026-03-09T17:29:59.209 INFO:tasks.workunit.client.1.vm09.stdout:8/3: chown . 60 1 2026-03-09T17:29:59.209 INFO:tasks.workunit.client.1.vm09.stdout:8/4: dwrite - no filename 2026-03-09T17:29:59.210 INFO:tasks.workunit.client.0.vm06.stdout:3/888: dread - dd/d1d/f112 zero size 2026-03-09T17:29:59.210 INFO:tasks.workunit.client.0.vm06.stdout:8/828: symlink d15/d16/d1e/d30/l110 0 2026-03-09T17:29:59.210 INFO:tasks.workunit.client.1.vm09.stdout:8/5: chown . 1594663 1 2026-03-09T17:29:59.210 INFO:tasks.workunit.client.1.vm09.stdout:8/6: stat - no entries 2026-03-09T17:29:59.210 INFO:tasks.workunit.client.1.vm09.stdout:8/7: truncate - no filename 2026-03-09T17:29:59.210 INFO:tasks.workunit.client.1.vm09.stdout:8/8: write - no filename 2026-03-09T17:29:59.210 INFO:tasks.workunit.client.1.vm09.stdout:8/9: unlink - no file 2026-03-09T17:29:59.211 INFO:tasks.workunit.client.0.vm06.stdout:8/829: write d15/d16/d1e/fa9 [1781025,62908] 0 2026-03-09T17:29:59.212 INFO:tasks.workunit.client.1.vm09.stdout:2/4: creat f1 x:0 0 0 2026-03-09T17:29:59.215 INFO:tasks.workunit.client.0.vm06.stdout:8/830: chown d15/d16/d1a/d47/c57 5814259 1 2026-03-09T17:29:59.216 INFO:tasks.workunit.client.1.vm09.stdout:4/3: rename f1 to f2 0 2026-03-09T17:29:59.216 INFO:tasks.workunit.client.1.vm09.stdout:5/0: rmdir - no directory 2026-03-09T17:29:59.216 INFO:tasks.workunit.client.1.vm09.stdout:5/1: chown . 103789 1 2026-03-09T17:29:59.216 INFO:tasks.workunit.client.1.vm09.stdout:5/2: rename - no filename 2026-03-09T17:29:59.219 INFO:tasks.workunit.client.0.vm06.stdout:9/923: dwrite d3/d26/f33 [0,4194304] 0 2026-03-09T17:29:59.221 INFO:tasks.workunit.client.0.vm06.stdout:8/831: creat d15/d39/d67/d86/f111 x:0 0 0 2026-03-09T17:29:59.223 INFO:tasks.workunit.client.0.vm06.stdout:1/820: dwrite d11/d14/d1d/f73 [0,4194304] 0 2026-03-09T17:29:59.235 INFO:tasks.workunit.client.0.vm06.stdout:3/889: getdents dd/d1d/d2e/d67/def 0 2026-03-09T17:29:59.235 INFO:tasks.workunit.client.1.vm09.stdout:8/10: symlink l0 0 2026-03-09T17:29:59.242 INFO:tasks.workunit.client.1.vm09.stdout:2/5: link f0 f2 0 2026-03-09T17:29:59.242 INFO:tasks.workunit.client.1.vm09.stdout:2/6: chown f0 0 1 2026-03-09T17:29:59.243 INFO:tasks.workunit.client.0.vm06.stdout:8/832: mknod d15/d31/de2/c112 0 2026-03-09T17:29:59.244 INFO:tasks.workunit.client.1.vm09.stdout:6/0: stat - no entries 2026-03-09T17:29:59.244 INFO:tasks.workunit.client.1.vm09.stdout:6/1: dread - no filename 2026-03-09T17:29:59.244 INFO:tasks.workunit.client.0.vm06.stdout:3/890: symlink dd/d19/d1e/l12e 0 2026-03-09T17:29:59.245 INFO:tasks.workunit.client.0.vm06.stdout:1/821: symlink d11/d14/d1d/d4a/de7/l113 0 2026-03-09T17:29:59.245 INFO:tasks.workunit.client.0.vm06.stdout:1/822: chown d11/d69/c76 806 1 2026-03-09T17:29:59.249 INFO:tasks.workunit.client.1.vm09.stdout:8/11: mkdir d1 0 2026-03-09T17:29:59.249 INFO:tasks.workunit.client.0.vm06.stdout:1/823: dwrite d11/d14/d1d/d1e/d2a/fba [0,4194304] 0 2026-03-09T17:29:59.261 INFO:tasks.workunit.client.1.vm09.stdout:5/3: mkdir d0 0 2026-03-09T17:29:59.261 INFO:tasks.workunit.client.1.vm09.stdout:5/4: dread - no filename 2026-03-09T17:29:59.265 INFO:tasks.workunit.client.1.vm09.stdout:4/4: dwrite f2 [0,4194304] 0 2026-03-09T17:29:59.267 INFO:tasks.workunit.client.1.vm09.stdout:9/0: creat f0 x:0 0 0 2026-03-09T17:29:59.268 INFO:tasks.workunit.client.1.vm09.stdout:2/7: write f2 [591147,11729] 0 2026-03-09T17:29:59.273 INFO:tasks.workunit.client.1.vm09.stdout:8/12: symlink d1/l2 0 2026-03-09T17:29:59.273 INFO:tasks.workunit.client.1.vm09.stdout:8/13: truncate - no filename 2026-03-09T17:29:59.273 INFO:tasks.workunit.client.1.vm09.stdout:8/14: dwrite - no filename 2026-03-09T17:29:59.273 INFO:tasks.workunit.client.1.vm09.stdout:6/2: creat f0 x:0 0 0 2026-03-09T17:29:59.273 INFO:tasks.workunit.client.1.vm09.stdout:6/3: rmdir - no directory 2026-03-09T17:29:59.273 INFO:tasks.workunit.client.1.vm09.stdout:6/4: write f0 [775416,26913] 0 2026-03-09T17:29:59.274 INFO:tasks.workunit.client.0.vm06.stdout:3/891: read - dd/d1d/d2e/d67/fcf zero size 2026-03-09T17:29:59.276 INFO:tasks.workunit.client.1.vm09.stdout:0/0: write - no filename 2026-03-09T17:29:59.276 INFO:tasks.workunit.client.1.vm09.stdout:8/15: rmdir d1 39 2026-03-09T17:29:59.276 INFO:tasks.workunit.client.1.vm09.stdout:0/1: chown . 6035023 1 2026-03-09T17:29:59.276 INFO:tasks.workunit.client.1.vm09.stdout:0/2: readlink - no filename 2026-03-09T17:29:59.276 INFO:tasks.workunit.client.1.vm09.stdout:0/3: truncate - no filename 2026-03-09T17:29:59.277 INFO:tasks.workunit.client.1.vm09.stdout:2/8: mknod c3 0 2026-03-09T17:29:59.277 INFO:tasks.workunit.client.1.vm09.stdout:0/4: chown . 0 1 2026-03-09T17:29:59.277 INFO:tasks.workunit.client.1.vm09.stdout:0/5: rmdir - no directory 2026-03-09T17:29:59.277 INFO:tasks.workunit.client.1.vm09.stdout:0/6: unlink - no file 2026-03-09T17:29:59.278 INFO:tasks.workunit.client.0.vm06.stdout:8/833: mknod d15/d31/dc5/df1/d3d/d5f/c113 0 2026-03-09T17:29:59.279 INFO:tasks.workunit.client.0.vm06.stdout:1/824: mkdir d11/d14/d1d/d4a/df7/d106/d112/d114 0 2026-03-09T17:29:59.280 INFO:tasks.workunit.client.0.vm06.stdout:1/825: readlink d11/d14/d1d/d4a/de7/l113 0 2026-03-09T17:29:59.281 INFO:tasks.workunit.client.1.vm09.stdout:2/9: write f0 [1015719,3972] 0 2026-03-09T17:29:59.281 INFO:tasks.workunit.client.1.vm09.stdout:3/0: creat f0 x:0 0 0 2026-03-09T17:29:59.284 INFO:tasks.workunit.client.1.vm09.stdout:2/10: creat f4 x:0 0 0 2026-03-09T17:29:59.286 INFO:tasks.workunit.client.0.vm06.stdout:8/834: rename d15/d39/f6f to d15/d31/f114 0 2026-03-09T17:29:59.286 INFO:tasks.workunit.client.1.vm09.stdout:0/7: getdents . 0 2026-03-09T17:29:59.286 INFO:tasks.workunit.client.0.vm06.stdout:1/826: creat d11/d14/d1d/d1e/d2a/f115 x:0 0 0 2026-03-09T17:29:59.294 INFO:tasks.workunit.client.1.vm09.stdout:2/11: creat f5 x:0 0 0 2026-03-09T17:29:59.295 INFO:tasks.workunit.client.1.vm09.stdout:0/8: symlink l0 0 2026-03-09T17:29:59.295 INFO:tasks.workunit.client.1.vm09.stdout:0/9: truncate - no filename 2026-03-09T17:29:59.295 INFO:tasks.workunit.client.1.vm09.stdout:3/1: dwrite f0 [0,4194304] 0 2026-03-09T17:29:59.295 INFO:tasks.workunit.client.1.vm09.stdout:2/12: unlink f2 0 2026-03-09T17:29:59.296 INFO:tasks.workunit.client.1.vm09.stdout:0/10: chown l0 1 1 2026-03-09T17:29:59.296 INFO:tasks.workunit.client.1.vm09.stdout:0/11: dwrite - no filename 2026-03-09T17:29:59.296 INFO:tasks.workunit.client.1.vm09.stdout:0/12: readlink l0 0 2026-03-09T17:29:59.300 INFO:tasks.workunit.client.1.vm09.stdout:0/13: rename l0 to l1 0 2026-03-09T17:29:59.300 INFO:tasks.workunit.client.1.vm09.stdout:0/14: write - no filename 2026-03-09T17:29:59.300 INFO:tasks.workunit.client.1.vm09.stdout:2/13: rename f5 to f6 0 2026-03-09T17:29:59.325 INFO:tasks.workunit.client.1.vm09.stdout:2/14: rename f4 to f7 0 2026-03-09T17:29:59.331 INFO:tasks.workunit.client.1.vm09.stdout:2/15: creat f8 x:0 0 0 2026-03-09T17:29:59.338 INFO:tasks.workunit.client.1.vm09.stdout:2/16: write f7 [1029541,30649] 0 2026-03-09T17:29:59.338 INFO:tasks.workunit.client.1.vm09.stdout:2/17: rmdir - no directory 2026-03-09T17:29:59.338 INFO:tasks.workunit.client.1.vm09.stdout:2/18: readlink - no filename 2026-03-09T17:29:59.340 INFO:tasks.workunit.client.1.vm09.stdout:2/19: chown f8 12719 1 2026-03-09T17:29:59.341 INFO:tasks.workunit.client.1.vm09.stdout:2/20: stat f1 0 2026-03-09T17:29:59.341 INFO:tasks.workunit.client.1.vm09.stdout:2/21: readlink - no filename 2026-03-09T17:29:59.344 INFO:tasks.workunit.client.1.vm09.stdout:2/22: chown f7 341070364 1 2026-03-09T17:29:59.345 INFO:tasks.workunit.client.0.vm06.stdout:6/739: dwrite d6/f97 [0,4194304] 0 2026-03-09T17:29:59.351 INFO:tasks.workunit.client.0.vm06.stdout:5/836: write d4/d50/fad [485385,78566] 0 2026-03-09T17:29:59.353 INFO:tasks.workunit.client.0.vm06.stdout:6/740: mkdir d6/d47/d4d/d6d/de2 0 2026-03-09T17:29:59.354 INFO:tasks.workunit.client.0.vm06.stdout:6/741: chown d6/d12/fd4 209176556 1 2026-03-09T17:29:59.355 INFO:tasks.workunit.client.0.vm06.stdout:5/837: truncate d4/f1f 1178058 0 2026-03-09T17:29:59.356 INFO:tasks.workunit.client.1.vm09.stdout:2/23: write f6 [834129,18249] 0 2026-03-09T17:29:59.359 INFO:tasks.workunit.client.1.vm09.stdout:2/24: chown f7 1070916 1 2026-03-09T17:29:59.379 INFO:tasks.workunit.client.1.vm09.stdout:2/25: dwrite f0 [0,4194304] 0 2026-03-09T17:29:59.417 INFO:tasks.workunit.client.1.vm09.stdout:3/2: fdatasync f0 0 2026-03-09T17:29:59.418 INFO:tasks.workunit.client.1.vm09.stdout:3/3: creat f1 x:0 0 0 2026-03-09T17:29:59.418 INFO:tasks.workunit.client.1.vm09.stdout:3/4: chown f0 313122 1 2026-03-09T17:29:59.426 INFO:tasks.workunit.client.0.vm06.stdout:7/988: write d5/dd/dc5/dad/fdf [424374,101987] 0 2026-03-09T17:29:59.432 INFO:tasks.workunit.client.0.vm06.stdout:7/989: dwrite d5/dd/dc5/d5f/ffd [0,4194304] 0 2026-03-09T17:29:59.439 INFO:tasks.workunit.client.0.vm06.stdout:7/990: link d5/dd/dc5/f32 d5/dd/dc5/d64/f127 0 2026-03-09T17:29:59.444 INFO:tasks.workunit.client.1.vm09.stdout:3/5: dwrite f0 [0,4194304] 0 2026-03-09T17:29:59.445 INFO:tasks.workunit.client.0.vm06.stdout:0/979: truncate d7/d11/d19/d3c/db9/ddd/d10e/d129/f132 299 0 2026-03-09T17:29:59.454 INFO:tasks.workunit.client.0.vm06.stdout:0/980: dread d7/d11/d2d/daf/fd3 [0,4194304] 0 2026-03-09T17:29:59.455 INFO:tasks.workunit.client.0.vm06.stdout:2/743: write d3/d4/d12/d71/daa/d77/d81/d64/d6a/f96 [892985,51515] 0 2026-03-09T17:29:59.458 INFO:tasks.workunit.client.0.vm06.stdout:0/981: creat d7/d88/f150 x:0 0 0 2026-03-09T17:29:59.463 INFO:tasks.workunit.client.0.vm06.stdout:0/982: creat d7/d11/d19/d23/db7/dbd/f151 x:0 0 0 2026-03-09T17:29:59.463 INFO:tasks.workunit.client.0.vm06.stdout:4/895: write db/d1d/d21/fa5 [79568,1954] 0 2026-03-09T17:29:59.466 INFO:tasks.workunit.client.0.vm06.stdout:4/896: creat db/d1d/f13d x:0 0 0 2026-03-09T17:29:59.466 INFO:tasks.workunit.client.0.vm06.stdout:4/897: chown db/d1d/d21/d88 413143 1 2026-03-09T17:29:59.472 INFO:tasks.workunit.client.0.vm06.stdout:4/898: symlink db/d59/d5f/d45/l13e 0 2026-03-09T17:29:59.475 INFO:tasks.workunit.client.0.vm06.stdout:0/983: dread d7/ffa [0,4194304] 0 2026-03-09T17:29:59.484 INFO:tasks.workunit.client.1.vm09.stdout:7/2: getdents . 0 2026-03-09T17:29:59.484 INFO:tasks.workunit.client.1.vm09.stdout:7/3: dread - f0 zero size 2026-03-09T17:29:59.484 INFO:tasks.workunit.client.1.vm09.stdout:7/4: write f0 [825842,43243] 0 2026-03-09T17:29:59.484 INFO:tasks.workunit.client.1.vm09.stdout:7/5: rmdir - no directory 2026-03-09T17:29:59.489 INFO:tasks.workunit.client.1.vm09.stdout:7/6: dwrite f0 [0,4194304] 0 2026-03-09T17:29:59.489 INFO:tasks.workunit.client.0.vm06.stdout:9/924: write d3/d26/d35/f99 [352280,104051] 0 2026-03-09T17:29:59.498 INFO:tasks.workunit.client.0.vm06.stdout:9/925: mkdir d3/d15/d36/d12a 0 2026-03-09T17:29:59.568 INFO:tasks.workunit.client.1.vm09.stdout:9/1: getdents . 0 2026-03-09T17:29:59.568 INFO:tasks.workunit.client.1.vm09.stdout:9/2: readlink - no filename 2026-03-09T17:29:59.568 INFO:tasks.workunit.client.1.vm09.stdout:9/3: write f0 [1043609,28320] 0 2026-03-09T17:29:59.568 INFO:tasks.workunit.client.1.vm09.stdout:9/4: write f0 [1210615,126773] 0 2026-03-09T17:29:59.568 INFO:tasks.workunit.client.1.vm09.stdout:4/5: write f2 [4915969,42769] 0 2026-03-09T17:29:59.568 INFO:tasks.workunit.client.1.vm09.stdout:6/5: fsync f0 0 2026-03-09T17:29:59.568 INFO:tasks.workunit.client.1.vm09.stdout:2/26: rename f7 to f9 0 2026-03-09T17:29:59.568 INFO:tasks.workunit.client.1.vm09.stdout:2/27: write f0 [1395697,41805] 0 2026-03-09T17:29:59.569 INFO:tasks.workunit.client.1.vm09.stdout:2/28: read f0 [2076866,2018] 0 2026-03-09T17:29:59.569 INFO:tasks.workunit.client.0.vm06.stdout:9/926: stat d3/d26/d35/c93 0 2026-03-09T17:29:59.569 INFO:tasks.workunit.client.0.vm06.stdout:9/927: rmdir d3 39 2026-03-09T17:29:59.569 INFO:tasks.workunit.client.0.vm06.stdout:3/892: write dd/d1d/d4e/f5a [807967,39046] 0 2026-03-09T17:29:59.569 INFO:tasks.workunit.client.0.vm06.stdout:3/893: rmdir dd/d19/dda 0 2026-03-09T17:29:59.569 INFO:tasks.workunit.client.0.vm06.stdout:1/827: write d11/d14/d1d/d1e/d2a/d34/d58/fa1 [8228459,52734] 0 2026-03-09T17:29:59.569 INFO:tasks.workunit.client.0.vm06.stdout:3/894: getdents dd/d59/da1/d11e 0 2026-03-09T17:29:59.569 INFO:tasks.workunit.client.0.vm06.stdout:3/895: readlink dd/d19/d2c/l94 0 2026-03-09T17:29:59.569 INFO:tasks.workunit.client.0.vm06.stdout:6/742: write d6/d12/fbb [1268111,29846] 0 2026-03-09T17:29:59.569 INFO:tasks.workunit.client.0.vm06.stdout:5/838: dwrite d4/d50/fa3 [0,4194304] 0 2026-03-09T17:29:59.569 INFO:tasks.workunit.client.0.vm06.stdout:6/743: creat d6/d12/d53/dd0/fe3 x:0 0 0 2026-03-09T17:29:59.569 INFO:tasks.workunit.client.0.vm06.stdout:6/744: mknod d6/d12/d53/dd0/ce4 0 2026-03-09T17:29:59.569 INFO:tasks.workunit.client.0.vm06.stdout:6/745: rename d6/d12/d17/c43 to d6/d12/d17/d65/ce5 0 2026-03-09T17:29:59.569 INFO:tasks.workunit.client.0.vm06.stdout:6/746: write d6/d47/d96/d40/f67 [3277936,67005] 0 2026-03-09T17:29:59.570 INFO:tasks.workunit.client.0.vm06.stdout:6/747: link d6/d47/d4d/d6d/fbe d6/d12/d53/d91/dbf/fe6 0 2026-03-09T17:29:59.572 INFO:tasks.workunit.client.0.vm06.stdout:6/748: symlink d6/d12/d2d/db3/le7 0 2026-03-09T17:29:59.573 INFO:tasks.workunit.client.0.vm06.stdout:6/749: rmdir d6/d4f/d3e/d52/d95 39 2026-03-09T17:29:59.574 INFO:tasks.workunit.client.0.vm06.stdout:6/750: mknod d6/d12/d17/d65/ce8 0 2026-03-09T17:29:59.576 INFO:tasks.workunit.client.0.vm06.stdout:6/751: unlink d6/d4f/f3a 0 2026-03-09T17:29:59.580 INFO:tasks.workunit.client.0.vm06.stdout:6/752: dwrite d6/d12/d17/d65/f72 [0,4194304] 0 2026-03-09T17:29:59.590 INFO:tasks.workunit.client.1.vm09.stdout:8/16: creat d1/f3 x:0 0 0 2026-03-09T17:29:59.590 INFO:tasks.workunit.client.1.vm09.stdout:8/17: truncate d1/f3 367624 0 2026-03-09T17:29:59.592 INFO:tasks.workunit.client.1.vm09.stdout:9/5: mknod c1 0 2026-03-09T17:29:59.593 INFO:tasks.workunit.client.1.vm09.stdout:4/6: creat f3 x:0 0 0 2026-03-09T17:29:59.593 INFO:tasks.workunit.client.0.vm06.stdout:6/753: mknod d6/d4f/d3e/ce9 0 2026-03-09T17:29:59.593 INFO:tasks.workunit.client.1.vm09.stdout:4/7: truncate f2 5348986 0 2026-03-09T17:29:59.594 INFO:tasks.workunit.client.1.vm09.stdout:6/6: symlink l1 0 2026-03-09T17:29:59.595 INFO:tasks.workunit.client.1.vm09.stdout:0/15: rename l1 to l2 0 2026-03-09T17:29:59.597 INFO:tasks.workunit.client.0.vm06.stdout:6/754: unlink d6/d12/d53/d91/dbf/fde 0 2026-03-09T17:29:59.597 INFO:tasks.workunit.client.1.vm09.stdout:2/29: rename f1 to fa 0 2026-03-09T17:29:59.598 INFO:tasks.workunit.client.1.vm09.stdout:7/7: getdents . 0 2026-03-09T17:29:59.598 INFO:tasks.workunit.client.1.vm09.stdout:7/8: stat f0 0 2026-03-09T17:29:59.598 INFO:tasks.workunit.client.1.vm09.stdout:7/9: write f0 [2769486,48155] 0 2026-03-09T17:29:59.599 INFO:tasks.workunit.client.1.vm09.stdout:9/6: creat f2 x:0 0 0 2026-03-09T17:29:59.599 INFO:tasks.workunit.client.0.vm06.stdout:6/755: dread d6/d12/fbc [0,4194304] 0 2026-03-09T17:29:59.601 INFO:tasks.workunit.client.1.vm09.stdout:4/8: creat f4 x:0 0 0 2026-03-09T17:29:59.602 INFO:tasks.workunit.client.1.vm09.stdout:6/7: unlink l1 0 2026-03-09T17:29:59.603 INFO:tasks.workunit.client.1.vm09.stdout:6/8: read f0 [696236,1536] 0 2026-03-09T17:29:59.603 INFO:tasks.workunit.client.1.vm09.stdout:7/10: symlink l1 0 2026-03-09T17:29:59.605 INFO:tasks.workunit.client.1.vm09.stdout:4/9: creat f5 x:0 0 0 2026-03-09T17:29:59.605 INFO:tasks.workunit.client.1.vm09.stdout:4/10: read - f3 zero size 2026-03-09T17:29:59.606 INFO:tasks.workunit.client.1.vm09.stdout:7/11: creat f2 x:0 0 0 2026-03-09T17:29:59.607 INFO:tasks.workunit.client.1.vm09.stdout:7/12: chown f0 19 1 2026-03-09T17:29:59.607 INFO:tasks.workunit.client.1.vm09.stdout:4/11: symlink l6 0 2026-03-09T17:29:59.607 INFO:tasks.workunit.client.1.vm09.stdout:7/13: read - f2 zero size 2026-03-09T17:29:59.610 INFO:tasks.workunit.client.1.vm09.stdout:6/9: dread f0 [0,4194304] 0 2026-03-09T17:29:59.610 INFO:tasks.workunit.client.1.vm09.stdout:7/14: unlink f0 0 2026-03-09T17:29:59.611 INFO:tasks.workunit.client.1.vm09.stdout:6/10: fsync f0 0 2026-03-09T17:29:59.612 INFO:tasks.workunit.client.1.vm09.stdout:7/15: creat f3 x:0 0 0 2026-03-09T17:29:59.613 INFO:tasks.workunit.client.1.vm09.stdout:7/16: mknod c4 0 2026-03-09T17:29:59.613 INFO:tasks.workunit.client.1.vm09.stdout:4/12: dwrite f2 [4194304,4194304] 0 2026-03-09T17:29:59.614 INFO:tasks.workunit.client.1.vm09.stdout:7/17: truncate f2 388329 0 2026-03-09T17:29:59.614 INFO:tasks.workunit.client.1.vm09.stdout:7/18: rmdir - no directory 2026-03-09T17:29:59.615 INFO:tasks.workunit.client.1.vm09.stdout:4/13: readlink l0 0 2026-03-09T17:29:59.615 INFO:tasks.workunit.client.1.vm09.stdout:6/11: rename f0 to f2 0 2026-03-09T17:29:59.619 INFO:tasks.workunit.client.1.vm09.stdout:4/14: truncate f5 413828 0 2026-03-09T17:29:59.619 INFO:tasks.workunit.client.1.vm09.stdout:7/19: write f3 [870127,79421] 0 2026-03-09T17:29:59.620 INFO:tasks.workunit.client.1.vm09.stdout:6/12: mkdir d3 0 2026-03-09T17:29:59.624 INFO:tasks.workunit.client.1.vm09.stdout:7/20: mknod c5 0 2026-03-09T17:29:59.626 INFO:tasks.workunit.client.1.vm09.stdout:4/15: dread f5 [0,4194304] 0 2026-03-09T17:29:59.627 INFO:tasks.workunit.client.1.vm09.stdout:6/13: dwrite f2 [0,4194304] 0 2026-03-09T17:29:59.629 INFO:tasks.workunit.client.1.vm09.stdout:7/21: rename c4 to c6 0 2026-03-09T17:29:59.629 INFO:tasks.workunit.client.1.vm09.stdout:4/16: mknod c7 0 2026-03-09T17:29:59.630 INFO:tasks.workunit.client.1.vm09.stdout:4/17: write f5 [1310291,51225] 0 2026-03-09T17:29:59.631 INFO:tasks.workunit.client.1.vm09.stdout:6/14: symlink d3/l4 0 2026-03-09T17:29:59.631 INFO:tasks.workunit.client.1.vm09.stdout:4/18: write f5 [1221243,10065] 0 2026-03-09T17:29:59.632 INFO:tasks.workunit.client.1.vm09.stdout:4/19: stat l6 0 2026-03-09T17:29:59.898 INFO:tasks.workunit.client.0.vm06.stdout:2/744: sync 2026-03-09T17:29:59.899 INFO:tasks.workunit.client.0.vm06.stdout:2/745: chown d3/d4/d22/lc3 0 1 2026-03-09T17:29:59.900 INFO:tasks.workunit.client.0.vm06.stdout:2/746: mknod d3/d4/ce9 0 2026-03-09T17:29:59.901 INFO:tasks.workunit.client.0.vm06.stdout:2/747: creat d3/d4/fea x:0 0 0 2026-03-09T17:29:59.902 INFO:tasks.workunit.client.0.vm06.stdout:2/748: fsync d3/d4/d12/d2b/d36/fb9 0 2026-03-09T17:29:59.903 INFO:tasks.workunit.client.0.vm06.stdout:2/749: creat d3/d4/d12/feb x:0 0 0 2026-03-09T17:29:59.917 INFO:tasks.workunit.client.1.vm09.stdout:8/18: dread d1/f3 [0,4194304] 0 2026-03-09T17:29:59.957 INFO:tasks.workunit.client.1.vm09.stdout:8/19: rename d1/l2 to d1/l4 0 2026-03-09T17:29:59.957 INFO:tasks.workunit.client.1.vm09.stdout:8/20: chown d1 5472189 1 2026-03-09T17:29:59.957 INFO:tasks.workunit.client.1.vm09.stdout:8/21: chown d1 9996479 1 2026-03-09T17:29:59.957 INFO:tasks.workunit.client.0.vm06.stdout:6/756: sync 2026-03-09T17:29:59.957 INFO:tasks.workunit.client.0.vm06.stdout:6/757: dwrite d6/d47/d4d/da0/fe0 [0,4194304] 0 2026-03-09T17:29:59.957 INFO:tasks.workunit.client.0.vm06.stdout:6/758: unlink d6/d4f/d73/l98 0 2026-03-09T17:29:59.957 INFO:tasks.workunit.client.0.vm06.stdout:6/759: truncate d6/d4f/d3e/d52/d8c/db0/fc7 4546196 0 2026-03-09T17:29:59.957 INFO:tasks.workunit.client.0.vm06.stdout:6/760: mkdir d6/d12/d17/d85/dea 0 2026-03-09T17:29:59.957 INFO:tasks.workunit.client.0.vm06.stdout:6/761: mknod d6/d12/d17/d85/ceb 0 2026-03-09T17:29:59.957 INFO:tasks.workunit.client.0.vm06.stdout:6/762: truncate d6/d47/d4d/d6d/fbe 365741 0 2026-03-09T17:29:59.957 INFO:tasks.workunit.client.0.vm06.stdout:6/763: truncate d6/d12/f76 4637754 0 2026-03-09T17:29:59.957 INFO:tasks.workunit.client.0.vm06.stdout:6/764: link d6/d47/d4d/d6d/ld9 d6/d12/lec 0 2026-03-09T17:30:00.109 INFO:tasks.workunit.client.1.vm09.stdout:3/6: fsync f0 0 2026-03-09T17:30:00.112 INFO:tasks.workunit.client.1.vm09.stdout:3/7: rename f0 to f2 0 2026-03-09T17:30:00.113 INFO:tasks.workunit.client.1.vm09.stdout:3/8: creat f3 x:0 0 0 2026-03-09T17:30:00.113 INFO:tasks.workunit.client.1.vm09.stdout:3/9: write f1 [566038,24350] 0 2026-03-09T17:30:00.115 INFO:tasks.workunit.client.1.vm09.stdout:3/10: dread f2 [0,4194304] 0 2026-03-09T17:30:00.115 INFO:tasks.workunit.client.1.vm09.stdout:3/11: readlink - no filename 2026-03-09T17:30:00.139 INFO:tasks.workunit.client.1.vm09.stdout:3/12: read f1 [197710,46675] 0 2026-03-09T17:30:00.140 INFO:tasks.workunit.client.1.vm09.stdout:3/13: write f3 [19997,39982] 0 2026-03-09T17:30:00.143 INFO:tasks.workunit.client.1.vm09.stdout:3/14: dread f2 [0,4194304] 0 2026-03-09T17:30:00.162 INFO:tasks.workunit.client.1.vm09.stdout:9/7: dread f0 [0,4194304] 0 2026-03-09T17:30:00.163 INFO:tasks.workunit.client.1.vm09.stdout:9/8: dread - f2 zero size 2026-03-09T17:30:00.164 INFO:tasks.workunit.client.1.vm09.stdout:9/9: symlink l3 0 2026-03-09T17:30:00.171 INFO:tasks.workunit.client.1.vm09.stdout:9/10: dwrite f2 [0,4194304] 0 2026-03-09T17:30:00.183 INFO:tasks.workunit.client.1.vm09.stdout:9/11: creat f4 x:0 0 0 2026-03-09T17:30:00.184 INFO:tasks.workunit.client.1.vm09.stdout:9/12: rmdir - no directory 2026-03-09T17:30:00.184 INFO:tasks.workunit.client.1.vm09.stdout:9/13: stat f2 0 2026-03-09T17:30:00.184 INFO:tasks.workunit.client.1.vm09.stdout:9/14: chown c1 2023894 1 2026-03-09T17:30:00.185 INFO:tasks.workunit.client.1.vm09.stdout:9/15: mkdir d5 0 2026-03-09T17:30:00.188 INFO:tasks.workunit.client.1.vm09.stdout:9/16: unlink l3 0 2026-03-09T17:30:00.193 INFO:tasks.workunit.client.1.vm09.stdout:9/17: unlink f0 0 2026-03-09T17:30:00.197 INFO:tasks.workunit.client.1.vm09.stdout:9/18: symlink d5/l6 0 2026-03-09T17:30:00.198 INFO:tasks.workunit.client.0.vm06.stdout:5/839: dread d4/f26 [0,4194304] 0 2026-03-09T17:30:00.201 INFO:tasks.workunit.client.0.vm06.stdout:5/840: getdents d4/d50/d35/d40/d96/dfe 0 2026-03-09T17:30:00.201 INFO:tasks.workunit.client.0.vm06.stdout:5/841: chown d4/d50/d18/d3d/l7f 5739169 1 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:5/5: sync 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:4/20: sync 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:1/0: sync 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:1/1: dread - no filename 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:1/2: dread - no filename 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:1/3: chown . 21 1 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:1/4: write - no filename 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:1/5: write - no filename 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:1/6: write - no filename 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:1/7: write - no filename 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:1/8: dwrite - no filename 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:1/9: truncate - no filename 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:1/10: rmdir - no directory 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:1/11: chown . 338494 1 2026-03-09T17:30:00.213 INFO:tasks.workunit.client.1.vm09.stdout:1/12: dwrite - no filename 2026-03-09T17:30:00.214 INFO:tasks.workunit.client.1.vm09.stdout:1/13: stat - no entries 2026-03-09T17:30:00.221 INFO:tasks.workunit.client.1.vm09.stdout:5/6: creat d0/f1 x:0 0 0 2026-03-09T17:30:00.221 INFO:tasks.workunit.client.1.vm09.stdout:5/7: truncate d0/f1 1024530 0 2026-03-09T17:30:00.221 INFO:tasks.workunit.client.1.vm09.stdout:5/8: chown d0/f1 0 1 2026-03-09T17:30:00.222 INFO:tasks.workunit.client.1.vm09.stdout:5/9: chown d0 7 1 2026-03-09T17:30:00.222 INFO:tasks.workunit.client.1.vm09.stdout:5/10: stat d0 0 2026-03-09T17:30:00.222 INFO:tasks.workunit.client.1.vm09.stdout:1/14: mknod c0 0 2026-03-09T17:30:00.222 INFO:tasks.workunit.client.1.vm09.stdout:1/15: dwrite - no filename 2026-03-09T17:30:00.222 INFO:tasks.workunit.client.1.vm09.stdout:1/16: readlink - no filename 2026-03-09T17:30:00.222 INFO:tasks.workunit.client.1.vm09.stdout:1/17: truncate - no filename 2026-03-09T17:30:00.222 INFO:tasks.workunit.client.1.vm09.stdout:1/18: readlink - no filename 2026-03-09T17:30:00.224 INFO:tasks.workunit.client.1.vm09.stdout:5/11: mkdir d0/d2 0 2026-03-09T17:30:00.227 INFO:tasks.workunit.client.0.vm06.stdout:7/991: dwrite d5/d7/fb4 [4194304,4194304] 0 2026-03-09T17:30:00.229 INFO:tasks.workunit.client.0.vm06.stdout:7/992: chown d5/dd/d79/d7f/f121 40 1 2026-03-09T17:30:00.238 INFO:tasks.workunit.client.1.vm09.stdout:1/19: mknod c1 0 2026-03-09T17:30:00.241 INFO:tasks.workunit.client.1.vm09.stdout:5/12: symlink d0/l3 0 2026-03-09T17:30:00.243 INFO:tasks.workunit.client.1.vm09.stdout:5/13: read d0/f1 [65650,65913] 0 2026-03-09T17:30:00.246 INFO:tasks.workunit.client.1.vm09.stdout:1/20: creat f2 x:0 0 0 2026-03-09T17:30:00.259 INFO:tasks.workunit.client.1.vm09.stdout:5/14: dwrite d0/f1 [0,4194304] 0 2026-03-09T17:30:00.261 INFO:tasks.workunit.client.1.vm09.stdout:5/15: readlink d0/l3 0 2026-03-09T17:30:00.263 INFO:tasks.workunit.client.0.vm06.stdout:7/993: creat d5/d10d/f128 x:0 0 0 2026-03-09T17:30:00.265 INFO:tasks.workunit.client.0.vm06.stdout:7/994: symlink d5/d7/d2b/dc8/l129 0 2026-03-09T17:30:00.267 INFO:tasks.workunit.client.1.vm09.stdout:1/21: dwrite f2 [0,4194304] 0 2026-03-09T17:30:00.270 INFO:tasks.workunit.client.1.vm09.stdout:1/22: creat f3 x:0 0 0 2026-03-09T17:30:00.270 INFO:tasks.workunit.client.1.vm09.stdout:5/16: creat d0/f4 x:0 0 0 2026-03-09T17:30:00.274 INFO:tasks.workunit.client.1.vm09.stdout:5/17: symlink d0/l5 0 2026-03-09T17:30:00.274 INFO:tasks.workunit.client.1.vm09.stdout:5/18: chown d0/f1 288 1 2026-03-09T17:30:00.274 INFO:tasks.workunit.client.1.vm09.stdout:5/19: write d0/f4 [289344,44050] 0 2026-03-09T17:30:00.280 INFO:tasks.workunit.client.1.vm09.stdout:1/23: sync 2026-03-09T17:30:00.280 INFO:tasks.workunit.client.1.vm09.stdout:1/24: readlink - no filename 2026-03-09T17:30:00.280 INFO:tasks.workunit.client.1.vm09.stdout:5/20: creat d0/d2/f6 x:0 0 0 2026-03-09T17:30:00.280 INFO:tasks.workunit.client.1.vm09.stdout:5/21: write d0/f4 [1223860,36170] 0 2026-03-09T17:30:00.282 INFO:tasks.workunit.client.1.vm09.stdout:1/25: rename c1 to c4 0 2026-03-09T17:30:00.296 INFO:tasks.workunit.client.1.vm09.stdout:5/22: dread d0/f1 [0,4194304] 0 2026-03-09T17:30:00.300 INFO:tasks.workunit.client.1.vm09.stdout:5/23: mkdir d0/d7 0 2026-03-09T17:30:00.301 INFO:tasks.workunit.client.1.vm09.stdout:5/24: chown d0/d2 2841 1 2026-03-09T17:30:00.309 INFO:tasks.workunit.client.1.vm09.stdout:1/26: dwrite f2 [0,4194304] 0 2026-03-09T17:30:00.314 INFO:tasks.workunit.client.1.vm09.stdout:1/27: mknod c5 0 2026-03-09T17:30:00.317 INFO:tasks.workunit.client.0.vm06.stdout:0/984: dwrite d7/f76 [0,4194304] 0 2026-03-09T17:30:00.318 INFO:tasks.workunit.client.0.vm06.stdout:0/985: write d7/d11/f10c [2099233,95429] 0 2026-03-09T17:30:00.318 INFO:tasks.workunit.client.1.vm09.stdout:1/28: write f3 [22318,62678] 0 2026-03-09T17:30:00.332 INFO:tasks.workunit.client.0.vm06.stdout:4/899: write db/d1d/d21/d37/fbd [1120859,89845] 0 2026-03-09T17:30:00.333 INFO:tasks.workunit.client.1.vm09.stdout:5/25: dwrite d0/d2/f6 [0,4194304] 0 2026-03-09T17:30:00.336 INFO:tasks.workunit.client.0.vm06.stdout:4/900: dwrite db/d59/d5f/d6d/f7b [4194304,4194304] 0 2026-03-09T17:30:00.339 INFO:tasks.workunit.client.0.vm06.stdout:0/986: creat d7/d11/d19/d23/db7/dbd/d101/f152 x:0 0 0 2026-03-09T17:30:00.341 INFO:tasks.workunit.client.1.vm09.stdout:5/26: read d0/f4 [415863,58003] 0 2026-03-09T17:30:00.345 INFO:tasks.workunit.client.1.vm09.stdout:1/29: dwrite f2 [0,4194304] 0 2026-03-09T17:30:00.352 INFO:tasks.workunit.client.1.vm09.stdout:5/27: chown d0/d2/f6 1699 1 2026-03-09T17:30:00.355 INFO:tasks.workunit.client.0.vm06.stdout:4/901: dwrite db/d1d/fd3 [0,4194304] 0 2026-03-09T17:30:00.365 INFO:tasks.workunit.client.1.vm09.stdout:1/30: dread f2 [0,4194304] 0 2026-03-09T17:30:00.365 INFO:tasks.workunit.client.1.vm09.stdout:5/28: dread d0/f1 [0,4194304] 0 2026-03-09T17:30:00.370 INFO:tasks.workunit.client.0.vm06.stdout:4/902: mknod db/d1d/d21/d25/d4b/de4/c13f 0 2026-03-09T17:30:00.372 INFO:tasks.workunit.client.0.vm06.stdout:4/903: dread db/d1d/d21/d88/fd2 [0,4194304] 0 2026-03-09T17:30:00.377 INFO:tasks.workunit.client.1.vm09.stdout:1/31: creat f6 x:0 0 0 2026-03-09T17:30:00.379 INFO:tasks.workunit.client.1.vm09.stdout:5/29: mkdir d0/d2/d8 0 2026-03-09T17:30:00.386 INFO:tasks.workunit.client.1.vm09.stdout:1/32: symlink l7 0 2026-03-09T17:30:00.388 INFO:tasks.workunit.client.1.vm09.stdout:1/33: write f6 [76978,16546] 0 2026-03-09T17:30:00.389 INFO:tasks.workunit.client.1.vm09.stdout:5/30: dread d0/f4 [0,4194304] 0 2026-03-09T17:30:00.389 INFO:tasks.workunit.client.1.vm09.stdout:1/34: chown c0 5491571 1 2026-03-09T17:30:00.395 INFO:tasks.workunit.client.1.vm09.stdout:5/31: chown d0/f4 108 1 2026-03-09T17:30:00.396 INFO:tasks.workunit.client.1.vm09.stdout:5/32: unlink d0/f4 0 2026-03-09T17:30:00.416 INFO:tasks.workunit.client.1.vm09.stdout:1/35: dwrite f3 [0,4194304] 0 2026-03-09T17:30:00.418 INFO:tasks.workunit.client.1.vm09.stdout:5/33: dwrite d0/f1 [0,4194304] 0 2026-03-09T17:30:00.424 INFO:tasks.workunit.client.1.vm09.stdout:5/34: read d0/f1 [1934481,55706] 0 2026-03-09T17:30:00.429 INFO:tasks.workunit.client.1.vm09.stdout:1/36: creat f8 x:0 0 0 2026-03-09T17:30:00.454 INFO:tasks.workunit.client.0.vm06.stdout:9/928: getdents d3/d15/d36 0 2026-03-09T17:30:00.457 INFO:tasks.workunit.client.1.vm09.stdout:5/35: sync 2026-03-09T17:30:00.520 INFO:tasks.workunit.client.0.vm06.stdout:8/835: write d15/d16/d1e/f4e [9182564,23314] 0 2026-03-09T17:30:00.524 INFO:tasks.workunit.client.0.vm06.stdout:8/836: getdents d15/d39/d67/d77 0 2026-03-09T17:30:00.526 INFO:tasks.workunit.client.0.vm06.stdout:8/837: creat d15/d39/d67/d77/d99/f115 x:0 0 0 2026-03-09T17:30:00.528 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:00 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:00.528 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:00 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:00.528 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:00 vm09.local ceph-mon[62061]: pgmap v163: 65 pgs: 65 active+clean; 1.9 GiB data, 7.0 GiB used, 113 GiB / 120 GiB avail; 34 MiB/s rd, 82 MiB/s wr, 350 op/s 2026-03-09T17:30:00.535 INFO:tasks.workunit.client.0.vm06.stdout:1/828: write d11/de0/ffd [831712,7704] 0 2026-03-09T17:30:00.546 INFO:tasks.workunit.client.0.vm06.stdout:3/896: dwrite dd/d81/fa9 [0,4194304] 0 2026-03-09T17:30:00.551 INFO:tasks.workunit.client.0.vm06.stdout:3/897: mkdir dd/d19/d25/d44/d12f 0 2026-03-09T17:30:00.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:00 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:00.585 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:00 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:00.585 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:00 vm06.local ceph-mon[57307]: pgmap v163: 65 pgs: 65 active+clean; 1.9 GiB data, 7.0 GiB used, 113 GiB / 120 GiB avail; 34 MiB/s rd, 82 MiB/s wr, 350 op/s 2026-03-09T17:30:00.587 INFO:tasks.workunit.client.0.vm06.stdout:1/829: sync 2026-03-09T17:30:00.590 INFO:tasks.workunit.client.0.vm06.stdout:1/830: symlink d11/d14/d1c/d5f/l116 0 2026-03-09T17:30:00.593 INFO:tasks.workunit.client.0.vm06.stdout:1/831: chown d11/d14/d1d/d42/d46/d92/dc0/cd0 119282 1 2026-03-09T17:30:00.593 INFO:tasks.workunit.client.0.vm06.stdout:1/832: dread - d11/d14/d1d/d42/d46/d92/dc0/d57/fac zero size 2026-03-09T17:30:00.610 INFO:tasks.workunit.client.1.vm09.stdout:0/16: getdents . 0 2026-03-09T17:30:00.612 INFO:tasks.workunit.client.1.vm09.stdout:0/17: chown l2 19046 1 2026-03-09T17:30:00.615 INFO:tasks.workunit.client.1.vm09.stdout:0/18: symlink l3 0 2026-03-09T17:30:00.615 INFO:tasks.workunit.client.1.vm09.stdout:0/19: stat l2 0 2026-03-09T17:30:00.615 INFO:tasks.workunit.client.1.vm09.stdout:0/20: read - no filename 2026-03-09T17:30:00.615 INFO:tasks.workunit.client.1.vm09.stdout:0/21: dread - no filename 2026-03-09T17:30:00.615 INFO:tasks.workunit.client.1.vm09.stdout:0/22: dwrite - no filename 2026-03-09T17:30:00.615 INFO:tasks.workunit.client.1.vm09.stdout:0/23: dwrite - no filename 2026-03-09T17:30:00.615 INFO:tasks.workunit.client.1.vm09.stdout:0/24: dread - no filename 2026-03-09T17:30:00.615 INFO:tasks.workunit.client.1.vm09.stdout:2/30: rename fa to fb 0 2026-03-09T17:30:00.616 INFO:tasks.workunit.client.1.vm09.stdout:0/25: stat l2 0 2026-03-09T17:30:00.616 INFO:tasks.workunit.client.1.vm09.stdout:2/31: chown f8 7421365 1 2026-03-09T17:30:00.617 INFO:tasks.workunit.client.1.vm09.stdout:9/19: fsync f2 0 2026-03-09T17:30:00.618 INFO:tasks.workunit.client.1.vm09.stdout:0/26: mknod c4 0 2026-03-09T17:30:00.618 INFO:tasks.workunit.client.1.vm09.stdout:0/27: dread - no filename 2026-03-09T17:30:00.618 INFO:tasks.workunit.client.1.vm09.stdout:0/28: write - no filename 2026-03-09T17:30:00.619 INFO:tasks.workunit.client.1.vm09.stdout:0/29: rename l3 to l5 0 2026-03-09T17:30:00.620 INFO:tasks.workunit.client.1.vm09.stdout:9/20: rename f4 to d5/f7 0 2026-03-09T17:30:00.621 INFO:tasks.workunit.client.1.vm09.stdout:9/21: mknod d5/c8 0 2026-03-09T17:30:00.621 INFO:tasks.workunit.client.1.vm09.stdout:2/32: dwrite f8 [0,4194304] 0 2026-03-09T17:30:00.623 INFO:tasks.workunit.client.1.vm09.stdout:9/22: fdatasync f2 0 2026-03-09T17:30:00.624 INFO:tasks.workunit.client.1.vm09.stdout:0/30: sync 2026-03-09T17:30:00.624 INFO:tasks.workunit.client.1.vm09.stdout:0/31: write - no filename 2026-03-09T17:30:00.624 INFO:tasks.workunit.client.1.vm09.stdout:0/32: readlink l2 0 2026-03-09T17:30:00.626 INFO:tasks.workunit.client.1.vm09.stdout:2/33: symlink lc 0 2026-03-09T17:30:00.626 INFO:tasks.workunit.client.1.vm09.stdout:0/33: mkdir d6 0 2026-03-09T17:30:00.629 INFO:tasks.workunit.client.1.vm09.stdout:9/23: dread f2 [0,4194304] 0 2026-03-09T17:30:00.629 INFO:tasks.workunit.client.1.vm09.stdout:2/34: fdatasync fb 0 2026-03-09T17:30:00.631 INFO:tasks.workunit.client.1.vm09.stdout:9/24: fsync f2 0 2026-03-09T17:30:00.631 INFO:tasks.workunit.client.1.vm09.stdout:9/25: write f2 [1024984,78541] 0 2026-03-09T17:30:00.632 INFO:tasks.workunit.client.1.vm09.stdout:0/34: creat d6/f7 x:0 0 0 2026-03-09T17:30:00.632 INFO:tasks.workunit.client.1.vm09.stdout:2/35: sync 2026-03-09T17:30:00.633 INFO:tasks.workunit.client.1.vm09.stdout:0/35: write d6/f7 [257437,124051] 0 2026-03-09T17:30:00.633 INFO:tasks.workunit.client.1.vm09.stdout:2/36: chown f6 7060486 1 2026-03-09T17:30:00.634 INFO:tasks.workunit.client.1.vm09.stdout:0/36: write d6/f7 [414171,46842] 0 2026-03-09T17:30:00.634 INFO:tasks.workunit.client.1.vm09.stdout:2/37: creat fd x:0 0 0 2026-03-09T17:30:00.635 INFO:tasks.workunit.client.1.vm09.stdout:2/38: creat fe x:0 0 0 2026-03-09T17:30:00.640 INFO:tasks.workunit.client.1.vm09.stdout:2/39: rename lc to lf 0 2026-03-09T17:30:00.642 INFO:tasks.workunit.client.1.vm09.stdout:2/40: symlink l10 0 2026-03-09T17:30:00.643 INFO:tasks.workunit.client.1.vm09.stdout:0/37: dwrite d6/f7 [0,4194304] 0 2026-03-09T17:30:00.647 INFO:tasks.workunit.client.1.vm09.stdout:2/41: symlink l11 0 2026-03-09T17:30:00.655 INFO:tasks.workunit.client.1.vm09.stdout:2/42: mknod c12 0 2026-03-09T17:30:00.655 INFO:tasks.workunit.client.1.vm09.stdout:0/38: link d6/f7 d6/f8 0 2026-03-09T17:30:00.656 INFO:tasks.workunit.client.1.vm09.stdout:2/43: mkdir d13 0 2026-03-09T17:30:00.657 INFO:tasks.workunit.client.1.vm09.stdout:0/39: creat d6/f9 x:0 0 0 2026-03-09T17:30:00.666 INFO:tasks.workunit.client.1.vm09.stdout:2/44: dwrite fe [0,4194304] 0 2026-03-09T17:30:00.668 INFO:tasks.workunit.client.1.vm09.stdout:2/45: rename fe to d13/f14 0 2026-03-09T17:30:00.669 INFO:tasks.workunit.client.1.vm09.stdout:2/46: mkdir d13/d15 0 2026-03-09T17:30:00.718 INFO:tasks.workunit.client.1.vm09.stdout:4/21: getdents . 0 2026-03-09T17:30:00.719 INFO:tasks.workunit.client.1.vm09.stdout:7/22: rename c6 to c7 0 2026-03-09T17:30:00.719 INFO:tasks.workunit.client.1.vm09.stdout:4/22: creat f8 x:0 0 0 2026-03-09T17:30:00.720 INFO:tasks.workunit.client.1.vm09.stdout:4/23: write f5 [1511511,37944] 0 2026-03-09T17:30:00.727 INFO:tasks.workunit.client.1.vm09.stdout:4/24: symlink l9 0 2026-03-09T17:30:00.729 INFO:tasks.workunit.client.1.vm09.stdout:4/25: creat fa x:0 0 0 2026-03-09T17:30:00.731 INFO:tasks.workunit.client.1.vm09.stdout:4/26: creat fb x:0 0 0 2026-03-09T17:30:00.733 INFO:tasks.workunit.client.1.vm09.stdout:4/27: link f5 fc 0 2026-03-09T17:30:00.733 INFO:tasks.workunit.client.1.vm09.stdout:6/15: truncate f2 2906654 0 2026-03-09T17:30:00.734 INFO:tasks.workunit.client.1.vm09.stdout:4/28: write f4 [463536,125510] 0 2026-03-09T17:30:00.744 INFO:tasks.workunit.client.0.vm06.stdout:2/750: write d3/d4/d22/f4b [2950531,43722] 0 2026-03-09T17:30:00.745 INFO:tasks.workunit.client.0.vm06.stdout:2/751: creat d3/d4/d22/d72/d8f/fec x:0 0 0 2026-03-09T17:30:00.745 INFO:tasks.workunit.client.0.vm06.stdout:2/752: readlink d3/d4/d12/d2b/d36/dd4/ldb 0 2026-03-09T17:30:00.751 INFO:tasks.workunit.client.1.vm09.stdout:8/22: chown d1/l4 61 1 2026-03-09T17:30:00.752 INFO:tasks.workunit.client.1.vm09.stdout:8/23: creat d1/f5 x:0 0 0 2026-03-09T17:30:00.765 INFO:tasks.workunit.client.0.vm06.stdout:6/765: dwrite d6/d12/d53/d8f/f9e [4194304,4194304] 0 2026-03-09T17:30:00.767 INFO:tasks.workunit.client.0.vm06.stdout:6/766: fsync d6/d47/d96/f3d 0 2026-03-09T17:30:00.767 INFO:tasks.workunit.client.0.vm06.stdout:6/767: fsync d6/f5c 0 2026-03-09T17:30:00.770 INFO:tasks.workunit.client.1.vm09.stdout:3/15: unlink f2 0 2026-03-09T17:30:00.773 INFO:tasks.workunit.client.0.vm06.stdout:6/768: dwrite d6/d47/d4d/fae [0,4194304] 0 2026-03-09T17:30:00.775 INFO:tasks.workunit.client.1.vm09.stdout:3/16: dread f1 [0,4194304] 0 2026-03-09T17:30:00.779 INFO:tasks.workunit.client.0.vm06.stdout:6/769: creat d6/fed x:0 0 0 2026-03-09T17:30:00.780 INFO:tasks.workunit.client.0.vm06.stdout:6/770: fsync d6/d12/d53/dd0/fe3 0 2026-03-09T17:30:00.780 INFO:tasks.workunit.client.0.vm06.stdout:6/771: fsync d6/d4f/d3e/d52/d8c/db0/fc7 0 2026-03-09T17:30:00.783 INFO:tasks.workunit.client.1.vm09.stdout:3/17: link f1 f4 0 2026-03-09T17:30:00.786 INFO:tasks.workunit.client.1.vm09.stdout:3/18: write f4 [1330455,72592] 0 2026-03-09T17:30:00.790 INFO:tasks.workunit.client.1.vm09.stdout:3/19: chown f1 334 1 2026-03-09T17:30:00.792 INFO:tasks.workunit.client.0.vm06.stdout:6/772: dread d6/d12/d53/f87 [0,4194304] 0 2026-03-09T17:30:00.812 INFO:tasks.workunit.client.0.vm06.stdout:6/773: dread d6/d47/d96/d40/fb4 [0,4194304] 0 2026-03-09T17:30:00.813 INFO:tasks.workunit.client.0.vm06.stdout:6/774: fdatasync d6/d47/d96/da1/fb7 0 2026-03-09T17:30:00.830 INFO:tasks.workunit.client.0.vm06.stdout:5/842: dwrite d4/d50/d18/fa2 [0,4194304] 0 2026-03-09T17:30:00.914 INFO:tasks.workunit.client.0.vm06.stdout:7/995: write d5/d1f/d34/d3f/f5b [1123733,70834] 0 2026-03-09T17:30:00.915 INFO:tasks.workunit.client.0.vm06.stdout:7/996: chown d5/d1f/d34/d3f/d91/fb9 7 1 2026-03-09T17:30:00.915 INFO:tasks.workunit.client.1.vm09.stdout:1/37: fsync f3 0 2026-03-09T17:30:00.915 INFO:tasks.workunit.client.1.vm09.stdout:1/38: readlink l7 0 2026-03-09T17:30:00.917 INFO:tasks.workunit.client.1.vm09.stdout:1/39: mkdir d9 0 2026-03-09T17:30:00.918 INFO:tasks.workunit.client.0.vm06.stdout:7/997: mkdir d5/dd/dc5/d64/d12a 0 2026-03-09T17:30:00.918 INFO:tasks.workunit.client.0.vm06.stdout:7/998: chown d5/dd/d79/la7 9 1 2026-03-09T17:30:00.919 INFO:tasks.workunit.client.0.vm06.stdout:7/999: read - d5/d7/dac/f117 zero size 2026-03-09T17:30:00.926 INFO:tasks.workunit.client.1.vm09.stdout:1/40: dwrite f6 [0,4194304] 0 2026-03-09T17:30:00.929 INFO:tasks.workunit.client.1.vm09.stdout:1/41: symlink d9/la 0 2026-03-09T17:30:00.943 INFO:tasks.workunit.client.1.vm09.stdout:1/42: dwrite f8 [0,4194304] 0 2026-03-09T17:30:00.946 INFO:tasks.workunit.client.1.vm09.stdout:1/43: stat f6 0 2026-03-09T17:30:00.948 INFO:tasks.workunit.client.1.vm09.stdout:1/44: mknod d9/cb 0 2026-03-09T17:30:00.954 INFO:tasks.workunit.client.1.vm09.stdout:1/45: dread f8 [0,4194304] 0 2026-03-09T17:30:00.955 INFO:tasks.workunit.client.1.vm09.stdout:1/46: mkdir d9/dc 0 2026-03-09T17:30:00.956 INFO:tasks.workunit.client.1.vm09.stdout:1/47: read f2 [4145785,8013] 0 2026-03-09T17:30:00.957 INFO:tasks.workunit.client.1.vm09.stdout:1/48: mkdir d9/dc/dd 0 2026-03-09T17:30:00.959 INFO:tasks.workunit.client.1.vm09.stdout:5/36: write d0/d2/f6 [4652187,83237] 0 2026-03-09T17:30:00.963 INFO:tasks.workunit.client.1.vm09.stdout:1/49: dread f6 [0,4194304] 0 2026-03-09T17:30:00.964 INFO:tasks.workunit.client.1.vm09.stdout:1/50: truncate f3 4891813 0 2026-03-09T17:30:00.964 INFO:tasks.workunit.client.1.vm09.stdout:5/37: dread d0/f1 [0,4194304] 0 2026-03-09T17:30:00.967 INFO:tasks.workunit.client.1.vm09.stdout:1/51: creat d9/dc/dd/fe x:0 0 0 2026-03-09T17:30:00.969 INFO:tasks.workunit.client.1.vm09.stdout:1/52: rename c0 to d9/cf 0 2026-03-09T17:30:00.970 INFO:tasks.workunit.client.1.vm09.stdout:1/53: mkdir d9/d10 0 2026-03-09T17:30:00.973 INFO:tasks.workunit.client.1.vm09.stdout:1/54: creat d9/f11 x:0 0 0 2026-03-09T17:30:00.973 INFO:tasks.workunit.client.1.vm09.stdout:5/38: dread d0/d2/f6 [0,4194304] 0 2026-03-09T17:30:00.973 INFO:tasks.workunit.client.1.vm09.stdout:1/55: chown d9/cb 33 1 2026-03-09T17:30:00.977 INFO:tasks.workunit.client.1.vm09.stdout:5/39: rename d0/d7 to d0/d9 0 2026-03-09T17:30:00.978 INFO:tasks.workunit.client.1.vm09.stdout:5/40: mknod d0/d9/ca 0 2026-03-09T17:30:00.979 INFO:tasks.workunit.client.1.vm09.stdout:5/41: chown d0/d2 8690 1 2026-03-09T17:30:00.989 INFO:tasks.workunit.client.1.vm09.stdout:5/42: dwrite d0/f1 [0,4194304] 0 2026-03-09T17:30:00.992 INFO:tasks.workunit.client.1.vm09.stdout:5/43: symlink d0/d2/lb 0 2026-03-09T17:30:00.996 INFO:tasks.workunit.client.1.vm09.stdout:5/44: mkdir d0/dc 0 2026-03-09T17:30:00.998 INFO:tasks.workunit.client.1.vm09.stdout:5/45: symlink d0/dc/ld 0 2026-03-09T17:30:01.002 INFO:tasks.workunit.client.0.vm06.stdout:0/987: write d7/fbf [277241,7640] 0 2026-03-09T17:30:01.006 INFO:tasks.workunit.client.1.vm09.stdout:5/46: write d0/f1 [1115055,35220] 0 2026-03-09T17:30:01.034 INFO:tasks.workunit.client.0.vm06.stdout:0/988: sync 2026-03-09T17:30:01.035 INFO:tasks.workunit.client.1.vm09.stdout:5/47: sync 2026-03-09T17:30:01.047 INFO:tasks.workunit.client.0.vm06.stdout:0/989: unlink d7/d11/d89/d99/cd6 0 2026-03-09T17:30:01.051 INFO:tasks.workunit.client.1.vm09.stdout:5/48: mkdir d0/de 0 2026-03-09T17:30:01.051 INFO:tasks.workunit.client.0.vm06.stdout:4/904: write db/d59/f10b [618159,128653] 0 2026-03-09T17:30:01.053 INFO:tasks.workunit.client.1.vm09.stdout:5/49: creat d0/ff x:0 0 0 2026-03-09T17:30:01.067 INFO:tasks.workunit.client.0.vm06.stdout:9/929: dwrite d3/d26/d35/fb5 [0,4194304] 0 2026-03-09T17:30:01.126 INFO:tasks.workunit.client.0.vm06.stdout:8/838: write d15/d39/f4b [5124363,18554] 0 2026-03-09T17:30:01.144 INFO:tasks.workunit.client.0.vm06.stdout:3/898: write dd/d81/da3/fc6 [31526,107671] 0 2026-03-09T17:30:01.150 INFO:tasks.workunit.client.0.vm06.stdout:1/833: write d11/d14/d1c/f2e [1718815,89477] 0 2026-03-09T17:30:01.152 INFO:tasks.workunit.client.0.vm06.stdout:1/834: rename d11/fee to d11/d14/d1c/dbc/f117 0 2026-03-09T17:30:01.159 INFO:tasks.workunit.client.1.vm09.stdout:9/26: rename d5/f7 to d5/f9 0 2026-03-09T17:30:01.161 INFO:tasks.workunit.client.1.vm09.stdout:9/27: dread - d5/f9 zero size 2026-03-09T17:30:01.175 INFO:tasks.workunit.client.1.vm09.stdout:2/47: readlink lf 0 2026-03-09T17:30:01.175 INFO:tasks.workunit.client.1.vm09.stdout:0/40: fsync d6/f8 0 2026-03-09T17:30:01.180 INFO:tasks.workunit.client.1.vm09.stdout:0/41: mknod d6/ca 0 2026-03-09T17:30:01.184 INFO:tasks.workunit.client.1.vm09.stdout:0/42: mknod d6/cb 0 2026-03-09T17:30:01.186 INFO:tasks.workunit.client.1.vm09.stdout:2/48: dread f6 [0,4194304] 0 2026-03-09T17:30:01.189 INFO:tasks.workunit.client.1.vm09.stdout:2/49: mknod d13/c16 0 2026-03-09T17:30:01.189 INFO:tasks.workunit.client.1.vm09.stdout:0/43: link l2 d6/lc 0 2026-03-09T17:30:01.191 INFO:tasks.workunit.client.1.vm09.stdout:2/50: write fb [285595,41918] 0 2026-03-09T17:30:01.191 INFO:tasks.workunit.client.1.vm09.stdout:0/44: readlink d6/lc 0 2026-03-09T17:30:01.199 INFO:tasks.workunit.client.1.vm09.stdout:0/45: truncate d6/f7 2183979 0 2026-03-09T17:30:01.200 INFO:tasks.workunit.client.1.vm09.stdout:0/46: readlink d6/lc 0 2026-03-09T17:30:01.201 INFO:tasks.workunit.client.1.vm09.stdout:0/47: creat d6/fd x:0 0 0 2026-03-09T17:30:01.201 INFO:tasks.workunit.client.1.vm09.stdout:0/48: chown d6/fd 2 1 2026-03-09T17:30:01.203 INFO:tasks.workunit.client.1.vm09.stdout:0/49: chown d6/lc 0 1 2026-03-09T17:30:01.204 INFO:tasks.workunit.client.1.vm09.stdout:0/50: write d6/f9 [41164,78717] 0 2026-03-09T17:30:01.209 INFO:tasks.workunit.client.1.vm09.stdout:0/51: chown l5 408 1 2026-03-09T17:30:01.243 INFO:tasks.workunit.client.1.vm09.stdout:7/23: rename c7 to c8 0 2026-03-09T17:30:01.249 INFO:tasks.workunit.client.1.vm09.stdout:7/24: dwrite f2 [0,4194304] 0 2026-03-09T17:30:01.258 INFO:tasks.workunit.client.1.vm09.stdout:7/25: unlink f2 0 2026-03-09T17:30:01.258 INFO:tasks.workunit.client.1.vm09.stdout:4/29: getdents . 0 2026-03-09T17:30:01.263 INFO:tasks.workunit.client.1.vm09.stdout:4/30: unlink f8 0 2026-03-09T17:30:01.265 INFO:tasks.workunit.client.1.vm09.stdout:6/16: dwrite f2 [0,4194304] 0 2026-03-09T17:30:01.274 INFO:tasks.workunit.client.1.vm09.stdout:4/31: creat fd x:0 0 0 2026-03-09T17:30:01.285 INFO:tasks.workunit.client.1.vm09.stdout:6/17: dwrite f2 [4194304,4194304] 0 2026-03-09T17:30:01.290 INFO:tasks.workunit.client.1.vm09.stdout:6/18: mknod d3/c5 0 2026-03-09T17:30:01.290 INFO:tasks.workunit.client.1.vm09.stdout:4/32: dwrite f3 [0,4194304] 0 2026-03-09T17:30:01.295 INFO:tasks.workunit.client.1.vm09.stdout:7/26: dread f3 [0,4194304] 0 2026-03-09T17:30:01.309 INFO:tasks.workunit.client.1.vm09.stdout:7/27: creat f9 x:0 0 0 2026-03-09T17:30:01.312 INFO:tasks.workunit.client.0.vm06.stdout:2/753: write d3/d4/d12/f92 [1436598,126244] 0 2026-03-09T17:30:01.315 INFO:tasks.workunit.client.0.vm06.stdout:2/754: creat d3/d4/d12/d2b/d36/dd4/fed x:0 0 0 2026-03-09T17:30:01.319 INFO:tasks.workunit.client.1.vm09.stdout:4/33: rename f4 to fe 0 2026-03-09T17:30:01.319 INFO:tasks.workunit.client.0.vm06.stdout:2/755: creat d3/d4/d22/d72/fee x:0 0 0 2026-03-09T17:30:01.319 INFO:tasks.workunit.client.1.vm09.stdout:4/34: read - fa zero size 2026-03-09T17:30:01.320 INFO:tasks.workunit.client.1.vm09.stdout:7/28: dwrite f9 [0,4194304] 0 2026-03-09T17:30:01.328 INFO:tasks.workunit.client.1.vm09.stdout:7/29: fsync f3 0 2026-03-09T17:30:01.328 INFO:tasks.workunit.client.1.vm09.stdout:7/30: read f9 [4100571,8119] 0 2026-03-09T17:30:01.328 INFO:tasks.workunit.client.1.vm09.stdout:7/31: rmdir - no directory 2026-03-09T17:30:01.330 INFO:tasks.workunit.client.1.vm09.stdout:7/32: mkdir da 0 2026-03-09T17:30:01.333 INFO:tasks.workunit.client.1.vm09.stdout:4/35: rename f5 to ff 0 2026-03-09T17:30:01.335 INFO:tasks.workunit.client.1.vm09.stdout:7/33: creat da/fb x:0 0 0 2026-03-09T17:30:01.341 INFO:tasks.workunit.client.1.vm09.stdout:7/34: creat da/fc x:0 0 0 2026-03-09T17:30:01.341 INFO:tasks.workunit.client.1.vm09.stdout:7/35: stat da 0 2026-03-09T17:30:01.367 INFO:tasks.workunit.client.1.vm09.stdout:4/36: sync 2026-03-09T17:30:01.367 INFO:tasks.workunit.client.1.vm09.stdout:4/37: chown fd 129358694 1 2026-03-09T17:30:01.368 INFO:tasks.workunit.client.1.vm09.stdout:4/38: dread - fb zero size 2026-03-09T17:30:01.382 INFO:tasks.workunit.client.1.vm09.stdout:4/39: dwrite fb [0,4194304] 0 2026-03-09T17:30:01.383 INFO:tasks.workunit.client.1.vm09.stdout:4/40: rmdir - no directory 2026-03-09T17:30:01.384 INFO:tasks.workunit.client.1.vm09.stdout:4/41: chown l0 2 1 2026-03-09T17:30:01.391 INFO:tasks.workunit.client.1.vm09.stdout:4/42: creat f10 x:0 0 0 2026-03-09T17:30:01.391 INFO:tasks.workunit.client.1.vm09.stdout:4/43: read fe [328577,59184] 0 2026-03-09T17:30:01.394 INFO:tasks.workunit.client.1.vm09.stdout:4/44: mkdir d11 0 2026-03-09T17:30:01.395 INFO:tasks.workunit.client.1.vm09.stdout:8/24: fsync d1/f5 0 2026-03-09T17:30:01.396 INFO:tasks.workunit.client.1.vm09.stdout:4/45: unlink l9 0 2026-03-09T17:30:01.403 INFO:tasks.workunit.client.1.vm09.stdout:4/46: creat d11/f12 x:0 0 0 2026-03-09T17:30:01.405 INFO:tasks.workunit.client.1.vm09.stdout:4/47: rename fa to d11/f13 0 2026-03-09T17:30:01.405 INFO:tasks.workunit.client.1.vm09.stdout:4/48: dread - d11/f12 zero size 2026-03-09T17:30:01.409 INFO:tasks.workunit.client.1.vm09.stdout:4/49: mknod d11/c14 0 2026-03-09T17:30:01.414 INFO:tasks.workunit.client.1.vm09.stdout:4/50: creat d11/f15 x:0 0 0 2026-03-09T17:30:01.416 INFO:tasks.workunit.client.1.vm09.stdout:4/51: rename fb to d11/f16 0 2026-03-09T17:30:01.416 INFO:tasks.workunit.client.1.vm09.stdout:4/52: chown d11/c14 57 1 2026-03-09T17:30:01.438 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:01 vm09.local ceph-mon[62061]: overall HEALTH_OK 2026-03-09T17:30:01.438 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:01 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:01.438 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:01 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:01.460 INFO:tasks.workunit.client.1.vm09.stdout:3/20: getdents . 0 2026-03-09T17:30:01.479 INFO:tasks.workunit.client.1.vm09.stdout:3/21: sync 2026-03-09T17:30:01.480 INFO:tasks.workunit.client.1.vm09.stdout:3/22: sync 2026-03-09T17:30:01.491 INFO:tasks.workunit.client.1.vm09.stdout:3/23: dwrite f1 [0,4194304] 0 2026-03-09T17:30:01.494 INFO:tasks.workunit.client.0.vm06.stdout:6/775: write d6/d4f/d3e/f62 [1061432,111099] 0 2026-03-09T17:30:01.496 INFO:tasks.workunit.client.0.vm06.stdout:5/843: write d4/d22/d46/dec/f105 [889965,58340] 0 2026-03-09T17:30:01.496 INFO:tasks.workunit.client.0.vm06.stdout:5/844: readlink d4/d52/d55/le5 0 2026-03-09T17:30:01.503 INFO:tasks.workunit.client.0.vm06.stdout:5/845: truncate d4/d50/d35/d40/d6f/f9e 18494 0 2026-03-09T17:30:01.504 INFO:tasks.workunit.client.1.vm09.stdout:3/24: write f1 [411925,2337] 0 2026-03-09T17:30:01.506 INFO:tasks.workunit.client.0.vm06.stdout:5/846: creat d4/d52/d55/d11e/f130 x:0 0 0 2026-03-09T17:30:01.517 INFO:tasks.workunit.client.1.vm09.stdout:1/56: getdents d9/dc/dd 0 2026-03-09T17:30:01.517 INFO:tasks.workunit.client.1.vm09.stdout:5/50: rmdir d0/d9 39 2026-03-09T17:30:01.517 INFO:tasks.workunit.client.1.vm09.stdout:5/51: fdatasync d0/ff 0 2026-03-09T17:30:01.517 INFO:tasks.workunit.client.1.vm09.stdout:1/57: dread f8 [0,4194304] 0 2026-03-09T17:30:01.517 INFO:tasks.workunit.client.1.vm09.stdout:1/58: dread - d9/dc/dd/fe zero size 2026-03-09T17:30:01.522 INFO:tasks.workunit.client.1.vm09.stdout:3/25: mkdir d5 0 2026-03-09T17:30:01.523 INFO:tasks.workunit.client.1.vm09.stdout:1/59: write f8 [1355909,23734] 0 2026-03-09T17:30:01.524 INFO:tasks.workunit.client.1.vm09.stdout:3/26: mkdir d5/d6 0 2026-03-09T17:30:01.525 INFO:tasks.workunit.client.1.vm09.stdout:1/60: rename c5 to d9/dc/c12 0 2026-03-09T17:30:01.525 INFO:tasks.workunit.client.1.vm09.stdout:3/27: mknod d5/c7 0 2026-03-09T17:30:01.525 INFO:tasks.workunit.client.1.vm09.stdout:3/28: readlink - no filename 2026-03-09T17:30:01.526 INFO:tasks.workunit.client.1.vm09.stdout:3/29: symlink d5/l8 0 2026-03-09T17:30:01.528 INFO:tasks.workunit.client.1.vm09.stdout:3/30: mkdir d5/d9 0 2026-03-09T17:30:01.528 INFO:tasks.workunit.client.1.vm09.stdout:3/31: fdatasync f1 0 2026-03-09T17:30:01.528 INFO:tasks.workunit.client.1.vm09.stdout:1/61: dread f3 [0,4194304] 0 2026-03-09T17:30:01.528 INFO:tasks.workunit.client.1.vm09.stdout:3/32: read f1 [780413,25525] 0 2026-03-09T17:30:01.529 INFO:tasks.workunit.client.1.vm09.stdout:1/62: read f2 [155504,89809] 0 2026-03-09T17:30:01.531 INFO:tasks.workunit.client.1.vm09.stdout:3/33: rename d5/l8 to d5/d9/la 0 2026-03-09T17:30:01.534 INFO:tasks.workunit.client.1.vm09.stdout:1/63: dwrite f2 [0,4194304] 0 2026-03-09T17:30:01.540 INFO:tasks.workunit.client.1.vm09.stdout:1/64: dwrite f2 [0,4194304] 0 2026-03-09T17:30:01.545 INFO:tasks.workunit.client.1.vm09.stdout:1/65: mknod d9/dc/c13 0 2026-03-09T17:30:01.547 INFO:tasks.workunit.client.1.vm09.stdout:1/66: dread f2 [0,4194304] 0 2026-03-09T17:30:01.547 INFO:tasks.workunit.client.1.vm09.stdout:1/67: readlink d9/la 0 2026-03-09T17:30:01.548 INFO:tasks.workunit.client.1.vm09.stdout:1/68: write f3 [4301605,35019] 0 2026-03-09T17:30:01.552 INFO:tasks.workunit.client.1.vm09.stdout:1/69: dread f3 [0,4194304] 0 2026-03-09T17:30:01.552 INFO:tasks.workunit.client.1.vm09.stdout:1/70: dread - d9/f11 zero size 2026-03-09T17:30:01.552 INFO:tasks.workunit.client.1.vm09.stdout:1/71: fdatasync f8 0 2026-03-09T17:30:01.555 INFO:tasks.workunit.client.1.vm09.stdout:1/72: getdents d9 0 2026-03-09T17:30:01.555 INFO:tasks.workunit.client.1.vm09.stdout:1/73: readlink l7 0 2026-03-09T17:30:01.560 INFO:tasks.workunit.client.1.vm09.stdout:1/74: dwrite f8 [0,4194304] 0 2026-03-09T17:30:01.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:01 vm06.local ceph-mon[57307]: overall HEALTH_OK 2026-03-09T17:30:01.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:01 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:01.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:01 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:01.741 INFO:tasks.workunit.client.0.vm06.stdout:4/905: write db/d1d/d21/d26/d89/fb1 [4791549,52492] 0 2026-03-09T17:30:01.745 INFO:tasks.workunit.client.0.vm06.stdout:4/906: read db/d1d/d21/d25/fc0 [1086246,98472] 0 2026-03-09T17:30:01.745 INFO:tasks.workunit.client.0.vm06.stdout:4/907: stat db/d1d/d21/d37/l47 0 2026-03-09T17:30:01.749 INFO:tasks.workunit.client.0.vm06.stdout:9/930: write d3/d15/d36/d4d/fa1 [1421299,11595] 0 2026-03-09T17:30:01.753 INFO:tasks.workunit.client.0.vm06.stdout:9/931: getdents d3/d15/d36/d83/d115 0 2026-03-09T17:30:01.755 INFO:tasks.workunit.client.0.vm06.stdout:9/932: mknod d3/d15/d48/da8/db9/de8/d108/c12b 0 2026-03-09T17:30:01.756 INFO:tasks.workunit.client.0.vm06.stdout:9/933: symlink d3/d15/d36/df4/l12c 0 2026-03-09T17:30:01.757 INFO:tasks.workunit.client.0.vm06.stdout:9/934: chown d3/d26/d35 3714947 1 2026-03-09T17:30:01.757 INFO:tasks.workunit.client.0.vm06.stdout:9/935: stat d3/d26/d35/l11d 0 2026-03-09T17:30:01.759 INFO:tasks.workunit.client.0.vm06.stdout:9/936: creat d3/d15/d48/da8/db9/de8/f12d x:0 0 0 2026-03-09T17:30:01.760 INFO:tasks.workunit.client.0.vm06.stdout:8/839: write d15/d16/d1e/d30/db8/d5e/fd1 [1704524,113661] 0 2026-03-09T17:30:01.762 INFO:tasks.workunit.client.0.vm06.stdout:9/937: rename d3/d26/dcb/df1/l128 to d3/d15/d36/d4d/l12e 0 2026-03-09T17:30:01.764 INFO:tasks.workunit.client.0.vm06.stdout:8/840: write d15/d31/dc5/df1/d71/f80 [2094919,115145] 0 2026-03-09T17:30:01.770 INFO:tasks.workunit.client.0.vm06.stdout:8/841: rename d15/d31/dc5/df1/d3d/d5f/f81 to d15/d39/d67/d77/d97/dac/dcb/f116 0 2026-03-09T17:30:01.774 INFO:tasks.workunit.client.0.vm06.stdout:3/899: write dd/d19/d25/d2d/fe8 [5082455,52334] 0 2026-03-09T17:30:01.778 INFO:tasks.workunit.client.0.vm06.stdout:3/900: truncate dd/d81/da3/dae/fcb 4808463 0 2026-03-09T17:30:01.783 INFO:tasks.workunit.client.1.vm09.stdout:9/28: unlink d5/f9 0 2026-03-09T17:30:01.784 INFO:tasks.workunit.client.1.vm09.stdout:2/51: getdents d13 0 2026-03-09T17:30:01.784 INFO:tasks.workunit.client.1.vm09.stdout:2/52: chown c3 226 1 2026-03-09T17:30:01.785 INFO:tasks.workunit.client.1.vm09.stdout:9/29: rename d5/l6 to d5/la 0 2026-03-09T17:30:01.785 INFO:tasks.workunit.client.1.vm09.stdout:9/30: symlink d5/lb 0 2026-03-09T17:30:01.786 INFO:tasks.workunit.client.1.vm09.stdout:9/31: write f2 [16397,24414] 0 2026-03-09T17:30:01.795 INFO:tasks.workunit.client.1.vm09.stdout:9/32: symlink d5/lc 0 2026-03-09T17:30:01.804 INFO:tasks.workunit.client.1.vm09.stdout:2/53: sync 2026-03-09T17:30:01.809 INFO:tasks.workunit.client.1.vm09.stdout:2/54: link c12 d13/d15/c17 0 2026-03-09T17:30:01.810 INFO:tasks.workunit.client.1.vm09.stdout:2/55: creat d13/d15/f18 x:0 0 0 2026-03-09T17:30:01.817 INFO:tasks.workunit.client.1.vm09.stdout:0/52: dread d6/f8 [0,4194304] 0 2026-03-09T17:30:01.817 INFO:tasks.workunit.client.1.vm09.stdout:0/53: read d6/f8 [454745,46477] 0 2026-03-09T17:30:01.824 INFO:tasks.workunit.client.1.vm09.stdout:0/54: dread d6/f9 [0,4194304] 0 2026-03-09T17:30:01.825 INFO:tasks.workunit.client.1.vm09.stdout:0/55: rmdir d6 39 2026-03-09T17:30:01.826 INFO:tasks.workunit.client.1.vm09.stdout:0/56: fdatasync d6/f9 0 2026-03-09T17:30:01.830 INFO:tasks.workunit.client.1.vm09.stdout:0/57: dwrite d6/fd [0,4194304] 0 2026-03-09T17:30:01.831 INFO:tasks.workunit.client.1.vm09.stdout:0/58: chown d6/ca 186 1 2026-03-09T17:30:01.842 INFO:tasks.workunit.client.1.vm09.stdout:0/59: mkdir d6/de 0 2026-03-09T17:30:01.845 INFO:tasks.workunit.client.1.vm09.stdout:6/19: getdents d3 0 2026-03-09T17:30:01.848 INFO:tasks.workunit.client.1.vm09.stdout:0/60: dwrite d6/f9 [0,4194304] 0 2026-03-09T17:30:01.848 INFO:tasks.workunit.client.1.vm09.stdout:0/61: chown d6/de 7 1 2026-03-09T17:30:01.850 INFO:tasks.workunit.client.0.vm06.stdout:2/756: dwrite d3/d4/d12/da7/db3/fbe [0,4194304] 0 2026-03-09T17:30:01.852 INFO:tasks.workunit.client.1.vm09.stdout:6/20: dwrite f2 [4194304,4194304] 0 2026-03-09T17:30:01.855 INFO:tasks.workunit.client.1.vm09.stdout:0/62: unlink d6/fd 0 2026-03-09T17:30:01.858 INFO:tasks.workunit.client.1.vm09.stdout:6/21: chown d3 3443 1 2026-03-09T17:30:01.860 INFO:tasks.workunit.client.1.vm09.stdout:0/63: creat d6/de/ff x:0 0 0 2026-03-09T17:30:01.860 INFO:tasks.workunit.client.1.vm09.stdout:6/22: getdents d3 0 2026-03-09T17:30:01.862 INFO:tasks.workunit.client.1.vm09.stdout:0/64: dread d6/f9 [0,4194304] 0 2026-03-09T17:30:01.863 INFO:tasks.workunit.client.1.vm09.stdout:0/65: rename d6/ca to d6/de/c10 0 2026-03-09T17:30:01.864 INFO:tasks.workunit.client.1.vm09.stdout:0/66: dread - d6/de/ff zero size 2026-03-09T17:30:01.866 INFO:tasks.workunit.client.1.vm09.stdout:0/67: link l2 d6/l11 0 2026-03-09T17:30:01.867 INFO:tasks.workunit.client.1.vm09.stdout:0/68: read d6/f9 [59447,43482] 0 2026-03-09T17:30:01.890 INFO:tasks.workunit.client.1.vm09.stdout:0/69: sync 2026-03-09T17:30:01.891 INFO:tasks.workunit.client.1.vm09.stdout:0/70: readlink l5 0 2026-03-09T17:30:01.891 INFO:tasks.workunit.client.1.vm09.stdout:0/71: chown d6 14 1 2026-03-09T17:30:01.893 INFO:tasks.workunit.client.1.vm09.stdout:7/36: rmdir da 39 2026-03-09T17:30:01.893 INFO:tasks.workunit.client.1.vm09.stdout:0/72: dread d6/f8 [0,4194304] 0 2026-03-09T17:30:01.895 INFO:tasks.workunit.client.1.vm09.stdout:0/73: creat d6/f12 x:0 0 0 2026-03-09T17:30:01.896 INFO:tasks.workunit.client.1.vm09.stdout:0/74: fsync d6/f8 0 2026-03-09T17:30:01.897 INFO:tasks.workunit.client.1.vm09.stdout:0/75: fsync d6/f8 0 2026-03-09T17:30:01.897 INFO:tasks.workunit.client.1.vm09.stdout:7/37: sync 2026-03-09T17:30:01.898 INFO:tasks.workunit.client.1.vm09.stdout:7/38: truncate da/fc 207526 0 2026-03-09T17:30:01.898 INFO:tasks.workunit.client.1.vm09.stdout:7/39: rename da to da/dd 22 2026-03-09T17:30:01.899 INFO:tasks.workunit.client.1.vm09.stdout:7/40: read - da/fb zero size 2026-03-09T17:30:01.900 INFO:tasks.workunit.client.1.vm09.stdout:0/76: link c4 d6/de/c13 0 2026-03-09T17:30:01.901 INFO:tasks.workunit.client.1.vm09.stdout:7/41: mknod da/ce 0 2026-03-09T17:30:01.902 INFO:tasks.workunit.client.1.vm09.stdout:0/77: dread d6/f7 [0,4194304] 0 2026-03-09T17:30:01.903 INFO:tasks.workunit.client.1.vm09.stdout:7/42: write da/fb [530096,4574] 0 2026-03-09T17:30:01.933 INFO:tasks.workunit.client.1.vm09.stdout:7/43: fsync da/fb 0 2026-03-09T17:30:01.937 INFO:tasks.workunit.client.1.vm09.stdout:7/44: dwrite f9 [0,4194304] 0 2026-03-09T17:30:01.940 INFO:tasks.workunit.client.1.vm09.stdout:7/45: dread da/fc [0,4194304] 0 2026-03-09T17:30:01.944 INFO:tasks.workunit.client.1.vm09.stdout:7/46: creat da/ff x:0 0 0 2026-03-09T17:30:01.946 INFO:tasks.workunit.client.1.vm09.stdout:7/47: dread da/fb [0,4194304] 0 2026-03-09T17:30:01.947 INFO:tasks.workunit.client.1.vm09.stdout:7/48: creat da/f10 x:0 0 0 2026-03-09T17:30:01.947 INFO:tasks.workunit.client.1.vm09.stdout:7/49: stat da/fc 0 2026-03-09T17:30:01.948 INFO:tasks.workunit.client.1.vm09.stdout:7/50: write da/fb [1581919,2972] 0 2026-03-09T17:30:01.988 INFO:tasks.workunit.client.1.vm09.stdout:8/25: dwrite d1/f3 [0,4194304] 0 2026-03-09T17:30:01.990 INFO:tasks.workunit.client.1.vm09.stdout:4/53: getdents d11 0 2026-03-09T17:30:01.990 INFO:tasks.workunit.client.1.vm09.stdout:8/26: mknod d1/c6 0 2026-03-09T17:30:01.991 INFO:tasks.workunit.client.1.vm09.stdout:4/54: rmdir d11 39 2026-03-09T17:30:01.994 INFO:tasks.workunit.client.1.vm09.stdout:4/55: rename l6 to d11/l17 0 2026-03-09T17:30:01.995 INFO:tasks.workunit.client.1.vm09.stdout:4/56: chown l0 1585 1 2026-03-09T17:30:01.996 INFO:tasks.workunit.client.1.vm09.stdout:8/27: dwrite d1/f5 [0,4194304] 0 2026-03-09T17:30:01.997 INFO:tasks.workunit.client.1.vm09.stdout:4/57: creat d11/f18 x:0 0 0 2026-03-09T17:30:02.007 INFO:tasks.workunit.client.1.vm09.stdout:8/28: creat d1/f7 x:0 0 0 2026-03-09T17:30:02.010 INFO:tasks.workunit.client.1.vm09.stdout:4/58: creat d11/f19 x:0 0 0 2026-03-09T17:30:02.010 INFO:tasks.workunit.client.1.vm09.stdout:8/29: dread d1/f3 [0,4194304] 0 2026-03-09T17:30:02.018 INFO:tasks.workunit.client.1.vm09.stdout:8/30: creat d1/f8 x:0 0 0 2026-03-09T17:30:02.019 INFO:tasks.workunit.client.1.vm09.stdout:4/59: dread f2 [4194304,4194304] 0 2026-03-09T17:30:02.020 INFO:tasks.workunit.client.1.vm09.stdout:8/31: mknod d1/c9 0 2026-03-09T17:30:02.023 INFO:tasks.workunit.client.0.vm06.stdout:6/776: dwrite d6/d12/d53/fc6 [0,4194304] 0 2026-03-09T17:30:02.031 INFO:tasks.workunit.client.0.vm06.stdout:5/847: dwrite d4/d52/db4/dc2/f100 [0,4194304] 0 2026-03-09T17:30:02.036 INFO:tasks.workunit.client.0.vm06.stdout:5/848: mknod d4/da4/c131 0 2026-03-09T17:30:02.086 INFO:tasks.workunit.client.1.vm09.stdout:5/52: truncate d0/d2/f6 1637319 0 2026-03-09T17:30:02.088 INFO:tasks.workunit.client.1.vm09.stdout:3/34: truncate f1 3582468 0 2026-03-09T17:30:02.090 INFO:tasks.workunit.client.1.vm09.stdout:3/35: creat d5/d6/fb x:0 0 0 2026-03-09T17:30:02.091 INFO:tasks.workunit.client.1.vm09.stdout:5/53: dwrite d0/f1 [0,4194304] 0 2026-03-09T17:30:02.092 INFO:tasks.workunit.client.1.vm09.stdout:3/36: fdatasync f1 0 2026-03-09T17:30:02.092 INFO:tasks.workunit.client.1.vm09.stdout:5/54: chown d0/d2/lb 5464 1 2026-03-09T17:30:02.095 INFO:tasks.workunit.client.1.vm09.stdout:5/55: dread d0/f1 [0,4194304] 0 2026-03-09T17:30:02.100 INFO:tasks.workunit.client.1.vm09.stdout:5/56: rename d0/d9/ca to d0/c10 0 2026-03-09T17:30:02.101 INFO:tasks.workunit.client.1.vm09.stdout:5/57: dread d0/f1 [0,4194304] 0 2026-03-09T17:30:02.102 INFO:tasks.workunit.client.1.vm09.stdout:5/58: truncate d0/f1 5163177 0 2026-03-09T17:30:02.104 INFO:tasks.workunit.client.1.vm09.stdout:5/59: dread d0/f1 [4194304,4194304] 0 2026-03-09T17:30:02.116 INFO:tasks.workunit.client.1.vm09.stdout:1/75: truncate f2 1577359 0 2026-03-09T17:30:02.117 INFO:tasks.workunit.client.1.vm09.stdout:1/76: stat d9 0 2026-03-09T17:30:02.117 INFO:tasks.workunit.client.1.vm09.stdout:5/60: sync 2026-03-09T17:30:02.118 INFO:tasks.workunit.client.1.vm09.stdout:5/61: rename d0/f1 to d0/de/f11 0 2026-03-09T17:30:02.119 INFO:tasks.workunit.client.0.vm06.stdout:2/757: rename d3/d4/d22/d72/d8f/fec to d3/d4/d12/d2b/fef 0 2026-03-09T17:30:02.119 INFO:tasks.workunit.client.0.vm06.stdout:2/758: readlink d3/d4/d46/l80 0 2026-03-09T17:30:02.120 INFO:tasks.workunit.client.1.vm09.stdout:1/77: dwrite f6 [0,4194304] 0 2026-03-09T17:30:02.122 INFO:tasks.workunit.client.0.vm06.stdout:4/908: rmdir db/d59/d5f/d5d 39 2026-03-09T17:30:02.124 INFO:tasks.workunit.client.0.vm06.stdout:2/759: dwrite d3/d4/d12/da7/fe8 [0,4194304] 0 2026-03-09T17:30:02.136 INFO:tasks.workunit.client.0.vm06.stdout:2/760: rmdir d3/d4/d12 39 2026-03-09T17:30:02.137 INFO:tasks.workunit.client.1.vm09.stdout:1/78: rename d9/la to d9/dc/dd/l14 0 2026-03-09T17:30:02.140 INFO:tasks.workunit.client.1.vm09.stdout:1/79: mkdir d9/dc/d15 0 2026-03-09T17:30:02.148 INFO:tasks.workunit.client.1.vm09.stdout:1/80: dwrite f6 [0,4194304] 0 2026-03-09T17:30:02.149 INFO:tasks.workunit.client.0.vm06.stdout:4/909: rename db/d1d/d21/d25/f38 to db/d59/d5f/d6d/dfc/f140 0 2026-03-09T17:30:02.149 INFO:tasks.workunit.client.0.vm06.stdout:4/910: creat db/d1d/d21/d25/d4b/d85/d137/f141 x:0 0 0 2026-03-09T17:30:02.149 INFO:tasks.workunit.client.0.vm06.stdout:4/911: dread - db/ff5 zero size 2026-03-09T17:30:02.149 INFO:tasks.workunit.client.0.vm06.stdout:4/912: symlink db/d59/d5f/d6d/dfc/l142 0 2026-03-09T17:30:02.149 INFO:tasks.workunit.client.0.vm06.stdout:0/990: unlink d7/d11/d89/da8/db2/lbb 0 2026-03-09T17:30:02.149 INFO:tasks.workunit.client.0.vm06.stdout:0/991: chown d7/d11/d19/d1d/d87 410 1 2026-03-09T17:30:02.150 INFO:tasks.workunit.client.0.vm06.stdout:4/913: creat db/d1d/d21/d26/d125/f143 x:0 0 0 2026-03-09T17:30:02.150 INFO:tasks.workunit.client.0.vm06.stdout:4/914: read - db/ff5 zero size 2026-03-09T17:30:02.151 INFO:tasks.workunit.client.0.vm06.stdout:0/992: creat d7/d11/d89/da8/db2/d13a/f153 x:0 0 0 2026-03-09T17:30:02.155 INFO:tasks.workunit.client.0.vm06.stdout:0/993: fsync d7/d11/d19/f131 0 2026-03-09T17:30:02.157 INFO:tasks.workunit.client.0.vm06.stdout:0/994: mkdir d7/d11/d19/d3c/df3/d11c/d154 0 2026-03-09T17:30:02.158 INFO:tasks.workunit.client.0.vm06.stdout:4/915: rmdir db/d1d/d21/d26/d89/dab 0 2026-03-09T17:30:02.159 INFO:tasks.workunit.client.0.vm06.stdout:4/916: chown db/d1d/f1f 1258 1 2026-03-09T17:30:02.161 INFO:tasks.workunit.client.0.vm06.stdout:0/995: dwrite d7/d102/f143 [0,4194304] 0 2026-03-09T17:30:02.165 INFO:tasks.workunit.client.0.vm06.stdout:0/996: chown d7/d11/d89/da8/db2/dea/c115 54364 1 2026-03-09T17:30:02.293 INFO:tasks.workunit.client.0.vm06.stdout:9/938: dwrite d3/d15/d48/fda [0,4194304] 0 2026-03-09T17:30:02.295 INFO:tasks.workunit.client.0.vm06.stdout:9/939: stat d3/d26/d35/f6f 0 2026-03-09T17:30:02.295 INFO:tasks.workunit.client.0.vm06.stdout:9/940: write d3/d15/d16/f5c [3791854,15560] 0 2026-03-09T17:30:02.322 INFO:tasks.workunit.client.0.vm06.stdout:8/842: write d15/d16/d1a/d47/f9c [899078,126245] 0 2026-03-09T17:30:02.328 INFO:tasks.workunit.client.1.vm09.stdout:2/56: rmdir d13/d15 39 2026-03-09T17:30:02.331 INFO:tasks.workunit.client.1.vm09.stdout:2/57: dread - d13/d15/f18 zero size 2026-03-09T17:30:02.333 INFO:tasks.workunit.client.0.vm06.stdout:3/901: dwrite dd/d59/da1/fa4 [0,4194304] 0 2026-03-09T17:30:02.333 INFO:tasks.workunit.client.0.vm06.stdout:8/843: dwrite d15/d31/dc5/df1/d2b/f63 [0,4194304] 0 2026-03-09T17:30:02.334 INFO:tasks.workunit.client.1.vm09.stdout:2/58: symlink d13/d15/l19 0 2026-03-09T17:30:02.336 INFO:tasks.workunit.client.1.vm09.stdout:9/33: link d5/la d5/ld 0 2026-03-09T17:30:02.336 INFO:tasks.workunit.client.1.vm09.stdout:6/23: truncate f2 2056131 0 2026-03-09T17:30:02.338 INFO:tasks.workunit.client.1.vm09.stdout:6/24: unlink d3/l4 0 2026-03-09T17:30:02.339 INFO:tasks.workunit.client.1.vm09.stdout:9/34: unlink d5/la 0 2026-03-09T17:30:02.339 INFO:tasks.workunit.client.1.vm09.stdout:2/59: dwrite fd [0,4194304] 0 2026-03-09T17:30:02.340 INFO:tasks.workunit.client.1.vm09.stdout:2/60: readlink l11 0 2026-03-09T17:30:02.340 INFO:tasks.workunit.client.1.vm09.stdout:0/78: write d6/f8 [750985,84726] 0 2026-03-09T17:30:02.364 INFO:tasks.workunit.client.0.vm06.stdout:3/902: dread dd/f38 [0,4194304] 0 2026-03-09T17:30:02.375 INFO:tasks.workunit.client.1.vm09.stdout:7/51: rmdir da 39 2026-03-09T17:30:02.375 INFO:tasks.workunit.client.1.vm09.stdout:4/60: getdents d11 0 2026-03-09T17:30:02.375 INFO:tasks.workunit.client.1.vm09.stdout:9/35: mkdir d5/de 0 2026-03-09T17:30:02.376 INFO:tasks.workunit.client.1.vm09.stdout:4/61: write fd [118247,16688] 0 2026-03-09T17:30:02.377 INFO:tasks.workunit.client.1.vm09.stdout:9/36: dread f2 [0,4194304] 0 2026-03-09T17:30:02.378 INFO:tasks.workunit.client.1.vm09.stdout:9/37: read f2 [1877272,107120] 0 2026-03-09T17:30:02.381 INFO:tasks.workunit.client.1.vm09.stdout:8/32: rmdir d1 39 2026-03-09T17:30:02.390 INFO:tasks.workunit.client.1.vm09.stdout:0/79: dwrite d6/f9 [0,4194304] 0 2026-03-09T17:30:02.391 INFO:tasks.workunit.client.1.vm09.stdout:0/80: write d6/f8 [844863,101483] 0 2026-03-09T17:30:02.391 INFO:tasks.workunit.client.0.vm06.stdout:3/903: creat dd/d19/d25/df0/f130 x:0 0 0 2026-03-09T17:30:02.391 INFO:tasks.workunit.client.0.vm06.stdout:6/777: write d6/d12/d53/f5b [2086082,48089] 0 2026-03-09T17:30:02.391 INFO:tasks.workunit.client.0.vm06.stdout:1/835: mkdir d11/d14/d1d/d42/d46/d92/dc0/d118 0 2026-03-09T17:30:02.392 INFO:tasks.workunit.client.0.vm06.stdout:5/849: dwrite d4/d50/d18/d3d/f54 [0,4194304] 0 2026-03-09T17:30:02.396 INFO:tasks.workunit.client.0.vm06.stdout:3/904: creat dd/d1d/f131 x:0 0 0 2026-03-09T17:30:02.396 INFO:tasks.workunit.client.0.vm06.stdout:3/905: chown dd/d1d/f9f 17357262 1 2026-03-09T17:30:02.397 INFO:tasks.workunit.client.1.vm09.stdout:9/38: sync 2026-03-09T17:30:02.399 INFO:tasks.workunit.client.1.vm09.stdout:5/62: fsync d0/d2/f6 0 2026-03-09T17:30:02.402 INFO:tasks.workunit.client.1.vm09.stdout:1/81: dread f2 [0,4194304] 0 2026-03-09T17:30:02.403 INFO:tasks.workunit.client.1.vm09.stdout:1/82: fsync d9/dc/dd/fe 0 2026-03-09T17:30:02.405 INFO:tasks.workunit.client.0.vm06.stdout:1/836: truncate d11/d14/d1d/d1e/d2a/d34/d58/fb9 1656344 0 2026-03-09T17:30:02.405 INFO:tasks.workunit.client.1.vm09.stdout:3/37: dwrite f4 [0,4194304] 0 2026-03-09T17:30:02.405 INFO:tasks.workunit.client.1.vm09.stdout:3/38: read f3 [26495,28448] 0 2026-03-09T17:30:02.405 INFO:tasks.workunit.client.1.vm09.stdout:4/62: fdatasync f2 0 2026-03-09T17:30:02.409 INFO:tasks.workunit.client.0.vm06.stdout:2/761: dwrite d3/d4/d12/d2b/d36/fde [0,4194304] 0 2026-03-09T17:30:02.426 INFO:tasks.workunit.client.1.vm09.stdout:7/52: mkdir da/d11 0 2026-03-09T17:30:02.428 INFO:tasks.workunit.client.0.vm06.stdout:1/837: creat d11/d14/d1d/d1e/d2a/d34/d64/f119 x:0 0 0 2026-03-09T17:30:02.429 INFO:tasks.workunit.client.0.vm06.stdout:1/838: stat d11/d14/d1d/d42/d46/d92/dc0/f21 0 2026-03-09T17:30:02.435 INFO:tasks.workunit.client.0.vm06.stdout:2/762: mkdir d3/d4/d12/d71/daa/d77/d81/d64/de5/df0 0 2026-03-09T17:30:02.442 INFO:tasks.workunit.client.0.vm06.stdout:1/839: symlink d11/d14/d1d/d4a/l11a 0 2026-03-09T17:30:02.443 INFO:tasks.workunit.client.0.vm06.stdout:0/997: write d7/d11/d5d/d64/fc9 [2854760,34506] 0 2026-03-09T17:30:02.443 INFO:tasks.workunit.client.0.vm06.stdout:4/917: dwrite db/d1d/d21/d44/fb7 [0,4194304] 0 2026-03-09T17:30:02.448 INFO:tasks.workunit.client.1.vm09.stdout:5/63: dwrite d0/de/f11 [4194304,4194304] 0 2026-03-09T17:30:02.449 INFO:tasks.workunit.client.0.vm06.stdout:4/918: chown db/d1d/d21/d37/d69/c7c 1038448 1 2026-03-09T17:30:02.454 INFO:tasks.workunit.client.0.vm06.stdout:3/906: truncate dd/f15 8716800 0 2026-03-09T17:30:02.459 INFO:tasks.workunit.client.1.vm09.stdout:1/83: rmdir d9/dc 39 2026-03-09T17:30:02.459 INFO:tasks.workunit.client.0.vm06.stdout:3/907: chown dd/d1d/d6e/f9a 393015265 1 2026-03-09T17:30:02.459 INFO:tasks.workunit.client.0.vm06.stdout:6/778: getdents d6/d12/d53/d91 0 2026-03-09T17:30:02.460 INFO:tasks.workunit.client.0.vm06.stdout:2/763: dwrite d3/d4/d22/d72/d8f/fb7 [0,4194304] 0 2026-03-09T17:30:02.473 INFO:tasks.workunit.client.0.vm06.stdout:4/919: symlink db/de2/l144 0 2026-03-09T17:30:02.473 INFO:tasks.workunit.client.0.vm06.stdout:1/840: chown d11/l32 149 1 2026-03-09T17:30:02.474 INFO:tasks.workunit.client.0.vm06.stdout:4/920: chown db/d59/d5f/d6d/fd9 1 1 2026-03-09T17:30:02.481 INFO:tasks.workunit.client.1.vm09.stdout:4/63: mknod d11/c1a 0 2026-03-09T17:30:02.482 INFO:tasks.workunit.client.0.vm06.stdout:3/908: creat dd/d19/d25/d2d/f132 x:0 0 0 2026-03-09T17:30:02.491 INFO:tasks.workunit.client.1.vm09.stdout:8/33: mkdir d1/da 0 2026-03-09T17:30:02.491 INFO:tasks.workunit.client.1.vm09.stdout:8/34: stat d1/l4 0 2026-03-09T17:30:02.491 INFO:tasks.workunit.client.1.vm09.stdout:8/35: fsync d1/f7 0 2026-03-09T17:30:02.494 INFO:tasks.workunit.client.1.vm09.stdout:7/53: rename da/fc to da/f12 0 2026-03-09T17:30:02.494 INFO:tasks.workunit.client.0.vm06.stdout:1/841: mkdir d11/d14/d1d/d94/d11b 0 2026-03-09T17:30:02.498 INFO:tasks.workunit.client.0.vm06.stdout:9/941: dwrite d3/d15/d36/d4d/fd9 [0,4194304] 0 2026-03-09T17:30:02.527 INFO:tasks.workunit.client.1.vm09.stdout:6/25: rmdir d3 39 2026-03-09T17:30:02.527 INFO:tasks.workunit.client.1.vm09.stdout:9/39: mknod d5/de/cf 0 2026-03-09T17:30:02.527 INFO:tasks.workunit.client.1.vm09.stdout:5/64: rename d0/d2/f6 to d0/d2/f12 0 2026-03-09T17:30:02.527 INFO:tasks.workunit.client.1.vm09.stdout:2/61: write f9 [236931,36243] 0 2026-03-09T17:30:02.527 INFO:tasks.workunit.client.1.vm09.stdout:2/62: stat fd 0 2026-03-09T17:30:02.527 INFO:tasks.workunit.client.0.vm06.stdout:8/844: truncate d15/d31/d58/d9b/ff4 3922558 0 2026-03-09T17:30:02.527 INFO:tasks.workunit.client.0.vm06.stdout:8/845: fsync d15/d16/d1e/fa9 0 2026-03-09T17:30:02.527 INFO:tasks.workunit.client.0.vm06.stdout:2/764: mkdir d3/df1 0 2026-03-09T17:30:02.527 INFO:tasks.workunit.client.0.vm06.stdout:1/842: dwrite d11/d14/d1d/d42/dff/f107 [0,4194304] 0 2026-03-09T17:30:02.527 INFO:tasks.workunit.client.0.vm06.stdout:3/909: getdents dd/d59/da1/d11e 0 2026-03-09T17:30:02.532 INFO:tasks.workunit.client.0.vm06.stdout:1/843: creat d11/d14/d1d/d4a/df7/d106/d112/d114/f11c x:0 0 0 2026-03-09T17:30:02.532 INFO:tasks.workunit.client.0.vm06.stdout:3/910: mkdir dd/d19/d25/d44/d12f/d133 0 2026-03-09T17:30:02.534 INFO:tasks.workunit.client.0.vm06.stdout:3/911: creat dd/d19/d2c/f134 x:0 0 0 2026-03-09T17:30:02.537 INFO:tasks.workunit.client.0.vm06.stdout:3/912: link dd/d19/d2c/fad dd/d5b/d65/f135 0 2026-03-09T17:30:02.537 INFO:tasks.workunit.client.0.vm06.stdout:3/913: chown dd/f51 8012 1 2026-03-09T17:30:02.547 INFO:tasks.workunit.client.0.vm06.stdout:3/914: getdents dd/d19/d25/d44 0 2026-03-09T17:30:02.550 INFO:tasks.workunit.client.0.vm06.stdout:3/915: creat dd/d19/d1e/f136 x:0 0 0 2026-03-09T17:30:02.550 INFO:tasks.workunit.client.0.vm06.stdout:3/916: dread - dd/d19/d1e/f136 zero size 2026-03-09T17:30:02.560 INFO:tasks.workunit.client.0.vm06.stdout:3/917: mknod dd/d19/d25/d2d/c137 0 2026-03-09T17:30:02.567 INFO:tasks.workunit.client.0.vm06.stdout:2/765: dread d3/d4/d46/da5/fa8 [0,4194304] 0 2026-03-09T17:30:02.567 INFO:tasks.workunit.client.0.vm06.stdout:0/998: sync 2026-03-09T17:30:02.570 INFO:tasks.workunit.client.0.vm06.stdout:2/766: mknod d3/d4/d12/da7/cf2 0 2026-03-09T17:30:02.571 INFO:tasks.workunit.client.0.vm06.stdout:0/999: unlink d7/d11/d19/d8b/da4/d85/f12b 0 2026-03-09T17:30:02.572 INFO:tasks.workunit.client.0.vm06.stdout:2/767: mkdir d3/d4/d12/d2b/db0/df3 0 2026-03-09T17:30:02.575 INFO:tasks.workunit.client.1.vm09.stdout:7/54: symlink da/l13 0 2026-03-09T17:30:02.577 INFO:tasks.workunit.client.0.vm06.stdout:2/768: link d3/d4/d12/d2b/d2d/f2a d3/d4/d12/d71/daa/d77/d81/d64/de5/df0/ff4 0 2026-03-09T17:30:02.579 INFO:tasks.workunit.client.1.vm09.stdout:5/65: rename d0/de/f11 to d0/d9/f13 0 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.0.vm06.stdout:2/769: creat d3/d4/d22/d72/ff5 x:0 0 0 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:1/84: mknod d9/dc/dd/c16 0 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:6/26: dwrite f2 [0,4194304] 0 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:1/85: readlink l7 0 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:8/36: creat d1/da/fb x:0 0 0 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:7/55: rename da/l13 to da/l14 0 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:8/37: dread - d1/f7 zero size 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:0/81: link l2 d6/de/l14 0 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:0/82: stat d6/f9 0 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:9/40: dread f2 [0,4194304] 0 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:8/38: chown l0 3 1 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:1/86: write f2 [448454,25723] 0 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:7/56: creat da/f15 x:0 0 0 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:7/57: dread - da/f15 zero size 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:9/41: read f2 [2397377,122899] 0 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:2/63: creat d13/f1a x:0 0 0 2026-03-09T17:30:02.588 INFO:tasks.workunit.client.1.vm09.stdout:6/27: symlink d3/l6 0 2026-03-09T17:30:02.589 INFO:tasks.workunit.client.1.vm09.stdout:0/83: unlink d6/cb 0 2026-03-09T17:30:02.589 INFO:tasks.workunit.client.1.vm09.stdout:8/39: rename d1/da/fb to d1/da/fc 0 2026-03-09T17:30:02.589 INFO:tasks.workunit.client.1.vm09.stdout:8/40: fdatasync d1/f5 0 2026-03-09T17:30:02.589 INFO:tasks.workunit.client.1.vm09.stdout:6/28: chown f2 84033 1 2026-03-09T17:30:02.591 INFO:tasks.workunit.client.1.vm09.stdout:7/58: creat da/f16 x:0 0 0 2026-03-09T17:30:02.594 INFO:tasks.workunit.client.1.vm09.stdout:1/87: truncate f3 2963755 0 2026-03-09T17:30:02.595 INFO:tasks.workunit.client.1.vm09.stdout:8/41: mkdir d1/da/dd 0 2026-03-09T17:30:02.595 INFO:tasks.workunit.client.1.vm09.stdout:0/84: sync 2026-03-09T17:30:02.598 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:02 vm06.local ceph-mon[57307]: pgmap v164: 65 pgs: 65 active+clean; 2.0 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 32 MiB/s rd, 77 MiB/s wr, 286 op/s 2026-03-09T17:30:02.599 INFO:tasks.workunit.client.1.vm09.stdout:9/42: dwrite f2 [0,4194304] 0 2026-03-09T17:30:02.599 INFO:tasks.workunit.client.1.vm09.stdout:9/43: write f2 [1430223,102099] 0 2026-03-09T17:30:02.607 INFO:tasks.workunit.client.1.vm09.stdout:7/59: symlink da/d11/l17 0 2026-03-09T17:30:02.607 INFO:tasks.workunit.client.1.vm09.stdout:1/88: creat d9/d10/f17 x:0 0 0 2026-03-09T17:30:02.607 INFO:tasks.workunit.client.1.vm09.stdout:8/42: mknod d1/da/dd/ce 0 2026-03-09T17:30:02.607 INFO:tasks.workunit.client.1.vm09.stdout:1/89: stat d9/dc 0 2026-03-09T17:30:02.608 INFO:tasks.workunit.client.1.vm09.stdout:1/90: chown d9/d10/f17 1017 1 2026-03-09T17:30:02.608 INFO:tasks.workunit.client.1.vm09.stdout:8/43: write d1/f8 [23912,26281] 0 2026-03-09T17:30:02.609 INFO:tasks.workunit.client.1.vm09.stdout:7/60: unlink l1 0 2026-03-09T17:30:02.613 INFO:tasks.workunit.client.1.vm09.stdout:7/61: dwrite da/ff [0,4194304] 0 2026-03-09T17:30:02.613 INFO:tasks.workunit.client.1.vm09.stdout:7/62: write da/fb [1482196,105827] 0 2026-03-09T17:30:02.616 INFO:tasks.workunit.client.1.vm09.stdout:7/63: rename da to da/d18 22 2026-03-09T17:30:02.617 INFO:tasks.workunit.client.1.vm09.stdout:1/91: symlink d9/dc/l18 0 2026-03-09T17:30:02.618 INFO:tasks.workunit.client.1.vm09.stdout:1/92: truncate d9/dc/dd/fe 547623 0 2026-03-09T17:30:02.618 INFO:tasks.workunit.client.1.vm09.stdout:8/44: symlink d1/da/dd/lf 0 2026-03-09T17:30:02.619 INFO:tasks.workunit.client.1.vm09.stdout:8/45: dread - d1/f7 zero size 2026-03-09T17:30:02.620 INFO:tasks.workunit.client.1.vm09.stdout:8/46: write d1/f8 [992107,18798] 0 2026-03-09T17:30:02.620 INFO:tasks.workunit.client.1.vm09.stdout:7/64: creat da/d11/f19 x:0 0 0 2026-03-09T17:30:02.621 INFO:tasks.workunit.client.1.vm09.stdout:7/65: write da/f10 [496379,113625] 0 2026-03-09T17:30:02.621 INFO:tasks.workunit.client.1.vm09.stdout:0/85: link c4 d6/c15 0 2026-03-09T17:30:02.625 INFO:tasks.workunit.client.1.vm09.stdout:7/66: creat da/d11/f1a x:0 0 0 2026-03-09T17:30:02.628 INFO:tasks.workunit.client.1.vm09.stdout:8/47: link d1/f8 d1/da/dd/f10 0 2026-03-09T17:30:02.628 INFO:tasks.workunit.client.1.vm09.stdout:8/48: dread - d1/da/fc zero size 2026-03-09T17:30:02.629 INFO:tasks.workunit.client.1.vm09.stdout:7/67: mknod da/c1b 0 2026-03-09T17:30:02.629 INFO:tasks.workunit.client.1.vm09.stdout:0/86: dread d6/f8 [0,4194304] 0 2026-03-09T17:30:02.630 INFO:tasks.workunit.client.1.vm09.stdout:7/68: write da/f10 [739255,113609] 0 2026-03-09T17:30:02.630 INFO:tasks.workunit.client.1.vm09.stdout:8/49: write d1/da/dd/f10 [1071102,11489] 0 2026-03-09T17:30:02.634 INFO:tasks.workunit.client.1.vm09.stdout:7/69: creat da/f1c x:0 0 0 2026-03-09T17:30:02.634 INFO:tasks.workunit.client.1.vm09.stdout:8/50: mknod d1/c11 0 2026-03-09T17:30:02.635 INFO:tasks.workunit.client.1.vm09.stdout:8/51: creat d1/da/f12 x:0 0 0 2026-03-09T17:30:02.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:02 vm09.local ceph-mon[62061]: pgmap v164: 65 pgs: 65 active+clean; 2.0 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 32 MiB/s rd, 77 MiB/s wr, 286 op/s 2026-03-09T17:30:02.696 INFO:tasks.workunit.client.0.vm06.stdout:5/850: dwrite d4/d50/d35/d40/d95/db8/dda/fdd [0,4194304] 0 2026-03-09T17:30:02.704 INFO:tasks.workunit.client.1.vm09.stdout:8/52: dread d1/f8 [0,4194304] 0 2026-03-09T17:30:02.708 INFO:tasks.workunit.client.1.vm09.stdout:8/53: mkdir d1/da/d13 0 2026-03-09T17:30:02.708 INFO:tasks.workunit.client.0.vm06.stdout:5/851: link d4/d50/d18/l34 d4/dbb/d127/l132 0 2026-03-09T17:30:02.711 INFO:tasks.workunit.client.1.vm09.stdout:8/54: mkdir d1/d14 0 2026-03-09T17:30:02.718 INFO:tasks.workunit.client.0.vm06.stdout:5/852: mknod d4/dbb/d127/c133 0 2026-03-09T17:30:02.723 INFO:tasks.workunit.client.1.vm09.stdout:8/55: symlink d1/da/dd/l15 0 2026-03-09T17:30:02.738 INFO:tasks.workunit.client.0.vm06.stdout:5/853: symlink d4/d50/d35/d40/d96/dfe/l134 0 2026-03-09T17:30:02.739 INFO:tasks.workunit.client.1.vm09.stdout:8/56: dwrite d1/da/dd/f10 [0,4194304] 0 2026-03-09T17:30:02.739 INFO:tasks.workunit.client.1.vm09.stdout:8/57: unlink d1/f5 0 2026-03-09T17:30:02.742 INFO:tasks.workunit.client.1.vm09.stdout:8/58: dwrite d1/f7 [0,4194304] 0 2026-03-09T17:30:02.755 INFO:tasks.workunit.client.1.vm09.stdout:8/59: unlink d1/c9 0 2026-03-09T17:30:02.757 INFO:tasks.workunit.client.1.vm09.stdout:8/60: creat d1/f16 x:0 0 0 2026-03-09T17:30:02.758 INFO:tasks.workunit.client.1.vm09.stdout:8/61: truncate d1/f16 649487 0 2026-03-09T17:30:02.760 INFO:tasks.workunit.client.1.vm09.stdout:8/62: dread d1/f7 [0,4194304] 0 2026-03-09T17:30:02.763 INFO:tasks.workunit.client.1.vm09.stdout:8/63: rmdir d1/da/dd 39 2026-03-09T17:30:02.767 INFO:tasks.workunit.client.1.vm09.stdout:8/64: mknod d1/c17 0 2026-03-09T17:30:02.793 INFO:tasks.workunit.client.1.vm09.stdout:4/64: getdents d11 0 2026-03-09T17:30:02.794 INFO:tasks.workunit.client.1.vm09.stdout:4/65: rmdir d11 39 2026-03-09T17:30:02.794 INFO:tasks.workunit.client.1.vm09.stdout:4/66: write fe [1368861,30202] 0 2026-03-09T17:30:02.823 INFO:tasks.workunit.client.0.vm06.stdout:4/921: dwrite db/d59/d90/f12f [0,4194304] 0 2026-03-09T17:30:02.829 INFO:tasks.workunit.client.0.vm06.stdout:4/922: dread db/f55 [0,4194304] 0 2026-03-09T17:30:02.829 INFO:tasks.workunit.client.0.vm06.stdout:4/923: chown db/d1d/d21/d44/d8a/la4 2808635 1 2026-03-09T17:30:02.837 INFO:tasks.workunit.client.0.vm06.stdout:4/924: symlink db/d1d/d21/d37/d69/d78/da0/db6/d12c/l145 0 2026-03-09T17:30:02.837 INFO:tasks.workunit.client.0.vm06.stdout:4/925: write db/d1d/d21/d44/fb7 [2679286,7273] 0 2026-03-09T17:30:02.838 INFO:tasks.workunit.client.0.vm06.stdout:4/926: chown db/d1d/d21/d37/d69/d78/feb 3571 1 2026-03-09T17:30:02.838 INFO:tasks.workunit.client.0.vm06.stdout:4/927: chown db/d1d/d21/d44/d8a/fa7 12297 1 2026-03-09T17:30:02.839 INFO:tasks.workunit.client.0.vm06.stdout:4/928: chown db/d59/d5f/d45/d10a/dba 32 1 2026-03-09T17:30:02.843 INFO:tasks.workunit.client.0.vm06.stdout:4/929: truncate db/d59/d5f/d6d/ddb/f115 836987 0 2026-03-09T17:30:02.847 INFO:tasks.workunit.client.0.vm06.stdout:4/930: symlink db/d1d/d21/d108/l146 0 2026-03-09T17:30:02.847 INFO:tasks.workunit.client.0.vm06.stdout:4/931: mkdir db/d1d/d21/d37/d69/d78/db4/d147 0 2026-03-09T17:30:02.854 INFO:tasks.workunit.client.1.vm09.stdout:5/66: getdents d0/d2 0 2026-03-09T17:30:02.856 INFO:tasks.workunit.client.1.vm09.stdout:5/67: symlink d0/l14 0 2026-03-09T17:30:02.859 INFO:tasks.workunit.client.1.vm09.stdout:5/68: mkdir d0/d2/d15 0 2026-03-09T17:30:02.859 INFO:tasks.workunit.client.0.vm06.stdout:6/779: truncate d6/d47/d96/d40/fb4 1848086 0 2026-03-09T17:30:02.860 INFO:tasks.workunit.client.0.vm06.stdout:6/780: truncate d6/d12/d53/dd0/fe3 518849 0 2026-03-09T17:30:02.865 INFO:tasks.workunit.client.1.vm09.stdout:5/69: mkdir d0/d9/d16 0 2026-03-09T17:30:02.865 INFO:tasks.workunit.client.0.vm06.stdout:9/942: write d3/d6d/d9a/d9c/fdf [4336774,72100] 0 2026-03-09T17:30:02.875 INFO:tasks.workunit.client.0.vm06.stdout:8/846: dwrite d15/d16/f66 [0,4194304] 0 2026-03-09T17:30:02.882 INFO:tasks.workunit.client.1.vm09.stdout:5/70: chown d0/c10 20721356 1 2026-03-09T17:30:02.882 INFO:tasks.workunit.client.1.vm09.stdout:5/71: chown d0/d2/f12 9696280 1 2026-03-09T17:30:02.885 INFO:tasks.workunit.client.0.vm06.stdout:8/847: creat d15/d39/d67/d77/d97/f117 x:0 0 0 2026-03-09T17:30:02.888 INFO:tasks.workunit.client.1.vm09.stdout:5/72: mkdir d0/de/d17 0 2026-03-09T17:30:02.889 INFO:tasks.workunit.client.0.vm06.stdout:1/844: write d11/d14/d1d/d1e/d2a/d34/d64/fea [373283,4942] 0 2026-03-09T17:30:02.893 INFO:tasks.workunit.client.0.vm06.stdout:8/848: mkdir d15/d39/d67/d77/d97/dac/dcb/d118 0 2026-03-09T17:30:02.900 INFO:tasks.workunit.client.0.vm06.stdout:1/845: creat d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/dce/f11d x:0 0 0 2026-03-09T17:30:02.906 INFO:tasks.workunit.client.0.vm06.stdout:6/781: getdents d6/d12/d53/d91 0 2026-03-09T17:30:02.909 INFO:tasks.workunit.client.1.vm09.stdout:5/73: rmdir d0/d2/d8 0 2026-03-09T17:30:02.915 INFO:tasks.workunit.client.1.vm09.stdout:5/74: symlink d0/dc/l18 0 2026-03-09T17:30:02.918 INFO:tasks.workunit.client.0.vm06.stdout:1/846: rename d11/d14/d1d/d1e/l71 to d11/d14/d1d/d4a/df7/d106/d112/d114/l11e 0 2026-03-09T17:30:02.921 INFO:tasks.workunit.client.0.vm06.stdout:8/849: link d15/d31/d58/d9b/fb1 d15/d31/dc5/df1/d3d/d5f/d83/f119 0 2026-03-09T17:30:02.926 INFO:tasks.workunit.client.0.vm06.stdout:1/847: truncate d11/d14/d1d/d42/d46/f55 207405 0 2026-03-09T17:30:02.927 INFO:tasks.workunit.client.0.vm06.stdout:1/848: stat d11/d14/d1d/d1e/d2a/d99/ff3 0 2026-03-09T17:30:02.929 INFO:tasks.workunit.client.1.vm09.stdout:5/75: mknod d0/d9/d16/c19 0 2026-03-09T17:30:02.929 INFO:tasks.workunit.client.0.vm06.stdout:8/850: truncate d15/d16/f51 5739927 0 2026-03-09T17:30:02.931 INFO:tasks.workunit.client.1.vm09.stdout:5/76: dread d0/d9/f13 [0,4194304] 0 2026-03-09T17:30:02.931 INFO:tasks.workunit.client.1.vm09.stdout:5/77: readlink d0/dc/l18 0 2026-03-09T17:30:02.931 INFO:tasks.workunit.client.1.vm09.stdout:5/78: readlink d0/l5 0 2026-03-09T17:30:02.932 INFO:tasks.workunit.client.1.vm09.stdout:5/79: write d0/ff [973039,27243] 0 2026-03-09T17:30:02.932 INFO:tasks.workunit.client.1.vm09.stdout:5/80: readlink d0/l3 0 2026-03-09T17:30:02.933 INFO:tasks.workunit.client.0.vm06.stdout:3/918: dwrite dd/d1d/d6e/d70/fbd [0,4194304] 0 2026-03-09T17:30:02.934 INFO:tasks.workunit.client.1.vm09.stdout:5/81: dread d0/d9/f13 [0,4194304] 0 2026-03-09T17:30:02.942 INFO:tasks.workunit.client.0.vm06.stdout:8/851: fdatasync d15/d16/d1a/d47/faf 0 2026-03-09T17:30:02.947 INFO:tasks.workunit.client.1.vm09.stdout:5/82: fsync d0/d2/f12 0 2026-03-09T17:30:02.948 INFO:tasks.workunit.client.0.vm06.stdout:3/919: symlink dd/d19/d25/d2d/d9b/df1/l138 0 2026-03-09T17:30:02.949 INFO:tasks.workunit.client.0.vm06.stdout:1/849: dread d11/d14/d1d/d1e/d2a/f40 [0,4194304] 0 2026-03-09T17:30:02.964 INFO:tasks.workunit.client.0.vm06.stdout:8/852: read d15/d31/dc5/df1/d71/ffb [3776418,9417] 0 2026-03-09T17:30:02.968 INFO:tasks.workunit.client.0.vm06.stdout:8/853: truncate d15/d16/d1e/d30/d55/def/df3/f102 2463980 0 2026-03-09T17:30:02.969 INFO:tasks.workunit.client.0.vm06.stdout:8/854: chown d15/d16/l1c 131 1 2026-03-09T17:30:02.992 INFO:tasks.workunit.client.1.vm09.stdout:3/39: truncate f1 1393866 0 2026-03-09T17:30:02.995 INFO:tasks.workunit.client.1.vm09.stdout:0/87: rmdir d6/de 39 2026-03-09T17:30:03.008 INFO:tasks.workunit.client.1.vm09.stdout:5/83: rename d0/d9/f13 to d0/de/f1a 0 2026-03-09T17:30:03.010 INFO:tasks.workunit.client.1.vm09.stdout:9/44: rename d5/c8 to d5/de/c10 0 2026-03-09T17:30:03.012 INFO:tasks.workunit.client.1.vm09.stdout:9/45: dread f2 [0,4194304] 0 2026-03-09T17:30:03.013 INFO:tasks.workunit.client.1.vm09.stdout:0/88: sync 2026-03-09T17:30:03.013 INFO:tasks.workunit.client.1.vm09.stdout:0/89: fsync d6/f9 0 2026-03-09T17:30:03.014 INFO:tasks.workunit.client.0.vm06.stdout:2/770: dwrite d3/f3b [0,4194304] 0 2026-03-09T17:30:03.015 INFO:tasks.workunit.client.1.vm09.stdout:9/46: dread f2 [0,4194304] 0 2026-03-09T17:30:03.026 INFO:tasks.workunit.client.0.vm06.stdout:2/771: mknod d3/d4/d12/d71/daa/d77/d81/d64/de5/df0/cf6 0 2026-03-09T17:30:03.027 INFO:tasks.workunit.client.0.vm06.stdout:2/772: symlink d3/d4/d12/d2b/d9f/lf7 0 2026-03-09T17:30:03.028 INFO:tasks.workunit.client.1.vm09.stdout:0/90: creat d6/f16 x:0 0 0 2026-03-09T17:30:03.029 INFO:tasks.workunit.client.1.vm09.stdout:9/47: creat d5/f11 x:0 0 0 2026-03-09T17:30:03.030 INFO:tasks.workunit.client.0.vm06.stdout:2/773: creat d3/d4/d12/da7/de3/ff8 x:0 0 0 2026-03-09T17:30:03.032 INFO:tasks.workunit.client.1.vm09.stdout:9/48: dwrite d5/f11 [0,4194304] 0 2026-03-09T17:30:03.039 INFO:tasks.workunit.client.0.vm06.stdout:2/774: link d3/l61 d3/d4/d12/d2b/d36/lf9 0 2026-03-09T17:30:03.039 INFO:tasks.workunit.client.1.vm09.stdout:9/49: chown c1 664598 1 2026-03-09T17:30:03.039 INFO:tasks.workunit.client.1.vm09.stdout:0/91: dwrite d6/f7 [0,4194304] 0 2026-03-09T17:30:03.053 INFO:tasks.workunit.client.1.vm09.stdout:6/29: getdents d3 0 2026-03-09T17:30:03.056 INFO:tasks.workunit.client.1.vm09.stdout:9/50: dwrite f2 [0,4194304] 0 2026-03-09T17:30:03.057 INFO:tasks.workunit.client.1.vm09.stdout:6/30: mkdir d3/d7 0 2026-03-09T17:30:03.057 INFO:tasks.workunit.client.1.vm09.stdout:6/31: read f2 [4004571,69894] 0 2026-03-09T17:30:03.059 INFO:tasks.workunit.client.1.vm09.stdout:0/92: rename d6/l11 to d6/de/l17 0 2026-03-09T17:30:03.065 INFO:tasks.workunit.client.1.vm09.stdout:2/64: truncate f8 1955848 0 2026-03-09T17:30:03.065 INFO:tasks.workunit.client.1.vm09.stdout:2/65: chown d13/f14 3 1 2026-03-09T17:30:03.066 INFO:tasks.workunit.client.1.vm09.stdout:8/65: symlink d1/da/dd/l18 0 2026-03-09T17:30:03.067 INFO:tasks.workunit.client.1.vm09.stdout:8/66: chown l0 1783869 1 2026-03-09T17:30:03.069 INFO:tasks.workunit.client.1.vm09.stdout:2/66: mknod d13/c1b 0 2026-03-09T17:30:03.069 INFO:tasks.workunit.client.1.vm09.stdout:6/32: link d3/l6 d3/d7/l8 0 2026-03-09T17:30:03.069 INFO:tasks.workunit.client.1.vm09.stdout:2/67: dread - d13/d15/f18 zero size 2026-03-09T17:30:03.071 INFO:tasks.workunit.client.1.vm09.stdout:8/67: dread d1/f3 [0,4194304] 0 2026-03-09T17:30:03.071 INFO:tasks.workunit.client.1.vm09.stdout:2/68: fdatasync f0 0 2026-03-09T17:30:03.073 INFO:tasks.workunit.client.1.vm09.stdout:6/33: dread f2 [0,4194304] 0 2026-03-09T17:30:03.073 INFO:tasks.workunit.client.1.vm09.stdout:9/51: link d5/lc d5/l12 0 2026-03-09T17:30:03.073 INFO:tasks.workunit.client.1.vm09.stdout:7/70: fsync da/f1c 0 2026-03-09T17:30:03.076 INFO:tasks.workunit.client.1.vm09.stdout:6/34: dread f2 [0,4194304] 0 2026-03-09T17:30:03.077 INFO:tasks.workunit.client.1.vm09.stdout:9/52: chown d5/lb 1989006487 1 2026-03-09T17:30:03.077 INFO:tasks.workunit.client.1.vm09.stdout:0/93: sync 2026-03-09T17:30:03.077 INFO:tasks.workunit.client.1.vm09.stdout:2/69: dread f0 [0,4194304] 0 2026-03-09T17:30:03.078 INFO:tasks.workunit.client.0.vm06.stdout:5/854: dwrite d4/d50/fd7 [0,4194304] 0 2026-03-09T17:30:03.082 INFO:tasks.workunit.client.1.vm09.stdout:8/68: dwrite d1/da/f12 [0,4194304] 0 2026-03-09T17:30:03.085 INFO:tasks.workunit.client.1.vm09.stdout:2/70: dread f9 [0,4194304] 0 2026-03-09T17:30:03.087 INFO:tasks.workunit.client.1.vm09.stdout:8/69: dread d1/f7 [0,4194304] 0 2026-03-09T17:30:03.095 INFO:tasks.workunit.client.0.vm06.stdout:5/855: symlink d4/d50/d35/d40/l135 0 2026-03-09T17:30:03.100 INFO:tasks.workunit.client.1.vm09.stdout:7/71: rename da/ce to da/c1d 0 2026-03-09T17:30:03.103 INFO:tasks.workunit.client.1.vm09.stdout:7/72: readlink da/d11/l17 0 2026-03-09T17:30:03.107 INFO:tasks.workunit.client.1.vm09.stdout:4/67: write f2 [1795917,104686] 0 2026-03-09T17:30:03.108 INFO:tasks.workunit.client.1.vm09.stdout:7/73: creat da/d11/f1e x:0 0 0 2026-03-09T17:30:03.108 INFO:tasks.workunit.client.1.vm09.stdout:7/74: dread - da/f16 zero size 2026-03-09T17:30:03.112 INFO:tasks.workunit.client.1.vm09.stdout:7/75: dwrite da/f15 [0,4194304] 0 2026-03-09T17:30:03.117 INFO:tasks.workunit.client.0.vm06.stdout:4/932: write fa [1767541,93701] 0 2026-03-09T17:30:03.117 INFO:tasks.workunit.client.1.vm09.stdout:7/76: stat da/d11/f1a 0 2026-03-09T17:30:03.117 INFO:tasks.workunit.client.1.vm09.stdout:7/77: dwrite da/d11/f1a [0,4194304] 0 2026-03-09T17:30:03.118 INFO:tasks.workunit.client.1.vm09.stdout:7/78: fsync da/f16 0 2026-03-09T17:30:03.128 INFO:tasks.workunit.client.0.vm06.stdout:5/856: dread d4/d52/f8f [0,4194304] 0 2026-03-09T17:30:03.135 INFO:tasks.workunit.client.0.vm06.stdout:4/933: truncate db/d59/d5f/d45/d10a/dcc/de0/f104 3014786 0 2026-03-09T17:30:03.137 INFO:tasks.workunit.client.1.vm09.stdout:9/53: creat d5/f13 x:0 0 0 2026-03-09T17:30:03.144 INFO:tasks.workunit.client.0.vm06.stdout:9/943: dwrite d3/d11/d65/faa [0,4194304] 0 2026-03-09T17:30:03.153 INFO:tasks.workunit.client.0.vm06.stdout:4/934: dread db/d59/d5f/d6d/dfc/f140 [0,4194304] 0 2026-03-09T17:30:03.162 INFO:tasks.workunit.client.1.vm09.stdout:4/68: rmdir d11 39 2026-03-09T17:30:03.166 INFO:tasks.workunit.client.0.vm06.stdout:9/944: mkdir d3/d26/d12f 0 2026-03-09T17:30:03.167 INFO:tasks.workunit.client.1.vm09.stdout:8/70: link d1/da/dd/l18 d1/l19 0 2026-03-09T17:30:03.169 INFO:tasks.workunit.client.1.vm09.stdout:8/71: dread d1/f16 [0,4194304] 0 2026-03-09T17:30:03.169 INFO:tasks.workunit.client.0.vm06.stdout:6/782: dwrite d6/d4f/d3e/d52/d8c/db0/fdb [0,4194304] 0 2026-03-09T17:30:03.177 INFO:tasks.workunit.client.0.vm06.stdout:4/935: mknod db/d1d/d21/d88/dfd/c148 0 2026-03-09T17:30:03.184 INFO:tasks.workunit.client.0.vm06.stdout:9/945: truncate d3/d15/d16/f105 53400 0 2026-03-09T17:30:03.190 INFO:tasks.workunit.client.0.vm06.stdout:4/936: dread - db/d1d/d21/d37/d69/d78/feb zero size 2026-03-09T17:30:03.190 INFO:tasks.workunit.client.0.vm06.stdout:3/920: write dd/d19/d28/feb [1837970,1133] 0 2026-03-09T17:30:03.192 INFO:tasks.workunit.client.0.vm06.stdout:1/850: write d11/d14/d1d/d1e/d2a/d99/ff3 [882493,11807] 0 2026-03-09T17:30:03.198 INFO:tasks.workunit.client.1.vm09.stdout:4/69: mknod d11/c1b 0 2026-03-09T17:30:03.198 INFO:tasks.workunit.client.0.vm06.stdout:8/855: write d15/f7d [1317101,35135] 0 2026-03-09T17:30:03.202 INFO:tasks.workunit.client.1.vm09.stdout:3/40: write f3 [11384,121867] 0 2026-03-09T17:30:03.204 INFO:tasks.workunit.client.0.vm06.stdout:4/937: dread db/d59/d5f/d6d/ddb/f115 [0,4194304] 0 2026-03-09T17:30:03.209 INFO:tasks.workunit.client.1.vm09.stdout:5/84: fsync d0/de/f1a 0 2026-03-09T17:30:03.209 INFO:tasks.workunit.client.1.vm09.stdout:9/54: creat d5/f14 x:0 0 0 2026-03-09T17:30:03.211 INFO:tasks.workunit.client.0.vm06.stdout:3/921: creat dd/d19/d25/d44/d12f/f139 x:0 0 0 2026-03-09T17:30:03.211 INFO:tasks.workunit.client.1.vm09.stdout:5/85: dread d0/ff [0,4194304] 0 2026-03-09T17:30:03.212 INFO:tasks.workunit.client.1.vm09.stdout:5/86: write d0/de/f1a [9090689,112403] 0 2026-03-09T17:30:03.212 INFO:tasks.workunit.client.1.vm09.stdout:4/70: creat d11/f1c x:0 0 0 2026-03-09T17:30:03.212 INFO:tasks.workunit.client.0.vm06.stdout:1/851: creat d11/d14/d1d/d42/d46/d92/dc0/ddf/f11f x:0 0 0 2026-03-09T17:30:03.213 INFO:tasks.workunit.client.1.vm09.stdout:4/71: chown d11/f16 2558 1 2026-03-09T17:30:03.213 INFO:tasks.workunit.client.1.vm09.stdout:5/87: write d0/ff [1284058,26739] 0 2026-03-09T17:30:03.220 INFO:tasks.workunit.client.1.vm09.stdout:3/41: fdatasync f1 0 2026-03-09T17:30:03.223 INFO:tasks.workunit.client.0.vm06.stdout:6/783: creat d6/d4f/fee x:0 0 0 2026-03-09T17:30:03.225 INFO:tasks.workunit.client.1.vm09.stdout:9/55: readlink d5/lc 0 2026-03-09T17:30:03.225 INFO:tasks.workunit.client.1.vm09.stdout:9/56: truncate d5/f14 450957 0 2026-03-09T17:30:03.226 INFO:tasks.workunit.client.0.vm06.stdout:2/775: write f2 [525794,71623] 0 2026-03-09T17:30:03.227 INFO:tasks.workunit.client.0.vm06.stdout:2/776: readlink d3/d4/d12/d2b/d36/l3e 0 2026-03-09T17:30:03.231 INFO:tasks.workunit.client.1.vm09.stdout:6/35: getdents d3 0 2026-03-09T17:30:03.233 INFO:tasks.workunit.client.1.vm09.stdout:6/36: dread f2 [0,4194304] 0 2026-03-09T17:30:03.234 INFO:tasks.workunit.client.1.vm09.stdout:4/72: creat d11/f1d x:0 0 0 2026-03-09T17:30:03.234 INFO:tasks.workunit.client.1.vm09.stdout:6/37: chown d3/c5 101463 1 2026-03-09T17:30:03.234 INFO:tasks.workunit.client.1.vm09.stdout:4/73: chown d11/c1b 28 1 2026-03-09T17:30:03.234 INFO:tasks.workunit.client.0.vm06.stdout:4/938: dread db/d1d/d21/f2f [0,4194304] 0 2026-03-09T17:30:03.234 INFO:tasks.workunit.client.1.vm09.stdout:1/93: write f3 [3735134,14783] 0 2026-03-09T17:30:03.234 INFO:tasks.workunit.client.1.vm09.stdout:4/74: dread - d11/f15 zero size 2026-03-09T17:30:03.235 INFO:tasks.workunit.client.1.vm09.stdout:1/94: dread - d9/d10/f17 zero size 2026-03-09T17:30:03.237 INFO:tasks.workunit.client.1.vm09.stdout:4/75: dread f2 [4194304,4194304] 0 2026-03-09T17:30:03.238 INFO:tasks.workunit.client.1.vm09.stdout:4/76: write d11/f15 [917468,93318] 0 2026-03-09T17:30:03.238 INFO:tasks.workunit.client.0.vm06.stdout:3/922: symlink dd/d81/da3/dae/d110/l13a 0 2026-03-09T17:30:03.244 INFO:tasks.workunit.client.0.vm06.stdout:3/923: dread dd/d19/d25/d44/d80/dd7/fe6 [0,4194304] 0 2026-03-09T17:30:03.245 INFO:tasks.workunit.client.0.vm06.stdout:1/852: mkdir d11/d14/d1d/d1e/d2a/d34/d64/df6/d120 0 2026-03-09T17:30:03.249 INFO:tasks.workunit.client.1.vm09.stdout:8/72: getdents d1/da/dd 0 2026-03-09T17:30:03.250 INFO:tasks.workunit.client.1.vm09.stdout:8/73: write d1/da/dd/f10 [587374,13221] 0 2026-03-09T17:30:03.250 INFO:tasks.workunit.client.1.vm09.stdout:8/74: chown d1/f8 13930267 1 2026-03-09T17:30:03.254 INFO:tasks.workunit.client.0.vm06.stdout:2/777: rmdir d3/d4/d46/da5 39 2026-03-09T17:30:03.260 INFO:tasks.workunit.client.1.vm09.stdout:7/79: fsync da/d11/f1e 0 2026-03-09T17:30:03.261 INFO:tasks.workunit.client.1.vm09.stdout:0/94: truncate d6/f7 2290141 0 2026-03-09T17:30:03.261 INFO:tasks.workunit.client.1.vm09.stdout:7/80: dread - da/f1c zero size 2026-03-09T17:30:03.261 INFO:tasks.workunit.client.1.vm09.stdout:2/71: truncate fb 167941 0 2026-03-09T17:30:03.262 INFO:tasks.workunit.client.1.vm09.stdout:2/72: chown l10 48502276 1 2026-03-09T17:30:03.262 INFO:tasks.workunit.client.1.vm09.stdout:2/73: stat d13/d15/f18 0 2026-03-09T17:30:03.263 INFO:tasks.workunit.client.1.vm09.stdout:7/81: dread f3 [0,4194304] 0 2026-03-09T17:30:03.263 INFO:tasks.workunit.client.1.vm09.stdout:2/74: write d13/f1a [498370,84828] 0 2026-03-09T17:30:03.268 INFO:tasks.workunit.client.1.vm09.stdout:4/77: unlink d11/f1d 0 2026-03-09T17:30:03.271 INFO:tasks.workunit.client.0.vm06.stdout:5/857: dwrite d4/d52/f8f [0,4194304] 0 2026-03-09T17:30:03.282 INFO:tasks.workunit.client.1.vm09.stdout:8/75: mknod d1/c1a 0 2026-03-09T17:30:03.282 INFO:tasks.workunit.client.1.vm09.stdout:8/76: stat d1/c1a 0 2026-03-09T17:30:03.283 INFO:tasks.workunit.client.1.vm09.stdout:9/57: mknod d5/c15 0 2026-03-09T17:30:03.283 INFO:tasks.workunit.client.1.vm09.stdout:9/58: write d5/f13 [954081,43132] 0 2026-03-09T17:30:03.285 INFO:tasks.workunit.client.0.vm06.stdout:9/946: dwrite d3/d15/d48/fb7 [0,4194304] 0 2026-03-09T17:30:03.298 INFO:tasks.workunit.client.0.vm06.stdout:8/856: dwrite d15/d16/d1e/f64 [0,4194304] 0 2026-03-09T17:30:03.307 INFO:tasks.workunit.client.0.vm06.stdout:4/939: write db/d1d/d21/d25/f80 [1397808,32451] 0 2026-03-09T17:30:03.312 INFO:tasks.workunit.client.1.vm09.stdout:7/82: creat da/d11/f1f x:0 0 0 2026-03-09T17:30:03.316 INFO:tasks.workunit.client.1.vm09.stdout:6/38: dwrite f2 [0,4194304] 0 2026-03-09T17:30:03.318 INFO:tasks.workunit.client.0.vm06.stdout:3/924: dwrite dd/f1b [0,4194304] 0 2026-03-09T17:30:03.320 INFO:tasks.workunit.client.1.vm09.stdout:2/75: unlink f0 0 2026-03-09T17:30:03.320 INFO:tasks.workunit.client.1.vm09.stdout:2/76: chown d13/d15/f18 12746 1 2026-03-09T17:30:03.320 INFO:tasks.workunit.client.1.vm09.stdout:2/77: chown d13 0 1 2026-03-09T17:30:03.327 INFO:tasks.workunit.client.0.vm06.stdout:1/853: dwrite d11/d14/d1d/d94/fc8 [0,4194304] 0 2026-03-09T17:30:03.337 INFO:tasks.workunit.client.1.vm09.stdout:4/78: fsync d11/f16 0 2026-03-09T17:30:03.337 INFO:tasks.workunit.client.1.vm09.stdout:4/79: chown d11/f16 1789971 1 2026-03-09T17:30:03.338 INFO:tasks.workunit.client.1.vm09.stdout:4/80: chown d11/f19 1939903987 1 2026-03-09T17:30:03.338 INFO:tasks.workunit.client.1.vm09.stdout:4/81: chown d11/f18 829125 1 2026-03-09T17:30:03.339 INFO:tasks.workunit.client.1.vm09.stdout:4/82: write fe [2282858,25395] 0 2026-03-09T17:30:03.339 INFO:tasks.workunit.client.1.vm09.stdout:4/83: read - d11/f18 zero size 2026-03-09T17:30:03.341 INFO:tasks.workunit.client.1.vm09.stdout:5/88: link d0/d2/lb d0/d9/d16/l1b 0 2026-03-09T17:30:03.342 INFO:tasks.workunit.client.1.vm09.stdout:4/84: dwrite f3 [0,4194304] 0 2026-03-09T17:30:03.366 INFO:tasks.workunit.client.1.vm09.stdout:7/83: chown da/l14 9734416 1 2026-03-09T17:30:03.377 INFO:tasks.workunit.client.1.vm09.stdout:1/95: truncate f8 1387146 0 2026-03-09T17:30:03.381 INFO:tasks.workunit.client.1.vm09.stdout:5/89: unlink d0/d2/f12 0 2026-03-09T17:30:03.384 INFO:tasks.workunit.client.1.vm09.stdout:3/42: getdents d5 0 2026-03-09T17:30:03.385 INFO:tasks.workunit.client.1.vm09.stdout:8/77: mkdir d1/d14/d1b 0 2026-03-09T17:30:03.391 INFO:tasks.workunit.client.1.vm09.stdout:7/84: symlink da/d11/l20 0 2026-03-09T17:30:03.392 INFO:tasks.workunit.client.1.vm09.stdout:6/39: symlink d3/d7/l9 0 2026-03-09T17:30:03.394 INFO:tasks.workunit.client.1.vm09.stdout:6/40: dread f2 [0,4194304] 0 2026-03-09T17:30:03.401 INFO:tasks.workunit.client.1.vm09.stdout:1/96: rmdir d9/dc 39 2026-03-09T17:30:03.403 INFO:tasks.workunit.client.1.vm09.stdout:4/85: unlink fc 0 2026-03-09T17:30:03.403 INFO:tasks.workunit.client.1.vm09.stdout:4/86: dread - d11/f12 zero size 2026-03-09T17:30:03.403 INFO:tasks.workunit.client.1.vm09.stdout:4/87: chown fe 32066126 1 2026-03-09T17:30:03.405 INFO:tasks.workunit.client.1.vm09.stdout:3/43: creat d5/d9/fc x:0 0 0 2026-03-09T17:30:03.406 INFO:tasks.workunit.client.1.vm09.stdout:3/44: dread - d5/d9/fc zero size 2026-03-09T17:30:03.408 INFO:tasks.workunit.client.1.vm09.stdout:4/88: dwrite d11/f18 [0,4194304] 0 2026-03-09T17:30:03.411 INFO:tasks.workunit.client.1.vm09.stdout:3/45: dwrite f4 [0,4194304] 0 2026-03-09T17:30:03.412 INFO:tasks.workunit.client.1.vm09.stdout:3/46: write d5/d6/fb [922464,38911] 0 2026-03-09T17:30:03.427 INFO:tasks.workunit.client.1.vm09.stdout:0/95: getdents d6 0 2026-03-09T17:30:03.427 INFO:tasks.workunit.client.1.vm09.stdout:1/97: sync 2026-03-09T17:30:03.428 INFO:tasks.workunit.client.1.vm09.stdout:0/96: dread d6/f9 [0,4194304] 0 2026-03-09T17:30:03.432 INFO:tasks.workunit.client.1.vm09.stdout:7/85: rename f9 to da/f21 0 2026-03-09T17:30:03.432 INFO:tasks.workunit.client.1.vm09.stdout:7/86: truncate da/f1c 339316 0 2026-03-09T17:30:03.438 INFO:tasks.workunit.client.1.vm09.stdout:6/41: mknod d3/ca 0 2026-03-09T17:30:03.441 INFO:tasks.workunit.client.1.vm09.stdout:5/90: creat d0/d2/d15/f1c x:0 0 0 2026-03-09T17:30:03.445 INFO:tasks.workunit.client.0.vm06.stdout:3/925: symlink dd/d1d/d2e/d67/def/l13b 0 2026-03-09T17:30:03.447 INFO:tasks.workunit.client.1.vm09.stdout:4/89: mkdir d11/d1e 0 2026-03-09T17:30:03.448 INFO:tasks.workunit.client.1.vm09.stdout:3/47: rmdir d5 39 2026-03-09T17:30:03.450 INFO:tasks.workunit.client.1.vm09.stdout:8/78: creat d1/d14/d1b/f1c x:0 0 0 2026-03-09T17:30:03.452 INFO:tasks.workunit.client.1.vm09.stdout:9/59: getdents d5 0 2026-03-09T17:30:03.453 INFO:tasks.workunit.client.1.vm09.stdout:8/79: dread d1/f3 [0,4194304] 0 2026-03-09T17:30:03.453 INFO:tasks.workunit.client.0.vm06.stdout:2/778: mkdir d3/d4/d12/dfa 0 2026-03-09T17:30:03.457 INFO:tasks.workunit.client.1.vm09.stdout:0/97: rename d6/f12 to d6/f18 0 2026-03-09T17:30:03.461 INFO:tasks.workunit.client.0.vm06.stdout:5/858: unlink d4/dbb/d127/l132 0 2026-03-09T17:30:03.463 INFO:tasks.workunit.client.0.vm06.stdout:5/859: read d4/d22/d64/f7d [536032,57464] 0 2026-03-09T17:30:03.474 INFO:tasks.workunit.client.0.vm06.stdout:8/857: mkdir d15/d39/d11a 0 2026-03-09T17:30:03.476 INFO:tasks.workunit.client.0.vm06.stdout:4/940: mkdir db/d1d/d21/d149 0 2026-03-09T17:30:03.478 INFO:tasks.workunit.client.1.vm09.stdout:6/42: creat d3/fb x:0 0 0 2026-03-09T17:30:03.479 INFO:tasks.workunit.client.0.vm06.stdout:1/854: symlink d11/l121 0 2026-03-09T17:30:03.481 INFO:tasks.workunit.client.0.vm06.stdout:2/779: rename d3/d4/fea to d3/d4/d12/d71/ffb 0 2026-03-09T17:30:03.487 INFO:tasks.workunit.client.1.vm09.stdout:4/90: dwrite d11/f13 [0,4194304] 0 2026-03-09T17:30:03.490 INFO:tasks.workunit.client.1.vm09.stdout:4/91: fdatasync d11/f12 0 2026-03-09T17:30:03.494 INFO:tasks.workunit.client.0.vm06.stdout:5/860: creat d4/d50/d35/d40/d109/f136 x:0 0 0 2026-03-09T17:30:03.500 INFO:tasks.workunit.client.1.vm09.stdout:2/78: read fb [53094,130382] 0 2026-03-09T17:30:03.500 INFO:tasks.workunit.client.0.vm06.stdout:4/941: fdatasync db/d1d/d21/d88/fd2 0 2026-03-09T17:30:03.500 INFO:tasks.workunit.client.0.vm06.stdout:1/855: fsync d11/d14/d1d/d42/f44 0 2026-03-09T17:30:03.501 INFO:tasks.workunit.client.1.vm09.stdout:2/79: dwrite d13/d15/f18 [0,4194304] 0 2026-03-09T17:30:03.504 INFO:tasks.workunit.client.0.vm06.stdout:6/784: dwrite d6/d4f/d3e/d52/d80/faa [0,4194304] 0 2026-03-09T17:30:03.512 INFO:tasks.workunit.client.1.vm09.stdout:0/98: stat c4 0 2026-03-09T17:30:03.515 INFO:tasks.workunit.client.1.vm09.stdout:7/87: rename c8 to da/d11/c22 0 2026-03-09T17:30:03.515 INFO:tasks.workunit.client.1.vm09.stdout:7/88: dread - da/f16 zero size 2026-03-09T17:30:03.516 INFO:tasks.workunit.client.1.vm09.stdout:7/89: dread - da/d11/f1f zero size 2026-03-09T17:30:03.520 INFO:tasks.workunit.client.0.vm06.stdout:2/780: mkdir d3/d4/d12/da7/dfc 0 2026-03-09T17:30:03.521 INFO:tasks.workunit.client.1.vm09.stdout:6/43: creat d3/fc x:0 0 0 2026-03-09T17:30:03.521 INFO:tasks.workunit.client.1.vm09.stdout:6/44: write d3/fc [1038429,46845] 0 2026-03-09T17:30:03.522 INFO:tasks.workunit.client.0.vm06.stdout:5/861: rmdir d4/d52/d112 39 2026-03-09T17:30:03.525 INFO:tasks.workunit.client.0.vm06.stdout:9/947: dwrite d3/d6d/d9a/fb4 [0,4194304] 0 2026-03-09T17:30:03.530 INFO:tasks.workunit.client.0.vm06.stdout:8/858: symlink d15/d16/l11b 0 2026-03-09T17:30:03.538 INFO:tasks.workunit.client.0.vm06.stdout:1/856: mkdir d11/d14/d1d/dd1/d122 0 2026-03-09T17:30:03.538 INFO:tasks.workunit.client.0.vm06.stdout:6/785: mknod d6/d47/d4d/d6d/cef 0 2026-03-09T17:30:03.539 INFO:tasks.workunit.client.0.vm06.stdout:2/781: rmdir d3/d4/d12/d71/daa/d77 39 2026-03-09T17:30:03.541 INFO:tasks.workunit.client.0.vm06.stdout:6/786: dread d6/d12/d53/f87 [0,4194304] 0 2026-03-09T17:30:03.547 INFO:tasks.workunit.client.0.vm06.stdout:9/948: fdatasync d3/d15/f74 0 2026-03-09T17:30:03.550 INFO:tasks.workunit.client.0.vm06.stdout:1/857: readlink d11/d14/d1d/d1e/l3c 0 2026-03-09T17:30:03.550 INFO:tasks.workunit.client.0.vm06.stdout:1/858: dread - d11/d14/d1d/d42/d46/fcd zero size 2026-03-09T17:30:03.551 INFO:tasks.workunit.client.0.vm06.stdout:1/859: stat d11/d14/d1d/d42/d46/d92/dc0/ddf/f11f 0 2026-03-09T17:30:03.552 INFO:tasks.workunit.client.0.vm06.stdout:1/860: stat d11/d14/d1d 0 2026-03-09T17:30:03.553 INFO:tasks.workunit.client.0.vm06.stdout:8/859: dwrite d15/d16/d1a/d47/f76 [0,4194304] 0 2026-03-09T17:30:03.555 INFO:tasks.workunit.client.0.vm06.stdout:8/860: write d15/d16/d1e/f4e [7879870,89035] 0 2026-03-09T17:30:03.555 INFO:tasks.workunit.client.0.vm06.stdout:3/926: sync 2026-03-09T17:30:03.556 INFO:tasks.workunit.client.0.vm06.stdout:3/927: chown dd/d19/d1e 776 1 2026-03-09T17:30:03.556 INFO:tasks.workunit.client.1.vm09.stdout:2/80: sync 2026-03-09T17:30:03.556 INFO:tasks.workunit.client.1.vm09.stdout:2/81: chown fd 52 1 2026-03-09T17:30:03.557 INFO:tasks.workunit.client.1.vm09.stdout:2/82: chown l11 1197592 1 2026-03-09T17:30:03.557 INFO:tasks.workunit.client.0.vm06.stdout:4/942: read db/d1d/d21/d25/d4b/df7/ffe [2447323,49573] 0 2026-03-09T17:30:03.583 INFO:tasks.workunit.client.0.vm06.stdout:5/862: dread d4/d22/d46/dec/f105 [0,4194304] 0 2026-03-09T17:30:03.588 INFO:tasks.workunit.client.0.vm06.stdout:8/861: dread d15/d16/d1a/d7c/f9e [0,4194304] 0 2026-03-09T17:30:03.593 INFO:tasks.workunit.client.0.vm06.stdout:3/928: creat dd/d19/d25/d2d/d9b/f13c x:0 0 0 2026-03-09T17:30:03.597 INFO:tasks.workunit.client.0.vm06.stdout:2/782: mkdir d3/d4/d22/d72/dfd 0 2026-03-09T17:30:03.599 INFO:tasks.workunit.client.0.vm06.stdout:6/787: symlink d6/d12/lf0 0 2026-03-09T17:30:03.599 INFO:tasks.workunit.client.0.vm06.stdout:9/949: symlink d3/d15/d36/d12a/l130 0 2026-03-09T17:30:03.600 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:03 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:03.600 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:03 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:03.600 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:03 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:03.600 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:03 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:30:03.600 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:03 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:03.600 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:03 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:03.600 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:03 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr fail", "who": "vm06.pbgzei"}]: dispatch 2026-03-09T17:30:03.600 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:03 vm09.local ceph-mon[62061]: osdmap e38: 6 total, 6 up, 6 in 2026-03-09T17:30:03.600 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:03 vm09.local ceph-mon[62061]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "mgr fail", "who": "vm06.pbgzei"}]': finished 2026-03-09T17:30:03.600 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:03 vm09.local ceph-mon[62061]: mgrmap e20: vm09.lqzvkh(active, starting, since 0.0910244s) 2026-03-09T17:30:03.601 INFO:tasks.workunit.client.0.vm06.stdout:8/862: truncate d15/d39/d3c/d6c/fbf 630069 0 2026-03-09T17:30:03.608 INFO:tasks.workunit.client.0.vm06.stdout:5/863: dread d4/d22/d64/df3/f102 [0,4194304] 0 2026-03-09T17:30:03.609 INFO:tasks.workunit.client.0.vm06.stdout:9/950: rename d3/d11/d65/d80/fd2 to d3/d15/d48/da8/db9/de8/d108/f131 0 2026-03-09T17:30:03.609 INFO:tasks.workunit.client.0.vm06.stdout:9/951: readlink d3/d6d/d9a/d9c/lb8 0 2026-03-09T17:30:03.610 INFO:tasks.workunit.client.0.vm06.stdout:8/863: creat d15/d16/d1a/f11c x:0 0 0 2026-03-09T17:30:03.617 INFO:tasks.workunit.client.0.vm06.stdout:3/929: getdents dd/d1d/d6e/d70/d11d 0 2026-03-09T17:30:03.617 INFO:tasks.workunit.client.0.vm06.stdout:3/930: chown dd/fdd 0 1 2026-03-09T17:30:03.618 INFO:tasks.workunit.client.0.vm06.stdout:3/931: chown dd/d19/d25/d44/c49 44414 1 2026-03-09T17:30:03.621 INFO:tasks.workunit.client.0.vm06.stdout:9/952: rmdir d3/d11 39 2026-03-09T17:30:03.622 INFO:tasks.workunit.client.0.vm06.stdout:8/864: fdatasync d15/d39/d67/d77/fc3 0 2026-03-09T17:30:03.623 INFO:tasks.workunit.client.0.vm06.stdout:4/943: getdents db/d1d/d21/d25/d4b/d85/d137 0 2026-03-09T17:30:03.625 INFO:tasks.workunit.client.0.vm06.stdout:3/932: mkdir dd/d81/d97/df5/d13d 0 2026-03-09T17:30:03.626 INFO:tasks.workunit.client.0.vm06.stdout:5/864: truncate d4/d22/d46/f58 693865 0 2026-03-09T17:30:03.635 INFO:tasks.workunit.client.0.vm06.stdout:1/861: read d11/d14/d1d/d1e/d2a/fba [729428,26840] 0 2026-03-09T17:30:03.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:03 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:03.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:03 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:03.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:03 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:03.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:03 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:30:03.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:03 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:03.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:03 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:03.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:03 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr fail", "who": "vm06.pbgzei"}]: dispatch 2026-03-09T17:30:03.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:03 vm06.local ceph-mon[57307]: osdmap e38: 6 total, 6 up, 6 in 2026-03-09T17:30:03.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:03 vm06.local ceph-mon[57307]: from='mgr.14221 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "mgr fail", "who": "vm06.pbgzei"}]': finished 2026-03-09T17:30:03.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:03 vm06.local ceph-mon[57307]: mgrmap e20: vm09.lqzvkh(active, starting, since 0.0910244s) 2026-03-09T17:30:03.643 INFO:tasks.workunit.client.0.vm06.stdout:8/865: dread - d15/d39/d3c/dd5/fde zero size 2026-03-09T17:30:03.656 INFO:tasks.workunit.client.0.vm06.stdout:4/944: dread - db/d1d/d21/d26/d7a/fda zero size 2026-03-09T17:30:03.665 INFO:tasks.workunit.client.0.vm06.stdout:3/933: dread dd/d81/da3/fbc [0,4194304] 0 2026-03-09T17:30:03.668 INFO:tasks.workunit.client.0.vm06.stdout:1/862: mknod d11/d14/d1d/d4a/df7/c123 0 2026-03-09T17:30:03.669 INFO:tasks.workunit.client.0.vm06.stdout:8/866: mknod d15/d39/d67/d77/d97/c11d 0 2026-03-09T17:30:03.673 INFO:tasks.workunit.client.0.vm06.stdout:8/867: dread d15/f7d [0,4194304] 0 2026-03-09T17:30:03.675 INFO:tasks.workunit.client.0.vm06.stdout:4/945: symlink db/d59/d5f/d45/d10a/dcc/de0/l14a 0 2026-03-09T17:30:03.676 INFO:tasks.workunit.client.0.vm06.stdout:4/946: stat db/d1d/c1e 0 2026-03-09T17:30:03.676 INFO:tasks.workunit.client.0.vm06.stdout:6/788: dwrite d6/d47/d96/d40/fdd [0,4194304] 0 2026-03-09T17:30:03.684 INFO:tasks.workunit.client.0.vm06.stdout:2/783: truncate d3/d4/d12/f92 554814 0 2026-03-09T17:30:03.689 INFO:tasks.workunit.client.0.vm06.stdout:4/947: mkdir db/d59/d5f/d45/d10a/dcc/de0/d14b 0 2026-03-09T17:30:03.693 INFO:tasks.workunit.client.0.vm06.stdout:9/953: dwrite d3/d26/d35/f6f [0,4194304] 0 2026-03-09T17:30:03.704 INFO:tasks.workunit.client.0.vm06.stdout:9/954: dread d3/d26/f76 [0,4194304] 0 2026-03-09T17:30:03.706 INFO:tasks.workunit.client.0.vm06.stdout:6/789: creat d6/d12/d2d/db3/ff1 x:0 0 0 2026-03-09T17:30:03.709 INFO:tasks.workunit.client.0.vm06.stdout:1/863: symlink d11/d14/d1d/d1e/d2a/d34/d64/dfa/l124 0 2026-03-09T17:30:03.711 INFO:tasks.workunit.client.0.vm06.stdout:2/784: readlink d3/d4/d12/d71/daa/d77/d81/d64/laf 0 2026-03-09T17:30:03.717 INFO:tasks.workunit.client.0.vm06.stdout:4/948: fsync db/d59/d5f/d45/d10a/dba/ff1 0 2026-03-09T17:30:03.721 INFO:tasks.workunit.client.0.vm06.stdout:9/955: creat d3/d15/d36/d83/df8/d103/f132 x:0 0 0 2026-03-09T17:30:03.725 INFO:tasks.workunit.client.0.vm06.stdout:1/864: symlink d11/d14/d1d/d8c/l125 0 2026-03-09T17:30:03.732 INFO:tasks.workunit.client.0.vm06.stdout:5/865: dwrite d4/d22/d64/df3/f102 [4194304,4194304] 0 2026-03-09T17:30:03.733 INFO:tasks.workunit.client.0.vm06.stdout:3/934: write dd/d19/f10b [125962,104661] 0 2026-03-09T17:30:03.739 INFO:tasks.workunit.client.0.vm06.stdout:8/868: dwrite d15/d31/dc5/df1/d3d/d5f/dd4/fd6 [0,4194304] 0 2026-03-09T17:30:03.754 INFO:tasks.workunit.client.0.vm06.stdout:2/785: symlink d3/d4/d12/d2b/db0/dc1/lfe 0 2026-03-09T17:30:03.758 INFO:tasks.workunit.client.0.vm06.stdout:6/790: symlink d6/d12/d53/lf2 0 2026-03-09T17:30:03.773 INFO:tasks.workunit.client.1.vm09.stdout:8/80: creat d1/da/d13/f1d x:0 0 0 2026-03-09T17:30:03.773 INFO:tasks.workunit.client.1.vm09.stdout:8/81: fsync d1/da/dd/f10 0 2026-03-09T17:30:03.773 INFO:tasks.workunit.client.1.vm09.stdout:8/82: read d1/f3 [3810769,13015] 0 2026-03-09T17:30:03.773 INFO:tasks.workunit.client.0.vm06.stdout:3/935: mknod dd/d19/d1e/db8/df3/c13e 0 2026-03-09T17:30:03.773 INFO:tasks.workunit.client.0.vm06.stdout:2/786: truncate d3/d4/d12/da7/db3/fc2 3230965 0 2026-03-09T17:30:03.773 INFO:tasks.workunit.client.0.vm06.stdout:6/791: symlink d6/d47/d96/da1/lf3 0 2026-03-09T17:30:03.775 INFO:tasks.workunit.client.1.vm09.stdout:9/60: rename d5/lb to d5/de/l16 0 2026-03-09T17:30:03.776 INFO:tasks.workunit.client.1.vm09.stdout:9/61: chown c1 133960 1 2026-03-09T17:30:03.778 INFO:tasks.workunit.client.1.vm09.stdout:9/62: dread f2 [0,4194304] 0 2026-03-09T17:30:03.778 INFO:tasks.workunit.client.1.vm09.stdout:7/90: mknod da/c23 0 2026-03-09T17:30:03.780 INFO:tasks.workunit.client.0.vm06.stdout:2/787: truncate d3/d4/d12/f2e 988548 0 2026-03-09T17:30:03.781 INFO:tasks.workunit.client.0.vm06.stdout:2/788: chown d3/d4/d12 0 1 2026-03-09T17:30:03.785 INFO:tasks.workunit.client.0.vm06.stdout:4/949: rename db/d1d/d21/d25/d4b/d85/fbe to db/d1d/d21/d37/f14c 0 2026-03-09T17:30:03.786 INFO:tasks.workunit.client.1.vm09.stdout:5/91: link d0/d9/d16/c19 d0/d2/d15/c1d 0 2026-03-09T17:30:03.790 INFO:tasks.workunit.client.0.vm06.stdout:2/789: unlink d3/d4/d22/d72/ff5 0 2026-03-09T17:30:03.791 INFO:tasks.workunit.client.1.vm09.stdout:1/98: getdents d9 0 2026-03-09T17:30:03.794 INFO:tasks.workunit.client.0.vm06.stdout:1/865: rename d11/d14/d1d/d1e/d2a/d34/d64/fea to d11/d14/d1d/dd1/de2/f126 0 2026-03-09T17:30:03.801 INFO:tasks.workunit.client.1.vm09.stdout:3/48: rename f4 to d5/fd 0 2026-03-09T17:30:03.802 INFO:tasks.workunit.client.1.vm09.stdout:3/49: dread - d5/d9/fc zero size 2026-03-09T17:30:03.802 INFO:tasks.workunit.client.1.vm09.stdout:3/50: write f3 [231482,34287] 0 2026-03-09T17:30:03.806 INFO:tasks.workunit.client.0.vm06.stdout:9/956: rename d3/d15/d16/lde to d3/d11/d65/d124/l133 0 2026-03-09T17:30:03.808 INFO:tasks.workunit.client.0.vm06.stdout:5/866: dwrite d4/d22/d46/f82 [0,4194304] 0 2026-03-09T17:30:03.809 INFO:tasks.workunit.client.1.vm09.stdout:3/51: sync 2026-03-09T17:30:03.816 INFO:tasks.workunit.client.0.vm06.stdout:1/866: creat d11/d14/d1c/d3a/f127 x:0 0 0 2026-03-09T17:30:03.816 INFO:tasks.workunit.client.0.vm06.stdout:1/867: write d11/d14/d1d/d1e/d2a/d99/ff3 [1318670,30183] 0 2026-03-09T17:30:03.824 INFO:tasks.workunit.client.1.vm09.stdout:2/83: symlink d13/l1c 0 2026-03-09T17:30:03.824 INFO:tasks.workunit.client.1.vm09.stdout:5/92: symlink d0/d9/d16/l1e 0 2026-03-09T17:30:03.825 INFO:tasks.workunit.client.0.vm06.stdout:8/869: write d15/d31/dc5/df1/f26 [1238685,127665] 0 2026-03-09T17:30:03.831 INFO:tasks.workunit.client.1.vm09.stdout:8/83: link d1/f3 d1/da/dd/f1e 0 2026-03-09T17:30:03.831 INFO:tasks.workunit.client.1.vm09.stdout:8/84: stat d1/f8 0 2026-03-09T17:30:03.834 INFO:tasks.workunit.client.1.vm09.stdout:6/45: dwrite f2 [0,4194304] 0 2026-03-09T17:30:03.835 INFO:tasks.workunit.client.1.vm09.stdout:6/46: fsync d3/fc 0 2026-03-09T17:30:03.836 INFO:tasks.workunit.client.0.vm06.stdout:1/868: mknod d11/de0/c128 0 2026-03-09T17:30:03.837 INFO:tasks.workunit.client.1.vm09.stdout:6/47: sync 2026-03-09T17:30:03.837 INFO:tasks.workunit.client.1.vm09.stdout:6/48: fsync d3/fb 0 2026-03-09T17:30:03.838 INFO:tasks.workunit.client.1.vm09.stdout:6/49: read - d3/fb zero size 2026-03-09T17:30:03.838 INFO:tasks.workunit.client.1.vm09.stdout:6/50: dread - d3/fb zero size 2026-03-09T17:30:03.840 INFO:tasks.workunit.client.1.vm09.stdout:6/51: sync 2026-03-09T17:30:03.841 INFO:tasks.workunit.client.0.vm06.stdout:2/790: dwrite d3/d4/d12/d2b/fef [0,4194304] 0 2026-03-09T17:30:03.856 INFO:tasks.workunit.client.0.vm06.stdout:9/957: mkdir d3/d6d/d9a/d134 0 2026-03-09T17:30:03.857 INFO:tasks.workunit.client.0.vm06.stdout:9/958: read - d3/f5f zero size 2026-03-09T17:30:03.857 INFO:tasks.workunit.client.1.vm09.stdout:4/92: rename f2 to d11/f1f 0 2026-03-09T17:30:03.857 INFO:tasks.workunit.client.1.vm09.stdout:4/93: fsync d11/f13 0 2026-03-09T17:30:03.860 INFO:tasks.workunit.client.1.vm09.stdout:3/52: fsync d5/fd 0 2026-03-09T17:30:03.868 INFO:tasks.workunit.client.1.vm09.stdout:3/53: chown d5/d9 128 1 2026-03-09T17:30:03.868 INFO:tasks.workunit.client.0.vm06.stdout:2/791: symlink d3/d4/d22/d72/d8f/dda/lff 0 2026-03-09T17:30:03.869 INFO:tasks.workunit.client.0.vm06.stdout:2/792: readlink d3/d4/d22/le6 0 2026-03-09T17:30:03.869 INFO:tasks.workunit.client.0.vm06.stdout:3/936: rename dd/d19/d28/ffe to dd/d19/f13f 0 2026-03-09T17:30:03.873 INFO:tasks.workunit.client.1.vm09.stdout:8/85: symlink d1/l1f 0 2026-03-09T17:30:03.878 INFO:tasks.workunit.client.0.vm06.stdout:3/937: dread dd/f5f [0,4194304] 0 2026-03-09T17:30:03.878 INFO:tasks.workunit.client.1.vm09.stdout:0/99: link l5 d6/de/l19 0 2026-03-09T17:30:03.881 INFO:tasks.workunit.client.0.vm06.stdout:6/792: rename d6/d12/d17/f6b to d6/d47/d4d/d9a/da2/ff4 0 2026-03-09T17:30:03.882 INFO:tasks.workunit.client.0.vm06.stdout:6/793: chown d6/d4f/l5a 74 1 2026-03-09T17:30:03.883 INFO:tasks.workunit.client.0.vm06.stdout:1/869: dread d11/d14/d1d/d42/d46/d92/dc0/fc9 [0,4194304] 0 2026-03-09T17:30:03.886 INFO:tasks.workunit.client.1.vm09.stdout:6/52: rename d3/d7/l9 to d3/d7/ld 0 2026-03-09T17:30:03.887 INFO:tasks.workunit.client.1.vm09.stdout:6/53: dread - d3/fb zero size 2026-03-09T17:30:03.890 INFO:tasks.workunit.client.0.vm06.stdout:5/867: dwrite d4/d50/d18/f101 [0,4194304] 0 2026-03-09T17:30:03.890 INFO:tasks.workunit.client.0.vm06.stdout:5/868: chown d4/d22/d46/f82 13188 1 2026-03-09T17:30:03.896 INFO:tasks.workunit.client.1.vm09.stdout:7/91: truncate da/f10 508407 0 2026-03-09T17:30:03.897 INFO:tasks.workunit.client.0.vm06.stdout:8/870: dwrite d15/d31/d58/d9b/fb1 [0,4194304] 0 2026-03-09T17:30:03.908 INFO:tasks.workunit.client.0.vm06.stdout:9/959: symlink d3/d15/l135 0 2026-03-09T17:30:03.912 INFO:tasks.workunit.client.0.vm06.stdout:3/938: creat dd/d19/d25/d44/d12f/f140 x:0 0 0 2026-03-09T17:30:03.912 INFO:tasks.workunit.client.1.vm09.stdout:3/54: creat d5/d6/fe x:0 0 0 2026-03-09T17:30:03.912 INFO:tasks.workunit.client.1.vm09.stdout:3/55: chown d5/d9/fc 292614478 1 2026-03-09T17:30:03.912 INFO:tasks.workunit.client.1.vm09.stdout:2/84: truncate f8 723886 0 2026-03-09T17:30:03.913 INFO:tasks.workunit.client.1.vm09.stdout:2/85: dwrite d13/f1a [0,4194304] 0 2026-03-09T17:30:03.916 INFO:tasks.workunit.client.0.vm06.stdout:6/794: creat d6/d47/d4d/ff5 x:0 0 0 2026-03-09T17:30:03.922 INFO:tasks.workunit.client.1.vm09.stdout:1/99: link d9/dc/l18 d9/dc/l19 0 2026-03-09T17:30:03.925 INFO:tasks.workunit.client.0.vm06.stdout:1/870: creat d11/d14/d1d/d4a/df7/f129 x:0 0 0 2026-03-09T17:30:03.926 INFO:tasks.workunit.client.1.vm09.stdout:8/86: unlink d1/da/fc 0 2026-03-09T17:30:03.929 INFO:tasks.workunit.client.1.vm09.stdout:0/100: dread d6/f9 [0,4194304] 0 2026-03-09T17:30:03.929 INFO:tasks.workunit.client.0.vm06.stdout:5/869: mknod d4/d52/d55/c137 0 2026-03-09T17:30:03.932 INFO:tasks.workunit.client.1.vm09.stdout:6/54: creat d3/d7/fe x:0 0 0 2026-03-09T17:30:03.935 INFO:tasks.workunit.client.1.vm09.stdout:7/92: symlink da/d11/l24 0 2026-03-09T17:30:03.942 INFO:tasks.workunit.client.0.vm06.stdout:2/793: creat d3/d4/d12/d2b/d2d/f100 x:0 0 0 2026-03-09T17:30:03.942 INFO:tasks.workunit.client.0.vm06.stdout:3/939: fdatasync dd/d81/da3/dae/fbb 0 2026-03-09T17:30:03.943 INFO:tasks.workunit.client.1.vm09.stdout:7/93: chown da/d11/f1e 7265221 1 2026-03-09T17:30:03.943 INFO:tasks.workunit.client.1.vm09.stdout:4/94: truncate d11/f1f 4081153 0 2026-03-09T17:30:03.943 INFO:tasks.workunit.client.1.vm09.stdout:4/95: write d11/f16 [2836136,30143] 0 2026-03-09T17:30:03.943 INFO:tasks.workunit.client.1.vm09.stdout:4/96: chown d11/f15 5335 1 2026-03-09T17:30:03.943 INFO:tasks.workunit.client.1.vm09.stdout:3/56: chown d5/d9/la 13324496 1 2026-03-09T17:30:03.943 INFO:tasks.workunit.client.1.vm09.stdout:3/57: dwrite f1 [0,4194304] 0 2026-03-09T17:30:03.944 INFO:tasks.workunit.client.1.vm09.stdout:6/55: sync 2026-03-09T17:30:03.945 INFO:tasks.workunit.client.1.vm09.stdout:6/56: chown d3/c5 113308351 1 2026-03-09T17:30:03.945 INFO:tasks.workunit.client.1.vm09.stdout:6/57: write d3/fb [784799,39699] 0 2026-03-09T17:30:03.946 INFO:tasks.workunit.client.0.vm06.stdout:4/950: rename db/d59/d5f/c12b to db/d1d/dc4/c14d 0 2026-03-09T17:30:03.950 INFO:tasks.workunit.client.1.vm09.stdout:3/58: dwrite d5/d6/fe [0,4194304] 0 2026-03-09T17:30:03.951 INFO:tasks.workunit.client.0.vm06.stdout:1/871: symlink d11/d14/d1c/d5f/l12a 0 2026-03-09T17:30:03.952 INFO:tasks.workunit.client.1.vm09.stdout:9/63: creat d5/f17 x:0 0 0 2026-03-09T17:30:03.952 INFO:tasks.workunit.client.1.vm09.stdout:6/58: dwrite d3/d7/fe [0,4194304] 0 2026-03-09T17:30:03.965 INFO:tasks.workunit.client.1.vm09.stdout:8/87: mknod d1/da/dd/c20 0 2026-03-09T17:30:03.972 INFO:tasks.workunit.client.1.vm09.stdout:0/101: mknod d6/de/c1a 0 2026-03-09T17:30:03.983 INFO:tasks.workunit.client.1.vm09.stdout:0/102: truncate d6/de/ff 980674 0 2026-03-09T17:30:03.983 INFO:tasks.workunit.client.1.vm09.stdout:3/59: mknod d5/d9/cf 0 2026-03-09T17:30:03.983 INFO:tasks.workunit.client.1.vm09.stdout:3/60: chown d5/d6/fb 65102 1 2026-03-09T17:30:03.983 INFO:tasks.workunit.client.1.vm09.stdout:9/64: write d5/f13 [569032,34268] 0 2026-03-09T17:30:03.988 INFO:tasks.workunit.client.1.vm09.stdout:1/100: link f2 d9/dc/d15/f1a 0 2026-03-09T17:30:03.989 INFO:tasks.workunit.client.0.vm06.stdout:9/960: link d3/d15/f1a d3/d15/d36/d4d/f136 0 2026-03-09T17:30:03.989 INFO:tasks.workunit.client.0.vm06.stdout:9/961: chown d3/d11/d65/d124 74 1 2026-03-09T17:30:03.990 INFO:tasks.workunit.client.0.vm06.stdout:2/794: dread d3/d4/d12/d71/daa/d77/d81/d64/fce [0,4194304] 0 2026-03-09T17:30:03.994 INFO:tasks.workunit.client.0.vm06.stdout:2/795: dwrite d3/d4/d12/d2b/d2d/f100 [0,4194304] 0 2026-03-09T17:30:03.997 INFO:tasks.workunit.client.1.vm09.stdout:8/88: creat d1/da/d13/f21 x:0 0 0 2026-03-09T17:30:03.999 INFO:tasks.workunit.client.1.vm09.stdout:8/89: dread d1/da/dd/f1e [0,4194304] 0 2026-03-09T17:30:04.000 INFO:tasks.workunit.client.0.vm06.stdout:6/795: link d6/d12/fbc d6/d4f/d3e/d52/d8c/db0/ff6 0 2026-03-09T17:30:04.000 INFO:tasks.workunit.client.1.vm09.stdout:8/90: write d1/da/f12 [3503043,130457] 0 2026-03-09T17:30:04.000 INFO:tasks.workunit.client.1.vm09.stdout:0/103: chown d6/c15 2 1 2026-03-09T17:30:04.001 INFO:tasks.workunit.client.1.vm09.stdout:8/91: readlink d1/l1f 0 2026-03-09T17:30:04.006 INFO:tasks.workunit.client.1.vm09.stdout:3/61: symlink d5/d6/l10 0 2026-03-09T17:30:04.006 INFO:tasks.workunit.client.1.vm09.stdout:3/62: write f1 [2889466,42283] 0 2026-03-09T17:30:04.008 INFO:tasks.workunit.client.0.vm06.stdout:9/962: unlink d3/d15/f17 0 2026-03-09T17:30:04.010 INFO:tasks.workunit.client.1.vm09.stdout:0/104: sync 2026-03-09T17:30:04.011 INFO:tasks.workunit.client.0.vm06.stdout:9/963: dread d3/d26/f76 [0,4194304] 0 2026-03-09T17:30:04.016 INFO:tasks.workunit.client.1.vm09.stdout:9/65: creat d5/de/f18 x:0 0 0 2026-03-09T17:30:04.018 INFO:tasks.workunit.client.1.vm09.stdout:6/59: chown d3/d7/ld 0 1 2026-03-09T17:30:04.022 INFO:tasks.workunit.client.1.vm09.stdout:5/93: truncate d0/de/f1a 5990523 0 2026-03-09T17:30:04.023 INFO:tasks.workunit.client.1.vm09.stdout:1/101: rename d9/dc/l19 to d9/dc/l1b 0 2026-03-09T17:30:04.030 INFO:tasks.workunit.client.0.vm06.stdout:9/964: chown d3/d15/d36/d83/fb1 6279977 1 2026-03-09T17:30:04.030 INFO:tasks.workunit.client.1.vm09.stdout:5/94: dwrite d0/d2/d15/f1c [0,4194304] 0 2026-03-09T17:30:04.032 INFO:tasks.workunit.client.1.vm09.stdout:8/92: fdatasync d1/da/dd/f1e 0 2026-03-09T17:30:04.034 INFO:tasks.workunit.client.1.vm09.stdout:6/60: fdatasync d3/d7/fe 0 2026-03-09T17:30:04.036 INFO:tasks.workunit.client.0.vm06.stdout:8/871: dwrite d15/d39/d67/de3/fe9 [0,4194304] 0 2026-03-09T17:30:04.040 INFO:tasks.workunit.client.0.vm06.stdout:2/796: symlink d3/d4/d22/l101 0 2026-03-09T17:30:04.044 INFO:tasks.workunit.client.0.vm06.stdout:3/940: link dd/d5b/d65/f135 dd/d19/d1e/d100/d11b/f141 0 2026-03-09T17:30:04.050 INFO:tasks.workunit.client.1.vm09.stdout:3/63: symlink d5/l11 0 2026-03-09T17:30:04.052 INFO:tasks.workunit.client.0.vm06.stdout:1/872: getdents d11/d14/d1d/d1e/d2a/d99/de9 0 2026-03-09T17:30:04.057 INFO:tasks.workunit.client.0.vm06.stdout:8/872: mkdir d15/d39/d67/d77/d99/d11e 0 2026-03-09T17:30:04.064 INFO:tasks.workunit.client.0.vm06.stdout:8/873: read d15/d39/f45 [3294028,115661] 0 2026-03-09T17:30:04.068 INFO:tasks.workunit.client.1.vm09.stdout:9/66: fsync d5/f17 0 2026-03-09T17:30:04.068 INFO:tasks.workunit.client.1.vm09.stdout:9/67: chown d5/f17 0 1 2026-03-09T17:30:04.070 INFO:tasks.workunit.client.0.vm06.stdout:5/870: dwrite d4/d50/d18/f8c [4194304,4194304] 0 2026-03-09T17:30:04.074 INFO:tasks.workunit.client.0.vm06.stdout:4/951: dwrite db/d1d/d21/d44/d8a/dec/f11e [0,4194304] 0 2026-03-09T17:30:04.078 INFO:tasks.workunit.client.0.vm06.stdout:6/796: link d6/d47/ld6 d6/d47/d96/da1/lf7 0 2026-03-09T17:30:04.080 INFO:tasks.workunit.client.1.vm09.stdout:7/94: getdents da 0 2026-03-09T17:30:04.080 INFO:tasks.workunit.client.0.vm06.stdout:1/873: read d11/d14/d1d/d42/d46/d92/dc0/f21 [1659623,18364] 0 2026-03-09T17:30:04.080 INFO:tasks.workunit.client.1.vm09.stdout:7/95: write da/d11/f1e [254612,61439] 0 2026-03-09T17:30:04.081 INFO:tasks.workunit.client.1.vm09.stdout:7/96: truncate da/d11/f1a 5132203 0 2026-03-09T17:30:04.083 INFO:tasks.workunit.client.0.vm06.stdout:2/797: mkdir d3/d4/d12/d71/daa/d77/d102 0 2026-03-09T17:30:04.083 INFO:tasks.workunit.client.0.vm06.stdout:2/798: chown d3/d4/d12/d2b/db0/dc1 23346059 1 2026-03-09T17:30:04.090 INFO:tasks.workunit.client.1.vm09.stdout:8/93: creat d1/da/dd/f22 x:0 0 0 2026-03-09T17:30:04.092 INFO:tasks.workunit.client.1.vm09.stdout:6/61: rmdir d3 39 2026-03-09T17:30:04.093 INFO:tasks.workunit.client.0.vm06.stdout:8/874: rename d15/d16/d1e/d30/c4d to d15/d16/d1e/d30/db8/c11f 0 2026-03-09T17:30:04.094 INFO:tasks.workunit.client.1.vm09.stdout:4/97: link c7 d11/c20 0 2026-03-09T17:30:04.095 INFO:tasks.workunit.client.1.vm09.stdout:4/98: fsync d11/f19 0 2026-03-09T17:30:04.095 INFO:tasks.workunit.client.1.vm09.stdout:4/99: fsync f3 0 2026-03-09T17:30:04.096 INFO:tasks.workunit.client.1.vm09.stdout:8/94: dwrite d1/da/dd/f10 [4194304,4194304] 0 2026-03-09T17:30:04.096 INFO:tasks.workunit.client.1.vm09.stdout:3/64: mkdir d5/d6/d12 0 2026-03-09T17:30:04.096 INFO:tasks.workunit.client.0.vm06.stdout:5/871: truncate d4/d50/d18/fa8 3425924 0 2026-03-09T17:30:04.097 INFO:tasks.workunit.client.1.vm09.stdout:3/65: chown d5 67825 1 2026-03-09T17:30:04.097 INFO:tasks.workunit.client.1.vm09.stdout:3/66: fsync d5/d6/fb 0 2026-03-09T17:30:04.098 INFO:tasks.workunit.client.1.vm09.stdout:3/67: write f1 [1640148,115436] 0 2026-03-09T17:30:04.109 INFO:tasks.workunit.client.0.vm06.stdout:4/952: symlink db/d1d/d21/d37/d69/d78/db4/l14e 0 2026-03-09T17:30:04.109 INFO:tasks.workunit.client.1.vm09.stdout:0/105: dwrite d6/f7 [0,4194304] 0 2026-03-09T17:30:04.109 INFO:tasks.workunit.client.0.vm06.stdout:4/953: readlink db/d59/d5f/lac 0 2026-03-09T17:30:04.118 INFO:tasks.workunit.client.1.vm09.stdout:7/97: dread da/d11/f1e [0,4194304] 0 2026-03-09T17:30:04.118 INFO:tasks.workunit.client.1.vm09.stdout:1/102: fsync d9/dc/d15/f1a 0 2026-03-09T17:30:04.118 INFO:tasks.workunit.client.1.vm09.stdout:1/103: fdatasync f6 0 2026-03-09T17:30:04.130 INFO:tasks.workunit.client.1.vm09.stdout:5/95: creat d0/de/d17/f1f x:0 0 0 2026-03-09T17:30:04.141 INFO:tasks.workunit.client.0.vm06.stdout:2/799: creat d3/d4/d12/da7/db3/f103 x:0 0 0 2026-03-09T17:30:04.141 INFO:tasks.workunit.client.1.vm09.stdout:6/62: write d3/d7/fe [251619,70680] 0 2026-03-09T17:30:04.141 INFO:tasks.workunit.client.1.vm09.stdout:6/63: write f2 [898309,71723] 0 2026-03-09T17:30:04.145 INFO:tasks.workunit.client.0.vm06.stdout:5/872: creat d4/d22/d64/f138 x:0 0 0 2026-03-09T17:30:04.146 INFO:tasks.workunit.client.1.vm09.stdout:8/95: dread d1/f16 [0,4194304] 0 2026-03-09T17:30:04.148 INFO:tasks.workunit.client.0.vm06.stdout:6/797: dread d6/d12/d17/d85/f9c [0,4194304] 0 2026-03-09T17:30:04.153 INFO:tasks.workunit.client.0.vm06.stdout:6/798: dwrite d6/d12/d2d/db3/ff1 [0,4194304] 0 2026-03-09T17:30:04.162 INFO:tasks.workunit.client.1.vm09.stdout:3/68: symlink d5/l13 0 2026-03-09T17:30:04.162 INFO:tasks.workunit.client.1.vm09.stdout:2/86: link f8 d13/d15/f1d 0 2026-03-09T17:30:04.162 INFO:tasks.workunit.client.0.vm06.stdout:5/873: rmdir d4/d50/d35/d40/d95/db8 39 2026-03-09T17:30:04.164 INFO:tasks.workunit.client.1.vm09.stdout:7/98: creat da/d11/f25 x:0 0 0 2026-03-09T17:30:04.164 INFO:tasks.workunit.client.0.vm06.stdout:6/799: unlink d6/d12/d53/f87 0 2026-03-09T17:30:04.165 INFO:tasks.workunit.client.0.vm06.stdout:1/874: link d11/d14/d1c/l30 d11/d14/d1d/d1e/d2a/d34/d58/l12b 0 2026-03-09T17:30:04.166 INFO:tasks.workunit.client.0.vm06.stdout:1/875: chown d11/d14/d1d/d1e/d2a/d34/d58/cbd 1 1 2026-03-09T17:30:04.167 INFO:tasks.workunit.client.0.vm06.stdout:5/874: mknod d4/dca/c139 0 2026-03-09T17:30:04.168 INFO:tasks.workunit.client.1.vm09.stdout:9/68: symlink d5/l19 0 2026-03-09T17:30:04.169 INFO:tasks.workunit.client.0.vm06.stdout:5/875: truncate d4/d52/d55/dee/f115 289914 0 2026-03-09T17:30:04.170 INFO:tasks.workunit.client.1.vm09.stdout:1/104: dwrite d9/dc/d15/f1a [0,4194304] 0 2026-03-09T17:30:04.170 INFO:tasks.workunit.client.0.vm06.stdout:6/800: mkdir d6/d47/dd7/df8 0 2026-03-09T17:30:04.176 INFO:tasks.workunit.client.1.vm09.stdout:5/96: mkdir d0/d2/d15/d20 0 2026-03-09T17:30:04.177 INFO:tasks.workunit.client.1.vm09.stdout:6/64: rename d3/fb to d3/d7/ff 0 2026-03-09T17:30:04.178 INFO:tasks.workunit.client.1.vm09.stdout:4/100: mknod d11/d1e/c21 0 2026-03-09T17:30:04.182 INFO:tasks.workunit.client.1.vm09.stdout:6/65: dwrite d3/d7/fe [4194304,4194304] 0 2026-03-09T17:30:04.183 INFO:tasks.workunit.client.1.vm09.stdout:6/66: chown d3/d7/ff 0 1 2026-03-09T17:30:04.184 INFO:tasks.workunit.client.1.vm09.stdout:6/67: stat d3/d7/ld 0 2026-03-09T17:30:04.184 INFO:tasks.workunit.client.1.vm09.stdout:6/68: write d3/fc [1039977,14266] 0 2026-03-09T17:30:04.187 INFO:tasks.workunit.client.0.vm06.stdout:6/801: creat d6/d12/ff9 x:0 0 0 2026-03-09T17:30:04.188 INFO:tasks.workunit.client.0.vm06.stdout:6/802: stat d6/d47/ce1 0 2026-03-09T17:30:04.194 INFO:tasks.workunit.client.1.vm09.stdout:7/99: creat da/f26 x:0 0 0 2026-03-09T17:30:04.216 INFO:tasks.workunit.client.1.vm09.stdout:9/69: readlink d5/l12 0 2026-03-09T17:30:04.216 INFO:tasks.workunit.client.0.vm06.stdout:9/965: dwrite d3/d15/d16/f105 [0,4194304] 0 2026-03-09T17:30:04.216 INFO:tasks.workunit.client.0.vm06.stdout:3/941: write dd/d19/d1e/f10a [679657,6849] 0 2026-03-09T17:30:04.216 INFO:tasks.workunit.client.0.vm06.stdout:9/966: dwrite d3/d26/f33 [0,4194304] 0 2026-03-09T17:30:04.216 INFO:tasks.workunit.client.0.vm06.stdout:5/876: getdents d4/d50/d35/d40/d109/d11f 0 2026-03-09T17:30:04.216 INFO:tasks.workunit.client.0.vm06.stdout:3/942: dread dd/d19/d25/d48/d93/fb9 [0,4194304] 0 2026-03-09T17:30:04.217 INFO:tasks.workunit.client.0.vm06.stdout:6/803: creat d6/d12/d53/d91/dbf/ffa x:0 0 0 2026-03-09T17:30:04.218 INFO:tasks.workunit.client.0.vm06.stdout:6/804: write d6/d47/d96/d40/fdd [4126706,19549] 0 2026-03-09T17:30:04.226 INFO:tasks.workunit.client.1.vm09.stdout:5/97: mkdir d0/dc/d21 0 2026-03-09T17:30:04.226 INFO:tasks.workunit.client.0.vm06.stdout:9/967: creat d3/d6d/d9a/d9c/d116/f137 x:0 0 0 2026-03-09T17:30:04.227 INFO:tasks.workunit.client.0.vm06.stdout:9/968: fdatasync d3/d26/d35/fb5 0 2026-03-09T17:30:04.228 INFO:tasks.workunit.client.0.vm06.stdout:1/876: dread d11/d14/d1d/d42/d46/d92/dc0/f68 [4194304,4194304] 0 2026-03-09T17:30:04.234 INFO:tasks.workunit.client.0.vm06.stdout:3/943: truncate dd/d19/d28/f58 1140979 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.0.vm06.stdout:6/805: truncate d6/d47/f49 546218 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.0.vm06.stdout:6/806: chown d6/d47/d96/da1/cb6 739775 1 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.0.vm06.stdout:9/969: chown d3/c7e 2 1 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.0.vm06.stdout:9/970: chown d3/d15/fce 2 1 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.0.vm06.stdout:1/877: creat d11/d14/d1d/d4a/f12c x:0 0 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.0.vm06.stdout:6/807: symlink d6/d12/d53/d91/dbf/lfb 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.0.vm06.stdout:9/971: fsync d3/d15/d36/fcf 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.0.vm06.stdout:9/972: write d3/d11/f87 [126291,40993] 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.0.vm06.stdout:9/973: dwrite d3/d15/d36/d4d/f60 [4194304,4194304] 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.0.vm06.stdout:3/944: mknod dd/d19/d25/d44/d80/dd7/d120/c142 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.1.vm09.stdout:8/96: mkdir d1/da/d23 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.1.vm09.stdout:6/69: creat d3/d7/f10 x:0 0 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.1.vm09.stdout:7/100: creat da/f27 x:0 0 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.1.vm09.stdout:1/105: mknod d9/dc/d15/c1c 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.1.vm09.stdout:1/106: write d9/dc/dd/fe [88896,110512] 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.1.vm09.stdout:0/106: getdents d6/de 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.1.vm09.stdout:8/97: symlink d1/da/dd/l24 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.1.vm09.stdout:8/98: write d1/d14/d1b/f1c [788037,2540] 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.1.vm09.stdout:8/99: readlink d1/da/dd/l18 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.1.vm09.stdout:9/70: rename c1 to d5/c1a 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.1.vm09.stdout:5/98: creat d0/f22 x:0 0 0 2026-03-09T17:30:04.272 INFO:tasks.workunit.client.1.vm09.stdout:5/99: read d0/d2/d15/f1c [1458434,45615] 0 2026-03-09T17:30:04.273 INFO:tasks.workunit.client.1.vm09.stdout:5/100: chown d0/d9 224365 1 2026-03-09T17:30:04.273 INFO:tasks.workunit.client.1.vm09.stdout:5/101: readlink d0/dc/ld 0 2026-03-09T17:30:04.273 INFO:tasks.workunit.client.1.vm09.stdout:9/71: rmdir d5/de 39 2026-03-09T17:30:04.273 INFO:tasks.workunit.client.1.vm09.stdout:1/107: rename d9/d10 to d9/dc/d15/d1d 0 2026-03-09T17:30:04.273 INFO:tasks.workunit.client.1.vm09.stdout:8/100: symlink d1/da/l25 0 2026-03-09T17:30:04.273 INFO:tasks.workunit.client.1.vm09.stdout:0/107: getdents d6 0 2026-03-09T17:30:04.273 INFO:tasks.workunit.client.1.vm09.stdout:9/72: dread f2 [0,4194304] 0 2026-03-09T17:30:04.273 INFO:tasks.workunit.client.1.vm09.stdout:8/101: dwrite d1/da/d13/f21 [0,4194304] 0 2026-03-09T17:30:04.274 INFO:tasks.workunit.client.1.vm09.stdout:0/108: rename d6/f18 to d6/f1b 0 2026-03-09T17:30:04.275 INFO:tasks.workunit.client.1.vm09.stdout:8/102: write d1/da/d13/f21 [3329336,7203] 0 2026-03-09T17:30:04.276 INFO:tasks.workunit.client.1.vm09.stdout:8/103: chown d1/da/l25 20146 1 2026-03-09T17:30:04.277 INFO:tasks.workunit.client.1.vm09.stdout:9/73: truncate f2 2352976 0 2026-03-09T17:30:04.282 INFO:tasks.workunit.client.0.vm06.stdout:1/878: dread d11/d14/d1d/f8f [0,4194304] 0 2026-03-09T17:30:04.283 INFO:tasks.workunit.client.1.vm09.stdout:8/104: mkdir d1/d26 0 2026-03-09T17:30:04.285 INFO:tasks.workunit.client.1.vm09.stdout:0/109: mknod d6/de/c1c 0 2026-03-09T17:30:04.286 INFO:tasks.workunit.client.1.vm09.stdout:8/105: dread d1/f3 [0,4194304] 0 2026-03-09T17:30:04.286 INFO:tasks.workunit.client.1.vm09.stdout:9/74: rename d5/f17 to d5/f1b 0 2026-03-09T17:30:04.287 INFO:tasks.workunit.client.1.vm09.stdout:0/110: mkdir d6/d1d 0 2026-03-09T17:30:04.288 INFO:tasks.workunit.client.1.vm09.stdout:8/106: dread d1/da/f12 [0,4194304] 0 2026-03-09T17:30:04.289 INFO:tasks.workunit.client.1.vm09.stdout:8/107: fdatasync d1/da/dd/f22 0 2026-03-09T17:30:04.291 INFO:tasks.workunit.client.1.vm09.stdout:8/108: unlink d1/da/dd/l18 0 2026-03-09T17:30:04.323 INFO:tasks.workunit.client.1.vm09.stdout:0/111: dwrite d6/f16 [0,4194304] 0 2026-03-09T17:30:04.323 INFO:tasks.workunit.client.1.vm09.stdout:9/75: link d5/f11 d5/f1c 0 2026-03-09T17:30:04.323 INFO:tasks.workunit.client.1.vm09.stdout:0/112: creat d6/d1d/f1e x:0 0 0 2026-03-09T17:30:04.323 INFO:tasks.workunit.client.1.vm09.stdout:9/76: unlink d5/de/cf 0 2026-03-09T17:30:04.323 INFO:tasks.workunit.client.1.vm09.stdout:0/113: dread d6/f16 [0,4194304] 0 2026-03-09T17:30:04.323 INFO:tasks.workunit.client.1.vm09.stdout:0/114: dread d6/f16 [0,4194304] 0 2026-03-09T17:30:04.323 INFO:tasks.workunit.client.1.vm09.stdout:0/115: symlink d6/d1d/l1f 0 2026-03-09T17:30:04.323 INFO:tasks.workunit.client.1.vm09.stdout:9/77: creat d5/f1d x:0 0 0 2026-03-09T17:30:04.323 INFO:tasks.workunit.client.1.vm09.stdout:0/116: write d6/f9 [2033582,119635] 0 2026-03-09T17:30:04.323 INFO:tasks.workunit.client.1.vm09.stdout:0/117: truncate d6/de/ff 1946117 0 2026-03-09T17:30:04.323 INFO:tasks.workunit.client.1.vm09.stdout:9/78: creat d5/f1e x:0 0 0 2026-03-09T17:30:04.325 INFO:tasks.workunit.client.0.vm06.stdout:9/974: rename d3/d26/d6c/d68/l125 to d3/d11/d65/l138 0 2026-03-09T17:30:04.327 INFO:tasks.workunit.client.0.vm06.stdout:6/808: creat d6/d12/d53/d91/dcb/ffc x:0 0 0 2026-03-09T17:30:04.329 INFO:tasks.workunit.client.1.vm09.stdout:9/79: stat d5/f14 0 2026-03-09T17:30:04.330 INFO:tasks.workunit.client.0.vm06.stdout:3/945: mkdir dd/d19/d1e/db8/d143 0 2026-03-09T17:30:04.330 INFO:tasks.workunit.client.0.vm06.stdout:3/946: read dd/d19/d25/d2d/f6d [3373133,128343] 0 2026-03-09T17:30:04.331 INFO:tasks.workunit.client.0.vm06.stdout:2/800: sync 2026-03-09T17:30:04.336 INFO:tasks.workunit.client.1.vm09.stdout:9/80: unlink d5/l12 0 2026-03-09T17:30:04.337 INFO:tasks.workunit.client.0.vm06.stdout:9/975: mknod d3/d15/d36/d4d/c139 0 2026-03-09T17:30:04.338 INFO:tasks.workunit.client.1.vm09.stdout:9/81: dread d5/f1c [0,4194304] 0 2026-03-09T17:30:04.343 INFO:tasks.workunit.client.1.vm09.stdout:0/118: rename l2 to d6/l20 0 2026-03-09T17:30:04.349 INFO:tasks.workunit.client.1.vm09.stdout:3/69: sync 2026-03-09T17:30:04.358 INFO:tasks.workunit.client.1.vm09.stdout:0/119: creat d6/f21 x:0 0 0 2026-03-09T17:30:04.358 INFO:tasks.workunit.client.1.vm09.stdout:0/120: write d6/f9 [717594,7807] 0 2026-03-09T17:30:04.359 INFO:tasks.workunit.client.1.vm09.stdout:0/121: chown d6/de/c1c 25832 1 2026-03-09T17:30:04.361 INFO:tasks.workunit.client.0.vm06.stdout:3/947: truncate dd/d1d/f99 578428 0 2026-03-09T17:30:04.362 INFO:tasks.workunit.client.1.vm09.stdout:4/101: sync 2026-03-09T17:30:04.365 INFO:tasks.workunit.client.1.vm09.stdout:6/70: fsync f2 0 2026-03-09T17:30:04.371 INFO:tasks.workunit.client.1.vm09.stdout:3/70: link d5/d9/la d5/l14 0 2026-03-09T17:30:04.372 INFO:tasks.workunit.client.1.vm09.stdout:3/71: write d5/fd [4149176,83823] 0 2026-03-09T17:30:04.372 INFO:tasks.workunit.client.1.vm09.stdout:4/102: stat c7 0 2026-03-09T17:30:04.375 INFO:tasks.workunit.client.1.vm09.stdout:6/71: creat d3/d7/f11 x:0 0 0 2026-03-09T17:30:04.378 INFO:tasks.workunit.client.1.vm09.stdout:0/122: rename l5 to d6/d1d/l22 0 2026-03-09T17:30:04.383 INFO:tasks.workunit.client.1.vm09.stdout:6/72: mkdir d3/d7/d12 0 2026-03-09T17:30:04.385 INFO:tasks.workunit.client.1.vm09.stdout:4/103: link f3 d11/d1e/f22 0 2026-03-09T17:30:04.386 INFO:tasks.workunit.client.1.vm09.stdout:4/104: read d11/f18 [2429415,77507] 0 2026-03-09T17:30:04.386 INFO:tasks.workunit.client.1.vm09.stdout:4/105: write d11/f18 [4478211,81350] 0 2026-03-09T17:30:04.407 INFO:tasks.workunit.client.1.vm09.stdout:4/106: readlink l0 0 2026-03-09T17:30:04.407 INFO:tasks.workunit.client.1.vm09.stdout:4/107: chown d11/d1e/f22 9833 1 2026-03-09T17:30:04.407 INFO:tasks.workunit.client.1.vm09.stdout:6/73: unlink d3/l6 0 2026-03-09T17:30:04.407 INFO:tasks.workunit.client.1.vm09.stdout:6/74: unlink d3/c5 0 2026-03-09T17:30:04.414 INFO:tasks.workunit.client.1.vm09.stdout:3/72: sync 2026-03-09T17:30:04.420 INFO:tasks.workunit.client.0.vm06.stdout:3/948: dread dd/d19/d1e/f10a [0,4194304] 0 2026-03-09T17:30:04.427 INFO:tasks.workunit.client.1.vm09.stdout:3/73: creat d5/d6/d12/f15 x:0 0 0 2026-03-09T17:30:04.430 INFO:tasks.workunit.client.1.vm09.stdout:3/74: write d5/d9/fc [972675,43922] 0 2026-03-09T17:30:04.431 INFO:tasks.workunit.client.0.vm06.stdout:3/949: dread dd/d1d/d2e/fec [0,4194304] 0 2026-03-09T17:30:04.438 INFO:tasks.workunit.client.1.vm09.stdout:3/75: mkdir d5/d16 0 2026-03-09T17:30:04.441 INFO:tasks.workunit.client.0.vm06.stdout:3/950: symlink dd/d59/da1/d11e/l144 0 2026-03-09T17:30:04.446 INFO:tasks.workunit.client.0.vm06.stdout:3/951: getdents dd/d81/d97/df5/d13d 0 2026-03-09T17:30:04.448 INFO:tasks.workunit.client.0.vm06.stdout:3/952: rename dd/d1d/f34 to dd/d81/d97/df5/f145 0 2026-03-09T17:30:04.450 INFO:tasks.workunit.client.0.vm06.stdout:8/875: write d15/d39/d67/d77/fc3 [1184752,10698] 0 2026-03-09T17:30:04.451 INFO:tasks.workunit.client.0.vm06.stdout:8/876: readlink d15/d39/d67/d77/la7 0 2026-03-09T17:30:04.452 INFO:tasks.workunit.client.0.vm06.stdout:3/953: dread dd/d19/d25/d44/d80/dd7/fe6 [0,4194304] 0 2026-03-09T17:30:04.454 INFO:tasks.workunit.client.1.vm09.stdout:3/76: dread d5/d9/fc [0,4194304] 0 2026-03-09T17:30:04.465 INFO:tasks.workunit.client.1.vm09.stdout:3/77: dread d5/d6/fb [0,4194304] 0 2026-03-09T17:30:04.470 INFO:tasks.workunit.client.1.vm09.stdout:3/78: creat d5/d16/f17 x:0 0 0 2026-03-09T17:30:04.470 INFO:tasks.workunit.client.0.vm06.stdout:4/954: write db/d1d/d21/f2f [1841659,78654] 0 2026-03-09T17:30:04.476 INFO:tasks.workunit.client.1.vm09.stdout:3/79: creat d5/d6/d12/f18 x:0 0 0 2026-03-09T17:30:04.476 INFO:tasks.workunit.client.1.vm09.stdout:3/80: write d5/d6/d12/f18 [709698,47472] 0 2026-03-09T17:30:04.476 INFO:tasks.workunit.client.1.vm09.stdout:3/81: readlink d5/l11 0 2026-03-09T17:30:04.480 INFO:tasks.workunit.client.0.vm06.stdout:8/877: getdents d15/d16/d1e/d30/d55 0 2026-03-09T17:30:04.483 INFO:tasks.workunit.client.0.vm06.stdout:8/878: dwrite d15/d16/d1e/f4e [4194304,4194304] 0 2026-03-09T17:30:04.510 INFO:tasks.workunit.client.1.vm09.stdout:3/82: creat d5/d6/d12/f19 x:0 0 0 2026-03-09T17:30:04.521 INFO:tasks.workunit.client.0.vm06.stdout:5/877: write d4/d50/d18/f73 [988126,52008] 0 2026-03-09T17:30:04.529 INFO:tasks.workunit.client.1.vm09.stdout:3/83: truncate d5/d6/fb 889470 0 2026-03-09T17:30:04.535 INFO:tasks.workunit.client.1.vm09.stdout:2/87: truncate d13/f1a 1782836 0 2026-03-09T17:30:04.536 INFO:tasks.workunit.client.0.vm06.stdout:8/879: creat d15/d16/d1e/d30/f120 x:0 0 0 2026-03-09T17:30:04.538 INFO:tasks.workunit.client.0.vm06.stdout:5/878: symlink d4/d22/d46/dec/l13a 0 2026-03-09T17:30:04.539 INFO:tasks.workunit.client.0.vm06.stdout:5/879: chown d4/d50/d35/d40/d109/d11f/f122 1 1 2026-03-09T17:30:04.540 INFO:tasks.workunit.client.1.vm09.stdout:2/88: symlink d13/l1e 0 2026-03-09T17:30:04.541 INFO:tasks.workunit.client.0.vm06.stdout:8/880: truncate d15/d39/d67/d77/de7/f10a 398936 0 2026-03-09T17:30:04.543 INFO:tasks.workunit.client.1.vm09.stdout:2/89: dread d13/d15/f18 [0,4194304] 0 2026-03-09T17:30:04.547 INFO:tasks.workunit.client.1.vm09.stdout:2/90: write d13/f14 [463144,8208] 0 2026-03-09T17:30:04.547 INFO:tasks.workunit.client.1.vm09.stdout:7/101: getdents da 0 2026-03-09T17:30:04.550 INFO:tasks.workunit.client.1.vm09.stdout:7/102: dwrite da/ff [0,4194304] 0 2026-03-09T17:30:04.560 INFO:tasks.workunit.client.0.vm06.stdout:5/880: link d4/dbb/ff9 d4/d50/d35/d40/d95/db8/f13b 0 2026-03-09T17:30:04.565 INFO:tasks.workunit.client.1.vm09.stdout:2/91: mknod d13/c1f 0 2026-03-09T17:30:04.567 INFO:tasks.workunit.client.1.vm09.stdout:2/92: creat d13/d15/f20 x:0 0 0 2026-03-09T17:30:04.569 INFO:tasks.workunit.client.1.vm09.stdout:2/93: mkdir d13/d15/d21 0 2026-03-09T17:30:04.580 INFO:tasks.workunit.client.0.vm06.stdout:8/881: sync 2026-03-09T17:30:04.583 INFO:tasks.workunit.client.0.vm06.stdout:8/882: truncate d15/d39/d3c/dd5/fde 847442 0 2026-03-09T17:30:04.607 INFO:tasks.workunit.client.1.vm09.stdout:1/108: creat d9/dc/d15/d1d/f1e x:0 0 0 2026-03-09T17:30:04.609 INFO:tasks.workunit.client.1.vm09.stdout:1/109: unlink d9/cb 0 2026-03-09T17:30:04.612 INFO:tasks.workunit.client.1.vm09.stdout:1/110: unlink d9/dc/dd/c16 0 2026-03-09T17:30:04.636 INFO:tasks.workunit.client.1.vm09.stdout:8/109: write d1/f16 [119106,9139] 0 2026-03-09T17:30:04.652 INFO:tasks.workunit.client.0.vm06.stdout:1/879: write f10 [4080966,32429] 0 2026-03-09T17:30:04.655 INFO:tasks.workunit.client.1.vm09.stdout:9/82: getdents d5 0 2026-03-09T17:30:04.656 INFO:tasks.workunit.client.0.vm06.stdout:1/880: symlink d11/d14/d1d/d1e/d2a/d34/l12d 0 2026-03-09T17:30:04.656 INFO:tasks.workunit.client.1.vm09.stdout:9/83: mknod d5/c1f 0 2026-03-09T17:30:04.661 INFO:tasks.workunit.client.1.vm09.stdout:0/123: rename d6/l20 to d6/de/l23 0 2026-03-09T17:30:04.663 INFO:tasks.workunit.client.1.vm09.stdout:9/84: link d5/f11 d5/de/f20 0 2026-03-09T17:30:04.664 INFO:tasks.workunit.client.1.vm09.stdout:9/85: truncate d5/de/f18 606954 0 2026-03-09T17:30:04.665 INFO:tasks.workunit.client.1.vm09.stdout:4/108: rename ff to d11/f23 0 2026-03-09T17:30:04.665 INFO:tasks.workunit.client.1.vm09.stdout:3/84: rename d5/d6 to d5/d6/d12/d1a 22 2026-03-09T17:30:04.667 INFO:tasks.workunit.client.1.vm09.stdout:3/85: symlink d5/d9/l1b 0 2026-03-09T17:30:04.668 INFO:tasks.workunit.client.1.vm09.stdout:3/86: fsync f1 0 2026-03-09T17:30:04.668 INFO:tasks.workunit.client.1.vm09.stdout:7/103: rename da/l14 to da/d11/l28 0 2026-03-09T17:30:04.668 INFO:tasks.workunit.client.1.vm09.stdout:9/86: mkdir d5/d21 0 2026-03-09T17:30:04.671 INFO:tasks.workunit.client.1.vm09.stdout:3/87: chown d5/d6/fe 493267 1 2026-03-09T17:30:04.675 INFO:tasks.workunit.client.0.vm06.stdout:9/976: dwrite d3/f4b [0,4194304] 0 2026-03-09T17:30:04.676 INFO:tasks.workunit.client.1.vm09.stdout:3/88: dread - d5/d6/d12/f15 zero size 2026-03-09T17:30:04.677 INFO:tasks.workunit.client.1.vm09.stdout:4/109: dwrite fe [0,4194304] 0 2026-03-09T17:30:04.681 INFO:tasks.workunit.client.1.vm09.stdout:0/124: dread d6/de/ff [0,4194304] 0 2026-03-09T17:30:04.695 INFO:tasks.workunit.client.1.vm09.stdout:3/89: mkdir d5/d6/d12/d1c 0 2026-03-09T17:30:04.698 INFO:tasks.workunit.client.1.vm09.stdout:9/87: rename d5/c1a to d5/c22 0 2026-03-09T17:30:04.703 INFO:tasks.workunit.client.0.vm06.stdout:9/977: dread d3/d15/d16/f72 [0,4194304] 0 2026-03-09T17:30:04.704 INFO:tasks.workunit.client.1.vm09.stdout:3/90: dwrite d5/d6/d12/f15 [0,4194304] 0 2026-03-09T17:30:04.713 INFO:tasks.workunit.client.1.vm09.stdout:4/110: sync 2026-03-09T17:30:04.716 INFO:tasks.workunit.client.0.vm06.stdout:9/978: dread d3/d11/f1f [4194304,4194304] 0 2026-03-09T17:30:04.718 INFO:tasks.workunit.client.0.vm06.stdout:6/809: truncate d6/d12/d53/dd0/fe3 181562 0 2026-03-09T17:30:04.729 INFO:tasks.workunit.client.1.vm09.stdout:9/88: mknod d5/de/c23 0 2026-03-09T17:30:04.729 INFO:tasks.workunit.client.1.vm09.stdout:3/91: creat d5/d6/d12/f1d x:0 0 0 2026-03-09T17:30:04.729 INFO:tasks.workunit.client.1.vm09.stdout:4/111: rmdir d11/d1e 39 2026-03-09T17:30:04.729 INFO:tasks.workunit.client.1.vm09.stdout:9/89: symlink d5/de/l24 0 2026-03-09T17:30:04.729 INFO:tasks.workunit.client.1.vm09.stdout:3/92: dwrite d5/d6/d12/f15 [0,4194304] 0 2026-03-09T17:30:04.739 INFO:tasks.workunit.client.1.vm09.stdout:3/93: creat d5/d9/f1e x:0 0 0 2026-03-09T17:30:04.740 INFO:tasks.workunit.client.1.vm09.stdout:9/90: mknod d5/d21/c25 0 2026-03-09T17:30:04.742 INFO:tasks.workunit.client.1.vm09.stdout:4/112: creat d11/f24 x:0 0 0 2026-03-09T17:30:04.743 INFO:tasks.workunit.client.1.vm09.stdout:3/94: symlink d5/d6/d12/l1f 0 2026-03-09T17:30:04.743 INFO:tasks.workunit.client.1.vm09.stdout:3/95: readlink d5/d6/l10 0 2026-03-09T17:30:04.744 INFO:tasks.workunit.client.1.vm09.stdout:9/91: creat d5/d21/f26 x:0 0 0 2026-03-09T17:30:04.746 INFO:tasks.workunit.client.1.vm09.stdout:9/92: symlink d5/de/l27 0 2026-03-09T17:30:04.748 INFO:tasks.workunit.client.1.vm09.stdout:9/93: mknod d5/de/c28 0 2026-03-09T17:30:04.750 INFO:tasks.workunit.client.1.vm09.stdout:9/94: readlink d5/ld 0 2026-03-09T17:30:04.762 INFO:tasks.workunit.client.1.vm09.stdout:4/113: sync 2026-03-09T17:30:04.762 INFO:tasks.workunit.client.1.vm09.stdout:4/114: readlink l0 0 2026-03-09T17:30:04.764 INFO:tasks.workunit.client.1.vm09.stdout:4/115: fdatasync d11/f18 0 2026-03-09T17:30:04.767 INFO:tasks.workunit.client.0.vm06.stdout:2/801: write d3/d4/d12/d2b/d2d/f1b [7506,21041] 0 2026-03-09T17:30:04.768 INFO:tasks.workunit.client.1.vm09.stdout:4/116: dwrite d11/f1c [0,4194304] 0 2026-03-09T17:30:04.768 INFO:tasks.workunit.client.0.vm06.stdout:2/802: write d3/d4/d12/da7/db3/fbe [3917725,113756] 0 2026-03-09T17:30:04.769 INFO:tasks.workunit.client.1.vm09.stdout:4/117: fsync d11/f16 0 2026-03-09T17:30:04.771 INFO:tasks.workunit.client.1.vm09.stdout:4/118: sync 2026-03-09T17:30:04.785 INFO:tasks.workunit.client.1.vm09.stdout:6/75: fsync d3/d7/f11 0 2026-03-09T17:30:04.788 INFO:tasks.workunit.client.0.vm06.stdout:2/803: creat d3/d4/d46/da5/f104 x:0 0 0 2026-03-09T17:30:04.789 INFO:tasks.workunit.client.1.vm09.stdout:6/76: dwrite d3/d7/fe [8388608,4194304] 0 2026-03-09T17:30:04.790 INFO:tasks.workunit.client.0.vm06.stdout:2/804: rename d3/d4/d22/f67 to d3/d4/d12/d71/daa/d77/d102/f105 0 2026-03-09T17:30:04.797 INFO:tasks.workunit.client.1.vm09.stdout:4/119: creat d11/f25 x:0 0 0 2026-03-09T17:30:04.797 INFO:tasks.workunit.client.1.vm09.stdout:6/77: chown d3/d7 0 1 2026-03-09T17:30:04.797 INFO:tasks.workunit.client.1.vm09.stdout:4/120: dread - d11/f24 zero size 2026-03-09T17:30:04.800 INFO:tasks.workunit.client.1.vm09.stdout:6/78: mknod d3/c13 0 2026-03-09T17:30:04.804 INFO:tasks.workunit.client.1.vm09.stdout:6/79: dwrite d3/d7/fe [8388608,4194304] 0 2026-03-09T17:30:04.813 INFO:tasks.workunit.client.1.vm09.stdout:4/121: dwrite d11/d1e/f22 [0,4194304] 0 2026-03-09T17:30:04.813 INFO:tasks.workunit.client.1.vm09.stdout:4/122: fdatasync d11/f25 0 2026-03-09T17:30:04.813 INFO:tasks.workunit.client.1.vm09.stdout:4/123: read - d11/f12 zero size 2026-03-09T17:30:04.813 INFO:tasks.workunit.client.1.vm09.stdout:4/124: readlink l0 0 2026-03-09T17:30:04.821 INFO:tasks.workunit.client.1.vm09.stdout:6/80: mknod d3/c14 0 2026-03-09T17:30:04.825 INFO:tasks.workunit.client.1.vm09.stdout:4/125: creat d11/f26 x:0 0 0 2026-03-09T17:30:04.826 INFO:tasks.workunit.client.1.vm09.stdout:4/126: write d11/f25 [888744,8746] 0 2026-03-09T17:30:04.837 INFO:tasks.workunit.client.1.vm09.stdout:4/127: symlink d11/d1e/l27 0 2026-03-09T17:30:04.845 INFO:tasks.workunit.client.1.vm09.stdout:4/128: creat d11/d1e/f28 x:0 0 0 2026-03-09T17:30:04.845 INFO:tasks.workunit.client.1.vm09.stdout:4/129: read d11/f16 [2744919,66237] 0 2026-03-09T17:30:04.847 INFO:tasks.workunit.client.1.vm09.stdout:6/81: sync 2026-03-09T17:30:04.848 INFO:tasks.workunit.client.1.vm09.stdout:6/82: chown d3/fc 1518677589 1 2026-03-09T17:30:04.848 INFO:tasks.workunit.client.1.vm09.stdout:6/83: dread - d3/d7/f11 zero size 2026-03-09T17:30:04.852 INFO:tasks.workunit.client.1.vm09.stdout:4/130: sync 2026-03-09T17:30:04.853 INFO:tasks.workunit.client.1.vm09.stdout:4/131: chown d11/f24 1425 1 2026-03-09T17:30:04.857 INFO:tasks.workunit.client.1.vm09.stdout:6/84: dwrite d3/d7/ff [0,4194304] 0 2026-03-09T17:30:04.858 INFO:tasks.workunit.client.1.vm09.stdout:6/85: chown d3/d7/ld 159898618 1 2026-03-09T17:30:04.858 INFO:tasks.workunit.client.1.vm09.stdout:6/86: fsync d3/d7/f10 0 2026-03-09T17:30:04.859 INFO:tasks.workunit.client.1.vm09.stdout:4/132: mkdir d11/d1e/d29 0 2026-03-09T17:30:04.874 INFO:tasks.workunit.client.1.vm09.stdout:6/87: symlink d3/l15 0 2026-03-09T17:30:04.878 INFO:tasks.workunit.client.1.vm09.stdout:4/133: mknod d11/d1e/c2a 0 2026-03-09T17:30:04.887 INFO:tasks.workunit.client.1.vm09.stdout:4/134: rename d11/c1b to d11/d1e/c2b 0 2026-03-09T17:30:04.888 INFO:tasks.workunit.client.0.vm06.stdout:3/954: write dd/f4a [4930643,28299] 0 2026-03-09T17:30:04.899 INFO:tasks.workunit.client.1.vm09.stdout:3/96: getdents d5/d16 0 2026-03-09T17:30:04.905 INFO:tasks.workunit.client.0.vm06.stdout:4/955: write db/d59/d5f/d5d/fc2 [721335,97031] 0 2026-03-09T17:30:04.908 INFO:tasks.workunit.client.1.vm09.stdout:3/97: getdents d5/d6/d12/d1c 0 2026-03-09T17:30:04.914 INFO:tasks.workunit.client.1.vm09.stdout:3/98: mknod d5/d6/c20 0 2026-03-09T17:30:04.919 INFO:tasks.workunit.client.1.vm09.stdout:5/102: write d0/de/f1a [5017818,51340] 0 2026-03-09T17:30:04.919 INFO:tasks.workunit.client.1.vm09.stdout:3/99: chown d5/d6/fb 75295542 1 2026-03-09T17:30:04.921 INFO:tasks.workunit.client.1.vm09.stdout:3/100: dread - d5/d6/d12/f1d zero size 2026-03-09T17:30:04.923 INFO:tasks.workunit.client.0.vm06.stdout:4/956: link db/d1d/d21/d26/d89/l129 db/d1d/d21/l14f 0 2026-03-09T17:30:04.926 INFO:tasks.workunit.client.1.vm09.stdout:5/103: link d0/d2/d15/c1d d0/d9/d16/c23 0 2026-03-09T17:30:04.927 INFO:tasks.workunit.client.1.vm09.stdout:5/104: write d0/ff [320848,103237] 0 2026-03-09T17:30:04.927 INFO:tasks.workunit.client.1.vm09.stdout:2/94: truncate d13/f1a 2515532 0 2026-03-09T17:30:04.928 INFO:tasks.workunit.client.0.vm06.stdout:4/957: read db/d59/d5f/d45/d10a/dcc/fd0 [1783984,119708] 0 2026-03-09T17:30:04.937 INFO:tasks.workunit.client.1.vm09.stdout:5/105: rename d0/de/f1a to d0/d2/d15/d20/f24 0 2026-03-09T17:30:04.940 INFO:tasks.workunit.client.0.vm06.stdout:4/958: dread - db/d1d/d21/d44/d8a/f136 zero size 2026-03-09T17:30:04.941 INFO:tasks.workunit.client.0.vm06.stdout:4/959: readlink db/d1d/d21/d26/d89/lc5 0 2026-03-09T17:30:04.941 INFO:tasks.workunit.client.0.vm06.stdout:4/960: chown db/d1d/d21/d37/c94 351815 1 2026-03-09T17:30:04.948 INFO:tasks.workunit.client.0.vm06.stdout:4/961: rmdir db/d57 39 2026-03-09T17:30:04.953 INFO:tasks.workunit.client.1.vm09.stdout:2/95: mkdir d13/d22 0 2026-03-09T17:30:04.955 INFO:tasks.workunit.client.0.vm06.stdout:5/881: dwrite d4/d22/d46/dec/f116 [0,4194304] 0 2026-03-09T17:30:04.968 INFO:tasks.workunit.client.0.vm06.stdout:5/882: unlink d4/d22/fcb 0 2026-03-09T17:30:04.969 INFO:tasks.workunit.client.0.vm06.stdout:8/883: write d15/d31/f114 [4285178,130323] 0 2026-03-09T17:30:04.979 INFO:tasks.workunit.client.1.vm09.stdout:1/111: truncate f3 2331052 0 2026-03-09T17:30:04.982 INFO:tasks.workunit.client.0.vm06.stdout:4/962: link db/d59/d90/fe5 db/d1d/d21/f150 0 2026-03-09T17:30:04.982 INFO:tasks.workunit.client.0.vm06.stdout:8/884: mknod d15/d16/c121 0 2026-03-09T17:30:04.987 INFO:tasks.workunit.client.1.vm09.stdout:1/112: mkdir d9/d1f 0 2026-03-09T17:30:04.991 INFO:tasks.workunit.client.1.vm09.stdout:1/113: dwrite d9/dc/dd/fe [0,4194304] 0 2026-03-09T17:30:04.994 INFO:tasks.workunit.client.1.vm09.stdout:1/114: mknod d9/dc/d15/d1d/c20 0 2026-03-09T17:30:04.995 INFO:tasks.workunit.client.1.vm09.stdout:1/115: chown d9/dc 2 1 2026-03-09T17:30:05.007 INFO:tasks.workunit.client.0.vm06.stdout:4/963: dread db/d59/d90/f12f [0,4194304] 0 2026-03-09T17:30:05.019 INFO:tasks.workunit.client.0.vm06.stdout:4/964: mkdir db/d1d/d21/d37/d69/d78/da0/db6/d151 0 2026-03-09T17:30:05.024 INFO:tasks.workunit.client.0.vm06.stdout:4/965: symlink db/d1d/d21/d37/d69/d11f/l152 0 2026-03-09T17:30:05.038 INFO:tasks.workunit.client.0.vm06.stdout:1/881: write d11/d14/d1d/d1e/d2a/d99/f10e [714292,11163] 0 2026-03-09T17:30:05.041 INFO:tasks.workunit.client.1.vm09.stdout:8/110: dwrite d1/da/dd/f22 [0,4194304] 0 2026-03-09T17:30:05.051 INFO:tasks.workunit.client.0.vm06.stdout:1/882: write d11/d14/d1d/dd1/de2/f126 [1159819,46669] 0 2026-03-09T17:30:05.059 INFO:tasks.workunit.client.1.vm09.stdout:8/111: dread d1/da/dd/f10 [4194304,4194304] 0 2026-03-09T17:30:05.064 INFO:tasks.workunit.client.1.vm09.stdout:9/95: getdents d5 0 2026-03-09T17:30:05.064 INFO:tasks.workunit.client.1.vm09.stdout:9/96: truncate d5/f1b 625650 0 2026-03-09T17:30:05.064 INFO:tasks.workunit.client.1.vm09.stdout:0/125: truncate d6/f7 1133487 0 2026-03-09T17:30:05.064 INFO:tasks.workunit.client.1.vm09.stdout:7/104: dwrite da/d11/f1e [0,4194304] 0 2026-03-09T17:30:05.065 INFO:tasks.workunit.client.1.vm09.stdout:8/112: creat d1/da/dd/f27 x:0 0 0 2026-03-09T17:30:05.065 INFO:tasks.workunit.client.1.vm09.stdout:7/105: dread da/f12 [0,4194304] 0 2026-03-09T17:30:05.066 INFO:tasks.workunit.client.1.vm09.stdout:9/97: mkdir d5/de/d29 0 2026-03-09T17:30:05.066 INFO:tasks.workunit.client.1.vm09.stdout:0/126: mkdir d6/d1d/d24 0 2026-03-09T17:30:05.069 INFO:tasks.workunit.client.1.vm09.stdout:3/101: dwrite d5/d6/d12/f15 [4194304,4194304] 0 2026-03-09T17:30:05.069 INFO:tasks.workunit.client.1.vm09.stdout:9/98: chown d5/f14 26 1 2026-03-09T17:30:05.070 INFO:tasks.workunit.client.1.vm09.stdout:7/106: readlink da/d11/l28 0 2026-03-09T17:30:05.082 INFO:tasks.workunit.client.1.vm09.stdout:9/99: creat d5/d21/f2a x:0 0 0 2026-03-09T17:30:05.082 INFO:tasks.workunit.client.1.vm09.stdout:3/102: unlink d5/d6/c20 0 2026-03-09T17:30:05.082 INFO:tasks.workunit.client.1.vm09.stdout:7/107: mkdir da/d11/d29 0 2026-03-09T17:30:05.082 INFO:tasks.workunit.client.1.vm09.stdout:3/103: chown f3 3851430 1 2026-03-09T17:30:05.083 INFO:tasks.workunit.client.1.vm09.stdout:7/108: dread - da/d11/f19 zero size 2026-03-09T17:30:05.083 INFO:tasks.workunit.client.1.vm09.stdout:7/109: chown da/d11/f25 0 1 2026-03-09T17:30:05.084 INFO:tasks.workunit.client.1.vm09.stdout:7/110: read da/f21 [3570274,124722] 0 2026-03-09T17:30:05.086 INFO:tasks.workunit.client.1.vm09.stdout:7/111: write da/f26 [66032,116146] 0 2026-03-09T17:30:05.087 INFO:tasks.workunit.client.1.vm09.stdout:9/100: creat d5/d21/f2b x:0 0 0 2026-03-09T17:30:05.090 INFO:tasks.workunit.client.1.vm09.stdout:3/104: dwrite f3 [0,4194304] 0 2026-03-09T17:30:05.091 INFO:tasks.workunit.client.1.vm09.stdout:9/101: rmdir d5 39 2026-03-09T17:30:05.095 INFO:tasks.workunit.client.1.vm09.stdout:8/113: dwrite d1/f8 [0,4194304] 0 2026-03-09T17:30:05.098 INFO:tasks.workunit.client.0.vm06.stdout:1/883: sync 2026-03-09T17:30:05.106 INFO:tasks.workunit.client.1.vm09.stdout:7/112: mknod da/d11/d29/c2a 0 2026-03-09T17:30:05.106 INFO:tasks.workunit.client.1.vm09.stdout:7/113: stat da/f10 0 2026-03-09T17:30:05.109 INFO:tasks.workunit.client.1.vm09.stdout:8/114: write d1/d14/d1b/f1c [329297,21350] 0 2026-03-09T17:30:05.110 INFO:tasks.workunit.client.1.vm09.stdout:8/115: chown d1/da/dd/lf 15 1 2026-03-09T17:30:05.113 INFO:tasks.workunit.client.1.vm09.stdout:3/105: mknod d5/d6/d12/d1c/c21 0 2026-03-09T17:30:05.114 INFO:tasks.workunit.client.1.vm09.stdout:7/114: dwrite da/f21 [0,4194304] 0 2026-03-09T17:30:05.127 INFO:tasks.workunit.client.1.vm09.stdout:9/102: rename d5/de/l16 to d5/de/d29/l2c 0 2026-03-09T17:30:05.132 INFO:tasks.workunit.client.1.vm09.stdout:9/103: dwrite d5/f1d [0,4194304] 0 2026-03-09T17:30:05.133 INFO:tasks.workunit.client.1.vm09.stdout:9/104: stat d5/f13 0 2026-03-09T17:30:05.134 INFO:tasks.workunit.client.1.vm09.stdout:7/115: creat da/d11/d29/f2b x:0 0 0 2026-03-09T17:30:05.134 INFO:tasks.workunit.client.1.vm09.stdout:3/106: fdatasync d5/d6/fb 0 2026-03-09T17:30:05.135 INFO:tasks.workunit.client.1.vm09.stdout:3/107: dread d5/d9/fc [0,4194304] 0 2026-03-09T17:30:05.136 INFO:tasks.workunit.client.1.vm09.stdout:3/108: dread d5/d6/fb [0,4194304] 0 2026-03-09T17:30:05.141 INFO:tasks.workunit.client.1.vm09.stdout:8/116: creat d1/f28 x:0 0 0 2026-03-09T17:30:05.141 INFO:tasks.workunit.client.1.vm09.stdout:8/117: chown d1/da/d13/f1d 18291474 1 2026-03-09T17:30:05.147 INFO:tasks.workunit.client.1.vm09.stdout:7/116: unlink c5 0 2026-03-09T17:30:05.148 INFO:tasks.workunit.client.1.vm09.stdout:7/117: write da/ff [770358,44864] 0 2026-03-09T17:30:05.148 INFO:tasks.workunit.client.1.vm09.stdout:7/118: dread - da/d11/d29/f2b zero size 2026-03-09T17:30:05.149 INFO:tasks.workunit.client.1.vm09.stdout:7/119: dread - da/d11/f19 zero size 2026-03-09T17:30:05.150 INFO:tasks.workunit.client.1.vm09.stdout:3/109: unlink f1 0 2026-03-09T17:30:05.154 INFO:tasks.workunit.client.1.vm09.stdout:7/120: dwrite da/d11/f1a [4194304,4194304] 0 2026-03-09T17:30:05.160 INFO:tasks.workunit.client.1.vm09.stdout:3/110: creat d5/f22 x:0 0 0 2026-03-09T17:30:05.179 INFO:tasks.workunit.client.1.vm09.stdout:3/111: stat d5/d6/d12/f15 0 2026-03-09T17:30:05.179 INFO:tasks.workunit.client.1.vm09.stdout:7/121: mkdir da/d11/d2c 0 2026-03-09T17:30:05.179 INFO:tasks.workunit.client.1.vm09.stdout:7/122: read da/d11/f1e [1491446,36602] 0 2026-03-09T17:30:05.179 INFO:tasks.workunit.client.1.vm09.stdout:7/123: mkdir da/d11/d2d 0 2026-03-09T17:30:05.179 INFO:tasks.workunit.client.1.vm09.stdout:7/124: dwrite da/d11/f1f [0,4194304] 0 2026-03-09T17:30:05.179 INFO:tasks.workunit.client.1.vm09.stdout:7/125: dread - da/d11/f19 zero size 2026-03-09T17:30:05.180 INFO:tasks.workunit.client.1.vm09.stdout:7/126: mknod da/d11/d2c/c2e 0 2026-03-09T17:30:05.181 INFO:tasks.workunit.client.1.vm09.stdout:7/127: write da/d11/f1e [465698,121113] 0 2026-03-09T17:30:05.182 INFO:tasks.workunit.client.1.vm09.stdout:7/128: symlink da/d11/l2f 0 2026-03-09T17:30:05.183 INFO:tasks.workunit.client.1.vm09.stdout:7/129: chown da/d11/d2c 6286352 1 2026-03-09T17:30:05.184 INFO:tasks.workunit.client.1.vm09.stdout:7/130: creat da/d11/d2c/f30 x:0 0 0 2026-03-09T17:30:05.185 INFO:tasks.workunit.client.1.vm09.stdout:7/131: symlink da/d11/d2d/l31 0 2026-03-09T17:30:05.186 INFO:tasks.workunit.client.1.vm09.stdout:7/132: dread - da/d11/d29/f2b zero size 2026-03-09T17:30:05.189 INFO:tasks.workunit.client.1.vm09.stdout:7/133: rename da/f10 to da/d11/d2d/f32 0 2026-03-09T17:30:05.189 INFO:tasks.workunit.client.1.vm09.stdout:7/134: stat da 0 2026-03-09T17:30:05.190 INFO:tasks.workunit.client.1.vm09.stdout:7/135: creat da/d11/d29/f33 x:0 0 0 2026-03-09T17:30:05.194 INFO:tasks.workunit.client.1.vm09.stdout:7/136: dwrite da/d11/f25 [0,4194304] 0 2026-03-09T17:30:05.205 INFO:tasks.workunit.client.1.vm09.stdout:7/137: mknod da/d11/c34 0 2026-03-09T17:30:05.205 INFO:tasks.workunit.client.0.vm06.stdout:9/979: write d3/d2c/ffc [3837055,72833] 0 2026-03-09T17:30:05.205 INFO:tasks.workunit.client.0.vm06.stdout:9/980: creat d3/dad/f13a x:0 0 0 2026-03-09T17:30:05.205 INFO:tasks.workunit.client.0.vm06.stdout:9/981: mknod d3/d15/d48/da8/c13b 0 2026-03-09T17:30:05.205 INFO:tasks.workunit.client.0.vm06.stdout:9/982: readlink d3/l4 0 2026-03-09T17:30:05.205 INFO:tasks.workunit.client.0.vm06.stdout:9/983: mkdir d3/d11/d13c 0 2026-03-09T17:30:05.205 INFO:tasks.workunit.client.0.vm06.stdout:9/984: dread - d3/d6d/f7a zero size 2026-03-09T17:30:05.205 INFO:tasks.workunit.client.0.vm06.stdout:9/985: stat d3/d6d/d9a/cfa 0 2026-03-09T17:30:05.207 INFO:tasks.workunit.client.1.vm09.stdout:8/118: sync 2026-03-09T17:30:05.207 INFO:tasks.workunit.client.1.vm09.stdout:3/112: sync 2026-03-09T17:30:05.208 INFO:tasks.workunit.client.1.vm09.stdout:3/113: symlink d5/d9/l23 0 2026-03-09T17:30:05.209 INFO:tasks.workunit.client.1.vm09.stdout:3/114: chown d5 0 1 2026-03-09T17:30:05.209 INFO:tasks.workunit.client.1.vm09.stdout:8/119: fsync d1/f3 0 2026-03-09T17:30:05.210 INFO:tasks.workunit.client.1.vm09.stdout:3/115: mknod d5/d16/c24 0 2026-03-09T17:30:05.214 INFO:tasks.workunit.client.1.vm09.stdout:3/116: dwrite d5/d9/f1e [0,4194304] 0 2026-03-09T17:30:05.215 INFO:tasks.workunit.client.0.vm06.stdout:9/986: getdents d3/d6d 0 2026-03-09T17:30:05.216 INFO:tasks.workunit.client.0.vm06.stdout:9/987: readlink d3/d6d/l70 0 2026-03-09T17:30:05.218 INFO:tasks.workunit.client.0.vm06.stdout:9/988: truncate d3/d15/d48/da8/db9/de8/f12d 888287 0 2026-03-09T17:30:05.222 INFO:tasks.workunit.client.0.vm06.stdout:9/989: creat d3/d15/d36/d12a/f13d x:0 0 0 2026-03-09T17:30:05.229 INFO:tasks.workunit.client.0.vm06.stdout:9/990: rename d3/d15/d36/d4c/d6a/d8a/fe0 to d3/d26/d6c/d68/f13e 0 2026-03-09T17:30:05.236 INFO:tasks.workunit.client.0.vm06.stdout:9/991: creat d3/de1/f13f x:0 0 0 2026-03-09T17:30:05.245 INFO:tasks.workunit.client.0.vm06.stdout:6/810: dread d6/d12/d53/dd0/fe3 [0,4194304] 0 2026-03-09T17:30:05.246 INFO:tasks.workunit.client.0.vm06.stdout:6/811: dread - d6/d12/d53/fd5 zero size 2026-03-09T17:30:05.247 INFO:tasks.workunit.client.0.vm06.stdout:6/812: chown d6/d47/f61 2054561 1 2026-03-09T17:30:05.252 INFO:tasks.workunit.client.0.vm06.stdout:6/813: dwrite d6/d12/d53/dd0/fe3 [0,4194304] 0 2026-03-09T17:30:05.257 INFO:tasks.workunit.client.0.vm06.stdout:9/992: dread d3/d11/d65/d80/f11f [0,4194304] 0 2026-03-09T17:30:05.266 INFO:tasks.workunit.client.1.vm09.stdout:7/138: fdatasync da/d11/f1a 0 2026-03-09T17:30:05.266 INFO:tasks.workunit.client.1.vm09.stdout:7/139: dread - da/d11/d29/f33 zero size 2026-03-09T17:30:05.268 INFO:tasks.workunit.client.0.vm06.stdout:9/993: mkdir d3/d15/d48/da8/d140 0 2026-03-09T17:30:05.269 INFO:tasks.workunit.client.1.vm09.stdout:7/140: chown da/d11/c22 107540295 1 2026-03-09T17:30:05.274 INFO:tasks.workunit.client.1.vm09.stdout:7/141: creat da/d11/d2c/f35 x:0 0 0 2026-03-09T17:30:05.283 INFO:tasks.workunit.client.1.vm09.stdout:7/142: dwrite da/d11/d29/f33 [0,4194304] 0 2026-03-09T17:30:05.287 INFO:tasks.workunit.client.1.vm09.stdout:7/143: creat da/f36 x:0 0 0 2026-03-09T17:30:05.291 INFO:tasks.workunit.client.1.vm09.stdout:7/144: write da/f27 [296165,38386] 0 2026-03-09T17:30:05.291 INFO:tasks.workunit.client.1.vm09.stdout:7/145: readlink da/d11/l24 0 2026-03-09T17:30:05.291 INFO:tasks.workunit.client.1.vm09.stdout:7/146: readlink da/d11/l28 0 2026-03-09T17:30:05.291 INFO:tasks.workunit.client.1.vm09.stdout:7/147: write da/f15 [4285088,126051] 0 2026-03-09T17:30:05.342 INFO:tasks.workunit.client.0.vm06.stdout:2/805: dwrite d3/fc7 [0,4194304] 0 2026-03-09T17:30:05.400 INFO:tasks.workunit.client.1.vm09.stdout:6/88: chown d3/d7/d12 1430 1 2026-03-09T17:30:05.401 INFO:tasks.workunit.client.1.vm09.stdout:6/89: readlink d3/l15 0 2026-03-09T17:30:05.402 INFO:tasks.workunit.client.1.vm09.stdout:6/90: symlink d3/d7/l16 0 2026-03-09T17:30:05.417 INFO:tasks.workunit.client.0.vm06.stdout:3/955: write dd/d1d/f9f [697151,47513] 0 2026-03-09T17:30:05.418 INFO:tasks.workunit.client.0.vm06.stdout:3/956: chown dd/f15 5 1 2026-03-09T17:30:05.421 INFO:tasks.workunit.client.0.vm06.stdout:3/957: mkdir dd/d146 0 2026-03-09T17:30:05.422 INFO:tasks.workunit.client.0.vm06.stdout:3/958: rmdir dd/d1d/d4e 39 2026-03-09T17:30:05.424 INFO:tasks.workunit.client.1.vm09.stdout:4/135: truncate fe 3634629 0 2026-03-09T17:30:05.425 INFO:tasks.workunit.client.0.vm06.stdout:3/959: rename dd/d19/d2c/lcd to dd/d19/d25/l147 0 2026-03-09T17:30:05.426 INFO:tasks.workunit.client.0.vm06.stdout:3/960: truncate dd/d19/d25/d44/d12f/f139 815776 0 2026-03-09T17:30:05.427 INFO:tasks.workunit.client.0.vm06.stdout:3/961: dread - dd/d19/d1e/f136 zero size 2026-03-09T17:30:05.428 INFO:tasks.workunit.client.1.vm09.stdout:4/136: dread d11/d1e/f22 [0,4194304] 0 2026-03-09T17:30:05.428 INFO:tasks.workunit.client.0.vm06.stdout:3/962: rmdir dd/d19/d25/d44/d80/dd7/d120 39 2026-03-09T17:30:05.428 INFO:tasks.workunit.client.1.vm09.stdout:4/137: chown d11/f18 16301 1 2026-03-09T17:30:05.429 INFO:tasks.workunit.client.0.vm06.stdout:3/963: mknod dd/d1d/d6e/c148 0 2026-03-09T17:30:05.431 INFO:tasks.workunit.client.0.vm06.stdout:3/964: mkdir dd/d19/d25/d2d/d9b/df1/d149 0 2026-03-09T17:30:05.431 INFO:tasks.workunit.client.0.vm06.stdout:3/965: dread - dd/d19/d25/d44/f57 zero size 2026-03-09T17:30:05.436 INFO:tasks.workunit.client.0.vm06.stdout:3/966: rename dd/d1d/d6e/d70/f73 to dd/d81/da3/dae/df8/f14a 0 2026-03-09T17:30:05.436 INFO:tasks.workunit.client.0.vm06.stdout:3/967: read dd/d19/d1e/f10a [297710,120371] 0 2026-03-09T17:30:05.439 INFO:tasks.workunit.client.1.vm09.stdout:4/138: creat d11/d1e/d29/f2c x:0 0 0 2026-03-09T17:30:05.440 INFO:tasks.workunit.client.1.vm09.stdout:4/139: write d11/d1e/d29/f2c [573934,61151] 0 2026-03-09T17:30:05.450 INFO:tasks.workunit.client.0.vm06.stdout:3/968: creat dd/d19/d25/f14b x:0 0 0 2026-03-09T17:30:05.453 INFO:tasks.workunit.client.0.vm06.stdout:3/969: mknod dd/d19/d2c/c14c 0 2026-03-09T17:30:05.454 INFO:tasks.workunit.client.1.vm09.stdout:2/96: dread d13/f1a [0,4194304] 0 2026-03-09T17:30:05.454 INFO:tasks.workunit.client.0.vm06.stdout:3/970: symlink dd/d1d/d6e/d70/l14d 0 2026-03-09T17:30:05.459 INFO:tasks.workunit.client.1.vm09.stdout:5/106: truncate d0/d2/d15/d20/f24 1271704 0 2026-03-09T17:30:05.460 INFO:tasks.workunit.client.1.vm09.stdout:5/107: truncate d0/f22 219865 0 2026-03-09T17:30:05.461 INFO:tasks.workunit.client.1.vm09.stdout:2/97: rmdir d13/d22 0 2026-03-09T17:30:05.461 INFO:tasks.workunit.client.1.vm09.stdout:5/108: write d0/f22 [504536,27391] 0 2026-03-09T17:30:05.461 INFO:tasks.workunit.client.1.vm09.stdout:5/109: dread - d0/de/d17/f1f zero size 2026-03-09T17:30:05.466 INFO:tasks.workunit.client.1.vm09.stdout:2/98: truncate d13/d15/f18 2262802 0 2026-03-09T17:30:05.468 INFO:tasks.workunit.client.1.vm09.stdout:5/110: creat d0/d2/d15/d20/f25 x:0 0 0 2026-03-09T17:30:05.471 INFO:tasks.workunit.client.1.vm09.stdout:5/111: mkdir d0/dc/d21/d26 0 2026-03-09T17:30:05.473 INFO:tasks.workunit.client.1.vm09.stdout:5/112: chown d0/d9/d16/c23 2023028 1 2026-03-09T17:30:05.489 INFO:tasks.workunit.client.0.vm06.stdout:5/883: write d4/d50/d35/f4d [4367758,119128] 0 2026-03-09T17:30:05.493 INFO:tasks.workunit.client.0.vm06.stdout:5/884: rmdir d4/d50/d35/d40/d96/dfe 39 2026-03-09T17:30:05.495 INFO:tasks.workunit.client.0.vm06.stdout:5/885: creat d4/d52/d55/dee/f13c x:0 0 0 2026-03-09T17:30:05.496 INFO:tasks.workunit.client.0.vm06.stdout:5/886: readlink d4/d50/db2/ld5 0 2026-03-09T17:30:05.500 INFO:tasks.workunit.client.0.vm06.stdout:5/887: mknod d4/d50/d18/c13d 0 2026-03-09T17:30:05.506 INFO:tasks.workunit.client.0.vm06.stdout:5/888: fsync d4/d50/f43 0 2026-03-09T17:30:05.509 INFO:tasks.workunit.client.0.vm06.stdout:8/885: dwrite d15/d39/f7b [0,4194304] 0 2026-03-09T17:30:05.512 INFO:tasks.workunit.client.0.vm06.stdout:5/889: rename d4/dbb to d4/d52/d55/d13e 0 2026-03-09T17:30:05.516 INFO:tasks.workunit.client.0.vm06.stdout:5/890: rmdir d4/da4 39 2026-03-09T17:30:05.532 INFO:tasks.workunit.client.0.vm06.stdout:4/966: write db/d1d/d21/f6e [543005,22586] 0 2026-03-09T17:30:05.560 INFO:tasks.workunit.client.0.vm06.stdout:4/967: sync 2026-03-09T17:30:05.568 INFO:tasks.workunit.client.1.vm09.stdout:0/127: rmdir d6/d1d 39 2026-03-09T17:30:05.570 INFO:tasks.workunit.client.0.vm06.stdout:4/968: truncate db/f51 4704392 0 2026-03-09T17:30:05.570 INFO:tasks.workunit.client.0.vm06.stdout:4/969: chown db/d1d/d21/d25/d4b/c128 0 1 2026-03-09T17:30:05.571 INFO:tasks.workunit.client.1.vm09.stdout:7/148: dwrite da/f12 [0,4194304] 0 2026-03-09T17:30:05.573 INFO:tasks.workunit.client.1.vm09.stdout:0/128: readlink d6/de/l17 0 2026-03-09T17:30:05.590 INFO:tasks.workunit.client.1.vm09.stdout:7/149: unlink da/d11/d2d/l31 0 2026-03-09T17:30:05.592 INFO:tasks.workunit.client.1.vm09.stdout:7/150: dread da/f21 [0,4194304] 0 2026-03-09T17:30:05.593 INFO:tasks.workunit.client.1.vm09.stdout:0/129: mknod d6/d1d/c25 0 2026-03-09T17:30:05.594 INFO:tasks.workunit.client.1.vm09.stdout:0/130: fdatasync d6/f9 0 2026-03-09T17:30:05.595 INFO:tasks.workunit.client.1.vm09.stdout:7/151: symlink da/d11/d29/l37 0 2026-03-09T17:30:05.600 INFO:tasks.workunit.client.0.vm06.stdout:1/884: truncate d11/d14/d1c/d3a/fbf 274197 0 2026-03-09T17:30:05.600 INFO:tasks.workunit.client.0.vm06.stdout:1/885: chown d11/d14/d1d/d8c/ld3 20111587 1 2026-03-09T17:30:05.618 INFO:tasks.workunit.client.1.vm09.stdout:3/117: rename d5/d6/d12/d1c to d5/d16/d25 0 2026-03-09T17:30:05.622 INFO:tasks.workunit.client.1.vm09.stdout:9/105: truncate d5/f13 707558 0 2026-03-09T17:30:05.627 INFO:tasks.workunit.client.1.vm09.stdout:5/113: rename d0/d9/d16/l1e to d0/d2/d15/d20/l27 0 2026-03-09T17:30:05.630 INFO:tasks.workunit.client.1.vm09.stdout:3/118: symlink d5/d6/d12/l26 0 2026-03-09T17:30:05.633 INFO:tasks.workunit.client.0.vm06.stdout:1/886: getdents d11/d14/d1d/d1e/d2a/d99 0 2026-03-09T17:30:05.635 INFO:tasks.workunit.client.1.vm09.stdout:7/152: rename da/ff to da/d11/d2c/f38 0 2026-03-09T17:30:05.635 INFO:tasks.workunit.client.0.vm06.stdout:1/887: chown d11/d14/d1c/fe3 1251593 1 2026-03-09T17:30:05.635 INFO:tasks.workunit.client.1.vm09.stdout:7/153: write da/d11/d2c/f30 [334875,48366] 0 2026-03-09T17:30:05.642 INFO:tasks.workunit.client.0.vm06.stdout:1/888: dread d11/d14/d1d/d1e/d2a/f40 [0,4194304] 0 2026-03-09T17:30:05.653 INFO:tasks.workunit.client.1.vm09.stdout:5/114: truncate d0/d2/d15/f1c 2638459 0 2026-03-09T17:30:05.653 INFO:tasks.workunit.client.0.vm06.stdout:1/889: symlink d11/d14/d1d/d1e/dc2/l12e 0 2026-03-09T17:30:05.655 INFO:tasks.workunit.client.0.vm06.stdout:1/890: creat d11/d14/d1d/d4a/f12f x:0 0 0 2026-03-09T17:30:05.657 INFO:tasks.workunit.client.0.vm06.stdout:1/891: dread d11/d69/fad [0,4194304] 0 2026-03-09T17:30:05.658 INFO:tasks.workunit.client.0.vm06.stdout:1/892: read d11/d14/d1d/d1e/d2a/d99/de9/feb [47768,61482] 0 2026-03-09T17:30:05.659 INFO:tasks.workunit.client.1.vm09.stdout:7/154: creat da/d11/d29/f39 x:0 0 0 2026-03-09T17:30:05.660 INFO:tasks.workunit.client.1.vm09.stdout:7/155: creat da/f3a x:0 0 0 2026-03-09T17:30:05.660 INFO:tasks.workunit.client.1.vm09.stdout:7/156: fdatasync da/d11/f1a 0 2026-03-09T17:30:05.661 INFO:tasks.workunit.client.1.vm09.stdout:7/157: chown da/d11/d2c/f35 0 1 2026-03-09T17:30:05.665 INFO:tasks.workunit.client.1.vm09.stdout:7/158: dwrite da/d11/d29/f2b [0,4194304] 0 2026-03-09T17:30:05.680 INFO:tasks.workunit.client.1.vm09.stdout:8/120: truncate d1/f16 342821 0 2026-03-09T17:30:05.680 INFO:tasks.workunit.client.1.vm09.stdout:8/121: write d1/da/d13/f1d [256005,29644] 0 2026-03-09T17:30:05.681 INFO:tasks.workunit.client.1.vm09.stdout:7/159: mknod da/c3b 0 2026-03-09T17:30:05.683 INFO:tasks.workunit.client.1.vm09.stdout:7/160: dread da/f27 [0,4194304] 0 2026-03-09T17:30:05.684 INFO:tasks.workunit.client.1.vm09.stdout:8/122: rename d1/l1f to d1/da/d13/l29 0 2026-03-09T17:30:05.685 INFO:tasks.workunit.client.0.vm06.stdout:1/893: dread d11/d14/d1d/d42/f52 [0,4194304] 0 2026-03-09T17:30:05.686 INFO:tasks.workunit.client.1.vm09.stdout:7/161: symlink da/d11/l3c 0 2026-03-09T17:30:05.687 INFO:tasks.workunit.client.1.vm09.stdout:7/162: write da/d11/d29/f33 [1361242,109382] 0 2026-03-09T17:30:05.688 INFO:tasks.workunit.client.0.vm06.stdout:1/894: read d11/d14/d1d/d1e/d2a/d34/d64/f8a [2876667,116178] 0 2026-03-09T17:30:05.688 INFO:tasks.workunit.client.1.vm09.stdout:8/123: dwrite d1/da/dd/f10 [4194304,4194304] 0 2026-03-09T17:30:05.692 INFO:tasks.workunit.client.0.vm06.stdout:1/895: dwrite d11/f105 [0,4194304] 0 2026-03-09T17:30:05.694 INFO:tasks.workunit.client.1.vm09.stdout:8/124: mkdir d1/d14/d2a 0 2026-03-09T17:30:05.699 INFO:tasks.workunit.client.0.vm06.stdout:1/896: unlink d11/l32 0 2026-03-09T17:30:05.704 INFO:tasks.workunit.client.1.vm09.stdout:8/125: rename d1/da/dd/f10 to d1/d14/d2a/f2b 0 2026-03-09T17:30:05.710 INFO:tasks.workunit.client.1.vm09.stdout:8/126: mknod d1/d14/d2a/c2c 0 2026-03-09T17:30:05.710 INFO:tasks.workunit.client.1.vm09.stdout:8/127: chown d1/d14/d2a 158 1 2026-03-09T17:30:05.710 INFO:tasks.workunit.client.1.vm09.stdout:8/128: stat d1/da/dd 0 2026-03-09T17:30:05.711 INFO:tasks.workunit.client.1.vm09.stdout:8/129: read d1/da/d13/f21 [1714540,28551] 0 2026-03-09T17:30:05.716 INFO:tasks.workunit.client.1.vm09.stdout:8/130: chown d1/d26 231083 1 2026-03-09T17:30:05.724 INFO:tasks.workunit.client.1.vm09.stdout:8/131: mknod d1/da/d23/c2d 0 2026-03-09T17:30:05.732 INFO:tasks.workunit.client.1.vm09.stdout:8/132: write d1/f7 [2209523,12978] 0 2026-03-09T17:30:05.735 INFO:tasks.workunit.client.1.vm09.stdout:8/133: link d1/d14/d2a/f2b d1/d14/d2a/f2e 0 2026-03-09T17:30:05.735 INFO:tasks.workunit.client.1.vm09.stdout:8/134: creat d1/d14/f2f x:0 0 0 2026-03-09T17:30:05.736 INFO:tasks.workunit.client.1.vm09.stdout:8/135: symlink d1/d26/l30 0 2026-03-09T17:30:05.769 INFO:tasks.workunit.client.1.vm09.stdout:8/136: sync 2026-03-09T17:30:05.769 INFO:tasks.workunit.client.1.vm09.stdout:8/137: chown d1/da/dd/l15 307258 1 2026-03-09T17:30:05.797 INFO:tasks.workunit.client.0.vm06.stdout:6/814: dwrite d6/d12/d17/f32 [4194304,4194304] 0 2026-03-09T17:30:05.806 INFO:tasks.workunit.client.0.vm06.stdout:6/815: fsync d6/d4f/d3e/f51 0 2026-03-09T17:30:05.808 INFO:tasks.workunit.client.0.vm06.stdout:9/994: write d3/f50 [608563,113961] 0 2026-03-09T17:30:05.809 INFO:tasks.workunit.client.1.vm09.stdout:7/163: fsync da/d11/d2c/f35 0 2026-03-09T17:30:05.809 INFO:tasks.workunit.client.0.vm06.stdout:9/995: chown d3/d6d/d9a/d9c/dcd/l102 6626 1 2026-03-09T17:30:05.809 INFO:tasks.workunit.client.0.vm06.stdout:9/996: readlink d3/d15/d36/d4d/l63 0 2026-03-09T17:30:05.810 INFO:tasks.workunit.client.0.vm06.stdout:9/997: chown d3/d15/d36/d83 1962544151 1 2026-03-09T17:30:05.815 INFO:tasks.workunit.client.1.vm09.stdout:7/164: mkdir da/d11/d29/d3d 0 2026-03-09T17:30:05.819 INFO:tasks.workunit.client.0.vm06.stdout:9/998: creat d3/d26/d35/f141 x:0 0 0 2026-03-09T17:30:05.821 INFO:tasks.workunit.client.0.vm06.stdout:9/999: truncate d3/d6d/d9a/feb 4597689 0 2026-03-09T17:30:05.823 INFO:tasks.workunit.client.0.vm06.stdout:2/806: write d3/d4/d12/d71/daa/d77/d81/d64/d6a/de0/fe4 [1039044,47076] 0 2026-03-09T17:30:05.824 INFO:tasks.workunit.client.0.vm06.stdout:2/807: creat d3/d4/d12/da7/f106 x:0 0 0 2026-03-09T17:30:05.829 INFO:tasks.workunit.client.1.vm09.stdout:6/91: dwrite d3/d7/fe [0,4194304] 0 2026-03-09T17:30:05.831 INFO:tasks.workunit.client.1.vm09.stdout:6/92: symlink d3/l17 0 2026-03-09T17:30:05.836 INFO:tasks.workunit.client.0.vm06.stdout:6/816: dread d6/d47/d96/da1/fb7 [0,4194304] 0 2026-03-09T17:30:05.847 INFO:tasks.workunit.client.0.vm06.stdout:2/808: dread d3/d4/f52 [0,4194304] 0 2026-03-09T17:30:05.847 INFO:tasks.workunit.client.0.vm06.stdout:2/809: truncate d3/f3b 4315008 0 2026-03-09T17:30:05.852 INFO:tasks.workunit.client.0.vm06.stdout:6/817: symlink d6/d4f/d3e/d52/d8c/db0/lfd 0 2026-03-09T17:30:05.854 INFO:tasks.workunit.client.1.vm09.stdout:6/93: creat d3/d7/f18 x:0 0 0 2026-03-09T17:30:05.854 INFO:tasks.workunit.client.1.vm09.stdout:4/140: write d11/f1f [2522924,7324] 0 2026-03-09T17:30:05.857 INFO:tasks.workunit.client.0.vm06.stdout:2/810: truncate d3/d4/f9c 1417293 0 2026-03-09T17:30:05.863 INFO:tasks.workunit.client.1.vm09.stdout:6/94: readlink d3/d7/ld 0 2026-03-09T17:30:05.863 INFO:tasks.workunit.client.0.vm06.stdout:3/971: write dd/d81/da3/dae/fbb [1381740,6297] 0 2026-03-09T17:30:05.876 INFO:tasks.workunit.client.1.vm09.stdout:2/99: truncate d13/f14 4108181 0 2026-03-09T17:30:05.880 INFO:tasks.workunit.client.1.vm09.stdout:6/95: getdents d3/d7/d12 0 2026-03-09T17:30:05.882 INFO:tasks.workunit.client.0.vm06.stdout:6/818: rename d6/c41 to d6/d12/d17/d85/cfe 0 2026-03-09T17:30:05.882 INFO:tasks.workunit.client.0.vm06.stdout:6/819: readlink d6/d47/l58 0 2026-03-09T17:30:05.883 INFO:tasks.workunit.client.0.vm06.stdout:6/820: read d6/f5c [366946,108129] 0 2026-03-09T17:30:05.886 INFO:tasks.workunit.client.1.vm09.stdout:4/141: link d11/f18 d11/f2d 0 2026-03-09T17:30:05.888 INFO:tasks.workunit.client.0.vm06.stdout:3/972: creat dd/d81/da3/dae/df8/dff/f14e x:0 0 0 2026-03-09T17:30:05.891 INFO:tasks.workunit.client.1.vm09.stdout:1/116: truncate f3 637195 0 2026-03-09T17:30:05.891 INFO:tasks.workunit.client.1.vm09.stdout:2/100: write f6 [1876211,35796] 0 2026-03-09T17:30:05.891 INFO:tasks.workunit.client.1.vm09.stdout:1/117: fdatasync d9/dc/d15/f1a 0 2026-03-09T17:30:05.893 INFO:tasks.workunit.client.1.vm09.stdout:1/118: dread d9/dc/dd/fe [0,4194304] 0 2026-03-09T17:30:05.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: Active manager daemon vm09.lqzvkh restarted 2026-03-09T17:30:05.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: Activating manager daemon vm09.lqzvkh 2026-03-09T17:30:05.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/crt"}]: dispatch 2026-03-09T17:30:05.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: osdmap e39: 6 total, 6 up, 6 in 2026-03-09T17:30:05.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: mgrmap e21: vm09.lqzvkh(active, starting, since 0.0122543s) 2026-03-09T17:30:05.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:30:05.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.cjcawy"}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vmzmbb"}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.drzmdt"}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mgr metadata", "who": "vm09.lqzvkh", "id": "vm09.lqzvkh"}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/key"}]: dispatch 2026-03-09T17:30:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:05 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T17:30:05.895 INFO:tasks.workunit.client.0.vm06.stdout:2/811: creat d3/d4/d46/f107 x:0 0 0 2026-03-09T17:30:05.902 INFO:tasks.workunit.client.1.vm09.stdout:6/96: chown d3/d7/f10 7694533 1 2026-03-09T17:30:05.906 INFO:tasks.workunit.client.1.vm09.stdout:6/97: dwrite f2 [0,4194304] 0 2026-03-09T17:30:05.906 INFO:tasks.workunit.client.1.vm09.stdout:6/98: write d3/fc [91922,15271] 0 2026-03-09T17:30:05.906 INFO:tasks.workunit.client.0.vm06.stdout:8/886: dwrite f7 [0,4194304] 0 2026-03-09T17:30:05.918 INFO:tasks.workunit.client.1.vm09.stdout:4/142: creat d11/d1e/d29/f2e x:0 0 0 2026-03-09T17:30:05.921 INFO:tasks.workunit.client.0.vm06.stdout:5/891: dwrite d4/d50/d18/f4a [0,4194304] 0 2026-03-09T17:30:05.926 INFO:tasks.workunit.client.0.vm06.stdout:4/970: dwrite db/d1d/d21/d108/f126 [4194304,4194304] 0 2026-03-09T17:30:05.943 INFO:tasks.workunit.client.0.vm06.stdout:3/973: dread dd/d1d/f99 [0,4194304] 0 2026-03-09T17:30:05.944 INFO:tasks.workunit.client.0.vm06.stdout:2/812: symlink d3/d4/d12/d71/daa/d77/d81/d64/d6a/de0/l108 0 2026-03-09T17:30:05.947 INFO:tasks.workunit.client.1.vm09.stdout:2/101: fsync d13/f1a 0 2026-03-09T17:30:05.965 INFO:tasks.workunit.client.0.vm06.stdout:8/887: creat d15/d39/d67/d77/d97/f122 x:0 0 0 2026-03-09T17:30:05.968 INFO:tasks.workunit.client.0.vm06.stdout:4/971: read db/d1d/d21/d37/fbd [526619,126803] 0 2026-03-09T17:30:05.970 INFO:tasks.workunit.client.1.vm09.stdout:1/119: mkdir d9/dc/d15/d21 0 2026-03-09T17:30:05.971 INFO:tasks.workunit.client.1.vm09.stdout:7/165: getdents da/d11/d2c 0 2026-03-09T17:30:05.979 INFO:tasks.workunit.client.0.vm06.stdout:3/974: rmdir dd/d5b 39 2026-03-09T17:30:05.979 INFO:tasks.workunit.client.0.vm06.stdout:3/975: chown dd/d19/d2c/fe9 9 1 2026-03-09T17:30:05.979 INFO:tasks.workunit.client.1.vm09.stdout:1/120: dwrite d9/dc/d15/d1d/f1e [0,4194304] 0 2026-03-09T17:30:05.979 INFO:tasks.workunit.client.1.vm09.stdout:9/106: write f2 [332713,12107] 0 2026-03-09T17:30:05.979 INFO:tasks.workunit.client.1.vm09.stdout:9/107: fdatasync d5/de/f18 0 2026-03-09T17:30:05.980 INFO:tasks.workunit.client.1.vm09.stdout:4/143: creat d11/d1e/d29/f2f x:0 0 0 2026-03-09T17:30:05.980 INFO:tasks.workunit.client.1.vm09.stdout:7/166: dwrite da/f3a [0,4194304] 0 2026-03-09T17:30:05.981 INFO:tasks.workunit.client.1.vm09.stdout:0/131: unlink d6/f8 0 2026-03-09T17:30:05.982 INFO:tasks.workunit.client.1.vm09.stdout:0/132: write d6/f21 [426171,113819] 0 2026-03-09T17:30:05.999 INFO:tasks.workunit.client.0.vm06.stdout:5/892: creat d4/da4/f13f x:0 0 0 2026-03-09T17:30:06.003 INFO:tasks.workunit.client.0.vm06.stdout:1/897: dwrite d11/d14/d1d/d1e/d2a/d34/f3b [4194304,4194304] 0 2026-03-09T17:30:06.006 INFO:tasks.workunit.client.0.vm06.stdout:3/976: fdatasync dd/d19/d1e/f3f 0 2026-03-09T17:30:06.008 INFO:tasks.workunit.client.0.vm06.stdout:8/888: fdatasync d15/d16/d1e/f59 0 2026-03-09T17:30:06.010 INFO:tasks.workunit.client.1.vm09.stdout:8/138: rename d1/d26 to d1/d14/d31 0 2026-03-09T17:30:06.014 INFO:tasks.workunit.client.1.vm09.stdout:1/121: mkdir d9/dc/d15/d22 0 2026-03-09T17:30:06.016 INFO:tasks.workunit.client.1.vm09.stdout:1/122: read f2 [1543723,101095] 0 2026-03-09T17:30:06.017 INFO:tasks.workunit.client.0.vm06.stdout:4/972: sync 2026-03-09T17:30:06.018 INFO:tasks.workunit.client.1.vm09.stdout:1/123: dread d9/dc/d15/f1a [0,4194304] 0 2026-03-09T17:30:06.020 INFO:tasks.workunit.client.1.vm09.stdout:9/108: dwrite d5/f1b [0,4194304] 0 2026-03-09T17:30:06.020 INFO:tasks.workunit.client.1.vm09.stdout:1/124: stat d9/dc/d15/d22 0 2026-03-09T17:30:06.021 INFO:tasks.workunit.client.1.vm09.stdout:4/144: mkdir d11/d1e/d30 0 2026-03-09T17:30:06.065 INFO:tasks.workunit.client.0.vm06.stdout:3/977: dread dd/d81/da3/dae/fcb [0,4194304] 0 2026-03-09T17:30:06.076 INFO:tasks.workunit.client.1.vm09.stdout:7/167: mkdir da/d11/d3e 0 2026-03-09T17:30:06.077 INFO:tasks.workunit.client.1.vm09.stdout:2/102: getdents d13/d15/d21 0 2026-03-09T17:30:06.085 INFO:tasks.workunit.client.1.vm09.stdout:6/99: rmdir d3/d7/d12 0 2026-03-09T17:30:06.094 INFO:tasks.workunit.client.0.vm06.stdout:6/821: dwrite d6/d47/f49 [0,4194304] 0 2026-03-09T17:30:06.097 INFO:tasks.workunit.client.1.vm09.stdout:5/115: write d0/d2/d15/f1c [3377111,103319] 0 2026-03-09T17:30:06.100 INFO:tasks.workunit.client.0.vm06.stdout:2/813: dwrite d3/d4/d22/d72/d8f/f95 [0,4194304] 0 2026-03-09T17:30:06.120 INFO:tasks.workunit.client.1.vm09.stdout:8/139: mkdir d1/d14/d1b/d32 0 2026-03-09T17:30:06.120 INFO:tasks.workunit.client.0.vm06.stdout:8/889: truncate d15/d31/dc5/df1/d3d/f6a 267927 0 2026-03-09T17:30:06.124 INFO:tasks.workunit.client.0.vm06.stdout:3/978: truncate dd/d19/d28/f6f 1635884 0 2026-03-09T17:30:06.126 INFO:tasks.workunit.client.1.vm09.stdout:9/109: creat d5/de/f2d x:0 0 0 2026-03-09T17:30:06.128 INFO:tasks.workunit.client.1.vm09.stdout:4/145: mkdir d11/d1e/d31 0 2026-03-09T17:30:06.131 INFO:tasks.workunit.client.0.vm06.stdout:6/822: chown d6/d12/d17/f78 1353003982 1 2026-03-09T17:30:06.143 INFO:tasks.workunit.client.0.vm06.stdout:1/898: readlink d11/d14/d1d/d4a/ld9 0 2026-03-09T17:30:06.143 INFO:tasks.workunit.client.1.vm09.stdout:0/133: symlink d6/l26 0 2026-03-09T17:30:06.143 INFO:tasks.workunit.client.1.vm09.stdout:0/134: truncate d6/f21 686530 0 2026-03-09T17:30:06.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: Active manager daemon vm09.lqzvkh restarted 2026-03-09T17:30:06.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: Activating manager daemon vm09.lqzvkh 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.? 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/crt"}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: osdmap e39: 6 total, 6 up, 6 in 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: mgrmap e21: vm09.lqzvkh(active, starting, since 0.0122543s) 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.cjcawy"}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vmzmbb"}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.drzmdt"}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mgr metadata", "who": "vm09.lqzvkh", "id": "vm09.lqzvkh"}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/key"}]: dispatch 2026-03-09T17:30:06.144 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:05 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T17:30:06.147 INFO:tasks.workunit.client.1.vm09.stdout:6/100: dwrite d3/d7/ff [0,4194304] 0 2026-03-09T17:30:06.151 INFO:tasks.workunit.client.0.vm06.stdout:6/823: dread d6/d47/d96/f7e [0,4194304] 0 2026-03-09T17:30:06.153 INFO:tasks.workunit.client.0.vm06.stdout:8/890: fdatasync d15/d31/dc5/df1/d3d/f4c 0 2026-03-09T17:30:06.156 INFO:tasks.workunit.client.0.vm06.stdout:3/979: symlink dd/d19/d1e/db8/d143/l14f 0 2026-03-09T17:30:06.160 INFO:tasks.workunit.client.1.vm09.stdout:4/146: creat d11/d1e/d29/f32 x:0 0 0 2026-03-09T17:30:06.166 INFO:tasks.workunit.client.0.vm06.stdout:8/891: link d15/d39/d67/d77/d99/f115 d15/d39/d67/de3/f123 0 2026-03-09T17:30:06.168 INFO:tasks.workunit.client.1.vm09.stdout:5/116: creat d0/dc/d21/d26/f28 x:0 0 0 2026-03-09T17:30:06.168 INFO:tasks.workunit.client.0.vm06.stdout:6/824: creat d6/d12/d53/fff x:0 0 0 2026-03-09T17:30:06.169 INFO:tasks.workunit.client.1.vm09.stdout:5/117: write d0/ff [2064214,44394] 0 2026-03-09T17:30:06.169 INFO:tasks.workunit.client.1.vm09.stdout:6/101: dwrite d3/d7/fe [0,4194304] 0 2026-03-09T17:30:06.170 INFO:tasks.workunit.client.0.vm06.stdout:8/892: fdatasync d15/d16/d1e/d30/f3b 0 2026-03-09T17:30:06.171 INFO:tasks.workunit.client.1.vm09.stdout:6/102: write d3/fc [898693,24398] 0 2026-03-09T17:30:06.175 INFO:tasks.workunit.client.0.vm06.stdout:6/825: creat d6/d12/d53/dd0/f100 x:0 0 0 2026-03-09T17:30:06.180 INFO:tasks.workunit.client.0.vm06.stdout:8/893: symlink d15/d39/d67/d77/d97/dac/l124 0 2026-03-09T17:30:06.181 INFO:tasks.workunit.client.0.vm06.stdout:6/826: symlink d6/d12/d53/d8f/l101 0 2026-03-09T17:30:06.182 INFO:tasks.workunit.client.0.vm06.stdout:6/827: fsync d6/d4f/d3e/d52/d8c/db0/fdb 0 2026-03-09T17:30:06.183 INFO:tasks.workunit.client.0.vm06.stdout:6/828: truncate d6/d12/d53/d91/dbf/ffa 407772 0 2026-03-09T17:30:06.192 INFO:tasks.workunit.client.0.vm06.stdout:5/893: creat d4/d50/d18/f140 x:0 0 0 2026-03-09T17:30:06.196 INFO:tasks.workunit.client.0.vm06.stdout:8/894: creat d15/d39/d67/d77/d99/f125 x:0 0 0 2026-03-09T17:30:06.197 INFO:tasks.workunit.client.1.vm09.stdout:5/118: sync 2026-03-09T17:30:06.199 INFO:tasks.workunit.client.0.vm06.stdout:6/829: symlink d6/d12/d53/dd0/l102 0 2026-03-09T17:30:06.202 INFO:tasks.workunit.client.0.vm06.stdout:6/830: dwrite d6/d12/ff9 [0,4194304] 0 2026-03-09T17:30:06.206 INFO:tasks.workunit.client.0.vm06.stdout:6/831: dwrite d6/d12/d2d/f39 [0,4194304] 0 2026-03-09T17:30:06.220 INFO:tasks.workunit.client.1.vm09.stdout:6/103: creat d3/f19 x:0 0 0 2026-03-09T17:30:06.228 INFO:tasks.workunit.client.1.vm09.stdout:7/168: getdents da 0 2026-03-09T17:30:06.235 INFO:tasks.workunit.client.1.vm09.stdout:5/119: creat d0/dc/d21/f29 x:0 0 0 2026-03-09T17:30:06.235 INFO:tasks.workunit.client.1.vm09.stdout:8/140: getdents d1/d14 0 2026-03-09T17:30:06.235 INFO:tasks.workunit.client.1.vm09.stdout:7/169: dread da/d11/f1f [0,4194304] 0 2026-03-09T17:30:06.235 INFO:tasks.workunit.client.1.vm09.stdout:8/141: fsync d1/d14/d1b/f1c 0 2026-03-09T17:30:06.235 INFO:tasks.workunit.client.1.vm09.stdout:8/142: dwrite d1/f7 [0,4194304] 0 2026-03-09T17:30:06.235 INFO:tasks.workunit.client.1.vm09.stdout:8/143: chown d1/d14/d31 414739 1 2026-03-09T17:30:06.242 INFO:tasks.workunit.client.1.vm09.stdout:6/104: creat d3/f1a x:0 0 0 2026-03-09T17:30:06.242 INFO:tasks.workunit.client.1.vm09.stdout:6/105: fsync d3/d7/f10 0 2026-03-09T17:30:06.242 INFO:tasks.workunit.client.1.vm09.stdout:6/106: chown d3 2524 1 2026-03-09T17:30:06.243 INFO:tasks.workunit.client.1.vm09.stdout:4/147: creat d11/f33 x:0 0 0 2026-03-09T17:30:06.243 INFO:tasks.workunit.client.0.vm06.stdout:8/895: link d15/ce0 d15/d31/d58/dc9/c126 0 2026-03-09T17:30:06.246 INFO:tasks.workunit.client.1.vm09.stdout:5/120: creat d0/d2/f2a x:0 0 0 2026-03-09T17:30:06.246 INFO:tasks.workunit.client.1.vm09.stdout:5/121: readlink d0/dc/ld 0 2026-03-09T17:30:06.247 INFO:tasks.workunit.client.1.vm09.stdout:5/122: read - d0/dc/d21/d26/f28 zero size 2026-03-09T17:30:06.247 INFO:tasks.workunit.client.1.vm09.stdout:6/107: mknod d3/c1b 0 2026-03-09T17:30:06.254 INFO:tasks.workunit.client.1.vm09.stdout:5/123: mknod d0/d2/d15/c2b 0 2026-03-09T17:30:06.254 INFO:tasks.workunit.client.1.vm09.stdout:5/124: stat d0/l3 0 2026-03-09T17:30:06.254 INFO:tasks.workunit.client.1.vm09.stdout:7/170: sync 2026-03-09T17:30:06.255 INFO:tasks.workunit.client.1.vm09.stdout:4/148: sync 2026-03-09T17:30:06.258 INFO:tasks.workunit.client.1.vm09.stdout:5/125: symlink d0/d9/d16/l2c 0 2026-03-09T17:30:06.261 INFO:tasks.workunit.client.1.vm09.stdout:5/126: dread d0/ff [0,4194304] 0 2026-03-09T17:30:06.261 INFO:tasks.workunit.client.1.vm09.stdout:5/127: stat d0/c10 0 2026-03-09T17:30:06.262 INFO:tasks.workunit.client.1.vm09.stdout:5/128: truncate d0/d2/f2a 682800 0 2026-03-09T17:30:06.263 INFO:tasks.workunit.client.1.vm09.stdout:6/108: mknod d3/d7/c1c 0 2026-03-09T17:30:06.267 INFO:tasks.workunit.client.1.vm09.stdout:8/144: getdents d1 0 2026-03-09T17:30:06.268 INFO:tasks.workunit.client.1.vm09.stdout:1/125: dwrite f3 [0,4194304] 0 2026-03-09T17:30:06.274 INFO:tasks.workunit.client.1.vm09.stdout:7/171: getdents da/d11/d29/d3d 0 2026-03-09T17:30:06.275 INFO:tasks.workunit.client.1.vm09.stdout:7/172: chown da/f15 25276 1 2026-03-09T17:30:06.275 INFO:tasks.workunit.client.1.vm09.stdout:7/173: read da/f26 [132068,5167] 0 2026-03-09T17:30:06.279 INFO:tasks.workunit.client.1.vm09.stdout:4/149: creat d11/d1e/d30/f34 x:0 0 0 2026-03-09T17:30:06.293 INFO:tasks.workunit.client.1.vm09.stdout:1/126: symlink d9/dc/d15/l23 0 2026-03-09T17:30:06.293 INFO:tasks.workunit.client.1.vm09.stdout:1/127: readlink l7 0 2026-03-09T17:30:06.293 INFO:tasks.workunit.client.1.vm09.stdout:8/145: dwrite d1/da/f12 [0,4194304] 0 2026-03-09T17:30:06.295 INFO:tasks.workunit.client.0.vm06.stdout:8/896: read d15/d16/d6d/f89 [3165602,52627] 0 2026-03-09T17:30:06.302 INFO:tasks.workunit.client.1.vm09.stdout:7/174: unlink da/c1d 0 2026-03-09T17:30:06.306 INFO:tasks.workunit.client.1.vm09.stdout:4/150: mkdir d11/d1e/d30/d35 0 2026-03-09T17:30:06.312 INFO:tasks.workunit.client.0.vm06.stdout:8/897: rename d15/d39/d3c/c69 to d15/d31/dc5/df1/d3d/d5f/dd4/c127 0 2026-03-09T17:30:06.321 INFO:tasks.workunit.client.0.vm06.stdout:2/814: write d3/d4/d22/f28 [1476039,114376] 0 2026-03-09T17:30:06.326 INFO:tasks.workunit.client.1.vm09.stdout:8/146: chown d1/da/d13/l29 8476 1 2026-03-09T17:30:06.327 INFO:tasks.workunit.client.1.vm09.stdout:7/175: creat da/d11/f3f x:0 0 0 2026-03-09T17:30:06.327 INFO:tasks.workunit.client.1.vm09.stdout:8/147: fdatasync d1/da/dd/f22 0 2026-03-09T17:30:06.329 INFO:tasks.workunit.client.1.vm09.stdout:4/151: mkdir d11/d1e/d29/d36 0 2026-03-09T17:30:06.330 INFO:tasks.workunit.client.0.vm06.stdout:8/898: symlink d15/d31/dc5/df1/d3d/d5f/d83/dc1/l128 0 2026-03-09T17:30:06.330 INFO:tasks.workunit.client.1.vm09.stdout:7/176: dwrite da/d11/f1a [4194304,4194304] 0 2026-03-09T17:30:06.333 INFO:tasks.workunit.client.1.vm09.stdout:7/177: write da/f1c [1333166,58717] 0 2026-03-09T17:30:06.341 INFO:tasks.workunit.client.1.vm09.stdout:8/148: creat d1/f33 x:0 0 0 2026-03-09T17:30:06.341 INFO:tasks.workunit.client.1.vm09.stdout:6/109: link d3/d7/c1c d3/d7/c1d 0 2026-03-09T17:30:06.342 INFO:tasks.workunit.client.1.vm09.stdout:6/110: write f2 [3523046,109075] 0 2026-03-09T17:30:06.342 INFO:tasks.workunit.client.0.vm06.stdout:8/899: symlink d15/d31/de2/l129 0 2026-03-09T17:30:06.343 INFO:tasks.workunit.client.1.vm09.stdout:4/152: read fe [3528319,58132] 0 2026-03-09T17:30:06.343 INFO:tasks.workunit.client.1.vm09.stdout:4/153: readlink d11/d1e/l27 0 2026-03-09T17:30:06.344 INFO:tasks.workunit.client.1.vm09.stdout:4/154: write d11/f13 [188230,17996] 0 2026-03-09T17:30:06.344 INFO:tasks.workunit.client.1.vm09.stdout:7/178: mknod da/d11/c40 0 2026-03-09T17:30:06.346 INFO:tasks.workunit.client.0.vm06.stdout:8/900: dwrite d15/d16/d1e/d30/db8/d5e/f98 [0,4194304] 0 2026-03-09T17:30:06.347 INFO:tasks.workunit.client.0.vm06.stdout:8/901: stat d15/d31/de2/l129 0 2026-03-09T17:30:06.351 INFO:tasks.workunit.client.1.vm09.stdout:6/111: mkdir d3/d1e 0 2026-03-09T17:30:06.359 INFO:tasks.workunit.client.1.vm09.stdout:6/112: creat d3/f1f x:0 0 0 2026-03-09T17:30:06.359 INFO:tasks.workunit.client.1.vm09.stdout:7/179: getdents da/d11/d3e 0 2026-03-09T17:30:06.360 INFO:tasks.workunit.client.0.vm06.stdout:4/973: rmdir db/d59 39 2026-03-09T17:30:06.362 INFO:tasks.workunit.client.0.vm06.stdout:8/902: mkdir d15/d16/d12a 0 2026-03-09T17:30:06.364 INFO:tasks.workunit.client.1.vm09.stdout:7/180: rename da/d11/d2c to da/d11/d41 0 2026-03-09T17:30:06.366 INFO:tasks.workunit.client.1.vm09.stdout:7/181: dread da/d11/d29/f2b [0,4194304] 0 2026-03-09T17:30:06.367 INFO:tasks.workunit.client.1.vm09.stdout:7/182: truncate da/d11/f19 537004 0 2026-03-09T17:30:06.367 INFO:tasks.workunit.client.1.vm09.stdout:7/183: write da/fb [1185757,11373] 0 2026-03-09T17:30:06.368 INFO:tasks.workunit.client.1.vm09.stdout:6/113: creat d3/d1e/f20 x:0 0 0 2026-03-09T17:30:06.368 INFO:tasks.workunit.client.1.vm09.stdout:6/114: stat d3/f1a 0 2026-03-09T17:30:06.370 INFO:tasks.workunit.client.0.vm06.stdout:1/899: write d11/d14/d1c/d3a/fc5 [987706,38283] 0 2026-03-09T17:30:06.372 INFO:tasks.workunit.client.1.vm09.stdout:6/115: dwrite d3/d7/fe [4194304,4194304] 0 2026-03-09T17:30:06.376 INFO:tasks.workunit.client.0.vm06.stdout:1/900: read - d11/de0/f10c zero size 2026-03-09T17:30:06.378 INFO:tasks.workunit.client.0.vm06.stdout:1/901: truncate f7 3558262 0 2026-03-09T17:30:06.379 INFO:tasks.workunit.client.1.vm09.stdout:0/135: write d6/de/ff [2660027,3504] 0 2026-03-09T17:30:06.382 INFO:tasks.workunit.client.1.vm09.stdout:7/184: fdatasync da/f1c 0 2026-03-09T17:30:06.382 INFO:tasks.workunit.client.1.vm09.stdout:0/136: rename d6/f16 to d6/f27 0 2026-03-09T17:30:06.383 INFO:tasks.workunit.client.1.vm09.stdout:7/185: write f3 [1518851,81339] 0 2026-03-09T17:30:06.384 INFO:tasks.workunit.client.1.vm09.stdout:7/186: creat da/d11/d29/f42 x:0 0 0 2026-03-09T17:30:06.384 INFO:tasks.workunit.client.1.vm09.stdout:7/187: chown da/d11/d29/l37 127207 1 2026-03-09T17:30:06.386 INFO:tasks.workunit.client.1.vm09.stdout:7/188: mknod da/c43 0 2026-03-09T17:30:06.387 INFO:tasks.workunit.client.1.vm09.stdout:7/189: fsync da/d11/d2d/f32 0 2026-03-09T17:30:06.404 INFO:tasks.workunit.client.1.vm09.stdout:9/110: dwrite d5/f13 [0,4194304] 0 2026-03-09T17:30:06.406 INFO:tasks.workunit.client.1.vm09.stdout:9/111: mkdir d5/d2e 0 2026-03-09T17:30:06.409 INFO:tasks.workunit.client.1.vm09.stdout:9/112: link d5/f1c d5/d21/f2f 0 2026-03-09T17:30:06.409 INFO:tasks.workunit.client.1.vm09.stdout:9/113: chown d5/f1e 27805 1 2026-03-09T17:30:06.410 INFO:tasks.workunit.client.1.vm09.stdout:9/114: chown d5/d21/f2b 233651 1 2026-03-09T17:30:06.410 INFO:tasks.workunit.client.1.vm09.stdout:9/115: stat f2 0 2026-03-09T17:30:06.410 INFO:tasks.workunit.client.1.vm09.stdout:9/116: truncate d5/d21/f26 490614 0 2026-03-09T17:30:06.411 INFO:tasks.workunit.client.1.vm09.stdout:9/117: write d5/d21/f2a [926072,16319] 0 2026-03-09T17:30:06.412 INFO:tasks.workunit.client.1.vm09.stdout:9/118: creat d5/d21/f30 x:0 0 0 2026-03-09T17:30:06.414 INFO:tasks.workunit.client.1.vm09.stdout:9/119: mknod d5/c31 0 2026-03-09T17:30:06.414 INFO:tasks.workunit.client.1.vm09.stdout:9/120: write d5/f13 [2916283,116919] 0 2026-03-09T17:30:06.424 INFO:tasks.workunit.client.1.vm09.stdout:9/121: getdents d5/d2e 0 2026-03-09T17:30:06.424 INFO:tasks.workunit.client.0.vm06.stdout:3/980: creat dd/d19/d25/d44/f150 x:0 0 0 2026-03-09T17:30:06.424 INFO:tasks.workunit.client.1.vm09.stdout:9/122: chown f2 11652398 1 2026-03-09T17:30:06.426 INFO:tasks.workunit.client.1.vm09.stdout:9/123: symlink d5/l32 0 2026-03-09T17:30:06.431 INFO:tasks.workunit.client.1.vm09.stdout:9/124: dwrite d5/f1e [0,4194304] 0 2026-03-09T17:30:06.440 INFO:tasks.workunit.client.1.vm09.stdout:9/125: mkdir d5/de/d29/d33 0 2026-03-09T17:30:06.441 INFO:tasks.workunit.client.0.vm06.stdout:4/974: dread db/d59/d5f/d45/f8e [0,4194304] 0 2026-03-09T17:30:06.444 INFO:tasks.workunit.client.0.vm06.stdout:4/975: mknod db/d1d/d21/d37/d69/d78/da0/db6/d12c/c153 0 2026-03-09T17:30:06.449 INFO:tasks.workunit.client.1.vm09.stdout:9/126: dread d5/de/f20 [0,4194304] 0 2026-03-09T17:30:06.452 INFO:tasks.workunit.client.1.vm09.stdout:9/127: rename d5/f1c to d5/f34 0 2026-03-09T17:30:06.452 INFO:tasks.workunit.client.1.vm09.stdout:9/128: write d5/f1d [2608033,11583] 0 2026-03-09T17:30:06.453 INFO:tasks.workunit.client.1.vm09.stdout:9/129: write d5/f1e [3618519,92980] 0 2026-03-09T17:30:06.458 INFO:tasks.workunit.client.1.vm09.stdout:9/130: creat d5/de/d29/f35 x:0 0 0 2026-03-09T17:30:06.462 INFO:tasks.workunit.client.1.vm09.stdout:9/131: creat d5/de/d29/f36 x:0 0 0 2026-03-09T17:30:06.485 INFO:tasks.workunit.client.0.vm06.stdout:5/894: write d4/f49 [1318620,81042] 0 2026-03-09T17:30:06.486 INFO:tasks.workunit.client.0.vm06.stdout:5/895: chown d4/d22/d46/f93 238929 1 2026-03-09T17:30:06.487 INFO:tasks.workunit.client.0.vm06.stdout:5/896: dread - d4/d50/dd6/ffa zero size 2026-03-09T17:30:06.491 INFO:tasks.workunit.client.0.vm06.stdout:6/832: write d6/d47/d4d/d9a/da2/ff4 [356829,39028] 0 2026-03-09T17:30:06.493 INFO:tasks.workunit.client.0.vm06.stdout:6/833: symlink d6/d47/dd7/l103 0 2026-03-09T17:30:06.497 INFO:tasks.workunit.client.1.vm09.stdout:2/103: rename f8 to d13/f23 0 2026-03-09T17:30:06.498 INFO:tasks.workunit.client.1.vm09.stdout:2/104: write fd [550412,52355] 0 2026-03-09T17:30:06.499 INFO:tasks.workunit.client.1.vm09.stdout:2/105: stat d13/c16 0 2026-03-09T17:30:06.502 INFO:tasks.workunit.client.1.vm09.stdout:3/119: rename d5/l13 to d5/d16/l27 0 2026-03-09T17:30:06.504 INFO:tasks.workunit.client.1.vm09.stdout:3/120: dread - d5/d6/d12/f1d zero size 2026-03-09T17:30:06.508 INFO:tasks.workunit.client.1.vm09.stdout:5/129: rename d0/l5 to d0/d9/l2d 0 2026-03-09T17:30:06.509 INFO:tasks.workunit.client.1.vm09.stdout:0/137: rename d6/de/c1c to d6/d1d/d24/c28 0 2026-03-09T17:30:06.510 INFO:tasks.workunit.client.1.vm09.stdout:0/138: chown d6/de/c1a 87902 1 2026-03-09T17:30:06.513 INFO:tasks.workunit.client.1.vm09.stdout:9/132: rename d5/de/f18 to d5/de/d29/f37 0 2026-03-09T17:30:06.513 INFO:tasks.workunit.client.1.vm09.stdout:9/133: fdatasync d5/d21/f2b 0 2026-03-09T17:30:06.515 INFO:tasks.workunit.client.1.vm09.stdout:0/139: chown d6/d1d/l22 4214 1 2026-03-09T17:30:06.516 INFO:tasks.workunit.client.1.vm09.stdout:0/140: dread - d6/f1b zero size 2026-03-09T17:30:06.517 INFO:tasks.workunit.client.1.vm09.stdout:9/134: creat d5/d21/f38 x:0 0 0 2026-03-09T17:30:06.519 INFO:tasks.workunit.client.1.vm09.stdout:5/130: link d0/dc/l18 d0/dc/l2e 0 2026-03-09T17:30:06.522 INFO:tasks.workunit.client.1.vm09.stdout:9/135: dwrite d5/d21/f26 [0,4194304] 0 2026-03-09T17:30:06.523 INFO:tasks.workunit.client.1.vm09.stdout:0/141: mknod d6/c29 0 2026-03-09T17:30:06.525 INFO:tasks.workunit.client.1.vm09.stdout:9/136: mknod d5/de/c39 0 2026-03-09T17:30:06.526 INFO:tasks.workunit.client.1.vm09.stdout:5/131: dwrite d0/d2/f2a [0,4194304] 0 2026-03-09T17:30:06.528 INFO:tasks.workunit.client.1.vm09.stdout:0/142: write d6/f27 [3007630,22561] 0 2026-03-09T17:30:06.536 INFO:tasks.workunit.client.1.vm09.stdout:9/137: mknod d5/d2e/c3a 0 2026-03-09T17:30:06.538 INFO:tasks.workunit.client.1.vm09.stdout:0/143: mkdir d6/d2a 0 2026-03-09T17:30:06.539 INFO:tasks.workunit.client.1.vm09.stdout:0/144: symlink d6/d1d/d24/l2b 0 2026-03-09T17:30:06.548 INFO:tasks.workunit.client.1.vm09.stdout:1/128: write f3 [4683148,104992] 0 2026-03-09T17:30:06.551 INFO:tasks.workunit.client.1.vm09.stdout:1/129: dwrite d9/dc/d15/f1a [0,4194304] 0 2026-03-09T17:30:06.554 INFO:tasks.workunit.client.1.vm09.stdout:1/130: rename d9/dc/d15/c1c to d9/dc/d15/c24 0 2026-03-09T17:30:06.554 INFO:tasks.workunit.client.1.vm09.stdout:1/131: chown d9/dc/d15/d1d/c20 1198 1 2026-03-09T17:30:06.558 INFO:tasks.workunit.client.1.vm09.stdout:1/132: creat d9/dc/d15/f25 x:0 0 0 2026-03-09T17:30:06.558 INFO:tasks.workunit.client.1.vm09.stdout:1/133: mknod d9/dc/d15/d22/c26 0 2026-03-09T17:30:06.558 INFO:tasks.workunit.client.1.vm09.stdout:1/134: symlink d9/dc/d15/l27 0 2026-03-09T17:30:06.561 INFO:tasks.workunit.client.1.vm09.stdout:1/135: dread d9/dc/d15/f1a [0,4194304] 0 2026-03-09T17:30:06.562 INFO:tasks.workunit.client.1.vm09.stdout:1/136: creat d9/dc/dd/f28 x:0 0 0 2026-03-09T17:30:06.567 INFO:tasks.workunit.client.1.vm09.stdout:1/137: creat d9/f29 x:0 0 0 2026-03-09T17:30:06.571 INFO:tasks.workunit.client.1.vm09.stdout:1/138: dwrite d9/f29 [0,4194304] 0 2026-03-09T17:30:06.582 INFO:tasks.workunit.client.1.vm09.stdout:1/139: fdatasync d9/f11 0 2026-03-09T17:30:06.583 INFO:tasks.workunit.client.1.vm09.stdout:1/140: dread - d9/dc/dd/f28 zero size 2026-03-09T17:30:06.584 INFO:tasks.workunit.client.1.vm09.stdout:1/141: symlink d9/dc/d15/l2a 0 2026-03-09T17:30:06.618 INFO:tasks.workunit.client.1.vm09.stdout:1/142: sync 2026-03-09T17:30:06.619 INFO:tasks.workunit.client.1.vm09.stdout:1/143: readlink d9/dc/d15/l2a 0 2026-03-09T17:30:06.623 INFO:tasks.workunit.client.1.vm09.stdout:1/144: dwrite d9/dc/d15/f25 [0,4194304] 0 2026-03-09T17:30:06.627 INFO:tasks.workunit.client.1.vm09.stdout:1/145: dread d9/dc/d15/f1a [0,4194304] 0 2026-03-09T17:30:06.634 INFO:tasks.workunit.client.1.vm09.stdout:1/146: rename d9/dc/d15/f25 to d9/dc/d15/d22/f2b 0 2026-03-09T17:30:06.639 INFO:tasks.workunit.client.1.vm09.stdout:1/147: symlink d9/dc/dd/l2c 0 2026-03-09T17:30:06.644 INFO:tasks.workunit.client.1.vm09.stdout:1/148: fdatasync f3 0 2026-03-09T17:30:06.644 INFO:tasks.workunit.client.1.vm09.stdout:1/149: dread d9/dc/dd/fe [0,4194304] 0 2026-03-09T17:30:06.646 INFO:tasks.workunit.client.1.vm09.stdout:1/150: getdents d9/dc 0 2026-03-09T17:30:06.647 INFO:tasks.workunit.client.1.vm09.stdout:1/151: mknod d9/dc/d15/d22/c2d 0 2026-03-09T17:30:06.652 INFO:tasks.workunit.client.1.vm09.stdout:1/152: link f2 d9/d1f/f2e 0 2026-03-09T17:30:06.652 INFO:tasks.workunit.client.1.vm09.stdout:1/153: truncate d9/f29 4623683 0 2026-03-09T17:30:06.656 INFO:tasks.workunit.client.1.vm09.stdout:1/154: rename d9/cf to d9/dc/c2f 0 2026-03-09T17:30:06.657 INFO:tasks.workunit.client.1.vm09.stdout:1/155: readlink d9/dc/d15/l23 0 2026-03-09T17:30:06.657 INFO:tasks.workunit.client.1.vm09.stdout:1/156: truncate d9/dc/dd/f28 695625 0 2026-03-09T17:30:06.658 INFO:tasks.workunit.client.1.vm09.stdout:1/157: mknod d9/dc/d15/c30 0 2026-03-09T17:30:06.665 INFO:tasks.workunit.client.0.vm06.stdout:2/815: dwrite d3/d4/f52 [0,4194304] 0 2026-03-09T17:30:06.667 INFO:tasks.workunit.client.1.vm09.stdout:4/155: rmdir d11/d1e 39 2026-03-09T17:30:06.671 INFO:tasks.workunit.client.1.vm09.stdout:6/116: fsync d3/d1e/f20 0 2026-03-09T17:30:06.671 INFO:tasks.workunit.client.1.vm09.stdout:6/117: truncate d3/f1a 214048 0 2026-03-09T17:30:06.673 INFO:tasks.workunit.client.0.vm06.stdout:8/903: dwrite d15/d31/dc5/df1/d2b/fe6 [0,4194304] 0 2026-03-09T17:30:06.685 INFO:tasks.workunit.client.0.vm06.stdout:1/902: write d11/d14/d1d/d1e/d2a/f43 [8574911,39773] 0 2026-03-09T17:30:06.686 INFO:tasks.workunit.client.0.vm06.stdout:1/903: write d11/d14/d1d/d1e/d2a/d99/ff3 [142329,21591] 0 2026-03-09T17:30:06.688 INFO:tasks.workunit.client.1.vm09.stdout:6/118: dread d3/fc [0,4194304] 0 2026-03-09T17:30:06.691 INFO:tasks.workunit.client.1.vm09.stdout:8/149: truncate d1/f28 58018 0 2026-03-09T17:30:06.699 INFO:tasks.workunit.client.1.vm09.stdout:8/150: truncate d1/da/dd/f27 614842 0 2026-03-09T17:30:06.699 INFO:tasks.workunit.client.1.vm09.stdout:6/119: mkdir d3/d21 0 2026-03-09T17:30:06.699 INFO:tasks.workunit.client.1.vm09.stdout:6/120: dread - d3/d7/f10 zero size 2026-03-09T17:30:06.699 INFO:tasks.workunit.client.1.vm09.stdout:4/156: rename l0 to d11/d1e/d29/l37 0 2026-03-09T17:30:06.700 INFO:tasks.workunit.client.0.vm06.stdout:8/904: dread d15/d39/d3c/d6c/fbf [0,4194304] 0 2026-03-09T17:30:06.701 INFO:tasks.workunit.client.0.vm06.stdout:8/905: write d15/d39/d67/d77/d97/f117 [85282,19557] 0 2026-03-09T17:30:06.706 INFO:tasks.workunit.client.1.vm09.stdout:4/157: creat d11/d1e/d30/f38 x:0 0 0 2026-03-09T17:30:06.706 INFO:tasks.workunit.client.1.vm09.stdout:4/158: write d11/d1e/d29/f32 [406960,32234] 0 2026-03-09T17:30:06.708 INFO:tasks.workunit.client.1.vm09.stdout:4/159: truncate d11/d1e/f28 923036 0 2026-03-09T17:30:06.711 INFO:tasks.workunit.client.0.vm06.stdout:8/906: read - d15/d39/dd2/fbe zero size 2026-03-09T17:30:06.711 INFO:tasks.workunit.client.1.vm09.stdout:6/121: creat d3/d21/f22 x:0 0 0 2026-03-09T17:30:06.714 INFO:tasks.workunit.client.0.vm06.stdout:8/907: dread d15/d16/d1e/f59 [4194304,4194304] 0 2026-03-09T17:30:06.718 INFO:tasks.workunit.client.0.vm06.stdout:8/908: read d15/d31/dc5/df1/f61 [1978624,28298] 0 2026-03-09T17:30:06.718 INFO:tasks.workunit.client.0.vm06.stdout:8/909: chown d15/d31/d58 1775052036 1 2026-03-09T17:30:06.719 INFO:tasks.workunit.client.0.vm06.stdout:8/910: dread - d15/d39/dd2/fbe zero size 2026-03-09T17:30:06.719 INFO:tasks.workunit.client.0.vm06.stdout:8/911: dread - d15/d39/dd2/fbe zero size 2026-03-09T17:30:06.722 INFO:tasks.workunit.client.1.vm09.stdout:4/160: symlink d11/d1e/d31/l39 0 2026-03-09T17:30:06.722 INFO:tasks.workunit.client.1.vm09.stdout:7/190: write da/f21 [3589799,69465] 0 2026-03-09T17:30:06.723 INFO:tasks.workunit.client.1.vm09.stdout:7/191: write da/d11/d29/f33 [1391929,53847] 0 2026-03-09T17:30:06.725 INFO:tasks.workunit.client.0.vm06.stdout:8/912: rename d15/d16/d1e/f4e to d15/d39/d67/d86/ddd/f12b 0 2026-03-09T17:30:06.730 INFO:tasks.workunit.client.1.vm09.stdout:4/161: creat d11/d1e/d31/f3a x:0 0 0 2026-03-09T17:30:06.731 INFO:tasks.workunit.client.1.vm09.stdout:6/122: creat d3/d7/f23 x:0 0 0 2026-03-09T17:30:06.732 INFO:tasks.workunit.client.1.vm09.stdout:6/123: stat d3/d7/f18 0 2026-03-09T17:30:06.732 INFO:tasks.workunit.client.0.vm06.stdout:3/981: write dd/d19/d25/d48/fd3 [2195904,41939] 0 2026-03-09T17:30:06.735 INFO:tasks.workunit.client.0.vm06.stdout:8/913: dread - d15/d39/d67/d77/de7/feb zero size 2026-03-09T17:30:06.738 INFO:tasks.workunit.client.0.vm06.stdout:4/976: dwrite db/d1d/d21/d88/dc3/f139 [0,4194304] 0 2026-03-09T17:30:06.749 INFO:tasks.workunit.client.0.vm06.stdout:5/897: dwrite d4/d22/d64/f9f [0,4194304] 0 2026-03-09T17:30:06.750 INFO:tasks.workunit.client.1.vm09.stdout:3/121: getdents d5 0 2026-03-09T17:30:06.750 INFO:tasks.workunit.client.0.vm06.stdout:6/834: dwrite d6/d47/d96/d40/fb4 [0,4194304] 0 2026-03-09T17:30:06.762 INFO:tasks.workunit.client.1.vm09.stdout:2/106: dwrite d13/d15/f1d [0,4194304] 0 2026-03-09T17:30:06.762 INFO:tasks.workunit.client.0.vm06.stdout:8/914: creat d15/d16/d1e/d30/d55/def/df3/f12c x:0 0 0 2026-03-09T17:30:06.762 INFO:tasks.workunit.client.0.vm06.stdout:4/977: rename db/d1d/f3a to db/d1d/d21/d37/d69/d78/da0/f154 0 2026-03-09T17:30:06.765 INFO:tasks.workunit.client.1.vm09.stdout:3/122: write d5/d9/fc [1089419,119580] 0 2026-03-09T17:30:06.785 INFO:tasks.workunit.client.0.vm06.stdout:6/835: mknod d6/d47/d4d/d9a/da2/db1/c104 0 2026-03-09T17:30:06.790 INFO:tasks.workunit.client.1.vm09.stdout:7/192: link da/d11/d29/f42 da/d11/d29/f44 0 2026-03-09T17:30:06.791 INFO:tasks.workunit.client.1.vm09.stdout:7/193: chown da/f36 1848 1 2026-03-09T17:30:06.795 INFO:tasks.workunit.client.0.vm06.stdout:3/982: dread dd/d5b/d65/f6a [0,4194304] 0 2026-03-09T17:30:06.796 INFO:tasks.workunit.client.0.vm06.stdout:5/898: rename d4/d50/dd6 to d4/d50/d18/de1/d141 0 2026-03-09T17:30:06.801 INFO:tasks.workunit.client.0.vm06.stdout:3/983: dread - dd/d1d/d2e/d67/fee zero size 2026-03-09T17:30:06.836 INFO:tasks.workunit.client.0.vm06.stdout:8/915: rename d15/d16/d1e/d30/f3b to d15/d39/d67/de3/f12d 0 2026-03-09T17:30:06.836 INFO:tasks.workunit.client.0.vm06.stdout:8/916: chown d15/d39/d3c/d6c/fbf 24905 1 2026-03-09T17:30:06.836 INFO:tasks.workunit.client.0.vm06.stdout:5/899: dread d4/d22/d46/f58 [0,4194304] 0 2026-03-09T17:30:06.836 INFO:tasks.workunit.client.0.vm06.stdout:5/900: write d4/d50/d35/d40/d109/f136 [1035521,100761] 0 2026-03-09T17:30:06.836 INFO:tasks.workunit.client.0.vm06.stdout:8/917: symlink d15/d39/d67/d77/de7/l12e 0 2026-03-09T17:30:06.836 INFO:tasks.workunit.client.1.vm09.stdout:2/107: creat d13/d15/d21/f24 x:0 0 0 2026-03-09T17:30:06.836 INFO:tasks.workunit.client.1.vm09.stdout:2/108: stat d13/d15/f1d 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:6/124: creat d3/d7/f24 x:0 0 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:6/125: dwrite d3/d7/ff [0,4194304] 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:7/194: creat da/d11/d2d/f45 x:0 0 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:2/109: mknod d13/d15/c25 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:6/126: mkdir d3/d21/d25 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:6/127: write d3/d7/ff [776984,65014] 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:6/128: stat d3/c14 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:6/129: read - d3/d1e/f20 zero size 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:7/195: symlink da/d11/d29/l46 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:7/196: unlink da/f12 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:3/123: getdents d5/d6/d12 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:7/197: mkdir da/d11/d47 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:2/110: creat d13/f26 x:0 0 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:7/198: fdatasync da/f36 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:2/111: chown l10 59 1 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:2/112: dread d13/f1a [0,4194304] 0 2026-03-09T17:30:06.837 INFO:tasks.workunit.client.1.vm09.stdout:2/113: creat d13/d15/d21/f27 x:0 0 0 2026-03-09T17:30:06.838 INFO:tasks.workunit.client.1.vm09.stdout:2/114: read d13/f1a [586480,74623] 0 2026-03-09T17:30:06.839 INFO:tasks.workunit.client.1.vm09.stdout:3/124: link d5/f22 d5/d16/d25/f28 0 2026-03-09T17:30:06.840 INFO:tasks.workunit.client.1.vm09.stdout:7/199: symlink da/d11/d29/d3d/l48 0 2026-03-09T17:30:06.841 INFO:tasks.workunit.client.1.vm09.stdout:7/200: fdatasync da/d11/f1a 0 2026-03-09T17:30:06.841 INFO:tasks.workunit.client.1.vm09.stdout:3/125: dread d5/d6/fb [0,4194304] 0 2026-03-09T17:30:06.842 INFO:tasks.workunit.client.1.vm09.stdout:7/201: mkdir da/d11/d2d/d49 0 2026-03-09T17:30:06.849 INFO:tasks.workunit.client.1.vm09.stdout:3/126: rename d5/d9/cf to d5/c29 0 2026-03-09T17:30:06.852 INFO:tasks.workunit.client.1.vm09.stdout:3/127: dwrite d5/d6/d12/f15 [0,4194304] 0 2026-03-09T17:30:06.853 INFO:tasks.workunit.client.1.vm09.stdout:2/115: sync 2026-03-09T17:30:06.853 INFO:tasks.workunit.client.1.vm09.stdout:3/128: write d5/d6/d12/f15 [706569,15117] 0 2026-03-09T17:30:06.864 INFO:tasks.workunit.client.1.vm09.stdout:2/116: link f6 d13/d15/d21/f28 0 2026-03-09T17:30:06.864 INFO:tasks.workunit.client.0.vm06.stdout:4/978: sync 2026-03-09T17:30:06.865 INFO:tasks.workunit.client.0.vm06.stdout:3/984: sync 2026-03-09T17:30:06.865 INFO:tasks.workunit.client.0.vm06.stdout:4/979: stat db/d1d/fd3 0 2026-03-09T17:30:06.868 INFO:tasks.workunit.client.1.vm09.stdout:2/117: mknod d13/d15/c29 0 2026-03-09T17:30:06.870 INFO:tasks.workunit.client.0.vm06.stdout:3/985: dread - dd/d118/f128 zero size 2026-03-09T17:30:06.870 INFO:tasks.workunit.client.0.vm06.stdout:3/986: dread - dd/d19/d25/f14b zero size 2026-03-09T17:30:06.876 INFO:tasks.workunit.client.0.vm06.stdout:3/987: dwrite dd/d19/f10b [0,4194304] 0 2026-03-09T17:30:06.876 INFO:tasks.workunit.client.1.vm09.stdout:2/118: dwrite d13/d15/d21/f24 [0,4194304] 0 2026-03-09T17:30:06.876 INFO:tasks.workunit.client.1.vm09.stdout:2/119: dread - d13/f26 zero size 2026-03-09T17:30:06.878 INFO:tasks.workunit.client.1.vm09.stdout:2/120: sync 2026-03-09T17:30:06.885 INFO:tasks.workunit.client.0.vm06.stdout:6/836: fsync d6/d47/d96/d40/fb4 0 2026-03-09T17:30:06.902 INFO:tasks.workunit.client.1.vm09.stdout:9/138: getdents d5/d2e 0 2026-03-09T17:30:06.903 INFO:tasks.workunit.client.1.vm09.stdout:9/139: write d5/d21/f26 [848562,77401] 0 2026-03-09T17:30:06.903 INFO:tasks.workunit.client.1.vm09.stdout:5/132: dwrite d0/d2/f2a [4194304,4194304] 0 2026-03-09T17:30:06.904 INFO:tasks.workunit.client.1.vm09.stdout:9/140: fsync d5/f1d 0 2026-03-09T17:30:06.904 INFO:tasks.workunit.client.1.vm09.stdout:0/145: truncate d6/f27 3795795 0 2026-03-09T17:30:06.910 INFO:tasks.workunit.client.0.vm06.stdout:3/988: mknod dd/d19/d25/d44/d12f/c151 0 2026-03-09T17:30:06.923 INFO:tasks.workunit.client.1.vm09.stdout:5/133: creat d0/dc/f2f x:0 0 0 2026-03-09T17:30:06.925 INFO:tasks.workunit.client.1.vm09.stdout:0/146: mknod d6/d1d/d24/c2c 0 2026-03-09T17:30:06.926 INFO:tasks.workunit.client.1.vm09.stdout:9/141: creat d5/de/d29/d33/f3b x:0 0 0 2026-03-09T17:30:06.927 INFO:tasks.workunit.client.1.vm09.stdout:5/134: stat d0/d9/l2d 0 2026-03-09T17:30:06.928 INFO:tasks.workunit.client.1.vm09.stdout:0/147: unlink d6/de/l17 0 2026-03-09T17:30:06.930 INFO:tasks.workunit.client.1.vm09.stdout:9/142: rmdir d5/d21 39 2026-03-09T17:30:06.935 INFO:tasks.workunit.client.1.vm09.stdout:5/135: rename d0/d2/lb to d0/d9/l30 0 2026-03-09T17:30:06.935 INFO:tasks.workunit.client.1.vm09.stdout:5/136: readlink d0/d9/d16/l2c 0 2026-03-09T17:30:06.935 INFO:tasks.workunit.client.1.vm09.stdout:9/143: getdents d5/d2e 0 2026-03-09T17:30:06.936 INFO:tasks.workunit.client.1.vm09.stdout:0/148: dwrite d6/f1b [0,4194304] 0 2026-03-09T17:30:06.937 INFO:tasks.workunit.client.0.vm06.stdout:3/989: sync 2026-03-09T17:30:06.940 INFO:tasks.workunit.client.1.vm09.stdout:5/137: creat d0/d2/f31 x:0 0 0 2026-03-09T17:30:06.942 INFO:tasks.workunit.client.0.vm06.stdout:3/990: mknod dd/d81/d97/c152 0 2026-03-09T17:30:06.944 INFO:tasks.workunit.client.1.vm09.stdout:5/138: dwrite d0/d2/f31 [0,4194304] 0 2026-03-09T17:30:06.951 INFO:tasks.workunit.client.1.vm09.stdout:5/139: write d0/d2/f31 [2739604,33724] 0 2026-03-09T17:30:06.953 INFO:tasks.workunit.client.1.vm09.stdout:5/140: sync 2026-03-09T17:30:06.971 INFO:tasks.workunit.client.1.vm09.stdout:5/141: creat d0/d2/d15/d20/f32 x:0 0 0 2026-03-09T17:30:06.974 INFO:tasks.workunit.client.1.vm09.stdout:5/142: dwrite d0/dc/d21/f29 [0,4194304] 0 2026-03-09T17:30:06.974 INFO:tasks.workunit.client.1.vm09.stdout:0/149: creat d6/f2d x:0 0 0 2026-03-09T17:30:06.976 INFO:tasks.workunit.client.1.vm09.stdout:0/150: chown d6/de/l19 0 1 2026-03-09T17:30:06.976 INFO:tasks.workunit.client.1.vm09.stdout:0/151: write d6/f9 [3614090,33435] 0 2026-03-09T17:30:06.976 INFO:tasks.workunit.client.1.vm09.stdout:5/143: read d0/d2/f2a [5156272,122974] 0 2026-03-09T17:30:06.977 INFO:tasks.workunit.client.1.vm09.stdout:5/144: write d0/de/d17/f1f [821552,77663] 0 2026-03-09T17:30:06.978 INFO:tasks.workunit.client.1.vm09.stdout:5/145: write d0/dc/f2f [283261,30580] 0 2026-03-09T17:30:06.979 INFO:tasks.workunit.client.1.vm09.stdout:5/146: write d0/d2/f2a [118440,27941] 0 2026-03-09T17:30:06.989 INFO:tasks.workunit.client.1.vm09.stdout:5/147: read d0/d2/d15/d20/f24 [829581,35568] 0 2026-03-09T17:30:06.995 INFO:tasks.workunit.client.0.vm06.stdout:3/991: dread dd/d81/da3/fc6 [0,4194304] 0 2026-03-09T17:30:06.996 INFO:tasks.workunit.client.1.vm09.stdout:5/148: getdents d0/de 0 2026-03-09T17:30:06.996 INFO:tasks.workunit.client.1.vm09.stdout:5/149: fdatasync d0/f22 0 2026-03-09T17:30:07.011 INFO:tasks.workunit.client.1.vm09.stdout:1/158: write d9/dc/dd/fe [2034740,130914] 0 2026-03-09T17:30:07.012 INFO:tasks.workunit.client.1.vm09.stdout:4/162: readlink d11/d1e/d29/l37 0 2026-03-09T17:30:07.012 INFO:tasks.workunit.client.0.vm06.stdout:1/904: write d11/d14/d1d/f56 [641837,57743] 0 2026-03-09T17:30:07.013 INFO:tasks.workunit.client.1.vm09.stdout:1/159: dread d9/dc/dd/f28 [0,4194304] 0 2026-03-09T17:30:07.014 INFO:tasks.workunit.client.0.vm06.stdout:2/816: dwrite d3/d4/d12/f66 [0,4194304] 0 2026-03-09T17:30:07.016 INFO:tasks.workunit.client.1.vm09.stdout:8/151: dwrite d1/f3 [0,4194304] 0 2026-03-09T17:30:07.026 INFO:tasks.workunit.client.1.vm09.stdout:8/152: chown d1/da/dd 69 1 2026-03-09T17:30:07.026 INFO:tasks.workunit.client.1.vm09.stdout:8/153: write d1/f33 [585052,29796] 0 2026-03-09T17:30:07.028 INFO:tasks.workunit.client.1.vm09.stdout:5/150: getdents d0/de 0 2026-03-09T17:30:07.029 INFO:tasks.workunit.client.1.vm09.stdout:8/154: fdatasync d1/f16 0 2026-03-09T17:30:07.032 INFO:tasks.workunit.client.1.vm09.stdout:5/151: rename d0/de/d17 to d0/dc/d21/d33 0 2026-03-09T17:30:07.032 INFO:tasks.workunit.client.1.vm09.stdout:5/152: fdatasync d0/d2/d15/f1c 0 2026-03-09T17:30:07.033 INFO:tasks.workunit.client.1.vm09.stdout:8/155: dread d1/f3 [0,4194304] 0 2026-03-09T17:30:07.034 INFO:tasks.workunit.client.1.vm09.stdout:0/152: fdatasync d6/f9 0 2026-03-09T17:30:07.046 INFO:tasks.workunit.client.1.vm09.stdout:5/153: creat d0/d9/f34 x:0 0 0 2026-03-09T17:30:07.048 INFO:tasks.workunit.client.0.vm06.stdout:1/905: symlink d11/d14/d1d/d1e/d2a/d34/d64/l130 0 2026-03-09T17:30:07.055 INFO:tasks.workunit.client.1.vm09.stdout:0/153: creat d6/de/f2e x:0 0 0 2026-03-09T17:30:07.060 INFO:tasks.workunit.client.0.vm06.stdout:1/906: creat d11/d14/d1d/d1e/d2a/d34/d64/f131 x:0 0 0 2026-03-09T17:30:07.060 INFO:tasks.workunit.client.1.vm09.stdout:0/154: dwrite d6/f21 [0,4194304] 0 2026-03-09T17:30:07.061 INFO:tasks.workunit.client.1.vm09.stdout:8/156: mkdir d1/da/d23/d34 0 2026-03-09T17:30:07.061 INFO:tasks.workunit.client.1.vm09.stdout:5/154: rmdir d0/d9/d16 39 2026-03-09T17:30:07.061 INFO:tasks.workunit.client.1.vm09.stdout:5/155: truncate d0/d9/f34 580643 0 2026-03-09T17:30:07.074 INFO:tasks.workunit.client.0.vm06.stdout:1/907: read d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/dce/fe5 [3103711,102960] 0 2026-03-09T17:30:07.088 INFO:tasks.workunit.client.1.vm09.stdout:0/155: rename d6/d1d/d24/l2b to d6/de/l2f 0 2026-03-09T17:30:07.096 INFO:tasks.workunit.client.1.vm09.stdout:5/156: rename d0/dc/f2f to d0/dc/d21/d33/f35 0 2026-03-09T17:30:07.096 INFO:tasks.workunit.client.1.vm09.stdout:5/157: write d0/f22 [690176,117177] 0 2026-03-09T17:30:07.096 INFO:tasks.workunit.client.1.vm09.stdout:0/156: creat d6/d1d/f30 x:0 0 0 2026-03-09T17:30:07.097 INFO:tasks.workunit.client.0.vm06.stdout:1/908: chown d11/d14/d1d/d4a/df7/d106/c102 391427860 1 2026-03-09T17:30:07.100 INFO:tasks.workunit.client.1.vm09.stdout:8/157: creat d1/da/f35 x:0 0 0 2026-03-09T17:30:07.107 INFO:tasks.workunit.client.1.vm09.stdout:5/158: rename d0/d2/d15/d20/f24 to d0/dc/d21/d26/f36 0 2026-03-09T17:30:07.110 INFO:tasks.workunit.client.1.vm09.stdout:8/158: write d1/da/dd/f1e [2235017,25107] 0 2026-03-09T17:30:07.116 INFO:tasks.workunit.client.1.vm09.stdout:8/159: read d1/f28 [55615,109520] 0 2026-03-09T17:30:07.122 INFO:tasks.workunit.client.0.vm06.stdout:1/909: getdents d11/d14/d1d/d42/dff 0 2026-03-09T17:30:07.123 INFO:tasks.workunit.client.1.vm09.stdout:8/160: creat d1/da/d13/f36 x:0 0 0 2026-03-09T17:30:07.129 INFO:tasks.workunit.client.0.vm06.stdout:1/910: symlink d11/d14/d1d/d4a/df7/d106/d112/d114/l132 0 2026-03-09T17:30:07.134 INFO:tasks.workunit.client.1.vm09.stdout:8/161: mknod d1/da/d23/d34/c37 0 2026-03-09T17:30:07.136 INFO:tasks.workunit.client.0.vm06.stdout:1/911: rmdir d11/d14/d1d/d42/d46/d92/dc0/daf 0 2026-03-09T17:30:07.136 INFO:tasks.workunit.client.1.vm09.stdout:8/162: dread d1/f16 [0,4194304] 0 2026-03-09T17:30:07.139 INFO:tasks.workunit.client.0.vm06.stdout:1/912: creat d11/d14/d1d/d42/f133 x:0 0 0 2026-03-09T17:30:07.141 INFO:tasks.workunit.client.1.vm09.stdout:8/163: unlink d1/da/dd/c20 0 2026-03-09T17:30:07.153 INFO:tasks.workunit.client.0.vm06.stdout:5/901: write d4/d50/d18/f3c [9111196,111881] 0 2026-03-09T17:30:07.155 INFO:tasks.workunit.client.0.vm06.stdout:8/918: dwrite d15/d16/f52 [0,4194304] 0 2026-03-09T17:30:07.156 INFO:tasks.workunit.client.0.vm06.stdout:8/919: readlink d15/d39/d67/d77/la7 0 2026-03-09T17:30:07.167 INFO:tasks.workunit.client.0.vm06.stdout:5/902: fdatasync d4/da4/fc5 0 2026-03-09T17:30:07.168 INFO:tasks.workunit.client.1.vm09.stdout:6/130: truncate d3/d7/fe 6112791 0 2026-03-09T17:30:07.169 INFO:tasks.workunit.client.1.vm09.stdout:6/131: chown d3/d7/l16 1675 1 2026-03-09T17:30:07.173 INFO:tasks.workunit.client.0.vm06.stdout:8/920: symlink d15/d16/d1e/d30/d55/def/df3/l12f 0 2026-03-09T17:30:07.178 INFO:tasks.workunit.client.1.vm09.stdout:6/132: mkdir d3/d21/d25/d26 0 2026-03-09T17:30:07.178 INFO:tasks.workunit.client.0.vm06.stdout:8/921: rmdir d15/d31/d58/dc9 39 2026-03-09T17:30:07.178 INFO:tasks.workunit.client.1.vm09.stdout:6/133: chown d3/f1a 1 1 2026-03-09T17:30:07.178 INFO:tasks.workunit.client.0.vm06.stdout:8/922: chown d15/l5a 63799155 1 2026-03-09T17:30:07.179 INFO:tasks.workunit.client.0.vm06.stdout:8/923: read d15/d16/d1e/f59 [5267451,88107] 0 2026-03-09T17:30:07.184 INFO:tasks.workunit.client.1.vm09.stdout:6/134: dread d3/fc [0,4194304] 0 2026-03-09T17:30:07.188 INFO:tasks.workunit.client.1.vm09.stdout:6/135: dread d3/d7/ff [0,4194304] 0 2026-03-09T17:30:07.194 INFO:tasks.workunit.client.1.vm09.stdout:6/136: dwrite d3/d7/f10 [0,4194304] 0 2026-03-09T17:30:07.198 INFO:tasks.workunit.client.1.vm09.stdout:7/202: rmdir da/d11/d2d 39 2026-03-09T17:30:07.201 INFO:tasks.workunit.client.1.vm09.stdout:6/137: read - d3/d7/f11 zero size 2026-03-09T17:30:07.203 INFO:tasks.workunit.client.1.vm09.stdout:6/138: chown d3/d7/ld 1 1 2026-03-09T17:30:07.203 INFO:tasks.workunit.client.1.vm09.stdout:7/203: rename da/d11/d41/c2e to da/d11/d29/c4a 0 2026-03-09T17:30:07.209 INFO:tasks.workunit.client.1.vm09.stdout:7/204: symlink da/d11/l4b 0 2026-03-09T17:30:07.209 INFO:tasks.workunit.client.1.vm09.stdout:6/139: creat d3/d21/d25/d26/f27 x:0 0 0 2026-03-09T17:30:07.209 INFO:tasks.workunit.client.1.vm09.stdout:6/140: dread - d3/f1f zero size 2026-03-09T17:30:07.210 INFO:tasks.workunit.client.1.vm09.stdout:7/205: write da/f27 [167284,102657] 0 2026-03-09T17:30:07.210 INFO:tasks.workunit.client.1.vm09.stdout:6/141: creat d3/d21/f28 x:0 0 0 2026-03-09T17:30:07.211 INFO:tasks.workunit.client.1.vm09.stdout:7/206: mkdir da/d11/d29/d4c 0 2026-03-09T17:30:07.214 INFO:tasks.workunit.client.1.vm09.stdout:6/142: unlink d3/d7/l8 0 2026-03-09T17:30:07.216 INFO:tasks.workunit.client.1.vm09.stdout:7/207: symlink da/d11/d2d/l4d 0 2026-03-09T17:30:07.219 INFO:tasks.workunit.client.1.vm09.stdout:7/208: dwrite da/f16 [0,4194304] 0 2026-03-09T17:30:07.220 INFO:tasks.workunit.client.1.vm09.stdout:7/209: write f3 [1632977,46077] 0 2026-03-09T17:30:07.233 INFO:tasks.workunit.client.1.vm09.stdout:3/129: write d5/fd [2359777,98635] 0 2026-03-09T17:30:07.234 INFO:tasks.workunit.client.1.vm09.stdout:3/130: truncate d5/d6/d12/f19 804378 0 2026-03-09T17:30:07.237 INFO:tasks.workunit.client.1.vm09.stdout:3/131: dwrite d5/d6/d12/f18 [0,4194304] 0 2026-03-09T17:30:07.280 INFO:tasks.workunit.client.0.vm06.stdout:4/980: dwrite db/f6f [0,4194304] 0 2026-03-09T17:30:07.281 INFO:tasks.workunit.client.0.vm06.stdout:4/981: chown db/d59/d90/ff4 9054 1 2026-03-09T17:30:07.287 INFO:tasks.workunit.client.0.vm06.stdout:4/982: getdents db/d57 0 2026-03-09T17:30:07.320 INFO:tasks.workunit.client.1.vm09.stdout:2/121: dwrite f9 [0,4194304] 0 2026-03-09T17:30:07.353 INFO:tasks.workunit.client.1.vm09.stdout:2/122: fsync f9 0 2026-03-09T17:30:07.353 INFO:tasks.workunit.client.0.vm06.stdout:6/837: dwrite d6/d4f/d3e/f51 [0,4194304] 0 2026-03-09T17:30:07.362 INFO:tasks.workunit.client.1.vm09.stdout:9/144: rmdir d5/de 39 2026-03-09T17:30:07.367 INFO:tasks.workunit.client.0.vm06.stdout:6/838: mknod d6/d47/d96/c105 0 2026-03-09T17:30:07.370 INFO:tasks.workunit.client.0.vm06.stdout:6/839: dread - d6/d4f/d3e/d52/fd2 zero size 2026-03-09T17:30:07.380 INFO:tasks.workunit.client.1.vm09.stdout:9/145: rename d5/d21/f26 to d5/de/f3c 0 2026-03-09T17:30:07.381 INFO:tasks.workunit.client.1.vm09.stdout:2/123: getdents d13 0 2026-03-09T17:30:07.383 INFO:tasks.workunit.client.1.vm09.stdout:9/146: symlink d5/de/d29/d33/l3d 0 2026-03-09T17:30:07.384 INFO:tasks.workunit.client.1.vm09.stdout:9/147: readlink d5/l19 0 2026-03-09T17:30:07.388 INFO:tasks.workunit.client.1.vm09.stdout:9/148: dread d5/f14 [0,4194304] 0 2026-03-09T17:30:07.390 INFO:tasks.workunit.client.1.vm09.stdout:9/149: mknod d5/de/d29/d33/c3e 0 2026-03-09T17:30:07.390 INFO:tasks.workunit.client.1.vm09.stdout:9/150: read d5/de/f3c [2950929,37794] 0 2026-03-09T17:30:07.392 INFO:tasks.workunit.client.1.vm09.stdout:9/151: symlink d5/de/d29/l3f 0 2026-03-09T17:30:07.393 INFO:tasks.workunit.client.1.vm09.stdout:9/152: read d5/f1e [2173177,93338] 0 2026-03-09T17:30:07.395 INFO:tasks.workunit.client.1.vm09.stdout:9/153: symlink d5/de/d29/l40 0 2026-03-09T17:30:07.417 INFO:tasks.workunit.client.1.vm09.stdout:4/163: write d11/f18 [2637878,84412] 0 2026-03-09T17:30:07.418 INFO:tasks.workunit.client.0.vm06.stdout:3/992: dwrite dd/d19/d25/d2d/d9b/df1/f66 [0,4194304] 0 2026-03-09T17:30:07.436 INFO:tasks.workunit.client.1.vm09.stdout:1/160: write d9/dc/d15/f1a [44606,93455] 0 2026-03-09T17:30:07.439 INFO:tasks.workunit.client.0.vm06.stdout:2/817: truncate d3/d4/d12/d2b/d2d/f1b 40248 0 2026-03-09T17:30:07.445 INFO:tasks.workunit.client.1.vm09.stdout:4/164: creat d11/d1e/d29/f3b x:0 0 0 2026-03-09T17:30:07.446 INFO:tasks.workunit.client.0.vm06.stdout:3/993: truncate dd/d19/d25/d2d/d9b/fdb 262627 0 2026-03-09T17:30:07.447 INFO:tasks.workunit.client.1.vm09.stdout:1/161: mknod d9/dc/dd/c31 0 2026-03-09T17:30:07.447 INFO:tasks.workunit.client.1.vm09.stdout:1/162: chown d9/dc/d15/d22/c2d 13 1 2026-03-09T17:30:07.448 INFO:tasks.workunit.client.0.vm06.stdout:2/818: mkdir d3/d4/d12/d71/daa/d77/d102/d109 0 2026-03-09T17:30:07.450 INFO:tasks.workunit.client.0.vm06.stdout:3/994: readlink dd/d19/d2c/l3d 0 2026-03-09T17:30:07.451 INFO:tasks.workunit.client.1.vm09.stdout:1/163: dwrite f3 [4194304,4194304] 0 2026-03-09T17:30:07.452 INFO:tasks.workunit.client.1.vm09.stdout:1/164: write f6 [3937436,38520] 0 2026-03-09T17:30:07.454 INFO:tasks.workunit.client.1.vm09.stdout:0/157: truncate d6/f1b 130950 0 2026-03-09T17:30:07.454 INFO:tasks.workunit.client.1.vm09.stdout:0/158: dread - d6/de/f2e zero size 2026-03-09T17:30:07.456 INFO:tasks.workunit.client.1.vm09.stdout:4/165: link d11/d1e/d29/f32 d11/d1e/f3c 0 2026-03-09T17:30:07.456 INFO:tasks.workunit.client.1.vm09.stdout:0/159: truncate d6/de/f2e 377183 0 2026-03-09T17:30:07.459 INFO:tasks.workunit.client.1.vm09.stdout:1/165: mknod d9/dc/c32 0 2026-03-09T17:30:07.460 INFO:tasks.workunit.client.1.vm09.stdout:1/166: truncate d9/dc/d15/d1d/f17 408642 0 2026-03-09T17:30:07.466 INFO:tasks.workunit.client.1.vm09.stdout:0/160: creat d6/de/f31 x:0 0 0 2026-03-09T17:30:07.467 INFO:tasks.workunit.client.0.vm06.stdout:3/995: rmdir dd/d1d/d4e 39 2026-03-09T17:30:07.470 INFO:tasks.workunit.client.0.vm06.stdout:2/819: link d3/d4/d12/d2b/d36/dd4/fd7 d3/d4/d12/d71/daa/d77/d81/d64/d6a/dba/f10a 0 2026-03-09T17:30:07.477 INFO:tasks.workunit.client.1.vm09.stdout:0/161: readlink d6/de/l2f 0 2026-03-09T17:30:07.477 INFO:tasks.workunit.client.1.vm09.stdout:0/162: write d6/d1d/f1e [398672,73038] 0 2026-03-09T17:30:07.480 INFO:tasks.workunit.client.0.vm06.stdout:2/820: dread d3/d4/d12/d71/daa/d77/d81/d64/d6a/f6d [0,4194304] 0 2026-03-09T17:30:07.484 INFO:tasks.workunit.client.1.vm09.stdout:4/166: creat d11/d1e/d29/d36/f3d x:0 0 0 2026-03-09T17:30:07.487 INFO:tasks.workunit.client.1.vm09.stdout:1/167: creat d9/dc/d15/d21/f33 x:0 0 0 2026-03-09T17:30:07.488 INFO:tasks.workunit.client.1.vm09.stdout:5/159: truncate d0/d2/f2a 5092972 0 2026-03-09T17:30:07.488 INFO:tasks.workunit.client.1.vm09.stdout:1/168: chown d9/dc/d15/c30 82 1 2026-03-09T17:30:07.495 INFO:tasks.workunit.client.1.vm09.stdout:4/167: unlink d11/d1e/d29/f2c 0 2026-03-09T17:30:07.498 INFO:tasks.workunit.client.0.vm06.stdout:1/913: write d11/d14/d1c/d3a/fbe [1128606,116196] 0 2026-03-09T17:30:07.499 INFO:tasks.workunit.client.1.vm09.stdout:8/164: truncate d1/da/dd/f27 345899 0 2026-03-09T17:30:07.499 INFO:tasks.workunit.client.1.vm09.stdout:8/165: dread - d1/da/f35 zero size 2026-03-09T17:30:07.500 INFO:tasks.workunit.client.0.vm06.stdout:3/996: creat dd/d19/d25/d44/d80/dd7/f153 x:0 0 0 2026-03-09T17:30:07.503 INFO:tasks.workunit.client.0.vm06.stdout:2/821: dread d3/d4/f1f [0,4194304] 0 2026-03-09T17:30:07.504 INFO:tasks.workunit.client.1.vm09.stdout:5/160: unlink d0/l14 0 2026-03-09T17:30:07.505 INFO:tasks.workunit.client.1.vm09.stdout:5/161: dread d0/d9/f34 [0,4194304] 0 2026-03-09T17:30:07.513 INFO:tasks.workunit.client.0.vm06.stdout:3/997: mkdir dd/d81/d97/d154 0 2026-03-09T17:30:07.518 INFO:tasks.workunit.client.0.vm06.stdout:5/903: dwrite d4/d22/f3f [0,4194304] 0 2026-03-09T17:30:07.531 INFO:tasks.workunit.client.0.vm06.stdout:5/904: dread d4/d22/d64/fcc [0,4194304] 0 2026-03-09T17:30:07.531 INFO:tasks.workunit.client.0.vm06.stdout:8/924: dwrite d15/d39/d3c/d6c/f8b [0,4194304] 0 2026-03-09T17:30:07.532 INFO:tasks.workunit.client.0.vm06.stdout:5/905: chown d4/d22/ce8 5164125 1 2026-03-09T17:30:07.539 INFO:tasks.workunit.client.0.vm06.stdout:3/998: truncate dd/d1d/d2e/d67/def/f123 749957 0 2026-03-09T17:30:07.544 INFO:tasks.workunit.client.0.vm06.stdout:2/822: mkdir d3/d4/d12/d71/daa/d10b 0 2026-03-09T17:30:07.544 INFO:tasks.workunit.client.1.vm09.stdout:4/168: mknod d11/d1e/d30/c3e 0 2026-03-09T17:30:07.544 INFO:tasks.workunit.client.1.vm09.stdout:4/169: write d11/f18 [988519,94116] 0 2026-03-09T17:30:07.544 INFO:tasks.workunit.client.1.vm09.stdout:4/170: readlink d11/l17 0 2026-03-09T17:30:07.544 INFO:tasks.workunit.client.1.vm09.stdout:4/171: fsync d11/f13 0 2026-03-09T17:30:07.547 INFO:tasks.workunit.client.1.vm09.stdout:5/162: creat d0/dc/f37 x:0 0 0 2026-03-09T17:30:07.552 INFO:tasks.workunit.client.0.vm06.stdout:8/925: dread d15/d16/f66 [0,4194304] 0 2026-03-09T17:30:07.562 INFO:tasks.workunit.client.1.vm09.stdout:7/210: rmdir da/d11/d29 39 2026-03-09T17:30:07.562 INFO:tasks.workunit.client.1.vm09.stdout:6/143: rmdir d3/d21/d25/d26 39 2026-03-09T17:30:07.563 INFO:tasks.workunit.client.1.vm09.stdout:6/144: chown d3/d7/ff 46919325 1 2026-03-09T17:30:07.565 INFO:tasks.workunit.client.0.vm06.stdout:2/823: rename d3/d4/d12/d2b/d2d/f48 to d3/d4/d12/da7/dfc/f10c 0 2026-03-09T17:30:07.568 INFO:tasks.workunit.client.1.vm09.stdout:1/169: creat d9/f34 x:0 0 0 2026-03-09T17:30:07.571 INFO:tasks.workunit.client.1.vm09.stdout:8/166: symlink d1/d14/d1b/d32/l38 0 2026-03-09T17:30:07.572 INFO:tasks.workunit.client.1.vm09.stdout:8/167: read d1/f16 [157130,74298] 0 2026-03-09T17:30:07.573 INFO:tasks.workunit.client.1.vm09.stdout:5/163: stat d0/d9/d16/l1b 0 2026-03-09T17:30:07.573 INFO:tasks.workunit.client.1.vm09.stdout:5/164: chown d0/d2/d15/d20/f32 3449 1 2026-03-09T17:30:07.575 INFO:tasks.workunit.client.1.vm09.stdout:8/168: dwrite d1/f3 [0,4194304] 0 2026-03-09T17:30:07.576 INFO:tasks.workunit.client.1.vm09.stdout:7/211: rename da/d11/d29 to da/d11/d41/d4e 0 2026-03-09T17:30:07.577 INFO:tasks.workunit.client.1.vm09.stdout:7/212: chown da/d11/f1f 28 1 2026-03-09T17:30:07.578 INFO:tasks.workunit.client.1.vm09.stdout:6/145: symlink d3/d21/d25/l29 0 2026-03-09T17:30:07.578 INFO:tasks.workunit.client.1.vm09.stdout:7/213: chown da/d11/c34 245477 1 2026-03-09T17:30:07.578 INFO:tasks.workunit.client.1.vm09.stdout:8/169: dread d1/da/d13/f1d [0,4194304] 0 2026-03-09T17:30:07.580 INFO:tasks.workunit.client.1.vm09.stdout:7/214: dwrite da/f15 [0,4194304] 0 2026-03-09T17:30:07.584 INFO:tasks.workunit.client.1.vm09.stdout:4/172: creat d11/f3f x:0 0 0 2026-03-09T17:30:07.586 INFO:tasks.workunit.client.1.vm09.stdout:5/165: rename d0/d9/d16/l1b to d0/dc/d21/d26/l38 0 2026-03-09T17:30:07.590 INFO:tasks.workunit.client.1.vm09.stdout:3/132: dwrite d5/d6/fb [0,4194304] 0 2026-03-09T17:30:07.606 INFO:tasks.workunit.client.1.vm09.stdout:6/146: link d3/d7/fe d3/d21/d25/d26/f2a 0 2026-03-09T17:30:07.613 INFO:tasks.workunit.client.1.vm09.stdout:6/147: truncate d3/d7/f23 268090 0 2026-03-09T17:30:07.613 INFO:tasks.workunit.client.1.vm09.stdout:6/148: symlink d3/d21/d25/d26/l2b 0 2026-03-09T17:30:07.613 INFO:tasks.workunit.client.1.vm09.stdout:6/149: symlink d3/d7/l2c 0 2026-03-09T17:30:07.613 INFO:tasks.workunit.client.1.vm09.stdout:6/150: unlink d3/c1b 0 2026-03-09T17:30:07.613 INFO:tasks.workunit.client.1.vm09.stdout:6/151: read - d3/d1e/f20 zero size 2026-03-09T17:30:07.619 INFO:tasks.workunit.client.1.vm09.stdout:4/173: dread d11/f23 [0,4194304] 0 2026-03-09T17:30:07.619 INFO:tasks.workunit.client.1.vm09.stdout:4/174: truncate d11/d1e/d29/d36/f3d 924696 0 2026-03-09T17:30:07.620 INFO:tasks.workunit.client.1.vm09.stdout:4/175: write fd [843376,93379] 0 2026-03-09T17:30:07.621 INFO:tasks.workunit.client.1.vm09.stdout:4/176: rmdir d11/d1e 39 2026-03-09T17:30:07.623 INFO:tasks.workunit.client.1.vm09.stdout:4/177: creat d11/d1e/d29/d36/f40 x:0 0 0 2026-03-09T17:30:07.624 INFO:tasks.workunit.client.1.vm09.stdout:4/178: write d11/d1e/d29/f2e [115733,37216] 0 2026-03-09T17:30:07.625 INFO:tasks.workunit.client.1.vm09.stdout:4/179: rename d11/d1e/l27 to d11/d1e/d29/l41 0 2026-03-09T17:30:07.626 INFO:tasks.workunit.client.1.vm09.stdout:4/180: dread - d11/d1e/d29/f2f zero size 2026-03-09T17:30:07.626 INFO:tasks.workunit.client.1.vm09.stdout:4/181: readlink d11/d1e/d31/l39 0 2026-03-09T17:30:07.626 INFO:tasks.workunit.client.1.vm09.stdout:4/182: readlink d11/l17 0 2026-03-09T17:30:07.633 INFO:tasks.workunit.client.1.vm09.stdout:4/183: dread d11/f15 [0,4194304] 0 2026-03-09T17:30:07.635 INFO:tasks.workunit.client.1.vm09.stdout:4/184: rename d11/c1a to d11/d1e/d29/d36/c42 0 2026-03-09T17:30:07.639 INFO:tasks.workunit.client.1.vm09.stdout:4/185: dwrite d11/f26 [0,4194304] 0 2026-03-09T17:30:07.642 INFO:tasks.workunit.client.1.vm09.stdout:7/215: sync 2026-03-09T17:30:07.643 INFO:tasks.workunit.client.1.vm09.stdout:8/170: sync 2026-03-09T17:30:07.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: Manager daemon vm09.lqzvkh is now available 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: Migrating agent root cert to cert store 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: Migrating agent root key to cert store 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: Checking for cert/key for grafana.vm06 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: Migrating grafana.vm06 cert to cert store 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: Migrating grafana.vm06 key to cert store 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm09.lqzvkh/mirror_snapshot_schedule"}]: dispatch 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm09.lqzvkh/mirror_snapshot_schedule"}]: dispatch 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm09.lqzvkh/trash_purge_schedule"}]: dispatch 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:07 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm09.lqzvkh/trash_purge_schedule"}]: dispatch 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: Manager daemon vm09.lqzvkh is now available 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: Migrating agent root cert to cert store 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: Migrating agent root key to cert store 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: Checking for cert/key for grafana.vm06 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: Migrating grafana.vm06 cert to cert store 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: Migrating grafana.vm06 key to cert store 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm09.lqzvkh/mirror_snapshot_schedule"}]: dispatch 2026-03-09T17:30:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm09.lqzvkh/mirror_snapshot_schedule"}]: dispatch 2026-03-09T17:30:07.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm09.lqzvkh/trash_purge_schedule"}]: dispatch 2026-03-09T17:30:07.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:07 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm09.lqzvkh/trash_purge_schedule"}]: dispatch 2026-03-09T17:30:07.650 INFO:tasks.workunit.client.1.vm09.stdout:8/171: symlink d1/da/l39 0 2026-03-09T17:30:07.651 INFO:tasks.workunit.client.1.vm09.stdout:7/216: dwrite da/f15 [0,4194304] 0 2026-03-09T17:30:07.653 INFO:tasks.workunit.client.1.vm09.stdout:7/217: chown da 1799274 1 2026-03-09T17:30:07.654 INFO:tasks.workunit.client.1.vm09.stdout:4/186: dwrite d11/d1e/d29/f32 [0,4194304] 0 2026-03-09T17:30:07.656 INFO:tasks.workunit.client.1.vm09.stdout:4/187: dread - d11/d1e/d29/f3b zero size 2026-03-09T17:30:07.657 INFO:tasks.workunit.client.1.vm09.stdout:4/188: dread - d11/f19 zero size 2026-03-09T17:30:07.657 INFO:tasks.workunit.client.1.vm09.stdout:8/172: dwrite d1/f7 [0,4194304] 0 2026-03-09T17:30:07.669 INFO:tasks.workunit.client.1.vm09.stdout:7/218: getdents da/d11/d41/d4e/d4c 0 2026-03-09T17:30:07.669 INFO:tasks.workunit.client.1.vm09.stdout:4/189: symlink d11/d1e/d30/d35/l43 0 2026-03-09T17:30:07.669 INFO:tasks.workunit.client.1.vm09.stdout:7/219: write da/d11/f3f [862997,41136] 0 2026-03-09T17:30:07.671 INFO:tasks.workunit.client.1.vm09.stdout:4/190: symlink d11/d1e/d29/d36/l44 0 2026-03-09T17:30:07.679 INFO:tasks.workunit.client.1.vm09.stdout:4/191: truncate d11/f16 4680601 0 2026-03-09T17:30:07.679 INFO:tasks.workunit.client.1.vm09.stdout:7/220: rename da/d11/l2f to da/d11/d41/d4e/d4c/l4f 0 2026-03-09T17:30:07.679 INFO:tasks.workunit.client.1.vm09.stdout:4/192: mkdir d11/d1e/d45 0 2026-03-09T17:30:07.683 INFO:tasks.workunit.client.1.vm09.stdout:7/221: link da/f21 da/d11/d41/d4e/d3d/f50 0 2026-03-09T17:30:07.684 INFO:tasks.workunit.client.1.vm09.stdout:4/193: creat d11/f46 x:0 0 0 2026-03-09T17:30:07.685 INFO:tasks.workunit.client.1.vm09.stdout:7/222: chown da/d11/d41/d4e/f33 167496 1 2026-03-09T17:30:07.694 INFO:tasks.workunit.client.1.vm09.stdout:7/223: creat da/d11/d41/d4e/d3d/f51 x:0 0 0 2026-03-09T17:30:07.697 INFO:tasks.workunit.client.1.vm09.stdout:7/224: write da/f3a [4656696,55] 0 2026-03-09T17:30:07.703 INFO:tasks.workunit.client.1.vm09.stdout:7/225: creat da/d11/d2d/d49/f52 x:0 0 0 2026-03-09T17:30:07.703 INFO:tasks.workunit.client.1.vm09.stdout:4/194: dwrite d11/f13 [0,4194304] 0 2026-03-09T17:30:07.703 INFO:tasks.workunit.client.1.vm09.stdout:7/226: read - da/f36 zero size 2026-03-09T17:30:07.707 INFO:tasks.workunit.client.1.vm09.stdout:8/173: fdatasync d1/da/dd/f1e 0 2026-03-09T17:30:07.728 INFO:tasks.workunit.client.0.vm06.stdout:4/983: write db/d59/f76 [301280,43084] 0 2026-03-09T17:30:07.736 INFO:tasks.workunit.client.0.vm06.stdout:4/984: fsync db/d57/fc7 0 2026-03-09T17:30:07.736 INFO:tasks.workunit.client.0.vm06.stdout:4/985: creat db/d1d/d21/f155 x:0 0 0 2026-03-09T17:30:07.736 INFO:tasks.workunit.client.1.vm09.stdout:8/174: mkdir d1/da/d3a 0 2026-03-09T17:30:07.736 INFO:tasks.workunit.client.1.vm09.stdout:7/227: getdents da/d11/d2d 0 2026-03-09T17:30:07.736 INFO:tasks.workunit.client.1.vm09.stdout:8/175: dwrite d1/da/f12 [4194304,4194304] 0 2026-03-09T17:30:07.742 INFO:tasks.workunit.client.1.vm09.stdout:4/195: sync 2026-03-09T17:30:07.743 INFO:tasks.workunit.client.1.vm09.stdout:4/196: write d11/d1e/d29/f3b [164167,47864] 0 2026-03-09T17:30:07.746 INFO:tasks.workunit.client.1.vm09.stdout:8/176: dread d1/da/dd/f1e [0,4194304] 0 2026-03-09T17:30:07.747 INFO:tasks.workunit.client.0.vm06.stdout:4/986: getdents db/de2/d132 0 2026-03-09T17:30:07.748 INFO:tasks.workunit.client.0.vm06.stdout:4/987: symlink db/d59/d5f/d45/d10a/dcc/l156 0 2026-03-09T17:30:07.749 INFO:tasks.workunit.client.1.vm09.stdout:8/177: mknod d1/d14/d1b/d32/c3b 0 2026-03-09T17:30:07.751 INFO:tasks.workunit.client.1.vm09.stdout:8/178: write d1/d14/f2f [20316,10770] 0 2026-03-09T17:30:07.752 INFO:tasks.workunit.client.0.vm06.stdout:4/988: mknod db/c157 0 2026-03-09T17:30:07.753 INFO:tasks.workunit.client.1.vm09.stdout:4/197: dwrite d11/f3f [0,4194304] 0 2026-03-09T17:30:07.755 INFO:tasks.workunit.client.0.vm06.stdout:4/989: mkdir db/d1d/d21/d25/d4b/df7/d138/d158 0 2026-03-09T17:30:07.758 INFO:tasks.workunit.client.0.vm06.stdout:4/990: unlink db/c1b 0 2026-03-09T17:30:07.759 INFO:tasks.workunit.client.1.vm09.stdout:8/179: getdents d1/da/dd 0 2026-03-09T17:30:07.763 INFO:tasks.workunit.client.1.vm09.stdout:4/198: dwrite fd [0,4194304] 0 2026-03-09T17:30:07.767 INFO:tasks.workunit.client.1.vm09.stdout:8/180: dread d1/da/d13/f21 [0,4194304] 0 2026-03-09T17:30:07.783 INFO:tasks.workunit.client.1.vm09.stdout:4/199: symlink d11/d1e/d31/l47 0 2026-03-09T17:30:07.783 INFO:tasks.workunit.client.1.vm09.stdout:8/181: rmdir d1/d14/d31 39 2026-03-09T17:30:07.815 INFO:tasks.workunit.client.1.vm09.stdout:4/200: sync 2026-03-09T17:30:07.816 INFO:tasks.workunit.client.1.vm09.stdout:4/201: stat d11/f12 0 2026-03-09T17:30:07.822 INFO:tasks.workunit.client.1.vm09.stdout:4/202: mknod d11/d1e/d29/d36/c48 0 2026-03-09T17:30:07.829 INFO:tasks.workunit.client.1.vm09.stdout:4/203: sync 2026-03-09T17:30:07.830 INFO:tasks.workunit.client.1.vm09.stdout:4/204: truncate d11/f1c 4600123 0 2026-03-09T17:30:07.834 INFO:tasks.workunit.client.1.vm09.stdout:4/205: dwrite d11/d1e/f28 [0,4194304] 0 2026-03-09T17:30:07.866 INFO:tasks.workunit.client.0.vm06.stdout:6/840: write d6/d12/d17/f7a [720919,120735] 0 2026-03-09T17:30:07.869 INFO:tasks.workunit.client.1.vm09.stdout:9/154: getdents d5/de 0 2026-03-09T17:30:07.872 INFO:tasks.workunit.client.1.vm09.stdout:2/124: dwrite d13/d15/d21/f28 [0,4194304] 0 2026-03-09T17:30:07.872 INFO:tasks.workunit.client.1.vm09.stdout:9/155: creat d5/de/d29/f41 x:0 0 0 2026-03-09T17:30:07.874 INFO:tasks.workunit.client.1.vm09.stdout:2/125: fdatasync d13/f14 0 2026-03-09T17:30:07.874 INFO:tasks.workunit.client.1.vm09.stdout:9/156: creat d5/de/d29/d33/f42 x:0 0 0 2026-03-09T17:30:07.876 INFO:tasks.workunit.client.1.vm09.stdout:2/126: creat d13/d15/f2a x:0 0 0 2026-03-09T17:30:07.876 INFO:tasks.workunit.client.1.vm09.stdout:9/157: mknod d5/d2e/c43 0 2026-03-09T17:30:07.879 INFO:tasks.workunit.client.1.vm09.stdout:9/158: read d5/de/f20 [2854671,1024] 0 2026-03-09T17:30:07.881 INFO:tasks.workunit.client.1.vm09.stdout:2/127: dwrite d13/d15/d21/f28 [0,4194304] 0 2026-03-09T17:30:07.884 INFO:tasks.workunit.client.1.vm09.stdout:2/128: creat d13/d15/f2b x:0 0 0 2026-03-09T17:30:07.885 INFO:tasks.workunit.client.1.vm09.stdout:2/129: chown c3 15 1 2026-03-09T17:30:07.885 INFO:tasks.workunit.client.1.vm09.stdout:9/159: dwrite f2 [0,4194304] 0 2026-03-09T17:30:07.886 INFO:tasks.workunit.client.1.vm09.stdout:9/160: stat d5/d21/c25 0 2026-03-09T17:30:07.891 INFO:tasks.workunit.client.1.vm09.stdout:2/130: dread d13/f1a [0,4194304] 0 2026-03-09T17:30:07.895 INFO:tasks.workunit.client.1.vm09.stdout:2/131: dwrite d13/d15/d21/f24 [4194304,4194304] 0 2026-03-09T17:30:07.899 INFO:tasks.workunit.client.1.vm09.stdout:2/132: truncate d13/f1a 2748633 0 2026-03-09T17:30:07.899 INFO:tasks.workunit.client.1.vm09.stdout:2/133: unlink d13/l1e 0 2026-03-09T17:30:07.915 INFO:tasks.workunit.client.1.vm09.stdout:9/161: truncate d5/f1e 1815543 0 2026-03-09T17:30:07.917 INFO:tasks.workunit.client.1.vm09.stdout:9/162: unlink d5/de/c10 0 2026-03-09T17:30:07.942 INFO:tasks.workunit.client.1.vm09.stdout:9/163: sync 2026-03-09T17:30:07.943 INFO:tasks.workunit.client.1.vm09.stdout:9/164: mknod d5/de/c44 0 2026-03-09T17:30:07.944 INFO:tasks.workunit.client.1.vm09.stdout:9/165: symlink d5/d21/l45 0 2026-03-09T17:30:07.944 INFO:tasks.workunit.client.1.vm09.stdout:9/166: write d5/d21/f2b [164082,9435] 0 2026-03-09T17:30:07.946 INFO:tasks.workunit.client.1.vm09.stdout:9/167: dread d5/de/f20 [0,4194304] 0 2026-03-09T17:30:07.951 INFO:tasks.workunit.client.1.vm09.stdout:9/168: link d5/d21/f30 d5/d21/f46 0 2026-03-09T17:30:07.952 INFO:tasks.workunit.client.1.vm09.stdout:9/169: dread f2 [0,4194304] 0 2026-03-09T17:30:07.963 INFO:tasks.workunit.client.1.vm09.stdout:1/170: rmdir d9/dc/dd 39 2026-03-09T17:30:07.964 INFO:tasks.workunit.client.1.vm09.stdout:1/171: dread - d9/f34 zero size 2026-03-09T17:30:07.985 INFO:tasks.workunit.client.0.vm06.stdout:1/914: dwrite d11/d14/d1c/f2e [0,4194304] 0 2026-03-09T17:30:07.992 INFO:tasks.workunit.client.1.vm09.stdout:0/163: write d6/f7 [2113194,75370] 0 2026-03-09T17:30:07.996 INFO:tasks.workunit.client.0.vm06.stdout:1/915: rename d11/l121 to d11/d14/d1d/d1e/d2a/d34/d64/dfa/l134 0 2026-03-09T17:30:07.997 INFO:tasks.workunit.client.0.vm06.stdout:1/916: write d11/d14/d1d/d4a/df7/d106/d112/d114/f11c [191136,62727] 0 2026-03-09T17:30:07.999 INFO:tasks.workunit.client.0.vm06.stdout:1/917: dread d11/d69/fad [0,4194304] 0 2026-03-09T17:30:08.003 INFO:tasks.workunit.client.1.vm09.stdout:9/170: link d5/de/d29/f36 d5/f47 0 2026-03-09T17:30:08.006 INFO:tasks.workunit.client.1.vm09.stdout:9/171: write d5/f13 [3947030,130152] 0 2026-03-09T17:30:08.007 INFO:tasks.workunit.client.1.vm09.stdout:1/172: dread f8 [0,4194304] 0 2026-03-09T17:30:08.012 INFO:tasks.workunit.client.1.vm09.stdout:9/172: dwrite d5/f47 [0,4194304] 0 2026-03-09T17:30:08.012 INFO:tasks.workunit.client.0.vm06.stdout:3/999: dwrite dd/d1d/d2e/d67/fcf [0,4194304] 0 2026-03-09T17:30:08.013 INFO:tasks.workunit.client.0.vm06.stdout:5/906: dwrite d4/f2d [0,4194304] 0 2026-03-09T17:30:08.019 INFO:tasks.workunit.client.1.vm09.stdout:1/173: mkdir d9/dc/d15/d21/d35 0 2026-03-09T17:30:08.021 INFO:tasks.workunit.client.0.vm06.stdout:5/907: dwrite d4/d50/d35/d40/d95/db8/dda/fdd [0,4194304] 0 2026-03-09T17:30:08.037 INFO:tasks.workunit.client.0.vm06.stdout:2/824: write d3/d4/d22/d72/d8f/fbf [1500864,17406] 0 2026-03-09T17:30:08.039 INFO:tasks.workunit.client.0.vm06.stdout:8/926: write d15/d16/d1a/d47/fa5 [1092981,108819] 0 2026-03-09T17:30:08.051 INFO:tasks.workunit.client.1.vm09.stdout:9/173: dwrite d5/de/f3c [0,4194304] 0 2026-03-09T17:30:08.054 INFO:tasks.workunit.client.1.vm09.stdout:9/174: read d5/d21/f2b [141830,66592] 0 2026-03-09T17:30:08.055 INFO:tasks.workunit.client.1.vm09.stdout:9/175: readlink d5/de/l27 0 2026-03-09T17:30:08.059 INFO:tasks.workunit.client.1.vm09.stdout:9/176: write d5/f13 [3658876,102197] 0 2026-03-09T17:30:08.065 INFO:tasks.workunit.client.1.vm09.stdout:5/166: rmdir d0/d9 39 2026-03-09T17:30:08.066 INFO:tasks.workunit.client.1.vm09.stdout:5/167: read d0/d2/f31 [3376267,37357] 0 2026-03-09T17:30:08.067 INFO:tasks.workunit.client.1.vm09.stdout:5/168: dread - d0/dc/f37 zero size 2026-03-09T17:30:08.067 INFO:tasks.workunit.client.1.vm09.stdout:5/169: write d0/f22 [899400,42197] 0 2026-03-09T17:30:08.076 INFO:tasks.workunit.client.1.vm09.stdout:4/206: dwrite d11/f23 [0,4194304] 0 2026-03-09T17:30:08.080 INFO:tasks.workunit.client.0.vm06.stdout:5/908: mknod d4/d50/d35/d40/d96/dfe/c142 0 2026-03-09T17:30:08.081 INFO:tasks.workunit.client.1.vm09.stdout:0/164: getdents d6/d1d/d24 0 2026-03-09T17:30:08.083 INFO:tasks.workunit.client.0.vm06.stdout:1/918: getdents d11/d14/d1c 0 2026-03-09T17:30:08.088 INFO:tasks.workunit.client.1.vm09.stdout:0/165: dwrite d6/f2d [0,4194304] 0 2026-03-09T17:30:08.088 INFO:tasks.workunit.client.0.vm06.stdout:8/927: mkdir d15/d130 0 2026-03-09T17:30:08.091 INFO:tasks.workunit.client.1.vm09.stdout:5/170: stat d0/d2/d15/d20/l27 0 2026-03-09T17:30:08.100 INFO:tasks.workunit.client.1.vm09.stdout:1/174: fdatasync d9/dc/dd/f28 0 2026-03-09T17:30:08.104 INFO:tasks.workunit.client.1.vm09.stdout:1/175: dread - d9/dc/d15/d21/f33 zero size 2026-03-09T17:30:08.114 INFO:tasks.workunit.client.0.vm06.stdout:8/928: dread d15/d16/d1a/f1b [0,4194304] 0 2026-03-09T17:30:08.115 INFO:tasks.workunit.client.0.vm06.stdout:8/929: dread - d15/d16/d1e/d30/f120 zero size 2026-03-09T17:30:08.117 INFO:tasks.workunit.client.0.vm06.stdout:5/909: mknod d4/d50/d35/d40/d96/c143 0 2026-03-09T17:30:08.122 INFO:tasks.workunit.client.0.vm06.stdout:8/930: creat d15/d39/dd2/f131 x:0 0 0 2026-03-09T17:30:08.125 INFO:tasks.workunit.client.0.vm06.stdout:5/910: mknod d4/d52/d55/dee/c144 0 2026-03-09T17:30:08.130 INFO:tasks.workunit.client.1.vm09.stdout:0/166: rmdir d6 39 2026-03-09T17:30:08.133 INFO:tasks.workunit.client.0.vm06.stdout:5/911: creat d4/d50/d18/d3d/f145 x:0 0 0 2026-03-09T17:30:08.137 INFO:tasks.workunit.client.0.vm06.stdout:8/931: creat d15/d16/d1e/d30/d55/d10d/f132 x:0 0 0 2026-03-09T17:30:08.140 INFO:tasks.workunit.client.0.vm06.stdout:5/912: fsync d4/d50/f61 0 2026-03-09T17:30:08.140 INFO:tasks.workunit.client.0.vm06.stdout:5/913: chown f0 143 1 2026-03-09T17:30:08.144 INFO:tasks.workunit.client.0.vm06.stdout:8/932: creat d15/d31/dc5/df1/d71/f133 x:0 0 0 2026-03-09T17:30:08.149 INFO:tasks.workunit.client.0.vm06.stdout:5/914: rename d4/d50/d35/d40/d109/d11f/f122 to d4/d52/d55/dee/f146 0 2026-03-09T17:30:08.154 INFO:tasks.workunit.client.0.vm06.stdout:5/915: rmdir d4/d22/d46/dec 39 2026-03-09T17:30:08.155 INFO:tasks.workunit.client.1.vm09.stdout:0/167: dread d6/d1d/f1e [0,4194304] 0 2026-03-09T17:30:08.159 INFO:tasks.workunit.client.1.vm09.stdout:1/176: symlink d9/dc/l36 0 2026-03-09T17:30:08.159 INFO:tasks.workunit.client.1.vm09.stdout:1/177: dread f8 [0,4194304] 0 2026-03-09T17:30:08.160 INFO:tasks.workunit.client.0.vm06.stdout:5/916: rename d4/d50/d35/d40/d95/cbd to d4/d22/dbe/c147 0 2026-03-09T17:30:08.165 INFO:tasks.workunit.client.1.vm09.stdout:3/133: dwrite d5/f22 [0,4194304] 0 2026-03-09T17:30:08.167 INFO:tasks.workunit.client.0.vm06.stdout:5/917: mknod d4/d22/dbe/dfb/c148 0 2026-03-09T17:30:08.167 INFO:tasks.workunit.client.1.vm09.stdout:9/177: rename d5/c22 to d5/c48 0 2026-03-09T17:30:08.179 INFO:tasks.workunit.client.1.vm09.stdout:3/134: dread d5/d6/fe [0,4194304] 0 2026-03-09T17:30:08.180 INFO:tasks.workunit.client.0.vm06.stdout:8/933: getdents d15/d16/d1a 0 2026-03-09T17:30:08.182 INFO:tasks.workunit.client.1.vm09.stdout:0/168: mkdir d6/d1d/d24/d32 0 2026-03-09T17:30:08.185 INFO:tasks.workunit.client.1.vm09.stdout:1/178: rename d9/d1f to d9/dc/d15/d22/d37 0 2026-03-09T17:30:08.201 INFO:tasks.workunit.client.1.vm09.stdout:6/152: getdents d3/d21/d25/d26 0 2026-03-09T17:30:08.201 INFO:tasks.workunit.client.1.vm09.stdout:6/153: dread - d3/f19 zero size 2026-03-09T17:30:08.202 INFO:tasks.workunit.client.1.vm09.stdout:6/154: fsync d3/f1f 0 2026-03-09T17:30:08.208 INFO:tasks.workunit.client.0.vm06.stdout:5/918: getdents d4/d52/d112 0 2026-03-09T17:30:08.210 INFO:tasks.workunit.client.1.vm09.stdout:9/178: dwrite f2 [0,4194304] 0 2026-03-09T17:30:08.211 INFO:tasks.workunit.client.1.vm09.stdout:0/169: mknod d6/d1d/c33 0 2026-03-09T17:30:08.214 INFO:tasks.workunit.client.0.vm06.stdout:5/919: mknod d4/d50/db2/c149 0 2026-03-09T17:30:08.217 INFO:tasks.workunit.client.0.vm06.stdout:5/920: write d4/d22/f5d [1959161,23307] 0 2026-03-09T17:30:08.219 INFO:tasks.workunit.client.1.vm09.stdout:6/155: symlink d3/d21/l2d 0 2026-03-09T17:30:08.220 INFO:tasks.workunit.client.0.vm06.stdout:5/921: mknod d4/d52/d55/c14a 0 2026-03-09T17:30:08.224 INFO:tasks.workunit.client.0.vm06.stdout:5/922: mknod d4/d52/d55/d13e/d127/c14b 0 2026-03-09T17:30:08.229 INFO:tasks.workunit.client.1.vm09.stdout:0/170: unlink d6/de/c13 0 2026-03-09T17:30:08.229 INFO:tasks.workunit.client.0.vm06.stdout:5/923: fsync d4/dca/f8b 0 2026-03-09T17:30:08.232 INFO:tasks.workunit.client.1.vm09.stdout:3/135: rename d5/l11 to d5/d6/d12/l2a 0 2026-03-09T17:30:08.234 INFO:tasks.workunit.client.1.vm09.stdout:1/179: mkdir d9/d38 0 2026-03-09T17:30:08.236 INFO:tasks.workunit.client.0.vm06.stdout:5/924: rename d4/d50/d35/d40/d96/dfe/l134 to d4/da4/dcf/l14c 0 2026-03-09T17:30:08.240 INFO:tasks.workunit.client.1.vm09.stdout:9/179: symlink d5/l49 0 2026-03-09T17:30:08.241 INFO:tasks.workunit.client.0.vm06.stdout:5/925: creat d4/d50/d35/d40/d109/f14d x:0 0 0 2026-03-09T17:30:08.242 INFO:tasks.workunit.client.1.vm09.stdout:9/180: dread d5/de/d29/f37 [0,4194304] 0 2026-03-09T17:30:08.243 INFO:tasks.workunit.client.1.vm09.stdout:0/171: write d6/f1b [907512,2303] 0 2026-03-09T17:30:08.243 INFO:tasks.workunit.client.1.vm09.stdout:0/172: stat d6/de/c10 0 2026-03-09T17:30:08.247 INFO:tasks.workunit.client.1.vm09.stdout:8/182: truncate d1/f3 371648 0 2026-03-09T17:30:08.250 INFO:tasks.workunit.client.0.vm06.stdout:5/926: symlink d4/d52/l14e 0 2026-03-09T17:30:08.252 INFO:tasks.workunit.client.0.vm06.stdout:5/927: rename d4/d52/db4/cb6 to d4/d52/d55/d13e/c14f 0 2026-03-09T17:30:08.253 INFO:tasks.workunit.client.0.vm06.stdout:5/928: chown d4/d50/d18/f74 79822 1 2026-03-09T17:30:08.255 INFO:tasks.workunit.client.1.vm09.stdout:9/181: creat d5/de/d29/d33/f4a x:0 0 0 2026-03-09T17:30:08.259 INFO:tasks.workunit.client.1.vm09.stdout:3/136: creat d5/d16/d25/f2b x:0 0 0 2026-03-09T17:30:08.260 INFO:tasks.workunit.client.1.vm09.stdout:1/180: symlink d9/d38/l39 0 2026-03-09T17:30:08.261 INFO:tasks.workunit.client.1.vm09.stdout:1/181: dread d9/dc/dd/f28 [0,4194304] 0 2026-03-09T17:30:08.262 INFO:tasks.workunit.client.1.vm09.stdout:1/182: write d9/f11 [169709,45800] 0 2026-03-09T17:30:08.263 INFO:tasks.workunit.client.1.vm09.stdout:7/228: dwrite da/d11/f1f [4194304,4194304] 0 2026-03-09T17:30:08.264 INFO:tasks.workunit.client.1.vm09.stdout:0/173: rename d6/de/l23 to d6/d1d/d24/d32/l34 0 2026-03-09T17:30:08.265 INFO:tasks.workunit.client.1.vm09.stdout:7/229: dread - da/d11/d2d/f45 zero size 2026-03-09T17:30:08.266 INFO:tasks.workunit.client.1.vm09.stdout:3/137: creat d5/d16/d25/f2c x:0 0 0 2026-03-09T17:30:08.267 INFO:tasks.workunit.client.1.vm09.stdout:3/138: write d5/d16/f17 [668871,60381] 0 2026-03-09T17:30:08.270 INFO:tasks.workunit.client.1.vm09.stdout:0/174: dwrite d6/de/f31 [0,4194304] 0 2026-03-09T17:30:08.276 INFO:tasks.workunit.client.1.vm09.stdout:0/175: dwrite d6/de/f31 [0,4194304] 0 2026-03-09T17:30:08.277 INFO:tasks.workunit.client.1.vm09.stdout:7/230: creat da/d11/d41/d4e/d3d/f53 x:0 0 0 2026-03-09T17:30:08.279 INFO:tasks.workunit.client.1.vm09.stdout:7/231: write da/d11/d41/d4e/f33 [1114514,75592] 0 2026-03-09T17:30:08.279 INFO:tasks.workunit.client.1.vm09.stdout:0/176: truncate d6/d1d/f30 214762 0 2026-03-09T17:30:08.284 INFO:tasks.workunit.client.1.vm09.stdout:3/139: symlink d5/d16/d25/l2d 0 2026-03-09T17:30:08.285 INFO:tasks.workunit.client.1.vm09.stdout:3/140: write d5/d6/d12/f1d [174291,12698] 0 2026-03-09T17:30:08.304 INFO:tasks.workunit.client.1.vm09.stdout:1/183: mkdir d9/d3a 0 2026-03-09T17:30:08.306 INFO:tasks.workunit.client.1.vm09.stdout:1/184: dread f2 [0,4194304] 0 2026-03-09T17:30:08.316 INFO:tasks.workunit.client.1.vm09.stdout:3/141: unlink d5/fd 0 2026-03-09T17:30:08.321 INFO:tasks.workunit.client.1.vm09.stdout:1/185: symlink d9/dc/d15/d22/l3b 0 2026-03-09T17:30:08.326 INFO:tasks.workunit.client.1.vm09.stdout:8/183: rename d1/da/d13/f21 to d1/d14/f3c 0 2026-03-09T17:30:08.327 INFO:tasks.workunit.client.1.vm09.stdout:8/184: truncate d1/da/f35 755653 0 2026-03-09T17:30:08.328 INFO:tasks.workunit.client.1.vm09.stdout:8/185: truncate d1/f7 4529549 0 2026-03-09T17:30:08.329 INFO:tasks.workunit.client.1.vm09.stdout:7/232: symlink da/d11/d3e/l54 0 2026-03-09T17:30:08.330 INFO:tasks.workunit.client.1.vm09.stdout:0/177: symlink d6/d2a/l35 0 2026-03-09T17:30:08.334 INFO:tasks.workunit.client.1.vm09.stdout:0/178: dwrite d6/de/f2e [0,4194304] 0 2026-03-09T17:30:08.336 INFO:tasks.workunit.client.1.vm09.stdout:3/142: stat d5/d9/fc 0 2026-03-09T17:30:08.339 INFO:tasks.workunit.client.1.vm09.stdout:0/179: dwrite d6/f21 [0,4194304] 0 2026-03-09T17:30:08.350 INFO:tasks.workunit.client.1.vm09.stdout:3/143: creat d5/d16/f2e x:0 0 0 2026-03-09T17:30:08.360 INFO:tasks.workunit.client.1.vm09.stdout:8/186: creat d1/d14/f3d x:0 0 0 2026-03-09T17:30:08.360 INFO:tasks.workunit.client.1.vm09.stdout:3/144: dwrite d5/d6/d12/f19 [0,4194304] 0 2026-03-09T17:30:08.360 INFO:tasks.workunit.client.1.vm09.stdout:8/187: dread d1/f7 [0,4194304] 0 2026-03-09T17:30:08.360 INFO:tasks.workunit.client.1.vm09.stdout:8/188: readlink d1/d14/d1b/d32/l38 0 2026-03-09T17:30:08.367 INFO:tasks.workunit.client.1.vm09.stdout:8/189: rename d1/da/dd/lf to d1/d14/d31/l3e 0 2026-03-09T17:30:08.374 INFO:tasks.workunit.client.1.vm09.stdout:3/145: link d5/d16/d25/f28 d5/f2f 0 2026-03-09T17:30:08.374 INFO:tasks.workunit.client.1.vm09.stdout:3/146: dread d5/f22 [0,4194304] 0 2026-03-09T17:30:08.374 INFO:tasks.workunit.client.1.vm09.stdout:8/190: rename d1/d14/d1b to d1/da/dd/d3f 0 2026-03-09T17:30:08.374 INFO:tasks.workunit.client.1.vm09.stdout:3/147: mkdir d5/d9/d30 0 2026-03-09T17:30:08.374 INFO:tasks.workunit.client.1.vm09.stdout:8/191: symlink d1/da/dd/d3f/l40 0 2026-03-09T17:30:08.374 INFO:tasks.workunit.client.1.vm09.stdout:3/148: mkdir d5/d16/d31 0 2026-03-09T17:30:08.376 INFO:tasks.workunit.client.1.vm09.stdout:3/149: dread d5/f2f [0,4194304] 0 2026-03-09T17:30:08.378 INFO:tasks.workunit.client.1.vm09.stdout:3/150: mkdir d5/d6/d32 0 2026-03-09T17:30:08.404 INFO:tasks.workunit.client.1.vm09.stdout:0/180: sync 2026-03-09T17:30:08.405 INFO:tasks.workunit.client.1.vm09.stdout:3/151: sync 2026-03-09T17:30:08.417 INFO:tasks.workunit.client.0.vm06.stdout:4/991: dwrite db/d59/d5f/d45/d10a/dcc/f123 [0,4194304] 0 2026-03-09T17:30:08.430 INFO:tasks.workunit.client.1.vm09.stdout:0/181: dread d6/de/ff [0,4194304] 0 2026-03-09T17:30:08.433 INFO:tasks.workunit.client.1.vm09.stdout:3/152: creat d5/d6/d32/f33 x:0 0 0 2026-03-09T17:30:08.434 INFO:tasks.workunit.client.0.vm06.stdout:4/992: creat db/d1d/d21/d37/d69/d78/db4/f159 x:0 0 0 2026-03-09T17:30:08.445 INFO:tasks.workunit.client.1.vm09.stdout:0/182: dwrite d6/de/ff [0,4194304] 0 2026-03-09T17:30:08.445 INFO:tasks.workunit.client.1.vm09.stdout:0/183: link d6/d2a/l35 d6/d2a/l36 0 2026-03-09T17:30:08.445 INFO:tasks.workunit.client.0.vm06.stdout:4/993: mkdir db/d1d/d21/d37/d69/d78/d15a 0 2026-03-09T17:30:08.445 INFO:tasks.workunit.client.0.vm06.stdout:4/994: unlink db/d1d/d21/d25/d4b/de4/ff8 0 2026-03-09T17:30:08.446 INFO:tasks.workunit.client.1.vm09.stdout:0/184: dwrite d6/f2d [0,4194304] 0 2026-03-09T17:30:08.500 INFO:tasks.workunit.client.1.vm09.stdout:4/207: getdents d11/d1e/d31 0 2026-03-09T17:30:08.534 INFO:tasks.workunit.client.0.vm06.stdout:6/841: write d6/d12/d17/d65/f72 [883548,21194] 0 2026-03-09T17:30:08.534 INFO:tasks.workunit.client.1.vm09.stdout:9/182: fsync d5/de/d29/f41 0 2026-03-09T17:30:08.534 INFO:tasks.workunit.client.0.vm06.stdout:6/842: fsync d6/d47/f49 0 2026-03-09T17:30:08.537 INFO:tasks.workunit.client.1.vm09.stdout:9/183: dwrite d5/f1d [0,4194304] 0 2026-03-09T17:30:08.538 INFO:tasks.workunit.client.0.vm06.stdout:6/843: rmdir d6/d4f/d3e/d52/d8c 39 2026-03-09T17:30:08.562 INFO:tasks.workunit.client.0.vm06.stdout:6/844: dread d6/d4f/d3e/d52/f89 [0,4194304] 0 2026-03-09T17:30:08.570 INFO:tasks.workunit.client.0.vm06.stdout:6/845: dread d6/d47/d96/d40/fbd [0,4194304] 0 2026-03-09T17:30:08.570 INFO:tasks.workunit.client.0.vm06.stdout:6/846: read - d6/d12/d53/fd5 zero size 2026-03-09T17:30:08.572 INFO:tasks.workunit.client.0.vm06.stdout:6/847: creat d6/d12/d53/dd0/f106 x:0 0 0 2026-03-09T17:30:08.573 INFO:tasks.workunit.client.0.vm06.stdout:6/848: mknod d6/d12/d2d/c107 0 2026-03-09T17:30:08.584 INFO:tasks.workunit.client.0.vm06.stdout:6/849: dread d6/d12/d17/f29 [0,4194304] 0 2026-03-09T17:30:08.586 INFO:tasks.workunit.client.0.vm06.stdout:6/850: dread d6/d12/d17/f29 [0,4194304] 0 2026-03-09T17:30:08.587 INFO:tasks.workunit.client.0.vm06.stdout:6/851: readlink d6/d4f/d73/lb9 0 2026-03-09T17:30:08.587 INFO:tasks.workunit.client.0.vm06.stdout:6/852: chown d6/d4f/d3e 48 1 2026-03-09T17:30:08.589 INFO:tasks.workunit.client.0.vm06.stdout:6/853: dread d6/d12/fd4 [0,4194304] 0 2026-03-09T17:30:08.590 INFO:tasks.workunit.client.1.vm09.stdout:2/134: dread - d13/d15/f2a zero size 2026-03-09T17:30:08.595 INFO:tasks.workunit.client.0.vm06.stdout:2/825: chown d3/d4/d12/d2b/d2d/f1b 5413 1 2026-03-09T17:30:08.598 INFO:tasks.workunit.client.0.vm06.stdout:6/854: mkdir d6/d47/d8a/d108 0 2026-03-09T17:30:08.599 INFO:tasks.workunit.client.1.vm09.stdout:2/135: mkdir d13/d15/d2c 0 2026-03-09T17:30:08.600 INFO:tasks.workunit.client.1.vm09.stdout:2/136: chown l11 187 1 2026-03-09T17:30:08.600 INFO:tasks.workunit.client.1.vm09.stdout:2/137: write d13/d15/d21/f27 [850479,94667] 0 2026-03-09T17:30:08.601 INFO:tasks.workunit.client.0.vm06.stdout:2/826: mkdir d3/d4/d12/d71/daa/d77/d81/d64/d6a/d10d 0 2026-03-09T17:30:08.604 INFO:tasks.workunit.client.0.vm06.stdout:6/855: dread d6/d12/d17/d85/f9c [0,4194304] 0 2026-03-09T17:30:08.613 INFO:tasks.workunit.client.0.vm06.stdout:1/919: dwrite d11/d14/d1d/d1e/d2a/d34/d64/fec [0,4194304] 0 2026-03-09T17:30:08.618 INFO:tasks.workunit.client.0.vm06.stdout:5/929: fsync d4/d50/d18/d3d/f145 0 2026-03-09T17:30:08.628 INFO:tasks.workunit.client.0.vm06.stdout:2/827: dread d3/d4/d12/d2b/d36/d37/f3a [0,4194304] 0 2026-03-09T17:30:08.631 INFO:tasks.workunit.client.0.vm06.stdout:6/856: rename d6/d12/d53/d8f/lda to d6/d12/d53/d91/dcb/l109 0 2026-03-09T17:30:08.632 INFO:tasks.workunit.client.1.vm09.stdout:2/138: creat d13/d15/d2c/f2d x:0 0 0 2026-03-09T17:30:08.633 INFO:tasks.workunit.client.0.vm06.stdout:1/920: truncate d11/d14/d1d/f90 935038 0 2026-03-09T17:30:08.634 INFO:tasks.workunit.client.0.vm06.stdout:1/921: fdatasync d11/d14/d1d/f56 0 2026-03-09T17:30:08.636 INFO:tasks.workunit.client.1.vm09.stdout:2/139: mknod d13/d15/d21/c2e 0 2026-03-09T17:30:08.637 INFO:tasks.workunit.client.0.vm06.stdout:5/930: rename d4/d50/d35/d40/d6f/f8e to d4/d52/db4/dc2/f150 0 2026-03-09T17:30:08.638 INFO:tasks.workunit.client.0.vm06.stdout:5/931: fdatasync d4/d50/d35/d40/d95/db8/dda/fdd 0 2026-03-09T17:30:08.638 INFO:tasks.workunit.client.1.vm09.stdout:2/140: dread f9 [0,4194304] 0 2026-03-09T17:30:08.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:08 vm06.local ceph-mon[57307]: mgrmap e22: vm09.lqzvkh(active, since 1.70961s) 2026-03-09T17:30:08.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:08 vm06.local ceph-mon[57307]: Deploying cephadm binary to vm09 2026-03-09T17:30:08.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:08 vm06.local ceph-mon[57307]: pgmap v3: 65 pgs: 65 active+clean; 2.5 GiB data, 8.7 GiB used, 111 GiB / 120 GiB avail 2026-03-09T17:30:08.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:08 vm06.local ceph-mon[57307]: Deploying cephadm binary to vm06 2026-03-09T17:30:08.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:08 vm06.local ceph-mon[57307]: pgmap v4: 65 pgs: 65 active+clean; 2.5 GiB data, 8.7 GiB used, 111 GiB / 120 GiB avail 2026-03-09T17:30:08.643 INFO:tasks.workunit.client.0.vm06.stdout:8/934: dwrite d15/d16/d6d/f10b [0,4194304] 0 2026-03-09T17:30:08.645 INFO:tasks.workunit.client.0.vm06.stdout:8/935: fsync d15/d31/dc5/df1/d2b/f63 0 2026-03-09T17:30:08.657 INFO:tasks.workunit.client.1.vm09.stdout:0/185: getdents d6/d1d 0 2026-03-09T17:30:08.658 INFO:tasks.workunit.client.1.vm09.stdout:2/141: creat d13/d15/f2f x:0 0 0 2026-03-09T17:30:08.658 INFO:tasks.workunit.client.1.vm09.stdout:0/186: read d6/de/f2e [2860877,89371] 0 2026-03-09T17:30:08.659 INFO:tasks.workunit.client.0.vm06.stdout:2/828: symlink d3/d4/d12/d71/daa/d77/d102/d109/l10e 0 2026-03-09T17:30:08.662 INFO:tasks.workunit.client.0.vm06.stdout:5/932: symlink d4/d22/dbe/l151 0 2026-03-09T17:30:08.664 INFO:tasks.workunit.client.1.vm09.stdout:2/142: dwrite d13/d15/f2b [0,4194304] 0 2026-03-09T17:30:08.664 INFO:tasks.workunit.client.1.vm09.stdout:0/187: dwrite d6/de/f31 [0,4194304] 0 2026-03-09T17:30:08.666 INFO:tasks.workunit.client.0.vm06.stdout:5/933: dwrite d4/d22/d64/f9f [0,4194304] 0 2026-03-09T17:30:08.670 INFO:tasks.workunit.client.1.vm09.stdout:6/156: dwrite d3/fc [0,4194304] 0 2026-03-09T17:30:08.686 INFO:tasks.workunit.client.0.vm06.stdout:2/829: creat d3/d4/d46/da5/f10f x:0 0 0 2026-03-09T17:30:08.688 INFO:tasks.workunit.client.1.vm09.stdout:0/188: rmdir d6/d2a 39 2026-03-09T17:30:08.690 INFO:tasks.workunit.client.1.vm09.stdout:0/189: dread d6/de/ff [0,4194304] 0 2026-03-09T17:30:08.693 INFO:tasks.workunit.client.0.vm06.stdout:5/934: dread d4/da4/fc5 [0,4194304] 0 2026-03-09T17:30:08.695 INFO:tasks.workunit.client.1.vm09.stdout:2/143: creat d13/d15/d21/f30 x:0 0 0 2026-03-09T17:30:08.696 INFO:tasks.workunit.client.1.vm09.stdout:2/144: write d13/d15/d21/f30 [464406,72849] 0 2026-03-09T17:30:08.696 INFO:tasks.workunit.client.0.vm06.stdout:2/830: creat d3/d4/d12/d2b/d36/f110 x:0 0 0 2026-03-09T17:30:08.698 INFO:tasks.workunit.client.1.vm09.stdout:6/157: unlink d3/f1a 0 2026-03-09T17:30:08.698 INFO:tasks.workunit.client.0.vm06.stdout:5/935: truncate d4/d50/f80 1135815 0 2026-03-09T17:30:08.700 INFO:tasks.workunit.client.1.vm09.stdout:3/153: getdents d5/d16/d25 0 2026-03-09T17:30:08.708 INFO:tasks.workunit.client.1.vm09.stdout:2/145: dread d13/f1a [0,4194304] 0 2026-03-09T17:30:08.711 INFO:tasks.workunit.client.1.vm09.stdout:6/158: fdatasync d3/d7/fe 0 2026-03-09T17:30:08.713 INFO:tasks.workunit.client.1.vm09.stdout:6/159: dread d3/fc [0,4194304] 0 2026-03-09T17:30:08.715 INFO:tasks.workunit.client.1.vm09.stdout:0/190: rmdir d6/d2a 39 2026-03-09T17:30:08.718 INFO:tasks.workunit.client.1.vm09.stdout:6/160: dwrite d3/d7/f23 [0,4194304] 0 2026-03-09T17:30:08.725 INFO:tasks.workunit.client.1.vm09.stdout:2/146: dread fd [0,4194304] 0 2026-03-09T17:30:08.726 INFO:tasks.workunit.client.0.vm06.stdout:1/922: dread d11/d14/d1d/d1e/d2a/d34/d58/f6a [0,4194304] 0 2026-03-09T17:30:08.729 INFO:tasks.workunit.client.1.vm09.stdout:1/186: dwrite f8 [0,4194304] 0 2026-03-09T17:30:08.730 INFO:tasks.workunit.client.0.vm06.stdout:1/923: truncate d11/d14/d1d/d4a/f12c 110783 0 2026-03-09T17:30:08.735 INFO:tasks.workunit.client.1.vm09.stdout:7/233: dwrite da/d11/d41/d4e/f2b [0,4194304] 0 2026-03-09T17:30:08.735 INFO:tasks.workunit.client.0.vm06.stdout:1/924: dread d11/d14/d1d/d1e/d2a/d34/d58/f6a [0,4194304] 0 2026-03-09T17:30:08.746 INFO:tasks.workunit.client.1.vm09.stdout:0/191: sync 2026-03-09T17:30:08.747 INFO:tasks.workunit.client.1.vm09.stdout:6/161: creat d3/f2e x:0 0 0 2026-03-09T17:30:08.748 INFO:tasks.workunit.client.1.vm09.stdout:6/162: truncate d3/f2e 11618 0 2026-03-09T17:30:08.748 INFO:tasks.workunit.client.1.vm09.stdout:3/154: fsync d5/f22 0 2026-03-09T17:30:08.748 INFO:tasks.workunit.client.1.vm09.stdout:8/192: getdents d1/d14 0 2026-03-09T17:30:08.749 INFO:tasks.workunit.client.1.vm09.stdout:6/163: readlink d3/d21/d25/d26/l2b 0 2026-03-09T17:30:08.751 INFO:tasks.workunit.client.1.vm09.stdout:8/193: read d1/f33 [157366,93931] 0 2026-03-09T17:30:08.751 INFO:tasks.workunit.client.0.vm06.stdout:1/925: symlink d11/d14/d1d/d1e/dc2/d103/d110/l135 0 2026-03-09T17:30:08.753 INFO:tasks.workunit.client.1.vm09.stdout:1/187: symlink d9/dc/dd/l3c 0 2026-03-09T17:30:08.754 INFO:tasks.workunit.client.0.vm06.stdout:1/926: readlink d11/d14/d1d/d1e/l7e 0 2026-03-09T17:30:08.768 INFO:tasks.workunit.client.0.vm06.stdout:1/927: creat d11/d14/d1d/d1e/dc2/d103/f136 x:0 0 0 2026-03-09T17:30:08.768 INFO:tasks.workunit.client.0.vm06.stdout:4/995: dwrite db/d1d/d21/d26/d7a/fda [0,4194304] 0 2026-03-09T17:30:08.769 INFO:tasks.workunit.client.1.vm09.stdout:3/155: dwrite d5/d16/f2e [0,4194304] 0 2026-03-09T17:30:08.769 INFO:tasks.workunit.client.1.vm09.stdout:1/188: dread d9/dc/dd/fe [0,4194304] 0 2026-03-09T17:30:08.769 INFO:tasks.workunit.client.1.vm09.stdout:1/189: readlink d9/d38/l39 0 2026-03-09T17:30:08.769 INFO:tasks.workunit.client.1.vm09.stdout:3/156: dread d5/f22 [0,4194304] 0 2026-03-09T17:30:08.769 INFO:tasks.workunit.client.1.vm09.stdout:8/194: rename d1/da/d23/d34/c37 to d1/d14/d31/c41 0 2026-03-09T17:30:08.775 INFO:tasks.workunit.client.1.vm09.stdout:4/208: truncate d11/f26 3546700 0 2026-03-09T17:30:08.777 INFO:tasks.workunit.client.1.vm09.stdout:0/192: sync 2026-03-09T17:30:08.778 INFO:tasks.workunit.client.1.vm09.stdout:9/184: truncate d5/f1b 2032886 0 2026-03-09T17:30:08.787 INFO:tasks.workunit.client.1.vm09.stdout:6/164: sync 2026-03-09T17:30:08.788 INFO:tasks.workunit.client.1.vm09.stdout:8/195: mkdir d1/d14/d2a/d42 0 2026-03-09T17:30:08.792 INFO:tasks.workunit.client.1.vm09.stdout:6/165: dwrite d3/f19 [0,4194304] 0 2026-03-09T17:30:08.798 INFO:tasks.workunit.client.1.vm09.stdout:3/157: unlink d5/d16/l27 0 2026-03-09T17:30:08.798 INFO:tasks.workunit.client.1.vm09.stdout:0/193: dread d6/f27 [0,4194304] 0 2026-03-09T17:30:08.798 INFO:tasks.workunit.client.1.vm09.stdout:7/234: getdents da/d11/d2d/d49 0 2026-03-09T17:30:08.798 INFO:tasks.workunit.client.1.vm09.stdout:1/190: creat d9/dc/f3d x:0 0 0 2026-03-09T17:30:08.798 INFO:tasks.workunit.client.1.vm09.stdout:0/194: write d6/d1d/f30 [923924,27837] 0 2026-03-09T17:30:08.799 INFO:tasks.workunit.client.1.vm09.stdout:3/158: dread d5/f22 [0,4194304] 0 2026-03-09T17:30:08.802 INFO:tasks.workunit.client.1.vm09.stdout:3/159: chown d5/d16/d25/c21 1121242273 1 2026-03-09T17:30:08.804 INFO:tasks.workunit.client.1.vm09.stdout:1/191: dwrite f3 [0,4194304] 0 2026-03-09T17:30:08.806 INFO:tasks.workunit.client.1.vm09.stdout:1/192: fdatasync d9/dc/d15/d1d/f17 0 2026-03-09T17:30:08.806 INFO:tasks.workunit.client.1.vm09.stdout:8/196: mkdir d1/d14/d2a/d42/d43 0 2026-03-09T17:30:08.806 INFO:tasks.workunit.client.1.vm09.stdout:1/193: readlink d9/d38/l39 0 2026-03-09T17:30:08.813 INFO:tasks.workunit.client.1.vm09.stdout:8/197: dwrite d1/da/dd/d3f/f1c [0,4194304] 0 2026-03-09T17:30:08.815 INFO:tasks.workunit.client.1.vm09.stdout:9/185: creat d5/f4b x:0 0 0 2026-03-09T17:30:08.817 INFO:tasks.workunit.client.1.vm09.stdout:7/235: mknod da/d11/d41/c55 0 2026-03-09T17:30:08.818 INFO:tasks.workunit.client.1.vm09.stdout:4/209: link d11/d1e/c21 d11/c49 0 2026-03-09T17:30:08.819 INFO:tasks.workunit.client.1.vm09.stdout:4/210: write d11/f16 [3104218,102060] 0 2026-03-09T17:30:08.820 INFO:tasks.workunit.client.1.vm09.stdout:1/194: sync 2026-03-09T17:30:08.820 INFO:tasks.workunit.client.1.vm09.stdout:4/211: write d11/f24 [720587,51091] 0 2026-03-09T17:30:08.822 INFO:tasks.workunit.client.1.vm09.stdout:8/198: dwrite d1/da/f12 [0,4194304] 0 2026-03-09T17:30:08.824 INFO:tasks.workunit.client.1.vm09.stdout:8/199: chown d1/f33 777 1 2026-03-09T17:30:08.825 INFO:tasks.workunit.client.1.vm09.stdout:8/200: truncate d1/d14/f2f 848122 0 2026-03-09T17:30:08.835 INFO:tasks.workunit.client.1.vm09.stdout:6/166: rename d3/d21/d25/d26/f27 to d3/d21/d25/f2f 0 2026-03-09T17:30:08.841 INFO:tasks.workunit.client.1.vm09.stdout:4/212: write d11/f15 [596963,62909] 0 2026-03-09T17:30:08.841 INFO:tasks.workunit.client.1.vm09.stdout:4/213: dread d11/f1c [4194304,4194304] 0 2026-03-09T17:30:08.841 INFO:tasks.workunit.client.1.vm09.stdout:4/214: write d11/d1e/d29/f3b [641499,84443] 0 2026-03-09T17:30:08.841 INFO:tasks.workunit.client.1.vm09.stdout:9/186: sync 2026-03-09T17:30:08.846 INFO:tasks.workunit.client.1.vm09.stdout:0/195: rename d6/de/ff to d6/d1d/f37 0 2026-03-09T17:30:08.846 INFO:tasks.workunit.client.1.vm09.stdout:0/196: write d6/f21 [4587616,49065] 0 2026-03-09T17:30:08.852 INFO:tasks.workunit.client.1.vm09.stdout:6/167: mkdir d3/d1e/d30 0 2026-03-09T17:30:08.853 INFO:tasks.workunit.client.1.vm09.stdout:3/160: getdents d5/d9 0 2026-03-09T17:30:08.858 INFO:tasks.workunit.client.1.vm09.stdout:1/195: dread d9/dc/d15/d1d/f17 [0,4194304] 0 2026-03-09T17:30:08.859 INFO:tasks.workunit.client.1.vm09.stdout:7/236: rename da/d11/d41/d4e/d3d to da/d11/d2d/d56 0 2026-03-09T17:30:08.861 INFO:tasks.workunit.client.1.vm09.stdout:7/237: truncate da/d11/f19 644938 0 2026-03-09T17:30:08.862 INFO:tasks.workunit.client.1.vm09.stdout:7/238: write da/d11/d2d/f45 [637767,61488] 0 2026-03-09T17:30:08.864 INFO:tasks.workunit.client.1.vm09.stdout:1/196: dwrite f8 [0,4194304] 0 2026-03-09T17:30:08.866 INFO:tasks.workunit.client.1.vm09.stdout:4/215: symlink d11/l4a 0 2026-03-09T17:30:08.866 INFO:tasks.workunit.client.1.vm09.stdout:4/216: dread - d11/f12 zero size 2026-03-09T17:30:08.868 INFO:tasks.workunit.client.1.vm09.stdout:6/168: mknod d3/d21/d25/c31 0 2026-03-09T17:30:08.870 INFO:tasks.workunit.client.1.vm09.stdout:7/239: creat da/d11/d41/f57 x:0 0 0 2026-03-09T17:30:08.871 INFO:tasks.workunit.client.1.vm09.stdout:1/197: symlink d9/dc/d15/d21/d35/l3e 0 2026-03-09T17:30:08.873 INFO:tasks.workunit.client.1.vm09.stdout:6/169: mkdir d3/d1e/d30/d32 0 2026-03-09T17:30:08.876 INFO:tasks.workunit.client.1.vm09.stdout:7/240: mknod da/d11/c58 0 2026-03-09T17:30:08.877 INFO:tasks.workunit.client.1.vm09.stdout:7/241: truncate da/f1c 2360192 0 2026-03-09T17:30:08.878 INFO:tasks.workunit.client.1.vm09.stdout:7/242: chown da/d11/c22 177729956 1 2026-03-09T17:30:08.879 INFO:tasks.workunit.client.1.vm09.stdout:1/198: dread d9/dc/d15/f1a [0,4194304] 0 2026-03-09T17:30:08.881 INFO:tasks.workunit.client.1.vm09.stdout:6/170: dwrite d3/d1e/f20 [0,4194304] 0 2026-03-09T17:30:08.882 INFO:tasks.workunit.client.1.vm09.stdout:4/217: sync 2026-03-09T17:30:08.891 INFO:tasks.workunit.client.1.vm09.stdout:6/171: dwrite d3/f1f [0,4194304] 0 2026-03-09T17:30:08.893 INFO:tasks.workunit.client.1.vm09.stdout:7/243: creat da/d11/d2d/f59 x:0 0 0 2026-03-09T17:30:08.894 INFO:tasks.workunit.client.1.vm09.stdout:4/218: dwrite d11/d1e/d29/f2f [0,4194304] 0 2026-03-09T17:30:08.894 INFO:tasks.workunit.client.1.vm09.stdout:6/172: chown d3/l17 15 1 2026-03-09T17:30:08.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:08 vm09.local ceph-mon[62061]: mgrmap e22: vm09.lqzvkh(active, since 1.70961s) 2026-03-09T17:30:08.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:08 vm09.local ceph-mon[62061]: Deploying cephadm binary to vm09 2026-03-09T17:30:08.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:08 vm09.local ceph-mon[62061]: pgmap v3: 65 pgs: 65 active+clean; 2.5 GiB data, 8.7 GiB used, 111 GiB / 120 GiB avail 2026-03-09T17:30:08.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:08 vm09.local ceph-mon[62061]: Deploying cephadm binary to vm06 2026-03-09T17:30:08.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:08 vm09.local ceph-mon[62061]: pgmap v4: 65 pgs: 65 active+clean; 2.5 GiB data, 8.7 GiB used, 111 GiB / 120 GiB avail 2026-03-09T17:30:08.901 INFO:tasks.workunit.client.1.vm09.stdout:1/199: mkdir d9/dc/d15/d22/d37/d3f 0 2026-03-09T17:30:08.901 INFO:tasks.workunit.client.1.vm09.stdout:1/200: readlink d9/dc/dd/l2c 0 2026-03-09T17:30:08.905 INFO:tasks.workunit.client.0.vm06.stdout:6/857: write d6/d47/d96/d40/f9f [644578,116354] 0 2026-03-09T17:30:08.933 INFO:tasks.workunit.client.0.vm06.stdout:6/858: dread d6/d4f/d3e/d52/f84 [0,4194304] 0 2026-03-09T17:30:08.933 INFO:tasks.workunit.client.0.vm06.stdout:6/859: fsync d6/d12/d53/dd0/f106 0 2026-03-09T17:30:08.934 INFO:tasks.workunit.client.1.vm09.stdout:4/219: chown d11/c49 3941 1 2026-03-09T17:30:08.938 INFO:tasks.workunit.client.1.vm09.stdout:4/220: symlink d11/d1e/d30/l4b 0 2026-03-09T17:30:08.963 INFO:tasks.workunit.client.1.vm09.stdout:0/197: dread d6/f9 [0,4194304] 0 2026-03-09T17:30:08.963 INFO:tasks.workunit.client.1.vm09.stdout:2/147: read d13/f14 [322835,76299] 0 2026-03-09T17:30:08.968 INFO:tasks.workunit.client.1.vm09.stdout:2/148: link d13/f14 d13/d15/d21/f31 0 2026-03-09T17:30:08.968 INFO:tasks.workunit.client.1.vm09.stdout:2/149: write d13/d15/f20 [1005862,48319] 0 2026-03-09T17:30:08.971 INFO:tasks.workunit.client.1.vm09.stdout:2/150: creat d13/d15/d21/f32 x:0 0 0 2026-03-09T17:30:08.971 INFO:tasks.workunit.client.1.vm09.stdout:2/151: readlink lf 0 2026-03-09T17:30:08.975 INFO:tasks.workunit.client.0.vm06.stdout:8/936: write d15/d16/d1e/d30/d55/ffd [778152,26380] 0 2026-03-09T17:30:08.983 INFO:tasks.workunit.client.0.vm06.stdout:5/936: write d4/d50/d18/f3e [7409830,112760] 0 2026-03-09T17:30:08.983 INFO:tasks.workunit.client.0.vm06.stdout:2/831: dwrite d3/d4/d12/d2b/d36/fb9 [4194304,4194304] 0 2026-03-09T17:30:08.990 INFO:tasks.workunit.client.0.vm06.stdout:8/937: mknod d15/d130/c134 0 2026-03-09T17:30:08.991 INFO:tasks.workunit.client.0.vm06.stdout:2/832: truncate d3/d4/d12/f42 3995678 0 2026-03-09T17:30:08.994 INFO:tasks.workunit.client.0.vm06.stdout:8/938: dwrite d15/d31/dc5/df1/d2b/f63 [0,4194304] 0 2026-03-09T17:30:08.996 INFO:tasks.workunit.client.0.vm06.stdout:8/939: chown d15/d16/d6d/fd8 8 1 2026-03-09T17:30:08.998 INFO:tasks.workunit.client.0.vm06.stdout:8/940: dread d15/d16/f23 [0,4194304] 0 2026-03-09T17:30:09.002 INFO:tasks.workunit.client.1.vm09.stdout:2/152: sync 2026-03-09T17:30:09.008 INFO:tasks.workunit.client.0.vm06.stdout:8/941: dread - d15/d39/d67/de3/fe5 zero size 2026-03-09T17:30:09.012 INFO:tasks.workunit.client.0.vm06.stdout:8/942: chown d15/d130/c134 21664169 1 2026-03-09T17:30:09.015 INFO:tasks.workunit.client.1.vm09.stdout:2/153: write d13/f1a [784053,78472] 0 2026-03-09T17:30:09.017 INFO:tasks.workunit.client.0.vm06.stdout:2/833: dread d3/d4/d12/d2b/f89 [0,4194304] 0 2026-03-09T17:30:09.017 INFO:tasks.workunit.client.1.vm09.stdout:2/154: fdatasync f9 0 2026-03-09T17:30:09.021 INFO:tasks.workunit.client.0.vm06.stdout:8/943: symlink d15/d31/dc5/df1/l135 0 2026-03-09T17:30:09.023 INFO:tasks.workunit.client.0.vm06.stdout:5/937: dread d4/d50/d18/d3d/f54 [0,4194304] 0 2026-03-09T17:30:09.026 INFO:tasks.workunit.client.1.vm09.stdout:2/155: mkdir d13/d15/d33 0 2026-03-09T17:30:09.027 INFO:tasks.workunit.client.0.vm06.stdout:2/834: fdatasync d3/fad 0 2026-03-09T17:30:09.034 INFO:tasks.workunit.client.1.vm09.stdout:2/156: rename d13/d15/d33 to d13/d15/d34 0 2026-03-09T17:30:09.036 INFO:tasks.workunit.client.0.vm06.stdout:5/938: creat d4/d22/dbe/dfb/f152 x:0 0 0 2026-03-09T17:30:09.039 INFO:tasks.workunit.client.0.vm06.stdout:5/939: fsync d4/d50/f1d 0 2026-03-09T17:30:09.044 INFO:tasks.workunit.client.0.vm06.stdout:5/940: symlink d4/d52/d55/d13e/d127/l153 0 2026-03-09T17:30:09.051 INFO:tasks.workunit.client.1.vm09.stdout:3/161: write d5/f22 [4442748,101414] 0 2026-03-09T17:30:09.052 INFO:tasks.workunit.client.0.vm06.stdout:2/835: read d3/f29 [1845600,99024] 0 2026-03-09T17:30:09.056 INFO:tasks.workunit.client.0.vm06.stdout:2/836: readlink d3/d4/d12/d2b/d36/lf9 0 2026-03-09T17:30:09.058 INFO:tasks.workunit.client.1.vm09.stdout:6/173: truncate d3/d7/fe 6602323 0 2026-03-09T17:30:09.059 INFO:tasks.workunit.client.0.vm06.stdout:2/837: fdatasync d3/d4/d12/d2b/f7e 0 2026-03-09T17:30:09.061 INFO:tasks.workunit.client.1.vm09.stdout:6/174: mknod d3/d21/c33 0 2026-03-09T17:30:09.064 INFO:tasks.workunit.client.0.vm06.stdout:2/838: readlink d3/d4/d12/d71/daa/d77/d81/d64/l7a 0 2026-03-09T17:30:09.065 INFO:tasks.workunit.client.0.vm06.stdout:2/839: dread - d3/d4/d46/fc6 zero size 2026-03-09T17:30:09.067 INFO:tasks.workunit.client.1.vm09.stdout:6/175: sync 2026-03-09T17:30:09.071 INFO:tasks.workunit.client.1.vm09.stdout:6/176: dwrite d3/d7/f18 [0,4194304] 0 2026-03-09T17:30:09.076 INFO:tasks.workunit.client.1.vm09.stdout:6/177: mkdir d3/d21/d25/d26/d34 0 2026-03-09T17:30:09.080 INFO:tasks.workunit.client.1.vm09.stdout:6/178: dwrite d3/d7/f23 [0,4194304] 0 2026-03-09T17:30:09.102 INFO:tasks.workunit.client.0.vm06.stdout:1/928: dwrite d11/d14/d1d/d4a/fa7 [0,4194304] 0 2026-03-09T17:30:09.128 INFO:tasks.workunit.client.1.vm09.stdout:5/171: dwrite d0/d2/f2a [4194304,4194304] 0 2026-03-09T17:30:09.128 INFO:tasks.workunit.client.1.vm09.stdout:5/172: creat d0/dc/d21/d26/f39 x:0 0 0 2026-03-09T17:30:09.128 INFO:tasks.workunit.client.1.vm09.stdout:5/173: read - d0/d2/d15/d20/f25 zero size 2026-03-09T17:30:09.128 INFO:tasks.workunit.client.1.vm09.stdout:5/174: stat d0/dc/d21/f29 0 2026-03-09T17:30:09.128 INFO:tasks.workunit.client.1.vm09.stdout:5/175: write d0/d2/d15/f1c [261094,59725] 0 2026-03-09T17:30:09.128 INFO:tasks.workunit.client.0.vm06.stdout:1/929: mknod d11/d14/d1d/d1e/d2a/d34/c137 0 2026-03-09T17:30:09.128 INFO:tasks.workunit.client.0.vm06.stdout:1/930: symlink d11/d14/d1d/d42/l138 0 2026-03-09T17:30:09.128 INFO:tasks.workunit.client.0.vm06.stdout:1/931: chown d11/d14/d1c/dbc 31278613 1 2026-03-09T17:30:09.128 INFO:tasks.workunit.client.0.vm06.stdout:4/996: dwrite db/de2/d132/fe7 [4194304,4194304] 0 2026-03-09T17:30:09.128 INFO:tasks.workunit.client.0.vm06.stdout:1/932: rmdir d11/d14/d1d/d94/d11b 0 2026-03-09T17:30:09.128 INFO:tasks.workunit.client.0.vm06.stdout:1/933: fsync d11/d14/d1d/d1e/d2a/d99/de9/feb 0 2026-03-09T17:30:09.133 INFO:tasks.workunit.client.1.vm09.stdout:9/187: truncate d5/f1d 2987734 0 2026-03-09T17:30:09.135 INFO:tasks.workunit.client.1.vm09.stdout:9/188: rmdir d5/de/d29 39 2026-03-09T17:30:09.135 INFO:tasks.workunit.client.1.vm09.stdout:5/176: getdents d0 0 2026-03-09T17:30:09.138 INFO:tasks.workunit.client.1.vm09.stdout:9/189: symlink d5/de/d29/l4c 0 2026-03-09T17:30:09.140 INFO:tasks.workunit.client.1.vm09.stdout:5/177: link d0/dc/ld d0/d2/l3a 0 2026-03-09T17:30:09.144 INFO:tasks.workunit.client.0.vm06.stdout:4/997: dread db/d1d/f1f [0,4194304] 0 2026-03-09T17:30:09.145 INFO:tasks.workunit.client.0.vm06.stdout:4/998: symlink db/d1d/d21/d44/d8a/d134/l15b 0 2026-03-09T17:30:09.171 INFO:tasks.workunit.client.0.vm06.stdout:5/941: sync 2026-03-09T17:30:09.176 INFO:tasks.workunit.client.1.vm09.stdout:5/178: dread d0/dc/d21/d26/f36 [0,4194304] 0 2026-03-09T17:30:09.178 INFO:tasks.workunit.client.1.vm09.stdout:5/179: symlink d0/dc/d21/d26/l3b 0 2026-03-09T17:30:09.179 INFO:tasks.workunit.client.1.vm09.stdout:5/180: mkdir d0/d9/d16/d3c 0 2026-03-09T17:30:09.181 INFO:tasks.workunit.client.1.vm09.stdout:5/181: link d0/dc/d21/d26/f39 d0/dc/d21/d26/f3d 0 2026-03-09T17:30:09.182 INFO:tasks.workunit.client.1.vm09.stdout:5/182: creat d0/d9/f3e x:0 0 0 2026-03-09T17:30:09.210 INFO:tasks.workunit.client.1.vm09.stdout:5/183: sync 2026-03-09T17:30:09.211 INFO:tasks.workunit.client.1.vm09.stdout:5/184: mknod d0/d9/d16/c3f 0 2026-03-09T17:30:09.214 INFO:tasks.workunit.client.1.vm09.stdout:5/185: dwrite d0/d2/d15/f1c [0,4194304] 0 2026-03-09T17:30:09.216 INFO:tasks.workunit.client.1.vm09.stdout:5/186: fdatasync d0/dc/d21/d26/f28 0 2026-03-09T17:30:09.216 INFO:tasks.workunit.client.1.vm09.stdout:5/187: dread - d0/d2/d15/d20/f25 zero size 2026-03-09T17:30:09.222 INFO:tasks.workunit.client.1.vm09.stdout:5/188: dwrite d0/dc/d21/d33/f35 [0,4194304] 0 2026-03-09T17:30:09.225 INFO:tasks.workunit.client.1.vm09.stdout:5/189: write d0/f22 [858021,36048] 0 2026-03-09T17:30:09.230 INFO:tasks.workunit.client.1.vm09.stdout:5/190: dwrite d0/dc/f37 [0,4194304] 0 2026-03-09T17:30:09.237 INFO:tasks.workunit.client.1.vm09.stdout:5/191: dwrite d0/dc/d21/d33/f35 [0,4194304] 0 2026-03-09T17:30:09.240 INFO:tasks.workunit.client.1.vm09.stdout:6/179: getdents d3/d1e/d30 0 2026-03-09T17:30:09.243 INFO:tasks.workunit.client.1.vm09.stdout:1/201: rename d9/dc/d15 to d9/dc/dd/d40 0 2026-03-09T17:30:09.244 INFO:tasks.workunit.client.1.vm09.stdout:1/202: chown d9/dc/dd/d40/d21/d35 137941970 1 2026-03-09T17:30:09.244 INFO:tasks.workunit.client.1.vm09.stdout:6/180: dwrite d3/f2e [0,4194304] 0 2026-03-09T17:30:09.246 INFO:tasks.workunit.client.1.vm09.stdout:6/181: readlink d3/d7/l16 0 2026-03-09T17:30:09.246 INFO:tasks.workunit.client.1.vm09.stdout:6/182: read - d3/d21/d25/f2f zero size 2026-03-09T17:30:09.249 INFO:tasks.workunit.client.1.vm09.stdout:7/244: truncate da/fb 1394180 0 2026-03-09T17:30:09.250 INFO:tasks.workunit.client.1.vm09.stdout:4/221: write f10 [544377,121747] 0 2026-03-09T17:30:09.250 INFO:tasks.workunit.client.0.vm06.stdout:6/860: truncate d6/d12/d2d/f39 3028821 0 2026-03-09T17:30:09.256 INFO:tasks.workunit.client.1.vm09.stdout:8/201: dwrite d1/f3 [0,4194304] 0 2026-03-09T17:30:09.265 INFO:tasks.workunit.client.1.vm09.stdout:8/202: dread d1/d14/f2f [0,4194304] 0 2026-03-09T17:30:09.267 INFO:tasks.workunit.client.0.vm06.stdout:8/944: dwrite d15/d31/dc5/df1/d2b/d85/f9a [0,4194304] 0 2026-03-09T17:30:09.269 INFO:tasks.workunit.client.0.vm06.stdout:8/945: write d15/d16/d1e/d30/d55/ffd [1332880,110793] 0 2026-03-09T17:30:09.283 INFO:tasks.workunit.client.1.vm09.stdout:0/198: rename d6/l26 to d6/d1d/d24/l38 0 2026-03-09T17:30:09.286 INFO:tasks.workunit.client.1.vm09.stdout:6/183: mknod d3/d21/d25/d26/c35 0 2026-03-09T17:30:09.287 INFO:tasks.workunit.client.1.vm09.stdout:7/245: creat da/f5a x:0 0 0 2026-03-09T17:30:09.288 INFO:tasks.workunit.client.1.vm09.stdout:4/222: mkdir d11/d1e/d31/d4c 0 2026-03-09T17:30:09.288 INFO:tasks.workunit.client.1.vm09.stdout:7/246: chown da/d11/d2d/f59 1957683 1 2026-03-09T17:30:09.289 INFO:tasks.workunit.client.1.vm09.stdout:4/223: fsync d11/d1e/d29/d36/f3d 0 2026-03-09T17:30:09.289 INFO:tasks.workunit.client.1.vm09.stdout:7/247: write da/f15 [3437665,104076] 0 2026-03-09T17:30:09.290 INFO:tasks.workunit.client.1.vm09.stdout:7/248: dread - da/d11/d41/d4e/f42 zero size 2026-03-09T17:30:09.293 INFO:tasks.workunit.client.1.vm09.stdout:6/184: dread d3/fc [0,4194304] 0 2026-03-09T17:30:09.293 INFO:tasks.workunit.client.1.vm09.stdout:7/249: truncate da/d11/d41/f35 336455 0 2026-03-09T17:30:09.294 INFO:tasks.workunit.client.1.vm09.stdout:5/192: mknod d0/d9/c40 0 2026-03-09T17:30:09.298 INFO:tasks.workunit.client.1.vm09.stdout:3/162: rename d5/d16/f2e to d5/d16/d31/f34 0 2026-03-09T17:30:09.298 INFO:tasks.workunit.client.1.vm09.stdout:3/163: chown d5/d9/l23 2733873 1 2026-03-09T17:30:09.298 INFO:tasks.workunit.client.1.vm09.stdout:3/164: chown d5/d16 22287 1 2026-03-09T17:30:09.302 INFO:tasks.workunit.client.1.vm09.stdout:3/165: dwrite d5/d6/d12/f18 [0,4194304] 0 2026-03-09T17:30:09.303 INFO:tasks.workunit.client.1.vm09.stdout:3/166: read - d5/d16/d25/f2b zero size 2026-03-09T17:30:09.330 INFO:tasks.workunit.client.1.vm09.stdout:8/203: mkdir d1/d14/d2a/d42/d43/d44 0 2026-03-09T17:30:09.330 INFO:tasks.workunit.client.1.vm09.stdout:8/204: chown l0 1569463 1 2026-03-09T17:30:09.340 INFO:tasks.workunit.client.1.vm09.stdout:7/250: mkdir da/d11/d47/d5b 0 2026-03-09T17:30:09.343 INFO:tasks.workunit.client.1.vm09.stdout:8/205: creat d1/da/dd/f45 x:0 0 0 2026-03-09T17:30:09.345 INFO:tasks.workunit.client.1.vm09.stdout:7/251: dwrite da/d11/f3f [0,4194304] 0 2026-03-09T17:30:09.350 INFO:tasks.workunit.client.1.vm09.stdout:8/206: dread d1/da/dd/f1e [0,4194304] 0 2026-03-09T17:30:09.363 INFO:tasks.workunit.client.1.vm09.stdout:7/252: rmdir da/d11/d3e 39 2026-03-09T17:30:09.363 INFO:tasks.workunit.client.1.vm09.stdout:7/253: read - da/d11/d41/d4e/f44 zero size 2026-03-09T17:30:09.364 INFO:tasks.workunit.client.1.vm09.stdout:7/254: fsync da/d11/d41/f57 0 2026-03-09T17:30:09.364 INFO:tasks.workunit.client.1.vm09.stdout:7/255: fsync da/d11/d41/f57 0 2026-03-09T17:30:09.364 INFO:tasks.workunit.client.1.vm09.stdout:8/207: creat d1/d14/d2a/d42/f46 x:0 0 0 2026-03-09T17:30:09.367 INFO:tasks.workunit.client.1.vm09.stdout:3/167: creat d5/f35 x:0 0 0 2026-03-09T17:30:09.367 INFO:tasks.workunit.client.1.vm09.stdout:3/168: fsync d5/f22 0 2026-03-09T17:30:09.368 INFO:tasks.workunit.client.1.vm09.stdout:3/169: write d5/d6/d12/f19 [4115224,21282] 0 2026-03-09T17:30:09.369 INFO:tasks.workunit.client.1.vm09.stdout:4/224: dread d11/f25 [0,4194304] 0 2026-03-09T17:30:09.370 INFO:tasks.workunit.client.1.vm09.stdout:4/225: write d11/d1e/d29/f3b [1710990,91739] 0 2026-03-09T17:30:09.371 INFO:tasks.workunit.client.0.vm06.stdout:1/934: dwrite d11/d14/d1d/f90 [0,4194304] 0 2026-03-09T17:30:09.373 INFO:tasks.workunit.client.1.vm09.stdout:8/208: mkdir d1/da/dd/d47 0 2026-03-09T17:30:09.375 INFO:tasks.workunit.client.0.vm06.stdout:1/935: dread d11/d14/d1d/d1e/d2a/d34/d58/f6a [0,4194304] 0 2026-03-09T17:30:09.388 INFO:tasks.workunit.client.1.vm09.stdout:3/170: mknod d5/d16/d25/c36 0 2026-03-09T17:30:09.388 INFO:tasks.workunit.client.1.vm09.stdout:3/171: dwrite d5/d16/d25/f28 [0,4194304] 0 2026-03-09T17:30:09.389 INFO:tasks.workunit.client.1.vm09.stdout:3/172: stat d5/d16/d25/f2c 0 2026-03-09T17:30:09.389 INFO:tasks.workunit.client.1.vm09.stdout:3/173: chown d5/d16/d25/c36 3441 1 2026-03-09T17:30:09.390 INFO:tasks.workunit.client.1.vm09.stdout:3/174: truncate d5/d16/d25/f2b 289418 0 2026-03-09T17:30:09.397 INFO:tasks.workunit.client.0.vm06.stdout:1/936: fsync d11/d14/d1d/d42/d46/fcd 0 2026-03-09T17:30:09.397 INFO:tasks.workunit.client.0.vm06.stdout:6/861: rmdir d6/d47/d8a/d108 0 2026-03-09T17:30:09.397 INFO:tasks.workunit.client.0.vm06.stdout:8/946: fsync d15/d31/dc5/df1/d71/f82 0 2026-03-09T17:30:09.399 INFO:tasks.workunit.client.0.vm06.stdout:8/947: chown d15/d16/d1e/d30/db8/d5e/f98 3642 1 2026-03-09T17:30:09.400 INFO:tasks.workunit.client.1.vm09.stdout:8/209: mknod d1/da/d23/d34/c48 0 2026-03-09T17:30:09.409 INFO:tasks.workunit.client.1.vm09.stdout:4/226: rename d11/f2d to d11/f4d 0 2026-03-09T17:30:09.410 INFO:tasks.workunit.client.1.vm09.stdout:4/227: fdatasync d11/f16 0 2026-03-09T17:30:09.410 INFO:tasks.workunit.client.1.vm09.stdout:8/210: mkdir d1/d14/d2a/d49 0 2026-03-09T17:30:09.410 INFO:tasks.workunit.client.0.vm06.stdout:8/948: fsync d15/d39/d67/d77/d99/f115 0 2026-03-09T17:30:09.411 INFO:tasks.workunit.client.1.vm09.stdout:4/228: write d11/f12 [87948,93839] 0 2026-03-09T17:30:09.416 INFO:tasks.workunit.client.1.vm09.stdout:3/175: getdents d5/d9 0 2026-03-09T17:30:09.416 INFO:tasks.workunit.client.0.vm06.stdout:1/937: creat d11/d14/d1d/d42/d46/d92/dc0/d57/de4/f139 x:0 0 0 2026-03-09T17:30:09.418 INFO:tasks.workunit.client.1.vm09.stdout:3/176: dread d5/d16/f17 [0,4194304] 0 2026-03-09T17:30:09.424 INFO:tasks.workunit.client.1.vm09.stdout:8/211: dread d1/f33 [0,4194304] 0 2026-03-09T17:30:09.429 INFO:tasks.workunit.client.0.vm06.stdout:6/862: dread d6/d12/d17/d65/f72 [0,4194304] 0 2026-03-09T17:30:09.429 INFO:tasks.workunit.client.1.vm09.stdout:2/157: write d13/f14 [212324,62545] 0 2026-03-09T17:30:09.432 INFO:tasks.workunit.client.1.vm09.stdout:3/177: mkdir d5/d16/d31/d37 0 2026-03-09T17:30:09.432 INFO:tasks.workunit.client.0.vm06.stdout:6/863: rmdir d6/d47/d96 39 2026-03-09T17:30:09.432 INFO:tasks.workunit.client.1.vm09.stdout:3/178: chown d5/d9 363177 1 2026-03-09T17:30:09.433 INFO:tasks.workunit.client.1.vm09.stdout:3/179: write d5/d6/d32/f33 [973526,33698] 0 2026-03-09T17:30:09.435 INFO:tasks.workunit.client.0.vm06.stdout:6/864: truncate d6/d4f/d3e/f62 1233366 0 2026-03-09T17:30:09.436 INFO:tasks.workunit.client.1.vm09.stdout:3/180: dwrite d5/f35 [0,4194304] 0 2026-03-09T17:30:09.437 INFO:tasks.workunit.client.1.vm09.stdout:8/212: symlink d1/d14/d31/l4a 0 2026-03-09T17:30:09.438 INFO:tasks.workunit.client.0.vm06.stdout:6/865: symlink d6/d12/d17/d65/l10a 0 2026-03-09T17:30:09.439 INFO:tasks.workunit.client.1.vm09.stdout:2/158: chown d13/c1b 136769838 1 2026-03-09T17:30:09.440 INFO:tasks.workunit.client.0.vm06.stdout:6/866: creat d6/d47/d4d/d6d/f10b x:0 0 0 2026-03-09T17:30:09.440 INFO:tasks.workunit.client.0.vm06.stdout:6/867: chown d6/d12/d17/d65/l10a 1231034 1 2026-03-09T17:30:09.441 INFO:tasks.workunit.client.1.vm09.stdout:2/159: dread f9 [0,4194304] 0 2026-03-09T17:30:09.449 INFO:tasks.workunit.client.1.vm09.stdout:3/181: fsync d5/d6/fe 0 2026-03-09T17:30:09.449 INFO:tasks.workunit.client.1.vm09.stdout:3/182: stat d5/d9/d30 0 2026-03-09T17:30:09.452 INFO:tasks.workunit.client.1.vm09.stdout:3/183: dwrite d5/f22 [0,4194304] 0 2026-03-09T17:30:09.453 INFO:tasks.workunit.client.1.vm09.stdout:4/229: rmdir d11/d1e/d31/d4c 0 2026-03-09T17:30:09.454 INFO:tasks.workunit.client.1.vm09.stdout:3/184: write d5/d9/fc [1147457,102441] 0 2026-03-09T17:30:09.455 INFO:tasks.workunit.client.1.vm09.stdout:3/185: fsync d5/d6/fb 0 2026-03-09T17:30:09.455 INFO:tasks.workunit.client.1.vm09.stdout:3/186: chown d5/d9/l23 1096098 1 2026-03-09T17:30:09.465 INFO:tasks.workunit.client.1.vm09.stdout:2/160: creat d13/d15/d21/f35 x:0 0 0 2026-03-09T17:30:09.468 INFO:tasks.workunit.client.1.vm09.stdout:2/161: dwrite f6 [0,4194304] 0 2026-03-09T17:30:09.469 INFO:tasks.workunit.client.1.vm09.stdout:2/162: chown d13/d15/d21 519259669 1 2026-03-09T17:30:09.471 INFO:tasks.workunit.client.0.vm06.stdout:6/868: getdents d6/d47/d4d/d9a/da2/db1 0 2026-03-09T17:30:09.472 INFO:tasks.workunit.client.0.vm06.stdout:6/869: readlink d6/d12/d2d/db3/le7 0 2026-03-09T17:30:09.481 INFO:tasks.workunit.client.0.vm06.stdout:6/870: dwrite d6/d47/d4d/ff5 [0,4194304] 0 2026-03-09T17:30:09.483 INFO:tasks.workunit.client.0.vm06.stdout:6/871: chown d6/d4f/f25 478 1 2026-03-09T17:30:09.483 INFO:tasks.workunit.client.0.vm06.stdout:6/872: dread - d6/d12/d53/dd0/f100 zero size 2026-03-09T17:30:09.484 INFO:tasks.workunit.client.0.vm06.stdout:6/873: chown d6/d4f/d3e/ce9 92 1 2026-03-09T17:30:09.485 INFO:tasks.workunit.client.0.vm06.stdout:2/840: dwrite d3/d4/d12/d71/daa/d77/d81/d64/de5/df0/ff4 [0,4194304] 0 2026-03-09T17:30:09.486 INFO:tasks.workunit.client.0.vm06.stdout:2/841: chown d3/d4/d12/d2b/d2d/lc4 0 1 2026-03-09T17:30:09.498 INFO:tasks.workunit.client.1.vm09.stdout:4/230: rename d11/d1e/d29/l37 to d11/d1e/d29/l4e 0 2026-03-09T17:30:09.501 INFO:tasks.workunit.client.0.vm06.stdout:6/874: mkdir d6/d12/d53/d8f/d10c 0 2026-03-09T17:30:09.507 INFO:tasks.workunit.client.1.vm09.stdout:4/231: dread d11/d1e/d29/f32 [0,4194304] 0 2026-03-09T17:30:09.510 INFO:tasks.workunit.client.0.vm06.stdout:2/842: rename d3/d4/d12/d71/daa/d77/d81/d64/d6a/de0/fe4 to d3/d4/d12/da7/de3/f111 0 2026-03-09T17:30:09.514 INFO:tasks.workunit.client.0.vm06.stdout:6/875: creat d6/d12/d53/dd0/f10d x:0 0 0 2026-03-09T17:30:09.518 INFO:tasks.workunit.client.0.vm06.stdout:2/843: truncate d3/d4/d12/d71/daa/d77/d81/d64/d6a/fab 4678761 0 2026-03-09T17:30:09.522 INFO:tasks.workunit.client.1.vm09.stdout:2/163: unlink f6 0 2026-03-09T17:30:09.523 INFO:tasks.workunit.client.0.vm06.stdout:6/876: mknod d6/d47/d4d/d9a/da2/c10e 0 2026-03-09T17:30:09.533 INFO:tasks.workunit.client.1.vm09.stdout:9/190: rmdir d5 39 2026-03-09T17:30:09.538 INFO:tasks.workunit.client.0.vm06.stdout:4/999: dwrite db/d59/d90/ff4 [0,4194304] 0 2026-03-09T17:30:09.542 INFO:tasks.workunit.client.0.vm06.stdout:5/942: dwrite d4/d52/f8a [0,4194304] 0 2026-03-09T17:30:09.551 INFO:tasks.workunit.client.0.vm06.stdout:2/844: mkdir d3/d4/d12/d2b/d9f/d112 0 2026-03-09T17:30:09.551 INFO:tasks.workunit.client.0.vm06.stdout:6/877: creat d6/d47/d4d/d9a/da2/db1/f10f x:0 0 0 2026-03-09T17:30:09.554 INFO:tasks.workunit.client.0.vm06.stdout:2/845: dwrite d3/fc7 [0,4194304] 0 2026-03-09T17:30:09.556 INFO:tasks.workunit.client.0.vm06.stdout:2/846: dread d3/d4/d12/d71/daa/d77/d81/d64/d6a/f6d [0,4194304] 0 2026-03-09T17:30:09.561 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:09 vm06.local ceph-mon[57307]: mgrmap e23: vm09.lqzvkh(active, since 2s) 2026-03-09T17:30:09.563 INFO:tasks.workunit.client.1.vm09.stdout:3/187: mkdir d5/d38 0 2026-03-09T17:30:09.564 INFO:tasks.workunit.client.1.vm09.stdout:3/188: read d5/d16/d25/f28 [3563137,130568] 0 2026-03-09T17:30:09.564 INFO:tasks.workunit.client.1.vm09.stdout:3/189: read d5/d6/fe [3874012,127068] 0 2026-03-09T17:30:09.569 INFO:tasks.workunit.client.1.vm09.stdout:3/190: dwrite d5/d6/d32/f33 [0,4194304] 0 2026-03-09T17:30:09.574 INFO:tasks.workunit.client.1.vm09.stdout:4/232: dread d11/d1e/f22 [0,4194304] 0 2026-03-09T17:30:09.579 INFO:tasks.workunit.client.1.vm09.stdout:4/233: dwrite d11/d1e/d29/f2e [0,4194304] 0 2026-03-09T17:30:09.583 INFO:tasks.workunit.client.0.vm06.stdout:2/847: fdatasync d3/d4/d12/d2b/d2d/fcd 0 2026-03-09T17:30:09.583 INFO:tasks.workunit.client.1.vm09.stdout:4/234: dread - d11/f46 zero size 2026-03-09T17:30:09.586 INFO:tasks.workunit.client.1.vm09.stdout:9/191: read - d5/de/f2d zero size 2026-03-09T17:30:09.587 INFO:tasks.workunit.client.1.vm09.stdout:9/192: chown d5/f34 15075751 1 2026-03-09T17:30:09.587 INFO:tasks.workunit.client.0.vm06.stdout:2/848: creat d3/d4/d12/d2b/d36/dd4/f113 x:0 0 0 2026-03-09T17:30:09.589 INFO:tasks.workunit.client.0.vm06.stdout:2/849: fsync d3/d4/d12/d2b/fb6 0 2026-03-09T17:30:09.597 INFO:tasks.workunit.client.1.vm09.stdout:7/256: read da/fb [934940,90730] 0 2026-03-09T17:30:09.601 INFO:tasks.workunit.client.1.vm09.stdout:0/199: chown d6/d1d/d24/l38 513787062 1 2026-03-09T17:30:09.605 INFO:tasks.workunit.client.1.vm09.stdout:1/203: dwrite d9/dc/dd/fe [0,4194304] 0 2026-03-09T17:30:09.606 INFO:tasks.workunit.client.1.vm09.stdout:0/200: dread d6/f21 [0,4194304] 0 2026-03-09T17:30:09.606 INFO:tasks.workunit.client.1.vm09.stdout:3/191: creat d5/d6/d12/f39 x:0 0 0 2026-03-09T17:30:09.610 INFO:tasks.workunit.client.1.vm09.stdout:2/164: fdatasync d13/d15/f18 0 2026-03-09T17:30:09.610 INFO:tasks.workunit.client.1.vm09.stdout:2/165: chown d13/d15/l19 5 1 2026-03-09T17:30:09.617 INFO:tasks.workunit.client.0.vm06.stdout:5/943: dread d4/d50/d18/f48 [4194304,4194304] 0 2026-03-09T17:30:09.617 INFO:tasks.workunit.client.1.vm09.stdout:4/235: symlink d11/d1e/d29/l4f 0 2026-03-09T17:30:09.621 INFO:tasks.workunit.client.0.vm06.stdout:5/944: write d4/d50/d18/f101 [2385949,87185] 0 2026-03-09T17:30:09.625 INFO:tasks.workunit.client.0.vm06.stdout:5/945: fdatasync d4/d50/d35/d40/d6f/fed 0 2026-03-09T17:30:09.628 INFO:tasks.workunit.client.1.vm09.stdout:5/193: dwrite d0/ff [0,4194304] 0 2026-03-09T17:30:09.628 INFO:tasks.workunit.client.0.vm06.stdout:2/850: getdents d3 0 2026-03-09T17:30:09.628 INFO:tasks.workunit.client.0.vm06.stdout:2/851: write d3/d4/f52 [3061797,129628] 0 2026-03-09T17:30:09.629 INFO:tasks.workunit.client.1.vm09.stdout:5/194: truncate d0/d2/d15/d20/f25 921226 0 2026-03-09T17:30:09.631 INFO:tasks.workunit.client.1.vm09.stdout:1/204: creat d9/dc/dd/d40/d22/d37/f41 x:0 0 0 2026-03-09T17:30:09.633 INFO:tasks.workunit.client.1.vm09.stdout:0/201: fsync d6/de/f2e 0 2026-03-09T17:30:09.634 INFO:tasks.workunit.client.1.vm09.stdout:6/185: truncate d3/d1e/f20 345759 0 2026-03-09T17:30:09.640 INFO:tasks.workunit.client.1.vm09.stdout:4/236: creat d11/d1e/d29/f50 x:0 0 0 2026-03-09T17:30:09.642 INFO:tasks.workunit.client.1.vm09.stdout:7/257: symlink da/d11/d47/d5b/l5c 0 2026-03-09T17:30:09.642 INFO:tasks.workunit.client.1.vm09.stdout:4/237: rename d11/d1e/d29/d36 to d11/d1e/d29/d36/d51 22 2026-03-09T17:30:09.642 INFO:tasks.workunit.client.1.vm09.stdout:5/195: read d0/d2/f31 [1950843,24101] 0 2026-03-09T17:30:09.643 INFO:tasks.workunit.client.1.vm09.stdout:0/202: unlink d6/f1b 0 2026-03-09T17:30:09.648 INFO:tasks.workunit.client.1.vm09.stdout:7/258: creat da/d11/d2d/d49/f5d x:0 0 0 2026-03-09T17:30:09.649 INFO:tasks.workunit.client.1.vm09.stdout:1/205: mkdir d9/dc/dd/d40/d22/d37/d3f/d42 0 2026-03-09T17:30:09.650 INFO:tasks.workunit.client.1.vm09.stdout:0/203: dread d6/f7 [0,4194304] 0 2026-03-09T17:30:09.650 INFO:tasks.workunit.client.1.vm09.stdout:7/259: write da/d11/f1a [3222189,29139] 0 2026-03-09T17:30:09.651 INFO:tasks.workunit.client.1.vm09.stdout:5/196: chown d0/dc/ld 1286 1 2026-03-09T17:30:09.651 INFO:tasks.workunit.client.0.vm06.stdout:8/949: dwrite d15/d39/d67/d77/d97/fad [0,4194304] 0 2026-03-09T17:30:09.655 INFO:tasks.workunit.client.0.vm06.stdout:1/938: dwrite d11/d14/fa6 [4194304,4194304] 0 2026-03-09T17:30:09.656 INFO:tasks.workunit.client.1.vm09.stdout:4/238: dwrite d11/f18 [0,4194304] 0 2026-03-09T17:30:09.658 INFO:tasks.workunit.client.1.vm09.stdout:4/239: read d11/f25 [460975,23822] 0 2026-03-09T17:30:09.663 INFO:tasks.workunit.client.1.vm09.stdout:7/260: mknod da/d11/d47/c5e 0 2026-03-09T17:30:09.670 INFO:tasks.workunit.client.1.vm09.stdout:6/186: sync 2026-03-09T17:30:09.670 INFO:tasks.workunit.client.0.vm06.stdout:8/950: symlink d15/d39/d67/d86/l136 0 2026-03-09T17:30:09.671 INFO:tasks.workunit.client.0.vm06.stdout:8/951: chown d15/d16/d1e/d30/db8/da4 144174 1 2026-03-09T17:30:09.672 INFO:tasks.workunit.client.0.vm06.stdout:8/952: read d15/d39/d67/d77/fa0 [782195,93848] 0 2026-03-09T17:30:09.673 INFO:tasks.workunit.client.0.vm06.stdout:8/953: chown d15/d16/d1e/d30/db8/d5e/fb9 7433100 1 2026-03-09T17:30:09.673 INFO:tasks.workunit.client.1.vm09.stdout:6/187: dwrite f2 [0,4194304] 0 2026-03-09T17:30:09.675 INFO:tasks.workunit.client.1.vm09.stdout:0/204: rename d6/de to d6/d1d/d39 0 2026-03-09T17:30:09.675 INFO:tasks.workunit.client.1.vm09.stdout:0/205: truncate d6/d1d/f30 1838285 0 2026-03-09T17:30:09.676 INFO:tasks.workunit.client.1.vm09.stdout:5/197: mknod d0/d9/d16/c41 0 2026-03-09T17:30:09.676 INFO:tasks.workunit.client.0.vm06.stdout:1/939: rmdir d11/d14/d1d/d1e 39 2026-03-09T17:30:09.677 INFO:tasks.workunit.client.1.vm09.stdout:8/213: write d1/da/d13/f1d [796990,20863] 0 2026-03-09T17:30:09.679 INFO:tasks.workunit.client.1.vm09.stdout:8/214: write d1/da/d13/f36 [572594,10127] 0 2026-03-09T17:30:09.680 INFO:tasks.workunit.client.1.vm09.stdout:1/206: symlink d9/d3a/l43 0 2026-03-09T17:30:09.681 INFO:tasks.workunit.client.1.vm09.stdout:4/240: mknod d11/d1e/d29/d36/c52 0 2026-03-09T17:30:09.682 INFO:tasks.workunit.client.0.vm06.stdout:1/940: mkdir d11/d14/d1d/dd1/de2/d13a 0 2026-03-09T17:30:09.690 INFO:tasks.workunit.client.0.vm06.stdout:1/941: fdatasync d11/d69/fad 0 2026-03-09T17:30:09.690 INFO:tasks.workunit.client.0.vm06.stdout:1/942: truncate d11/d14/d1d/d4a/df7/d106/d112/d114/f11c 451429 0 2026-03-09T17:30:09.690 INFO:tasks.workunit.client.1.vm09.stdout:6/188: dwrite d3/d21/f28 [0,4194304] 0 2026-03-09T17:30:09.690 INFO:tasks.workunit.client.1.vm09.stdout:7/261: mkdir da/d11/d41/d4e/d5f 0 2026-03-09T17:30:09.690 INFO:tasks.workunit.client.1.vm09.stdout:5/198: dwrite d0/d2/f2a [0,4194304] 0 2026-03-09T17:30:09.690 INFO:tasks.workunit.client.1.vm09.stdout:7/262: write da/d11/d2d/f45 [1348656,76073] 0 2026-03-09T17:30:09.690 INFO:tasks.workunit.client.1.vm09.stdout:5/199: dread d0/d2/f2a [4194304,4194304] 0 2026-03-09T17:30:09.700 INFO:tasks.workunit.client.1.vm09.stdout:1/207: mknod d9/dc/dd/d40/c44 0 2026-03-09T17:30:09.700 INFO:tasks.workunit.client.1.vm09.stdout:1/208: write f3 [1422699,22833] 0 2026-03-09T17:30:09.701 INFO:tasks.workunit.client.1.vm09.stdout:4/241: truncate d11/f1c 4929477 0 2026-03-09T17:30:09.704 INFO:tasks.workunit.client.1.vm09.stdout:4/242: dread d11/f12 [0,4194304] 0 2026-03-09T17:30:09.705 INFO:tasks.workunit.client.1.vm09.stdout:0/206: creat d6/d2a/f3a x:0 0 0 2026-03-09T17:30:09.708 INFO:tasks.workunit.client.1.vm09.stdout:6/189: mknod d3/d1e/d30/d32/c36 0 2026-03-09T17:30:09.708 INFO:tasks.workunit.client.1.vm09.stdout:4/243: creat d11/d1e/d30/f53 x:0 0 0 2026-03-09T17:30:09.708 INFO:tasks.workunit.client.1.vm09.stdout:7/263: creat da/d11/d3e/f60 x:0 0 0 2026-03-09T17:30:09.709 INFO:tasks.workunit.client.1.vm09.stdout:8/215: creat d1/da/f4b x:0 0 0 2026-03-09T17:30:09.710 INFO:tasks.workunit.client.1.vm09.stdout:8/216: chown d1/d14/d2a/d42/d43/d44 217191 1 2026-03-09T17:30:09.715 INFO:tasks.workunit.client.1.vm09.stdout:1/209: creat d9/dc/dd/d40/d22/d37/d3f/d42/f45 x:0 0 0 2026-03-09T17:30:09.718 INFO:tasks.workunit.client.1.vm09.stdout:1/210: dwrite d9/dc/dd/fe [4194304,4194304] 0 2026-03-09T17:30:09.721 INFO:tasks.workunit.client.1.vm09.stdout:4/244: symlink d11/d1e/d30/l54 0 2026-03-09T17:30:09.722 INFO:tasks.workunit.client.1.vm09.stdout:7/264: rename da/f5a to da/d11/d3e/f61 0 2026-03-09T17:30:09.726 INFO:tasks.workunit.client.1.vm09.stdout:1/211: dwrite d9/f11 [0,4194304] 0 2026-03-09T17:30:09.726 INFO:tasks.workunit.client.1.vm09.stdout:6/190: mkdir d3/d7/d37 0 2026-03-09T17:30:09.727 INFO:tasks.workunit.client.1.vm09.stdout:4/245: mknod d11/d1e/d30/d35/c55 0 2026-03-09T17:30:09.728 INFO:tasks.workunit.client.1.vm09.stdout:8/217: mkdir d1/da/dd/d47/d4c 0 2026-03-09T17:30:09.728 INFO:tasks.workunit.client.1.vm09.stdout:7/265: symlink da/d11/d2d/d56/l62 0 2026-03-09T17:30:09.729 INFO:tasks.workunit.client.1.vm09.stdout:4/246: chown d11/d1e/d29/d36/f40 5947 1 2026-03-09T17:30:09.729 INFO:tasks.workunit.client.1.vm09.stdout:1/212: chown d9/dc/f3d 0 1 2026-03-09T17:30:09.732 INFO:tasks.workunit.client.1.vm09.stdout:8/218: mknod d1/d14/d2a/d49/c4d 0 2026-03-09T17:30:09.735 INFO:tasks.workunit.client.0.vm06.stdout:6/878: write d6/d4f/d3e/d52/fd2 [147312,76140] 0 2026-03-09T17:30:09.737 INFO:tasks.workunit.client.0.vm06.stdout:6/879: read d6/d47/d4d/f50 [369539,112211] 0 2026-03-09T17:30:09.737 INFO:tasks.workunit.client.0.vm06.stdout:6/880: chown d6/d4f/f33 0 1 2026-03-09T17:30:09.739 INFO:tasks.workunit.client.1.vm09.stdout:6/191: link d3/d1e/d30/d32/c36 d3/d1e/c38 0 2026-03-09T17:30:09.740 INFO:tasks.workunit.client.1.vm09.stdout:6/192: dread - d3/d7/f11 zero size 2026-03-09T17:30:09.743 INFO:tasks.workunit.client.0.vm06.stdout:6/881: creat d6/d47/d96/f110 x:0 0 0 2026-03-09T17:30:09.743 INFO:tasks.workunit.client.1.vm09.stdout:4/247: sync 2026-03-09T17:30:09.746 INFO:tasks.workunit.client.0.vm06.stdout:6/882: rename d6/d12/c1f to d6/d47/d4d/c111 0 2026-03-09T17:30:09.747 INFO:tasks.workunit.client.0.vm06.stdout:6/883: fdatasync d6/d12/d53/dd0/f106 0 2026-03-09T17:30:09.751 INFO:tasks.workunit.client.1.vm09.stdout:6/193: creat d3/d1e/d30/f39 x:0 0 0 2026-03-09T17:30:09.751 INFO:tasks.workunit.client.1.vm09.stdout:4/248: mknod d11/d1e/d29/d36/c56 0 2026-03-09T17:30:09.752 INFO:tasks.workunit.client.1.vm09.stdout:4/249: chown d11/d1e/f28 196 1 2026-03-09T17:30:09.754 INFO:tasks.workunit.client.1.vm09.stdout:6/194: write d3/fc [1065163,29492] 0 2026-03-09T17:30:09.759 INFO:tasks.workunit.client.1.vm09.stdout:6/195: symlink d3/d21/d25/d26/d34/l3a 0 2026-03-09T17:30:09.763 INFO:tasks.workunit.client.1.vm09.stdout:6/196: mknod d3/d1e/d30/c3b 0 2026-03-09T17:30:09.763 INFO:tasks.workunit.client.1.vm09.stdout:6/197: truncate d3/d7/f24 311429 0 2026-03-09T17:30:09.763 INFO:tasks.workunit.client.1.vm09.stdout:6/198: creat d3/d21/f3c x:0 0 0 2026-03-09T17:30:09.764 INFO:tasks.workunit.client.1.vm09.stdout:4/250: sync 2026-03-09T17:30:09.771 INFO:tasks.workunit.client.1.vm09.stdout:4/251: dwrite d11/f25 [0,4194304] 0 2026-03-09T17:30:09.772 INFO:tasks.workunit.client.1.vm09.stdout:4/252: fsync d11/f33 0 2026-03-09T17:30:09.775 INFO:tasks.workunit.client.1.vm09.stdout:8/219: read d1/da/d13/f36 [2464,64811] 0 2026-03-09T17:30:09.777 INFO:tasks.workunit.client.1.vm09.stdout:4/253: mkdir d11/d1e/d29/d36/d57 0 2026-03-09T17:30:09.777 INFO:tasks.workunit.client.1.vm09.stdout:4/254: chown d11/d1e/d29/l4f 2 1 2026-03-09T17:30:09.777 INFO:tasks.workunit.client.1.vm09.stdout:8/220: write d1/f3 [2100926,129649] 0 2026-03-09T17:30:09.779 INFO:tasks.workunit.client.1.vm09.stdout:8/221: truncate d1/da/f4b 1047588 0 2026-03-09T17:30:09.781 INFO:tasks.workunit.client.1.vm09.stdout:4/255: dwrite d11/f16 [0,4194304] 0 2026-03-09T17:30:09.786 INFO:tasks.workunit.client.1.vm09.stdout:8/222: dwrite d1/da/f12 [0,4194304] 0 2026-03-09T17:30:09.793 INFO:tasks.workunit.client.1.vm09.stdout:4/256: mkdir d11/d1e/d30/d58 0 2026-03-09T17:30:09.797 INFO:tasks.workunit.client.1.vm09.stdout:4/257: dwrite fd [0,4194304] 0 2026-03-09T17:30:09.798 INFO:tasks.workunit.client.1.vm09.stdout:4/258: chown d11/f3f 2711 1 2026-03-09T17:30:09.799 INFO:tasks.workunit.client.1.vm09.stdout:8/223: sync 2026-03-09T17:30:09.885 INFO:tasks.workunit.client.1.vm09.stdout:7/266: dread da/d11/d2d/f45 [0,4194304] 0 2026-03-09T17:30:09.888 INFO:tasks.workunit.client.1.vm09.stdout:7/267: dwrite da/d11/d2d/d49/f5d [0,4194304] 0 2026-03-09T17:30:09.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:09 vm09.local ceph-mon[62061]: mgrmap e23: vm09.lqzvkh(active, since 2s) 2026-03-09T17:30:09.898 INFO:tasks.workunit.client.1.vm09.stdout:7/268: creat da/d11/d41/d4e/f63 x:0 0 0 2026-03-09T17:30:09.907 INFO:tasks.workunit.client.1.vm09.stdout:7/269: chown da/f36 982783 1 2026-03-09T17:30:09.907 INFO:tasks.workunit.client.1.vm09.stdout:7/270: mkdir da/d11/d64 0 2026-03-09T17:30:09.907 INFO:tasks.workunit.client.1.vm09.stdout:7/271: truncate da/d11/d2d/f32 587879 0 2026-03-09T17:30:09.907 INFO:tasks.workunit.client.1.vm09.stdout:7/272: chown da/d11/d41/d4e/l37 17492484 1 2026-03-09T17:30:09.907 INFO:tasks.workunit.client.1.vm09.stdout:7/273: mknod da/d11/d2d/c65 0 2026-03-09T17:30:09.911 INFO:tasks.workunit.client.1.vm09.stdout:7/274: creat da/d11/d41/d4e/d4c/f66 x:0 0 0 2026-03-09T17:30:09.914 INFO:tasks.workunit.client.1.vm09.stdout:2/166: write d13/d15/f18 [320706,88728] 0 2026-03-09T17:30:09.914 INFO:tasks.workunit.client.1.vm09.stdout:9/193: write d5/f34 [1182708,27903] 0 2026-03-09T17:30:09.914 INFO:tasks.workunit.client.1.vm09.stdout:3/192: rmdir d5/d6 39 2026-03-09T17:30:09.914 INFO:tasks.workunit.client.1.vm09.stdout:9/194: readlink d5/de/d29/l4c 0 2026-03-09T17:30:09.916 INFO:tasks.workunit.client.1.vm09.stdout:7/275: rename da/d11/f1e to da/d11/d41/d4e/d4c/f67 0 2026-03-09T17:30:09.917 INFO:tasks.workunit.client.1.vm09.stdout:7/276: write da/d11/f25 [3794455,111904] 0 2026-03-09T17:30:09.917 INFO:tasks.workunit.client.1.vm09.stdout:7/277: dread - da/d11/d3e/f60 zero size 2026-03-09T17:30:09.925 INFO:tasks.workunit.client.1.vm09.stdout:9/195: fdatasync d5/f1e 0 2026-03-09T17:30:09.927 INFO:tasks.workunit.client.1.vm09.stdout:3/193: rmdir d5/d38 0 2026-03-09T17:30:09.934 INFO:tasks.workunit.client.1.vm09.stdout:3/194: rmdir d5/d6 39 2026-03-09T17:30:09.938 INFO:tasks.workunit.client.1.vm09.stdout:3/195: dread d5/d6/fe [0,4194304] 0 2026-03-09T17:30:09.940 INFO:tasks.workunit.client.1.vm09.stdout:1/213: fdatasync f3 0 2026-03-09T17:30:09.941 INFO:tasks.workunit.client.1.vm09.stdout:3/196: dread d5/d6/fe [0,4194304] 0 2026-03-09T17:30:09.945 INFO:tasks.workunit.client.1.vm09.stdout:1/214: mkdir d9/dc/dd/d40/d22/d37/d3f/d42/d46 0 2026-03-09T17:30:09.945 INFO:tasks.workunit.client.1.vm09.stdout:1/215: dread - d9/dc/dd/d40/d22/d37/d3f/d42/f45 zero size 2026-03-09T17:30:09.947 INFO:tasks.workunit.client.1.vm09.stdout:1/216: write d9/dc/dd/d40/d21/f33 [308382,127945] 0 2026-03-09T17:30:09.953 INFO:tasks.workunit.client.1.vm09.stdout:1/217: dwrite d9/f34 [0,4194304] 0 2026-03-09T17:30:09.954 INFO:tasks.workunit.client.1.vm09.stdout:1/218: chown d9/d3a/l43 257945725 1 2026-03-09T17:30:09.960 INFO:tasks.workunit.client.1.vm09.stdout:1/219: creat d9/dc/f47 x:0 0 0 2026-03-09T17:30:09.961 INFO:tasks.workunit.client.1.vm09.stdout:1/220: dread d9/dc/dd/f28 [0,4194304] 0 2026-03-09T17:30:09.962 INFO:tasks.workunit.client.1.vm09.stdout:1/221: write d9/dc/dd/d40/d1d/f1e [2227198,23496] 0 2026-03-09T17:30:09.966 INFO:tasks.workunit.client.1.vm09.stdout:1/222: mknod d9/d3a/c48 0 2026-03-09T17:30:09.974 INFO:tasks.workunit.client.0.vm06.stdout:5/946: dwrite d4/d22/d46/f78 [0,4194304] 0 2026-03-09T17:30:09.982 INFO:tasks.workunit.client.0.vm06.stdout:5/947: creat d4/d50/d35/d40/f154 x:0 0 0 2026-03-09T17:30:09.982 INFO:tasks.workunit.client.0.vm06.stdout:5/948: readlink d4/d50/d35/d40/d6f/lf6 0 2026-03-09T17:30:09.987 INFO:tasks.workunit.client.0.vm06.stdout:5/949: unlink d4/d50/d18/f140 0 2026-03-09T17:30:09.993 INFO:tasks.workunit.client.0.vm06.stdout:5/950: creat d4/d50/d35/d40/d95/db8/dda/f155 x:0 0 0 2026-03-09T17:30:09.996 INFO:tasks.workunit.client.1.vm09.stdout:2/167: dread d13/f14 [0,4194304] 0 2026-03-09T17:30:09.997 INFO:tasks.workunit.client.1.vm09.stdout:0/207: getdents d6/d1d 0 2026-03-09T17:30:09.999 INFO:tasks.workunit.client.0.vm06.stdout:2/852: dread d3/f91 [0,4194304] 0 2026-03-09T17:30:10.004 INFO:tasks.workunit.client.0.vm06.stdout:8/954: write d15/d39/d67/d77/d97/dac/dcb/f116 [1023118,39098] 0 2026-03-09T17:30:10.004 INFO:tasks.workunit.client.0.vm06.stdout:8/955: stat d15/d16/d1e/d30/fdb 0 2026-03-09T17:30:10.006 INFO:tasks.workunit.client.0.vm06.stdout:8/956: read d15/d39/d3c/d6c/fbf [556247,3768] 0 2026-03-09T17:30:10.012 INFO:tasks.workunit.client.0.vm06.stdout:1/943: dwrite d11/d14/d1d/d1e/d2a/fba [0,4194304] 0 2026-03-09T17:30:10.022 INFO:tasks.workunit.client.1.vm09.stdout:5/200: write d0/d2/f31 [5044887,31808] 0 2026-03-09T17:30:10.025 INFO:tasks.workunit.client.0.vm06.stdout:5/951: mknod d4/d52/d55/d13e/d127/c156 0 2026-03-09T17:30:10.026 INFO:tasks.workunit.client.1.vm09.stdout:2/168: dread - d13/f26 zero size 2026-03-09T17:30:10.029 INFO:tasks.workunit.client.1.vm09.stdout:6/199: rmdir d3/d7 39 2026-03-09T17:30:10.031 INFO:tasks.workunit.client.1.vm09.stdout:2/169: mkdir d13/d15/d36 0 2026-03-09T17:30:10.033 INFO:tasks.workunit.client.1.vm09.stdout:0/208: symlink d6/l3b 0 2026-03-09T17:30:10.034 INFO:tasks.workunit.client.0.vm06.stdout:8/957: creat d15/d39/d67/d77/f137 x:0 0 0 2026-03-09T17:30:10.037 INFO:tasks.workunit.client.0.vm06.stdout:1/944: truncate d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/fef 2151209 0 2026-03-09T17:30:10.038 INFO:tasks.workunit.client.0.vm06.stdout:1/945: chown d11/d14/d1d/d4a/fa7 22 1 2026-03-09T17:30:10.038 INFO:tasks.workunit.client.1.vm09.stdout:6/200: dwrite d3/d7/f18 [0,4194304] 0 2026-03-09T17:30:10.039 INFO:tasks.workunit.client.1.vm09.stdout:6/201: write d3/d1e/d30/f39 [826103,51083] 0 2026-03-09T17:30:10.041 INFO:tasks.workunit.client.1.vm09.stdout:2/170: fdatasync fd 0 2026-03-09T17:30:10.043 INFO:tasks.workunit.client.0.vm06.stdout:5/952: symlink d4/d22/dbe/dfb/l157 0 2026-03-09T17:30:10.044 INFO:tasks.workunit.client.1.vm09.stdout:5/201: fsync d0/d2/f31 0 2026-03-09T17:30:10.047 INFO:tasks.workunit.client.0.vm06.stdout:5/953: dwrite d4/dca/fff [0,4194304] 0 2026-03-09T17:30:10.053 INFO:tasks.workunit.client.1.vm09.stdout:0/209: creat d6/d1d/f3c x:0 0 0 2026-03-09T17:30:10.056 INFO:tasks.workunit.client.0.vm06.stdout:6/884: dwrite d6/d47/d96/d40/f67 [4194304,4194304] 0 2026-03-09T17:30:10.057 INFO:tasks.workunit.client.0.vm06.stdout:2/853: truncate d3/d4/d12/f2e 483788 0 2026-03-09T17:30:10.060 INFO:tasks.workunit.client.0.vm06.stdout:2/854: read d3/d4/d22/d72/d8f/f95 [2465938,124674] 0 2026-03-09T17:30:10.063 INFO:tasks.workunit.client.1.vm09.stdout:4/259: rmdir d11/d1e/d29 39 2026-03-09T17:30:10.073 INFO:tasks.workunit.client.1.vm09.stdout:6/202: mknod d3/d1e/c3d 0 2026-03-09T17:30:10.073 INFO:tasks.workunit.client.1.vm09.stdout:8/224: write d1/d14/f2f [1314649,67134] 0 2026-03-09T17:30:10.073 INFO:tasks.workunit.client.1.vm09.stdout:0/210: symlink d6/d1d/d39/l3d 0 2026-03-09T17:30:10.073 INFO:tasks.workunit.client.1.vm09.stdout:4/260: truncate d11/d1e/d29/f50 636625 0 2026-03-09T17:30:10.073 INFO:tasks.workunit.client.1.vm09.stdout:7/278: truncate da/d11/d41/f30 39677 0 2026-03-09T17:30:10.074 INFO:tasks.workunit.client.1.vm09.stdout:7/279: chown da/d11/d2d 218674 1 2026-03-09T17:30:10.075 INFO:tasks.workunit.client.1.vm09.stdout:3/197: getdents d5 0 2026-03-09T17:30:10.077 INFO:tasks.workunit.client.0.vm06.stdout:2/855: rename d3/d4/d12/f35 to d3/d4/d12/d71/daa/d10b/f114 0 2026-03-09T17:30:10.078 INFO:tasks.workunit.client.0.vm06.stdout:2/856: write d3/d4/d46/da5/f104 [74437,95774] 0 2026-03-09T17:30:10.080 INFO:tasks.workunit.client.1.vm09.stdout:9/196: truncate d5/f47 2928252 0 2026-03-09T17:30:10.081 INFO:tasks.workunit.client.1.vm09.stdout:5/202: mkdir d0/d9/d16/d3c/d42 0 2026-03-09T17:30:10.082 INFO:tasks.workunit.client.1.vm09.stdout:1/223: rmdir d9/dc 39 2026-03-09T17:30:10.085 INFO:tasks.workunit.client.1.vm09.stdout:8/225: mknod d1/d14/d2a/c4e 0 2026-03-09T17:30:10.086 INFO:tasks.workunit.client.1.vm09.stdout:8/226: dread d1/f3 [0,4194304] 0 2026-03-09T17:30:10.087 INFO:tasks.workunit.client.1.vm09.stdout:0/211: mknod d6/d1d/c3e 0 2026-03-09T17:30:10.090 INFO:tasks.workunit.client.0.vm06.stdout:6/885: mknod d6/d47/c112 0 2026-03-09T17:30:10.090 INFO:tasks.workunit.client.0.vm06.stdout:6/886: write d6/d47/d96/da1/fb7 [1062438,53511] 0 2026-03-09T17:30:10.095 INFO:tasks.workunit.client.1.vm09.stdout:4/261: rename d11/d1e/d29/d36/l44 to d11/d1e/d29/d36/l59 0 2026-03-09T17:30:10.096 INFO:tasks.workunit.client.1.vm09.stdout:8/227: dread d1/da/dd/f22 [0,4194304] 0 2026-03-09T17:30:10.106 INFO:tasks.workunit.client.0.vm06.stdout:8/958: write d15/d39/f40 [1825476,19921] 0 2026-03-09T17:30:10.113 INFO:tasks.workunit.client.0.vm06.stdout:1/946: dwrite d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/dce/f11d [0,4194304] 0 2026-03-09T17:30:10.116 INFO:tasks.workunit.client.1.vm09.stdout:2/171: write fb [1108655,72292] 0 2026-03-09T17:30:10.117 INFO:tasks.workunit.client.1.vm09.stdout:2/172: dread - d13/d15/f2a zero size 2026-03-09T17:30:10.117 INFO:tasks.workunit.client.0.vm06.stdout:2/857: symlink d3/d4/d12/da7/dfc/l115 0 2026-03-09T17:30:10.120 INFO:tasks.workunit.client.1.vm09.stdout:3/198: symlink d5/d16/d25/l3a 0 2026-03-09T17:30:10.120 INFO:tasks.workunit.client.1.vm09.stdout:3/199: readlink d5/d6/l10 0 2026-03-09T17:30:10.121 INFO:tasks.workunit.client.1.vm09.stdout:9/197: fsync d5/d21/f2b 0 2026-03-09T17:30:10.121 INFO:tasks.workunit.client.0.vm06.stdout:6/887: truncate d6/d4f/d3e/d52/f89 4785923 0 2026-03-09T17:30:10.122 INFO:tasks.workunit.client.1.vm09.stdout:9/198: dread - d5/de/d29/d33/f42 zero size 2026-03-09T17:30:10.122 INFO:tasks.workunit.client.1.vm09.stdout:9/199: chown d5/d21/f2f 664 1 2026-03-09T17:30:10.122 INFO:tasks.workunit.client.0.vm06.stdout:6/888: truncate d6/d12/d17/f32 9707240 0 2026-03-09T17:30:10.124 INFO:tasks.workunit.client.0.vm06.stdout:8/959: unlink d15/l8a 0 2026-03-09T17:30:10.129 INFO:tasks.workunit.client.0.vm06.stdout:1/947: mknod d11/de0/c13b 0 2026-03-09T17:30:10.131 INFO:tasks.workunit.client.1.vm09.stdout:0/212: mknod d6/d1d/d39/c3f 0 2026-03-09T17:30:10.132 INFO:tasks.workunit.client.0.vm06.stdout:2/858: mkdir d3/d4/d12/d71/daa/d77/d102/d109/d116 0 2026-03-09T17:30:10.132 INFO:tasks.workunit.client.0.vm06.stdout:2/859: chown d3/df1 9066 1 2026-03-09T17:30:10.136 INFO:tasks.workunit.client.0.vm06.stdout:6/889: mkdir d6/d12/d17/d65/d113 0 2026-03-09T17:30:10.139 INFO:tasks.workunit.client.1.vm09.stdout:8/228: symlink d1/da/d13/l4f 0 2026-03-09T17:30:10.139 INFO:tasks.workunit.client.1.vm09.stdout:8/229: readlink d1/da/dd/l24 0 2026-03-09T17:30:10.144 INFO:tasks.workunit.client.0.vm06.stdout:2/860: symlink d3/d4/d12/d71/daa/d77/d102/l117 0 2026-03-09T17:30:10.144 INFO:tasks.workunit.client.0.vm06.stdout:2/861: chown d3/l5 1261 1 2026-03-09T17:30:10.147 INFO:tasks.workunit.client.1.vm09.stdout:2/173: mkdir d13/d15/d34/d37 0 2026-03-09T17:30:10.149 INFO:tasks.workunit.client.1.vm09.stdout:2/174: dread d13/f14 [0,4194304] 0 2026-03-09T17:30:10.149 INFO:tasks.workunit.client.1.vm09.stdout:3/200: readlink d5/l14 0 2026-03-09T17:30:10.154 INFO:tasks.workunit.client.1.vm09.stdout:7/280: dwrite da/d11/d41/f30 [0,4194304] 0 2026-03-09T17:30:10.159 INFO:tasks.workunit.client.0.vm06.stdout:2/862: rename d3/d4/d12/d2b/d36/d37/f90 to d3/d4/d12/d2b/db0/df3/f118 0 2026-03-09T17:30:10.170 INFO:tasks.workunit.client.0.vm06.stdout:1/948: creat d11/f13c x:0 0 0 2026-03-09T17:30:10.172 INFO:tasks.workunit.client.1.vm09.stdout:0/213: symlink d6/d1d/l40 0 2026-03-09T17:30:10.176 INFO:tasks.workunit.client.1.vm09.stdout:8/230: creat d1/da/dd/d3f/d32/f50 x:0 0 0 2026-03-09T17:30:10.179 INFO:tasks.workunit.client.1.vm09.stdout:6/203: getdents d3/d21/d25 0 2026-03-09T17:30:10.188 INFO:tasks.workunit.client.0.vm06.stdout:1/949: readlink d11/d14/d1d/d4a/l11a 0 2026-03-09T17:30:10.192 INFO:tasks.workunit.client.0.vm06.stdout:5/954: dread d4/d22/d46/dec/f116 [0,4194304] 0 2026-03-09T17:30:10.194 INFO:tasks.workunit.client.0.vm06.stdout:5/955: dread d4/d50/d35/d40/d95/fa1 [0,4194304] 0 2026-03-09T17:30:10.198 INFO:tasks.workunit.client.1.vm09.stdout:4/262: write d11/d1e/f22 [1797015,122930] 0 2026-03-09T17:30:10.199 INFO:tasks.workunit.client.1.vm09.stdout:4/263: read f3 [3384178,15473] 0 2026-03-09T17:30:10.201 INFO:tasks.workunit.client.0.vm06.stdout:8/960: dwrite d15/d31/dc5/df1/d3d/d5f/d83/ff5 [0,4194304] 0 2026-03-09T17:30:10.208 INFO:tasks.workunit.client.1.vm09.stdout:1/224: write d9/dc/dd/d40/f1a [2075968,79754] 0 2026-03-09T17:30:10.222 INFO:tasks.workunit.client.0.vm06.stdout:2/863: dwrite d3/d4/d12/da7/fbb [0,4194304] 0 2026-03-09T17:30:10.228 INFO:tasks.workunit.client.0.vm06.stdout:6/890: link d6/d4f/f25 d6/d4f/d3e/d52/d95/f114 0 2026-03-09T17:30:10.234 INFO:tasks.workunit.client.1.vm09.stdout:7/281: truncate da/d11/d41/d4e/f42 731703 0 2026-03-09T17:30:10.235 INFO:tasks.workunit.client.0.vm06.stdout:1/950: mkdir d11/d14/d1d/d1e/dc2/d103/d13d 0 2026-03-09T17:30:10.237 INFO:tasks.workunit.client.1.vm09.stdout:5/203: link d0/d9/d16/c19 d0/d2/d15/c43 0 2026-03-09T17:30:10.237 INFO:tasks.workunit.client.1.vm09.stdout:5/204: write d0/dc/d21/f29 [1959229,111154] 0 2026-03-09T17:30:10.238 INFO:tasks.workunit.client.1.vm09.stdout:0/214: rename d6/d2a/f3a to d6/d1d/f41 0 2026-03-09T17:30:10.247 INFO:tasks.workunit.client.1.vm09.stdout:8/231: symlink d1/da/dd/d47/l51 0 2026-03-09T17:30:10.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.239+0000 7f0e93255700 1 -- 192.168.123.106:0/3033263298 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0e8c072440 msgr2=0x7f0e8c10be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:10.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.239+0000 7f0e93255700 1 --2- 192.168.123.106:0/3033263298 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0e8c072440 0x7f0e8c10be90 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f0e88009b00 tx=0x7f0e88009e10 comp rx=0 tx=0).stop 2026-03-09T17:30:10.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.239+0000 7f0e93255700 1 -- 192.168.123.106:0/3033263298 shutdown_connections 2026-03-09T17:30:10.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.239+0000 7f0e93255700 1 --2- 192.168.123.106:0/3033263298 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0e8c072440 0x7f0e8c10be90 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.239+0000 7f0e93255700 1 --2- 192.168.123.106:0/3033263298 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0e8c071a60 0x7f0e8c071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.239+0000 7f0e93255700 1 -- 192.168.123.106:0/3033263298 >> 192.168.123.106:0/3033263298 conn(0x7f0e8c06d1a0 msgr2=0x7f0e8c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:10.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.240+0000 7f0e93255700 1 -- 192.168.123.106:0/3033263298 shutdown_connections 2026-03-09T17:30:10.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.240+0000 7f0e93255700 1 -- 192.168.123.106:0/3033263298 wait complete. 2026-03-09T17:30:10.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.240+0000 7f0e93255700 1 Processor -- start 2026-03-09T17:30:10.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.240+0000 7f0e93255700 1 -- start start 2026-03-09T17:30:10.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.240+0000 7f0e93255700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0e8c071a60 0x7f0e8c1169e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:10.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.240+0000 7f0e93255700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0e8c072440 0x7f0e8c116f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:10.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.240+0000 7f0e93255700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e8c117540 con 0x7f0e8c071a60 2026-03-09T17:30:10.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.240+0000 7f0e93255700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e8c1b2760 con 0x7f0e8c072440 2026-03-09T17:30:10.247 INFO:tasks.workunit.client.0.vm06.stdout:8/961: mknod d15/d16/d1e/d30/db8/c138 0 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.241+0000 7f0e91a52700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0e8c072440 0x7f0e8c116f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.241+0000 7f0e91a52700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0e8c072440 0x7f0e8c116f20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:54440/0 (socket says 192.168.123.106:54440) 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.241+0000 7f0e91a52700 1 -- 192.168.123.106:0/1213838941 learned_addr learned my addr 192.168.123.106:0/1213838941 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.241+0000 7f0e91a52700 1 -- 192.168.123.106:0/1213838941 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0e8c071a60 msgr2=0x7f0e8c1169e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.241+0000 7f0e91a52700 1 --2- 192.168.123.106:0/1213838941 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0e8c071a60 0x7f0e8c1169e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.241+0000 7f0e91a52700 1 -- 192.168.123.106:0/1213838941 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0e880097e0 con 0x7f0e8c072440 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.241+0000 7f0e91a52700 1 --2- 192.168.123.106:0/1213838941 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0e8c072440 0x7f0e8c116f20 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f0e88004a30 tx=0x7f0e88004b10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.241+0000 7f0e837fe700 1 -- 192.168.123.106:0/1213838941 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0e8801d070 con 0x7f0e8c072440 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.242+0000 7f0e93255700 1 -- 192.168.123.106:0/1213838941 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0e8c1b2900 con 0x7f0e8c072440 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.242+0000 7f0e93255700 1 -- 192.168.123.106:0/1213838941 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0e8c1b2da0 con 0x7f0e8c072440 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.242+0000 7f0e837fe700 1 -- 192.168.123.106:0/1213838941 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0e8800bcd0 con 0x7f0e8c072440 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.242+0000 7f0e93255700 1 -- 192.168.123.106:0/1213838941 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0e8c110c20 con 0x7f0e8c072440 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.243+0000 7f0e837fe700 1 -- 192.168.123.106:0/1213838941 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0e8800f970 con 0x7f0e8c072440 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.243+0000 7f0e837fe700 1 -- 192.168.123.106:0/1213838941 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 23) v1 ==== 50327+0+0 (secure 0 0 0) 0x7f0e8800fb90 con 0x7f0e8c072440 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.243+0000 7f0e837fe700 1 --2- 192.168.123.106:0/1213838941 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7f0e7803dc50 0x7f0e78040100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.243+0000 7f0e837fe700 1 -- 192.168.123.106:0/1213838941 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f0e88056380 con 0x7f0e8c072440 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.244+0000 7f0e92253700 1 --2- 192.168.123.106:0/1213838941 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7f0e7803dc50 0x7f0e78040100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:10.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.244+0000 7f0e92253700 1 --2- 192.168.123.106:0/1213838941 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7f0e7803dc50 0x7f0e78040100 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f0e8c1ae6d0 tx=0x7f0e84009250 comp rx=0 tx=0).ready entity=mgr.24477 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:10.248 INFO:tasks.workunit.client.1.vm09.stdout:6/204: creat d3/d21/d25/d26/d34/f3e x:0 0 0 2026-03-09T17:30:10.250 INFO:tasks.workunit.client.0.vm06.stdout:6/891: symlink d6/d47/d4d/da0/l115 0 2026-03-09T17:30:10.251 INFO:tasks.workunit.client.1.vm09.stdout:4/264: creat d11/d1e/d31/f5a x:0 0 0 2026-03-09T17:30:10.252 INFO:tasks.workunit.client.1.vm09.stdout:6/205: dwrite d3/d21/f28 [0,4194304] 0 2026-03-09T17:30:10.252 INFO:tasks.workunit.client.1.vm09.stdout:4/265: write d11/f33 [583624,76391] 0 2026-03-09T17:30:10.253 INFO:tasks.workunit.client.1.vm09.stdout:4/266: chown d11/d1e/d29/f32 364993 1 2026-03-09T17:30:10.254 INFO:tasks.workunit.client.1.vm09.stdout:9/200: link d5/de/c39 d5/de/d29/c4d 0 2026-03-09T17:30:10.255 INFO:tasks.workunit.client.0.vm06.stdout:8/962: creat d15/d39/d67/d77/d97/dac/f139 x:0 0 0 2026-03-09T17:30:10.255 INFO:tasks.workunit.client.0.vm06.stdout:1/951: symlink d11/d14/d1d/l13e 0 2026-03-09T17:30:10.255 INFO:tasks.workunit.client.1.vm09.stdout:5/205: creat d0/dc/f44 x:0 0 0 2026-03-09T17:30:10.257 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.254+0000 7f0e837fe700 1 -- 192.168.123.106:0/1213838941 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f0e88017c80 con 0x7f0e8c072440 2026-03-09T17:30:10.257 INFO:tasks.workunit.client.1.vm09.stdout:0/215: rename d6/d1d/f30 to d6/d1d/d24/d32/f42 0 2026-03-09T17:30:10.276 INFO:tasks.workunit.client.0.vm06.stdout:5/956: write d4/d50/fad [3516281,102533] 0 2026-03-09T17:30:10.278 INFO:tasks.workunit.client.0.vm06.stdout:2/864: link d3/f29 d3/d4/d22/d72/d8f/dda/f119 0 2026-03-09T17:30:10.278 INFO:tasks.workunit.client.1.vm09.stdout:3/201: dwrite d5/f22 [0,4194304] 0 2026-03-09T17:30:10.280 INFO:tasks.workunit.client.1.vm09.stdout:3/202: write d5/d6/d12/f19 [184922,50131] 0 2026-03-09T17:30:10.281 INFO:tasks.workunit.client.0.vm06.stdout:6/892: creat d6/d47/dd7/df8/f116 x:0 0 0 2026-03-09T17:30:10.283 INFO:tasks.workunit.client.0.vm06.stdout:8/963: mknod d15/d130/c13a 0 2026-03-09T17:30:10.284 INFO:tasks.workunit.client.0.vm06.stdout:8/964: write d15/d16/d1e/d30/d55/def/df3/f12c [134328,16445] 0 2026-03-09T17:30:10.315 INFO:tasks.workunit.client.1.vm09.stdout:4/267: symlink d11/d1e/d29/l5b 0 2026-03-09T17:30:10.315 INFO:tasks.workunit.client.1.vm09.stdout:6/206: mkdir d3/d1e/d30/d3f 0 2026-03-09T17:30:10.317 INFO:tasks.workunit.client.1.vm09.stdout:6/207: write d3/d21/d25/d26/d34/f3e [559318,97059] 0 2026-03-09T17:30:10.326 INFO:tasks.workunit.client.1.vm09.stdout:5/206: creat d0/d2/d15/d20/f45 x:0 0 0 2026-03-09T17:30:10.326 INFO:tasks.workunit.client.1.vm09.stdout:5/207: dread - d0/dc/d21/d26/f39 zero size 2026-03-09T17:30:10.327 INFO:tasks.workunit.client.1.vm09.stdout:5/208: fdatasync d0/f22 0 2026-03-09T17:30:10.328 INFO:tasks.workunit.client.1.vm09.stdout:5/209: truncate d0/d9/f3e 189882 0 2026-03-09T17:30:10.328 INFO:tasks.workunit.client.1.vm09.stdout:5/210: dread - d0/dc/d21/d26/f28 zero size 2026-03-09T17:30:10.345 INFO:tasks.workunit.client.1.vm09.stdout:0/216: unlink d6/d1d/d39/l19 0 2026-03-09T17:30:10.352 INFO:tasks.workunit.client.1.vm09.stdout:1/225: link d9/dc/dd/d40/c44 d9/d3a/c49 0 2026-03-09T17:30:10.357 INFO:tasks.workunit.client.1.vm09.stdout:2/175: link c12 d13/d15/c38 0 2026-03-09T17:30:10.361 INFO:tasks.workunit.client.1.vm09.stdout:4/268: mknod d11/d1e/d31/c5c 0 2026-03-09T17:30:10.362 INFO:tasks.workunit.client.1.vm09.stdout:4/269: write d11/d1e/d31/f3a [5607,86727] 0 2026-03-09T17:30:10.375 INFO:tasks.workunit.client.1.vm09.stdout:1/226: dread d9/f29 [0,4194304] 0 2026-03-09T17:30:10.375 INFO:tasks.workunit.client.1.vm09.stdout:1/227: chown d9/dc/dd/d40 111590 1 2026-03-09T17:30:10.399 INFO:tasks.workunit.client.1.vm09.stdout:9/201: write d5/f1d [855729,53722] 0 2026-03-09T17:30:10.410 INFO:tasks.workunit.client.0.vm06.stdout:5/957: write d4/d50/d35/d40/d6f/fd8 [3755576,59375] 0 2026-03-09T17:30:10.412 INFO:tasks.workunit.client.0.vm06.stdout:5/958: dread d4/d50/f24 [0,4194304] 0 2026-03-09T17:30:10.413 INFO:tasks.workunit.client.1.vm09.stdout:5/211: rmdir d0/d9 39 2026-03-09T17:30:10.415 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.413+0000 7f0e93255700 1 -- 192.168.123.106:0/1213838941 --> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0e8c061190 con 0x7f0e7803dc50 2026-03-09T17:30:10.416 INFO:tasks.workunit.client.0.vm06.stdout:2/865: symlink d3/d4/d12/da7/dfc/l11a 0 2026-03-09T17:30:10.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.414+0000 7f0e837fe700 1 -- 192.168.123.106:0/1213838941 <== mgr.24477 v2:192.168.123.109:6828/111652423 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+318 (secure 0 0 0) 0x7f0e8c061190 con 0x7f0e7803dc50 2026-03-09T17:30:10.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.417+0000 7f0e93255700 1 -- 192.168.123.106:0/1213838941 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7f0e7803dc50 msgr2=0x7f0e78040100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:10.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.417+0000 7f0e93255700 1 --2- 192.168.123.106:0/1213838941 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7f0e7803dc50 0x7f0e78040100 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f0e8c1ae6d0 tx=0x7f0e84009250 comp rx=0 tx=0).stop 2026-03-09T17:30:10.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.417+0000 7f0e93255700 1 -- 192.168.123.106:0/1213838941 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0e8c072440 msgr2=0x7f0e8c116f20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:10.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.417+0000 7f0e93255700 1 --2- 192.168.123.106:0/1213838941 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0e8c072440 0x7f0e8c116f20 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f0e88004a30 tx=0x7f0e88004b10 comp rx=0 tx=0).stop 2026-03-09T17:30:10.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.417+0000 7f0e93255700 1 -- 192.168.123.106:0/1213838941 shutdown_connections 2026-03-09T17:30:10.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.417+0000 7f0e93255700 1 --2- 192.168.123.106:0/1213838941 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7f0e7803dc50 0x7f0e78040100 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.417+0000 7f0e93255700 1 --2- 192.168.123.106:0/1213838941 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0e8c071a60 0x7f0e8c1169e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.417+0000 7f0e93255700 1 --2- 192.168.123.106:0/1213838941 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0e8c072440 0x7f0e8c116f20 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.417+0000 7f0e93255700 1 -- 192.168.123.106:0/1213838941 >> 192.168.123.106:0/1213838941 conn(0x7f0e8c06d1a0 msgr2=0x7f0e8c10a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:10.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.417+0000 7f0e93255700 1 -- 192.168.123.106:0/1213838941 shutdown_connections 2026-03-09T17:30:10.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.417+0000 7f0e93255700 1 -- 192.168.123.106:0/1213838941 wait complete. 2026-03-09T17:30:10.421 INFO:tasks.workunit.client.0.vm06.stdout:6/893: unlink d6/d12/d53/d8f/la5 0 2026-03-09T17:30:10.426 INFO:tasks.workunit.client.1.vm09.stdout:0/217: unlink d6/d1d/d39/f31 0 2026-03-09T17:30:10.437 INFO:tasks.workunit.client.0.vm06.stdout:8/965: dread d15/d16/d1e/ffa [0,4194304] 0 2026-03-09T17:30:10.439 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:30:10.455 INFO:tasks.workunit.client.1.vm09.stdout:3/203: truncate d5/d9/fc 1125972 0 2026-03-09T17:30:10.459 INFO:tasks.workunit.client.0.vm06.stdout:1/952: dwrite d11/d14/d1d/d42/d46/d92/dc0/ffb [0,4194304] 0 2026-03-09T17:30:10.474 INFO:tasks.workunit.client.0.vm06.stdout:5/959: mkdir d4/d22/dbe/dfb/d158 0 2026-03-09T17:30:10.481 INFO:tasks.workunit.client.1.vm09.stdout:7/282: truncate da/d11/d41/d4e/f44 1723564 0 2026-03-09T17:30:10.482 INFO:tasks.workunit.client.1.vm09.stdout:7/283: write f3 [921825,88020] 0 2026-03-09T17:30:10.482 INFO:tasks.workunit.client.0.vm06.stdout:6/894: stat d6/c24 0 2026-03-09T17:30:10.486 INFO:tasks.workunit.client.1.vm09.stdout:1/228: creat d9/dc/dd/d40/d22/f4a x:0 0 0 2026-03-09T17:30:10.498 INFO:tasks.workunit.client.0.vm06.stdout:1/953: readlink d11/d14/l77 0 2026-03-09T17:30:10.501 INFO:tasks.workunit.client.1.vm09.stdout:8/232: rename d1/c17 to d1/c52 0 2026-03-09T17:30:10.502 INFO:tasks.workunit.client.0.vm06.stdout:5/960: rename d4/d50/d35/d40/d6f/lf6 to d4/d50/d35/d40/d109/l159 0 2026-03-09T17:30:10.506 INFO:tasks.workunit.client.0.vm06.stdout:2/866: mkdir d3/d4/d11b 0 2026-03-09T17:30:10.511 INFO:tasks.workunit.client.1.vm09.stdout:0/218: unlink d6/d1d/d24/l38 0 2026-03-09T17:30:10.513 INFO:tasks.workunit.client.1.vm09.stdout:3/204: symlink d5/d16/d25/l3b 0 2026-03-09T17:30:10.518 INFO:tasks.workunit.client.1.vm09.stdout:4/270: symlink d11/d1e/d30/d58/l5d 0 2026-03-09T17:30:10.518 INFO:tasks.workunit.client.0.vm06.stdout:1/954: rmdir d11/d14/d1d/d1e/dd6 39 2026-03-09T17:30:10.520 INFO:tasks.workunit.client.0.vm06.stdout:8/966: rename d15/d31/dc5/df1/d3d/dc7 to d15/d31/d58/dc9/d13b 0 2026-03-09T17:30:10.520 INFO:tasks.workunit.client.1.vm09.stdout:7/284: write da/d11/d2d/f45 [932156,59135] 0 2026-03-09T17:30:10.525 INFO:tasks.workunit.client.0.vm06.stdout:1/955: dwrite d11/d14/d1d/d1e/d2a/d99/ff3 [0,4194304] 0 2026-03-09T17:30:10.533 INFO:tasks.workunit.client.0.vm06.stdout:1/956: dread d11/d14/d1d/d1e/d2a/d34/d64/fec [0,4194304] 0 2026-03-09T17:30:10.541 INFO:tasks.workunit.client.0.vm06.stdout:2/867: creat d3/d4/d12/dfa/f11c x:0 0 0 2026-03-09T17:30:10.544 INFO:tasks.workunit.client.1.vm09.stdout:0/219: sync 2026-03-09T17:30:10.548 INFO:tasks.workunit.client.1.vm09.stdout:5/212: dread d0/d2/f2a [4194304,4194304] 0 2026-03-09T17:30:10.551 INFO:tasks.workunit.client.1.vm09.stdout:3/205: symlink d5/d16/d25/l3c 0 2026-03-09T17:30:10.553 INFO:tasks.workunit.client.1.vm09.stdout:2/176: creat d13/f39 x:0 0 0 2026-03-09T17:30:10.553 INFO:tasks.workunit.client.1.vm09.stdout:4/271: symlink d11/d1e/d31/l5e 0 2026-03-09T17:30:10.554 INFO:tasks.workunit.client.1.vm09.stdout:7/285: mkdir da/d11/d2d/d56/d68 0 2026-03-09T17:30:10.555 INFO:tasks.workunit.client.1.vm09.stdout:4/272: chown d11/d1e/d30/f38 76500 1 2026-03-09T17:30:10.555 INFO:tasks.workunit.client.1.vm09.stdout:6/208: getdents d3 0 2026-03-09T17:30:10.555 INFO:tasks.workunit.client.1.vm09.stdout:7/286: stat da/d11/d41/f57 0 2026-03-09T17:30:10.555 INFO:tasks.workunit.client.0.vm06.stdout:1/957: unlink d11/d14/d1d/d42/d46/f55 0 2026-03-09T17:30:10.555 INFO:tasks.workunit.client.1.vm09.stdout:6/209: readlink d3/l17 0 2026-03-09T17:30:10.557 INFO:tasks.workunit.client.0.vm06.stdout:8/967: read d15/d39/d3c/dd5/fde [777294,38771] 0 2026-03-09T17:30:10.561 INFO:tasks.workunit.client.1.vm09.stdout:4/273: dwrite d11/d1e/d29/f50 [0,4194304] 0 2026-03-09T17:30:10.562 INFO:tasks.workunit.client.1.vm09.stdout:0/220: symlink d6/d1d/d24/d32/l43 0 2026-03-09T17:30:10.564 INFO:tasks.workunit.client.0.vm06.stdout:1/958: rename d11/d14/d1d/d1e/d2a/f40 to d11/d14/d1d/d42/f13f 0 2026-03-09T17:30:10.566 INFO:tasks.workunit.client.1.vm09.stdout:6/210: read d3/d7/f10 [3117064,113576] 0 2026-03-09T17:30:10.566 INFO:tasks.workunit.client.1.vm09.stdout:6/211: fdatasync d3/d7/f18 0 2026-03-09T17:30:10.568 INFO:tasks.workunit.client.1.vm09.stdout:6/212: dread d3/d7/f24 [0,4194304] 0 2026-03-09T17:30:10.571 INFO:tasks.workunit.client.1.vm09.stdout:5/213: unlink d0/d2/d15/d20/f45 0 2026-03-09T17:30:10.571 INFO:tasks.workunit.client.1.vm09.stdout:6/213: write d3/fc [2432907,44286] 0 2026-03-09T17:30:10.572 INFO:tasks.workunit.client.1.vm09.stdout:6/214: chown d3/d7/d37 387 1 2026-03-09T17:30:10.576 INFO:tasks.workunit.client.0.vm06.stdout:2/868: getdents d3/d4/d12/d71/daa/d77/d81/d64 0 2026-03-09T17:30:10.596 INFO:tasks.workunit.client.0.vm06.stdout:1/959: dread d11/d14/d1c/d3a/fbf [0,4194304] 0 2026-03-09T17:30:10.601 INFO:tasks.workunit.client.1.vm09.stdout:4/274: creat d11/d1e/d30/f5f x:0 0 0 2026-03-09T17:30:10.601 INFO:tasks.workunit.client.1.vm09.stdout:0/221: readlink d6/d2a/l36 0 2026-03-09T17:30:10.604 INFO:tasks.workunit.client.1.vm09.stdout:4/275: dread d11/d1e/d29/f2e [0,4194304] 0 2026-03-09T17:30:10.605 INFO:tasks.workunit.client.0.vm06.stdout:2/869: fdatasync d3/d4/d46/fd8 0 2026-03-09T17:30:10.607 INFO:tasks.workunit.client.0.vm06.stdout:1/960: rename d11/d14/d1d/d42/d46/d92/dc0/cdc to d11/d14/d1d/d42/d46/d92/dc0/d57/c140 0 2026-03-09T17:30:10.619 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.616+0000 7fd3b3cfe700 1 -- 192.168.123.106:0/368755789 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd3ac071980 msgr2=0x7fd3ac071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:10.619 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.616+0000 7fd3b3cfe700 1 --2- 192.168.123.106:0/368755789 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd3ac071980 0x7fd3ac071d90 secure :-1 s=READY pgs=319 cs=0 l=1 rev1=1 crypto rx=0x7fd3a800bc70 tx=0x7fd3a800bf80 comp rx=0 tx=0).stop 2026-03-09T17:30:10.619 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.616+0000 7fd3b3cfe700 1 -- 192.168.123.106:0/368755789 shutdown_connections 2026-03-09T17:30:10.619 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.616+0000 7fd3b3cfe700 1 --2- 192.168.123.106:0/368755789 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd3ac072360 0x7fd3ac0770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.619 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.616+0000 7fd3b3cfe700 1 --2- 192.168.123.106:0/368755789 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd3ac071980 0x7fd3ac071d90 unknown :-1 s=CLOSED pgs=319 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.619 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.616+0000 7fd3b3cfe700 1 -- 192.168.123.106:0/368755789 >> 192.168.123.106:0/368755789 conn(0x7fd3ac06d1a0 msgr2=0x7fd3ac06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.617+0000 7fd3b3cfe700 1 -- 192.168.123.106:0/368755789 shutdown_connections 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.617+0000 7fd3b3cfe700 1 -- 192.168.123.106:0/368755789 wait complete. 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.617+0000 7fd3b3cfe700 1 Processor -- start 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.617+0000 7fd3b3cfe700 1 -- start start 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.617+0000 7fd3b3cfe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd3ac072360 0x7fd3ac0824e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.617+0000 7fd3b3cfe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd3ac082a20 0x7fd3ac082e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.617+0000 7fd3b3cfe700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd3ac083e40 con 0x7fd3ac072360 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.617+0000 7fd3b3cfe700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd3ac12dd80 con 0x7fd3ac082a20 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.617+0000 7fd3b1299700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd3ac082a20 0x7fd3ac082e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.617+0000 7fd3b1299700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd3ac082a20 0x7fd3ac082e90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:54466/0 (socket says 192.168.123.106:54466) 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.617+0000 7fd3b1299700 1 -- 192.168.123.106:0/2105826930 learned_addr learned my addr 192.168.123.106:0/2105826930 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.617+0000 7fd3b1a9a700 1 --2- 192.168.123.106:0/2105826930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd3ac072360 0x7fd3ac0824e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.618+0000 7fd3b1a9a700 1 -- 192.168.123.106:0/2105826930 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd3ac082a20 msgr2=0x7fd3ac082e90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.618+0000 7fd3b1a9a700 1 --2- 192.168.123.106:0/2105826930 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd3ac082a20 0x7fd3ac082e90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.618+0000 7fd3b1a9a700 1 -- 192.168.123.106:0/2105826930 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd3a800b920 con 0x7fd3ac072360 2026-03-09T17:30:10.621 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.618+0000 7fd3b1a9a700 1 --2- 192.168.123.106:0/2105826930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd3ac072360 0x7fd3ac0824e0 secure :-1 s=READY pgs=320 cs=0 l=1 rev1=1 crypto rx=0x7fd3a8003a40 tx=0x7fd3a8003b00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:10.621 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.619+0000 7fd3a2ffd700 1 -- 192.168.123.106:0/2105826930 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd3a8010040 con 0x7fd3ac072360 2026-03-09T17:30:10.621 INFO:tasks.workunit.client.0.vm06.stdout:5/961: dread d4/d50/d18/d3d/f81 [0,4194304] 0 2026-03-09T17:30:10.622 INFO:tasks.workunit.client.1.vm09.stdout:4/276: dread d11/f33 [0,4194304] 0 2026-03-09T17:30:10.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.619+0000 7fd3b3cfe700 1 -- 192.168.123.106:0/2105826930 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd3ac12dfb0 con 0x7fd3ac072360 2026-03-09T17:30:10.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.619+0000 7fd3b3cfe700 1 -- 192.168.123.106:0/2105826930 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd3ac12e4a0 con 0x7fd3ac072360 2026-03-09T17:30:10.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.620+0000 7fd3a2ffd700 1 -- 192.168.123.106:0/2105826930 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd3a80044a0 con 0x7fd3ac072360 2026-03-09T17:30:10.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.620+0000 7fd3a2ffd700 1 -- 192.168.123.106:0/2105826930 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd3a801d800 con 0x7fd3ac072360 2026-03-09T17:30:10.624 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:10 vm06.local ceph-mon[57307]: pgmap v5: 65 pgs: 65 active+clean; 2.5 GiB data, 8.7 GiB used, 111 GiB / 120 GiB avail 2026-03-09T17:30:10.624 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.621+0000 7fd3a2ffd700 1 -- 192.168.123.106:0/2105826930 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 24) v1 ==== 50383+0+0 (secure 0 0 0) 0x7fd3a801d960 con 0x7fd3ac072360 2026-03-09T17:30:10.624 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.621+0000 7fd3a2ffd700 1 --2- 192.168.123.106:0/2105826930 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fd39803df70 0x7fd398040420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:10.624 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.621+0000 7fd3a2ffd700 1 -- 192.168.123.106:0/2105826930 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5448+0+0 (secure 0 0 0) 0x7fd3a80555f0 con 0x7fd3ac072360 2026-03-09T17:30:10.624 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.621+0000 7fd3b1299700 1 --2- 192.168.123.106:0/2105826930 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fd39803df70 0x7fd398040420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:10.624 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.622+0000 7fd3b3cfe700 1 -- 192.168.123.106:0/2105826930 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd390005320 con 0x7fd3ac072360 2026-03-09T17:30:10.624 INFO:tasks.workunit.client.1.vm09.stdout:4/277: dread d11/f25 [0,4194304] 0 2026-03-09T17:30:10.626 INFO:tasks.workunit.client.1.vm09.stdout:4/278: dread d11/f25 [0,4194304] 0 2026-03-09T17:30:10.628 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.625+0000 7fd3a2ffd700 1 -- 192.168.123.106:0/2105826930 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fd3a8021070 con 0x7fd3ac072360 2026-03-09T17:30:10.628 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.625+0000 7fd3b1299700 1 --2- 192.168.123.106:0/2105826930 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fd39803df70 0x7fd398040420 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fd3a4004150 tx=0x7fd3a4008be0 comp rx=0 tx=0).ready entity=mgr.24477 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:10.634 INFO:tasks.workunit.client.1.vm09.stdout:7/287: rename da/d11/d41/d4e/f44 to da/d11/d2d/f69 0 2026-03-09T17:30:10.635 INFO:tasks.workunit.client.1.vm09.stdout:7/288: read da/d11/d41/d4e/f33 [3208902,232] 0 2026-03-09T17:30:10.683 INFO:tasks.workunit.client.0.vm06.stdout:2/870: mknod d3/d4/d12/d71/daa/d77/d81/d64/de5/df0/c11d 0 2026-03-09T17:30:10.683 INFO:tasks.workunit.client.0.vm06.stdout:2/871: chown d3/d4/d46/da5/fa8 3905487 1 2026-03-09T17:30:10.686 INFO:tasks.workunit.client.1.vm09.stdout:8/233: read d1/da/f12 [1522238,119427] 0 2026-03-09T17:30:10.693 INFO:tasks.workunit.client.0.vm06.stdout:6/895: dwrite d6/d12/d53/f64 [0,4194304] 0 2026-03-09T17:30:10.704 INFO:tasks.workunit.client.1.vm09.stdout:9/202: write d5/f1b [346206,83210] 0 2026-03-09T17:30:10.705 INFO:tasks.workunit.client.1.vm09.stdout:9/203: truncate d5/d21/f30 636177 0 2026-03-09T17:30:10.716 INFO:tasks.workunit.client.1.vm09.stdout:9/204: dwrite d5/de/d29/f35 [0,4194304] 0 2026-03-09T17:30:10.746 INFO:tasks.workunit.client.0.vm06.stdout:1/961: creat d11/d14/d1d/d1e/d2a/d99/de9/f141 x:0 0 0 2026-03-09T17:30:10.748 INFO:tasks.workunit.client.1.vm09.stdout:5/214: mkdir d0/d46 0 2026-03-09T17:30:10.770 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:10 vm09.local ceph-mon[62061]: pgmap v5: 65 pgs: 65 active+clean; 2.5 GiB data, 8.7 GiB used, 111 GiB / 120 GiB avail 2026-03-09T17:30:10.770 INFO:tasks.workunit.client.1.vm09.stdout:1/229: truncate d9/dc/dd/d40/d21/f33 718 0 2026-03-09T17:30:10.771 INFO:tasks.workunit.client.1.vm09.stdout:2/177: truncate d13/d15/d21/f35 932802 0 2026-03-09T17:30:10.773 INFO:tasks.workunit.client.1.vm09.stdout:2/178: write d13/d15/f2a [454900,26811] 0 2026-03-09T17:30:10.775 INFO:tasks.workunit.client.0.vm06.stdout:2/872: dread - d3/d4/d12/d2b/fdc zero size 2026-03-09T17:30:10.779 INFO:tasks.workunit.client.1.vm09.stdout:1/230: truncate d9/dc/dd/d40/d22/f4a 383159 0 2026-03-09T17:30:10.780 INFO:tasks.workunit.client.1.vm09.stdout:2/179: dread d13/d15/d21/f35 [0,4194304] 0 2026-03-09T17:30:10.780 INFO:tasks.workunit.client.1.vm09.stdout:2/180: chown d13/d15/f2a 2 1 2026-03-09T17:30:10.781 INFO:tasks.workunit.client.0.vm06.stdout:8/968: dwrite d15/d16/d6d/f89 [4194304,4194304] 0 2026-03-09T17:30:10.784 INFO:tasks.workunit.client.1.vm09.stdout:4/279: truncate f3 2219466 0 2026-03-09T17:30:10.801 INFO:tasks.workunit.client.1.vm09.stdout:0/222: rename d6/d1d/d24/d32/f42 to d6/d1d/d39/f44 0 2026-03-09T17:30:10.802 INFO:tasks.workunit.client.1.vm09.stdout:0/223: stat d6/d1d/l40 0 2026-03-09T17:30:10.802 INFO:tasks.workunit.client.1.vm09.stdout:0/224: readlink d6/d1d/d39/l2f 0 2026-03-09T17:30:10.810 INFO:tasks.workunit.client.0.vm06.stdout:5/962: write d4/d22/d46/f10b [738464,66947] 0 2026-03-09T17:30:10.811 INFO:tasks.workunit.client.0.vm06.stdout:5/963: stat d4/da4/dcf/f123 0 2026-03-09T17:30:10.821 INFO:tasks.workunit.client.0.vm06.stdout:8/969: fdatasync d15/d39/d3c/dd5/fde 0 2026-03-09T17:30:10.821 INFO:tasks.workunit.client.0.vm06.stdout:8/970: chown d15/d16/d1e/d30/db8/ca3 4 1 2026-03-09T17:30:10.825 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.821+0000 7fd3b3cfe700 1 -- 192.168.123.106:0/2105826930 --> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd390000bf0 con 0x7fd39803df70 2026-03-09T17:30:10.829 INFO:tasks.workunit.client.0.vm06.stdout:6/896: truncate d6/d12/d17/f29 174672 0 2026-03-09T17:30:10.829 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.827+0000 7fd3a2ffd700 1 -- 192.168.123.106:0/2105826930 <== mgr.24477 v2:192.168.123.109:6828/111652423 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+318 (secure 0 0 0) 0x7fd390000bf0 con 0x7fd39803df70 2026-03-09T17:30:10.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.830+0000 7fd3a0ff9700 1 -- 192.168.123.106:0/2105826930 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fd39803df70 msgr2=0x7fd398040420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:10.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.830+0000 7fd3a0ff9700 1 --2- 192.168.123.106:0/2105826930 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fd39803df70 0x7fd398040420 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fd3a4004150 tx=0x7fd3a4008be0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.830+0000 7fd3a0ff9700 1 -- 192.168.123.106:0/2105826930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd3ac072360 msgr2=0x7fd3ac0824e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:10.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.830+0000 7fd3a0ff9700 1 --2- 192.168.123.106:0/2105826930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd3ac072360 0x7fd3ac0824e0 secure :-1 s=READY pgs=320 cs=0 l=1 rev1=1 crypto rx=0x7fd3a8003a40 tx=0x7fd3a8003b00 comp rx=0 tx=0).stop 2026-03-09T17:30:10.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.830+0000 7fd3a0ff9700 1 -- 192.168.123.106:0/2105826930 shutdown_connections 2026-03-09T17:30:10.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.830+0000 7fd3a0ff9700 1 --2- 192.168.123.106:0/2105826930 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fd39803df70 0x7fd398040420 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.830+0000 7fd3a0ff9700 1 --2- 192.168.123.106:0/2105826930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd3ac072360 0x7fd3ac0824e0 unknown :-1 s=CLOSED pgs=320 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.830+0000 7fd3a0ff9700 1 --2- 192.168.123.106:0/2105826930 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd3ac082a20 0x7fd3ac082e90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.830+0000 7fd3a0ff9700 1 -- 192.168.123.106:0/2105826930 >> 192.168.123.106:0/2105826930 conn(0x7fd3ac06d1a0 msgr2=0x7fd3ac06e090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:10.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.830+0000 7fd3a0ff9700 1 -- 192.168.123.106:0/2105826930 shutdown_connections 2026-03-09T17:30:10.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.830+0000 7fd3a0ff9700 1 -- 192.168.123.106:0/2105826930 wait complete. 2026-03-09T17:30:10.848 INFO:tasks.workunit.client.0.vm06.stdout:1/962: link d11/d14/d1d/d4a/df7/d106/d112/d114/l132 d11/d14/d1d/d42/dff/l142 0 2026-03-09T17:30:10.848 INFO:tasks.workunit.client.0.vm06.stdout:8/971: creat d15/d31/dc5/df1/d3d/d5f/d83/dc1/f13c x:0 0 0 2026-03-09T17:30:10.850 INFO:tasks.workunit.client.1.vm09.stdout:6/215: creat d3/d7/f40 x:0 0 0 2026-03-09T17:30:10.850 INFO:tasks.workunit.client.1.vm09.stdout:5/215: readlink d0/dc/ld 0 2026-03-09T17:30:10.852 INFO:tasks.workunit.client.0.vm06.stdout:2/873: creat d3/d4/d12/d71/daa/f11e x:0 0 0 2026-03-09T17:30:10.854 INFO:tasks.workunit.client.1.vm09.stdout:1/231: rmdir d9/dc 39 2026-03-09T17:30:10.855 INFO:tasks.workunit.client.1.vm09.stdout:4/280: rmdir d11/d1e 39 2026-03-09T17:30:10.858 INFO:tasks.workunit.client.0.vm06.stdout:6/897: rename d6/d12/d53/d8f to d6/d4f/d3e/d52/d8c/d117 0 2026-03-09T17:30:10.863 INFO:tasks.workunit.client.1.vm09.stdout:4/281: sync 2026-03-09T17:30:10.863 INFO:tasks.workunit.client.1.vm09.stdout:0/225: rename d6/f7 to d6/d1d/d24/d32/f45 0 2026-03-09T17:30:10.872 INFO:tasks.workunit.client.1.vm09.stdout:4/282: dread d11/f23 [0,4194304] 0 2026-03-09T17:30:10.882 INFO:tasks.workunit.client.0.vm06.stdout:1/963: symlink d11/d14/d1c/l143 0 2026-03-09T17:30:10.883 INFO:tasks.workunit.client.1.vm09.stdout:3/206: dwrite d5/d9/fc [0,4194304] 0 2026-03-09T17:30:10.889 INFO:tasks.workunit.client.0.vm06.stdout:5/964: dwrite d4/d50/d35/d40/d95/f117 [0,4194304] 0 2026-03-09T17:30:10.917 INFO:tasks.workunit.client.0.vm06.stdout:8/972: dread - d15/d39/d67/d77/d97/dac/fe1 zero size 2026-03-09T17:30:10.921 INFO:tasks.workunit.client.0.vm06.stdout:6/898: truncate d6/d12/f1c 934050 0 2026-03-09T17:30:10.921 INFO:tasks.workunit.client.0.vm06.stdout:2/874: chown d3/d4/d12/d2b/d2d/c63 20942 1 2026-03-09T17:30:10.928 INFO:tasks.workunit.client.0.vm06.stdout:5/965: mkdir d4/d22/dbe/dfb/d15a 0 2026-03-09T17:30:10.931 INFO:tasks.workunit.client.1.vm09.stdout:6/216: creat d3/f41 x:0 0 0 2026-03-09T17:30:10.940 INFO:tasks.workunit.client.0.vm06.stdout:5/966: creat d4/d50/d35/d40/d95/db8/dda/f15b x:0 0 0 2026-03-09T17:30:10.941 INFO:tasks.workunit.client.0.vm06.stdout:6/899: dwrite d6/d12/d17/d65/f72 [4194304,4194304] 0 2026-03-09T17:30:10.942 INFO:tasks.workunit.client.0.vm06.stdout:6/900: chown d6 27769 1 2026-03-09T17:30:10.948 INFO:tasks.workunit.client.1.vm09.stdout:2/181: dwrite d13/f14 [4194304,4194304] 0 2026-03-09T17:30:10.964 INFO:tasks.workunit.client.1.vm09.stdout:8/234: rename d1/da/dd/ce to d1/d14/d2a/c53 0 2026-03-09T17:30:10.994 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.965+0000 7f5d4e4b6700 1 -- 192.168.123.106:0/2622139297 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5d48072470 msgr2=0x7f5d4810beb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:10.994 INFO:tasks.workunit.client.0.vm06.stdout:6/901: symlink d6/d12/d2d/l118 0 2026-03-09T17:30:10.995 INFO:tasks.workunit.client.1.vm09.stdout:8/235: truncate d1/da/dd/f45 121926 0 2026-03-09T17:30:10.995 INFO:tasks.workunit.client.1.vm09.stdout:7/289: creat da/d11/f6a x:0 0 0 2026-03-09T17:30:10.995 INFO:tasks.workunit.client.1.vm09.stdout:5/216: symlink d0/d46/l47 0 2026-03-09T17:30:10.995 INFO:tasks.workunit.client.1.vm09.stdout:5/217: chown d0/d2/d15/f1c 371 1 2026-03-09T17:30:10.995 INFO:tasks.workunit.client.1.vm09.stdout:1/232: mknod d9/dc/dd/d40/d1d/c4b 0 2026-03-09T17:30:10.995 INFO:tasks.workunit.client.1.vm09.stdout:1/233: stat d9/dc/dd/d40/l23 0 2026-03-09T17:30:10.995 INFO:tasks.workunit.client.1.vm09.stdout:3/207: rename d5/d6 to d5/d16/d31/d3d 0 2026-03-09T17:30:10.995 INFO:tasks.workunit.client.1.vm09.stdout:2/182: dwrite d13/d15/d21/f30 [0,4194304] 0 2026-03-09T17:30:10.995 INFO:tasks.workunit.client.1.vm09.stdout:9/205: getdents d5/d21 0 2026-03-09T17:30:10.995 INFO:tasks.workunit.client.1.vm09.stdout:9/206: dread - d5/de/f2d zero size 2026-03-09T17:30:10.995 INFO:tasks.workunit.client.1.vm09.stdout:5/218: symlink d0/dc/l48 0 2026-03-09T17:30:10.995 INFO:tasks.workunit.client.1.vm09.stdout:9/207: read d5/d21/f2b [45871,34169] 0 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.965+0000 7f5d4e4b6700 1 --2- 192.168.123.106:0/2622139297 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5d48072470 0x7f5d4810beb0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f5d38009b00 tx=0x7f5d38009e10 comp rx=0 tx=0).stop 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.966+0000 7f5d4e4b6700 1 -- 192.168.123.106:0/2622139297 shutdown_connections 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.966+0000 7f5d4e4b6700 1 --2- 192.168.123.106:0/2622139297 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5d48072470 0x7f5d4810beb0 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.966+0000 7f5d4e4b6700 1 --2- 192.168.123.106:0/2622139297 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d48071a90 0x7f5d48071ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.966+0000 7f5d4e4b6700 1 -- 192.168.123.106:0/2622139297 >> 192.168.123.106:0/2622139297 conn(0x7f5d4806d1a0 msgr2=0x7f5d4806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.967+0000 7f5d4e4b6700 1 -- 192.168.123.106:0/2622139297 shutdown_connections 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.967+0000 7f5d4e4b6700 1 -- 192.168.123.106:0/2622139297 wait complete. 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.967+0000 7f5d4e4b6700 1 Processor -- start 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.967+0000 7f5d4e4b6700 1 -- start start 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.967+0000 7f5d4e4b6700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5d48072470 0x7f5d48116ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.967+0000 7f5d4e4b6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d48117010 0x7f5d481a14c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.967+0000 7f5d4e4b6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d48117510 con 0x7f5d48117010 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.967+0000 7f5d4e4b6700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d48117650 con 0x7f5d48072470 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.967+0000 7f5d477fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d48117010 0x7f5d481a14c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.967+0000 7f5d477fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d48117010 0x7f5d481a14c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:34838/0 (socket says 192.168.123.106:34838) 2026-03-09T17:30:10.995 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.967+0000 7f5d477fe700 1 -- 192.168.123.106:0/130364438 learned_addr learned my addr 192.168.123.106:0/130364438 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.967+0000 7f5d477fe700 1 -- 192.168.123.106:0/130364438 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5d48072470 msgr2=0x7f5d48116ad0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.967+0000 7f5d477fe700 1 --2- 192.168.123.106:0/130364438 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5d48072470 0x7f5d48116ad0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.967+0000 7f5d477fe700 1 -- 192.168.123.106:0/130364438 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5d380097e0 con 0x7f5d48117010 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.968+0000 7f5d477fe700 1 --2- 192.168.123.106:0/130364438 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d48117010 0x7f5d481a14c0 secure :-1 s=READY pgs=321 cs=0 l=1 rev1=1 crypto rx=0x7f5d38009fd0 tx=0x7f5d38004a20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.968+0000 7f5d457fa700 1 -- 192.168.123.106:0/130364438 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d3801d070 con 0x7f5d48117010 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.968+0000 7f5d4e4b6700 1 -- 192.168.123.106:0/130364438 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5d481a1a00 con 0x7f5d48117010 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.968+0000 7f5d4e4b6700 1 -- 192.168.123.106:0/130364438 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5d481a1ef0 con 0x7f5d48117010 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.968+0000 7f5d457fa700 1 -- 192.168.123.106:0/130364438 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5d3800bc50 con 0x7f5d48117010 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.969+0000 7f5d4e4b6700 1 -- 192.168.123.106:0/130364438 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5d34005320 con 0x7f5d48117010 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.969+0000 7f5d457fa700 1 -- 192.168.123.106:0/130364438 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d3800f460 con 0x7f5d48117010 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.969+0000 7f5d457fa700 1 -- 192.168.123.106:0/130364438 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 24) v1 ==== 50383+0+0 (secure 0 0 0) 0x7f5d3800f5c0 con 0x7f5d48117010 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.969+0000 7f5d457fa700 1 --2- 192.168.123.106:0/130364438 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7f5d3003dc80 0x7f5d30040130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.970+0000 7f5d457fa700 1 -- 192.168.123.106:0/130364438 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f5d3801f2d0 con 0x7f5d48117010 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.972+0000 7f5d457fa700 1 -- 192.168.123.106:0/130364438 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f5d38021800 con 0x7f5d48117010 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.974+0000 7f5d47fff700 1 --2- 192.168.123.106:0/130364438 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7f5d3003dc80 0x7f5d30040130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:10.996 INFO:tasks.workunit.client.0.vm06.stdout:6/902: unlink d6/d4f/d73/lcd 0 2026-03-09T17:30:10.996 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:10.978+0000 7f5d47fff700 1 --2- 192.168.123.106:0/130364438 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7f5d3003dc80 0x7f5d30040130 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f5d48117b10 tx=0x7f5d3c006cb0 comp rx=0 tx=0).ready entity=mgr.24477 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:11.005 INFO:tasks.workunit.client.1.vm09.stdout:8/236: rename d1/f3 to d1/d14/d2a/f54 0 2026-03-09T17:30:11.007 INFO:tasks.workunit.client.1.vm09.stdout:2/183: link f9 d13/d15/d34/f3a 0 2026-03-09T17:30:11.011 INFO:tasks.workunit.client.1.vm09.stdout:9/208: mkdir d5/de/d4e 0 2026-03-09T17:30:11.012 INFO:tasks.workunit.client.1.vm09.stdout:2/184: chown d13/d15/d21/f32 24379 1 2026-03-09T17:30:11.023 INFO:tasks.workunit.client.1.vm09.stdout:7/290: getdents da/d11/d41 0 2026-03-09T17:30:11.023 INFO:tasks.workunit.client.1.vm09.stdout:7/291: dread - da/f36 zero size 2026-03-09T17:30:11.027 INFO:tasks.workunit.client.0.vm06.stdout:1/964: sync 2026-03-09T17:30:11.028 INFO:tasks.workunit.client.1.vm09.stdout:2/185: dwrite d13/f23 [0,4194304] 0 2026-03-09T17:30:11.031 INFO:tasks.workunit.client.1.vm09.stdout:7/292: mknod da/d11/d41/d4e/c6b 0 2026-03-09T17:30:11.031 INFO:tasks.workunit.client.1.vm09.stdout:7/293: dread - da/d11/d2d/d56/f53 zero size 2026-03-09T17:30:11.039 INFO:tasks.workunit.client.1.vm09.stdout:2/186: read d13/d15/d21/f28 [1951877,119320] 0 2026-03-09T17:30:11.039 INFO:tasks.workunit.client.1.vm09.stdout:2/187: chown d13/d15/d2c 53483489 1 2026-03-09T17:30:11.040 INFO:tasks.workunit.client.1.vm09.stdout:2/188: chown d13/d15/d21/f32 3573 1 2026-03-09T17:30:11.040 INFO:tasks.workunit.client.1.vm09.stdout:2/189: readlink l10 0 2026-03-09T17:30:11.040 INFO:tasks.workunit.client.1.vm09.stdout:1/234: sync 2026-03-09T17:30:11.041 INFO:tasks.workunit.client.1.vm09.stdout:2/190: write d13/d15/f2a [291020,78940] 0 2026-03-09T17:30:11.041 INFO:tasks.workunit.client.1.vm09.stdout:9/209: sync 2026-03-09T17:30:11.043 INFO:tasks.workunit.client.1.vm09.stdout:1/235: sync 2026-03-09T17:30:11.043 INFO:tasks.workunit.client.1.vm09.stdout:9/210: sync 2026-03-09T17:30:11.043 INFO:tasks.workunit.client.1.vm09.stdout:1/236: fsync d9/dc/f3d 0 2026-03-09T17:30:11.044 INFO:tasks.workunit.client.1.vm09.stdout:1/237: dread d9/dc/dd/f28 [0,4194304] 0 2026-03-09T17:30:11.045 INFO:tasks.workunit.client.1.vm09.stdout:1/238: readlink d9/dc/l36 0 2026-03-09T17:30:11.082 INFO:tasks.workunit.client.1.vm09.stdout:1/239: dread f6 [0,4194304] 0 2026-03-09T17:30:11.082 INFO:tasks.workunit.client.1.vm09.stdout:1/240: chown d9/d38 77410 1 2026-03-09T17:30:11.168 INFO:tasks.workunit.client.0.vm06.stdout:1/965: mkdir d11/d14/d1d/d1e/d2a/d34/d64/dfa/d144 0 2026-03-09T17:30:11.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.167+0000 7f5d4e4b6700 1 -- 192.168.123.106:0/130364438 --> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f5d34000bf0 con 0x7f5d3003dc80 2026-03-09T17:30:11.185 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:30:11.185 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (4m) 2m ago 5m 24.7M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:30:11.185 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (5m) 2m ago 5m 8078k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:30:11.185 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (4m) 12s ago 4m 8556k - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:30:11.185 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (5m) 2m ago 5m 7402k - 18.2.0 dc2bc1663786 8c6366ef2954 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (4m) 12s ago 4m 7419k - 18.2.0 dc2bc1663786 78af352f0367 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (4m) 2m ago 4m 84.7M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (2m) 2m ago 2m 10.8M - 18.2.0 dc2bc1663786 4b4cbdf0c640 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (3m) 2m ago 3m 19.5M - 18.2.0 dc2bc1663786 4c8e86b2b8cd 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (2m) 12s ago 2m 121M - 18.2.0 dc2bc1663786 aa1f0430b448 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (2m) 12s ago 2m 15.6M - 18.2.0 dc2bc1663786 8dc8a0159213 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:9283,8765,8443 running (5m) 2m ago 5m 498M - 18.2.0 dc2bc1663786 2765e8d99a9c 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (15s) 12s ago 4m 48.3M - 19.2.3-678-ge911bdeb 654f31e6858e 6994beea5467 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (5m) 2m ago 5m 52.0M 2048M 18.2.0 dc2bc1663786 e0e1a20b1577 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (4m) 12s ago 4m 39.5M 2048M 18.2.0 dc2bc1663786 4c30d1217de3 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (5m) 2m ago 5m 13.9M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (4m) 12s ago 4m 15.2M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (4m) 2m ago 4m 46.8M 4096M 18.2.0 dc2bc1663786 7a07f019bdd7 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (4m) 2m ago 4m 46.0M 4096M 18.2.0 dc2bc1663786 decf7e88ebaf 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (3m) 2m ago 3m 48.3M 4096M 18.2.0 dc2bc1663786 df6c7067c3e4 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (3m) 12s ago 3m 346M 4096M 18.2.0 dc2bc1663786 48a594500ef1 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (3m) 12s ago 3m 320M 4096M 18.2.0 dc2bc1663786 a47c39052541 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (3m) 12s ago 3m 304M 4096M 18.2.0 dc2bc1663786 89f436540a49 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (4m) 2m ago 4m 41.3M - 2.43.0 a07b618ecd1d 9f52c04d903c 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.181+0000 7f5d457fa700 1 -- 192.168.123.106:0/130364438 <== mgr.24477 v2:192.168.123.109:6828/111652423 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f5d34000bf0 con 0x7f5d3003dc80 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.183+0000 7f5d2effd700 1 -- 192.168.123.106:0/130364438 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7f5d3003dc80 msgr2=0x7f5d30040130 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.183+0000 7f5d2effd700 1 --2- 192.168.123.106:0/130364438 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7f5d3003dc80 0x7f5d30040130 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f5d48117b10 tx=0x7f5d3c006cb0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.183+0000 7f5d2effd700 1 -- 192.168.123.106:0/130364438 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d48117010 msgr2=0x7f5d481a14c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.183+0000 7f5d2effd700 1 --2- 192.168.123.106:0/130364438 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d48117010 0x7f5d481a14c0 secure :-1 s=READY pgs=321 cs=0 l=1 rev1=1 crypto rx=0x7f5d38009fd0 tx=0x7f5d38004a20 comp rx=0 tx=0).stop 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.183+0000 7f5d2effd700 1 -- 192.168.123.106:0/130364438 shutdown_connections 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.183+0000 7f5d2effd700 1 --2- 192.168.123.106:0/130364438 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7f5d3003dc80 0x7f5d30040130 secure :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f5d48117b10 tx=0x7f5d3c006cb0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.183+0000 7f5d2effd700 1 --2- 192.168.123.106:0/130364438 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5d48072470 0x7f5d48116ad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.183+0000 7f5d2effd700 1 --2- 192.168.123.106:0/130364438 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d48117010 0x7f5d481a14c0 unknown :-1 s=CLOSED pgs=321 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.186 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.183+0000 7f5d2effd700 1 -- 192.168.123.106:0/130364438 >> 192.168.123.106:0/130364438 conn(0x7f5d4806d1a0 msgr2=0x7f5d4810b4d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:11.189 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.183+0000 7f5d2effd700 1 -- 192.168.123.106:0/130364438 shutdown_connections 2026-03-09T17:30:11.189 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.186+0000 7f5d2effd700 1 -- 192.168.123.106:0/130364438 wait complete. 2026-03-09T17:30:11.202 INFO:tasks.workunit.client.0.vm06.stdout:2/875: write d3/d4/d12/d2b/d2d/f100 [4841650,64584] 0 2026-03-09T17:30:11.207 INFO:tasks.workunit.client.0.vm06.stdout:5/967: write d4/d22/d46/f82 [193708,8651] 0 2026-03-09T17:30:11.208 INFO:tasks.workunit.client.0.vm06.stdout:8/973: dwrite d15/d31/dc5/df1/f61 [4194304,4194304] 0 2026-03-09T17:30:11.223 INFO:tasks.workunit.client.1.vm09.stdout:4/283: dwrite fe [0,4194304] 0 2026-03-09T17:30:11.255 INFO:tasks.workunit.client.0.vm06.stdout:6/903: write d6/d12/f76 [143250,58939] 0 2026-03-09T17:30:11.275 INFO:tasks.workunit.client.1.vm09.stdout:7/294: mkdir da/d11/d47/d5b/d6c 0 2026-03-09T17:30:11.275 INFO:tasks.workunit.client.0.vm06.stdout:1/966: truncate d11/d14/d1d/d1e/d2a/d34/d64/df6/ffc 1262894 0 2026-03-09T17:30:11.308 INFO:tasks.workunit.client.1.vm09.stdout:0/226: truncate d6/d1d/d39/f44 573369 0 2026-03-09T17:30:11.308 INFO:tasks.workunit.client.1.vm09.stdout:2/191: unlink f9 0 2026-03-09T17:30:11.312 INFO:tasks.workunit.client.0.vm06.stdout:5/968: creat d4/d52/db4/f15c x:0 0 0 2026-03-09T17:30:11.312 INFO:tasks.workunit.client.0.vm06.stdout:5/969: chown d4/d22/f77 72268 1 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.312+0000 7fa3b8a98700 1 -- 192.168.123.106:0/957616016 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3b4071980 msgr2=0x7fa3b4071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.312+0000 7fa3b8a98700 1 --2- 192.168.123.106:0/957616016 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3b4071980 0x7fa3b4071d90 secure :-1 s=READY pgs=322 cs=0 l=1 rev1=1 crypto rx=0x7fa3a40099c0 tx=0x7fa3a4009cd0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.312+0000 7fa3b8a98700 1 -- 192.168.123.106:0/957616016 shutdown_connections 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.312+0000 7fa3b8a98700 1 --2- 192.168.123.106:0/957616016 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa3b4072360 0x7fa3b40770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.312+0000 7fa3b8a98700 1 --2- 192.168.123.106:0/957616016 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3b4071980 0x7fa3b4071d90 unknown :-1 s=CLOSED pgs=322 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.312+0000 7fa3b8a98700 1 -- 192.168.123.106:0/957616016 >> 192.168.123.106:0/957616016 conn(0x7fa3b406d1a0 msgr2=0x7fa3b406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.312+0000 7fa3b8a98700 1 -- 192.168.123.106:0/957616016 shutdown_connections 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.312+0000 7fa3b8a98700 1 -- 192.168.123.106:0/957616016 wait complete. 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.312+0000 7fa3b8a98700 1 Processor -- start 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.312+0000 7fa3b8a98700 1 -- start start 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.312+0000 7fa3b8a98700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3b4072360 0x7fa3b4082530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.312+0000 7fa3b8a98700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa3b4082a70 0x7fa3b4082ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.312+0000 7fa3b8a98700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa3b41b2a90 con 0x7fa3b4072360 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.312+0000 7fa3b8a98700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa3b41b2bd0 con 0x7fa3b4082a70 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.313+0000 7fa3b259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3b4072360 0x7fa3b4082530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.313+0000 7fa3b259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3b4072360 0x7fa3b4082530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:34860/0 (socket says 192.168.123.106:34860) 2026-03-09T17:30:11.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.313+0000 7fa3b259c700 1 -- 192.168.123.106:0/1746334570 learned_addr learned my addr 192.168.123.106:0/1746334570 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:30:11.317 INFO:tasks.workunit.client.0.vm06.stdout:8/974: mknod d15/d31/dc5/df1/d71/c13d 0 2026-03-09T17:30:11.317 INFO:tasks.workunit.client.0.vm06.stdout:8/975: chown d15/d39/d67/d77/d97/dac 59582 1 2026-03-09T17:30:11.319 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.317+0000 7fa3b1d9b700 1 --2- 192.168.123.106:0/1746334570 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa3b4082a70 0x7fa3b4082ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:11.319 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.317+0000 7fa3b1d9b700 1 -- 192.168.123.106:0/1746334570 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3b4072360 msgr2=0x7fa3b4082530 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:11.319 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.317+0000 7fa3b1d9b700 1 --2- 192.168.123.106:0/1746334570 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3b4072360 0x7fa3b4082530 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.319 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.317+0000 7fa3b1d9b700 1 -- 192.168.123.106:0/1746334570 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa3a40096b0 con 0x7fa3b4082a70 2026-03-09T17:30:11.319 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.317+0000 7fa3b1d9b700 1 --2- 192.168.123.106:0/1746334570 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa3b4082a70 0x7fa3b4082ee0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fa3ac00bee0 tx=0x7fa3ac00c320 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:11.320 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.317+0000 7fa3a37fe700 1 -- 192.168.123.106:0/1746334570 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa3ac00ce50 con 0x7fa3b4082a70 2026-03-09T17:30:11.320 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.317+0000 7fa3b8a98700 1 -- 192.168.123.106:0/1746334570 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa3b41b2d10 con 0x7fa3b4082a70 2026-03-09T17:30:11.320 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.318+0000 7fa3b8a98700 1 -- 192.168.123.106:0/1746334570 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa3b41b3230 con 0x7fa3b4082a70 2026-03-09T17:30:11.320 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.318+0000 7fa3a37fe700 1 -- 192.168.123.106:0/1746334570 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa3ac014920 con 0x7fa3b4082a70 2026-03-09T17:30:11.320 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.318+0000 7fa3a37fe700 1 -- 192.168.123.106:0/1746334570 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa3ac012ae0 con 0x7fa3b4082a70 2026-03-09T17:30:11.321 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.319+0000 7fa3a37fe700 1 -- 192.168.123.106:0/1746334570 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 24) v1 ==== 50383+0+0 (secure 0 0 0) 0x7fa3ac012d40 con 0x7fa3b4082a70 2026-03-09T17:30:11.321 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.319+0000 7fa3a37fe700 1 --2- 192.168.123.106:0/1746334570 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fa39c040140 0x7fa39c0425f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:11.321 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.319+0000 7fa3a37fe700 1 -- 192.168.123.106:0/1746334570 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5448+0+0 (secure 0 0 0) 0x7fa3ac053e20 con 0x7fa3b4082a70 2026-03-09T17:30:11.322 INFO:tasks.workunit.client.1.vm09.stdout:5/219: truncate d0/ff 1574823 0 2026-03-09T17:30:11.323 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.321+0000 7fa3b8a98700 1 -- 192.168.123.106:0/1746334570 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa394005320 con 0x7fa3b4082a70 2026-03-09T17:30:11.323 INFO:tasks.workunit.client.1.vm09.stdout:5/220: write d0/dc/d21/d26/f28 [243063,40470] 0 2026-03-09T17:30:11.323 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.321+0000 7fa3b259c700 1 --2- 192.168.123.106:0/1746334570 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fa39c040140 0x7fa39c0425f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:11.325 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.323+0000 7fa3b259c700 1 --2- 192.168.123.106:0/1746334570 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fa39c040140 0x7fa39c0425f0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fa3a400c6d0 tx=0x7fa3a400c2d0 comp rx=0 tx=0).ready entity=mgr.24477 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:11.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.325+0000 7fa3a37fe700 1 -- 192.168.123.106:0/1746334570 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fa3ac011720 con 0x7fa3b4082a70 2026-03-09T17:30:11.331 INFO:tasks.workunit.client.1.vm09.stdout:1/241: dwrite d9/dc/dd/d40/d22/f2b [0,4194304] 0 2026-03-09T17:30:11.333 INFO:tasks.workunit.client.1.vm09.stdout:1/242: chown d9/dc/dd/d40/d22/d37/d3f 341868 1 2026-03-09T17:30:11.388 INFO:tasks.workunit.client.1.vm09.stdout:7/295: rename da/d11/d3e/f61 to da/d11/d47/d5b/f6d 0 2026-03-09T17:30:11.389 INFO:tasks.workunit.client.1.vm09.stdout:7/296: truncate da/f36 1019277 0 2026-03-09T17:30:11.400 INFO:tasks.workunit.client.1.vm09.stdout:0/227: rmdir d6 39 2026-03-09T17:30:11.405 INFO:tasks.workunit.client.1.vm09.stdout:2/192: mkdir d13/d15/d3b 0 2026-03-09T17:30:11.448 INFO:tasks.workunit.client.1.vm09.stdout:4/284: rename d11/d1e/d30/d35 to d11/d1e/d45/d60 0 2026-03-09T17:30:11.448 INFO:tasks.workunit.client.1.vm09.stdout:4/285: write d11/f13 [2928080,128890] 0 2026-03-09T17:30:11.452 INFO:tasks.workunit.client.1.vm09.stdout:7/297: fdatasync da/d11/d41/d4e/d4c/f67 0 2026-03-09T17:30:11.452 INFO:tasks.workunit.client.1.vm09.stdout:7/298: fdatasync da/d11/d41/d4e/d4c/f66 0 2026-03-09T17:30:11.452 INFO:tasks.workunit.client.1.vm09.stdout:7/299: chown da/d11/d47/d5b 15974665 1 2026-03-09T17:30:11.481 INFO:tasks.workunit.client.1.vm09.stdout:5/221: creat d0/d9/d16/d3c/f49 x:0 0 0 2026-03-09T17:30:11.482 INFO:tasks.workunit.client.1.vm09.stdout:6/217: fsync d3/d1e/f20 0 2026-03-09T17:30:11.482 INFO:tasks.workunit.client.1.vm09.stdout:1/243: mknod d9/c4c 0 2026-03-09T17:30:11.505 INFO:tasks.workunit.client.1.vm09.stdout:4/286: chown d11/f1c 38609 1 2026-03-09T17:30:11.506 INFO:tasks.workunit.client.1.vm09.stdout:4/287: fsync d11/f16 0 2026-03-09T17:30:11.508 INFO:tasks.workunit.client.1.vm09.stdout:4/288: chown d11/c20 2331892 1 2026-03-09T17:30:11.509 INFO:tasks.workunit.client.1.vm09.stdout:7/300: creat da/d11/d47/f6e x:0 0 0 2026-03-09T17:30:11.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:11 vm06.local ceph-mon[57307]: from='client.24507 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:11.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:11 vm06.local ceph-mon[57307]: mgrmap e24: vm09.lqzvkh(active, since 4s) 2026-03-09T17:30:11.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:11 vm06.local ceph-mon[57307]: [09/Mar/2026:17:30:10] ENGINE Bus STARTING 2026-03-09T17:30:11.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:11 vm06.local ceph-mon[57307]: from='client.14690 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:11.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:11 vm06.local ceph-mon[57307]: [09/Mar/2026:17:30:10] ENGINE Serving on https://192.168.123.109:7150 2026-03-09T17:30:11.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:11 vm06.local ceph-mon[57307]: [09/Mar/2026:17:30:10] ENGINE Client ('192.168.123.109', 58260) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T17:30:11.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:11 vm06.local ceph-mon[57307]: [09/Mar/2026:17:30:10] ENGINE Serving on http://192.168.123.109:8765 2026-03-09T17:30:11.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:11 vm06.local ceph-mon[57307]: [09/Mar/2026:17:30:10] ENGINE Bus STARTED 2026-03-09T17:30:11.521 INFO:tasks.workunit.client.1.vm09.stdout:0/228: rename d6/d2a to d6/d1d/d46 0 2026-03-09T17:30:11.529 INFO:tasks.workunit.client.1.vm09.stdout:7/301: sync 2026-03-09T17:30:11.529 INFO:tasks.workunit.client.1.vm09.stdout:7/302: write da/d11/d2d/f59 [61679,44126] 0 2026-03-09T17:30:11.543 INFO:tasks.workunit.client.0.vm06.stdout:1/967: rename d11/d14/d1d/d4a/fa7 to d11/d14/d1d/dd1/d10b/f145 0 2026-03-09T17:30:11.556 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.553+0000 7fa3b8a98700 1 -- 192.168.123.106:0/1746334570 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fa394005cc0 con 0x7fa3b4082a70 2026-03-09T17:30:11.558 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.556+0000 7fa3a37fe700 1 -- 192.168.123.106:0/1746334570 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7fa3ac007b40 con 0x7fa3b4082a70 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 12, 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:30:11.563 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:30:11.566 INFO:tasks.workunit.client.0.vm06.stdout:8/976: mkdir d15/d39/d67/d86/ddd/d13e 0 2026-03-09T17:30:11.566 INFO:tasks.workunit.client.0.vm06.stdout:5/970: creat d4/d52/d55/dee/f15d x:0 0 0 2026-03-09T17:30:11.566 INFO:tasks.workunit.client.0.vm06.stdout:8/977: chown d15/d31/c53 466 1 2026-03-09T17:30:11.570 INFO:tasks.workunit.client.1.vm09.stdout:3/208: link d5/d16/d31/d3d/d12/f15 d5/d16/d31/d3d/d12/f3e 0 2026-03-09T17:30:11.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.565+0000 7fa3b8a98700 1 -- 192.168.123.106:0/1746334570 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fa39c040140 msgr2=0x7fa39c0425f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:11.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.567+0000 7fa3b8a98700 1 --2- 192.168.123.106:0/1746334570 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fa39c040140 0x7fa39c0425f0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fa3a400c6d0 tx=0x7fa3a400c2d0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.567+0000 7fa3b8a98700 1 -- 192.168.123.106:0/1746334570 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa3b4082a70 msgr2=0x7fa3b4082ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:11.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.567+0000 7fa3b8a98700 1 --2- 192.168.123.106:0/1746334570 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa3b4082a70 0x7fa3b4082ee0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fa3ac00bee0 tx=0x7fa3ac00c320 comp rx=0 tx=0).stop 2026-03-09T17:30:11.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.567+0000 7fa3b8a98700 1 -- 192.168.123.106:0/1746334570 shutdown_connections 2026-03-09T17:30:11.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.567+0000 7fa3b8a98700 1 --2- 192.168.123.106:0/1746334570 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fa39c040140 0x7fa39c0425f0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.567+0000 7fa3b8a98700 1 --2- 192.168.123.106:0/1746334570 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3b4072360 0x7fa3b4082530 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.568+0000 7fa3b8a98700 1 --2- 192.168.123.106:0/1746334570 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa3b4082a70 0x7fa3b4082ee0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.568+0000 7fa3b8a98700 1 -- 192.168.123.106:0/1746334570 >> 192.168.123.106:0/1746334570 conn(0x7fa3b406d1a0 msgr2=0x7fa3b4076480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:11.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.568+0000 7fa3b8a98700 1 -- 192.168.123.106:0/1746334570 shutdown_connections 2026-03-09T17:30:11.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.568+0000 7fa3b8a98700 1 -- 192.168.123.106:0/1746334570 wait complete. 2026-03-09T17:30:11.571 INFO:tasks.workunit.client.1.vm09.stdout:3/209: truncate d5/d16/d31/d3d/d12/f39 42185 0 2026-03-09T17:30:11.589 INFO:tasks.workunit.client.0.vm06.stdout:1/968: symlink d11/d14/d1d/d8c/l146 0 2026-03-09T17:30:11.601 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:11 vm09.local ceph-mon[62061]: from='client.24507 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:11.601 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:11 vm09.local ceph-mon[62061]: mgrmap e24: vm09.lqzvkh(active, since 4s) 2026-03-09T17:30:11.601 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:11 vm09.local ceph-mon[62061]: [09/Mar/2026:17:30:10] ENGINE Bus STARTING 2026-03-09T17:30:11.601 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:11 vm09.local ceph-mon[62061]: from='client.14690 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:11.601 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:11 vm09.local ceph-mon[62061]: [09/Mar/2026:17:30:10] ENGINE Serving on https://192.168.123.109:7150 2026-03-09T17:30:11.601 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:11 vm09.local ceph-mon[62061]: [09/Mar/2026:17:30:10] ENGINE Client ('192.168.123.109', 58260) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T17:30:11.601 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:11 vm09.local ceph-mon[62061]: [09/Mar/2026:17:30:10] ENGINE Serving on http://192.168.123.109:8765 2026-03-09T17:30:11.601 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:11 vm09.local ceph-mon[62061]: [09/Mar/2026:17:30:10] ENGINE Bus STARTED 2026-03-09T17:30:11.602 INFO:tasks.workunit.client.0.vm06.stdout:5/971: rmdir d4/d52/db4/dc2 39 2026-03-09T17:30:11.610 INFO:tasks.workunit.client.1.vm09.stdout:0/229: symlink d6/d1d/l47 0 2026-03-09T17:30:11.611 INFO:tasks.workunit.client.1.vm09.stdout:4/289: dread d11/f12 [0,4194304] 0 2026-03-09T17:30:11.612 INFO:tasks.workunit.client.0.vm06.stdout:6/904: link d6/d47/d96/d40/fdd d6/d12/d17/d65/d113/f119 0 2026-03-09T17:30:11.612 INFO:tasks.workunit.client.0.vm06.stdout:2/876: getdents d3/d4/d12/d2b/db0/dc1 0 2026-03-09T17:30:11.613 INFO:tasks.workunit.client.0.vm06.stdout:2/877: chown d3/d4/dcf 9476681 1 2026-03-09T17:30:11.617 INFO:tasks.workunit.client.1.vm09.stdout:8/237: rmdir d1/d14/d31 39 2026-03-09T17:30:11.619 INFO:tasks.workunit.client.0.vm06.stdout:5/972: dread - d4/d50/db2/d125/f12c zero size 2026-03-09T17:30:11.619 INFO:tasks.workunit.client.0.vm06.stdout:8/978: mkdir d15/d13f 0 2026-03-09T17:30:11.622 INFO:tasks.workunit.client.0.vm06.stdout:1/969: sync 2026-03-09T17:30:11.630 INFO:tasks.workunit.client.1.vm09.stdout:3/210: write d5/f2f [4955368,49730] 0 2026-03-09T17:30:11.637 INFO:tasks.workunit.client.0.vm06.stdout:2/878: symlink d3/d4/d12/d2b/db0/l11f 0 2026-03-09T17:30:11.651 INFO:tasks.workunit.client.1.vm09.stdout:0/230: rename d6/d1d/c33 to d6/d1d/d24/d32/c48 0 2026-03-09T17:30:11.680 INFO:tasks.workunit.client.0.vm06.stdout:5/973: dread d4/f17 [0,4194304] 0 2026-03-09T17:30:11.691 INFO:tasks.workunit.client.0.vm06.stdout:5/974: sync 2026-03-09T17:30:11.697 INFO:tasks.workunit.client.1.vm09.stdout:8/238: chown d1/c11 449873 1 2026-03-09T17:30:11.698 INFO:tasks.workunit.client.1.vm09.stdout:0/231: creat d6/d1d/d24/d32/f49 x:0 0 0 2026-03-09T17:30:11.699 INFO:tasks.workunit.client.1.vm09.stdout:5/222: getdents d0 0 2026-03-09T17:30:11.700 INFO:tasks.workunit.client.1.vm09.stdout:8/239: truncate d1/da/f35 938700 0 2026-03-09T17:30:11.706 INFO:tasks.workunit.client.1.vm09.stdout:1/244: getdents d9/dc/dd/d40 0 2026-03-09T17:30:11.707 INFO:tasks.workunit.client.1.vm09.stdout:1/245: dread - d9/dc/f47 zero size 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.702+0000 7fa080b1e700 1 -- 192.168.123.106:0/901154483 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa07c072330 msgr2=0x7fa07c0770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.702+0000 7fa080b1e700 1 --2- 192.168.123.106:0/901154483 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa07c072330 0x7fa07c0770b0 secure :-1 s=READY pgs=323 cs=0 l=1 rev1=1 crypto rx=0x7fa06c007780 tx=0x7fa06c007a90 comp rx=0 tx=0).stop 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.702+0000 7fa080b1e700 1 -- 192.168.123.106:0/901154483 shutdown_connections 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.702+0000 7fa080b1e700 1 --2- 192.168.123.106:0/901154483 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa07c072330 0x7fa07c0770b0 unknown :-1 s=CLOSED pgs=323 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.702+0000 7fa080b1e700 1 --2- 192.168.123.106:0/901154483 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa07c071950 0x7fa07c071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.702+0000 7fa080b1e700 1 -- 192.168.123.106:0/901154483 >> 192.168.123.106:0/901154483 conn(0x7fa07c06d1a0 msgr2=0x7fa07c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.702+0000 7fa080b1e700 1 -- 192.168.123.106:0/901154483 shutdown_connections 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.702+0000 7fa080b1e700 1 -- 192.168.123.106:0/901154483 wait complete. 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.703+0000 7fa080b1e700 1 Processor -- start 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.703+0000 7fa080b1e700 1 -- start start 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.703+0000 7fa080b1e700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa07c071950 0x7fa07c0823a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.703+0000 7fa080b1e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa07c0828e0 0x7fa07c082d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.703+0000 7fa080b1e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa07c083d50 con 0x7fa07c0828e0 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.703+0000 7fa080b1e700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa07c12dd80 con 0x7fa07c071950 2026-03-09T17:30:11.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.703+0000 7fa07affd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa07c0828e0 0x7fa07c082d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:11.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.703+0000 7fa07affd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa07c0828e0 0x7fa07c082d50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:34880/0 (socket says 192.168.123.106:34880) 2026-03-09T17:30:11.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.703+0000 7fa07affd700 1 -- 192.168.123.106:0/1972808134 learned_addr learned my addr 192.168.123.106:0/1972808134 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:30:11.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.703+0000 7fa07b7fe700 1 --2- 192.168.123.106:0/1972808134 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa07c071950 0x7fa07c0823a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:11.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.703+0000 7fa07affd700 1 -- 192.168.123.106:0/1972808134 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa07c071950 msgr2=0x7fa07c0823a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:11.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.703+0000 7fa07affd700 1 --2- 192.168.123.106:0/1972808134 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa07c071950 0x7fa07c0823a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.703+0000 7fa07affd700 1 -- 192.168.123.106:0/1972808134 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa06c007430 con 0x7fa07c0828e0 2026-03-09T17:30:11.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.704+0000 7fa07affd700 1 --2- 192.168.123.106:0/1972808134 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa07c0828e0 0x7fa07c082d50 secure :-1 s=READY pgs=324 cs=0 l=1 rev1=1 crypto rx=0x7fa06c00a9e0 tx=0x7fa06c00aa10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:11.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.704+0000 7fa078ff9700 1 -- 192.168.123.106:0/1972808134 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa06c004210 con 0x7fa07c0828e0 2026-03-09T17:30:11.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.704+0000 7fa080b1e700 1 -- 192.168.123.106:0/1972808134 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa07c12e000 con 0x7fa07c0828e0 2026-03-09T17:30:11.709 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.705+0000 7fa080b1e700 1 -- 192.168.123.106:0/1972808134 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa07c12e4f0 con 0x7fa07c0828e0 2026-03-09T17:30:11.709 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.707+0000 7fa078ff9700 1 -- 192.168.123.106:0/1972808134 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa06c004370 con 0x7fa07c0828e0 2026-03-09T17:30:11.709 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.707+0000 7fa078ff9700 1 -- 192.168.123.106:0/1972808134 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa06c01b670 con 0x7fa07c0828e0 2026-03-09T17:30:11.712 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.707+0000 7fa078ff9700 1 -- 192.168.123.106:0/1972808134 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 24) v1 ==== 50383+0+0 (secure 0 0 0) 0x7fa06c022020 con 0x7fa07c0828e0 2026-03-09T17:30:11.712 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.709+0000 7fa078ff9700 1 --2- 192.168.123.106:0/1972808134 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fa06403dfc0 0x7fa064040470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:11.712 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.709+0000 7fa078ff9700 1 -- 192.168.123.106:0/1972808134 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5448+0+0 (secure 0 0 0) 0x7fa06c055310 con 0x7fa07c0828e0 2026-03-09T17:30:11.712 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.709+0000 7fa07b7fe700 1 --2- 192.168.123.106:0/1972808134 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fa06403dfc0 0x7fa064040470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:11.712 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.710+0000 7fa0627fc700 1 -- 192.168.123.106:0/1972808134 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa068005320 con 0x7fa07c0828e0 2026-03-09T17:30:11.713 INFO:tasks.workunit.client.1.vm09.stdout:3/211: unlink d5/d9/fc 0 2026-03-09T17:30:11.715 INFO:tasks.workunit.client.1.vm09.stdout:0/232: mknod d6/d1d/c4a 0 2026-03-09T17:30:11.716 INFO:tasks.workunit.client.1.vm09.stdout:5/223: write d0/dc/d21/d26/f39 [858736,92680] 0 2026-03-09T17:30:11.719 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.713+0000 7fa078ff9700 1 -- 192.168.123.106:0/1972808134 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fa06c00a030 con 0x7fa07c0828e0 2026-03-09T17:30:11.719 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.714+0000 7fa07b7fe700 1 --2- 192.168.123.106:0/1972808134 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fa06403dfc0 0x7fa064040470 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fa074003eb0 tx=0x7fa07400b040 comp rx=0 tx=0).ready entity=mgr.24477 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:11.730 INFO:tasks.workunit.client.1.vm09.stdout:0/233: sync 2026-03-09T17:30:11.730 INFO:tasks.workunit.client.1.vm09.stdout:5/224: sync 2026-03-09T17:30:11.731 INFO:tasks.workunit.client.1.vm09.stdout:0/234: chown d6/d1d/c3e 210304 1 2026-03-09T17:30:11.732 INFO:tasks.workunit.client.1.vm09.stdout:0/235: read d6/d1d/f1e [60478,63339] 0 2026-03-09T17:30:11.735 INFO:tasks.workunit.client.1.vm09.stdout:0/236: read d6/d1d/f37 [3607890,59417] 0 2026-03-09T17:30:11.736 INFO:tasks.workunit.client.1.vm09.stdout:3/212: dread d5/d16/f17 [0,4194304] 0 2026-03-09T17:30:11.739 INFO:tasks.workunit.client.1.vm09.stdout:4/290: creat d11/d1e/f61 x:0 0 0 2026-03-09T17:30:11.739 INFO:tasks.workunit.client.1.vm09.stdout:5/225: mkdir d0/dc/d21/d33/d4a 0 2026-03-09T17:30:11.740 INFO:tasks.workunit.client.1.vm09.stdout:8/240: symlink d1/d14/d2a/l55 0 2026-03-09T17:30:11.741 INFO:tasks.workunit.client.1.vm09.stdout:0/237: symlink d6/d1d/l4b 0 2026-03-09T17:30:11.743 INFO:tasks.workunit.client.1.vm09.stdout:4/291: dwrite d11/f4d [0,4194304] 0 2026-03-09T17:30:11.745 INFO:tasks.workunit.client.1.vm09.stdout:4/292: readlink d11/d1e/d30/l4b 0 2026-03-09T17:30:11.747 INFO:tasks.workunit.client.1.vm09.stdout:4/293: chown d11/d1e/d30/f5f 1092006 1 2026-03-09T17:30:11.748 INFO:tasks.workunit.client.1.vm09.stdout:4/294: write d11/f18 [4524151,46143] 0 2026-03-09T17:30:11.770 INFO:tasks.workunit.client.1.vm09.stdout:4/295: dwrite d11/d1e/d30/f38 [0,4194304] 0 2026-03-09T17:30:11.776 INFO:tasks.workunit.client.1.vm09.stdout:4/296: write fe [5234796,117928] 0 2026-03-09T17:30:11.794 INFO:tasks.workunit.client.1.vm09.stdout:1/246: rmdir d9/dc/dd/d40/d22/d37/d3f/d42/d46 0 2026-03-09T17:30:11.801 INFO:tasks.workunit.client.1.vm09.stdout:5/226: mkdir d0/d46/d4b 0 2026-03-09T17:30:11.805 INFO:tasks.workunit.client.1.vm09.stdout:5/227: readlink d0/dc/ld 0 2026-03-09T17:30:11.806 INFO:tasks.workunit.client.1.vm09.stdout:3/213: getdents d5/d16/d31/d37 0 2026-03-09T17:30:11.807 INFO:tasks.workunit.client.1.vm09.stdout:0/238: mkdir d6/d1d/d39/d4c 0 2026-03-09T17:30:11.811 INFO:tasks.workunit.client.1.vm09.stdout:8/241: creat d1/da/dd/d3f/d32/f56 x:0 0 0 2026-03-09T17:30:11.839 INFO:tasks.workunit.client.1.vm09.stdout:1/247: creat d9/dc/dd/d40/d1d/f4d x:0 0 0 2026-03-09T17:30:11.839 INFO:tasks.workunit.client.0.vm06.stdout:1/970: rename d11/d14/d1d/d1e/dc2/d103/d13d to d11/d14/d1d/d42/d147 0 2026-03-09T17:30:11.845 INFO:tasks.workunit.client.1.vm09.stdout:1/248: dwrite d9/dc/dd/d40/d1d/f17 [0,4194304] 0 2026-03-09T17:30:11.850 INFO:tasks.workunit.client.1.vm09.stdout:1/249: dread d9/dc/dd/d40/d22/f2b [0,4194304] 0 2026-03-09T17:30:11.850 INFO:tasks.workunit.client.1.vm09.stdout:1/250: write f3 [7396604,83982] 0 2026-03-09T17:30:11.864 INFO:tasks.workunit.client.1.vm09.stdout:6/218: truncate d3/d1e/f20 982710 0 2026-03-09T17:30:11.866 INFO:tasks.workunit.client.1.vm09.stdout:6/219: fsync d3/d1e/d30/f39 0 2026-03-09T17:30:11.873 INFO:tasks.workunit.client.1.vm09.stdout:5/228: rename d0/d2/d15/d20/f32 to d0/d46/f4c 0 2026-03-09T17:30:11.874 INFO:tasks.workunit.client.1.vm09.stdout:3/214: creat d5/d16/f3f x:0 0 0 2026-03-09T17:30:11.879 INFO:tasks.workunit.client.0.vm06.stdout:5/975: rmdir d4/d50/d35/d40/d96 39 2026-03-09T17:30:11.879 INFO:tasks.workunit.client.1.vm09.stdout:0/239: creat d6/d1d/d46/f4d x:0 0 0 2026-03-09T17:30:11.883 INFO:tasks.workunit.client.1.vm09.stdout:8/242: chown d1/d14/d31/l30 27 1 2026-03-09T17:30:11.886 INFO:tasks.workunit.client.1.vm09.stdout:1/251: rmdir d9/dc/dd/d40/d21/d35 39 2026-03-09T17:30:11.887 INFO:tasks.workunit.client.1.vm09.stdout:4/297: dread d11/f1c [0,4194304] 0 2026-03-09T17:30:11.887 INFO:tasks.workunit.client.1.vm09.stdout:4/298: read d11/f25 [2165845,130693] 0 2026-03-09T17:30:11.887 INFO:tasks.workunit.client.1.vm09.stdout:4/299: chown d11/d1e 91921817 1 2026-03-09T17:30:11.887 INFO:tasks.workunit.client.1.vm09.stdout:5/229: unlink d0/d9/l2d 0 2026-03-09T17:30:11.895 INFO:tasks.workunit.client.1.vm09.stdout:3/215: sync 2026-03-09T17:30:11.896 INFO:tasks.workunit.client.1.vm09.stdout:8/243: sync 2026-03-09T17:30:11.899 INFO:tasks.workunit.client.0.vm06.stdout:5/976: truncate d4/d22/d64/f138 783347 0 2026-03-09T17:30:11.900 INFO:tasks.workunit.client.1.vm09.stdout:6/220: creat d3/d1e/d30/d3f/f42 x:0 0 0 2026-03-09T17:30:11.901 INFO:tasks.workunit.client.1.vm09.stdout:5/230: fsync d0/dc/d21/d26/f3d 0 2026-03-09T17:30:11.904 INFO:tasks.workunit.client.1.vm09.stdout:0/240: truncate d6/d1d/f1e 638528 0 2026-03-09T17:30:11.904 INFO:tasks.workunit.client.0.vm06.stdout:5/977: chown d4/d50/d35/d40/d95/ld1 40192453 1 2026-03-09T17:30:11.904 INFO:tasks.workunit.client.1.vm09.stdout:1/252: dwrite d9/dc/dd/d40/f1a [0,4194304] 0 2026-03-09T17:30:11.909 INFO:tasks.workunit.client.1.vm09.stdout:3/216: mknod d5/d9/c40 0 2026-03-09T17:30:11.911 INFO:tasks.workunit.client.1.vm09.stdout:4/300: dwrite d11/d1e/d29/f50 [0,4194304] 0 2026-03-09T17:30:11.912 INFO:tasks.workunit.client.1.vm09.stdout:3/217: write d5/d16/d25/f28 [2715369,1605] 0 2026-03-09T17:30:11.918 INFO:tasks.workunit.client.0.vm06.stdout:5/978: dwrite d4/d50/d35/d40/d109/f14d [0,4194304] 0 2026-03-09T17:30:11.921 INFO:tasks.workunit.client.1.vm09.stdout:4/301: write d11/f15 [1204596,43995] 0 2026-03-09T17:30:11.925 INFO:tasks.workunit.client.1.vm09.stdout:5/231: dwrite d0/dc/d21/d26/f3d [0,4194304] 0 2026-03-09T17:30:11.928 INFO:tasks.workunit.client.1.vm09.stdout:8/244: dwrite d1/d14/f3d [0,4194304] 0 2026-03-09T17:30:11.965 INFO:tasks.workunit.client.1.vm09.stdout:8/245: chown d1/c52 0 1 2026-03-09T17:30:11.966 INFO:tasks.workunit.client.1.vm09.stdout:8/246: fsync d1/da/f35 0 2026-03-09T17:30:11.970 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.967+0000 7fa0627fc700 1 -- 192.168.123.106:0/1972808134 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fa068005cc0 con 0x7fa07c0828e0 2026-03-09T17:30:11.971 INFO:tasks.workunit.client.1.vm09.stdout:8/247: fsync d1/da/f35 0 2026-03-09T17:30:11.971 INFO:tasks.workunit.client.1.vm09.stdout:8/248: fsync d1/da/d13/f1d 0 2026-03-09T17:30:11.971 INFO:tasks.workunit.client.0.vm06.stdout:5/979: truncate d4/d50/d18/fa2 704003 0 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.972+0000 7fa078ff9700 1 -- 192.168.123.106:0/1972808134 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1851 (secure 0 0 0) 0x7fa06c00a020 con 0x7fa07c0828e0 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stdout:e13 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stdout:epoch 11 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:30:11.974 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24291} 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:30:11.975 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:30:11.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.974+0000 7fa080b1e700 1 -- 192.168.123.106:0/1972808134 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fa06403dfc0 msgr2=0x7fa064040470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:11.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.974+0000 7fa080b1e700 1 --2- 192.168.123.106:0/1972808134 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fa06403dfc0 0x7fa064040470 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fa074003eb0 tx=0x7fa07400b040 comp rx=0 tx=0).stop 2026-03-09T17:30:11.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.974+0000 7fa080b1e700 1 -- 192.168.123.106:0/1972808134 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa07c0828e0 msgr2=0x7fa07c082d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:11.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.974+0000 7fa080b1e700 1 --2- 192.168.123.106:0/1972808134 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa07c0828e0 0x7fa07c082d50 secure :-1 s=READY pgs=324 cs=0 l=1 rev1=1 crypto rx=0x7fa06c00a9e0 tx=0x7fa06c00aa10 comp rx=0 tx=0).stop 2026-03-09T17:30:11.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.974+0000 7fa080b1e700 1 -- 192.168.123.106:0/1972808134 shutdown_connections 2026-03-09T17:30:11.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.974+0000 7fa080b1e700 1 --2- 192.168.123.106:0/1972808134 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fa06403dfc0 0x7fa064040470 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.974+0000 7fa080b1e700 1 --2- 192.168.123.106:0/1972808134 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa07c071950 0x7fa07c0823a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:11.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.974+0000 7fa080b1e700 1 --2- 192.168.123.106:0/1972808134 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa07c0828e0 0x7fa07c082d50 secure :-1 s=CLOSED pgs=324 cs=0 l=1 rev1=1 crypto rx=0x7fa06c00a9e0 tx=0x7fa06c00aa10 comp rx=0 tx=0).stop 2026-03-09T17:30:11.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.974+0000 7fa080b1e700 1 -- 192.168.123.106:0/1972808134 >> 192.168.123.106:0/1972808134 conn(0x7fa07c06d1a0 msgr2=0x7fa07c0764f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:11.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.975+0000 7fa080b1e700 1 -- 192.168.123.106:0/1972808134 shutdown_connections 2026-03-09T17:30:11.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:11.975+0000 7fa080b1e700 1 -- 192.168.123.106:0/1972808134 wait complete. 2026-03-09T17:30:11.980 INFO:tasks.workunit.client.1.vm09.stdout:5/232: rename d0/d9/d16/l2c to d0/d46/l4d 0 2026-03-09T17:30:11.988 INFO:tasks.workunit.client.0.vm06.stdout:5/980: fsync d4/d50/fd7 0 2026-03-09T17:30:11.988 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:30:11.988 INFO:tasks.workunit.client.1.vm09.stdout:6/221: rmdir d3/d7/d37 0 2026-03-09T17:30:11.988 INFO:tasks.workunit.client.1.vm09.stdout:4/302: mknod d11/d1e/d29/d36/d57/c62 0 2026-03-09T17:30:11.988 INFO:tasks.workunit.client.1.vm09.stdout:3/218: creat d5/d9/d30/f41 x:0 0 0 2026-03-09T17:30:11.988 INFO:tasks.workunit.client.1.vm09.stdout:8/249: unlink d1/da/d13/l4f 0 2026-03-09T17:30:11.988 INFO:tasks.workunit.client.1.vm09.stdout:5/233: dread d0/d9/f34 [0,4194304] 0 2026-03-09T17:30:11.988 INFO:tasks.workunit.client.1.vm09.stdout:8/250: dread d1/da/dd/f27 [0,4194304] 0 2026-03-09T17:30:11.997 INFO:tasks.workunit.client.1.vm09.stdout:3/219: dread d5/d9/f1e [0,4194304] 0 2026-03-09T17:30:12.007 INFO:tasks.workunit.client.1.vm09.stdout:3/220: mknod d5/c42 0 2026-03-09T17:30:12.013 INFO:tasks.workunit.client.1.vm09.stdout:3/221: write d5/d16/d25/f28 [606870,9958] 0 2026-03-09T17:30:12.015 INFO:tasks.workunit.client.0.vm06.stdout:5/981: rename d4/d22/fde to d4/d22/d46/dec/f15e 0 2026-03-09T17:30:12.026 INFO:tasks.workunit.client.1.vm09.stdout:3/222: dwrite d5/d16/d31/f34 [0,4194304] 0 2026-03-09T17:30:12.032 INFO:tasks.workunit.client.1.vm09.stdout:3/223: dread - d5/d16/f3f zero size 2026-03-09T17:30:12.045 INFO:tasks.workunit.client.1.vm09.stdout:3/224: sync 2026-03-09T17:30:12.053 INFO:tasks.workunit.client.0.vm06.stdout:5/982: read d4/d50/d35/d40/d6f/f9e [4298,75397] 0 2026-03-09T17:30:12.055 INFO:tasks.workunit.client.0.vm06.stdout:5/983: dread - d4/d50/d35/d40/d6f/fed zero size 2026-03-09T17:30:12.077 INFO:tasks.workunit.client.1.vm09.stdout:3/225: dread d5/d16/d31/d3d/fb [0,4194304] 0 2026-03-09T17:30:12.089 INFO:tasks.workunit.client.1.vm09.stdout:3/226: rename d5/d16/f3f to d5/d16/d31/d3d/d12/f43 0 2026-03-09T17:30:12.089 INFO:tasks.workunit.client.1.vm09.stdout:3/227: truncate d5/d16/d25/f28 5846177 0 2026-03-09T17:30:12.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.089+0000 7fcb0f47d700 1 -- 192.168.123.106:0/2368174398 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb08072360 msgr2=0x7fcb080770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:12.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.089+0000 7fcb0f47d700 1 --2- 192.168.123.106:0/2368174398 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb08072360 0x7fcb080770e0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fcb0000a390 tx=0x7fcb0000a6a0 comp rx=0 tx=0).stop 2026-03-09T17:30:12.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.089+0000 7fcb0f47d700 1 -- 192.168.123.106:0/2368174398 shutdown_connections 2026-03-09T17:30:12.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.089+0000 7fcb0f47d700 1 --2- 192.168.123.106:0/2368174398 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb08072360 0x7fcb080770e0 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:12.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.089+0000 7fcb0f47d700 1 --2- 192.168.123.106:0/2368174398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb08071980 0x7fcb08071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:12.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.089+0000 7fcb0f47d700 1 -- 192.168.123.106:0/2368174398 >> 192.168.123.106:0/2368174398 conn(0x7fcb0806d1a0 msgr2=0x7fcb0806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:12.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.089+0000 7fcb0f47d700 1 -- 192.168.123.106:0/2368174398 shutdown_connections 2026-03-09T17:30:12.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.089+0000 7fcb0f47d700 1 -- 192.168.123.106:0/2368174398 wait complete. 2026-03-09T17:30:12.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.090+0000 7fcb0f47d700 1 Processor -- start 2026-03-09T17:30:12.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.090+0000 7fcb0f47d700 1 -- start start 2026-03-09T17:30:12.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.090+0000 7fcb0f47d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb08071980 0x7fcb08082580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:12.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.090+0000 7fcb0f47d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb08082ac0 0x7fcb08082f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:12.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.090+0000 7fcb0f47d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb081b2a90 con 0x7fcb08082ac0 2026-03-09T17:30:12.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.090+0000 7fcb0f47d700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb081b2bd0 con 0x7fcb08071980 2026-03-09T17:30:12.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.090+0000 7fcb0ca18700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb08082ac0 0x7fcb08082f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:12.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.090+0000 7fcb0ca18700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb08082ac0 0x7fcb08082f30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:34894/0 (socket says 192.168.123.106:34894) 2026-03-09T17:30:12.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.090+0000 7fcb0ca18700 1 -- 192.168.123.106:0/937042351 learned_addr learned my addr 192.168.123.106:0/937042351 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:30:12.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.090+0000 7fcb0ca18700 1 -- 192.168.123.106:0/937042351 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb08071980 msgr2=0x7fcb08082580 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:12.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.090+0000 7fcb0ca18700 1 --2- 192.168.123.106:0/937042351 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb08071980 0x7fcb08082580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:12.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.090+0000 7fcb0ca18700 1 -- 192.168.123.106:0/937042351 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcb0000a040 con 0x7fcb08082ac0 2026-03-09T17:30:12.093 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.090+0000 7fcb0ca18700 1 --2- 192.168.123.106:0/937042351 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb08082ac0 0x7fcb08082f30 secure :-1 s=READY pgs=325 cs=0 l=1 rev1=1 crypto rx=0x7fcb0000b280 tx=0x7fcb0000b360 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:12.093 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.091+0000 7fcafe7fc700 1 -- 192.168.123.106:0/937042351 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcb0000a710 con 0x7fcb08082ac0 2026-03-09T17:30:12.094 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.091+0000 7fcb0f47d700 1 -- 192.168.123.106:0/937042351 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcb081b2d70 con 0x7fcb08082ac0 2026-03-09T17:30:12.094 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.091+0000 7fcb0f47d700 1 -- 192.168.123.106:0/937042351 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcb081b3290 con 0x7fcb08082ac0 2026-03-09T17:30:12.094 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.092+0000 7fcafe7fc700 1 -- 192.168.123.106:0/937042351 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcb00018070 con 0x7fcb08082ac0 2026-03-09T17:30:12.094 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.092+0000 7fcafe7fc700 1 -- 192.168.123.106:0/937042351 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcb00014600 con 0x7fcb08082ac0 2026-03-09T17:30:12.094 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.092+0000 7fcb0f47d700 1 -- 192.168.123.106:0/937042351 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcaec005320 con 0x7fcb08082ac0 2026-03-09T17:30:12.096 INFO:tasks.workunit.client.1.vm09.stdout:3/228: read d5/f35 [1137630,8435] 0 2026-03-09T17:30:12.099 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.096+0000 7fcafe7fc700 1 -- 192.168.123.106:0/937042351 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 24) v1 ==== 50383+0+0 (secure 0 0 0) 0x7fcb0001a030 con 0x7fcb08082ac0 2026-03-09T17:30:12.099 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.096+0000 7fcafe7fc700 1 --2- 192.168.123.106:0/937042351 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fcaf403df20 0x7fcaf40403d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:12.099 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.096+0000 7fcafe7fc700 1 -- 192.168.123.106:0/937042351 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5448+0+0 (secure 0 0 0) 0x7fcb00010c90 con 0x7fcb08082ac0 2026-03-09T17:30:12.099 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.096+0000 7fcb0d219700 1 --2- 192.168.123.106:0/937042351 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fcaf403df20 0x7fcaf40403d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:12.099 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.097+0000 7fcb0d219700 1 --2- 192.168.123.106:0/937042351 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fcaf403df20 0x7fcaf40403d0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fcb0400bef0 tx=0x7fcb0400d040 comp rx=0 tx=0).ready entity=mgr.24477 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:12.099 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.097+0000 7fcafe7fc700 1 -- 192.168.123.106:0/937042351 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fcb0001b800 con 0x7fcb08082ac0 2026-03-09T17:30:12.125 INFO:tasks.workunit.client.1.vm09.stdout:2/193: truncate d13/d15/d21/f31 5125050 0 2026-03-09T17:30:12.127 INFO:tasks.workunit.client.1.vm09.stdout:7/303: write da/d11/d41/d4e/f33 [3471040,14020] 0 2026-03-09T17:30:12.128 INFO:tasks.workunit.client.1.vm09.stdout:7/304: write da/d11/d47/f6e [246497,96467] 0 2026-03-09T17:30:12.132 INFO:tasks.workunit.client.1.vm09.stdout:2/194: sync 2026-03-09T17:30:12.133 INFO:tasks.workunit.client.1.vm09.stdout:2/195: fsync d13/d15/f2a 0 2026-03-09T17:30:12.135 INFO:tasks.workunit.client.1.vm09.stdout:7/305: mknod da/d11/d2d/c6f 0 2026-03-09T17:30:12.142 INFO:tasks.workunit.client.1.vm09.stdout:2/196: mknod d13/d15/d36/c3c 0 2026-03-09T17:30:12.146 INFO:tasks.workunit.client.1.vm09.stdout:7/306: getdents da/d11/d41/d4e 0 2026-03-09T17:30:12.149 INFO:tasks.workunit.client.1.vm09.stdout:7/307: creat da/d11/d2d/f70 x:0 0 0 2026-03-09T17:30:12.153 INFO:tasks.workunit.client.1.vm09.stdout:7/308: dwrite da/f36 [0,4194304] 0 2026-03-09T17:30:12.155 INFO:tasks.workunit.client.1.vm09.stdout:7/309: rmdir da/d11 39 2026-03-09T17:30:12.162 INFO:tasks.workunit.client.1.vm09.stdout:7/310: readlink da/d11/d47/d5b/l5c 0 2026-03-09T17:30:12.165 INFO:tasks.workunit.client.1.vm09.stdout:7/311: read da/f27 [252599,76247] 0 2026-03-09T17:30:12.170 INFO:tasks.workunit.client.0.vm06.stdout:6/905: dwrite d6/d12/d53/d91/dbf/fc4 [0,4194304] 0 2026-03-09T17:30:12.173 INFO:tasks.workunit.client.1.vm09.stdout:7/312: link f3 da/d11/d2d/f71 0 2026-03-09T17:30:12.174 INFO:tasks.workunit.client.1.vm09.stdout:7/313: write da/d11/d2d/f45 [224511,106870] 0 2026-03-09T17:30:12.182 INFO:tasks.workunit.client.1.vm09.stdout:9/211: creat d5/f4f x:0 0 0 2026-03-09T17:30:12.185 INFO:tasks.workunit.client.1.vm09.stdout:9/212: fsync d5/de/f3c 0 2026-03-09T17:30:12.192 INFO:tasks.workunit.client.0.vm06.stdout:6/906: creat d6/d47/dd7/f11a x:0 0 0 2026-03-09T17:30:12.206 INFO:tasks.workunit.client.0.vm06.stdout:6/907: creat d6/d47/d4d/d9a/da2/f11b x:0 0 0 2026-03-09T17:30:12.209 INFO:tasks.workunit.client.1.vm09.stdout:9/213: unlink d5/de/c39 0 2026-03-09T17:30:12.210 INFO:tasks.workunit.client.1.vm09.stdout:7/314: dread da/d11/f3f [0,4194304] 0 2026-03-09T17:30:12.215 INFO:tasks.workunit.client.1.vm09.stdout:9/214: sync 2026-03-09T17:30:12.215 INFO:tasks.workunit.client.1.vm09.stdout:9/215: fdatasync d5/de/d29/d33/f42 0 2026-03-09T17:30:12.216 INFO:tasks.workunit.client.1.vm09.stdout:9/216: write d5/f13 [2362406,3578] 0 2026-03-09T17:30:12.216 INFO:tasks.workunit.client.1.vm09.stdout:9/217: stat d5/c1f 0 2026-03-09T17:30:12.238 INFO:tasks.workunit.client.0.vm06.stdout:8/979: write d15/d39/d67/de3/fe5 [53754,92395] 0 2026-03-09T17:30:12.267 INFO:tasks.workunit.client.0.vm06.stdout:2/879: write d3/d4/d12/f85 [2144937,130351] 0 2026-03-09T17:30:12.276 INFO:tasks.workunit.client.1.vm09.stdout:9/218: read d5/d21/f2a [239944,75934] 0 2026-03-09T17:30:12.284 INFO:tasks.workunit.client.1.vm09.stdout:7/315: symlink da/d11/d2d/d56/d68/l72 0 2026-03-09T17:30:12.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.284+0000 7fcb0f47d700 1 -- 192.168.123.106:0/937042351 --> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fcaec000bf0 con 0x7fcaf403df20 2026-03-09T17:30:12.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.285+0000 7fcafe7fc700 1 -- 192.168.123.106:0/937042351 <== mgr.24477 v2:192.168.123.109:6828/111652423 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+318 (secure 0 0 0) 0x7fcaec000bf0 con 0x7fcaf403df20 2026-03-09T17:30:12.288 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:30:12.288 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T17:30:12.288 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:30:12.288 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:30:12.288 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [], 2026-03-09T17:30:12.288 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "1/23 daemons upgraded", 2026-03-09T17:30:12.288 INFO:teuthology.orchestra.run.vm06.stdout: "message": "", 2026-03-09T17:30:12.288 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:30:12.288 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:30:12.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.288+0000 7fcaf3fff700 1 -- 192.168.123.106:0/937042351 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fcaf403df20 msgr2=0x7fcaf40403d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:12.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.288+0000 7fcaf3fff700 1 --2- 192.168.123.106:0/937042351 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fcaf403df20 0x7fcaf40403d0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fcb0400bef0 tx=0x7fcb0400d040 comp rx=0 tx=0).stop 2026-03-09T17:30:12.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.288+0000 7fcaf3fff700 1 -- 192.168.123.106:0/937042351 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb08082ac0 msgr2=0x7fcb08082f30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:12.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.288+0000 7fcaf3fff700 1 --2- 192.168.123.106:0/937042351 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb08082ac0 0x7fcb08082f30 secure :-1 s=READY pgs=325 cs=0 l=1 rev1=1 crypto rx=0x7fcb0000b280 tx=0x7fcb0000b360 comp rx=0 tx=0).stop 2026-03-09T17:30:12.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.288+0000 7fcaf3fff700 1 -- 192.168.123.106:0/937042351 shutdown_connections 2026-03-09T17:30:12.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.288+0000 7fcaf3fff700 1 --2- 192.168.123.106:0/937042351 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fcaf403df20 0x7fcaf40403d0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:12.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.288+0000 7fcaf3fff700 1 --2- 192.168.123.106:0/937042351 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb08071980 0x7fcb08082580 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:12.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.288+0000 7fcaf3fff700 1 --2- 192.168.123.106:0/937042351 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb08082ac0 0x7fcb08082f30 unknown :-1 s=CLOSED pgs=325 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:12.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.288+0000 7fcaf3fff700 1 -- 192.168.123.106:0/937042351 >> 192.168.123.106:0/937042351 conn(0x7fcb0806d1a0 msgr2=0x7fcb080764b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:12.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.288+0000 7fcaf3fff700 1 -- 192.168.123.106:0/937042351 shutdown_connections 2026-03-09T17:30:12.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.289+0000 7fcaf3fff700 1 -- 192.168.123.106:0/937042351 wait complete. 2026-03-09T17:30:12.323 INFO:tasks.workunit.client.1.vm09.stdout:2/197: mknod d13/d15/d34/c3d 0 2026-03-09T17:30:12.328 INFO:tasks.workunit.client.0.vm06.stdout:1/971: dwrite d11/d14/d1d/dd1/de2/f126 [0,4194304] 0 2026-03-09T17:30:12.328 INFO:tasks.workunit.client.1.vm09.stdout:1/253: dwrite f6 [0,4194304] 0 2026-03-09T17:30:12.336 INFO:tasks.workunit.client.1.vm09.stdout:1/254: write d9/dc/dd/d40/d22/d37/f41 [663583,109231] 0 2026-03-09T17:30:12.365 INFO:tasks.workunit.client.1.vm09.stdout:7/316: creat da/d11/d47/d5b/d6c/f73 x:0 0 0 2026-03-09T17:30:12.366 INFO:tasks.workunit.client.1.vm09.stdout:7/317: chown da/f16 82 1 2026-03-09T17:30:12.393 INFO:tasks.workunit.client.1.vm09.stdout:1/255: fsync d9/f29 0 2026-03-09T17:30:12.401 INFO:tasks.workunit.client.1.vm09.stdout:7/318: creat da/d11/d41/d4e/f74 x:0 0 0 2026-03-09T17:30:12.405 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.402+0000 7fc72c066700 1 -- 192.168.123.106:0/3123383232 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc724072360 msgr2=0x7fc7240770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:12.405 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.402+0000 7fc72c066700 1 --2- 192.168.123.106:0/3123383232 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc724072360 0x7fc7240770e0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7fc71c00cd40 tx=0x7fc71c00a320 comp rx=0 tx=0).stop 2026-03-09T17:30:12.405 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.402+0000 7fc72c066700 1 -- 192.168.123.106:0/3123383232 shutdown_connections 2026-03-09T17:30:12.405 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.402+0000 7fc72c066700 1 --2- 192.168.123.106:0/3123383232 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc724072360 0x7fc7240770e0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:12.405 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.402+0000 7fc72c066700 1 --2- 192.168.123.106:0/3123383232 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc724071980 0x7fc724071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:12.405 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.402+0000 7fc72c066700 1 -- 192.168.123.106:0/3123383232 >> 192.168.123.106:0/3123383232 conn(0x7fc72406d1a0 msgr2=0x7fc72406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:12.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.402+0000 7fc72c066700 1 -- 192.168.123.106:0/3123383232 shutdown_connections 2026-03-09T17:30:12.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.402+0000 7fc72c066700 1 -- 192.168.123.106:0/3123383232 wait complete. 2026-03-09T17:30:12.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.403+0000 7fc72c066700 1 Processor -- start 2026-03-09T17:30:12.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.403+0000 7fc72c066700 1 -- start start 2026-03-09T17:30:12.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.403+0000 7fc72c066700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc724071980 0x7fc7241b6030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:12.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.403+0000 7fc72c066700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc7241b6570 0x7fc72407f410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:12.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.403+0000 7fc72c066700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc7241b6a70 con 0x7fc7241b6570 2026-03-09T17:30:12.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.403+0000 7fc72c066700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc7241b6be0 con 0x7fc724071980 2026-03-09T17:30:12.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.404+0000 7fc729601700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc7241b6570 0x7fc72407f410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:12.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.404+0000 7fc729601700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc7241b6570 0x7fc72407f410 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:34912/0 (socket says 192.168.123.106:34912) 2026-03-09T17:30:12.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.404+0000 7fc729601700 1 -- 192.168.123.106:0/1761564276 learned_addr learned my addr 192.168.123.106:0/1761564276 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:30:12.408 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.405+0000 7fc729601700 1 -- 192.168.123.106:0/1761564276 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc724071980 msgr2=0x7fc7241b6030 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:12.408 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.405+0000 7fc729601700 1 --2- 192.168.123.106:0/1761564276 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc724071980 0x7fc7241b6030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:12.408 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.405+0000 7fc729601700 1 -- 192.168.123.106:0/1761564276 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc71c00c9f0 con 0x7fc7241b6570 2026-03-09T17:30:12.409 INFO:tasks.workunit.client.1.vm09.stdout:7/319: dwrite da/d11/d3e/f60 [0,4194304] 0 2026-03-09T17:30:12.410 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.408+0000 7fc729601700 1 --2- 192.168.123.106:0/1761564276 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc7241b6570 0x7fc72407f410 secure :-1 s=READY pgs=326 cs=0 l=1 rev1=1 crypto rx=0x7fc71c00bb40 tx=0x7fc71c00bc20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:12.411 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.409+0000 7fc71affd700 1 -- 192.168.123.106:0/1761564276 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc71c00dea0 con 0x7fc7241b6570 2026-03-09T17:30:12.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.409+0000 7fc72c066700 1 -- 192.168.123.106:0/1761564276 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc72407f950 con 0x7fc7241b6570 2026-03-09T17:30:12.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.409+0000 7fc72c066700 1 -- 192.168.123.106:0/1761564276 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc72407fe70 con 0x7fc7241b6570 2026-03-09T17:30:12.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.410+0000 7fc71affd700 1 -- 192.168.123.106:0/1761564276 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc71c009d70 con 0x7fc7241b6570 2026-03-09T17:30:12.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.410+0000 7fc71affd700 1 -- 192.168.123.106:0/1761564276 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc71c00c340 con 0x7fc7241b6570 2026-03-09T17:30:12.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.410+0000 7fc71affd700 1 -- 192.168.123.106:0/1761564276 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 24) v1 ==== 50383+0+0 (secure 0 0 0) 0x7fc71c00c4a0 con 0x7fc7241b6570 2026-03-09T17:30:12.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.411+0000 7fc71affd700 1 --2- 192.168.123.106:0/1761564276 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fc71003df20 0x7fc7100403d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:12.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.411+0000 7fc71affd700 1 -- 192.168.123.106:0/1761564276 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5448+0+0 (secure 0 0 0) 0x7fc71c0555f0 con 0x7fc7241b6570 2026-03-09T17:30:12.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.411+0000 7fc729e02700 1 --2- 192.168.123.106:0/1761564276 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fc71003df20 0x7fc7100403d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:12.414 INFO:tasks.workunit.client.1.vm09.stdout:7/320: write da/d11/d2d/f59 [36183,11954] 0 2026-03-09T17:30:12.415 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.412+0000 7fc72c066700 1 -- 192.168.123.106:0/1761564276 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc7241b0210 con 0x7fc7241b6570 2026-03-09T17:30:12.415 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.413+0000 7fc729e02700 1 --2- 192.168.123.106:0/1761564276 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fc71003df20 0x7fc7100403d0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fc720005950 tx=0x7fc7200058e0 comp rx=0 tx=0).ready entity=mgr.24477 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:12.420 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.417+0000 7fc71affd700 1 -- 192.168.123.106:0/1761564276 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fc71c024020 con 0x7fc7241b6570 2026-03-09T17:30:12.423 INFO:tasks.workunit.client.1.vm09.stdout:4/303: getdents d11/d1e/d29/d36/d57 0 2026-03-09T17:30:12.424 INFO:tasks.workunit.client.0.vm06.stdout:5/984: write d4/d22/d46/dec/f105 [1044262,46948] 0 2026-03-09T17:30:12.424 INFO:tasks.workunit.client.1.vm09.stdout:5/234: write d0/d9/f34 [1061552,120937] 0 2026-03-09T17:30:12.424 INFO:tasks.workunit.client.1.vm09.stdout:8/251: write d1/f33 [944983,108570] 0 2026-03-09T17:30:12.424 INFO:tasks.workunit.client.1.vm09.stdout:4/304: stat d11/d1e/d29/d36/f40 0 2026-03-09T17:30:12.425 INFO:tasks.workunit.client.1.vm09.stdout:4/305: write fe [2150544,125772] 0 2026-03-09T17:30:12.429 INFO:tasks.workunit.client.1.vm09.stdout:6/222: dwrite d3/d7/fe [4194304,4194304] 0 2026-03-09T17:30:12.431 INFO:tasks.workunit.client.1.vm09.stdout:3/229: write d5/d16/d31/d3d/fe [372372,21892] 0 2026-03-09T17:30:12.432 INFO:tasks.workunit.client.1.vm09.stdout:3/230: chown d5/d16/d31/d3d/d12/f19 421100 1 2026-03-09T17:30:12.432 INFO:tasks.workunit.client.1.vm09.stdout:9/219: getdents d5/de/d29/d33 0 2026-03-09T17:30:12.434 INFO:tasks.workunit.client.1.vm09.stdout:2/198: creat d13/d15/d21/f3e x:0 0 0 2026-03-09T17:30:12.436 INFO:tasks.workunit.client.1.vm09.stdout:9/220: truncate d5/de/d29/f35 4831370 0 2026-03-09T17:30:12.440 INFO:tasks.workunit.client.1.vm09.stdout:5/235: sync 2026-03-09T17:30:12.441 INFO:tasks.workunit.client.1.vm09.stdout:9/221: sync 2026-03-09T17:30:12.441 INFO:tasks.workunit.client.1.vm09.stdout:5/236: chown d0/c10 10 1 2026-03-09T17:30:12.441 INFO:tasks.workunit.client.1.vm09.stdout:1/256: mkdir d9/dc/dd/d40/d1d/d4e 0 2026-03-09T17:30:12.441 INFO:tasks.workunit.client.1.vm09.stdout:9/222: sync 2026-03-09T17:30:12.442 INFO:tasks.workunit.client.1.vm09.stdout:5/237: dread - d0/d9/d16/d3c/f49 zero size 2026-03-09T17:30:12.442 INFO:tasks.workunit.client.1.vm09.stdout:1/257: sync 2026-03-09T17:30:12.442 INFO:tasks.workunit.client.1.vm09.stdout:9/223: dread - d5/d21/f38 zero size 2026-03-09T17:30:12.445 INFO:tasks.workunit.client.1.vm09.stdout:1/258: dread f2 [0,4194304] 0 2026-03-09T17:30:12.454 INFO:tasks.workunit.client.1.vm09.stdout:1/259: dread d9/dc/dd/d40/f1a [0,4194304] 0 2026-03-09T17:30:12.464 INFO:tasks.workunit.client.1.vm09.stdout:7/321: write da/d11/f3f [1866266,98832] 0 2026-03-09T17:30:12.465 INFO:tasks.workunit.client.1.vm09.stdout:7/322: chown da/d11/l28 3159 1 2026-03-09T17:30:12.482 INFO:tasks.workunit.client.0.vm06.stdout:2/880: creat d3/d4/d12/d71/daa/d77/d81/d64/f120 x:0 0 0 2026-03-09T17:30:12.494 INFO:tasks.workunit.client.0.vm06.stdout:5/985: chown d4/d22/ce6 20309 1 2026-03-09T17:30:12.506 INFO:tasks.workunit.client.0.vm06.stdout:6/908: link d6/d12/d17/l2a d6/d12/d17/dce/l11c 0 2026-03-09T17:30:12.512 INFO:tasks.workunit.client.1.vm09.stdout:3/231: creat d5/d16/d31/f44 x:0 0 0 2026-03-09T17:30:12.512 INFO:tasks.workunit.client.1.vm09.stdout:3/232: chown d5/d16/f17 7361965 1 2026-03-09T17:30:12.532 INFO:tasks.workunit.client.0.vm06.stdout:1/972: mkdir d11/d14/d1d/d42/d46/d92/dc0/d118/d148 0 2026-03-09T17:30:12.532 INFO:tasks.workunit.client.0.vm06.stdout:8/980: write d15/d16/d6d/fd8 [937930,87098] 0 2026-03-09T17:30:12.534 INFO:tasks.workunit.client.1.vm09.stdout:0/241: write d6/d1d/f1e [493946,43219] 0 2026-03-09T17:30:12.540 INFO:tasks.workunit.client.0.vm06.stdout:6/909: rmdir d6/d12/d2d/db3 39 2026-03-09T17:30:12.540 INFO:tasks.workunit.client.1.vm09.stdout:5/238: rename d0/dc/l48 to d0/dc/l4e 0 2026-03-09T17:30:12.542 INFO:tasks.workunit.client.1.vm09.stdout:1/260: creat d9/dc/dd/f4f x:0 0 0 2026-03-09T17:30:12.544 INFO:tasks.workunit.client.0.vm06.stdout:1/973: chown d11/d14/c2b 132375949 1 2026-03-09T17:30:12.547 INFO:tasks.workunit.client.1.vm09.stdout:8/252: fdatasync d1/d14/d2a/f54 0 2026-03-09T17:30:12.551 INFO:tasks.workunit.client.1.vm09.stdout:6/223: symlink d3/d7/l43 0 2026-03-09T17:30:12.551 INFO:tasks.workunit.client.0.vm06.stdout:5/986: symlink d4/d50/d35/d40/d95/l15f 0 2026-03-09T17:30:12.551 INFO:tasks.workunit.client.1.vm09.stdout:8/253: write d1/f33 [528273,86660] 0 2026-03-09T17:30:12.551 INFO:tasks.workunit.client.0.vm06.stdout:8/981: rmdir d15/d16/d1e/d30/db8/da4 39 2026-03-09T17:30:12.554 INFO:tasks.workunit.client.1.vm09.stdout:4/306: dwrite d11/f12 [0,4194304] 0 2026-03-09T17:30:12.556 INFO:tasks.workunit.client.1.vm09.stdout:4/307: write fe [5112226,123685] 0 2026-03-09T17:30:12.575 INFO:tasks.workunit.client.0.vm06.stdout:1/974: symlink d11/d14/d1d/d42/d46/d92/l149 0 2026-03-09T17:30:12.579 INFO:tasks.workunit.client.0.vm06.stdout:5/987: symlink d4/d50/db2/d125/l160 0 2026-03-09T17:30:12.585 INFO:tasks.workunit.client.0.vm06.stdout:2/881: getdents d3/d4/d12/d71/daa 0 2026-03-09T17:30:12.585 INFO:tasks.workunit.client.0.vm06.stdout:8/982: dread d15/d31/dc5/df1/d71/f82 [0,4194304] 0 2026-03-09T17:30:12.589 INFO:tasks.workunit.client.0.vm06.stdout:2/882: chown d3/d4/d46/fd8 47 1 2026-03-09T17:30:12.596 INFO:tasks.workunit.client.1.vm09.stdout:2/199: rename d13/f1a to d13/d15/d3b/f3f 0 2026-03-09T17:30:12.598 INFO:tasks.workunit.client.0.vm06.stdout:5/988: read d4/d50/d18/de1/d141/fe9 [65270,9177] 0 2026-03-09T17:30:12.603 INFO:tasks.workunit.client.0.vm06.stdout:6/910: creat d6/d12/d53/f11d x:0 0 0 2026-03-09T17:30:12.603 INFO:tasks.workunit.client.0.vm06.stdout:6/911: stat d6/f97 0 2026-03-09T17:30:12.606 INFO:tasks.workunit.client.0.vm06.stdout:2/883: mkdir d3/d4/d12/d2b/db0/dc1/d121 0 2026-03-09T17:30:12.606 INFO:tasks.workunit.client.0.vm06.stdout:1/975: dwrite d11/de0/f10c [0,4194304] 0 2026-03-09T17:30:12.621 INFO:tasks.workunit.client.1.vm09.stdout:9/224: mknod d5/de/c50 0 2026-03-09T17:30:12.625 INFO:tasks.workunit.client.1.vm09.stdout:9/225: fsync d5/d21/f2f 0 2026-03-09T17:30:12.626 INFO:tasks.workunit.client.1.vm09.stdout:1/261: creat d9/dc/dd/d40/d22/f50 x:0 0 0 2026-03-09T17:30:12.641 INFO:tasks.workunit.client.0.vm06.stdout:8/983: symlink d15/d39/d67/d86/ddd/d13e/l140 0 2026-03-09T17:30:12.642 INFO:tasks.workunit.client.0.vm06.stdout:1/976: creat d11/d14/d1d/d42/d46/f14a x:0 0 0 2026-03-09T17:30:12.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:12 vm06.local ceph-mon[57307]: from='client.14692 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:12.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:12 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/1746334570' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:30:12.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:12 vm06.local ceph-mon[57307]: pgmap v6: 65 pgs: 65 active+clean; 2.5 GiB data, 8.7 GiB used, 111 GiB / 120 GiB avail 2026-03-09T17:30:12.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:12 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/1972808134' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:30:12.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:12 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:12.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:12 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:12.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:12 vm06.local ceph-mon[57307]: Standby manager daemon vm06.pbgzei started 2026-03-09T17:30:12.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:12 vm06.local ceph-mon[57307]: from='mgr.? 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.pbgzei/crt"}]: dispatch 2026-03-09T17:30:12.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:12 vm06.local ceph-mon[57307]: from='mgr.? 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:30:12.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:12 vm06.local ceph-mon[57307]: from='mgr.? 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.pbgzei/key"}]: dispatch 2026-03-09T17:30:12.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:12 vm06.local ceph-mon[57307]: from='mgr.? 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T17:30:12.649 INFO:tasks.workunit.client.1.vm09.stdout:8/254: mknod d1/d14/d2a/d42/d43/c57 0 2026-03-09T17:30:12.649 INFO:tasks.workunit.client.1.vm09.stdout:8/255: write d1/da/f4b [1622446,55013] 0 2026-03-09T17:30:12.657 INFO:tasks.workunit.client.0.vm06.stdout:8/984: rmdir d15/d31/dc5/df1/d2b/d85 39 2026-03-09T17:30:12.674 INFO:tasks.workunit.client.1.vm09.stdout:4/308: dread d11/d1e/f3c [0,4194304] 0 2026-03-09T17:30:12.677 INFO:tasks.workunit.client.0.vm06.stdout:1/977: mknod d11/d14/d1d/d4a/df7/d106/d112/d114/c14b 0 2026-03-09T17:30:12.679 INFO:tasks.workunit.client.1.vm09.stdout:3/233: link d5/d16/d31/f34 d5/d16/f45 0 2026-03-09T17:30:12.681 INFO:tasks.workunit.client.0.vm06.stdout:1/978: readlink d11/d14/d1d/d42/d46/d92/dc0/lb6 0 2026-03-09T17:30:12.681 INFO:tasks.workunit.client.0.vm06.stdout:2/884: dread d3/d4/d12/f42 [0,4194304] 0 2026-03-09T17:30:12.685 INFO:tasks.workunit.client.0.vm06.stdout:8/985: rmdir d15/d31/dc5/df1/d71 39 2026-03-09T17:30:12.685 INFO:tasks.workunit.client.0.vm06.stdout:5/989: dwrite d4/d52/db4/dc2/f150 [0,4194304] 0 2026-03-09T17:30:12.686 INFO:tasks.workunit.client.0.vm06.stdout:2/885: write d3/d4/d12/d71/daa/d77/d81/d64/f120 [186725,119723] 0 2026-03-09T17:30:12.692 INFO:tasks.workunit.client.1.vm09.stdout:0/242: getdents d6/d1d/d39/d4c 0 2026-03-09T17:30:12.693 INFO:tasks.workunit.client.1.vm09.stdout:5/239: creat d0/d46/d4b/f4f x:0 0 0 2026-03-09T17:30:12.693 INFO:tasks.workunit.client.0.vm06.stdout:6/912: getdents d6/d4f/d3e/d52/d8c/d117 0 2026-03-09T17:30:12.693 INFO:tasks.workunit.client.1.vm09.stdout:0/243: stat d6/d1d/d24/c2c 0 2026-03-09T17:30:12.698 INFO:tasks.workunit.client.0.vm06.stdout:6/913: fsync d6/d12/d53/dd0/f106 0 2026-03-09T17:30:12.703 INFO:tasks.workunit.client.1.vm09.stdout:0/244: truncate d6/d1d/d46/f4d 201415 0 2026-03-09T17:30:12.705 INFO:tasks.workunit.client.1.vm09.stdout:2/200: dwrite d13/f26 [0,4194304] 0 2026-03-09T17:30:12.709 INFO:tasks.workunit.client.1.vm09.stdout:3/234: dread d5/d16/d31/d3d/d12/f3e [0,4194304] 0 2026-03-09T17:30:12.720 INFO:tasks.workunit.client.0.vm06.stdout:5/990: creat d4/d50/d18/de1/d141/f161 x:0 0 0 2026-03-09T17:30:12.722 INFO:tasks.workunit.client.0.vm06.stdout:8/986: dread d15/d39/d67/de3/f12d [4194304,4194304] 0 2026-03-09T17:30:12.725 INFO:tasks.workunit.client.1.vm09.stdout:0/245: dwrite d6/d1d/d24/d32/f49 [0,4194304] 0 2026-03-09T17:30:12.727 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_OK 2026-03-09T17:30:12.727 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.720+0000 7fc72c066700 1 -- 192.168.123.106:0/1761564276 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fc72404ea50 con 0x7fc7241b6570 2026-03-09T17:30:12.727 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.722+0000 7fc71affd700 1 -- 192.168.123.106:0/1761564276 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7fc71c044090 con 0x7fc7241b6570 2026-03-09T17:30:12.727 INFO:tasks.workunit.client.0.vm06.stdout:2/886: mkdir d3/d4/d12/d71/daa/d77/d81/d64/de5/df0/d122 0 2026-03-09T17:30:12.728 INFO:tasks.workunit.client.0.vm06.stdout:2/887: readlink d3/d4/d12/d2b/db0/dc1/lfe 0 2026-03-09T17:30:12.730 INFO:tasks.workunit.client.1.vm09.stdout:7/323: link da/c43 da/d11/d41/d4e/d5f/c75 0 2026-03-09T17:30:12.730 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.727+0000 7fc72c066700 1 -- 192.168.123.106:0/1761564276 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fc71003df20 msgr2=0x7fc7100403d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:12.730 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.727+0000 7fc72c066700 1 --2- 192.168.123.106:0/1761564276 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fc71003df20 0x7fc7100403d0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fc720005950 tx=0x7fc7200058e0 comp rx=0 tx=0).stop 2026-03-09T17:30:12.730 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.727+0000 7fc72c066700 1 -- 192.168.123.106:0/1761564276 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc7241b6570 msgr2=0x7fc72407f410 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:12.730 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.727+0000 7fc72c066700 1 --2- 192.168.123.106:0/1761564276 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc7241b6570 0x7fc72407f410 secure :-1 s=READY pgs=326 cs=0 l=1 rev1=1 crypto rx=0x7fc71c00bb40 tx=0x7fc71c00bc20 comp rx=0 tx=0).stop 2026-03-09T17:30:12.733 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.730+0000 7fc72c066700 1 -- 192.168.123.106:0/1761564276 shutdown_connections 2026-03-09T17:30:12.733 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.730+0000 7fc72c066700 1 --2- 192.168.123.106:0/1761564276 >> [v2:192.168.123.109:6828/111652423,v1:192.168.123.109:6829/111652423] conn(0x7fc71003df20 0x7fc7100403d0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:12.733 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.731+0000 7fc72c066700 1 --2- 192.168.123.106:0/1761564276 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc724071980 0x7fc7241b6030 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:12.733 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.731+0000 7fc72c066700 1 --2- 192.168.123.106:0/1761564276 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc7241b6570 0x7fc72407f410 unknown :-1 s=CLOSED pgs=326 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:12.733 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.731+0000 7fc72c066700 1 -- 192.168.123.106:0/1761564276 >> 192.168.123.106:0/1761564276 conn(0x7fc72406d1a0 msgr2=0x7fc7240763d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:12.735 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.733+0000 7fc72c066700 1 -- 192.168.123.106:0/1761564276 shutdown_connections 2026-03-09T17:30:12.737 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:12.733+0000 7fc72c066700 1 -- 192.168.123.106:0/1761564276 wait complete. 2026-03-09T17:30:12.739 INFO:tasks.workunit.client.1.vm09.stdout:6/224: mknod d3/d21/d25/c44 0 2026-03-09T17:30:12.742 INFO:tasks.workunit.client.0.vm06.stdout:5/991: mkdir d4/da4/dcf/d162 0 2026-03-09T17:30:12.742 INFO:tasks.workunit.client.0.vm06.stdout:1/979: rename d11/d14/d1d/d1e/d2a/c98 to d11/d14/d1d/d1e/d2a/c14c 0 2026-03-09T17:30:12.749 INFO:tasks.workunit.client.1.vm09.stdout:8/256: dwrite d1/da/d13/f36 [0,4194304] 0 2026-03-09T17:30:12.753 INFO:tasks.workunit.client.0.vm06.stdout:6/914: getdents d6/d4f/d3e/d52/d8c/d117/d10c 0 2026-03-09T17:30:12.761 INFO:tasks.workunit.client.1.vm09.stdout:4/309: rename d11/l17 to d11/d1e/d45/l63 0 2026-03-09T17:30:12.773 INFO:tasks.workunit.client.1.vm09.stdout:5/240: creat d0/de/f50 x:0 0 0 2026-03-09T17:30:12.785 INFO:tasks.workunit.client.0.vm06.stdout:8/987: link d15/d16/d1e/d30/c4a d15/d39/d67/d86/c141 0 2026-03-09T17:30:12.792 INFO:tasks.workunit.client.0.vm06.stdout:5/992: creat d4/d50/d18/f163 x:0 0 0 2026-03-09T17:30:12.793 INFO:tasks.workunit.client.0.vm06.stdout:5/993: stat d4/d22/d46/fd9 0 2026-03-09T17:30:12.797 INFO:tasks.workunit.client.1.vm09.stdout:1/262: symlink d9/dc/dd/d40/d1d/d4e/l51 0 2026-03-09T17:30:12.799 INFO:tasks.workunit.client.1.vm09.stdout:1/263: write d9/dc/dd/d40/d22/d37/d3f/d42/f45 [923962,24727] 0 2026-03-09T17:30:12.803 INFO:tasks.workunit.client.0.vm06.stdout:6/915: write d6/d4f/d3e/d52/f84 [1176189,73075] 0 2026-03-09T17:30:12.806 INFO:tasks.workunit.client.0.vm06.stdout:2/888: dwrite d3/d4/d22/d72/f54 [0,4194304] 0 2026-03-09T17:30:12.808 INFO:tasks.workunit.client.0.vm06.stdout:2/889: chown d3/d4/d46/fc6 5677 1 2026-03-09T17:30:12.811 INFO:tasks.workunit.client.1.vm09.stdout:6/225: symlink d3/d21/l45 0 2026-03-09T17:30:12.812 INFO:tasks.workunit.client.1.vm09.stdout:6/226: write d3/d7/f23 [2152326,74963] 0 2026-03-09T17:30:12.813 INFO:tasks.workunit.client.0.vm06.stdout:8/988: dread d15/d16/d1e/f64 [0,4194304] 0 2026-03-09T17:30:12.817 INFO:tasks.workunit.client.0.vm06.stdout:6/916: rmdir d6/d12/d17/d65/d113 39 2026-03-09T17:30:12.818 INFO:tasks.workunit.client.1.vm09.stdout:6/227: read - d3/f41 zero size 2026-03-09T17:30:12.820 INFO:tasks.workunit.client.1.vm09.stdout:6/228: dread - d3/d1e/d30/d3f/f42 zero size 2026-03-09T17:30:12.822 INFO:tasks.workunit.client.1.vm09.stdout:3/235: dwrite f3 [0,4194304] 0 2026-03-09T17:30:12.823 INFO:tasks.workunit.client.1.vm09.stdout:4/310: fdatasync d11/f25 0 2026-03-09T17:30:12.826 INFO:tasks.workunit.client.1.vm09.stdout:4/311: truncate d11/d1e/f61 101382 0 2026-03-09T17:30:12.826 INFO:tasks.workunit.client.0.vm06.stdout:5/994: mkdir d4/d50/d35/d40/d96/dfe/d164 0 2026-03-09T17:30:12.829 INFO:tasks.workunit.client.1.vm09.stdout:5/241: mknod d0/dc/d21/d33/c51 0 2026-03-09T17:30:12.830 INFO:tasks.workunit.client.1.vm09.stdout:5/242: write d0/dc/d21/f29 [2057693,3051] 0 2026-03-09T17:30:12.832 INFO:tasks.workunit.client.1.vm09.stdout:4/312: dwrite d11/d1e/d30/f5f [0,4194304] 0 2026-03-09T17:30:12.848 INFO:tasks.workunit.client.0.vm06.stdout:8/989: creat d15/d31/dc5/f142 x:0 0 0 2026-03-09T17:30:12.848 INFO:tasks.workunit.client.0.vm06.stdout:8/990: stat d15/d16/d1a/d7c 0 2026-03-09T17:30:12.852 INFO:tasks.workunit.client.0.vm06.stdout:1/980: dwrite d11/d14/d1d/d42/d46/d92/dc0/d57/d7b/fef [0,4194304] 0 2026-03-09T17:30:12.860 INFO:tasks.workunit.client.0.vm06.stdout:2/890: dwrite d3/d4/d12/d71/daa/d77/fd9 [0,4194304] 0 2026-03-09T17:30:12.860 INFO:tasks.workunit.client.0.vm06.stdout:6/917: mkdir d6/d47/d4d/d9a/d11e 0 2026-03-09T17:30:12.861 INFO:tasks.workunit.client.1.vm09.stdout:9/226: mkdir d5/de/d29/d51 0 2026-03-09T17:30:12.863 INFO:tasks.workunit.client.1.vm09.stdout:7/324: mkdir da/d76 0 2026-03-09T17:30:12.869 INFO:tasks.workunit.client.0.vm06.stdout:1/981: read d11/d14/d1d/d1e/d2a/d34/d64/df6/ffc [502162,82768] 0 2026-03-09T17:30:12.872 INFO:tasks.workunit.client.0.vm06.stdout:2/891: dread - d3/d4/d12/da7/db3/f103 zero size 2026-03-09T17:30:12.882 INFO:tasks.workunit.client.1.vm09.stdout:6/229: write d3/f2e [3722568,62381] 0 2026-03-09T17:30:12.882 INFO:tasks.workunit.client.0.vm06.stdout:6/918: creat d6/d12/d53/d91/dcb/f11f x:0 0 0 2026-03-09T17:30:12.882 INFO:tasks.workunit.client.0.vm06.stdout:5/995: mkdir d4/da4/dcf/d121/d165 0 2026-03-09T17:30:12.883 INFO:tasks.workunit.client.0.vm06.stdout:8/991: unlink d15/d16/d1a/d47/f7e 0 2026-03-09T17:30:12.886 INFO:tasks.workunit.client.1.vm09.stdout:7/325: sync 2026-03-09T17:30:12.887 INFO:tasks.workunit.client.1.vm09.stdout:7/326: write da/d11/d47/f6e [27190,64711] 0 2026-03-09T17:30:12.890 INFO:tasks.workunit.client.1.vm09.stdout:5/243: rename d0/d2/d15 to d0/d52 0 2026-03-09T17:30:12.893 INFO:tasks.workunit.client.0.vm06.stdout:1/982: read d11/d14/d1d/d42/d46/faa [624158,50093] 0 2026-03-09T17:30:12.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:12 vm09.local ceph-mon[62061]: from='client.14692 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:12.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:12 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/1746334570' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:30:12.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:12 vm09.local ceph-mon[62061]: pgmap v6: 65 pgs: 65 active+clean; 2.5 GiB data, 8.7 GiB used, 111 GiB / 120 GiB avail 2026-03-09T17:30:12.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:12 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/1972808134' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:30:12.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:12 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:12.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:12 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:12.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:12 vm09.local ceph-mon[62061]: Standby manager daemon vm06.pbgzei started 2026-03-09T17:30:12.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:12 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.pbgzei/crt"}]: dispatch 2026-03-09T17:30:12.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:12 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:30:12.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:12 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.pbgzei/key"}]: dispatch 2026-03-09T17:30:12.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:12 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.106:0/2' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T17:30:12.896 INFO:tasks.workunit.client.0.vm06.stdout:1/983: stat d11/d14/d1d/d4a/df7/f129 0 2026-03-09T17:30:12.896 INFO:tasks.workunit.client.1.vm09.stdout:4/313: creat d11/d1e/d45/d60/f64 x:0 0 0 2026-03-09T17:30:12.896 INFO:tasks.workunit.client.1.vm09.stdout:4/314: write d11/d1e/f61 [1137636,74417] 0 2026-03-09T17:30:12.898 INFO:tasks.workunit.client.1.vm09.stdout:2/201: creat d13/f40 x:0 0 0 2026-03-09T17:30:12.899 INFO:tasks.workunit.client.0.vm06.stdout:5/996: mkdir d4/d50/d35/d166 0 2026-03-09T17:30:12.899 INFO:tasks.workunit.client.0.vm06.stdout:5/997: write d4/d50/d35/d40/d6f/fd8 [3053448,93840] 0 2026-03-09T17:30:12.900 INFO:tasks.workunit.client.0.vm06.stdout:2/892: dread d3/d4/d12/d71/daa/d77/d81/d64/d6a/f96 [0,4194304] 0 2026-03-09T17:30:12.903 INFO:tasks.workunit.client.0.vm06.stdout:2/893: dread d3/d4/d12/d2b/d36/fb9 [4194304,4194304] 0 2026-03-09T17:30:12.908 INFO:tasks.workunit.client.0.vm06.stdout:8/992: write d15/d31/dc5/df1/d71/f133 [879428,78600] 0 2026-03-09T17:30:12.918 INFO:tasks.workunit.client.1.vm09.stdout:3/236: truncate d5/d16/d31/f34 690256 0 2026-03-09T17:30:12.918 INFO:tasks.workunit.client.1.vm09.stdout:3/237: write d5/d16/d31/d3d/d32/f33 [3511698,84711] 0 2026-03-09T17:30:12.925 INFO:tasks.workunit.client.1.vm09.stdout:3/238: dread d5/d16/d25/f2b [0,4194304] 0 2026-03-09T17:30:12.931 INFO:tasks.workunit.client.1.vm09.stdout:8/257: write d1/f16 [680420,66964] 0 2026-03-09T17:30:12.932 INFO:tasks.workunit.client.1.vm09.stdout:6/230: rename d3 to d3/d7/d46 22 2026-03-09T17:30:12.932 INFO:tasks.workunit.client.0.vm06.stdout:2/894: stat d3/d4/d12/d2b/d9f/la2 0 2026-03-09T17:30:12.936 INFO:tasks.workunit.client.0.vm06.stdout:1/984: fdatasync d11/d14/d1d/d4a/f12f 0 2026-03-09T17:30:12.943 INFO:tasks.workunit.client.0.vm06.stdout:5/998: write d4/d52/f5b [859631,32705] 0 2026-03-09T17:30:12.944 INFO:tasks.workunit.client.0.vm06.stdout:5/999: chown d4/d50/d35/d40/d6f 181 1 2026-03-09T17:30:12.953 INFO:tasks.workunit.client.1.vm09.stdout:4/315: dwrite d11/d1e/f3c [0,4194304] 0 2026-03-09T17:30:12.955 INFO:tasks.workunit.client.1.vm09.stdout:0/246: getdents d6/d1d 0 2026-03-09T17:30:12.955 INFO:tasks.workunit.client.1.vm09.stdout:1/264: link d9/c4c d9/dc/dd/d40/d22/d37/c52 0 2026-03-09T17:30:12.956 INFO:tasks.workunit.client.1.vm09.stdout:1/265: readlink d9/dc/dd/l2c 0 2026-03-09T17:30:12.958 INFO:tasks.workunit.client.1.vm09.stdout:4/316: dread d11/f25 [0,4194304] 0 2026-03-09T17:30:12.960 INFO:tasks.workunit.client.0.vm06.stdout:2/895: creat d3/d4/d12/d2b/d36/d37/f123 x:0 0 0 2026-03-09T17:30:12.960 INFO:tasks.workunit.client.0.vm06.stdout:1/985: truncate d11/d14/d1d/d42/f70 125559 0 2026-03-09T17:30:12.962 INFO:tasks.workunit.client.0.vm06.stdout:2/896: truncate d3/d4/d46/da5/f10f 198222 0 2026-03-09T17:30:12.962 INFO:tasks.workunit.client.1.vm09.stdout:4/317: write d11/d1e/d45/d60/f64 [1001027,46414] 0 2026-03-09T17:30:12.964 INFO:tasks.workunit.client.1.vm09.stdout:1/266: dread d9/f29 [0,4194304] 0 2026-03-09T17:30:12.964 INFO:tasks.workunit.client.0.vm06.stdout:1/986: dwrite d11/d14/d1d/d42/d46/d92/dc0/d57/de4/f139 [0,4194304] 0 2026-03-09T17:30:12.971 INFO:tasks.workunit.client.1.vm09.stdout:4/318: dwrite d11/f15 [0,4194304] 0 2026-03-09T17:30:12.978 INFO:tasks.workunit.client.1.vm09.stdout:7/327: mkdir da/d11/d77 0 2026-03-09T17:30:12.983 INFO:tasks.workunit.client.1.vm09.stdout:8/258: unlink d1/da/dd/l24 0 2026-03-09T17:30:12.984 INFO:tasks.workunit.client.0.vm06.stdout:6/919: link d6/d47/d96/d40/l94 d6/d12/d2d/l120 0 2026-03-09T17:30:12.993 INFO:tasks.workunit.client.1.vm09.stdout:7/328: read da/d11/d41/d4e/f2b [3466528,14771] 0 2026-03-09T17:30:12.994 INFO:tasks.workunit.client.1.vm09.stdout:6/231: readlink d3/l15 0 2026-03-09T17:30:13.004 INFO:tasks.workunit.client.0.vm06.stdout:1/987: mkdir d11/d14/d1d/dd1/de2/d14d 0 2026-03-09T17:30:13.005 INFO:tasks.workunit.client.0.vm06.stdout:8/993: link d15/d39/d67/d77/d99/c10f d15/d39/d67/d77/d99/d11e/c143 0 2026-03-09T17:30:13.006 INFO:tasks.workunit.client.1.vm09.stdout:0/247: fdatasync d6/f27 0 2026-03-09T17:30:13.008 INFO:tasks.workunit.client.0.vm06.stdout:6/920: mknod d6/d12/d17/c121 0 2026-03-09T17:30:13.008 INFO:tasks.workunit.client.1.vm09.stdout:5/244: dread d0/d2/f2a [4194304,4194304] 0 2026-03-09T17:30:13.008 INFO:tasks.workunit.client.0.vm06.stdout:6/921: chown d6/d4f/d3e/d52/d8c/db0/lfd 40253 1 2026-03-09T17:30:13.009 INFO:tasks.workunit.client.0.vm06.stdout:6/922: stat d6/d47/d96/l9d 0 2026-03-09T17:30:13.009 INFO:tasks.workunit.client.0.vm06.stdout:8/994: write d15/d31/dc5/df1/d71/ffe [5499019,37170] 0 2026-03-09T17:30:13.010 INFO:tasks.workunit.client.1.vm09.stdout:4/319: creat d11/d1e/d31/f65 x:0 0 0 2026-03-09T17:30:13.011 INFO:tasks.workunit.client.0.vm06.stdout:6/923: fsync d6/d12/d17/d85/f9c 0 2026-03-09T17:30:13.012 INFO:tasks.workunit.client.0.vm06.stdout:8/995: mknod d15/d31/de2/c144 0 2026-03-09T17:30:13.014 INFO:tasks.workunit.client.0.vm06.stdout:8/996: dread d15/d39/d67/de3/fe5 [0,4194304] 0 2026-03-09T17:30:13.020 INFO:tasks.workunit.client.0.vm06.stdout:8/997: fsync d15/d39/d67/d77/d99/f125 0 2026-03-09T17:30:13.020 INFO:tasks.workunit.client.1.vm09.stdout:7/329: dread da/d11/d41/d4e/f42 [0,4194304] 0 2026-03-09T17:30:13.020 INFO:tasks.workunit.client.1.vm09.stdout:7/330: read da/d11/d2d/d56/f50 [1158792,84186] 0 2026-03-09T17:30:13.020 INFO:tasks.workunit.client.1.vm09.stdout:6/232: symlink d3/d1e/l47 0 2026-03-09T17:30:13.020 INFO:tasks.workunit.client.1.vm09.stdout:7/331: read da/d11/d41/f30 [805188,12764] 0 2026-03-09T17:30:13.020 INFO:tasks.workunit.client.1.vm09.stdout:7/332: chown da/d11/d41/d4e/f63 2038750 1 2026-03-09T17:30:13.020 INFO:tasks.workunit.client.1.vm09.stdout:0/248: creat d6/d1d/d24/f4e x:0 0 0 2026-03-09T17:30:13.020 INFO:tasks.workunit.client.1.vm09.stdout:7/333: chown da/d11/d41/f30 1898282 1 2026-03-09T17:30:13.020 INFO:tasks.workunit.client.1.vm09.stdout:7/334: truncate da/fb 1894790 0 2026-03-09T17:30:13.021 INFO:tasks.workunit.client.1.vm09.stdout:5/245: write d0/d9/f3e [489799,109548] 0 2026-03-09T17:30:13.022 INFO:tasks.workunit.client.1.vm09.stdout:9/227: creat d5/de/d29/f52 x:0 0 0 2026-03-09T17:30:13.025 INFO:tasks.workunit.client.1.vm09.stdout:2/202: mknod d13/d15/d34/d37/c41 0 2026-03-09T17:30:13.025 INFO:tasks.workunit.client.1.vm09.stdout:4/320: sync 2026-03-09T17:30:13.025 INFO:tasks.workunit.client.1.vm09.stdout:5/246: sync 2026-03-09T17:30:13.026 INFO:tasks.workunit.client.1.vm09.stdout:4/321: chown d11/d1e/d30/f34 1533293 1 2026-03-09T17:30:13.026 INFO:tasks.workunit.client.1.vm09.stdout:5/247: chown d0/d9/d16/d3c/d42 5 1 2026-03-09T17:30:13.026 INFO:tasks.workunit.client.1.vm09.stdout:7/335: mkdir da/d11/d47/d5b/d78 0 2026-03-09T17:30:13.027 INFO:tasks.workunit.client.1.vm09.stdout:7/336: chown da/d11/d2d/l4d 600 1 2026-03-09T17:30:13.029 INFO:tasks.workunit.client.1.vm09.stdout:7/337: chown da/d11/d2d/f69 7661 1 2026-03-09T17:30:13.029 INFO:tasks.workunit.client.1.vm09.stdout:0/249: dread d6/d1d/d39/f44 [0,4194304] 0 2026-03-09T17:30:13.030 INFO:tasks.workunit.client.1.vm09.stdout:8/259: truncate d1/da/dd/f22 3077003 0 2026-03-09T17:30:13.030 INFO:tasks.workunit.client.1.vm09.stdout:6/233: mkdir d3/d48 0 2026-03-09T17:30:13.030 INFO:tasks.workunit.client.1.vm09.stdout:8/260: write d1/da/f4b [2259547,55069] 0 2026-03-09T17:30:13.032 INFO:tasks.workunit.client.1.vm09.stdout:4/322: mknod d11/d1e/d45/d60/c66 0 2026-03-09T17:30:13.032 INFO:tasks.workunit.client.1.vm09.stdout:1/267: link d9/dc/dd/d40/d21/d35/l3e d9/d38/l53 0 2026-03-09T17:30:13.033 INFO:tasks.workunit.client.1.vm09.stdout:5/248: write d0/dc/d21/d33/f35 [2343221,25332] 0 2026-03-09T17:30:13.037 INFO:tasks.workunit.client.1.vm09.stdout:0/250: dread d6/f21 [4194304,4194304] 0 2026-03-09T17:30:13.039 INFO:tasks.workunit.client.1.vm09.stdout:4/323: dread d11/d1e/d30/f5f [0,4194304] 0 2026-03-09T17:30:13.040 INFO:tasks.workunit.client.1.vm09.stdout:1/268: symlink d9/dc/dd/d40/d21/d35/l54 0 2026-03-09T17:30:13.040 INFO:tasks.workunit.client.1.vm09.stdout:8/261: creat d1/d14/d2a/d42/d43/f58 x:0 0 0 2026-03-09T17:30:13.042 INFO:tasks.workunit.client.1.vm09.stdout:7/338: creat da/d11/d77/f79 x:0 0 0 2026-03-09T17:30:13.043 INFO:tasks.workunit.client.1.vm09.stdout:2/203: getdents d13 0 2026-03-09T17:30:13.044 INFO:tasks.workunit.client.1.vm09.stdout:0/251: rmdir d6/d1d/d39/d4c 0 2026-03-09T17:30:13.045 INFO:tasks.workunit.client.1.vm09.stdout:2/204: read fb [106975,75221] 0 2026-03-09T17:30:13.046 INFO:tasks.workunit.client.1.vm09.stdout:5/249: truncate d0/f22 693379 0 2026-03-09T17:30:13.047 INFO:tasks.workunit.client.1.vm09.stdout:4/324: link d11/f12 d11/d1e/d29/d36/d57/f67 0 2026-03-09T17:30:13.048 INFO:tasks.workunit.client.1.vm09.stdout:1/269: dwrite f6 [0,4194304] 0 2026-03-09T17:30:13.048 INFO:tasks.workunit.client.1.vm09.stdout:2/205: fsync d13/d15/f18 0 2026-03-09T17:30:13.049 INFO:tasks.workunit.client.1.vm09.stdout:9/228: dread d5/f34 [0,4194304] 0 2026-03-09T17:30:13.050 INFO:tasks.workunit.client.1.vm09.stdout:8/262: getdents d1/d14/d2a/d49 0 2026-03-09T17:30:13.051 INFO:tasks.workunit.client.1.vm09.stdout:8/263: chown d1/da/dd/d3f/d32/f50 111711 1 2026-03-09T17:30:13.052 INFO:tasks.workunit.client.1.vm09.stdout:9/229: write d5/de/d29/f52 [192200,20538] 0 2026-03-09T17:30:13.054 INFO:tasks.workunit.client.1.vm09.stdout:4/325: stat f3 0 2026-03-09T17:30:13.055 INFO:tasks.workunit.client.1.vm09.stdout:2/206: mknod d13/d15/d36/c42 0 2026-03-09T17:30:13.056 INFO:tasks.workunit.client.1.vm09.stdout:1/270: rmdir d9/dc/dd/d40/d1d 39 2026-03-09T17:30:13.056 INFO:tasks.workunit.client.1.vm09.stdout:2/207: read - d13/d15/f2f zero size 2026-03-09T17:30:13.057 INFO:tasks.workunit.client.1.vm09.stdout:1/271: stat d9/dc/dd/d40/d22/f50 0 2026-03-09T17:30:13.058 INFO:tasks.workunit.client.1.vm09.stdout:5/250: creat d0/f53 x:0 0 0 2026-03-09T17:30:13.059 INFO:tasks.workunit.client.1.vm09.stdout:4/326: mknod d11/d1e/d29/c68 0 2026-03-09T17:30:13.060 INFO:tasks.workunit.client.1.vm09.stdout:2/208: mkdir d13/d15/d3b/d43 0 2026-03-09T17:30:13.060 INFO:tasks.workunit.client.1.vm09.stdout:8/264: truncate d1/f7 5277746 0 2026-03-09T17:30:13.061 INFO:tasks.workunit.client.1.vm09.stdout:2/209: dread - d13/f40 zero size 2026-03-09T17:30:13.061 INFO:tasks.workunit.client.1.vm09.stdout:8/265: dread - d1/da/dd/d3f/d32/f50 zero size 2026-03-09T17:30:13.062 INFO:tasks.workunit.client.1.vm09.stdout:1/272: chown c4 706011 1 2026-03-09T17:30:13.068 INFO:tasks.workunit.client.1.vm09.stdout:9/230: dwrite d5/de/d29/f37 [0,4194304] 0 2026-03-09T17:30:13.068 INFO:tasks.workunit.client.1.vm09.stdout:0/252: truncate d6/f27 1229178 0 2026-03-09T17:30:13.071 INFO:tasks.workunit.client.1.vm09.stdout:9/231: chown d5/d2e/c3a 40 1 2026-03-09T17:30:13.076 INFO:tasks.workunit.client.1.vm09.stdout:1/273: write d9/dc/dd/d40/d1d/f1e [496562,78262] 0 2026-03-09T17:30:13.078 INFO:tasks.workunit.client.1.vm09.stdout:5/251: symlink d0/dc/d21/d26/l54 0 2026-03-09T17:30:13.079 INFO:tasks.workunit.client.1.vm09.stdout:8/266: mknod d1/da/c59 0 2026-03-09T17:30:13.079 INFO:tasks.workunit.client.1.vm09.stdout:1/274: read f3 [5087824,124575] 0 2026-03-09T17:30:13.080 INFO:tasks.workunit.client.0.vm06.stdout:8/998: sync 2026-03-09T17:30:13.082 INFO:tasks.workunit.client.1.vm09.stdout:9/232: dwrite f2 [0,4194304] 0 2026-03-09T17:30:13.083 INFO:tasks.workunit.client.1.vm09.stdout:0/253: dwrite d6/d1d/f41 [0,4194304] 0 2026-03-09T17:30:13.087 INFO:tasks.workunit.client.1.vm09.stdout:9/233: write d5/de/f3c [3817307,120505] 0 2026-03-09T17:30:13.088 INFO:tasks.workunit.client.1.vm09.stdout:9/234: truncate d5/d21/f46 761496 0 2026-03-09T17:30:13.088 INFO:tasks.workunit.client.1.vm09.stdout:9/235: chown d5/d21/f2b 95256853 1 2026-03-09T17:30:13.094 INFO:tasks.workunit.client.1.vm09.stdout:4/327: dread d11/d1e/f22 [0,4194304] 0 2026-03-09T17:30:13.102 INFO:tasks.workunit.client.0.vm06.stdout:2/897: dwrite d3/d4/d12/d71/daa/f5f [0,4194304] 0 2026-03-09T17:30:13.102 INFO:tasks.workunit.client.1.vm09.stdout:4/328: write fe [4801951,123409] 0 2026-03-09T17:30:13.102 INFO:tasks.workunit.client.1.vm09.stdout:4/329: read d11/d1e/f61 [495432,8171] 0 2026-03-09T17:30:13.105 INFO:tasks.workunit.client.1.vm09.stdout:0/254: dread d6/d1d/f1e [0,4194304] 0 2026-03-09T17:30:13.107 INFO:tasks.workunit.client.1.vm09.stdout:0/255: fdatasync d6/d1d/f1e 0 2026-03-09T17:30:13.112 INFO:tasks.workunit.client.0.vm06.stdout:8/999: creat d15/d13f/f145 x:0 0 0 2026-03-09T17:30:13.114 INFO:tasks.workunit.client.0.vm06.stdout:1/988: write d11/d14/d1d/d42/dff/f107 [1741380,73227] 0 2026-03-09T17:30:13.118 INFO:tasks.workunit.client.1.vm09.stdout:3/239: truncate d5/d16/d31/d3d/d12/f15 5406696 0 2026-03-09T17:30:13.118 INFO:tasks.workunit.client.0.vm06.stdout:2/898: fsync d3/d4/d22/f28 0 2026-03-09T17:30:13.119 INFO:tasks.workunit.client.1.vm09.stdout:3/240: write d5/d16/d25/f2c [171698,126373] 0 2026-03-09T17:30:13.126 INFO:tasks.workunit.client.1.vm09.stdout:8/267: mknod d1/d14/d31/c5a 0 2026-03-09T17:30:13.133 INFO:tasks.workunit.client.1.vm09.stdout:1/275: rename d9/dc/dd/d40/d1d/d4e to d9/dc/dd/d40/d22/d37/d3f/d42/d55 0 2026-03-09T17:30:13.138 INFO:tasks.workunit.client.0.vm06.stdout:1/989: creat d11/d14/d1d/d1e/dd6/f14e x:0 0 0 2026-03-09T17:30:13.138 INFO:tasks.workunit.client.1.vm09.stdout:1/276: dwrite d9/f34 [4194304,4194304] 0 2026-03-09T17:30:13.140 INFO:tasks.workunit.client.0.vm06.stdout:2/899: symlink d3/d4/d22/d72/l124 0 2026-03-09T17:30:13.142 INFO:tasks.workunit.client.0.vm06.stdout:2/900: truncate d3/d4/d12/d71/daa/f11e 301844 0 2026-03-09T17:30:13.144 INFO:tasks.workunit.client.0.vm06.stdout:2/901: chown d3/d4/d12/d2b/la4 80 1 2026-03-09T17:30:13.146 INFO:tasks.workunit.client.0.vm06.stdout:6/924: write d6/d12/d53/fcc [5171890,35450] 0 2026-03-09T17:30:13.146 INFO:tasks.workunit.client.0.vm06.stdout:6/925: chown d6/d12/c69 84108 1 2026-03-09T17:30:13.147 INFO:tasks.workunit.client.0.vm06.stdout:1/990: dread d11/d14/d1d/d4a/df7/d106/d112/d114/f11c [0,4194304] 0 2026-03-09T17:30:13.147 INFO:tasks.workunit.client.0.vm06.stdout:6/926: readlink d6/d4f/l90 0 2026-03-09T17:30:13.154 INFO:tasks.workunit.client.0.vm06.stdout:1/991: dwrite d11/f105 [4194304,4194304] 0 2026-03-09T17:30:13.177 INFO:tasks.workunit.client.1.vm09.stdout:2/210: creat d13/d15/d34/f44 x:0 0 0 2026-03-09T17:30:13.193 INFO:tasks.workunit.client.0.vm06.stdout:1/992: truncate d11/d14/d1d/d1e/d2a/f115 572096 0 2026-03-09T17:30:13.201 INFO:tasks.workunit.client.1.vm09.stdout:3/241: unlink d5/f35 0 2026-03-09T17:30:13.202 INFO:tasks.workunit.client.1.vm09.stdout:4/330: rename d11/d1e/d30 to d11/d1e/d45/d60/d69 0 2026-03-09T17:30:13.205 INFO:tasks.workunit.client.1.vm09.stdout:6/234: write d3/d7/f10 [1775104,90514] 0 2026-03-09T17:30:13.212 INFO:tasks.workunit.client.0.vm06.stdout:2/902: rename d3/d4/d12/d2b/d36/dd4/fd7 to d3/d4/d22/f125 0 2026-03-09T17:30:13.212 INFO:tasks.workunit.client.1.vm09.stdout:7/339: write da/d11/d2d/f69 [693042,119800] 0 2026-03-09T17:30:13.214 INFO:tasks.workunit.client.1.vm09.stdout:8/268: dwrite d1/d14/d2a/f2e [4194304,4194304] 0 2026-03-09T17:30:13.214 INFO:tasks.workunit.client.0.vm06.stdout:1/993: dread d11/d14/d1c/d3a/fbf [0,4194304] 0 2026-03-09T17:30:13.218 INFO:tasks.workunit.client.0.vm06.stdout:2/903: dread - d3/d4/d46/f107 zero size 2026-03-09T17:30:13.219 INFO:tasks.workunit.client.1.vm09.stdout:4/331: dwrite d11/f16 [0,4194304] 0 2026-03-09T17:30:13.220 INFO:tasks.workunit.client.1.vm09.stdout:1/277: dread d9/dc/dd/d40/d22/d37/f2e [0,4194304] 0 2026-03-09T17:30:13.228 INFO:tasks.workunit.client.0.vm06.stdout:1/994: rmdir d11/d14/d1d/d1e/d2a 39 2026-03-09T17:30:13.228 INFO:tasks.workunit.client.0.vm06.stdout:2/904: mknod d3/d4/d12/d71/daa/d77/d81/c126 0 2026-03-09T17:30:13.234 INFO:tasks.workunit.client.0.vm06.stdout:1/995: unlink d11/d14/d1d/f90 0 2026-03-09T17:30:13.238 INFO:tasks.workunit.client.1.vm09.stdout:9/236: rename d5/de/d29/d33/f42 to d5/d2e/f53 0 2026-03-09T17:30:13.240 INFO:tasks.workunit.client.1.vm09.stdout:8/269: symlink d1/da/d23/l5b 0 2026-03-09T17:30:13.242 INFO:tasks.workunit.client.0.vm06.stdout:2/905: fsync d3/d4/d12/da7/db3/fc2 0 2026-03-09T17:30:13.254 INFO:tasks.workunit.client.0.vm06.stdout:6/927: dread d6/d12/fbb [0,4194304] 0 2026-03-09T17:30:13.257 INFO:tasks.workunit.client.1.vm09.stdout:2/211: mkdir d13/d15/d34/d45 0 2026-03-09T17:30:13.260 INFO:tasks.workunit.client.0.vm06.stdout:6/928: chown d6/d12/d17/f78 58 1 2026-03-09T17:30:13.270 INFO:tasks.workunit.client.1.vm09.stdout:7/340: dread da/d11/f25 [0,4194304] 0 2026-03-09T17:30:13.275 INFO:tasks.workunit.client.0.vm06.stdout:2/906: creat d3/d4/d12/d71/daa/d77/d81/f127 x:0 0 0 2026-03-09T17:30:13.291 INFO:tasks.workunit.client.1.vm09.stdout:3/242: dread d5/d16/d31/d3d/fe [0,4194304] 0 2026-03-09T17:30:13.292 INFO:tasks.workunit.client.1.vm09.stdout:0/256: write d6/d1d/d39/f44 [596472,8490] 0 2026-03-09T17:30:13.294 INFO:tasks.workunit.client.1.vm09.stdout:5/252: truncate d0/d52/d20/f25 710721 0 2026-03-09T17:30:13.298 INFO:tasks.workunit.client.1.vm09.stdout:6/235: write d3/d7/f24 [361497,13892] 0 2026-03-09T17:30:13.309 INFO:tasks.workunit.client.1.vm09.stdout:1/278: write d9/dc/dd/f28 [323083,56689] 0 2026-03-09T17:30:13.311 INFO:tasks.workunit.client.0.vm06.stdout:1/996: write d11/d14/d1d/d42/d46/d92/dc0/d57/fac [729485,54350] 0 2026-03-09T17:30:13.312 INFO:tasks.workunit.client.0.vm06.stdout:1/997: chown d11/d14/d1c/d3a/db7 1 1 2026-03-09T17:30:13.313 INFO:tasks.workunit.client.0.vm06.stdout:1/998: dread - d11/d14/d1d/d1e/dd6/f14e zero size 2026-03-09T17:30:13.323 INFO:tasks.workunit.client.1.vm09.stdout:8/270: rmdir d1/da/dd 39 2026-03-09T17:30:13.323 INFO:tasks.workunit.client.1.vm09.stdout:8/271: chown l0 1354 1 2026-03-09T17:30:13.329 INFO:tasks.workunit.client.1.vm09.stdout:9/237: write d5/f14 [130017,125540] 0 2026-03-09T17:30:13.334 INFO:tasks.workunit.client.1.vm09.stdout:1/279: dread d9/f11 [0,4194304] 0 2026-03-09T17:30:13.337 INFO:tasks.workunit.client.1.vm09.stdout:0/257: mknod d6/d1d/d24/c4f 0 2026-03-09T17:30:13.337 INFO:tasks.workunit.client.1.vm09.stdout:0/258: write d6/f9 [1890164,47613] 0 2026-03-09T17:30:13.344 INFO:tasks.workunit.client.1.vm09.stdout:1/280: dread d9/dc/dd/d40/f1a [0,4194304] 0 2026-03-09T17:30:13.345 INFO:tasks.workunit.client.1.vm09.stdout:3/243: dwrite d5/f2f [0,4194304] 0 2026-03-09T17:30:13.347 INFO:tasks.workunit.client.1.vm09.stdout:4/332: link d11/f23 d11/d1e/d29/d36/f6a 0 2026-03-09T17:30:13.351 INFO:tasks.workunit.client.1.vm09.stdout:2/212: dread d13/d15/d21/f31 [0,4194304] 0 2026-03-09T17:30:13.356 INFO:tasks.workunit.client.0.vm06.stdout:2/907: creat d3/d4/d12/da7/db3/f128 x:0 0 0 2026-03-09T17:30:13.356 INFO:tasks.workunit.client.0.vm06.stdout:6/929: mkdir d6/d4f/d122 0 2026-03-09T17:30:13.357 INFO:tasks.workunit.client.0.vm06.stdout:1/999: creat d11/d14/d1d/d1e/dd6/f14f x:0 0 0 2026-03-09T17:30:13.357 INFO:tasks.workunit.client.0.vm06.stdout:6/930: chown d6/d4f/d3e/d52/d8c/d117/d10c 3308 1 2026-03-09T17:30:13.358 INFO:tasks.workunit.client.1.vm09.stdout:1/281: sync 2026-03-09T17:30:13.359 INFO:tasks.workunit.client.1.vm09.stdout:1/282: write d9/dc/f47 [544117,18022] 0 2026-03-09T17:30:13.362 INFO:tasks.workunit.client.1.vm09.stdout:5/253: rmdir d0/d46 39 2026-03-09T17:30:13.363 INFO:tasks.workunit.client.1.vm09.stdout:3/244: rmdir d5/d16/d25 39 2026-03-09T17:30:13.364 INFO:tasks.workunit.client.1.vm09.stdout:8/272: creat d1/d14/d2a/d42/d43/d44/f5c x:0 0 0 2026-03-09T17:30:13.365 INFO:tasks.workunit.client.1.vm09.stdout:2/213: fdatasync d13/d15/d3b/f3f 0 2026-03-09T17:30:13.366 INFO:tasks.workunit.client.0.vm06.stdout:2/908: read d3/f3b [2373933,71576] 0 2026-03-09T17:30:13.368 INFO:tasks.workunit.client.1.vm09.stdout:2/214: dread d13/f26 [0,4194304] 0 2026-03-09T17:30:13.371 INFO:tasks.workunit.client.1.vm09.stdout:2/215: dwrite d13/d15/f20 [0,4194304] 0 2026-03-09T17:30:13.371 INFO:tasks.workunit.client.0.vm06.stdout:6/931: dwrite d6/d47/dd7/df8/f116 [0,4194304] 0 2026-03-09T17:30:13.373 INFO:tasks.workunit.client.1.vm09.stdout:2/216: stat d13/d15/f2a 0 2026-03-09T17:30:13.395 INFO:tasks.workunit.client.0.vm06.stdout:2/909: creat d3/d4/d22/d72/d8f/dda/f129 x:0 0 0 2026-03-09T17:30:13.401 INFO:tasks.workunit.client.1.vm09.stdout:9/238: mkdir d5/de/d4e/d54 0 2026-03-09T17:30:13.401 INFO:tasks.workunit.client.1.vm09.stdout:5/254: stat d0/l3 0 2026-03-09T17:30:13.402 INFO:tasks.workunit.client.0.vm06.stdout:2/910: symlink d3/d4/d12/da7/l12a 0 2026-03-09T17:30:13.409 INFO:tasks.workunit.client.1.vm09.stdout:8/273: mkdir d1/d14/d2a/d42/d5d 0 2026-03-09T17:30:13.417 INFO:tasks.workunit.client.1.vm09.stdout:8/274: chown d1/d14/d2a/l55 33018422 1 2026-03-09T17:30:13.417 INFO:tasks.workunit.client.1.vm09.stdout:7/341: getdents da/d11/d47 0 2026-03-09T17:30:13.417 INFO:tasks.workunit.client.1.vm09.stdout:7/342: chown da/d11/l17 103485 1 2026-03-09T17:30:13.418 INFO:tasks.workunit.client.1.vm09.stdout:7/343: stat da/d11/d2d/d56/d68 0 2026-03-09T17:30:13.418 INFO:tasks.workunit.client.1.vm09.stdout:3/245: fsync d5/d16/d31/f34 0 2026-03-09T17:30:13.420 INFO:tasks.workunit.client.1.vm09.stdout:8/275: mknod d1/da/d23/c5e 0 2026-03-09T17:30:13.421 INFO:tasks.workunit.client.1.vm09.stdout:8/276: readlink d1/da/d23/l5b 0 2026-03-09T17:30:13.422 INFO:tasks.workunit.client.1.vm09.stdout:7/344: write da/f27 [71842,80578] 0 2026-03-09T17:30:13.424 INFO:tasks.workunit.client.1.vm09.stdout:5/255: rmdir d0/dc/d21/d33/d4a 0 2026-03-09T17:30:13.426 INFO:tasks.workunit.client.0.vm06.stdout:6/932: sync 2026-03-09T17:30:13.427 INFO:tasks.workunit.client.1.vm09.stdout:5/256: read d0/dc/f37 [1226476,10084] 0 2026-03-09T17:30:13.431 INFO:tasks.workunit.client.1.vm09.stdout:8/277: sync 2026-03-09T17:30:13.432 INFO:tasks.workunit.client.1.vm09.stdout:3/246: mkdir d5/d16/d46 0 2026-03-09T17:30:13.433 INFO:tasks.workunit.client.1.vm09.stdout:3/247: chown d5/d16/d31/d3d/d32/f33 103491 1 2026-03-09T17:30:13.433 INFO:tasks.workunit.client.0.vm06.stdout:6/933: chown d6/d47/c77 3010334 1 2026-03-09T17:30:13.434 INFO:tasks.workunit.client.0.vm06.stdout:6/934: fdatasync d6/d12/d17/f32 0 2026-03-09T17:30:13.438 INFO:tasks.workunit.client.1.vm09.stdout:7/345: rename da/d11/d2d/d56/f51 to da/d11/d41/f7a 0 2026-03-09T17:30:13.439 INFO:tasks.workunit.client.1.vm09.stdout:6/236: getdents d3/d21/d25/d26/d34 0 2026-03-09T17:30:13.440 INFO:tasks.workunit.client.0.vm06.stdout:6/935: symlink d6/d47/dd7/df8/l123 0 2026-03-09T17:30:13.444 INFO:tasks.workunit.client.1.vm09.stdout:6/237: sync 2026-03-09T17:30:13.444 INFO:tasks.workunit.client.1.vm09.stdout:9/239: link d5/de/d29/l4c d5/de/l55 0 2026-03-09T17:30:13.450 INFO:tasks.workunit.client.1.vm09.stdout:9/240: sync 2026-03-09T17:30:13.450 INFO:tasks.workunit.client.1.vm09.stdout:0/259: write d6/d1d/d39/f2e [1093580,8963] 0 2026-03-09T17:30:13.451 INFO:tasks.workunit.client.1.vm09.stdout:4/333: truncate d11/f4d 439036 0 2026-03-09T17:30:13.453 INFO:tasks.workunit.client.0.vm06.stdout:6/936: stat d6/d4f/d3e/d52/d95/f114 0 2026-03-09T17:30:13.459 INFO:tasks.workunit.client.1.vm09.stdout:1/283: dwrite d9/dc/dd/d40/d21/f33 [0,4194304] 0 2026-03-09T17:30:13.461 INFO:tasks.workunit.client.1.vm09.stdout:4/334: dread f10 [0,4194304] 0 2026-03-09T17:30:13.463 INFO:tasks.workunit.client.1.vm09.stdout:1/284: write d9/dc/dd/f28 [1488359,105368] 0 2026-03-09T17:30:13.469 INFO:tasks.workunit.client.0.vm06.stdout:2/911: dwrite d3/d4/d12/d2b/d36/dd4/fd5 [0,4194304] 0 2026-03-09T17:30:13.475 INFO:tasks.workunit.client.1.vm09.stdout:7/346: creat da/d11/d47/d5b/d6c/f7b x:0 0 0 2026-03-09T17:30:13.475 INFO:tasks.workunit.client.1.vm09.stdout:2/217: dwrite d13/d15/f20 [4194304,4194304] 0 2026-03-09T17:30:13.476 INFO:tasks.workunit.client.1.vm09.stdout:5/257: mkdir d0/d55 0 2026-03-09T17:30:13.476 INFO:tasks.workunit.client.0.vm06.stdout:6/937: rmdir d6/d12/d17/d65 39 2026-03-09T17:30:13.476 INFO:tasks.workunit.client.1.vm09.stdout:6/238: symlink d3/d21/d25/d26/d34/l49 0 2026-03-09T17:30:13.477 INFO:tasks.workunit.client.0.vm06.stdout:6/938: stat d6/d12/fbb 0 2026-03-09T17:30:13.480 INFO:tasks.workunit.client.0.vm06.stdout:6/939: chown d6/d12/d53/d91/dcb/f11f 483687 1 2026-03-09T17:30:13.485 INFO:tasks.workunit.client.1.vm09.stdout:2/218: chown d13/d15/d36/c3c 1500375250 1 2026-03-09T17:30:13.487 INFO:tasks.workunit.client.1.vm09.stdout:6/239: dwrite d3/d7/fe [0,4194304] 0 2026-03-09T17:30:13.487 INFO:tasks.workunit.client.0.vm06.stdout:2/912: rename d3/d4/d46/fd8 to d3/d4/d22/d72/d8f/f12b 0 2026-03-09T17:30:13.488 INFO:tasks.workunit.client.0.vm06.stdout:2/913: chown d3/d4/d12/d2b/d9f/fb8 89 1 2026-03-09T17:30:13.493 INFO:tasks.workunit.client.1.vm09.stdout:2/219: write d13/f23 [128959,62062] 0 2026-03-09T17:30:13.494 INFO:tasks.workunit.client.1.vm09.stdout:6/240: dwrite d3/d7/f24 [0,4194304] 0 2026-03-09T17:30:13.505 INFO:tasks.workunit.client.1.vm09.stdout:3/248: unlink d5/d16/d25/l2d 0 2026-03-09T17:30:13.507 INFO:tasks.workunit.client.1.vm09.stdout:1/285: creat d9/dc/dd/d40/d22/d37/d3f/f56 x:0 0 0 2026-03-09T17:30:13.507 INFO:tasks.workunit.client.1.vm09.stdout:7/347: dwrite da/d11/d41/d4e/f74 [0,4194304] 0 2026-03-09T17:30:13.516 INFO:tasks.workunit.client.1.vm09.stdout:3/249: fdatasync d5/d16/d31/d3d/d32/f33 0 2026-03-09T17:30:13.516 INFO:tasks.workunit.client.1.vm09.stdout:7/348: fdatasync da/d11/d41/d4e/f33 0 2026-03-09T17:30:13.519 INFO:tasks.workunit.client.1.vm09.stdout:5/258: dwrite d0/d9/f3e [0,4194304] 0 2026-03-09T17:30:13.524 INFO:tasks.workunit.client.1.vm09.stdout:5/259: stat d0/d9/c40 0 2026-03-09T17:30:13.526 INFO:tasks.workunit.client.1.vm09.stdout:6/241: mknod d3/d21/d25/d26/d34/c4a 0 2026-03-09T17:30:13.527 INFO:tasks.workunit.client.1.vm09.stdout:2/220: write fb [1792259,46350] 0 2026-03-09T17:30:13.527 INFO:tasks.workunit.client.1.vm09.stdout:2/221: stat d13/d15/d3b 0 2026-03-09T17:30:13.530 INFO:tasks.workunit.client.1.vm09.stdout:2/222: chown d13/d15/c29 9541 1 2026-03-09T17:30:13.539 INFO:tasks.workunit.client.1.vm09.stdout:7/349: dwrite da/d11/d41/d4e/f42 [0,4194304] 0 2026-03-09T17:30:13.546 INFO:tasks.workunit.client.0.vm06.stdout:2/914: dread d3/d4/f52 [0,4194304] 0 2026-03-09T17:30:13.566 INFO:tasks.workunit.client.0.vm06.stdout:2/915: creat d3/d4/d12/da7/dfc/f12c x:0 0 0 2026-03-09T17:30:13.584 INFO:tasks.workunit.client.1.vm09.stdout:7/350: dread da/d11/f1a [0,4194304] 0 2026-03-09T17:30:13.589 INFO:tasks.workunit.client.1.vm09.stdout:1/286: unlink d9/dc/dd/d40/d22/c26 0 2026-03-09T17:30:13.596 INFO:tasks.workunit.client.0.vm06.stdout:2/916: creat d3/d4/d12/d71/daa/d77/d102/d109/f12d x:0 0 0 2026-03-09T17:30:13.596 INFO:tasks.workunit.client.1.vm09.stdout:1/287: write d9/dc/dd/d40/d22/d37/d3f/d42/f45 [58316,25709] 0 2026-03-09T17:30:13.599 INFO:tasks.workunit.client.1.vm09.stdout:3/250: write d5/d16/d31/d3d/fe [3371000,115584] 0 2026-03-09T17:30:13.602 INFO:tasks.workunit.client.1.vm09.stdout:7/351: rename da/f3a to da/d11/d41/d4e/f7c 0 2026-03-09T17:30:13.603 INFO:tasks.workunit.client.1.vm09.stdout:8/278: getdents d1/da/dd/d3f 0 2026-03-09T17:30:13.603 INFO:tasks.workunit.client.1.vm09.stdout:6/242: chown d3/c14 2 1 2026-03-09T17:30:13.603 INFO:tasks.workunit.client.1.vm09.stdout:9/241: getdents d5 0 2026-03-09T17:30:13.605 INFO:tasks.workunit.client.1.vm09.stdout:6/243: readlink d3/d7/ld 0 2026-03-09T17:30:13.609 INFO:tasks.workunit.client.1.vm09.stdout:3/251: rmdir d5/d16/d31 39 2026-03-09T17:30:13.611 INFO:tasks.workunit.client.1.vm09.stdout:7/352: dwrite da/d11/d41/f35 [0,4194304] 0 2026-03-09T17:30:13.615 INFO:tasks.workunit.client.1.vm09.stdout:5/260: creat d0/d46/f56 x:0 0 0 2026-03-09T17:30:13.615 INFO:tasks.workunit.client.1.vm09.stdout:5/261: readlink d0/d46/l47 0 2026-03-09T17:30:13.616 INFO:tasks.workunit.client.0.vm06.stdout:2/917: dread d3/d4/d22/d72/d8f/fbf [0,4194304] 0 2026-03-09T17:30:13.637 INFO:tasks.workunit.client.1.vm09.stdout:4/335: dread fd [0,4194304] 0 2026-03-09T17:30:13.642 INFO:tasks.workunit.client.1.vm09.stdout:3/252: fdatasync d5/d16/d31/f44 0 2026-03-09T17:30:13.643 INFO:tasks.workunit.client.1.vm09.stdout:0/260: truncate d6/d1d/d24/d32/f49 3562341 0 2026-03-09T17:30:13.644 INFO:tasks.workunit.client.1.vm09.stdout:8/279: mknod d1/d14/d2a/d42/d5d/c5f 0 2026-03-09T17:30:13.645 INFO:tasks.workunit.client.1.vm09.stdout:5/262: chown d0/d2/l3a 559 1 2026-03-09T17:30:13.646 INFO:tasks.workunit.client.1.vm09.stdout:9/242: rename d5/de/d29/f41 to d5/de/d4e/f56 0 2026-03-09T17:30:13.648 INFO:tasks.workunit.client.1.vm09.stdout:4/336: unlink d11/f33 0 2026-03-09T17:30:13.649 INFO:tasks.workunit.client.0.vm06.stdout:6/940: dwrite d6/d4f/f33 [0,4194304] 0 2026-03-09T17:30:13.649 INFO:tasks.workunit.client.1.vm09.stdout:3/253: creat d5/d16/d46/f47 x:0 0 0 2026-03-09T17:30:13.655 INFO:tasks.workunit.client.1.vm09.stdout:6/244: dread d3/d1e/d30/f39 [0,4194304] 0 2026-03-09T17:30:13.660 INFO:tasks.workunit.client.1.vm09.stdout:3/254: mknod d5/d16/d31/d3d/d12/c48 0 2026-03-09T17:30:13.660 INFO:tasks.workunit.client.1.vm09.stdout:4/337: rename d11/c14 to d11/d1e/d45/c6b 0 2026-03-09T17:30:13.665 INFO:tasks.workunit.client.1.vm09.stdout:0/261: getdents d6/d1d 0 2026-03-09T17:30:13.666 INFO:tasks.workunit.client.1.vm09.stdout:7/353: read da/d11/d41/d4e/f7c [2386903,80346] 0 2026-03-09T17:30:13.667 INFO:tasks.workunit.client.1.vm09.stdout:7/354: readlink da/d11/l3c 0 2026-03-09T17:30:13.669 INFO:tasks.workunit.client.1.vm09.stdout:6/245: dwrite d3/d21/d25/f2f [0,4194304] 0 2026-03-09T17:30:13.680 INFO:tasks.workunit.client.0.vm06.stdout:6/941: sync 2026-03-09T17:30:13.682 INFO:tasks.workunit.client.0.vm06.stdout:6/942: unlink d6/d12/d53/f11d 0 2026-03-09T17:30:13.682 INFO:tasks.workunit.client.1.vm09.stdout:8/280: dwrite d1/da/d13/f36 [0,4194304] 0 2026-03-09T17:30:13.682 INFO:tasks.workunit.client.1.vm09.stdout:5/263: dwrite d0/dc/f44 [0,4194304] 0 2026-03-09T17:30:13.683 INFO:tasks.workunit.client.1.vm09.stdout:3/255: rename d5/d16/d25/l3c to d5/d16/d31/d3d/d12/l49 0 2026-03-09T17:30:13.683 INFO:tasks.workunit.client.0.vm06.stdout:6/943: unlink d6/la9 0 2026-03-09T17:30:13.689 INFO:tasks.workunit.client.0.vm06.stdout:6/944: rename d6/d47/d96/f7e to d6/d12/d53/d91/f124 0 2026-03-09T17:30:13.690 INFO:tasks.workunit.client.0.vm06.stdout:6/945: fdatasync d6/d4f/d3e/d52/f84 0 2026-03-09T17:30:13.691 INFO:tasks.workunit.client.1.vm09.stdout:7/355: dread da/d11/d3e/f60 [0,4194304] 0 2026-03-09T17:30:13.693 INFO:tasks.workunit.client.1.vm09.stdout:1/288: dwrite d9/dc/dd/d40/d22/f2b [0,4194304] 0 2026-03-09T17:30:13.693 INFO:tasks.workunit.client.0.vm06.stdout:6/946: fdatasync d6/d4f/fa3 0 2026-03-09T17:30:13.699 INFO:tasks.workunit.client.1.vm09.stdout:5/264: chown d0/d46/d4b/f4f 3 1 2026-03-09T17:30:13.699 INFO:tasks.workunit.client.1.vm09.stdout:8/281: symlink d1/da/d3a/l60 0 2026-03-09T17:30:13.709 INFO:tasks.workunit.client.1.vm09.stdout:3/256: truncate d5/d16/d31/f34 638313 0 2026-03-09T17:30:13.709 INFO:tasks.workunit.client.1.vm09.stdout:6/246: mknod d3/d21/d25/d26/d34/c4b 0 2026-03-09T17:30:13.709 INFO:tasks.workunit.client.1.vm09.stdout:5/265: chown d0/d52/c1d 30553 1 2026-03-09T17:30:13.712 INFO:tasks.workunit.client.1.vm09.stdout:8/282: stat d1/d14/d2a/d49 0 2026-03-09T17:30:13.715 INFO:tasks.workunit.client.1.vm09.stdout:7/356: fsync da/d11/d2d/f70 0 2026-03-09T17:30:13.719 INFO:tasks.workunit.client.1.vm09.stdout:2/223: dwrite d13/d15/d21/f30 [4194304,4194304] 0 2026-03-09T17:30:13.721 INFO:tasks.workunit.client.1.vm09.stdout:3/257: write d5/d16/d46/f47 [715968,83842] 0 2026-03-09T17:30:13.725 INFO:tasks.workunit.client.1.vm09.stdout:2/224: chown d13/d15/d34/d45 0 1 2026-03-09T17:30:13.729 INFO:tasks.workunit.client.0.vm06.stdout:6/947: dread d6/d12/d17/f7a [0,4194304] 0 2026-03-09T17:30:13.735 INFO:tasks.workunit.client.1.vm09.stdout:8/283: dwrite d1/da/d13/f1d [0,4194304] 0 2026-03-09T17:30:13.737 INFO:tasks.workunit.client.1.vm09.stdout:8/284: truncate d1/d14/f3d 4410114 0 2026-03-09T17:30:13.748 INFO:tasks.workunit.client.1.vm09.stdout:5/266: mkdir d0/d57 0 2026-03-09T17:30:13.748 INFO:tasks.workunit.client.1.vm09.stdout:3/258: mknod d5/d16/d31/d3d/d32/c4a 0 2026-03-09T17:30:13.748 INFO:tasks.workunit.client.1.vm09.stdout:5/267: chown d0/d52/c43 1364996 1 2026-03-09T17:30:13.748 INFO:tasks.workunit.client.1.vm09.stdout:3/259: readlink d5/d16/d31/d3d/d12/l26 0 2026-03-09T17:30:13.755 INFO:tasks.workunit.client.1.vm09.stdout:6/247: creat d3/d7/f4c x:0 0 0 2026-03-09T17:30:13.755 INFO:tasks.workunit.client.1.vm09.stdout:2/225: creat d13/d15/d3b/d43/f46 x:0 0 0 2026-03-09T17:30:13.757 INFO:tasks.workunit.client.1.vm09.stdout:7/357: dwrite da/d11/d41/f38 [0,4194304] 0 2026-03-09T17:30:13.763 INFO:tasks.workunit.client.1.vm09.stdout:3/260: creat d5/d16/d31/d3d/d32/f4b x:0 0 0 2026-03-09T17:30:13.764 INFO:tasks.workunit.client.0.vm06.stdout:6/948: rename d6/d47/d96/da1/fb7 to d6/d12/d17/f125 0 2026-03-09T17:30:13.764 INFO:tasks.workunit.client.1.vm09.stdout:6/248: creat d3/d21/f4d x:0 0 0 2026-03-09T17:30:13.765 INFO:tasks.workunit.client.1.vm09.stdout:6/249: dread - d3/d7/f4c zero size 2026-03-09T17:30:13.765 INFO:tasks.workunit.client.1.vm09.stdout:7/358: rename da/d11/d47/d5b/f6d to da/d11/d41/d4e/f7d 0 2026-03-09T17:30:13.769 INFO:tasks.workunit.client.1.vm09.stdout:5/268: mknod d0/d55/c58 0 2026-03-09T17:30:13.769 INFO:tasks.workunit.client.1.vm09.stdout:2/226: symlink d13/d15/d3b/d43/l47 0 2026-03-09T17:30:13.777 INFO:tasks.workunit.client.1.vm09.stdout:3/261: dread d5/d16/f17 [0,4194304] 0 2026-03-09T17:30:13.784 INFO:tasks.workunit.client.1.vm09.stdout:3/262: truncate d5/d16/d31/d3d/d32/f4b 974540 0 2026-03-09T17:30:13.784 INFO:tasks.workunit.client.1.vm09.stdout:6/250: mknod d3/d1e/d30/d3f/c4e 0 2026-03-09T17:30:13.784 INFO:tasks.workunit.client.1.vm09.stdout:5/269: write d0/d2/f2a [1956599,32963] 0 2026-03-09T17:30:13.784 INFO:tasks.workunit.client.1.vm09.stdout:5/270: dread - d0/d46/d4b/f4f zero size 2026-03-09T17:30:13.784 INFO:tasks.workunit.client.1.vm09.stdout:3/263: fsync d5/d16/d25/f2b 0 2026-03-09T17:30:13.785 INFO:tasks.workunit.client.1.vm09.stdout:6/251: chown d3/c13 986 1 2026-03-09T17:30:13.787 INFO:tasks.workunit.client.1.vm09.stdout:5/271: mknod d0/d55/c59 0 2026-03-09T17:30:13.787 INFO:tasks.workunit.client.1.vm09.stdout:5/272: stat d0/dc/d21 0 2026-03-09T17:30:13.788 INFO:tasks.workunit.client.0.vm06.stdout:6/949: creat d6/d4f/d3e/d52/f126 x:0 0 0 2026-03-09T17:30:13.797 INFO:tasks.workunit.client.1.vm09.stdout:2/227: rename d13/d15/f18 to d13/d15/d34/f48 0 2026-03-09T17:30:13.797 INFO:tasks.workunit.client.1.vm09.stdout:5/273: symlink d0/d52/d20/l5a 0 2026-03-09T17:30:13.802 INFO:tasks.workunit.client.1.vm09.stdout:5/274: dread d0/d52/d20/f25 [0,4194304] 0 2026-03-09T17:30:13.809 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:13 vm09.local ceph-mon[62061]: from='client.14704 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:13.814 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:13 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/1761564276' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:30:13.814 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:13 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:13.814 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:13 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:13.814 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:13 vm09.local ceph-mon[62061]: mgrmap e25: vm09.lqzvkh(active, since 7s), standbys: vm06.pbgzei 2026-03-09T17:30:13.814 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:13 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mgr metadata", "who": "vm06.pbgzei", "id": "vm06.pbgzei"}]: dispatch 2026-03-09T17:30:13.838 INFO:tasks.workunit.client.1.vm09.stdout:3/264: sync 2026-03-09T17:30:13.838 INFO:tasks.workunit.client.1.vm09.stdout:5/275: sync 2026-03-09T17:30:13.839 INFO:tasks.workunit.client.1.vm09.stdout:3/265: write d5/d16/d25/f28 [3187247,97362] 0 2026-03-09T17:30:13.840 INFO:tasks.workunit.client.1.vm09.stdout:5/276: readlink d0/dc/l4e 0 2026-03-09T17:30:13.846 INFO:tasks.workunit.client.1.vm09.stdout:5/277: symlink d0/l5b 0 2026-03-09T17:30:13.849 INFO:tasks.workunit.client.1.vm09.stdout:3/266: dwrite d5/d9/d30/f41 [0,4194304] 0 2026-03-09T17:30:13.855 INFO:tasks.workunit.client.1.vm09.stdout:3/267: fdatasync d5/d16/d31/d3d/d12/f1d 0 2026-03-09T17:30:13.862 INFO:tasks.workunit.client.1.vm09.stdout:3/268: symlink d5/d16/d31/d3d/l4c 0 2026-03-09T17:30:13.864 INFO:tasks.workunit.client.1.vm09.stdout:5/278: dwrite d0/de/f50 [0,4194304] 0 2026-03-09T17:30:13.870 INFO:tasks.workunit.client.1.vm09.stdout:3/269: chown d5/d16/d31/d3d/d12/l2a 17 1 2026-03-09T17:30:13.871 INFO:tasks.workunit.client.1.vm09.stdout:3/270: read d5/d16/d25/f2b [131951,90061] 0 2026-03-09T17:30:13.871 INFO:tasks.workunit.client.1.vm09.stdout:3/271: write d5/d16/d31/d3d/fe [4433862,77844] 0 2026-03-09T17:30:13.872 INFO:tasks.workunit.client.1.vm09.stdout:3/272: write f3 [501764,43632] 0 2026-03-09T17:30:13.876 INFO:tasks.workunit.client.1.vm09.stdout:5/279: mkdir d0/d9/d16/d5c 0 2026-03-09T17:30:13.885 INFO:tasks.workunit.client.1.vm09.stdout:3/273: mknod d5/d16/d25/c4d 0 2026-03-09T17:30:13.885 INFO:tasks.workunit.client.1.vm09.stdout:5/280: unlink d0/d2/l3a 0 2026-03-09T17:30:13.885 INFO:tasks.workunit.client.1.vm09.stdout:3/274: link d5/f2f d5/d9/f4e 0 2026-03-09T17:30:13.885 INFO:tasks.workunit.client.1.vm09.stdout:5/281: dwrite d0/d2/f2a [4194304,4194304] 0 2026-03-09T17:30:13.889 INFO:tasks.workunit.client.1.vm09.stdout:3/275: dread d5/d16/d31/d3d/fe [0,4194304] 0 2026-03-09T17:30:13.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:13 vm06.local ceph-mon[57307]: from='client.14704 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:13.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:13 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/1761564276' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:30:13.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:13 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:13.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:13 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:13.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:13 vm06.local ceph-mon[57307]: mgrmap e25: vm09.lqzvkh(active, since 7s), standbys: vm06.pbgzei 2026-03-09T17:30:13.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:13 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mgr metadata", "who": "vm06.pbgzei", "id": "vm06.pbgzei"}]: dispatch 2026-03-09T17:30:13.895 INFO:tasks.workunit.client.1.vm09.stdout:3/276: dread d5/d16/d31/d3d/d12/f3e [4194304,4194304] 0 2026-03-09T17:30:13.902 INFO:tasks.workunit.client.1.vm09.stdout:3/277: fdatasync d5/d16/d31/d3d/d12/f15 0 2026-03-09T17:30:13.912 INFO:tasks.workunit.client.1.vm09.stdout:3/278: dread d5/d16/d25/f28 [4194304,4194304] 0 2026-03-09T17:30:13.920 INFO:tasks.workunit.client.1.vm09.stdout:3/279: rename d5/d16/d31/d3d/d32/f4b to d5/d16/d31/d3d/d12/f4f 0 2026-03-09T17:30:13.922 INFO:tasks.workunit.client.1.vm09.stdout:3/280: truncate d5/d16/d31/d3d/d12/f43 444336 0 2026-03-09T17:30:13.922 INFO:tasks.workunit.client.1.vm09.stdout:3/281: fdatasync d5/d16/d25/f2c 0 2026-03-09T17:30:13.932 INFO:tasks.workunit.client.1.vm09.stdout:3/282: dread d5/d16/d31/d3d/fe [0,4194304] 0 2026-03-09T17:30:13.933 INFO:tasks.workunit.client.1.vm09.stdout:3/283: truncate d5/d16/d31/d3d/d12/f43 824892 0 2026-03-09T17:30:13.938 INFO:tasks.workunit.client.1.vm09.stdout:8/285: dread d1/da/dd/f1e [0,4194304] 0 2026-03-09T17:30:13.947 INFO:tasks.workunit.client.1.vm09.stdout:3/284: sync 2026-03-09T17:30:13.947 INFO:tasks.workunit.client.1.vm09.stdout:0/262: rmdir d6/d1d/d24/d32 39 2026-03-09T17:30:13.951 INFO:tasks.workunit.client.1.vm09.stdout:3/285: mknod d5/d9/c50 0 2026-03-09T17:30:13.951 INFO:tasks.workunit.client.1.vm09.stdout:0/263: write d6/f9 [2866571,46589] 0 2026-03-09T17:30:13.951 INFO:tasks.workunit.client.0.vm06.stdout:2/918: dwrite d3/d4/d12/d2b/d9f/fb8 [0,4194304] 0 2026-03-09T17:30:13.957 INFO:tasks.workunit.client.1.vm09.stdout:3/286: chown d5/d16/d25/f28 0 1 2026-03-09T17:30:13.958 INFO:tasks.workunit.client.1.vm09.stdout:8/286: dwrite d1/da/dd/d3f/f1c [4194304,4194304] 0 2026-03-09T17:30:13.964 INFO:tasks.workunit.client.1.vm09.stdout:9/243: dwrite d5/d21/f2b [0,4194304] 0 2026-03-09T17:30:13.975 INFO:tasks.workunit.client.0.vm06.stdout:2/919: dread d3/d4/f70 [0,4194304] 0 2026-03-09T17:30:13.980 INFO:tasks.workunit.client.1.vm09.stdout:3/287: mknod d5/d16/d31/c51 0 2026-03-09T17:30:13.991 INFO:tasks.workunit.client.1.vm09.stdout:8/287: mknod d1/da/dd/c61 0 2026-03-09T17:30:13.991 INFO:tasks.workunit.client.1.vm09.stdout:4/338: truncate d11/d1e/d45/d60/d69/f38 47387 0 2026-03-09T17:30:13.992 INFO:tasks.workunit.client.1.vm09.stdout:8/288: write d1/d14/f3d [662511,76468] 0 2026-03-09T17:30:13.997 INFO:tasks.workunit.client.1.vm09.stdout:1/289: dwrite d9/dc/dd/d40/d22/d37/f2e [0,4194304] 0 2026-03-09T17:30:14.007 INFO:tasks.workunit.client.1.vm09.stdout:6/252: getdents d3/d7 0 2026-03-09T17:30:14.012 INFO:tasks.workunit.client.1.vm09.stdout:7/359: write da/d11/d41/d4e/f7c [2123916,128586] 0 2026-03-09T17:30:14.021 INFO:tasks.workunit.client.0.vm06.stdout:6/950: write d6/d12/d17/d85/fa7 [1019883,108069] 0 2026-03-09T17:30:14.026 INFO:tasks.workunit.client.1.vm09.stdout:4/339: creat d11/d1e/d45/d60/f6c x:0 0 0 2026-03-09T17:30:14.027 INFO:tasks.workunit.client.1.vm09.stdout:2/228: dwrite d13/d15/d34/f3a [0,4194304] 0 2026-03-09T17:30:14.029 INFO:tasks.workunit.client.1.vm09.stdout:4/340: chown d11/d1e/d45/d60/d69/d58/l5d 27 1 2026-03-09T17:30:14.029 INFO:tasks.workunit.client.0.vm06.stdout:2/920: mknod d3/d4/d46/c12e 0 2026-03-09T17:30:14.039 INFO:tasks.workunit.client.1.vm09.stdout:3/288: dwrite d5/d16/d31/d3d/d12/f1d [0,4194304] 0 2026-03-09T17:30:14.039 INFO:tasks.workunit.client.1.vm09.stdout:7/360: symlink da/d11/d41/d4e/l7e 0 2026-03-09T17:30:14.040 INFO:tasks.workunit.client.1.vm09.stdout:0/264: creat d6/d1d/d24/f50 x:0 0 0 2026-03-09T17:30:14.043 INFO:tasks.workunit.client.1.vm09.stdout:2/229: dwrite d13/d15/f2b [0,4194304] 0 2026-03-09T17:30:14.046 INFO:tasks.workunit.client.0.vm06.stdout:6/951: mkdir d6/d12/d2d/d127 0 2026-03-09T17:30:14.056 INFO:tasks.workunit.client.1.vm09.stdout:8/289: dwrite d1/d14/f2f [0,4194304] 0 2026-03-09T17:30:14.062 INFO:tasks.workunit.client.1.vm09.stdout:8/290: dwrite d1/d14/d2a/f2e [4194304,4194304] 0 2026-03-09T17:30:14.066 INFO:tasks.workunit.client.1.vm09.stdout:8/291: write d1/d14/d2a/f2e [617651,78452] 0 2026-03-09T17:30:14.066 INFO:tasks.workunit.client.1.vm09.stdout:4/341: creat d11/d1e/d29/f6d x:0 0 0 2026-03-09T17:30:14.073 INFO:tasks.workunit.client.0.vm06.stdout:2/921: symlink d3/d4/d12/d71/daa/d77/d81/d64/d6a/d10d/l12f 0 2026-03-09T17:30:14.074 INFO:tasks.workunit.client.0.vm06.stdout:2/922: read d3/d4/d22/d72/d8f/fbf [793681,129039] 0 2026-03-09T17:30:14.101 INFO:tasks.workunit.client.1.vm09.stdout:6/253: rename d3/d21/d25/d26/d34/f3e to d3/f4f 0 2026-03-09T17:30:14.108 INFO:tasks.workunit.client.1.vm09.stdout:5/282: write d0/dc/d21/d26/f36 [1759722,113179] 0 2026-03-09T17:30:14.112 INFO:tasks.workunit.client.1.vm09.stdout:6/254: dread d3/d7/ff [0,4194304] 0 2026-03-09T17:30:14.115 INFO:tasks.workunit.client.1.vm09.stdout:9/244: write d5/f1e [256824,29882] 0 2026-03-09T17:30:14.116 INFO:tasks.workunit.client.1.vm09.stdout:9/245: chown d5/de/c44 2234658 1 2026-03-09T17:30:14.127 INFO:tasks.workunit.client.1.vm09.stdout:0/265: fdatasync d6/d1d/f1e 0 2026-03-09T17:30:14.132 INFO:tasks.workunit.client.1.vm09.stdout:8/292: creat d1/d14/d2a/f62 x:0 0 0 2026-03-09T17:30:14.133 INFO:tasks.workunit.client.1.vm09.stdout:7/361: dwrite da/d11/d3e/f60 [0,4194304] 0 2026-03-09T17:30:14.138 INFO:tasks.workunit.client.1.vm09.stdout:5/283: creat d0/d2/f5d x:0 0 0 2026-03-09T17:30:14.139 INFO:tasks.workunit.client.1.vm09.stdout:6/255: creat d3/d21/d25/d26/f50 x:0 0 0 2026-03-09T17:30:14.156 INFO:tasks.workunit.client.1.vm09.stdout:2/230: creat d13/d15/d34/d45/f49 x:0 0 0 2026-03-09T17:30:14.157 INFO:tasks.workunit.client.1.vm09.stdout:8/293: chown d1/c1a 305 1 2026-03-09T17:30:14.158 INFO:tasks.workunit.client.1.vm09.stdout:8/294: write d1/da/dd/f45 [214213,67412] 0 2026-03-09T17:30:14.159 INFO:tasks.workunit.client.1.vm09.stdout:7/362: truncate da/f21 2425246 0 2026-03-09T17:30:14.161 INFO:tasks.workunit.client.1.vm09.stdout:7/363: truncate da/d11/d47/d5b/d6c/f7b 741313 0 2026-03-09T17:30:14.164 INFO:tasks.workunit.client.1.vm09.stdout:5/284: mkdir d0/dc/d21/d26/d5e 0 2026-03-09T17:30:14.171 INFO:tasks.workunit.client.1.vm09.stdout:2/231: rename d13/d15/c38 to d13/d15/d3b/d43/c4a 0 2026-03-09T17:30:14.171 INFO:tasks.workunit.client.1.vm09.stdout:6/256: write d3/f19 [2722433,35875] 0 2026-03-09T17:30:14.177 INFO:tasks.workunit.client.1.vm09.stdout:4/342: creat d11/f6e x:0 0 0 2026-03-09T17:30:14.180 INFO:tasks.workunit.client.1.vm09.stdout:7/364: creat da/d11/d2d/d49/f7f x:0 0 0 2026-03-09T17:30:14.186 INFO:tasks.workunit.client.1.vm09.stdout:8/295: dread d1/d14/f3c [0,4194304] 0 2026-03-09T17:30:14.187 INFO:tasks.workunit.client.1.vm09.stdout:8/296: dread - d1/d14/d2a/d42/f46 zero size 2026-03-09T17:30:14.192 INFO:tasks.workunit.client.1.vm09.stdout:9/246: dread d5/d21/f46 [0,4194304] 0 2026-03-09T17:30:14.193 INFO:tasks.workunit.client.1.vm09.stdout:2/232: creat d13/d15/d36/f4b x:0 0 0 2026-03-09T17:30:14.195 INFO:tasks.workunit.client.1.vm09.stdout:1/290: truncate d9/f34 5001250 0 2026-03-09T17:30:14.207 INFO:tasks.workunit.client.1.vm09.stdout:1/291: dwrite d9/dc/dd/d40/d22/d37/d3f/d42/f45 [0,4194304] 0 2026-03-09T17:30:14.212 INFO:tasks.workunit.client.0.vm06.stdout:2/923: write d3/d4/d12/f42 [2471527,75915] 0 2026-03-09T17:30:14.229 INFO:tasks.workunit.client.1.vm09.stdout:2/233: unlink d13/d15/d36/f4b 0 2026-03-09T17:30:14.229 INFO:tasks.workunit.client.0.vm06.stdout:6/952: write d6/d47/d96/f37 [3947936,28527] 0 2026-03-09T17:30:14.237 INFO:tasks.workunit.client.0.vm06.stdout:2/924: rename d3/d4/d22/d72 to d3/d4/d12/d71/daa/d77/d81/d130 0 2026-03-09T17:30:14.239 INFO:tasks.workunit.client.0.vm06.stdout:2/925: write d3/d4/d12/d2b/d36/dd4/fd5 [5198029,127468] 0 2026-03-09T17:30:14.244 INFO:tasks.workunit.client.0.vm06.stdout:6/953: creat d6/d12/d53/d91/dcb/f128 x:0 0 0 2026-03-09T17:30:14.245 INFO:tasks.workunit.client.0.vm06.stdout:6/954: read - d6/d12/d53/d91/dcb/f11f zero size 2026-03-09T17:30:14.255 INFO:tasks.workunit.client.1.vm09.stdout:7/365: creat da/d11/d47/d5b/d78/f80 x:0 0 0 2026-03-09T17:30:14.256 INFO:tasks.workunit.client.1.vm09.stdout:7/366: stat da/d11/d41/d4e/d4c/f67 0 2026-03-09T17:30:14.265 INFO:tasks.workunit.client.1.vm09.stdout:4/343: rename d11/d1e/c2b to d11/d1e/d29/d36/d57/c6f 0 2026-03-09T17:30:14.266 INFO:tasks.workunit.client.1.vm09.stdout:0/266: getdents d6/d1d 0 2026-03-09T17:30:14.269 INFO:tasks.workunit.client.0.vm06.stdout:6/955: mkdir d6/d4f/d129 0 2026-03-09T17:30:14.272 INFO:tasks.workunit.client.1.vm09.stdout:1/292: symlink d9/dc/dd/d40/d22/d37/d3f/l57 0 2026-03-09T17:30:14.290 INFO:tasks.workunit.client.1.vm09.stdout:7/367: creat da/d11/d2d/d56/d68/f81 x:0 0 0 2026-03-09T17:30:14.299 INFO:tasks.workunit.client.1.vm09.stdout:5/285: write d0/f22 [571267,13963] 0 2026-03-09T17:30:14.301 INFO:tasks.workunit.client.1.vm09.stdout:4/344: creat d11/d1e/d45/f70 x:0 0 0 2026-03-09T17:30:14.305 INFO:tasks.workunit.client.1.vm09.stdout:8/297: dwrite d1/d14/f3c [0,4194304] 0 2026-03-09T17:30:14.312 INFO:tasks.workunit.client.1.vm09.stdout:0/267: symlink d6/d1d/d46/l51 0 2026-03-09T17:30:14.314 INFO:tasks.workunit.client.1.vm09.stdout:1/293: unlink d9/dc/dd/d40/d22/l3b 0 2026-03-09T17:30:14.314 INFO:tasks.workunit.client.0.vm06.stdout:2/926: write d3/d4/d12/d2b/fdc [165598,93506] 0 2026-03-09T17:30:14.316 INFO:tasks.workunit.client.1.vm09.stdout:6/257: getdents d3/d21/d25/d26/d34 0 2026-03-09T17:30:14.317 INFO:tasks.workunit.client.1.vm09.stdout:6/258: fdatasync d3/d21/f28 0 2026-03-09T17:30:14.318 INFO:tasks.workunit.client.1.vm09.stdout:7/368: creat da/d11/d47/d5b/f82 x:0 0 0 2026-03-09T17:30:14.318 INFO:tasks.workunit.client.1.vm09.stdout:7/369: stat da/d11/d2d/c65 0 2026-03-09T17:30:14.319 INFO:tasks.workunit.client.1.vm09.stdout:7/370: chown da/c3b 100667 1 2026-03-09T17:30:14.326 INFO:tasks.workunit.client.1.vm09.stdout:5/286: chown d0/d46/l4d 7 1 2026-03-09T17:30:14.327 INFO:tasks.workunit.client.1.vm09.stdout:5/287: readlink d0/l3 0 2026-03-09T17:30:14.329 INFO:tasks.workunit.client.1.vm09.stdout:3/289: dwrite d5/d16/d31/f34 [0,4194304] 0 2026-03-09T17:30:14.330 INFO:tasks.workunit.client.0.vm06.stdout:6/956: fsync d6/d4f/f44 0 2026-03-09T17:30:14.332 INFO:tasks.workunit.client.1.vm09.stdout:4/345: unlink d11/d1e/d45/d60/d69/f34 0 2026-03-09T17:30:14.334 INFO:tasks.workunit.client.1.vm09.stdout:9/247: rename d5/d21/f30 to d5/de/d29/f57 0 2026-03-09T17:30:14.334 INFO:tasks.workunit.client.1.vm09.stdout:3/290: fdatasync d5/d16/d46/f47 0 2026-03-09T17:30:14.334 INFO:tasks.workunit.client.1.vm09.stdout:6/259: fsync d3/f4f 0 2026-03-09T17:30:14.340 INFO:tasks.workunit.client.1.vm09.stdout:3/291: write d5/d16/d31/d3d/d12/f1d [853868,120449] 0 2026-03-09T17:30:14.341 INFO:tasks.workunit.client.1.vm09.stdout:0/268: dread d6/d1d/f37 [0,4194304] 0 2026-03-09T17:30:14.343 INFO:tasks.workunit.client.1.vm09.stdout:7/371: dread da/d11/f1a [0,4194304] 0 2026-03-09T17:30:14.350 INFO:tasks.workunit.client.0.vm06.stdout:6/957: creat d6/d4f/d3e/d52/d8c/d117/f12a x:0 0 0 2026-03-09T17:30:14.352 INFO:tasks.workunit.client.0.vm06.stdout:6/958: stat d6/d12/d17/d85/faf 0 2026-03-09T17:30:14.356 INFO:tasks.workunit.client.1.vm09.stdout:0/269: dread d6/d1d/d46/f4d [0,4194304] 0 2026-03-09T17:30:14.359 INFO:tasks.workunit.client.1.vm09.stdout:0/270: dread d6/d1d/f1e [0,4194304] 0 2026-03-09T17:30:14.360 INFO:tasks.workunit.client.1.vm09.stdout:0/271: chown d6/f9 4700444 1 2026-03-09T17:30:14.366 INFO:tasks.workunit.client.1.vm09.stdout:6/260: dread d3/d7/f18 [0,4194304] 0 2026-03-09T17:30:14.373 INFO:tasks.workunit.client.0.vm06.stdout:6/959: dread d6/d4f/d3e/d52/d8c/db0/fc7 [0,4194304] 0 2026-03-09T17:30:14.384 INFO:tasks.workunit.client.1.vm09.stdout:9/248: dwrite d5/de/d29/f35 [0,4194304] 0 2026-03-09T17:30:14.405 INFO:tasks.workunit.client.1.vm09.stdout:2/234: rename d13/d15/d21/f35 to d13/f4c 0 2026-03-09T17:30:14.406 INFO:tasks.workunit.client.1.vm09.stdout:2/235: fsync fd 0 2026-03-09T17:30:14.418 INFO:tasks.workunit.client.1.vm09.stdout:7/372: unlink f3 0 2026-03-09T17:30:14.432 INFO:tasks.workunit.client.0.vm06.stdout:2/927: write d3/d4/f3c [3492387,96697] 0 2026-03-09T17:30:14.440 INFO:tasks.workunit.client.0.vm06.stdout:2/928: creat d3/d4/d12/d71/daa/d77/d81/d64/de5/f131 x:0 0 0 2026-03-09T17:30:14.441 INFO:tasks.workunit.client.1.vm09.stdout:4/346: dwrite f3 [0,4194304] 0 2026-03-09T17:30:14.450 INFO:tasks.workunit.client.0.vm06.stdout:6/960: truncate d6/d47/d96/d40/f67 2596688 0 2026-03-09T17:30:14.452 INFO:tasks.workunit.client.1.vm09.stdout:9/249: chown d5/lc 63646 1 2026-03-09T17:30:14.453 INFO:tasks.workunit.client.0.vm06.stdout:2/929: chown d3/d4/d12/d71/daa/d77/d81/d64/d6a/c86 0 1 2026-03-09T17:30:14.458 INFO:tasks.workunit.client.1.vm09.stdout:8/298: rename d1/da/d13 to d1/da/dd/d63 0 2026-03-09T17:30:14.458 INFO:tasks.workunit.client.1.vm09.stdout:8/299: readlink d1/da/dd/d47/l51 0 2026-03-09T17:30:14.460 INFO:tasks.workunit.client.0.vm06.stdout:6/961: unlink d6/d12/d2d/c107 0 2026-03-09T17:30:14.470 INFO:tasks.workunit.client.1.vm09.stdout:5/288: truncate d0/d2/f2a 4591485 0 2026-03-09T17:30:14.472 INFO:tasks.workunit.client.0.vm06.stdout:6/962: rmdir d6/d4f 39 2026-03-09T17:30:14.486 INFO:tasks.workunit.client.1.vm09.stdout:7/373: dwrite da/f16 [4194304,4194304] 0 2026-03-09T17:30:14.486 INFO:tasks.workunit.client.1.vm09.stdout:1/294: link d9/dc/dd/d40/l23 d9/dc/dd/d40/l58 0 2026-03-09T17:30:14.487 INFO:tasks.workunit.client.0.vm06.stdout:2/930: creat d3/d4/d46/f132 x:0 0 0 2026-03-09T17:30:14.487 INFO:tasks.workunit.client.0.vm06.stdout:6/963: creat d6/d47/dd7/df8/f12b x:0 0 0 2026-03-09T17:30:14.488 INFO:tasks.workunit.client.1.vm09.stdout:1/295: write d9/dc/dd/d40/d22/f4a [1264977,5818] 0 2026-03-09T17:30:14.488 INFO:tasks.workunit.client.0.vm06.stdout:2/931: dread - d3/d4/d12/d71/ffb zero size 2026-03-09T17:30:14.489 INFO:tasks.workunit.client.1.vm09.stdout:7/374: write da/d11/d3e/f60 [624912,21045] 0 2026-03-09T17:30:14.490 INFO:tasks.workunit.client.0.vm06.stdout:2/932: chown d3/d4/d12/d2b/d36/dd4/f113 11 1 2026-03-09T17:30:14.493 INFO:tasks.workunit.client.1.vm09.stdout:4/347: mkdir d11/d1e/d45/d60/d71 0 2026-03-09T17:30:14.507 INFO:tasks.workunit.client.0.vm06.stdout:6/964: creat d6/d47/d4d/d9a/f12c x:0 0 0 2026-03-09T17:30:14.508 INFO:tasks.workunit.client.1.vm09.stdout:6/261: truncate d3/d7/f23 3251640 0 2026-03-09T17:30:14.509 INFO:tasks.workunit.client.1.vm09.stdout:0/272: dread d6/d1d/d24/d32/f49 [0,4194304] 0 2026-03-09T17:30:14.509 INFO:tasks.workunit.client.1.vm09.stdout:6/262: truncate d3/d21/f4d 50356 0 2026-03-09T17:30:14.521 INFO:tasks.workunit.client.1.vm09.stdout:8/300: dread d1/f8 [0,4194304] 0 2026-03-09T17:30:14.525 INFO:tasks.workunit.client.1.vm09.stdout:5/289: creat d0/d9/d16/d3c/f5f x:0 0 0 2026-03-09T17:30:14.529 INFO:tasks.workunit.client.0.vm06.stdout:2/933: dwrite d3/d4/d12/d2b/fb6 [0,4194304] 0 2026-03-09T17:30:14.534 INFO:tasks.workunit.client.0.vm06.stdout:6/965: dread d6/d4f/d3e/d52/d95/f114 [0,4194304] 0 2026-03-09T17:30:14.536 INFO:tasks.workunit.client.0.vm06.stdout:2/934: dwrite d3/d4/d12/d71/daa/d77/d81/d130/f54 [0,4194304] 0 2026-03-09T17:30:14.562 INFO:tasks.workunit.client.1.vm09.stdout:7/375: rename da/d11/d47/d5b/l5c to da/d11/d77/l83 0 2026-03-09T17:30:14.562 INFO:tasks.workunit.client.1.vm09.stdout:4/348: creat d11/d1e/d45/d60/d69/d58/f72 x:0 0 0 2026-03-09T17:30:14.562 INFO:tasks.workunit.client.1.vm09.stdout:9/250: symlink d5/de/d29/l58 0 2026-03-09T17:30:14.563 INFO:tasks.workunit.client.1.vm09.stdout:9/251: write d5/f13 [3131936,45934] 0 2026-03-09T17:30:14.563 INFO:tasks.workunit.client.0.vm06.stdout:2/935: fsync d3/f29 0 2026-03-09T17:30:14.567 INFO:tasks.workunit.client.0.vm06.stdout:2/936: dwrite d3/d4/d12/d71/daa/f11e [0,4194304] 0 2026-03-09T17:30:14.571 INFO:tasks.workunit.client.0.vm06.stdout:6/966: truncate d6/d4f/d3e/d52/d8c/db0/ff6 1406737 0 2026-03-09T17:30:14.580 INFO:tasks.workunit.client.1.vm09.stdout:0/273: read d6/d1d/d24/d32/f45 [1755750,114466] 0 2026-03-09T17:30:14.584 INFO:tasks.workunit.client.1.vm09.stdout:6/263: creat d3/d1e/d30/d3f/f51 x:0 0 0 2026-03-09T17:30:14.584 INFO:tasks.workunit.client.1.vm09.stdout:2/236: mkdir d13/d4d 0 2026-03-09T17:30:14.584 INFO:tasks.workunit.client.1.vm09.stdout:2/237: dwrite fb [0,4194304] 0 2026-03-09T17:30:14.594 INFO:tasks.workunit.client.1.vm09.stdout:3/292: link d5/d9/l1b d5/d16/l52 0 2026-03-09T17:30:14.599 INFO:tasks.workunit.client.1.vm09.stdout:9/252: mknod d5/d2e/c59 0 2026-03-09T17:30:14.602 INFO:tasks.workunit.client.1.vm09.stdout:0/274: mknod d6/d1d/d46/c52 0 2026-03-09T17:30:14.602 INFO:tasks.workunit.client.1.vm09.stdout:9/253: dwrite d5/d21/f2f [4194304,4194304] 0 2026-03-09T17:30:14.608 INFO:tasks.workunit.client.1.vm09.stdout:7/376: dread da/d11/d41/d4e/f2b [0,4194304] 0 2026-03-09T17:30:14.615 INFO:tasks.workunit.client.1.vm09.stdout:6/264: readlink d3/l17 0 2026-03-09T17:30:14.615 INFO:tasks.workunit.client.1.vm09.stdout:1/296: creat d9/f59 x:0 0 0 2026-03-09T17:30:14.615 INFO:tasks.workunit.client.1.vm09.stdout:9/254: dwrite d5/de/d29/f52 [0,4194304] 0 2026-03-09T17:30:14.615 INFO:tasks.workunit.client.1.vm09.stdout:1/297: read - d9/dc/f3d zero size 2026-03-09T17:30:14.616 INFO:tasks.workunit.client.1.vm09.stdout:0/275: write d6/d1d/d46/f4d [309939,107830] 0 2026-03-09T17:30:14.619 INFO:tasks.workunit.client.1.vm09.stdout:6/265: rmdir d3/d1e/d30/d32 39 2026-03-09T17:30:14.622 INFO:tasks.workunit.client.1.vm09.stdout:9/255: dwrite d5/f1b [0,4194304] 0 2026-03-09T17:30:14.631 INFO:tasks.workunit.client.1.vm09.stdout:7/377: dread - da/d11/d2d/f70 zero size 2026-03-09T17:30:14.637 INFO:tasks.workunit.client.1.vm09.stdout:7/378: dread - da/d11/f6a zero size 2026-03-09T17:30:14.637 INFO:tasks.workunit.client.1.vm09.stdout:7/379: truncate da/d11/d47/d5b/d6c/f73 875802 0 2026-03-09T17:30:14.637 INFO:tasks.workunit.client.1.vm09.stdout:7/380: write da/d11/d3e/f60 [3415319,53478] 0 2026-03-09T17:30:14.637 INFO:tasks.workunit.client.1.vm09.stdout:3/293: creat d5/f53 x:0 0 0 2026-03-09T17:30:14.642 INFO:tasks.workunit.client.0.vm06.stdout:6/967: dwrite d6/d12/d53/d91/dcb/ffc [0,4194304] 0 2026-03-09T17:30:14.644 INFO:tasks.workunit.client.1.vm09.stdout:9/256: rename d5/f1d to d5/d2e/f5a 0 2026-03-09T17:30:14.644 INFO:tasks.workunit.client.1.vm09.stdout:4/349: getdents d11/d1e/d45/d60/d69/d58 0 2026-03-09T17:30:14.645 INFO:tasks.workunit.client.0.vm06.stdout:2/937: dwrite d3/d4/d22/f4b [0,4194304] 0 2026-03-09T17:30:14.651 INFO:tasks.workunit.client.1.vm09.stdout:1/298: mkdir d9/d5a 0 2026-03-09T17:30:14.655 INFO:tasks.workunit.client.1.vm09.stdout:6/266: symlink d3/d1e/d30/d32/l52 0 2026-03-09T17:30:14.664 INFO:tasks.workunit.client.0.vm06.stdout:2/938: unlink d3/d4/d12/d71/daa/d77/d81/d64/d6a/c86 0 2026-03-09T17:30:14.664 INFO:tasks.workunit.client.0.vm06.stdout:2/939: write d3/d4/d12/dfa/f11c [1047516,108626] 0 2026-03-09T17:30:14.664 INFO:tasks.workunit.client.1.vm09.stdout:9/257: chown d5/de/d29/c4d 6317 1 2026-03-09T17:30:14.664 INFO:tasks.workunit.client.1.vm09.stdout:9/258: truncate d5/f14 852528 0 2026-03-09T17:30:14.664 INFO:tasks.workunit.client.1.vm09.stdout:6/267: rename d3/d1e/d30/f39 to d3/d1e/d30/d32/f53 0 2026-03-09T17:30:14.665 INFO:tasks.workunit.client.1.vm09.stdout:7/381: mkdir da/d11/d64/d84 0 2026-03-09T17:30:14.665 INFO:tasks.workunit.client.1.vm09.stdout:4/350: fdatasync d11/f4d 0 2026-03-09T17:30:14.665 INFO:tasks.workunit.client.1.vm09.stdout:1/299: mkdir d9/d5b 0 2026-03-09T17:30:14.668 INFO:tasks.workunit.client.1.vm09.stdout:9/259: dwrite d5/de/d4e/f56 [0,4194304] 0 2026-03-09T17:30:14.668 INFO:tasks.workunit.client.1.vm09.stdout:4/351: rmdir d11/d1e/d45/d60/d69 39 2026-03-09T17:30:14.669 INFO:tasks.workunit.client.1.vm09.stdout:9/260: read d5/f34 [5261958,72282] 0 2026-03-09T17:30:14.670 INFO:tasks.workunit.client.0.vm06.stdout:2/940: mknod d3/d4/d12/d2b/d2d/c133 0 2026-03-09T17:30:14.672 INFO:tasks.workunit.client.0.vm06.stdout:2/941: symlink d3/d4/d12/da7/dfc/l134 0 2026-03-09T17:30:14.677 INFO:tasks.workunit.client.1.vm09.stdout:7/382: creat da/d11/d2d/d56/f85 x:0 0 0 2026-03-09T17:30:14.681 INFO:tasks.workunit.client.1.vm09.stdout:9/261: readlink d5/de/d29/l3f 0 2026-03-09T17:30:14.681 INFO:tasks.workunit.client.1.vm09.stdout:6/268: creat d3/d21/d25/f54 x:0 0 0 2026-03-09T17:30:14.682 INFO:tasks.workunit.client.0.vm06.stdout:6/968: dread d6/d47/f49 [0,4194304] 0 2026-03-09T17:30:14.682 INFO:tasks.workunit.client.1.vm09.stdout:7/383: symlink da/d11/d41/d4e/d4c/l86 0 2026-03-09T17:30:14.685 INFO:tasks.workunit.client.1.vm09.stdout:9/262: symlink d5/de/d4e/l5b 0 2026-03-09T17:30:14.687 INFO:tasks.workunit.client.0.vm06.stdout:2/942: rename d3/d4/d12/d2b/d9f to d3/d4/d11b/d135 0 2026-03-09T17:30:14.688 INFO:tasks.workunit.client.1.vm09.stdout:1/300: dread d9/dc/dd/d40/d22/f4a [0,4194304] 0 2026-03-09T17:30:14.689 INFO:tasks.workunit.client.1.vm09.stdout:6/269: symlink d3/l55 0 2026-03-09T17:30:14.690 INFO:tasks.workunit.client.1.vm09.stdout:9/263: symlink d5/l5c 0 2026-03-09T17:30:14.691 INFO:tasks.workunit.client.1.vm09.stdout:1/301: symlink d9/d38/l5c 0 2026-03-09T17:30:14.691 INFO:tasks.workunit.client.1.vm09.stdout:3/294: sync 2026-03-09T17:30:14.695 INFO:tasks.workunit.client.1.vm09.stdout:6/270: creat d3/d21/d25/d26/d34/f56 x:0 0 0 2026-03-09T17:30:14.695 INFO:tasks.workunit.client.1.vm09.stdout:9/264: dwrite d5/f14 [0,4194304] 0 2026-03-09T17:30:14.698 INFO:tasks.workunit.client.1.vm09.stdout:3/295: dread d5/d9/f4e [0,4194304] 0 2026-03-09T17:30:14.701 INFO:tasks.workunit.client.1.vm09.stdout:3/296: chown d5/d16/d31/d3d/d12/c48 124177737 1 2026-03-09T17:30:14.702 INFO:tasks.workunit.client.1.vm09.stdout:3/297: truncate d5/d16/d31/d3d/d12/f43 1089119 0 2026-03-09T17:30:14.705 INFO:tasks.workunit.client.0.vm06.stdout:6/969: symlink d6/d4f/d3e/l12d 0 2026-03-09T17:30:14.709 INFO:tasks.workunit.client.1.vm09.stdout:8/301: write d1/da/dd/f1e [4545765,83186] 0 2026-03-09T17:30:14.710 INFO:tasks.workunit.client.1.vm09.stdout:0/276: rmdir d6/d1d/d46 39 2026-03-09T17:30:14.710 INFO:tasks.workunit.client.1.vm09.stdout:5/290: truncate d0/dc/d21/f29 3130146 0 2026-03-09T17:30:14.713 INFO:tasks.workunit.client.1.vm09.stdout:1/302: symlink d9/dc/dd/d40/d22/d37/d3f/l5d 0 2026-03-09T17:30:14.714 INFO:tasks.workunit.client.1.vm09.stdout:9/265: fdatasync d5/d2e/f5a 0 2026-03-09T17:30:14.716 INFO:tasks.workunit.client.1.vm09.stdout:6/271: mknod d3/d1e/c57 0 2026-03-09T17:30:14.723 INFO:tasks.workunit.client.1.vm09.stdout:8/302: creat d1/da/dd/d47/f64 x:0 0 0 2026-03-09T17:30:14.728 INFO:tasks.workunit.client.1.vm09.stdout:6/272: dwrite d3/d21/d25/f2f [0,4194304] 0 2026-03-09T17:30:14.734 INFO:tasks.workunit.client.1.vm09.stdout:6/273: readlink d3/d21/d25/d26/d34/l3a 0 2026-03-09T17:30:14.734 INFO:tasks.workunit.client.1.vm09.stdout:0/277: creat d6/d1d/d39/f53 x:0 0 0 2026-03-09T17:30:14.734 INFO:tasks.workunit.client.1.vm09.stdout:0/278: readlink d6/d1d/l1f 0 2026-03-09T17:30:14.734 INFO:tasks.workunit.client.1.vm09.stdout:0/279: dwrite d6/d1d/d39/f2e [0,4194304] 0 2026-03-09T17:30:14.735 INFO:tasks.workunit.client.1.vm09.stdout:6/274: truncate d3/d7/f40 69563 0 2026-03-09T17:30:14.738 INFO:tasks.workunit.client.0.vm06.stdout:6/970: dread d6/f97 [0,4194304] 0 2026-03-09T17:30:14.738 INFO:tasks.workunit.client.1.vm09.stdout:1/303: mknod d9/dc/dd/d40/d21/d35/c5e 0 2026-03-09T17:30:14.739 INFO:tasks.workunit.client.1.vm09.stdout:2/238: mknod d13/d15/d34/d45/c4e 0 2026-03-09T17:30:14.739 INFO:tasks.workunit.client.1.vm09.stdout:9/266: creat d5/f5d x:0 0 0 2026-03-09T17:30:14.742 INFO:tasks.workunit.client.1.vm09.stdout:9/267: dwrite d5/f14 [0,4194304] 0 2026-03-09T17:30:14.743 INFO:tasks.workunit.client.1.vm09.stdout:8/303: chown d1/da/dd/f22 418030 1 2026-03-09T17:30:14.747 INFO:tasks.workunit.client.1.vm09.stdout:6/275: creat d3/d7/f58 x:0 0 0 2026-03-09T17:30:14.750 INFO:tasks.workunit.client.1.vm09.stdout:6/276: chown d3/d21/d25/d26/d34/l3a 26 1 2026-03-09T17:30:14.754 INFO:tasks.workunit.client.1.vm09.stdout:3/298: creat d5/d16/f54 x:0 0 0 2026-03-09T17:30:14.754 INFO:tasks.workunit.client.1.vm09.stdout:5/291: creat d0/f60 x:0 0 0 2026-03-09T17:30:14.754 INFO:tasks.workunit.client.1.vm09.stdout:6/277: write d3/d1e/d30/d3f/f51 [475409,113831] 0 2026-03-09T17:30:14.757 INFO:tasks.workunit.client.1.vm09.stdout:2/239: dwrite d13/d15/d3b/f3f [0,4194304] 0 2026-03-09T17:30:14.758 INFO:tasks.workunit.client.1.vm09.stdout:8/304: dwrite d1/da/dd/d47/f64 [0,4194304] 0 2026-03-09T17:30:14.758 INFO:tasks.workunit.client.1.vm09.stdout:6/278: chown d3/d7/l16 25681 1 2026-03-09T17:30:14.763 INFO:tasks.workunit.client.1.vm09.stdout:1/304: sync 2026-03-09T17:30:14.764 INFO:tasks.workunit.client.0.vm06.stdout:2/943: dwrite d3/d4/d12/d2b/d2d/f1b [0,4194304] 0 2026-03-09T17:30:14.766 INFO:tasks.workunit.client.1.vm09.stdout:2/240: dread fb [0,4194304] 0 2026-03-09T17:30:14.766 INFO:tasks.workunit.client.0.vm06.stdout:2/944: fsync d3/d4/d46/da5/f104 0 2026-03-09T17:30:14.768 INFO:tasks.workunit.client.0.vm06.stdout:2/945: readlink d3/d4/d12/d71/daa/d77/d81/d64/l7a 0 2026-03-09T17:30:14.773 INFO:tasks.workunit.client.1.vm09.stdout:7/384: dwrite da/d11/d41/f30 [4194304,4194304] 0 2026-03-09T17:30:14.782 INFO:tasks.workunit.client.1.vm09.stdout:7/385: dread da/d11/d41/f35 [0,4194304] 0 2026-03-09T17:30:14.788 INFO:tasks.workunit.client.1.vm09.stdout:8/305: mknod d1/d14/d2a/d42/d43/c65 0 2026-03-09T17:30:14.792 INFO:tasks.workunit.client.0.vm06.stdout:2/946: rename d3/d4/d12/d2b/d36/lf9 to d3/d4/d12/d71/daa/d77/d81/d130/dfd/l136 0 2026-03-09T17:30:14.793 INFO:tasks.workunit.client.1.vm09.stdout:6/279: truncate d3/d21/f22 239064 0 2026-03-09T17:30:14.793 INFO:tasks.workunit.client.1.vm09.stdout:2/241: mkdir d13/d15/d36/d4f 0 2026-03-09T17:30:14.799 INFO:tasks.workunit.client.1.vm09.stdout:7/386: symlink da/d11/d47/d5b/l87 0 2026-03-09T17:30:14.800 INFO:tasks.workunit.client.1.vm09.stdout:0/280: rename d6/d1d/d46/l51 to d6/l54 0 2026-03-09T17:30:14.802 INFO:tasks.workunit.client.1.vm09.stdout:8/306: unlink d1/da/dd/d3f/d32/c3b 0 2026-03-09T17:30:14.804 INFO:tasks.workunit.client.1.vm09.stdout:5/292: rename d0/dc/d21/d26/l54 to d0/d46/l61 0 2026-03-09T17:30:14.805 INFO:tasks.workunit.client.1.vm09.stdout:3/299: getdents d5/d16/d25 0 2026-03-09T17:30:14.806 INFO:tasks.workunit.client.1.vm09.stdout:7/387: rename da/d11/d41/d4e/f39 to da/d11/d3e/f88 0 2026-03-09T17:30:14.807 INFO:tasks.workunit.client.1.vm09.stdout:0/281: mknod d6/c55 0 2026-03-09T17:30:14.809 INFO:tasks.workunit.client.1.vm09.stdout:3/300: creat d5/d16/d46/f55 x:0 0 0 2026-03-09T17:30:14.810 INFO:tasks.workunit.client.1.vm09.stdout:2/242: dwrite d13/f39 [0,4194304] 0 2026-03-09T17:30:14.814 INFO:tasks.workunit.client.1.vm09.stdout:5/293: creat d0/dc/d21/f62 x:0 0 0 2026-03-09T17:30:14.814 INFO:tasks.workunit.client.1.vm09.stdout:8/307: dwrite d1/d14/f3d [0,4194304] 0 2026-03-09T17:30:14.816 INFO:tasks.workunit.client.1.vm09.stdout:0/282: symlink d6/d1d/d24/d32/l56 0 2026-03-09T17:30:14.816 INFO:tasks.workunit.client.1.vm09.stdout:5/294: chown d0/d52/c2b 756 1 2026-03-09T17:30:14.817 INFO:tasks.workunit.client.1.vm09.stdout:1/305: getdents d9/dc/dd/d40/d22/d37 0 2026-03-09T17:30:14.817 INFO:tasks.workunit.client.1.vm09.stdout:2/243: read d13/d15/d3b/f3f [1742891,21061] 0 2026-03-09T17:30:14.819 INFO:tasks.workunit.client.1.vm09.stdout:3/301: truncate d5/d16/d31/d3d/d12/f39 569664 0 2026-03-09T17:30:14.820 INFO:tasks.workunit.client.1.vm09.stdout:2/244: chown d13/d15/f2b 7513 1 2026-03-09T17:30:14.831 INFO:tasks.workunit.client.0.vm06.stdout:6/971: dwrite d6/d4f/d3e/d52/d8c/db0/fc7 [4194304,4194304] 0 2026-03-09T17:30:14.834 INFO:tasks.workunit.client.1.vm09.stdout:7/388: fsync da/d11/d2d/d56/f50 0 2026-03-09T17:30:14.842 INFO:tasks.workunit.client.1.vm09.stdout:5/295: rename d0/d9/d16/d3c/f49 to d0/d52/d20/f63 0 2026-03-09T17:30:14.847 INFO:tasks.workunit.client.1.vm09.stdout:7/389: mkdir da/d11/d47/d89 0 2026-03-09T17:30:14.847 INFO:tasks.workunit.client.1.vm09.stdout:7/390: stat da/d11/d41/d4e/d4c/f67 0 2026-03-09T17:30:14.847 INFO:tasks.workunit.client.1.vm09.stdout:3/302: truncate d5/f22 6274623 0 2026-03-09T17:30:14.851 INFO:tasks.workunit.client.1.vm09.stdout:4/352: dread fe [4194304,4194304] 0 2026-03-09T17:30:14.851 INFO:tasks.workunit.client.1.vm09.stdout:4/353: chown d11/d1e/f22 5 1 2026-03-09T17:30:14.851 INFO:tasks.workunit.client.1.vm09.stdout:5/296: dwrite d0/d52/d20/f63 [0,4194304] 0 2026-03-09T17:30:14.851 INFO:tasks.workunit.client.1.vm09.stdout:2/245: mkdir d13/d50 0 2026-03-09T17:30:14.851 INFO:tasks.workunit.client.1.vm09.stdout:7/391: mknod da/d11/d47/c8a 0 2026-03-09T17:30:14.853 INFO:tasks.workunit.client.1.vm09.stdout:3/303: chown d5/c7 1 1 2026-03-09T17:30:14.854 INFO:tasks.workunit.client.1.vm09.stdout:7/392: dread - da/d11/f6a zero size 2026-03-09T17:30:14.854 INFO:tasks.workunit.client.1.vm09.stdout:3/304: chown d5/d16/d31/d3d/d12/f19 13234 1 2026-03-09T17:30:14.854 INFO:tasks.workunit.client.1.vm09.stdout:3/305: chown d5/d16/d31/d3d 837445 1 2026-03-09T17:30:14.854 INFO:tasks.workunit.client.1.vm09.stdout:5/297: creat d0/de/f64 x:0 0 0 2026-03-09T17:30:14.861 INFO:tasks.workunit.client.1.vm09.stdout:7/393: mknod da/d11/d41/d4e/d4c/c8b 0 2026-03-09T17:30:14.861 INFO:tasks.workunit.client.1.vm09.stdout:0/283: creat d6/d1d/f57 x:0 0 0 2026-03-09T17:30:14.861 INFO:tasks.workunit.client.1.vm09.stdout:3/306: rmdir d5/d16/d31 39 2026-03-09T17:30:14.862 INFO:tasks.workunit.client.1.vm09.stdout:5/298: creat d0/dc/d21/d33/f65 x:0 0 0 2026-03-09T17:30:14.862 INFO:tasks.workunit.client.1.vm09.stdout:5/299: chown d0/d2/f31 39 1 2026-03-09T17:30:14.862 INFO:tasks.workunit.client.1.vm09.stdout:2/246: symlink d13/d15/d36/l51 0 2026-03-09T17:30:14.865 INFO:tasks.workunit.client.1.vm09.stdout:2/247: chown d13/d15/d36/c42 166 1 2026-03-09T17:30:14.873 INFO:tasks.workunit.client.1.vm09.stdout:7/394: symlink da/d11/d47/d5b/d6c/l8c 0 2026-03-09T17:30:14.874 INFO:tasks.workunit.client.1.vm09.stdout:0/284: symlink d6/d1d/l58 0 2026-03-09T17:30:14.875 INFO:tasks.workunit.client.1.vm09.stdout:3/307: write d5/d16/d31/d3d/d12/f19 [4945702,116175] 0 2026-03-09T17:30:14.880 INFO:tasks.workunit.client.1.vm09.stdout:0/285: dwrite d6/d1d/d24/f50 [0,4194304] 0 2026-03-09T17:30:14.892 INFO:tasks.workunit.client.1.vm09.stdout:4/354: link d11/d1e/d29/f2e d11/d1e/f73 0 2026-03-09T17:30:14.892 INFO:tasks.workunit.client.1.vm09.stdout:2/248: fdatasync d13/f40 0 2026-03-09T17:30:14.892 INFO:tasks.workunit.client.1.vm09.stdout:7/395: creat da/d11/d47/f8d x:0 0 0 2026-03-09T17:30:14.892 INFO:tasks.workunit.client.1.vm09.stdout:7/396: write da/f27 [680065,14317] 0 2026-03-09T17:30:14.893 INFO:tasks.workunit.client.1.vm09.stdout:3/308: creat d5/d16/d31/f56 x:0 0 0 2026-03-09T17:30:14.893 INFO:tasks.workunit.client.1.vm09.stdout:3/309: truncate d5/d16/d31/d3d/d12/f4f 1836360 0 2026-03-09T17:30:14.893 INFO:tasks.workunit.client.1.vm09.stdout:2/249: truncate d13/d15/d3b/f3f 4824749 0 2026-03-09T17:30:14.893 INFO:tasks.workunit.client.1.vm09.stdout:0/286: mkdir d6/d1d/d24/d32/d59 0 2026-03-09T17:30:14.893 INFO:tasks.workunit.client.1.vm09.stdout:4/355: readlink d11/d1e/d45/d60/d69/l4b 0 2026-03-09T17:30:14.893 INFO:tasks.workunit.client.1.vm09.stdout:0/287: dread - d6/d1d/f57 zero size 2026-03-09T17:30:14.893 INFO:tasks.workunit.client.1.vm09.stdout:3/310: rename d5/d16/d31/d3d/fb to d5/d16/d31/f57 0 2026-03-09T17:30:14.893 INFO:tasks.workunit.client.1.vm09.stdout:5/300: link d0/d9/d16/d3c/f5f d0/d9/f66 0 2026-03-09T17:30:14.893 INFO:tasks.workunit.client.1.vm09.stdout:5/301: write d0/d9/f3e [1746283,18343] 0 2026-03-09T17:30:14.893 INFO:tasks.workunit.client.1.vm09.stdout:2/250: getdents d13/d15/d36/d4f 0 2026-03-09T17:30:14.893 INFO:tasks.workunit.client.1.vm09.stdout:5/302: write d0/f22 [1681873,62434] 0 2026-03-09T17:30:14.896 INFO:tasks.workunit.client.1.vm09.stdout:5/303: chown d0/dc/d21/d33/f35 60963 1 2026-03-09T17:30:14.904 INFO:tasks.workunit.client.1.vm09.stdout:9/268: write d5/de/d29/f57 [144117,96662] 0 2026-03-09T17:30:14.906 INFO:tasks.workunit.client.0.vm06.stdout:2/947: write d3/d4/d12/d2b/d36/d37/f3a [2692622,120835] 0 2026-03-09T17:30:14.910 INFO:tasks.workunit.client.1.vm09.stdout:8/308: rmdir d1/d14/d2a/d42 39 2026-03-09T17:30:14.910 INFO:tasks.workunit.client.1.vm09.stdout:8/309: stat d1/da/dd/f22 0 2026-03-09T17:30:14.910 INFO:tasks.workunit.client.1.vm09.stdout:6/280: unlink d3/d21/f22 0 2026-03-09T17:30:14.915 INFO:tasks.workunit.client.1.vm09.stdout:4/356: dread d11/d1e/d29/f32 [0,4194304] 0 2026-03-09T17:30:14.918 INFO:tasks.workunit.client.1.vm09.stdout:9/269: creat d5/d2e/f5e x:0 0 0 2026-03-09T17:30:14.918 INFO:tasks.workunit.client.1.vm09.stdout:4/357: creat d11/d1e/d31/f74 x:0 0 0 2026-03-09T17:30:14.919 INFO:tasks.workunit.client.1.vm09.stdout:2/251: fdatasync d13/f4c 0 2026-03-09T17:30:14.922 INFO:tasks.workunit.client.1.vm09.stdout:6/281: mkdir d3/d7/d59 0 2026-03-09T17:30:14.931 INFO:tasks.workunit.client.1.vm09.stdout:6/282: stat d3/d7/f58 0 2026-03-09T17:30:14.931 INFO:tasks.workunit.client.1.vm09.stdout:2/252: dread - d13/d15/d34/f44 zero size 2026-03-09T17:30:14.931 INFO:tasks.workunit.client.1.vm09.stdout:6/283: mkdir d3/d7/d59/d5a 0 2026-03-09T17:30:14.931 INFO:tasks.workunit.client.1.vm09.stdout:6/284: chown d3/d21/f28 2818183 1 2026-03-09T17:30:14.931 INFO:tasks.workunit.client.1.vm09.stdout:8/310: dwrite d1/da/dd/d63/f1d [0,4194304] 0 2026-03-09T17:30:14.931 INFO:tasks.workunit.client.1.vm09.stdout:2/253: symlink d13/d15/d36/l52 0 2026-03-09T17:30:14.935 INFO:tasks.workunit.client.1.vm09.stdout:5/304: sync 2026-03-09T17:30:14.935 INFO:tasks.workunit.client.1.vm09.stdout:4/358: read d11/d1e/d29/f3b [1547718,16363] 0 2026-03-09T17:30:14.935 INFO:tasks.workunit.client.1.vm09.stdout:5/305: stat d0/d57 0 2026-03-09T17:30:14.935 INFO:tasks.workunit.client.1.vm09.stdout:4/359: chown d11/f4d 913 1 2026-03-09T17:30:14.935 INFO:tasks.workunit.client.1.vm09.stdout:4/360: sync 2026-03-09T17:30:14.936 INFO:tasks.workunit.client.1.vm09.stdout:4/361: write f3 [3129253,103950] 0 2026-03-09T17:30:14.942 INFO:tasks.workunit.client.1.vm09.stdout:5/306: rmdir d0/d52 39 2026-03-09T17:30:14.942 INFO:tasks.workunit.client.1.vm09.stdout:5/307: chown d0/de/f50 4425309 1 2026-03-09T17:30:14.942 INFO:tasks.workunit.client.1.vm09.stdout:4/362: rename d11/f4d to d11/d1e/d45/d60/d69/d58/f75 0 2026-03-09T17:30:14.942 INFO:tasks.workunit.client.1.vm09.stdout:5/308: symlink d0/d52/l67 0 2026-03-09T17:30:14.942 INFO:tasks.workunit.client.1.vm09.stdout:5/309: read - d0/d46/f4c zero size 2026-03-09T17:30:14.942 INFO:tasks.workunit.client.1.vm09.stdout:6/285: mknod d3/d7/c5b 0 2026-03-09T17:30:14.942 INFO:tasks.workunit.client.1.vm09.stdout:8/311: creat d1/da/dd/d47/f66 x:0 0 0 2026-03-09T17:30:14.945 INFO:tasks.workunit.client.1.vm09.stdout:4/363: creat d11/d1e/d45/d60/d71/f76 x:0 0 0 2026-03-09T17:30:14.945 INFO:tasks.workunit.client.1.vm09.stdout:2/254: dwrite d13/d15/d21/f3e [0,4194304] 0 2026-03-09T17:30:14.949 INFO:tasks.workunit.client.1.vm09.stdout:7/397: dread da/d11/d41/f30 [0,4194304] 0 2026-03-09T17:30:14.958 INFO:tasks.workunit.client.1.vm09.stdout:7/398: dwrite da/d11/d41/d4e/f42 [0,4194304] 0 2026-03-09T17:30:14.959 INFO:tasks.workunit.client.1.vm09.stdout:6/286: mkdir d3/d1e/d30/d5c 0 2026-03-09T17:30:14.960 INFO:tasks.workunit.client.1.vm09.stdout:5/310: mkdir d0/dc/d21/d26/d5e/d68 0 2026-03-09T17:30:14.976 INFO:tasks.workunit.client.1.vm09.stdout:5/311: dwrite d0/dc/d21/d33/f35 [0,4194304] 0 2026-03-09T17:30:14.977 INFO:tasks.workunit.client.1.vm09.stdout:5/312: write d0/dc/d21/f62 [98211,49872] 0 2026-03-09T17:30:14.979 INFO:tasks.workunit.client.1.vm09.stdout:1/306: write f8 [4917642,37244] 0 2026-03-09T17:30:14.985 INFO:tasks.workunit.client.1.vm09.stdout:5/313: creat d0/dc/d21/d33/f69 x:0 0 0 2026-03-09T17:30:14.985 INFO:tasks.workunit.client.1.vm09.stdout:2/255: rmdir d13/d15/d36/d4f 0 2026-03-09T17:30:14.986 INFO:tasks.workunit.client.1.vm09.stdout:5/314: write d0/dc/d21/d26/f28 [866629,25285] 0 2026-03-09T17:30:14.986 INFO:tasks.workunit.client.1.vm09.stdout:4/364: truncate d11/f18 1160719 0 2026-03-09T17:30:14.993 INFO:tasks.workunit.client.1.vm09.stdout:6/287: truncate d3/d7/ff 1363934 0 2026-03-09T17:30:14.997 INFO:tasks.workunit.client.1.vm09.stdout:2/256: fsync d13/d15/d34/f48 0 2026-03-09T17:30:14.998 INFO:tasks.workunit.client.1.vm09.stdout:5/315: symlink d0/d2/l6a 0 2026-03-09T17:30:14.998 INFO:tasks.workunit.client.1.vm09.stdout:6/288: write d3/d21/f4d [1051979,86396] 0 2026-03-09T17:30:14.999 INFO:tasks.workunit.client.1.vm09.stdout:2/257: symlink d13/d15/d2c/l53 0 2026-03-09T17:30:15.000 INFO:tasks.workunit.client.1.vm09.stdout:1/307: dread d9/dc/dd/d40/d22/d37/f41 [0,4194304] 0 2026-03-09T17:30:15.002 INFO:tasks.workunit.client.1.vm09.stdout:2/258: mknod d13/d15/d2c/c54 0 2026-03-09T17:30:15.003 INFO:tasks.workunit.client.1.vm09.stdout:1/308: rmdir d9/dc/dd/d40/d1d 39 2026-03-09T17:30:15.004 INFO:tasks.workunit.client.1.vm09.stdout:2/259: fsync d13/d15/d2c/f2d 0 2026-03-09T17:30:15.006 INFO:tasks.workunit.client.1.vm09.stdout:5/316: sync 2026-03-09T17:30:15.007 INFO:tasks.workunit.client.1.vm09.stdout:2/260: fsync d13/d15/d34/d45/f49 0 2026-03-09T17:30:15.009 INFO:tasks.workunit.client.1.vm09.stdout:5/317: creat d0/de/f6b x:0 0 0 2026-03-09T17:30:15.012 INFO:tasks.workunit.client.0.vm06.stdout:6/972: dwrite d6/d12/d17/d85/faf [0,4194304] 0 2026-03-09T17:30:15.023 INFO:tasks.workunit.client.1.vm09.stdout:5/318: creat d0/dc/d21/d33/f6c x:0 0 0 2026-03-09T17:30:15.026 INFO:tasks.workunit.client.1.vm09.stdout:2/261: link lf d13/d15/d34/d37/l55 0 2026-03-09T17:30:15.030 INFO:tasks.workunit.client.1.vm09.stdout:5/319: mkdir d0/dc/d21/d26/d5e/d68/d6d 0 2026-03-09T17:30:15.030 INFO:tasks.workunit.client.1.vm09.stdout:2/262: mknod d13/d15/c56 0 2026-03-09T17:30:15.030 INFO:tasks.workunit.client.1.vm09.stdout:2/263: chown d13/f23 0 1 2026-03-09T17:30:15.030 INFO:tasks.workunit.client.1.vm09.stdout:2/264: write d13/f40 [515401,121893] 0 2026-03-09T17:30:15.034 INFO:tasks.workunit.client.1.vm09.stdout:2/265: chown d13/d15/d34/d37/l55 43 1 2026-03-09T17:30:15.043 INFO:tasks.workunit.client.1.vm09.stdout:2/266: unlink d13/f23 0 2026-03-09T17:30:15.043 INFO:tasks.workunit.client.1.vm09.stdout:2/267: readlink l11 0 2026-03-09T17:30:15.043 INFO:tasks.workunit.client.0.vm06.stdout:6/973: dread d6/d4f/d3e/d52/d8c/db0/fdb [0,4194304] 0 2026-03-09T17:30:15.043 INFO:tasks.workunit.client.0.vm06.stdout:6/974: mknod d6/d47/d4d/d6d/c12e 0 2026-03-09T17:30:15.043 INFO:tasks.workunit.client.0.vm06.stdout:6/975: creat d6/d12/d17/d85/f12f x:0 0 0 2026-03-09T17:30:15.043 INFO:tasks.workunit.client.0.vm06.stdout:6/976: creat d6/d4f/d3e/d52/d8c/f130 x:0 0 0 2026-03-09T17:30:15.043 INFO:tasks.workunit.client.0.vm06.stdout:6/977: creat d6/d12/d53/dd0/f131 x:0 0 0 2026-03-09T17:30:15.043 INFO:tasks.workunit.client.0.vm06.stdout:6/978: mkdir d6/d47/d4d/d6d/de2/d132 0 2026-03-09T17:30:15.043 INFO:tasks.workunit.client.0.vm06.stdout:6/979: readlink d6/d12/d17/d65/l10a 0 2026-03-09T17:30:15.060 INFO:tasks.workunit.client.1.vm09.stdout:3/311: read d5/d16/d31/f57 [2282161,13682] 0 2026-03-09T17:30:15.062 INFO:tasks.workunit.client.1.vm09.stdout:3/312: write d5/d16/d25/f2c [834918,16358] 0 2026-03-09T17:30:15.063 INFO:tasks.workunit.client.1.vm09.stdout:3/313: readlink d5/d16/l52 0 2026-03-09T17:30:15.066 INFO:tasks.workunit.client.1.vm09.stdout:0/288: rmdir d6/d1d/d24 39 2026-03-09T17:30:15.068 INFO:tasks.workunit.client.1.vm09.stdout:0/289: fsync d6/f21 0 2026-03-09T17:30:15.075 INFO:tasks.workunit.client.1.vm09.stdout:0/290: dread d6/d1d/d39/f2e [0,4194304] 0 2026-03-09T17:30:15.075 INFO:tasks.workunit.client.1.vm09.stdout:3/314: dread d5/d16/d31/f57 [0,4194304] 0 2026-03-09T17:30:15.076 INFO:tasks.workunit.client.1.vm09.stdout:0/291: readlink d6/d1d/l1f 0 2026-03-09T17:30:15.076 INFO:tasks.workunit.client.1.vm09.stdout:3/315: stat d5/d16/d31/d3d/d12/f18 0 2026-03-09T17:30:15.077 INFO:tasks.workunit.client.0.vm06.stdout:2/948: write d3/f91 [2427693,121086] 0 2026-03-09T17:30:15.078 INFO:tasks.workunit.client.0.vm06.stdout:2/949: stat d3/d4/d12/d71/daa/d77/d102/l117 0 2026-03-09T17:30:15.081 INFO:tasks.workunit.client.1.vm09.stdout:0/292: fdatasync d6/f2d 0 2026-03-09T17:30:15.081 INFO:tasks.workunit.client.0.vm06.stdout:2/950: truncate d3/d4/d12/f31 1727360 0 2026-03-09T17:30:15.081 INFO:tasks.workunit.client.1.vm09.stdout:7/399: dread da/d11/d41/d4e/d4c/f67 [0,4194304] 0 2026-03-09T17:30:15.083 INFO:tasks.workunit.client.1.vm09.stdout:3/316: mkdir d5/d16/d31/d37/d58 0 2026-03-09T17:30:15.087 INFO:tasks.workunit.client.1.vm09.stdout:1/309: dread f8 [4194304,4194304] 0 2026-03-09T17:30:15.087 INFO:tasks.workunit.client.0.vm06.stdout:2/951: truncate d3/d4/d12/d2b/d2d/f9d 63255 0 2026-03-09T17:30:15.088 INFO:tasks.workunit.client.0.vm06.stdout:2/952: truncate d3/d4/d12/d71/daa/d77/d81/f127 78611 0 2026-03-09T17:30:15.089 INFO:tasks.workunit.client.0.vm06.stdout:2/953: truncate d3/d4/d12/d71/ffb 311217 0 2026-03-09T17:30:15.091 INFO:tasks.workunit.client.1.vm09.stdout:1/310: chown d9/dc/dd/d40/d21/d35 501 1 2026-03-09T17:30:15.093 INFO:tasks.workunit.client.1.vm09.stdout:9/270: getdents d5/d2e 0 2026-03-09T17:30:15.094 INFO:tasks.workunit.client.1.vm09.stdout:9/271: write d5/d21/f46 [1400330,84547] 0 2026-03-09T17:30:15.095 INFO:tasks.workunit.client.1.vm09.stdout:0/293: write d6/d1d/d24/f4e [276545,126597] 0 2026-03-09T17:30:15.101 INFO:tasks.workunit.client.1.vm09.stdout:7/400: creat da/d11/d41/d4e/f8e x:0 0 0 2026-03-09T17:30:15.101 INFO:tasks.workunit.client.1.vm09.stdout:1/311: read d9/dc/dd/fe [8203936,82558] 0 2026-03-09T17:30:15.102 INFO:tasks.workunit.client.1.vm09.stdout:7/401: dread - da/d11/d41/d4e/f7d zero size 2026-03-09T17:30:15.109 INFO:tasks.workunit.client.0.vm06.stdout:2/954: dread d3/d4/d12/d71/daa/d77/d81/d130/d8f/fb7 [0,4194304] 0 2026-03-09T17:30:15.109 INFO:tasks.workunit.client.0.vm06.stdout:2/955: dread - d3/d4/d12/d71/daa/d77/d81/d64/de5/f131 zero size 2026-03-09T17:30:15.110 INFO:tasks.workunit.client.0.vm06.stdout:2/956: write d3/d4/d12/d2b/fb6 [2838617,65261] 0 2026-03-09T17:30:15.111 INFO:tasks.workunit.client.0.vm06.stdout:2/957: chown d3/d4/d12/da7/dfc/f12c 2 1 2026-03-09T17:30:15.113 INFO:tasks.workunit.client.1.vm09.stdout:0/294: unlink d6/d1d/d24/d32/l56 0 2026-03-09T17:30:15.115 INFO:tasks.workunit.client.1.vm09.stdout:1/312: mknod d9/dc/dd/c5f 0 2026-03-09T17:30:15.115 INFO:tasks.workunit.client.1.vm09.stdout:8/312: write d1/d14/d2a/f2b [8225745,95629] 0 2026-03-09T17:30:15.116 INFO:tasks.workunit.client.1.vm09.stdout:1/313: chown d9/dc/dd/l2c 10 1 2026-03-09T17:30:15.121 INFO:tasks.workunit.client.0.vm06.stdout:2/958: fdatasync d3/d4/d12/da7/dfc/f10c 0 2026-03-09T17:30:15.122 INFO:tasks.workunit.client.1.vm09.stdout:7/402: dwrite da/d11/d41/f30 [0,4194304] 0 2026-03-09T17:30:15.123 INFO:tasks.workunit.client.0.vm06.stdout:2/959: truncate d3/d4/d12/d2b/d36/dd4/f113 741426 0 2026-03-09T17:30:15.127 INFO:tasks.workunit.client.1.vm09.stdout:4/365: truncate d11/d1e/d29/f50 2578423 0 2026-03-09T17:30:15.128 INFO:tasks.workunit.client.1.vm09.stdout:4/366: dread - d11/d1e/d29/f6d zero size 2026-03-09T17:30:15.130 INFO:tasks.workunit.client.1.vm09.stdout:6/289: write d3/d1e/f20 [295006,104850] 0 2026-03-09T17:30:15.131 INFO:tasks.workunit.client.1.vm09.stdout:7/403: chown da/d11/d41/c55 22192578 1 2026-03-09T17:30:15.136 INFO:tasks.workunit.client.1.vm09.stdout:8/313: unlink d1/f8 0 2026-03-09T17:30:15.136 INFO:tasks.workunit.client.1.vm09.stdout:1/314: symlink d9/dc/dd/d40/d21/d35/l60 0 2026-03-09T17:30:15.137 INFO:tasks.workunit.client.1.vm09.stdout:6/290: creat d3/d21/f5d x:0 0 0 2026-03-09T17:30:15.138 INFO:tasks.workunit.client.1.vm09.stdout:4/367: mknod d11/d1e/d29/d36/d57/c77 0 2026-03-09T17:30:15.138 INFO:tasks.workunit.client.1.vm09.stdout:4/368: chown d11 21673 1 2026-03-09T17:30:15.138 INFO:tasks.workunit.client.1.vm09.stdout:7/404: mkdir da/d11/d47/d8f 0 2026-03-09T17:30:15.139 INFO:tasks.workunit.client.1.vm09.stdout:1/315: mkdir d9/d38/d61 0 2026-03-09T17:30:15.140 INFO:tasks.workunit.client.1.vm09.stdout:1/316: read - d9/f59 zero size 2026-03-09T17:30:15.140 INFO:tasks.workunit.client.1.vm09.stdout:6/291: rename d3/d7/c5b to d3/d1e/d30/d32/c5e 0 2026-03-09T17:30:15.141 INFO:tasks.workunit.client.1.vm09.stdout:9/272: link d5/f47 d5/de/d29/f5f 0 2026-03-09T17:30:15.141 INFO:tasks.workunit.client.1.vm09.stdout:1/317: stat d9/dc/dd/d40/l27 0 2026-03-09T17:30:15.141 INFO:tasks.workunit.client.1.vm09.stdout:6/292: fdatasync d3/fc 0 2026-03-09T17:30:15.143 INFO:tasks.workunit.client.1.vm09.stdout:1/318: creat d9/dc/dd/d40/d22/d37/d3f/f62 x:0 0 0 2026-03-09T17:30:15.149 INFO:tasks.workunit.client.1.vm09.stdout:1/319: dread - d9/dc/dd/f4f zero size 2026-03-09T17:30:15.149 INFO:tasks.workunit.client.1.vm09.stdout:1/320: stat d9/dc/dd/d40/l2a 0 2026-03-09T17:30:15.149 INFO:tasks.workunit.client.1.vm09.stdout:1/321: fdatasync d9/f11 0 2026-03-09T17:30:15.149 INFO:tasks.workunit.client.1.vm09.stdout:6/293: creat d3/d21/d25/f5f x:0 0 0 2026-03-09T17:30:15.149 INFO:tasks.workunit.client.1.vm09.stdout:9/273: truncate d5/de/d29/f5f 3897707 0 2026-03-09T17:30:15.149 INFO:tasks.workunit.client.1.vm09.stdout:9/274: write d5/de/f3c [3541390,20212] 0 2026-03-09T17:30:15.151 INFO:tasks.workunit.client.1.vm09.stdout:9/275: rename d5/de/l27 to d5/d21/l60 0 2026-03-09T17:30:15.153 INFO:tasks.workunit.client.1.vm09.stdout:6/294: dwrite f2 [0,4194304] 0 2026-03-09T17:30:15.153 INFO:tasks.workunit.client.1.vm09.stdout:9/276: mknod d5/de/c61 0 2026-03-09T17:30:15.154 INFO:tasks.workunit.client.1.vm09.stdout:1/322: dwrite d9/dc/dd/d40/d22/d37/d3f/f62 [0,4194304] 0 2026-03-09T17:30:15.154 INFO:tasks.workunit.client.1.vm09.stdout:9/277: symlink d5/d2e/l62 0 2026-03-09T17:30:15.155 INFO:tasks.workunit.client.1.vm09.stdout:4/369: sync 2026-03-09T17:30:15.171 INFO:tasks.workunit.client.1.vm09.stdout:1/323: write d9/dc/dd/fe [7382542,31302] 0 2026-03-09T17:30:15.192 INFO:tasks.workunit.client.1.vm09.stdout:1/324: sync 2026-03-09T17:30:15.193 INFO:tasks.workunit.client.1.vm09.stdout:1/325: rmdir d9/d3a 39 2026-03-09T17:30:15.195 INFO:tasks.workunit.client.1.vm09.stdout:1/326: rename d9/d5b to d9/dc/d63 0 2026-03-09T17:30:15.199 INFO:tasks.workunit.client.1.vm09.stdout:1/327: unlink d9/dc/dd/f28 0 2026-03-09T17:30:15.200 INFO:tasks.workunit.client.1.vm09.stdout:1/328: link d9/dc/dd/c5f d9/dc/dd/d40/d22/d37/d3f/d42/d55/c64 0 2026-03-09T17:30:15.214 INFO:tasks.workunit.client.1.vm09.stdout:5/320: truncate d0/d9/f3e 1685293 0 2026-03-09T17:30:15.215 INFO:tasks.workunit.client.1.vm09.stdout:2/268: rmdir d13 39 2026-03-09T17:30:15.215 INFO:tasks.workunit.client.0.vm06.stdout:6/980: write d6/d47/d4d/fae [1919853,89916] 0 2026-03-09T17:30:15.217 INFO:tasks.workunit.client.0.vm06.stdout:6/981: rmdir d6/d12/d2d 39 2026-03-09T17:30:15.218 INFO:tasks.workunit.client.0.vm06.stdout:6/982: mknod d6/c133 0 2026-03-09T17:30:15.219 INFO:tasks.workunit.client.1.vm09.stdout:1/329: link d9/dc/dd/d40/l58 d9/dc/d63/l65 0 2026-03-09T17:30:15.220 INFO:tasks.workunit.client.0.vm06.stdout:6/983: mkdir d6/d47/d4d/d9a/d134 0 2026-03-09T17:30:15.221 INFO:tasks.workunit.client.1.vm09.stdout:5/321: symlink d0/d9/d16/d3c/d42/l6e 0 2026-03-09T17:30:15.223 INFO:tasks.workunit.client.1.vm09.stdout:1/330: fsync d9/f11 0 2026-03-09T17:30:15.231 INFO:tasks.workunit.client.1.vm09.stdout:5/322: dwrite d0/d46/f56 [0,4194304] 0 2026-03-09T17:30:15.231 INFO:tasks.workunit.client.1.vm09.stdout:5/323: chown d0/dc/d21/d26/l3b 178 1 2026-03-09T17:30:15.231 INFO:tasks.workunit.client.0.vm06.stdout:6/984: link d6/d47/d4d/d6d/cef d6/d47/d96/d40/c135 0 2026-03-09T17:30:15.231 INFO:tasks.workunit.client.0.vm06.stdout:6/985: fdatasync d6/d12/d53/f64 0 2026-03-09T17:30:15.231 INFO:tasks.workunit.client.0.vm06.stdout:6/986: mknod d6/d4f/d3e/d52/d80/c136 0 2026-03-09T17:30:15.231 INFO:tasks.workunit.client.0.vm06.stdout:6/987: fsync d6/d4f/fee 0 2026-03-09T17:30:15.231 INFO:tasks.workunit.client.0.vm06.stdout:6/988: rmdir d6/d47/d8a 39 2026-03-09T17:30:15.231 INFO:tasks.workunit.client.0.vm06.stdout:6/989: write d6/d12/d17/d85/f12f [671199,11674] 0 2026-03-09T17:30:15.231 INFO:tasks.workunit.client.0.vm06.stdout:6/990: truncate d6/d4f/f44 723152 0 2026-03-09T17:30:15.231 INFO:tasks.workunit.client.0.vm06.stdout:6/991: fsync d6/d4f/f26 0 2026-03-09T17:30:15.233 INFO:tasks.workunit.client.0.vm06.stdout:6/992: rename d6/d47/d4d/d6d/c12e to d6/d47/d8a/c137 0 2026-03-09T17:30:15.234 INFO:tasks.workunit.client.1.vm09.stdout:5/324: unlink d0/f53 0 2026-03-09T17:30:15.234 INFO:tasks.workunit.client.0.vm06.stdout:6/993: fdatasync d6/d47/d96/d40/fbd 0 2026-03-09T17:30:15.234 INFO:tasks.workunit.client.1.vm09.stdout:5/325: dread - d0/dc/d21/d33/f69 zero size 2026-03-09T17:30:15.235 INFO:tasks.workunit.client.1.vm09.stdout:5/326: stat d0/d52/d20/f25 0 2026-03-09T17:30:15.235 INFO:tasks.workunit.client.1.vm09.stdout:5/327: write d0/dc/d21/f62 [1142165,24845] 0 2026-03-09T17:30:15.235 INFO:tasks.workunit.client.0.vm06.stdout:6/994: getdents d6/d4f/d129 0 2026-03-09T17:30:15.238 INFO:tasks.workunit.client.1.vm09.stdout:1/331: dwrite d9/dc/dd/d40/d1d/f17 [0,4194304] 0 2026-03-09T17:30:15.244 INFO:tasks.workunit.client.1.vm09.stdout:1/332: symlink d9/dc/dd/d40/d22/d37/d3f/l66 0 2026-03-09T17:30:15.248 INFO:tasks.workunit.client.1.vm09.stdout:0/295: dwrite d6/f21 [0,4194304] 0 2026-03-09T17:30:15.255 INFO:tasks.workunit.client.1.vm09.stdout:0/296: symlink d6/d1d/d24/d32/l5a 0 2026-03-09T17:30:15.272 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:14 vm06.local ceph-mon[57307]: pgmap v7: 65 pgs: 65 active+clean; 2.8 GiB data, 9.7 GiB used, 110 GiB / 120 GiB avail; 25 MiB/s rd, 91 MiB/s wr, 194 op/s 2026-03-09T17:30:15.272 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:14 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:15.272 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:14 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:15.272 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:14 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:15.273 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:14 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:15.273 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:14 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:15.273 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:14 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:15.322 INFO:tasks.workunit.client.1.vm09.stdout:3/317: truncate d5/d9/d30/f41 2873385 0 2026-03-09T17:30:15.330 INFO:tasks.workunit.client.0.vm06.stdout:2/960: write d3/d4/d12/da7/f106 [35536,1569] 0 2026-03-09T17:30:15.333 INFO:tasks.workunit.client.0.vm06.stdout:2/961: write d3/d4/d12/d71/ffb [945500,107891] 0 2026-03-09T17:30:15.337 INFO:tasks.workunit.client.1.vm09.stdout:3/318: mkdir d5/d16/d31/d3d/d12/d59 0 2026-03-09T17:30:15.339 INFO:tasks.workunit.client.0.vm06.stdout:2/962: dread d3/fc7 [0,4194304] 0 2026-03-09T17:30:15.341 INFO:tasks.workunit.client.0.vm06.stdout:2/963: write d3/d4/d12/d2b/d2d/f100 [3889692,90904] 0 2026-03-09T17:30:15.347 INFO:tasks.workunit.client.1.vm09.stdout:8/314: write d1/da/dd/f27 [76298,86259] 0 2026-03-09T17:30:15.350 INFO:tasks.workunit.client.1.vm09.stdout:8/315: write d1/da/dd/d3f/f1c [4676771,90198] 0 2026-03-09T17:30:15.359 INFO:tasks.workunit.client.0.vm06.stdout:2/964: rename d3/d4/d12/d71/daa/d77/d81/d130/d8f/dda/lff to d3/d4/d12/d2b/d36/d37/l137 0 2026-03-09T17:30:15.360 INFO:tasks.workunit.client.1.vm09.stdout:8/316: creat d1/da/dd/d47/d4c/f67 x:0 0 0 2026-03-09T17:30:15.365 INFO:tasks.workunit.client.1.vm09.stdout:3/319: dread d5/d16/d31/d3d/d32/f33 [0,4194304] 0 2026-03-09T17:30:15.367 INFO:tasks.workunit.client.1.vm09.stdout:7/405: truncate da/d11/d41/f30 3058182 0 2026-03-09T17:30:15.374 INFO:tasks.workunit.client.1.vm09.stdout:9/278: rmdir d5/de 39 2026-03-09T17:30:15.374 INFO:tasks.workunit.client.1.vm09.stdout:8/317: read d1/d14/f3c [3538646,32280] 0 2026-03-09T17:30:15.375 INFO:tasks.workunit.client.1.vm09.stdout:7/406: dread da/d11/d41/d4e/f7c [0,4194304] 0 2026-03-09T17:30:15.376 INFO:tasks.workunit.client.1.vm09.stdout:7/407: fdatasync da/d11/d41/d4e/f8e 0 2026-03-09T17:30:15.377 INFO:tasks.workunit.client.1.vm09.stdout:7/408: fsync da/d11/f6a 0 2026-03-09T17:30:15.378 INFO:tasks.workunit.client.1.vm09.stdout:7/409: write da/d11/d41/d4e/f63 [590109,35121] 0 2026-03-09T17:30:15.382 INFO:tasks.workunit.client.1.vm09.stdout:8/318: mknod d1/da/dd/d3f/d32/c68 0 2026-03-09T17:30:15.383 INFO:tasks.workunit.client.1.vm09.stdout:4/370: write d11/d1e/d29/d36/f6a [25457,109584] 0 2026-03-09T17:30:15.383 INFO:tasks.workunit.client.1.vm09.stdout:8/319: chown d1/da/l39 14820682 1 2026-03-09T17:30:15.388 INFO:tasks.workunit.client.1.vm09.stdout:6/295: truncate d3/d21/f4d 795434 0 2026-03-09T17:30:15.389 INFO:tasks.workunit.client.1.vm09.stdout:6/296: chown d3/d21/d25/d26/d34/l3a 80833011 1 2026-03-09T17:30:15.389 INFO:tasks.workunit.client.1.vm09.stdout:9/279: fdatasync d5/de/f20 0 2026-03-09T17:30:15.390 INFO:tasks.workunit.client.1.vm09.stdout:9/280: chown d5/d2e 1846732508 1 2026-03-09T17:30:15.390 INFO:tasks.workunit.client.1.vm09.stdout:7/410: write da/fb [151951,8285] 0 2026-03-09T17:30:15.391 INFO:tasks.workunit.client.1.vm09.stdout:9/281: truncate d5/d21/f38 473791 0 2026-03-09T17:30:15.392 INFO:tasks.workunit.client.1.vm09.stdout:4/371: mkdir d11/d1e/d29/d36/d57/d78 0 2026-03-09T17:30:15.394 INFO:tasks.workunit.client.1.vm09.stdout:6/297: creat d3/d1e/d30/d32/f60 x:0 0 0 2026-03-09T17:30:15.399 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:14 vm09.local ceph-mon[62061]: pgmap v7: 65 pgs: 65 active+clean; 2.8 GiB data, 9.7 GiB used, 110 GiB / 120 GiB avail; 25 MiB/s rd, 91 MiB/s wr, 194 op/s 2026-03-09T17:30:15.399 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:14 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:15.399 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:14 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:15.399 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:14 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:15.399 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:14 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:15.399 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:14 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:15.399 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:14 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:15.420 INFO:tasks.workunit.client.1.vm09.stdout:6/298: chown d3/f2e 26444 1 2026-03-09T17:30:15.426 INFO:tasks.workunit.client.1.vm09.stdout:5/328: rename d0/d9/d16/d3c to d0/dc/d21/d6f 0 2026-03-09T17:30:15.433 INFO:tasks.workunit.client.1.vm09.stdout:4/372: creat d11/d1e/d29/d36/d57/f79 x:0 0 0 2026-03-09T17:30:15.439 INFO:tasks.workunit.client.0.vm06.stdout:6/995: write d6/d4f/fee [915298,23527] 0 2026-03-09T17:30:15.443 INFO:tasks.workunit.client.1.vm09.stdout:6/299: chown d3/d7/f4c 61388771 1 2026-03-09T17:30:15.447 INFO:tasks.workunit.client.1.vm09.stdout:7/411: link da/d11/d2d/d56/l48 da/d11/d41/d4e/d4c/l90 0 2026-03-09T17:30:15.448 INFO:tasks.workunit.client.1.vm09.stdout:7/412: write da/f27 [1582282,106190] 0 2026-03-09T17:30:15.448 INFO:tasks.workunit.client.0.vm06.stdout:6/996: rename d6/d4f/fee to d6/d12/d53/dd0/f138 0 2026-03-09T17:30:15.452 INFO:tasks.workunit.client.1.vm09.stdout:2/269: creat d13/d15/d34/d45/f57 x:0 0 0 2026-03-09T17:30:15.460 INFO:tasks.workunit.client.1.vm09.stdout:5/329: creat d0/d9/d16/d5c/f70 x:0 0 0 2026-03-09T17:30:15.461 INFO:tasks.workunit.client.0.vm06.stdout:6/997: dwrite d6/d47/d4d/d9a/f12c [0,4194304] 0 2026-03-09T17:30:15.461 INFO:tasks.workunit.client.1.vm09.stdout:5/330: write d0/f60 [512157,99665] 0 2026-03-09T17:30:15.463 INFO:tasks.workunit.client.1.vm09.stdout:5/331: read - d0/dc/d21/d33/f65 zero size 2026-03-09T17:30:15.474 INFO:tasks.workunit.client.0.vm06.stdout:6/998: creat d6/d12/d53/dd0/f139 x:0 0 0 2026-03-09T17:30:15.478 INFO:tasks.workunit.client.1.vm09.stdout:7/413: mknod da/d11/d41/c91 0 2026-03-09T17:30:15.485 INFO:tasks.workunit.client.1.vm09.stdout:5/332: symlink d0/d55/l71 0 2026-03-09T17:30:15.491 INFO:tasks.workunit.client.0.vm06.stdout:6/999: rename d6/d12/c69 to d6/c13a 0 2026-03-09T17:30:15.497 INFO:tasks.workunit.client.1.vm09.stdout:5/333: read d0/d2/f2a [2679844,113888] 0 2026-03-09T17:30:15.497 INFO:tasks.workunit.client.1.vm09.stdout:2/270: unlink d13/d15/d36/l51 0 2026-03-09T17:30:15.505 INFO:tasks.workunit.client.1.vm09.stdout:5/334: symlink d0/d9/l72 0 2026-03-09T17:30:15.507 INFO:tasks.workunit.client.1.vm09.stdout:2/271: unlink d13/d15/c17 0 2026-03-09T17:30:15.507 INFO:tasks.workunit.client.1.vm09.stdout:2/272: chown l10 396169 1 2026-03-09T17:30:15.518 INFO:tasks.workunit.client.1.vm09.stdout:1/333: truncate d9/dc/dd/d40/d1d/f17 1363635 0 2026-03-09T17:30:15.522 INFO:tasks.workunit.client.1.vm09.stdout:0/297: dwrite d6/d1d/f37 [4194304,4194304] 0 2026-03-09T17:30:15.524 INFO:tasks.workunit.client.1.vm09.stdout:2/273: readlink d13/l1c 0 2026-03-09T17:30:15.528 INFO:tasks.workunit.client.1.vm09.stdout:1/334: creat d9/dc/d63/f67 x:0 0 0 2026-03-09T17:30:15.538 INFO:tasks.workunit.client.1.vm09.stdout:0/298: fsync d6/d1d/f3c 0 2026-03-09T17:30:15.544 INFO:tasks.workunit.client.1.vm09.stdout:1/335: creat d9/dc/dd/d40/d22/d37/d3f/f68 x:0 0 0 2026-03-09T17:30:15.546 INFO:tasks.workunit.client.1.vm09.stdout:0/299: mkdir d6/d1d/d24/d32/d59/d5b 0 2026-03-09T17:30:15.546 INFO:tasks.workunit.client.1.vm09.stdout:2/274: symlink d13/d15/d3b/l58 0 2026-03-09T17:30:15.547 INFO:tasks.workunit.client.1.vm09.stdout:1/336: stat d9/d3a 0 2026-03-09T17:30:15.551 INFO:tasks.workunit.client.1.vm09.stdout:0/300: creat d6/d1d/d24/d32/d59/f5c x:0 0 0 2026-03-09T17:30:15.551 INFO:tasks.workunit.client.1.vm09.stdout:1/337: creat d9/dc/dd/d40/d22/d37/d3f/d42/d55/f69 x:0 0 0 2026-03-09T17:30:15.554 INFO:tasks.workunit.client.1.vm09.stdout:1/338: write d9/dc/dd/fe [6464682,102837] 0 2026-03-09T17:30:15.562 INFO:tasks.workunit.client.1.vm09.stdout:0/301: fdatasync d6/f2d 0 2026-03-09T17:30:15.568 INFO:tasks.workunit.client.1.vm09.stdout:1/339: symlink d9/d3a/l6a 0 2026-03-09T17:30:15.568 INFO:tasks.workunit.client.1.vm09.stdout:1/340: fdatasync f2 0 2026-03-09T17:30:15.569 INFO:tasks.workunit.client.1.vm09.stdout:1/341: fdatasync d9/dc/dd/d40/d22/d37/f2e 0 2026-03-09T17:30:15.569 INFO:tasks.workunit.client.1.vm09.stdout:1/342: fdatasync d9/dc/f47 0 2026-03-09T17:30:15.575 INFO:tasks.workunit.client.1.vm09.stdout:1/343: dwrite d9/dc/dd/d40/d22/d37/f2e [0,4194304] 0 2026-03-09T17:30:15.593 INFO:tasks.workunit.client.1.vm09.stdout:1/344: link d9/dc/dd/d40/l23 d9/dc/d63/l6b 0 2026-03-09T17:30:15.596 INFO:tasks.workunit.client.0.vm06.stdout:2/965: write d3/d4/d12/d2b/f7e [3370689,56166] 0 2026-03-09T17:30:15.603 INFO:tasks.workunit.client.0.vm06.stdout:2/966: rename d3/d4/d12/da7/dfc/l11a to d3/d4/d46/da5/l138 0 2026-03-09T17:30:15.607 INFO:tasks.workunit.client.1.vm09.stdout:1/345: chown d9/dc/d63/l65 1 1 2026-03-09T17:30:15.610 INFO:tasks.workunit.client.0.vm06.stdout:2/967: mknod d3/d4/d12/d71/daa/d77/d81/d64/d6a/dba/c139 0 2026-03-09T17:30:15.615 INFO:tasks.workunit.client.1.vm09.stdout:3/320: truncate d5/d16/d31/d3d/d12/f43 1065430 0 2026-03-09T17:30:15.617 INFO:tasks.workunit.client.0.vm06.stdout:2/968: truncate d3/d4/d12/d2b/d36/dd4/f113 993112 0 2026-03-09T17:30:15.617 INFO:tasks.workunit.client.0.vm06.stdout:2/969: dread - d3/d4/d46/f132 zero size 2026-03-09T17:30:15.623 INFO:tasks.workunit.client.1.vm09.stdout:1/346: dread d9/dc/dd/d40/d21/f33 [0,4194304] 0 2026-03-09T17:30:15.624 INFO:tasks.workunit.client.1.vm09.stdout:1/347: stat d9/dc/dd/d40/d21/f33 0 2026-03-09T17:30:15.629 INFO:tasks.workunit.client.1.vm09.stdout:6/300: rename d3/d1e/d30/d32 to d3/d1e/d30/d5c/d61 0 2026-03-09T17:30:15.632 INFO:tasks.workunit.client.0.vm06.stdout:2/970: creat d3/d4/d11b/f13a x:0 0 0 2026-03-09T17:30:15.635 INFO:tasks.workunit.client.1.vm09.stdout:9/282: dwrite d5/de/d29/d33/f4a [0,4194304] 0 2026-03-09T17:30:15.648 INFO:tasks.workunit.client.1.vm09.stdout:6/301: dread d3/f2e [0,4194304] 0 2026-03-09T17:30:15.651 INFO:tasks.workunit.client.1.vm09.stdout:5/335: rename d0/dc/f44 to d0/d9/d16/d5c/f73 0 2026-03-09T17:30:15.655 INFO:tasks.workunit.client.1.vm09.stdout:6/302: dwrite d3/d1e/d30/d3f/f42 [0,4194304] 0 2026-03-09T17:30:15.659 INFO:tasks.workunit.client.1.vm09.stdout:9/283: symlink d5/de/d4e/l63 0 2026-03-09T17:30:15.661 INFO:tasks.workunit.client.1.vm09.stdout:5/336: dwrite d0/d52/d20/f63 [4194304,4194304] 0 2026-03-09T17:30:15.665 INFO:tasks.workunit.client.1.vm09.stdout:9/284: fdatasync d5/de/d29/f52 0 2026-03-09T17:30:15.665 INFO:tasks.workunit.client.1.vm09.stdout:9/285: chown d5 19 1 2026-03-09T17:30:15.665 INFO:tasks.workunit.client.1.vm09.stdout:9/286: chown d5/de/d29/d33/c3e 115109 1 2026-03-09T17:30:15.672 INFO:tasks.workunit.client.1.vm09.stdout:4/373: write d11/d1e/d45/d60/d69/d58/f75 [1005151,7301] 0 2026-03-09T17:30:15.675 INFO:tasks.workunit.client.1.vm09.stdout:7/414: write da/d11/f1f [8165667,126420] 0 2026-03-09T17:30:15.680 INFO:tasks.workunit.client.1.vm09.stdout:5/337: truncate d0/d2/f2a 3211977 0 2026-03-09T17:30:15.682 INFO:tasks.workunit.client.1.vm09.stdout:9/287: rmdir d5/d21 39 2026-03-09T17:30:15.684 INFO:tasks.workunit.client.1.vm09.stdout:9/288: stat d5/de/f3c 0 2026-03-09T17:30:15.685 INFO:tasks.workunit.client.1.vm09.stdout:4/374: fdatasync d11/d1e/d29/d36/d57/f67 0 2026-03-09T17:30:15.687 INFO:tasks.workunit.client.1.vm09.stdout:6/303: dwrite d3/d21/d25/d26/d34/f56 [0,4194304] 0 2026-03-09T17:30:15.688 INFO:tasks.workunit.client.1.vm09.stdout:5/338: read d0/dc/d21/d26/f3d [2903095,12067] 0 2026-03-09T17:30:15.688 INFO:tasks.workunit.client.1.vm09.stdout:0/302: rename d6/f2d to d6/d1d/d24/f5d 0 2026-03-09T17:30:15.693 INFO:tasks.workunit.client.1.vm09.stdout:5/339: read d0/de/f50 [3642525,113905] 0 2026-03-09T17:30:15.705 INFO:tasks.workunit.client.1.vm09.stdout:7/415: fdatasync da/f21 0 2026-03-09T17:30:15.711 INFO:tasks.workunit.client.1.vm09.stdout:2/275: write d13/d15/d21/f27 [604632,123049] 0 2026-03-09T17:30:15.713 INFO:tasks.workunit.client.1.vm09.stdout:3/321: rename d5/d16/l52 to d5/d16/d31/d37/d58/l5a 0 2026-03-09T17:30:15.717 INFO:tasks.workunit.client.1.vm09.stdout:6/304: dread d3/f4f [0,4194304] 0 2026-03-09T17:30:15.726 INFO:tasks.workunit.client.1.vm09.stdout:6/305: truncate d3/d21/d25/f54 278539 0 2026-03-09T17:30:15.726 INFO:tasks.workunit.client.1.vm09.stdout:3/322: dwrite d5/d16/f45 [0,4194304] 0 2026-03-09T17:30:15.729 INFO:tasks.workunit.client.1.vm09.stdout:5/340: mkdir d0/d9/d74 0 2026-03-09T17:30:15.729 INFO:tasks.workunit.client.1.vm09.stdout:2/276: chown d13/f14 147342 1 2026-03-09T17:30:15.735 INFO:tasks.workunit.client.1.vm09.stdout:9/289: rmdir d5/de/d4e/d54 0 2026-03-09T17:30:15.738 INFO:tasks.workunit.client.1.vm09.stdout:6/306: dwrite d3/d7/fe [0,4194304] 0 2026-03-09T17:30:15.738 INFO:tasks.workunit.client.1.vm09.stdout:9/290: write d5/de/d29/d33/f4a [1000925,42776] 0 2026-03-09T17:30:15.739 INFO:tasks.workunit.client.1.vm09.stdout:3/323: creat d5/d16/d31/d37/f5b x:0 0 0 2026-03-09T17:30:15.740 INFO:tasks.workunit.client.1.vm09.stdout:3/324: write d5/d16/d31/f34 [2208601,50091] 0 2026-03-09T17:30:15.747 INFO:tasks.workunit.client.1.vm09.stdout:9/291: stat d5/de/d29/l4c 0 2026-03-09T17:30:15.747 INFO:tasks.workunit.client.1.vm09.stdout:9/292: truncate d5/f1b 4484603 0 2026-03-09T17:30:15.771 INFO:tasks.workunit.client.1.vm09.stdout:6/307: mknod d3/d21/d25/d26/c62 0 2026-03-09T17:30:15.772 INFO:tasks.workunit.client.1.vm09.stdout:5/341: mkdir d0/d9/d74/d75 0 2026-03-09T17:30:15.772 INFO:tasks.workunit.client.1.vm09.stdout:7/416: getdents da 0 2026-03-09T17:30:15.776 INFO:tasks.workunit.client.1.vm09.stdout:8/320: dwrite d1/d14/d2a/f2b [0,4194304] 0 2026-03-09T17:30:15.794 INFO:tasks.workunit.client.1.vm09.stdout:2/277: fdatasync d13/f26 0 2026-03-09T17:30:15.802 INFO:tasks.workunit.client.1.vm09.stdout:5/342: mkdir d0/d2/d76 0 2026-03-09T17:30:15.804 INFO:tasks.workunit.client.1.vm09.stdout:5/343: dread d0/d46/f56 [0,4194304] 0 2026-03-09T17:30:15.805 INFO:tasks.workunit.client.1.vm09.stdout:5/344: write d0/dc/d21/d26/f28 [884939,75166] 0 2026-03-09T17:30:15.809 INFO:tasks.workunit.client.1.vm09.stdout:8/321: symlink d1/da/dd/d47/d4c/l69 0 2026-03-09T17:30:15.813 INFO:tasks.workunit.client.0.vm06.stdout:2/971: write d3/d4/d12/d71/daa/d77/d81/d64/d6a/fab [5645121,112376] 0 2026-03-09T17:30:15.819 INFO:tasks.workunit.client.0.vm06.stdout:2/972: fdatasync d3/d4/d12/f92 0 2026-03-09T17:30:15.820 INFO:tasks.workunit.client.1.vm09.stdout:9/293: symlink d5/d21/l64 0 2026-03-09T17:30:15.820 INFO:tasks.workunit.client.1.vm09.stdout:2/278: read - d13/d15/f2f zero size 2026-03-09T17:30:15.823 INFO:tasks.workunit.client.1.vm09.stdout:5/345: unlink d0/dc/d21/d33/f6c 0 2026-03-09T17:30:15.833 INFO:tasks.workunit.client.1.vm09.stdout:6/308: rename d3/d21/d25/d26/c62 to d3/d7/c63 0 2026-03-09T17:30:15.834 INFO:tasks.workunit.client.1.vm09.stdout:6/309: fdatasync d3/f4f 0 2026-03-09T17:30:15.835 INFO:tasks.workunit.client.1.vm09.stdout:5/346: fsync d0/de/f50 0 2026-03-09T17:30:15.837 INFO:tasks.workunit.client.1.vm09.stdout:1/348: write d9/f34 [3795865,3225] 0 2026-03-09T17:30:15.840 INFO:tasks.workunit.client.1.vm09.stdout:7/417: rename da/d11/d41/f7a to da/d11/d41/d4e/f92 0 2026-03-09T17:30:15.844 INFO:tasks.workunit.client.1.vm09.stdout:8/322: link d1/d14/d2a/f2e d1/da/dd/d3f/f6a 0 2026-03-09T17:30:15.844 INFO:tasks.workunit.client.1.vm09.stdout:5/347: dwrite d0/de/f64 [0,4194304] 0 2026-03-09T17:30:15.847 INFO:tasks.workunit.client.1.vm09.stdout:7/418: rmdir da/d11 39 2026-03-09T17:30:15.848 INFO:tasks.workunit.client.1.vm09.stdout:6/310: creat d3/d7/d59/d5a/f64 x:0 0 0 2026-03-09T17:30:15.848 INFO:tasks.workunit.client.1.vm09.stdout:2/279: getdents d13/d15/d21 0 2026-03-09T17:30:15.850 INFO:tasks.workunit.client.1.vm09.stdout:1/349: creat d9/f6c x:0 0 0 2026-03-09T17:30:15.852 INFO:tasks.workunit.client.1.vm09.stdout:1/350: chown l7 34 1 2026-03-09T17:30:15.852 INFO:tasks.workunit.client.1.vm09.stdout:6/311: dread - d3/d1e/d30/d5c/d61/f60 zero size 2026-03-09T17:30:15.856 INFO:tasks.workunit.client.1.vm09.stdout:8/323: symlink d1/d14/d2a/d42/d5d/l6b 0 2026-03-09T17:30:15.863 INFO:tasks.workunit.client.1.vm09.stdout:5/348: creat d0/d9/f77 x:0 0 0 2026-03-09T17:30:15.865 INFO:tasks.workunit.client.1.vm09.stdout:7/419: mknod da/d11/d47/d5b/d78/c93 0 2026-03-09T17:30:15.867 INFO:tasks.workunit.client.1.vm09.stdout:1/351: dwrite d9/dc/dd/d40/d1d/f4d [0,4194304] 0 2026-03-09T17:30:15.869 INFO:tasks.workunit.client.1.vm09.stdout:5/349: creat d0/f78 x:0 0 0 2026-03-09T17:30:15.872 INFO:tasks.workunit.client.1.vm09.stdout:1/352: chown d9/d3a/c48 1763 1 2026-03-09T17:30:15.875 INFO:tasks.workunit.client.1.vm09.stdout:2/280: dwrite d13/d15/d3b/f3f [0,4194304] 0 2026-03-09T17:30:15.875 INFO:tasks.workunit.client.1.vm09.stdout:5/350: mkdir d0/dc/d21/d26/d5e/d68/d79 0 2026-03-09T17:30:15.875 INFO:tasks.workunit.client.1.vm09.stdout:9/294: dread d5/de/d29/f5f [0,4194304] 0 2026-03-09T17:30:15.877 INFO:tasks.workunit.client.1.vm09.stdout:9/295: chown d5/c1f 13778 1 2026-03-09T17:30:15.877 INFO:tasks.workunit.client.1.vm09.stdout:6/312: sync 2026-03-09T17:30:15.885 INFO:tasks.workunit.client.1.vm09.stdout:2/281: creat d13/d15/d36/f59 x:0 0 0 2026-03-09T17:30:15.885 INFO:tasks.workunit.client.1.vm09.stdout:6/313: write f2 [517334,72254] 0 2026-03-09T17:30:15.892 INFO:tasks.workunit.client.1.vm09.stdout:5/351: rmdir d0/d57 0 2026-03-09T17:30:15.898 INFO:tasks.workunit.client.1.vm09.stdout:6/314: dread f2 [0,4194304] 0 2026-03-09T17:30:15.899 INFO:tasks.workunit.client.1.vm09.stdout:6/315: truncate d3/d7/d59/d5a/f64 272205 0 2026-03-09T17:30:15.900 INFO:tasks.workunit.client.1.vm09.stdout:6/316: fsync d3/d21/d25/d26/d34/f56 0 2026-03-09T17:30:15.908 INFO:tasks.workunit.client.1.vm09.stdout:1/353: dread d9/dc/dd/d40/d22/f2b [0,4194304] 0 2026-03-09T17:30:15.912 INFO:tasks.workunit.client.1.vm09.stdout:1/354: mknod d9/dc/dd/d40/d22/d37/d3f/d42/d55/c6d 0 2026-03-09T17:30:15.914 INFO:tasks.workunit.client.1.vm09.stdout:2/282: dread fd [0,4194304] 0 2026-03-09T17:30:15.922 INFO:tasks.workunit.client.1.vm09.stdout:2/283: fdatasync d13/d15/d21/f28 0 2026-03-09T17:30:15.928 INFO:tasks.workunit.client.1.vm09.stdout:2/284: sync 2026-03-09T17:30:15.930 INFO:tasks.workunit.client.1.vm09.stdout:2/285: getdents d13/d50 0 2026-03-09T17:30:15.931 INFO:tasks.workunit.client.1.vm09.stdout:1/355: dread d9/f11 [0,4194304] 0 2026-03-09T17:30:15.933 INFO:tasks.workunit.client.1.vm09.stdout:2/286: mknod d13/d15/d3b/d43/c5a 0 2026-03-09T17:30:15.937 INFO:tasks.workunit.client.1.vm09.stdout:1/356: unlink d9/dc/c12 0 2026-03-09T17:30:15.943 INFO:tasks.workunit.client.1.vm09.stdout:2/287: rmdir d13/d50 0 2026-03-09T17:30:15.949 INFO:tasks.workunit.client.1.vm09.stdout:2/288: creat d13/d15/d34/f5b x:0 0 0 2026-03-09T17:30:15.957 INFO:tasks.workunit.client.1.vm09.stdout:0/303: dwrite d6/d1d/d24/d32/f49 [0,4194304] 0 2026-03-09T17:30:15.966 INFO:tasks.workunit.client.1.vm09.stdout:0/304: mkdir d6/d1d/d24/d5e 0 2026-03-09T17:30:15.970 INFO:tasks.workunit.client.1.vm09.stdout:4/375: write d11/f26 [3587441,94600] 0 2026-03-09T17:30:15.974 INFO:tasks.workunit.client.1.vm09.stdout:4/376: read fe [4736447,87593] 0 2026-03-09T17:30:15.974 INFO:tasks.workunit.client.1.vm09.stdout:0/305: fsync d6/f27 0 2026-03-09T17:30:15.976 INFO:tasks.workunit.client.1.vm09.stdout:0/306: chown d6/d1d/d39/f2e 598712 1 2026-03-09T17:30:15.981 INFO:tasks.workunit.client.1.vm09.stdout:0/307: creat d6/d1d/d46/f5f x:0 0 0 2026-03-09T17:30:15.983 INFO:tasks.workunit.client.1.vm09.stdout:0/308: unlink d6/d1d/d24/c4f 0 2026-03-09T17:30:15.987 INFO:tasks.workunit.client.1.vm09.stdout:0/309: getdents d6/d1d/d24/d32/d59 0 2026-03-09T17:30:15.988 INFO:tasks.workunit.client.1.vm09.stdout:0/310: write d6/d1d/d24/f4e [89828,1986] 0 2026-03-09T17:30:15.992 INFO:tasks.workunit.client.1.vm09.stdout:0/311: mknod d6/d1d/d24/d32/d59/d5b/c60 0 2026-03-09T17:30:15.994 INFO:tasks.workunit.client.1.vm09.stdout:9/296: getdents d5/de/d4e 0 2026-03-09T17:30:15.996 INFO:tasks.workunit.client.1.vm09.stdout:3/325: rmdir d5 39 2026-03-09T17:30:15.997 INFO:tasks.workunit.client.1.vm09.stdout:9/297: truncate d5/d2e/f5e 559285 0 2026-03-09T17:30:16.012 INFO:tasks.workunit.client.1.vm09.stdout:9/298: getdents d5/d21 0 2026-03-09T17:30:16.016 INFO:tasks.workunit.client.1.vm09.stdout:9/299: readlink d5/d2e/l62 0 2026-03-09T17:30:16.016 INFO:tasks.workunit.client.1.vm09.stdout:0/312: dread d6/d1d/d24/f5d [0,4194304] 0 2026-03-09T17:30:16.018 INFO:tasks.workunit.client.1.vm09.stdout:9/300: creat d5/de/f65 x:0 0 0 2026-03-09T17:30:16.020 INFO:tasks.workunit.client.1.vm09.stdout:0/313: sync 2026-03-09T17:30:16.024 INFO:tasks.workunit.client.1.vm09.stdout:0/314: write d6/d1d/d24/f50 [265553,107270] 0 2026-03-09T17:30:16.024 INFO:tasks.workunit.client.0.vm06.stdout:2/973: dwrite d3/d4/d12/da7/fbb [0,4194304] 0 2026-03-09T17:30:16.104 INFO:tasks.workunit.client.1.vm09.stdout:8/324: rename d1/da/dd/d3f to d1/da/d23/d6c 0 2026-03-09T17:30:16.106 INFO:tasks.workunit.client.1.vm09.stdout:8/325: creat d1/da/d23/d6c/d32/f6d x:0 0 0 2026-03-09T17:30:16.108 INFO:tasks.workunit.client.1.vm09.stdout:6/317: rename d3/f1f to d3/d1e/d30/d5c/f65 0 2026-03-09T17:30:16.108 INFO:tasks.workunit.client.1.vm09.stdout:1/357: rename d9 to d9/dc/dd/d40/d22/d37/d3f/d42/d6e 22 2026-03-09T17:30:16.116 INFO:tasks.workunit.client.1.vm09.stdout:8/326: dwrite d1/da/dd/f27 [0,4194304] 0 2026-03-09T17:30:16.123 INFO:tasks.workunit.client.1.vm09.stdout:6/318: dwrite d3/d21/d25/d26/f50 [0,4194304] 0 2026-03-09T17:30:16.127 INFO:tasks.workunit.client.1.vm09.stdout:5/352: truncate d0/f22 1144766 0 2026-03-09T17:30:16.127 INFO:tasks.workunit.client.1.vm09.stdout:7/420: dwrite da/f26 [0,4194304] 0 2026-03-09T17:30:16.131 INFO:tasks.workunit.client.1.vm09.stdout:8/327: fsync d1/da/d23/d6c/d32/f50 0 2026-03-09T17:30:16.137 INFO:tasks.workunit.client.1.vm09.stdout:8/328: unlink d1/da/dd/d47/d4c/l69 0 2026-03-09T17:30:16.137 INFO:tasks.workunit.client.1.vm09.stdout:6/319: chown d3/d1e/d30/d5c/d61/c5e 470420047 1 2026-03-09T17:30:16.137 INFO:tasks.workunit.client.1.vm09.stdout:2/289: write d13/f14 [942548,69629] 0 2026-03-09T17:30:16.138 INFO:tasks.workunit.client.1.vm09.stdout:4/377: write d11/f1f [4525604,26727] 0 2026-03-09T17:30:16.140 INFO:tasks.workunit.client.1.vm09.stdout:8/329: dread - d1/da/dd/d47/d4c/f67 zero size 2026-03-09T17:30:16.142 INFO:tasks.workunit.client.1.vm09.stdout:1/358: getdents d9/dc/d63 0 2026-03-09T17:30:16.150 INFO:tasks.workunit.client.0.vm06.stdout:2/974: write d3/d4/d46/da5/fa8 [567234,128196] 0 2026-03-09T17:30:16.150 INFO:tasks.workunit.client.0.vm06.stdout:2/975: creat d3/d4/dcf/f13b x:0 0 0 2026-03-09T17:30:16.150 INFO:tasks.workunit.client.1.vm09.stdout:7/421: link da/d11/d47/d5b/d78/c93 da/d11/d47/d5b/d6c/c94 0 2026-03-09T17:30:16.150 INFO:tasks.workunit.client.1.vm09.stdout:4/378: mknod d11/d1e/d31/c7a 0 2026-03-09T17:30:16.150 INFO:tasks.workunit.client.1.vm09.stdout:4/379: chown d11/d1e/f61 31 1 2026-03-09T17:30:16.150 INFO:tasks.workunit.client.1.vm09.stdout:0/315: write d6/d1d/d24/d32/f45 [1009856,127060] 0 2026-03-09T17:30:16.150 INFO:tasks.workunit.client.1.vm09.stdout:7/422: creat da/d11/d47/d8f/f95 x:0 0 0 2026-03-09T17:30:16.150 INFO:tasks.workunit.client.1.vm09.stdout:3/326: dread d5/d16/d31/d3d/d12/f43 [0,4194304] 0 2026-03-09T17:30:16.155 INFO:tasks.workunit.client.1.vm09.stdout:1/359: dwrite f2 [4194304,4194304] 0 2026-03-09T17:30:16.158 INFO:tasks.workunit.client.1.vm09.stdout:4/380: creat d11/d1e/d45/d60/f7b x:0 0 0 2026-03-09T17:30:16.162 INFO:tasks.workunit.client.1.vm09.stdout:1/360: write d9/f34 [1534409,20247] 0 2026-03-09T17:30:16.162 INFO:tasks.workunit.client.1.vm09.stdout:8/330: dwrite d1/d14/d2a/d42/d43/d44/f5c [0,4194304] 0 2026-03-09T17:30:16.162 INFO:tasks.workunit.client.1.vm09.stdout:4/381: fdatasync d11/d1e/d29/f2f 0 2026-03-09T17:30:16.171 INFO:tasks.workunit.client.1.vm09.stdout:4/382: creat d11/d1e/d31/f7c x:0 0 0 2026-03-09T17:30:16.174 INFO:tasks.workunit.client.1.vm09.stdout:8/331: truncate d1/da/d23/d6c/f6a 620260 0 2026-03-09T17:30:16.182 INFO:tasks.workunit.client.1.vm09.stdout:3/327: dwrite d5/d16/d31/f44 [0,4194304] 0 2026-03-09T17:30:16.186 INFO:tasks.workunit.client.0.vm06.stdout:2/976: dread f2 [0,4194304] 0 2026-03-09T17:30:16.190 INFO:tasks.workunit.client.0.vm06.stdout:2/977: dread - d3/d4/d46/f4f zero size 2026-03-09T17:30:16.197 INFO:tasks.workunit.client.1.vm09.stdout:8/332: sync 2026-03-09T17:30:16.197 INFO:tasks.workunit.client.0.vm06.stdout:2/978: rename d3/d4/d12/d2b/fb6 to d3/d4/d12/d71/daa/d77/d81/d64/d6a/de0/f13c 0 2026-03-09T17:30:16.199 INFO:tasks.workunit.client.0.vm06.stdout:2/979: mknod d3/d4/d12/da7/de3/c13d 0 2026-03-09T17:30:16.202 INFO:tasks.workunit.client.0.vm06.stdout:2/980: creat d3/d4/d12/d71/daa/d77/d102/d109/d116/f13e x:0 0 0 2026-03-09T17:30:16.205 INFO:tasks.workunit.client.1.vm09.stdout:8/333: link d1/d14/d2a/d42/f46 d1/f6e 0 2026-03-09T17:30:16.214 INFO:tasks.workunit.client.1.vm09.stdout:8/334: fsync d1/f16 0 2026-03-09T17:30:16.215 INFO:tasks.workunit.client.0.vm06.stdout:2/981: dread d3/f5a [0,4194304] 0 2026-03-09T17:30:16.226 INFO:tasks.workunit.client.1.vm09.stdout:8/335: sync 2026-03-09T17:30:16.226 INFO:tasks.workunit.client.1.vm09.stdout:0/316: fdatasync d6/d1d/d24/d32/f45 0 2026-03-09T17:30:16.229 INFO:tasks.workunit.client.1.vm09.stdout:8/336: chown d1/d14/d31/l3e 380710837 1 2026-03-09T17:30:16.309 INFO:tasks.workunit.client.1.vm09.stdout:9/301: rename d5/de/d29/f5f to d5/de/d29/d33/f66 0 2026-03-09T17:30:16.310 INFO:tasks.workunit.client.1.vm09.stdout:3/328: rename d5/d16/d31/d3d/d12/c48 to d5/d16/d31/d37/c5c 0 2026-03-09T17:30:16.312 INFO:tasks.workunit.client.1.vm09.stdout:3/329: mknod d5/d9/c5d 0 2026-03-09T17:30:16.324 INFO:tasks.workunit.client.1.vm09.stdout:9/302: dread d5/d21/f2b [0,4194304] 0 2026-03-09T17:30:16.328 INFO:tasks.workunit.client.1.vm09.stdout:9/303: dwrite d5/de/f65 [0,4194304] 0 2026-03-09T17:30:16.332 INFO:tasks.workunit.client.1.vm09.stdout:9/304: creat d5/de/d29/f67 x:0 0 0 2026-03-09T17:30:16.338 INFO:tasks.workunit.client.1.vm09.stdout:9/305: unlink d5/d21/l45 0 2026-03-09T17:30:16.364 INFO:tasks.workunit.client.1.vm09.stdout:7/423: link da/d11/c40 da/d11/d2d/c96 0 2026-03-09T17:30:16.369 INFO:tasks.workunit.client.1.vm09.stdout:7/424: symlink da/l97 0 2026-03-09T17:30:16.378 INFO:tasks.workunit.client.1.vm09.stdout:7/425: dwrite da/d11/d47/d5b/d6c/f73 [0,4194304] 0 2026-03-09T17:30:16.386 INFO:tasks.workunit.client.1.vm09.stdout:7/426: read da/d11/d41/f35 [3646047,26786] 0 2026-03-09T17:30:16.397 INFO:tasks.workunit.client.1.vm09.stdout:7/427: creat da/d11/d2d/d49/f98 x:0 0 0 2026-03-09T17:30:16.397 INFO:tasks.workunit.client.1.vm09.stdout:7/428: stat da/f15 0 2026-03-09T17:30:16.397 INFO:tasks.workunit.client.1.vm09.stdout:7/429: mknod da/d76/c99 0 2026-03-09T17:30:16.397 INFO:tasks.workunit.client.1.vm09.stdout:7/430: chown da/fb 419822 1 2026-03-09T17:30:16.397 INFO:tasks.workunit.client.1.vm09.stdout:7/431: chown da/d11/f6a 400283505 1 2026-03-09T17:30:16.404 INFO:tasks.workunit.client.1.vm09.stdout:7/432: dwrite da/f27 [0,4194304] 0 2026-03-09T17:30:16.411 INFO:tasks.workunit.client.1.vm09.stdout:7/433: fdatasync da/d11/d77/f79 0 2026-03-09T17:30:16.419 INFO:tasks.workunit.client.1.vm09.stdout:7/434: dread da/d11/d41/f38 [0,4194304] 0 2026-03-09T17:30:16.425 INFO:tasks.workunit.client.1.vm09.stdout:7/435: symlink da/d11/l9a 0 2026-03-09T17:30:16.425 INFO:tasks.workunit.client.1.vm09.stdout:7/436: rmdir da/d11/d2d 39 2026-03-09T17:30:16.425 INFO:tasks.workunit.client.1.vm09.stdout:7/437: creat da/d11/d47/d5b/d78/f9b x:0 0 0 2026-03-09T17:30:16.454 INFO:tasks.workunit.client.1.vm09.stdout:1/361: mkdir d9/dc/dd/d40/d21/d6f 0 2026-03-09T17:30:16.454 INFO:tasks.workunit.client.1.vm09.stdout:1/362: chown d9/dc/c32 10694765 1 2026-03-09T17:30:16.557 INFO:tasks.workunit.client.1.vm09.stdout:2/290: dwrite d13/d15/d2c/f2d [0,4194304] 0 2026-03-09T17:30:16.567 INFO:tasks.workunit.client.1.vm09.stdout:2/291: dread d13/d15/d21/f31 [0,4194304] 0 2026-03-09T17:30:16.567 INFO:tasks.workunit.client.1.vm09.stdout:2/292: fsync d13/d15/d34/d45/f49 0 2026-03-09T17:30:16.569 INFO:tasks.workunit.client.1.vm09.stdout:2/293: creat d13/d4d/f5c x:0 0 0 2026-03-09T17:30:16.572 INFO:tasks.workunit.client.1.vm09.stdout:2/294: link d13/d15/f1d d13/d15/d21/f5d 0 2026-03-09T17:30:16.572 INFO:tasks.workunit.client.1.vm09.stdout:2/295: chown fb 1983690 1 2026-03-09T17:30:16.572 INFO:tasks.workunit.client.1.vm09.stdout:2/296: readlink d13/d15/d3b/d43/l47 0 2026-03-09T17:30:16.573 INFO:tasks.workunit.client.1.vm09.stdout:2/297: read d13/d15/d3b/f3f [2797604,58547] 0 2026-03-09T17:30:16.594 INFO:tasks.workunit.client.1.vm09.stdout:2/298: sync 2026-03-09T17:30:16.599 INFO:tasks.workunit.client.1.vm09.stdout:2/299: dwrite d13/d15/f2a [0,4194304] 0 2026-03-09T17:30:16.603 INFO:tasks.workunit.client.1.vm09.stdout:2/300: chown d13/d15/d34/d37/l55 1608 1 2026-03-09T17:30:16.616 INFO:tasks.workunit.client.1.vm09.stdout:4/383: dwrite d11/d1e/f73 [0,4194304] 0 2026-03-09T17:30:16.617 INFO:tasks.workunit.client.1.vm09.stdout:4/384: stat d11/d1e/d29/f2f 0 2026-03-09T17:30:16.621 INFO:tasks.workunit.client.1.vm09.stdout:4/385: link d11/d1e/d29/d36/f3d d11/d1e/d45/d60/d69/d58/f7d 0 2026-03-09T17:30:16.626 INFO:tasks.workunit.client.1.vm09.stdout:4/386: dwrite d11/d1e/f73 [0,4194304] 0 2026-03-09T17:30:16.627 INFO:tasks.workunit.client.1.vm09.stdout:4/387: truncate d11/d1e/d29/f6d 331951 0 2026-03-09T17:30:16.632 INFO:tasks.workunit.client.0.vm06.stdout:2/982: truncate d3/d4/d12/f42 3074796 0 2026-03-09T17:30:16.639 INFO:tasks.workunit.client.1.vm09.stdout:4/388: dread fd [0,4194304] 0 2026-03-09T17:30:16.640 INFO:tasks.workunit.client.1.vm09.stdout:4/389: stat d11/d1e/d45/d60/d69/l4b 0 2026-03-09T17:30:16.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:16.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:16.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:16.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:16.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:16.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:30:16.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T17:30:16.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: Updating vm09:/etc/ceph/ceph.conf 2026-03-09T17:30:16.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:30:16.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:30:16.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:30:16.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: pgmap v8: 65 pgs: 65 active+clean; 2.8 GiB data, 9.7 GiB used, 110 GiB / 120 GiB avail; 19 MiB/s rd, 69 MiB/s wr, 147 op/s 2026-03-09T17:30:16.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: Updating vm09:/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:30:16.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.client.admin.keyring 2026-03-09T17:30:16.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:16.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:16.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:16 vm06.local ceph-mon[57307]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.client.admin.keyring 2026-03-09T17:30:16.643 INFO:tasks.workunit.client.0.vm06.stdout:2/983: creat d3/d4/d12/d71/daa/f13f x:0 0 0 2026-03-09T17:30:16.643 INFO:tasks.workunit.client.1.vm09.stdout:4/390: dwrite d11/d1e/d45/d60/d71/f76 [0,4194304] 0 2026-03-09T17:30:16.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:16.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: Updating vm09:/etc/ceph/ceph.conf 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: pgmap v8: 65 pgs: 65 active+clean; 2.8 GiB data, 9.7 GiB used, 110 GiB / 120 GiB avail; 19 MiB/s rd, 69 MiB/s wr, 147 op/s 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: Updating vm09:/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.client.admin.keyring 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:16 vm09.local ceph-mon[62061]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.client.admin.keyring 2026-03-09T17:30:16.648 INFO:tasks.workunit.client.1.vm09.stdout:4/391: dwrite d11/f1f [0,4194304] 0 2026-03-09T17:30:16.725 INFO:tasks.workunit.client.1.vm09.stdout:9/306: getdents d5/de/d29 0 2026-03-09T17:30:16.732 INFO:tasks.workunit.client.1.vm09.stdout:9/307: link d5/de/c44 d5/de/d4e/c68 0 2026-03-09T17:30:16.733 INFO:tasks.workunit.client.1.vm09.stdout:9/308: write d5/de/f65 [3776489,121078] 0 2026-03-09T17:30:16.736 INFO:tasks.workunit.client.1.vm09.stdout:9/309: symlink d5/d21/l69 0 2026-03-09T17:30:16.742 INFO:tasks.workunit.client.1.vm09.stdout:7/438: dwrite da/f15 [0,4194304] 0 2026-03-09T17:30:16.743 INFO:tasks.workunit.client.1.vm09.stdout:1/363: write d9/f11 [5090200,88715] 0 2026-03-09T17:30:16.745 INFO:tasks.workunit.client.1.vm09.stdout:7/439: write da/d11/d77/f79 [872344,59440] 0 2026-03-09T17:30:16.746 INFO:tasks.workunit.client.1.vm09.stdout:7/440: chown da/d11/d47 426792813 1 2026-03-09T17:30:16.760 INFO:tasks.workunit.client.1.vm09.stdout:1/364: rmdir d9/dc/dd/d40/d22/d37/d3f/d42 39 2026-03-09T17:30:16.768 INFO:tasks.workunit.client.1.vm09.stdout:7/441: dread da/d11/d2d/f71 [0,4194304] 0 2026-03-09T17:30:16.772 INFO:tasks.workunit.client.1.vm09.stdout:7/442: fsync da/d11/d2d/d49/f52 0 2026-03-09T17:30:16.775 INFO:tasks.workunit.client.1.vm09.stdout:7/443: dwrite da/d11/f25 [4194304,4194304] 0 2026-03-09T17:30:16.777 INFO:tasks.workunit.client.1.vm09.stdout:7/444: mknod da/d11/d47/d5b/c9c 0 2026-03-09T17:30:16.788 INFO:tasks.workunit.client.1.vm09.stdout:7/445: dread da/d11/f1a [0,4194304] 0 2026-03-09T17:30:16.792 INFO:tasks.workunit.client.1.vm09.stdout:7/446: dwrite da/d11/d2d/f45 [0,4194304] 0 2026-03-09T17:30:16.806 INFO:tasks.workunit.client.1.vm09.stdout:7/447: mknod da/d11/d47/d5b/c9d 0 2026-03-09T17:30:16.806 INFO:tasks.workunit.client.1.vm09.stdout:7/448: stat da/d11/d41/d4e 0 2026-03-09T17:30:16.806 INFO:tasks.workunit.client.1.vm09.stdout:7/449: fdatasync da/d11/f1a 0 2026-03-09T17:30:16.806 INFO:tasks.workunit.client.1.vm09.stdout:7/450: truncate da/d11/d41/d4e/f8e 915477 0 2026-03-09T17:30:16.911 INFO:tasks.workunit.client.1.vm09.stdout:6/320: write d3/d1e/d30/d5c/f65 [823814,81127] 0 2026-03-09T17:30:16.912 INFO:tasks.workunit.client.1.vm09.stdout:6/321: readlink d3/d1e/l47 0 2026-03-09T17:30:16.914 INFO:tasks.workunit.client.0.vm06.stdout:2/984: sync 2026-03-09T17:30:16.915 INFO:tasks.workunit.client.0.vm06.stdout:2/985: dread - d3/d4/d11b/f13a zero size 2026-03-09T17:30:16.919 INFO:tasks.workunit.client.1.vm09.stdout:6/322: creat d3/d21/d25/d26/d34/f66 x:0 0 0 2026-03-09T17:30:16.923 INFO:tasks.workunit.client.1.vm09.stdout:6/323: readlink d3/d7/l2c 0 2026-03-09T17:30:16.923 INFO:tasks.workunit.client.1.vm09.stdout:6/324: truncate d3/d7/f11 143770 0 2026-03-09T17:30:16.923 INFO:tasks.workunit.client.1.vm09.stdout:6/325: dread - d3/d7/f4c zero size 2026-03-09T17:30:16.923 INFO:tasks.workunit.client.1.vm09.stdout:6/326: readlink d3/d21/d25/d26/d34/l49 0 2026-03-09T17:30:16.924 INFO:tasks.workunit.client.1.vm09.stdout:6/327: creat d3/d1e/d30/d3f/f67 x:0 0 0 2026-03-09T17:30:16.926 INFO:tasks.workunit.client.1.vm09.stdout:6/328: truncate d3/d7/f23 2078681 0 2026-03-09T17:30:16.926 INFO:tasks.workunit.client.1.vm09.stdout:6/329: fsync d3/f4f 0 2026-03-09T17:30:16.927 INFO:tasks.workunit.client.1.vm09.stdout:6/330: write d3/d7/d59/d5a/f64 [321398,83678] 0 2026-03-09T17:30:16.945 INFO:tasks.workunit.client.1.vm09.stdout:5/353: dwrite d0/d2/f2a [0,4194304] 0 2026-03-09T17:30:16.958 INFO:tasks.workunit.client.1.vm09.stdout:5/354: unlink d0/d2/l6a 0 2026-03-09T17:30:16.958 INFO:tasks.workunit.client.1.vm09.stdout:2/301: rmdir d13/d15/d34 39 2026-03-09T17:30:16.963 INFO:tasks.workunit.client.1.vm09.stdout:0/317: rename d6/d1d/l4b to d6/d1d/d24/d32/l61 0 2026-03-09T17:30:16.974 INFO:tasks.workunit.client.1.vm09.stdout:5/355: dread d0/dc/d21/d33/f35 [0,4194304] 0 2026-03-09T17:30:16.974 INFO:tasks.workunit.client.0.vm06.stdout:2/986: write d3/d4/d12/f42 [459057,127115] 0 2026-03-09T17:30:16.974 INFO:tasks.workunit.client.1.vm09.stdout:5/356: stat d0/d9/f34 0 2026-03-09T17:30:16.984 INFO:tasks.workunit.client.1.vm09.stdout:5/357: creat d0/dc/d21/f7a x:0 0 0 2026-03-09T17:30:16.984 INFO:tasks.workunit.client.1.vm09.stdout:2/302: creat d13/d15/d34/f5e x:0 0 0 2026-03-09T17:30:16.985 INFO:tasks.workunit.client.1.vm09.stdout:0/318: creat d6/d1d/f62 x:0 0 0 2026-03-09T17:30:16.985 INFO:tasks.workunit.client.1.vm09.stdout:5/358: read - d0/d9/f66 zero size 2026-03-09T17:30:16.988 INFO:tasks.workunit.client.0.vm06.stdout:2/987: mkdir d3/d4/d12/d71/daa/d77/d102/d140 0 2026-03-09T17:30:16.991 INFO:tasks.workunit.client.1.vm09.stdout:2/303: mknod d13/d4d/c5f 0 2026-03-09T17:30:16.992 INFO:tasks.workunit.client.1.vm09.stdout:8/337: rename d1/da/d23/d6c/d32/l38 to d1/da/d23/l6f 0 2026-03-09T17:30:16.994 INFO:tasks.workunit.client.1.vm09.stdout:5/359: mknod d0/dc/d21/d6f/d42/c7b 0 2026-03-09T17:30:16.995 INFO:tasks.workunit.client.1.vm09.stdout:5/360: read d0/d9/f3e [219788,119706] 0 2026-03-09T17:30:16.998 INFO:tasks.workunit.client.1.vm09.stdout:3/330: rename d5/d16/d46/f55 to d5/d16/d31/d3d/d12/f5e 0 2026-03-09T17:30:16.998 INFO:tasks.workunit.client.1.vm09.stdout:3/331: readlink d5/d16/d25/l3a 0 2026-03-09T17:30:17.001 INFO:tasks.workunit.client.1.vm09.stdout:3/332: dwrite d5/d16/d46/f47 [0,4194304] 0 2026-03-09T17:30:17.002 INFO:tasks.workunit.client.1.vm09.stdout:3/333: fsync d5/d16/d31/f34 0 2026-03-09T17:30:17.011 INFO:tasks.workunit.client.1.vm09.stdout:6/331: fsync d3/d21/d25/d26/d34/f66 0 2026-03-09T17:30:17.011 INFO:tasks.workunit.client.1.vm09.stdout:8/338: creat d1/da/d23/d6c/f70 x:0 0 0 2026-03-09T17:30:17.011 INFO:tasks.workunit.client.1.vm09.stdout:0/319: creat d6/f63 x:0 0 0 2026-03-09T17:30:17.011 INFO:tasks.workunit.client.1.vm09.stdout:2/304: mkdir d13/d15/d60 0 2026-03-09T17:30:17.012 INFO:tasks.workunit.client.1.vm09.stdout:5/361: readlink d0/dc/l18 0 2026-03-09T17:30:17.013 INFO:tasks.workunit.client.1.vm09.stdout:2/305: write fd [2823885,121979] 0 2026-03-09T17:30:17.014 INFO:tasks.workunit.client.1.vm09.stdout:4/392: rename d11/c20 to d11/d1e/d29/d36/d57/d78/c7e 0 2026-03-09T17:30:17.015 INFO:tasks.workunit.client.1.vm09.stdout:8/339: truncate d1/da/d23/d6c/d32/f56 158702 0 2026-03-09T17:30:17.016 INFO:tasks.workunit.client.1.vm09.stdout:8/340: readlink d1/da/l25 0 2026-03-09T17:30:17.017 INFO:tasks.workunit.client.1.vm09.stdout:5/362: dwrite d0/d2/f2a [0,4194304] 0 2026-03-09T17:30:17.018 INFO:tasks.workunit.client.1.vm09.stdout:0/320: dwrite d6/d1d/f62 [0,4194304] 0 2026-03-09T17:30:17.019 INFO:tasks.workunit.client.0.vm06.stdout:2/988: dread d3/d4/d12/da7/db3/fc2 [0,4194304] 0 2026-03-09T17:30:17.022 INFO:tasks.workunit.client.1.vm09.stdout:0/321: fsync d6/d1d/f57 0 2026-03-09T17:30:17.024 INFO:tasks.workunit.client.1.vm09.stdout:5/363: dwrite d0/d9/f77 [0,4194304] 0 2026-03-09T17:30:17.050 INFO:tasks.workunit.client.0.vm06.stdout:2/989: mkdir d3/d4/d12/d71/daa/d77/d81/d64/de5/df0/d122/d141 0 2026-03-09T17:30:17.056 INFO:tasks.workunit.client.0.vm06.stdout:2/990: mkdir d3/d4/d12/d71/daa/d77/d81/d64/d6a/d142 0 2026-03-09T17:30:17.063 INFO:tasks.workunit.client.1.vm09.stdout:8/341: mkdir d1/da/d23/d71 0 2026-03-09T17:30:17.070 INFO:tasks.workunit.client.1.vm09.stdout:6/332: creat d3/d48/f68 x:0 0 0 2026-03-09T17:30:17.070 INFO:tasks.workunit.client.1.vm09.stdout:0/322: mkdir d6/d64 0 2026-03-09T17:30:17.086 INFO:tasks.workunit.client.1.vm09.stdout:9/310: rename d5/l5c to d5/de/l6a 0 2026-03-09T17:30:17.087 INFO:tasks.workunit.client.1.vm09.stdout:9/311: write d5/f14 [5199637,128426] 0 2026-03-09T17:30:17.090 INFO:tasks.workunit.client.1.vm09.stdout:8/342: mkdir d1/d14/d72 0 2026-03-09T17:30:17.091 INFO:tasks.workunit.client.1.vm09.stdout:6/333: symlink d3/d21/l69 0 2026-03-09T17:30:17.095 INFO:tasks.workunit.client.1.vm09.stdout:6/334: dwrite d3/d21/d25/d26/d34/f56 [0,4194304] 0 2026-03-09T17:30:17.098 INFO:tasks.workunit.client.1.vm09.stdout:2/306: getdents d13/d15/d21 0 2026-03-09T17:30:17.099 INFO:tasks.workunit.client.1.vm09.stdout:2/307: dread - d13/d15/d34/f5e zero size 2026-03-09T17:30:17.101 INFO:tasks.workunit.client.1.vm09.stdout:6/335: mkdir d3/d1e/d30/d5c/d61/d6a 0 2026-03-09T17:30:17.102 INFO:tasks.workunit.client.1.vm09.stdout:2/308: creat d13/d15/d34/d45/f61 x:0 0 0 2026-03-09T17:30:17.104 INFO:tasks.workunit.client.1.vm09.stdout:4/393: dread f3 [0,4194304] 0 2026-03-09T17:30:17.105 INFO:tasks.workunit.client.1.vm09.stdout:9/312: rmdir d5/de/d29/d51 0 2026-03-09T17:30:17.107 INFO:tasks.workunit.client.1.vm09.stdout:2/309: dwrite d13/d15/d34/f5b [0,4194304] 0 2026-03-09T17:30:17.108 INFO:tasks.workunit.client.1.vm09.stdout:2/310: dread - d13/d15/f2f zero size 2026-03-09T17:30:17.109 INFO:tasks.workunit.client.1.vm09.stdout:9/313: dwrite f2 [0,4194304] 0 2026-03-09T17:30:17.117 INFO:tasks.workunit.client.1.vm09.stdout:6/336: sync 2026-03-09T17:30:17.118 INFO:tasks.workunit.client.1.vm09.stdout:8/343: sync 2026-03-09T17:30:17.118 INFO:tasks.workunit.client.1.vm09.stdout:6/337: fdatasync d3/fc 0 2026-03-09T17:30:17.118 INFO:tasks.workunit.client.1.vm09.stdout:1/365: rename d9/dc/l1b to d9/dc/dd/d40/d21/l70 0 2026-03-09T17:30:17.118 INFO:tasks.workunit.client.1.vm09.stdout:9/314: chown d5/f1b 239 1 2026-03-09T17:30:17.120 INFO:tasks.workunit.client.1.vm09.stdout:1/366: chown d9/dc/dd/d40/d21/f33 28757060 1 2026-03-09T17:30:17.122 INFO:tasks.workunit.client.1.vm09.stdout:6/338: stat d3/d1e/d30/d5c/d61/f60 0 2026-03-09T17:30:17.136 INFO:tasks.workunit.client.1.vm09.stdout:4/394: creat d11/d1e/d29/d36/f7f x:0 0 0 2026-03-09T17:30:17.152 INFO:tasks.workunit.client.1.vm09.stdout:2/311: readlink d13/d15/l19 0 2026-03-09T17:30:17.153 INFO:tasks.workunit.client.1.vm09.stdout:2/312: dread - d13/d15/d34/d45/f57 zero size 2026-03-09T17:30:17.153 INFO:tasks.workunit.client.1.vm09.stdout:9/315: mkdir d5/d2e/d6b 0 2026-03-09T17:30:17.153 INFO:tasks.workunit.client.1.vm09.stdout:2/313: write d13/d15/d34/f3a [582079,42970] 0 2026-03-09T17:30:17.153 INFO:tasks.workunit.client.1.vm09.stdout:8/344: dread d1/f28 [0,4194304] 0 2026-03-09T17:30:17.153 INFO:tasks.workunit.client.1.vm09.stdout:8/345: chown d1/d14/d31/c5a 8699 1 2026-03-09T17:30:17.153 INFO:tasks.workunit.client.1.vm09.stdout:7/451: rename da/d11/d41 to da/d11/d47/d5b/d6c/d9e 0 2026-03-09T17:30:17.153 INFO:tasks.workunit.client.1.vm09.stdout:6/339: mkdir d3/d21/d25/d26/d6b 0 2026-03-09T17:30:17.153 INFO:tasks.workunit.client.1.vm09.stdout:7/452: write da/d11/d2d/d56/f85 [1025456,17325] 0 2026-03-09T17:30:17.153 INFO:tasks.workunit.client.1.vm09.stdout:9/316: dwrite d5/d2e/f5e [0,4194304] 0 2026-03-09T17:30:17.153 INFO:tasks.workunit.client.1.vm09.stdout:4/395: fdatasync d11/d1e/d29/f3b 0 2026-03-09T17:30:17.153 INFO:tasks.workunit.client.1.vm09.stdout:9/317: chown d5/f1b 20 1 2026-03-09T17:30:17.153 INFO:tasks.workunit.client.1.vm09.stdout:2/314: write d13/f14 [1352154,103791] 0 2026-03-09T17:30:17.153 INFO:tasks.workunit.client.1.vm09.stdout:4/396: chown d11/d1e/d45/d60/d69/d58/f75 0 1 2026-03-09T17:30:17.153 INFO:tasks.workunit.client.1.vm09.stdout:5/364: rename d0/de/f64 to d0/d52/d20/f7c 0 2026-03-09T17:30:17.154 INFO:tasks.workunit.client.1.vm09.stdout:6/340: write d3/d1e/d30/d5c/d61/f53 [650599,11811] 0 2026-03-09T17:30:17.163 INFO:tasks.workunit.client.1.vm09.stdout:4/397: symlink d11/d1e/d29/d36/l80 0 2026-03-09T17:30:17.173 INFO:tasks.workunit.client.1.vm09.stdout:6/341: sync 2026-03-09T17:30:17.180 INFO:tasks.workunit.client.0.vm06.stdout:2/991: write d3/d4/d12/d2b/d36/fb9 [4644484,17911] 0 2026-03-09T17:30:17.184 INFO:tasks.workunit.client.1.vm09.stdout:3/334: dwrite d5/d16/d31/d3d/d12/f18 [0,4194304] 0 2026-03-09T17:30:17.184 INFO:tasks.workunit.client.0.vm06.stdout:2/992: write d3/d4/d12/d71/daa/d77/d102/d109/f12d [799272,50002] 0 2026-03-09T17:30:17.185 INFO:tasks.workunit.client.1.vm09.stdout:7/453: link da/d11/d47/d5b/d6c/f7b da/d11/d2d/d56/f9f 0 2026-03-09T17:30:17.186 INFO:tasks.workunit.client.1.vm09.stdout:1/367: getdents d9/dc/dd/d40/d21/d35 0 2026-03-09T17:30:17.187 INFO:tasks.workunit.client.1.vm09.stdout:1/368: read d9/dc/dd/d40/d22/f4a [182348,45644] 0 2026-03-09T17:30:17.193 INFO:tasks.workunit.client.1.vm09.stdout:2/315: mkdir d13/d62 0 2026-03-09T17:30:17.194 INFO:tasks.workunit.client.1.vm09.stdout:9/318: symlink d5/de/d29/d33/l6c 0 2026-03-09T17:30:17.194 INFO:tasks.workunit.client.1.vm09.stdout:9/319: stat d5/d21/c25 0 2026-03-09T17:30:17.204 INFO:tasks.workunit.client.1.vm09.stdout:9/320: dread d5/de/d4e/f56 [0,4194304] 0 2026-03-09T17:30:17.205 INFO:tasks.workunit.client.0.vm06.stdout:2/993: dwrite d3/d4/d12/d71/daa/d77/d81/d64/d6a/de0/f13c [0,4194304] 0 2026-03-09T17:30:17.221 INFO:tasks.workunit.client.1.vm09.stdout:0/323: rename d6/d1d/d24/d32/l61 to d6/d1d/d46/l65 0 2026-03-09T17:30:17.234 INFO:tasks.workunit.client.1.vm09.stdout:6/342: creat d3/d48/f6c x:0 0 0 2026-03-09T17:30:17.239 INFO:tasks.workunit.client.1.vm09.stdout:3/335: mknod d5/d16/d31/d3d/d12/c5f 0 2026-03-09T17:30:17.241 INFO:tasks.workunit.client.1.vm09.stdout:1/369: unlink d9/dc/dd/d40/d21/f33 0 2026-03-09T17:30:17.247 INFO:tasks.workunit.client.1.vm09.stdout:9/321: dwrite d5/de/d29/f67 [0,4194304] 0 2026-03-09T17:30:17.254 INFO:tasks.workunit.client.1.vm09.stdout:7/454: rename da/d11/d47/d5b/d6c/d9e/c55 to da/d11/d47/d5b/d6c/ca0 0 2026-03-09T17:30:17.256 INFO:tasks.workunit.client.1.vm09.stdout:4/398: dread d11/d1e/d45/d60/d69/f5f [0,4194304] 0 2026-03-09T17:30:17.257 INFO:tasks.workunit.client.1.vm09.stdout:4/399: readlink d11/d1e/d45/d60/d69/d58/l5d 0 2026-03-09T17:30:17.262 INFO:tasks.workunit.client.1.vm09.stdout:9/322: dread d5/de/d29/d33/f4a [0,4194304] 0 2026-03-09T17:30:17.269 INFO:tasks.workunit.client.1.vm09.stdout:6/343: creat d3/d1e/d30/d5c/f6d x:0 0 0 2026-03-09T17:30:17.269 INFO:tasks.workunit.client.1.vm09.stdout:3/336: creat d5/d16/d25/f60 x:0 0 0 2026-03-09T17:30:17.269 INFO:tasks.workunit.client.1.vm09.stdout:1/370: creat d9/dc/dd/d40/d22/d37/f71 x:0 0 0 2026-03-09T17:30:17.272 INFO:tasks.workunit.client.1.vm09.stdout:1/371: dwrite d9/dc/dd/f4f [0,4194304] 0 2026-03-09T17:30:17.287 INFO:tasks.workunit.client.1.vm09.stdout:7/455: unlink da/d11/d2d/d49/f5d 0 2026-03-09T17:30:17.287 INFO:tasks.workunit.client.1.vm09.stdout:4/400: creat d11/d1e/d29/f81 x:0 0 0 2026-03-09T17:30:17.288 INFO:tasks.workunit.client.1.vm09.stdout:5/365: write d0/dc/d21/d6f/f5f [499181,9980] 0 2026-03-09T17:30:17.290 INFO:tasks.workunit.client.1.vm09.stdout:5/366: truncate d0/d46/f4c 706542 0 2026-03-09T17:30:17.290 INFO:tasks.workunit.client.1.vm09.stdout:6/344: write f2 [114572,97794] 0 2026-03-09T17:30:17.293 INFO:tasks.workunit.client.1.vm09.stdout:6/345: write d3/d21/d25/d26/f2a [8510414,51675] 0 2026-03-09T17:30:17.294 INFO:tasks.workunit.client.1.vm09.stdout:6/346: write d3/d21/f28 [3600546,33561] 0 2026-03-09T17:30:17.294 INFO:tasks.workunit.client.1.vm09.stdout:5/367: dread d0/d46/f4c [0,4194304] 0 2026-03-09T17:30:17.295 INFO:tasks.workunit.client.1.vm09.stdout:5/368: truncate d0/d46/d4b/f4f 626235 0 2026-03-09T17:30:17.298 INFO:tasks.workunit.client.1.vm09.stdout:5/369: dwrite d0/dc/d21/f7a [0,4194304] 0 2026-03-09T17:30:17.301 INFO:tasks.workunit.client.0.vm06.stdout:2/994: dwrite d3/d4/f1f [0,4194304] 0 2026-03-09T17:30:17.303 INFO:tasks.workunit.client.1.vm09.stdout:8/346: truncate d1/d14/d2a/f54 3380520 0 2026-03-09T17:30:17.306 INFO:tasks.workunit.client.1.vm09.stdout:8/347: dwrite d1/da/d23/d6c/f70 [0,4194304] 0 2026-03-09T17:30:17.306 INFO:tasks.workunit.client.1.vm09.stdout:3/337: creat d5/d9/d30/f61 x:0 0 0 2026-03-09T17:30:17.317 INFO:tasks.workunit.client.1.vm09.stdout:2/316: rename d13/c1b to d13/d15/d2c/c63 0 2026-03-09T17:30:17.324 INFO:tasks.workunit.client.1.vm09.stdout:9/323: symlink d5/de/l6d 0 2026-03-09T17:30:17.344 INFO:tasks.workunit.client.1.vm09.stdout:0/324: dwrite d6/d1d/d24/f5d [4194304,4194304] 0 2026-03-09T17:30:17.350 INFO:tasks.workunit.client.1.vm09.stdout:1/372: mknod d9/dc/dd/d40/d22/d37/d3f/d42/c72 0 2026-03-09T17:30:17.351 INFO:tasks.workunit.client.1.vm09.stdout:4/401: unlink d11/d1e/d29/d36/d57/c6f 0 2026-03-09T17:30:17.351 INFO:tasks.workunit.client.1.vm09.stdout:2/317: write d13/d15/f20 [965677,33085] 0 2026-03-09T17:30:17.353 INFO:tasks.workunit.client.1.vm09.stdout:2/318: write d13/f14 [986204,54481] 0 2026-03-09T17:30:17.353 INFO:tasks.workunit.client.1.vm09.stdout:2/319: chown d13/d15/f2f 98 1 2026-03-09T17:30:17.355 INFO:tasks.workunit.client.1.vm09.stdout:9/324: mkdir d5/de/d4e/d6e 0 2026-03-09T17:30:17.357 INFO:tasks.workunit.client.1.vm09.stdout:0/325: unlink d6/c55 0 2026-03-09T17:30:17.357 INFO:tasks.workunit.client.1.vm09.stdout:1/373: dread - d9/dc/dd/d40/d22/d37/d3f/d42/d55/f69 zero size 2026-03-09T17:30:17.366 INFO:tasks.workunit.client.1.vm09.stdout:2/320: mknod d13/d15/d34/d37/c64 0 2026-03-09T17:30:17.366 INFO:tasks.workunit.client.1.vm09.stdout:4/402: creat d11/d1e/d29/d36/f82 x:0 0 0 2026-03-09T17:30:17.369 INFO:tasks.workunit.client.1.vm09.stdout:9/325: link d5/d21/f2f d5/d2e/f6f 0 2026-03-09T17:30:17.369 INFO:tasks.workunit.client.1.vm09.stdout:6/347: getdents d3/d7 0 2026-03-09T17:30:17.376 INFO:tasks.workunit.client.1.vm09.stdout:0/326: rename d6/d1d/d39/l14 to d6/d1d/d46/l66 0 2026-03-09T17:30:17.380 INFO:tasks.workunit.client.1.vm09.stdout:0/327: chown d6/f27 232737560 1 2026-03-09T17:30:17.381 INFO:tasks.workunit.client.1.vm09.stdout:2/321: chown d13/d15/d21/f5d 53476786 1 2026-03-09T17:30:17.381 INFO:tasks.workunit.client.1.vm09.stdout:0/328: dread - d6/d1d/d46/f5f zero size 2026-03-09T17:30:17.381 INFO:tasks.workunit.client.1.vm09.stdout:9/326: mkdir d5/d2e/d70 0 2026-03-09T17:30:17.381 INFO:tasks.workunit.client.1.vm09.stdout:0/329: readlink d6/l54 0 2026-03-09T17:30:17.381 INFO:tasks.workunit.client.1.vm09.stdout:0/330: write d6/d1d/f3c [401263,7360] 0 2026-03-09T17:30:17.383 INFO:tasks.workunit.client.1.vm09.stdout:1/374: sync 2026-03-09T17:30:17.388 INFO:tasks.workunit.client.1.vm09.stdout:1/375: dwrite d9/f34 [0,4194304] 0 2026-03-09T17:30:17.396 INFO:tasks.workunit.client.1.vm09.stdout:6/348: mknod d3/c6e 0 2026-03-09T17:30:17.397 INFO:tasks.workunit.client.1.vm09.stdout:9/327: rmdir d5/de/d29/d33 39 2026-03-09T17:30:17.399 INFO:tasks.workunit.client.1.vm09.stdout:2/322: dread d13/d15/d21/f24 [0,4194304] 0 2026-03-09T17:30:17.405 INFO:tasks.workunit.client.1.vm09.stdout:2/323: rename d13/d15/d36 to d13/d15/d36/d65 22 2026-03-09T17:30:17.405 INFO:tasks.workunit.client.1.vm09.stdout:9/328: fdatasync d5/f11 0 2026-03-09T17:30:17.407 INFO:tasks.workunit.client.1.vm09.stdout:0/331: creat d6/d1d/d24/d5e/f67 x:0 0 0 2026-03-09T17:30:17.408 INFO:tasks.workunit.client.1.vm09.stdout:0/332: fdatasync d6/d1d/d39/f44 0 2026-03-09T17:30:17.408 INFO:tasks.workunit.client.1.vm09.stdout:2/324: dwrite d13/d15/d34/f3a [0,4194304] 0 2026-03-09T17:30:17.413 INFO:tasks.workunit.client.1.vm09.stdout:6/349: dread d3/d7/fe [4194304,4194304] 0 2026-03-09T17:30:17.415 INFO:tasks.workunit.client.1.vm09.stdout:6/350: read d3/d7/ff [1160314,113424] 0 2026-03-09T17:30:17.416 INFO:tasks.workunit.client.1.vm09.stdout:0/333: dread d6/d1d/f62 [0,4194304] 0 2026-03-09T17:30:17.418 INFO:tasks.workunit.client.1.vm09.stdout:0/334: readlink d6/d1d/l58 0 2026-03-09T17:30:17.419 INFO:tasks.workunit.client.1.vm09.stdout:0/335: stat d6/d1d/d24/f4e 0 2026-03-09T17:30:17.421 INFO:tasks.workunit.client.1.vm09.stdout:2/325: dwrite d13/d15/d36/f59 [0,4194304] 0 2026-03-09T17:30:17.428 INFO:tasks.workunit.client.1.vm09.stdout:0/336: dwrite d6/d1d/d24/d5e/f67 [0,4194304] 0 2026-03-09T17:30:17.435 INFO:tasks.workunit.client.1.vm09.stdout:1/376: creat d9/dc/dd/d40/f73 x:0 0 0 2026-03-09T17:30:17.438 INFO:tasks.workunit.client.1.vm09.stdout:6/351: mknod d3/d7/d59/d5a/c6f 0 2026-03-09T17:30:17.440 INFO:tasks.workunit.client.1.vm09.stdout:6/352: readlink d3/d21/d25/d26/l2b 0 2026-03-09T17:30:17.443 INFO:tasks.workunit.client.1.vm09.stdout:0/337: creat d6/d1d/d24/d32/f68 x:0 0 0 2026-03-09T17:30:17.447 INFO:tasks.workunit.client.1.vm09.stdout:2/326: dread d13/d15/d3b/f3f [0,4194304] 0 2026-03-09T17:30:17.449 INFO:tasks.workunit.client.0.vm06.stdout:2/995: dwrite d3/d4/d12/d71/daa/d77/d81/d130/d8f/fb7 [0,4194304] 0 2026-03-09T17:30:17.452 INFO:tasks.workunit.client.1.vm09.stdout:2/327: dwrite d13/d15/d34/f48 [0,4194304] 0 2026-03-09T17:30:17.454 INFO:tasks.workunit.client.1.vm09.stdout:2/328: readlink d13/l1c 0 2026-03-09T17:30:17.458 INFO:tasks.workunit.client.1.vm09.stdout:7/456: dwrite da/d11/d47/d5b/d6c/d9e/f35 [0,4194304] 0 2026-03-09T17:30:17.466 INFO:tasks.workunit.client.0.vm06.stdout:2/996: mkdir d3/d4/d12/d71/daa/d77/d81/d143 0 2026-03-09T17:30:17.469 INFO:tasks.workunit.client.1.vm09.stdout:5/370: write d0/dc/d21/d26/f39 [4468963,116186] 0 2026-03-09T17:30:17.470 INFO:tasks.workunit.client.1.vm09.stdout:5/371: write d0/dc/d21/d6f/f5f [248198,3715] 0 2026-03-09T17:30:17.474 INFO:tasks.workunit.client.0.vm06.stdout:2/997: write d3/d4/d22/f2f [2878499,79950] 0 2026-03-09T17:30:17.474 INFO:tasks.workunit.client.1.vm09.stdout:3/338: write d5/d16/f17 [174523,86852] 0 2026-03-09T17:30:17.476 INFO:tasks.workunit.client.1.vm09.stdout:2/329: mkdir d13/d15/d34/d37/d66 0 2026-03-09T17:30:17.477 INFO:tasks.workunit.client.1.vm09.stdout:7/457: mkdir da/d11/d2d/d56/da1 0 2026-03-09T17:30:17.477 INFO:tasks.workunit.client.1.vm09.stdout:7/458: fsync da/d11/d2d/d49/f98 0 2026-03-09T17:30:17.480 INFO:tasks.workunit.client.1.vm09.stdout:7/459: dread da/d11/d47/d5b/d6c/d9e/f38 [0,4194304] 0 2026-03-09T17:30:17.485 INFO:tasks.workunit.client.0.vm06.stdout:2/998: unlink d3/d4/d12/d71/daa/d77/d81/d130/d8f/c98 0 2026-03-09T17:30:17.485 INFO:tasks.workunit.client.1.vm09.stdout:9/329: getdents d5/de 0 2026-03-09T17:30:17.485 INFO:tasks.workunit.client.1.vm09.stdout:0/338: symlink d6/d1d/d24/l69 0 2026-03-09T17:30:17.485 INFO:tasks.workunit.client.1.vm09.stdout:0/339: dwrite d6/d1d/d24/d32/f49 [0,4194304] 0 2026-03-09T17:30:17.489 INFO:tasks.workunit.client.1.vm09.stdout:0/340: fdatasync d6/f63 0 2026-03-09T17:30:17.494 INFO:tasks.workunit.client.1.vm09.stdout:8/348: dwrite d1/f7 [4194304,4194304] 0 2026-03-09T17:30:17.498 INFO:tasks.workunit.client.1.vm09.stdout:4/403: getdents d11/d1e/d29/d36 0 2026-03-09T17:30:17.501 INFO:tasks.workunit.client.1.vm09.stdout:5/372: symlink d0/d52/d20/l7d 0 2026-03-09T17:30:17.511 INFO:tasks.workunit.client.1.vm09.stdout:6/353: rename d3/f2e to d3/d1e/d30/f70 0 2026-03-09T17:30:17.512 INFO:tasks.workunit.client.1.vm09.stdout:5/373: dwrite d0/dc/d21/f7a [0,4194304] 0 2026-03-09T17:30:17.515 INFO:tasks.workunit.client.1.vm09.stdout:1/377: getdents d9/dc 0 2026-03-09T17:30:17.515 INFO:tasks.workunit.client.1.vm09.stdout:7/460: rmdir da/d11/d47/d5b 39 2026-03-09T17:30:17.518 INFO:tasks.workunit.client.1.vm09.stdout:5/374: mkdir d0/dc/d21/d7e 0 2026-03-09T17:30:17.518 INFO:tasks.workunit.client.1.vm09.stdout:0/341: rename d6/d1d/d24/d32/d59/d5b/c60 to d6/d1d/d46/c6a 0 2026-03-09T17:30:17.520 INFO:tasks.workunit.client.1.vm09.stdout:1/378: write d9/dc/dd/d40/d22/d37/d3f/f68 [250201,69637] 0 2026-03-09T17:30:17.521 INFO:tasks.workunit.client.1.vm09.stdout:4/404: dread d11/d1e/d29/f3b [0,4194304] 0 2026-03-09T17:30:17.521 INFO:tasks.workunit.client.1.vm09.stdout:2/330: mknod d13/d15/d60/c67 0 2026-03-09T17:30:17.523 INFO:tasks.workunit.client.1.vm09.stdout:2/331: write d13/d15/d34/f5b [4932828,22150] 0 2026-03-09T17:30:17.524 INFO:tasks.workunit.client.1.vm09.stdout:3/339: creat d5/f62 x:0 0 0 2026-03-09T17:30:17.524 INFO:tasks.workunit.client.1.vm09.stdout:2/332: fdatasync d13/d15/f2a 0 2026-03-09T17:30:17.527 INFO:tasks.workunit.client.1.vm09.stdout:2/333: truncate d13/d15/d34/f3a 4958537 0 2026-03-09T17:30:17.529 INFO:tasks.workunit.client.1.vm09.stdout:5/375: dwrite d0/f78 [0,4194304] 0 2026-03-09T17:30:17.529 INFO:tasks.workunit.client.1.vm09.stdout:0/342: symlink d6/d1d/d24/d32/l6b 0 2026-03-09T17:30:17.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:17 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:17.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:17 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:17.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:17 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:17.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:17 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:17.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:17 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:17.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:17 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:17.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:17 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:17.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:17 vm06.local ceph-mon[57307]: Reconfiguring prometheus.vm06 (dependencies changed)... 2026-03-09T17:30:17.529 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:17 vm06.local ceph-mon[57307]: Reconfiguring daemon prometheus.vm06 on vm06 2026-03-09T17:30:17.535 INFO:tasks.workunit.client.1.vm09.stdout:1/379: unlink d9/dc/l36 0 2026-03-09T17:30:17.535 INFO:tasks.workunit.client.1.vm09.stdout:6/354: rename d3/d21/d25/d26/d34/c4b to d3/d7/d59/d5a/c71 0 2026-03-09T17:30:17.540 INFO:tasks.workunit.client.1.vm09.stdout:0/343: mkdir d6/d1d/d24/d5e/d6c 0 2026-03-09T17:30:17.547 INFO:tasks.workunit.client.1.vm09.stdout:2/334: symlink d13/d15/d34/d37/l68 0 2026-03-09T17:30:17.547 INFO:tasks.workunit.client.1.vm09.stdout:6/355: mkdir d3/d7/d59/d72 0 2026-03-09T17:30:17.547 INFO:tasks.workunit.client.1.vm09.stdout:2/335: readlink d13/d15/l19 0 2026-03-09T17:30:17.547 INFO:tasks.workunit.client.1.vm09.stdout:4/405: mkdir d11/d1e/d83 0 2026-03-09T17:30:17.547 INFO:tasks.workunit.client.1.vm09.stdout:7/461: dread da/d11/d2d/d56/f50 [0,4194304] 0 2026-03-09T17:30:17.547 INFO:tasks.workunit.client.1.vm09.stdout:0/344: truncate d6/d1d/d39/f53 69590 0 2026-03-09T17:30:17.547 INFO:tasks.workunit.client.1.vm09.stdout:3/340: link d5/d16/d31/d3d/d12/f5e d5/d16/d46/f63 0 2026-03-09T17:30:17.547 INFO:tasks.workunit.client.1.vm09.stdout:6/356: mkdir d3/d7/d59/d73 0 2026-03-09T17:30:17.551 INFO:tasks.workunit.client.1.vm09.stdout:6/357: dwrite d3/d21/d25/d26/d34/f66 [0,4194304] 0 2026-03-09T17:30:17.551 INFO:tasks.workunit.client.1.vm09.stdout:6/358: readlink d3/d21/d25/d26/d34/l49 0 2026-03-09T17:30:17.553 INFO:tasks.workunit.client.1.vm09.stdout:3/341: unlink d5/d16/f17 0 2026-03-09T17:30:17.562 INFO:tasks.workunit.client.1.vm09.stdout:2/336: dwrite d13/d15/d34/f48 [0,4194304] 0 2026-03-09T17:30:17.562 INFO:tasks.workunit.client.1.vm09.stdout:5/376: link d0/d9/f77 d0/d9/f7f 0 2026-03-09T17:30:17.566 INFO:tasks.workunit.client.1.vm09.stdout:2/337: dwrite d13/d4d/f5c [0,4194304] 0 2026-03-09T17:30:17.582 INFO:tasks.workunit.client.1.vm09.stdout:3/342: mkdir d5/d16/d31/d37/d58/d64 0 2026-03-09T17:30:17.584 INFO:tasks.workunit.client.1.vm09.stdout:3/343: stat d5/d16/d31/f34 0 2026-03-09T17:30:17.585 INFO:tasks.workunit.client.1.vm09.stdout:5/377: creat d0/dc/d21/d6f/f80 x:0 0 0 2026-03-09T17:30:17.586 INFO:tasks.workunit.client.1.vm09.stdout:5/378: fdatasync d0/dc/d21/f62 0 2026-03-09T17:30:17.587 INFO:tasks.workunit.client.1.vm09.stdout:5/379: chown d0/de/f6b 48792 1 2026-03-09T17:30:17.588 INFO:tasks.workunit.client.1.vm09.stdout:2/338: dread d13/d15/d21/f28 [0,4194304] 0 2026-03-09T17:30:17.589 INFO:tasks.workunit.client.1.vm09.stdout:0/345: creat d6/f6d x:0 0 0 2026-03-09T17:30:17.591 INFO:tasks.workunit.client.1.vm09.stdout:6/359: creat d3/d1e/d30/d5c/d61/d6a/f74 x:0 0 0 2026-03-09T17:30:17.591 INFO:tasks.workunit.client.1.vm09.stdout:3/344: rmdir d5/d16/d31/d3d/d32 39 2026-03-09T17:30:17.591 INFO:tasks.workunit.client.1.vm09.stdout:6/360: chown d3/d7/c63 31504714 1 2026-03-09T17:30:17.594 INFO:tasks.workunit.client.1.vm09.stdout:0/346: readlink d6/lc 0 2026-03-09T17:30:17.611 INFO:tasks.workunit.client.1.vm09.stdout:0/347: fsync d6/d1d/d46/f4d 0 2026-03-09T17:30:17.611 INFO:tasks.workunit.client.1.vm09.stdout:3/345: dwrite d5/d16/d25/f60 [0,4194304] 0 2026-03-09T17:30:17.611 INFO:tasks.workunit.client.1.vm09.stdout:2/339: mkdir d13/d15/d34/d69 0 2026-03-09T17:30:17.612 INFO:tasks.workunit.client.1.vm09.stdout:6/361: dwrite d3/d21/f28 [0,4194304] 0 2026-03-09T17:30:17.612 INFO:tasks.workunit.client.1.vm09.stdout:2/340: creat d13/d15/d34/d45/f6a x:0 0 0 2026-03-09T17:30:17.612 INFO:tasks.workunit.client.1.vm09.stdout:0/348: symlink d6/d1d/d24/l6e 0 2026-03-09T17:30:17.614 INFO:tasks.workunit.client.1.vm09.stdout:0/349: dread d6/d1d/d24/d32/f49 [0,4194304] 0 2026-03-09T17:30:17.616 INFO:tasks.workunit.client.1.vm09.stdout:2/341: rmdir d13/d4d 39 2026-03-09T17:30:17.622 INFO:tasks.workunit.client.1.vm09.stdout:0/350: symlink d6/d1d/d24/d32/d59/l6f 0 2026-03-09T17:30:17.623 INFO:tasks.workunit.client.1.vm09.stdout:2/342: write d13/d15/f2b [1335904,100035] 0 2026-03-09T17:30:17.626 INFO:tasks.workunit.client.1.vm09.stdout:0/351: unlink d6/d1d/f62 0 2026-03-09T17:30:17.626 INFO:tasks.workunit.client.1.vm09.stdout:0/352: read - d6/f6d zero size 2026-03-09T17:30:17.627 INFO:tasks.workunit.client.1.vm09.stdout:0/353: write d6/d1d/f3c [257327,47630] 0 2026-03-09T17:30:17.633 INFO:tasks.workunit.client.1.vm09.stdout:0/354: unlink d6/d1d/c4a 0 2026-03-09T17:30:17.633 INFO:tasks.workunit.client.1.vm09.stdout:0/355: write d6/f63 [352562,90063] 0 2026-03-09T17:30:17.634 INFO:tasks.workunit.client.0.vm06.stdout:2/999: write d3/d4/d12/d71/daa/d77/d81/d130/fee [652495,41983] 0 2026-03-09T17:30:17.637 INFO:tasks.workunit.client.0.vm06.stderr:+ rm -rf -- ./tmp.lxoFKUeY46 2026-03-09T17:30:17.638 INFO:tasks.workunit.client.1.vm09.stdout:5/380: sync 2026-03-09T17:30:17.638 INFO:tasks.workunit.client.1.vm09.stdout:6/362: sync 2026-03-09T17:30:17.641 INFO:tasks.workunit.client.1.vm09.stdout:2/343: dwrite d13/f14 [0,4194304] 0 2026-03-09T17:30:17.647 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:17 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:17.647 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:17 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:17.647 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:17 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:17.647 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:17 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:17.647 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:17 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:17.647 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:17 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:17.647 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:17 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:17.647 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:17 vm09.local ceph-mon[62061]: Reconfiguring prometheus.vm06 (dependencies changed)... 2026-03-09T17:30:17.647 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:17 vm09.local ceph-mon[62061]: Reconfiguring daemon prometheus.vm06 on vm06 2026-03-09T17:30:17.647 INFO:tasks.workunit.client.1.vm09.stdout:9/330: dwrite d5/de/f2d [0,4194304] 0 2026-03-09T17:30:17.648 INFO:tasks.workunit.client.1.vm09.stdout:5/381: truncate d0/d9/d16/d5c/f73 5232846 0 2026-03-09T17:30:17.651 INFO:tasks.workunit.client.1.vm09.stdout:8/349: dwrite d1/da/dd/f22 [0,4194304] 0 2026-03-09T17:30:17.656 INFO:tasks.workunit.client.1.vm09.stdout:5/382: fsync d0/dc/f37 0 2026-03-09T17:30:17.657 INFO:tasks.workunit.client.1.vm09.stdout:5/383: readlink d0/d52/d20/l5a 0 2026-03-09T17:30:17.657 INFO:tasks.workunit.client.1.vm09.stdout:5/384: dread - d0/d2/f5d zero size 2026-03-09T17:30:17.662 INFO:tasks.workunit.client.1.vm09.stdout:6/363: truncate d3/d21/f3c 823990 0 2026-03-09T17:30:17.665 INFO:tasks.workunit.client.1.vm09.stdout:8/350: symlink d1/da/d23/d6c/d32/l73 0 2026-03-09T17:30:17.671 INFO:tasks.workunit.client.1.vm09.stdout:6/364: truncate d3/d7/ff 1799140 0 2026-03-09T17:30:17.675 INFO:tasks.workunit.client.1.vm09.stdout:1/380: rmdir d9/dc/dd/d40/d22/d37/d3f 39 2026-03-09T17:30:17.683 INFO:tasks.workunit.client.1.vm09.stdout:1/381: write d9/dc/dd/d40/d22/f4a [1519452,1395] 0 2026-03-09T17:30:17.691 INFO:tasks.workunit.client.1.vm09.stdout:1/382: stat d9/dc/d63/l65 0 2026-03-09T17:30:17.695 INFO:tasks.workunit.client.1.vm09.stdout:1/383: symlink d9/dc/dd/d40/d22/d37/d3f/d42/l74 0 2026-03-09T17:30:17.702 INFO:tasks.workunit.client.1.vm09.stdout:4/406: truncate f3 414689 0 2026-03-09T17:30:17.705 INFO:tasks.workunit.client.1.vm09.stdout:4/407: mknod d11/d1e/d29/d36/c84 0 2026-03-09T17:30:17.706 INFO:tasks.workunit.client.1.vm09.stdout:4/408: readlink d11/d1e/d29/d36/l80 0 2026-03-09T17:30:17.707 INFO:tasks.workunit.client.1.vm09.stdout:7/462: dwrite da/d11/d3e/f88 [0,4194304] 0 2026-03-09T17:30:17.712 INFO:tasks.workunit.client.1.vm09.stdout:1/384: getdents d9/dc/dd/d40/d1d 0 2026-03-09T17:30:17.714 INFO:tasks.workunit.client.1.vm09.stdout:1/385: dread - d9/f6c zero size 2026-03-09T17:30:17.716 INFO:tasks.workunit.client.1.vm09.stdout:4/409: mknod d11/d1e/d45/c85 0 2026-03-09T17:30:17.717 INFO:tasks.workunit.client.1.vm09.stdout:4/410: dread - d11/d1e/d45/d60/d69/d58/f72 zero size 2026-03-09T17:30:17.719 INFO:tasks.workunit.client.1.vm09.stdout:1/386: creat d9/dc/d63/f75 x:0 0 0 2026-03-09T17:30:17.730 INFO:tasks.workunit.client.1.vm09.stdout:7/463: dread da/d11/d47/d5b/d6c/d9e/d4e/f8e [0,4194304] 0 2026-03-09T17:30:17.730 INFO:tasks.workunit.client.1.vm09.stdout:7/464: readlink da/l97 0 2026-03-09T17:30:17.731 INFO:tasks.workunit.client.1.vm09.stdout:3/346: dwrite d5/f2f [4194304,4194304] 0 2026-03-09T17:30:17.744 INFO:tasks.workunit.client.1.vm09.stdout:3/347: dwrite d5/f2f [0,4194304] 0 2026-03-09T17:30:17.750 INFO:tasks.workunit.client.1.vm09.stdout:7/465: mkdir da/d11/d3e/da2 0 2026-03-09T17:30:17.753 INFO:tasks.workunit.client.1.vm09.stdout:7/466: stat da/d11/d47/d5b/d6c/d9e/d4e/f74 0 2026-03-09T17:30:17.761 INFO:tasks.workunit.client.1.vm09.stdout:3/348: fsync d5/d16/d46/f63 0 2026-03-09T17:30:17.762 INFO:tasks.workunit.client.1.vm09.stdout:9/331: write d5/de/d29/f36 [818503,3073] 0 2026-03-09T17:30:17.771 INFO:tasks.workunit.client.1.vm09.stdout:3/349: dwrite d5/d16/d31/d3d/fe [0,4194304] 0 2026-03-09T17:30:17.777 INFO:tasks.workunit.client.1.vm09.stdout:9/332: unlink d5/lc 0 2026-03-09T17:30:17.777 INFO:tasks.workunit.client.1.vm09.stdout:9/333: readlink d5/l32 0 2026-03-09T17:30:17.777 INFO:tasks.workunit.client.1.vm09.stdout:7/467: symlink da/d11/d47/d89/la3 0 2026-03-09T17:30:17.779 INFO:tasks.workunit.client.1.vm09.stdout:7/468: chown da/d11/d47/d5b/d6c/d9e/d4e/f8e 2087 1 2026-03-09T17:30:17.779 INFO:tasks.workunit.client.1.vm09.stdout:1/387: getdents d9 0 2026-03-09T17:30:17.782 INFO:tasks.workunit.client.1.vm09.stdout:5/385: truncate d0/d46/f4c 194848 0 2026-03-09T17:30:17.783 INFO:tasks.workunit.client.1.vm09.stdout:5/386: dread - d0/dc/d21/d33/f65 zero size 2026-03-09T17:30:17.784 INFO:tasks.workunit.client.1.vm09.stdout:5/387: write d0/de/f50 [1591642,14860] 0 2026-03-09T17:30:17.786 INFO:tasks.workunit.client.1.vm09.stdout:9/334: getdents d5/d2e/d6b 0 2026-03-09T17:30:17.790 INFO:tasks.workunit.client.1.vm09.stdout:9/335: dwrite d5/f5d [0,4194304] 0 2026-03-09T17:30:17.791 INFO:tasks.workunit.client.1.vm09.stdout:9/336: stat d5 0 2026-03-09T17:30:17.792 INFO:tasks.workunit.client.1.vm09.stdout:8/351: write d1/d14/d2a/f2b [539400,122028] 0 2026-03-09T17:30:17.792 INFO:tasks.workunit.client.1.vm09.stdout:8/352: chown d1/da/f12 38 1 2026-03-09T17:30:17.793 INFO:tasks.workunit.client.1.vm09.stdout:8/353: stat d1/da/dd/d47/f64 0 2026-03-09T17:30:17.794 INFO:tasks.workunit.client.1.vm09.stdout:5/388: creat d0/d52/f81 x:0 0 0 2026-03-09T17:30:17.796 INFO:tasks.workunit.client.1.vm09.stdout:9/337: dread d5/f5d [0,4194304] 0 2026-03-09T17:30:17.805 INFO:tasks.workunit.client.1.vm09.stdout:5/389: dwrite d0/dc/d21/d33/f69 [0,4194304] 0 2026-03-09T17:30:17.807 INFO:tasks.workunit.client.1.vm09.stdout:9/338: dwrite d5/f14 [4194304,4194304] 0 2026-03-09T17:30:17.811 INFO:tasks.workunit.client.1.vm09.stdout:5/390: fsync d0/d2/f5d 0 2026-03-09T17:30:17.817 INFO:tasks.workunit.client.1.vm09.stdout:4/411: write f10 [1617805,30332] 0 2026-03-09T17:30:17.818 INFO:tasks.workunit.client.1.vm09.stdout:4/412: readlink d11/d1e/d29/l5b 0 2026-03-09T17:30:17.830 INFO:tasks.workunit.client.1.vm09.stdout:1/388: creat d9/dc/f76 x:0 0 0 2026-03-09T17:30:17.834 INFO:tasks.workunit.client.1.vm09.stdout:4/413: creat d11/d1e/d29/d36/f86 x:0 0 0 2026-03-09T17:30:17.837 INFO:tasks.workunit.client.1.vm09.stdout:1/389: chown d9/dc/dd/d40/l2a 153239976 1 2026-03-09T17:30:17.843 INFO:tasks.workunit.client.1.vm09.stdout:4/414: rename d11/d1e/d29/d36/c48 to d11/d1e/d45/d60/d69/d58/c87 0 2026-03-09T17:30:17.849 INFO:tasks.workunit.client.1.vm09.stdout:7/469: write da/f1c [1718143,74767] 0 2026-03-09T17:30:17.862 INFO:tasks.workunit.client.1.vm09.stdout:9/339: link d5/de/c28 d5/d21/c71 0 2026-03-09T17:30:17.866 INFO:tasks.workunit.client.1.vm09.stdout:4/415: unlink d11/d1e/d29/f32 0 2026-03-09T17:30:17.870 INFO:tasks.workunit.client.1.vm09.stdout:4/416: dwrite d11/d1e/d45/d60/d69/d58/f75 [0,4194304] 0 2026-03-09T17:30:17.872 INFO:tasks.workunit.client.1.vm09.stdout:4/417: dread - d11/d1e/d31/f74 zero size 2026-03-09T17:30:17.873 INFO:tasks.workunit.client.1.vm09.stdout:4/418: chown d11/d1e/d45/d60/d69/d58/l5d 613666133 1 2026-03-09T17:30:17.874 INFO:tasks.workunit.client.1.vm09.stdout:4/419: chown d11/d1e/d45/d60/d69/f53 24753 1 2026-03-09T17:30:17.878 INFO:tasks.workunit.client.1.vm09.stdout:5/391: fsync d0/dc/d21/d33/f69 0 2026-03-09T17:30:17.881 INFO:tasks.workunit.client.1.vm09.stdout:9/340: creat d5/d2e/f72 x:0 0 0 2026-03-09T17:30:17.888 INFO:tasks.workunit.client.1.vm09.stdout:4/420: mknod d11/d1e/d45/c88 0 2026-03-09T17:30:17.889 INFO:tasks.workunit.client.1.vm09.stdout:5/392: unlink d0/dc/d21/d33/f1f 0 2026-03-09T17:30:17.898 INFO:tasks.workunit.client.1.vm09.stdout:6/365: read d3/d1e/d30/d3f/f42 [816233,55304] 0 2026-03-09T17:30:17.902 INFO:tasks.workunit.client.1.vm09.stdout:8/354: dread d1/d14/d2a/f2b [0,4194304] 0 2026-03-09T17:30:17.905 INFO:tasks.workunit.client.1.vm09.stdout:4/421: mkdir d11/d1e/d83/d89 0 2026-03-09T17:30:17.906 INFO:tasks.workunit.client.1.vm09.stdout:6/366: dread d3/d21/f3c [0,4194304] 0 2026-03-09T17:30:17.911 INFO:tasks.workunit.client.1.vm09.stdout:6/367: creat d3/d7/d59/d73/f75 x:0 0 0 2026-03-09T17:30:17.912 INFO:tasks.workunit.client.1.vm09.stdout:8/355: dread d1/d14/f2f [0,4194304] 0 2026-03-09T17:30:17.913 INFO:tasks.workunit.client.1.vm09.stdout:2/344: dread d13/f40 [0,4194304] 0 2026-03-09T17:30:17.915 INFO:tasks.workunit.client.1.vm09.stdout:2/345: dwrite d13/d15/f2a [0,4194304] 0 2026-03-09T17:30:17.924 INFO:tasks.workunit.client.1.vm09.stdout:8/356: dread d1/da/f4b [0,4194304] 0 2026-03-09T17:30:17.924 INFO:tasks.workunit.client.1.vm09.stdout:1/390: truncate f6 93101 0 2026-03-09T17:30:17.926 INFO:tasks.workunit.client.1.vm09.stdout:8/357: dread - d1/da/d23/d6c/d32/f6d zero size 2026-03-09T17:30:17.927 INFO:tasks.workunit.client.1.vm09.stdout:6/368: rename d3/d1e/d30 to d3/d21/d76 0 2026-03-09T17:30:17.928 INFO:tasks.workunit.client.1.vm09.stdout:7/470: dwrite da/d11/d47/d5b/d6c/d9e/d4e/f2b [0,4194304] 0 2026-03-09T17:30:17.929 INFO:tasks.workunit.client.1.vm09.stdout:6/369: chown d3/d21/d76/d5c/d61/f60 0 1 2026-03-09T17:30:17.930 INFO:tasks.workunit.client.1.vm09.stdout:6/370: dread - d3/f41 zero size 2026-03-09T17:30:17.943 INFO:tasks.workunit.client.1.vm09.stdout:1/391: readlink d9/d38/l53 0 2026-03-09T17:30:17.943 INFO:tasks.workunit.client.1.vm09.stdout:8/358: rmdir d1/da 39 2026-03-09T17:30:17.943 INFO:tasks.workunit.client.1.vm09.stdout:1/392: dread - d9/dc/f76 zero size 2026-03-09T17:30:17.944 INFO:tasks.workunit.client.1.vm09.stdout:7/471: stat da/d11/d2d/f69 0 2026-03-09T17:30:17.945 INFO:tasks.workunit.client.1.vm09.stdout:9/341: dwrite d5/f4b [0,4194304] 0 2026-03-09T17:30:17.946 INFO:tasks.workunit.client.1.vm09.stdout:6/371: rename d3/d21/d25/d26/d34/f56 to d3/d7/f77 0 2026-03-09T17:30:17.951 INFO:tasks.workunit.client.1.vm09.stdout:8/359: fdatasync d1/da/d23/d6c/f70 0 2026-03-09T17:30:17.954 INFO:tasks.workunit.client.1.vm09.stdout:1/393: creat d9/dc/dd/d40/d1d/f77 x:0 0 0 2026-03-09T17:30:17.956 INFO:tasks.workunit.client.1.vm09.stdout:6/372: creat d3/d21/d76/d5c/f78 x:0 0 0 2026-03-09T17:30:17.961 INFO:tasks.workunit.client.1.vm09.stdout:8/360: unlink d1/da/c59 0 2026-03-09T17:30:17.963 INFO:tasks.workunit.client.1.vm09.stdout:1/394: rename d9/dc/dd/d40/d21/d35/c5e to d9/dc/dd/d40/d21/d35/c78 0 2026-03-09T17:30:17.967 INFO:tasks.workunit.client.1.vm09.stdout:9/342: getdents d5/de/d4e/d6e 0 2026-03-09T17:30:17.967 INFO:tasks.workunit.client.1.vm09.stdout:7/472: mknod da/d11/d47/d5b/d6c/d9e/d4e/d5f/ca4 0 2026-03-09T17:30:17.967 INFO:tasks.workunit.client.1.vm09.stdout:2/346: link d13/d15/d34/c3d d13/d15/c6b 0 2026-03-09T17:30:17.967 INFO:tasks.workunit.client.1.vm09.stdout:2/347: write d13/d15/d34/d45/f6a [436966,12112] 0 2026-03-09T17:30:17.972 INFO:tasks.workunit.client.1.vm09.stdout:0/356: dread d6/d1d/d24/f50 [0,4194304] 0 2026-03-09T17:30:17.973 INFO:tasks.workunit.client.1.vm09.stdout:0/357: write d6/d1d/d39/f53 [266084,62597] 0 2026-03-09T17:30:17.976 INFO:tasks.workunit.client.1.vm09.stdout:3/350: dread d5/d16/f45 [0,4194304] 0 2026-03-09T17:30:17.977 INFO:tasks.workunit.client.1.vm09.stdout:8/361: chown d1/d14/d2a/c4e 0 1 2026-03-09T17:30:17.978 INFO:tasks.workunit.client.1.vm09.stdout:7/473: rename da/d11/d47/d5b/d6c/d9e/d4e/d4c/c8b to da/d11/d2d/d56/ca5 0 2026-03-09T17:30:17.980 INFO:tasks.workunit.client.1.vm09.stdout:2/348: mknod d13/d15/d34/d69/c6c 0 2026-03-09T17:30:17.981 INFO:tasks.workunit.client.1.vm09.stdout:7/474: write da/d11/d47/d5b/d6c/d9e/d4e/f63 [702005,81767] 0 2026-03-09T17:30:17.983 INFO:tasks.workunit.client.1.vm09.stdout:5/393: dwrite d0/d9/f7f [4194304,4194304] 0 2026-03-09T17:30:17.983 INFO:tasks.workunit.client.1.vm09.stdout:7/475: chown da/d11/c58 1060 1 2026-03-09T17:30:17.984 INFO:tasks.workunit.client.1.vm09.stdout:1/395: dwrite f8 [4194304,4194304] 0 2026-03-09T17:30:17.991 INFO:tasks.workunit.client.1.vm09.stdout:9/343: creat d5/de/d29/f73 x:0 0 0 2026-03-09T17:30:17.991 INFO:tasks.workunit.client.1.vm09.stdout:8/362: fdatasync d1/d14/d2a/d42/f46 0 2026-03-09T17:30:17.992 INFO:tasks.workunit.client.1.vm09.stdout:8/363: fdatasync d1/da/d23/d6c/f70 0 2026-03-09T17:30:17.992 INFO:tasks.workunit.client.1.vm09.stdout:8/364: readlink d1/d14/d2a/d42/d5d/l6b 0 2026-03-09T17:30:17.999 INFO:tasks.workunit.client.1.vm09.stdout:5/394: rename d0/d9/f66 to d0/dc/d21/d6f/d42/f82 0 2026-03-09T17:30:18.001 INFO:tasks.workunit.client.1.vm09.stdout:8/365: dread d1/d14/f3d [0,4194304] 0 2026-03-09T17:30:18.001 INFO:tasks.workunit.client.1.vm09.stdout:8/366: readlink d1/da/dd/l15 0 2026-03-09T17:30:18.003 INFO:tasks.workunit.client.1.vm09.stdout:7/476: creat da/d76/fa6 x:0 0 0 2026-03-09T17:30:18.004 INFO:tasks.workunit.client.1.vm09.stdout:7/477: readlink da/d11/d47/d5b/d6c/d9e/d4e/l46 0 2026-03-09T17:30:18.004 INFO:tasks.workunit.client.1.vm09.stdout:8/367: dread d1/da/dd/f45 [0,4194304] 0 2026-03-09T17:30:18.005 INFO:tasks.workunit.client.1.vm09.stdout:0/358: creat d6/d1d/f70 x:0 0 0 2026-03-09T17:30:18.005 INFO:tasks.workunit.client.1.vm09.stdout:7/478: chown da/d11/d47/d5b/c9c 120833438 1 2026-03-09T17:30:18.008 INFO:tasks.workunit.client.1.vm09.stdout:0/359: read d6/d1d/f41 [2066055,86929] 0 2026-03-09T17:30:18.009 INFO:tasks.workunit.client.1.vm09.stdout:0/360: write d6/d1d/d24/f5d [8174525,31273] 0 2026-03-09T17:30:18.010 INFO:tasks.workunit.client.1.vm09.stdout:4/422: write d11/d1e/d29/f2e [4133349,34930] 0 2026-03-09T17:30:18.015 INFO:tasks.workunit.client.1.vm09.stdout:0/361: dwrite d6/f9 [4194304,4194304] 0 2026-03-09T17:30:18.020 INFO:tasks.workunit.client.1.vm09.stdout:2/349: creat d13/d4d/f6d x:0 0 0 2026-03-09T17:30:18.041 INFO:tasks.workunit.client.1.vm09.stdout:7/479: mkdir da/d11/d64/da7 0 2026-03-09T17:30:18.042 INFO:tasks.workunit.client.1.vm09.stdout:9/344: creat d5/d2e/d6b/f74 x:0 0 0 2026-03-09T17:30:18.044 INFO:tasks.workunit.client.1.vm09.stdout:4/423: creat d11/d1e/d29/f8a x:0 0 0 2026-03-09T17:30:18.046 INFO:tasks.workunit.client.1.vm09.stdout:0/362: symlink d6/d1d/d24/d32/d59/l71 0 2026-03-09T17:30:18.046 INFO:tasks.workunit.client.1.vm09.stdout:2/350: symlink d13/d15/d34/d45/l6e 0 2026-03-09T17:30:18.047 INFO:tasks.workunit.client.1.vm09.stdout:0/363: fdatasync d6/d1d/f37 0 2026-03-09T17:30:18.047 INFO:tasks.workunit.client.1.vm09.stdout:4/424: unlink d11/f24 0 2026-03-09T17:30:18.048 INFO:tasks.workunit.client.1.vm09.stdout:1/396: rename d9/dc/dd/d40/d22/d37/d3f/d42/d55/c6d to d9/dc/dd/d40/d22/d37/d3f/c79 0 2026-03-09T17:30:18.048 INFO:tasks.workunit.client.1.vm09.stdout:9/345: creat d5/d2e/d70/f75 x:0 0 0 2026-03-09T17:30:18.049 INFO:tasks.workunit.client.1.vm09.stdout:8/368: sync 2026-03-09T17:30:18.056 INFO:tasks.workunit.client.1.vm09.stdout:2/351: sync 2026-03-09T17:30:18.056 INFO:tasks.workunit.client.1.vm09.stdout:2/352: stat d13/d15/d21/f24 0 2026-03-09T17:30:18.063 INFO:tasks.workunit.client.1.vm09.stdout:7/480: rename da/d11/f1f to da/d11/d64/da7/fa8 0 2026-03-09T17:30:18.063 INFO:tasks.workunit.client.1.vm09.stdout:2/353: fsync d13/d15/d3b/f3f 0 2026-03-09T17:30:18.064 INFO:tasks.workunit.client.1.vm09.stdout:7/481: chown da/d11/l24 225807378 1 2026-03-09T17:30:18.065 INFO:tasks.workunit.client.1.vm09.stdout:2/354: truncate d13/d15/d34/d45/f49 620370 0 2026-03-09T17:30:18.068 INFO:tasks.workunit.client.1.vm09.stdout:1/397: mknod d9/d38/d61/c7a 0 2026-03-09T17:30:18.081 INFO:tasks.workunit.client.1.vm09.stdout:7/482: creat da/d11/d64/fa9 x:0 0 0 2026-03-09T17:30:18.081 INFO:tasks.workunit.client.1.vm09.stdout:2/355: mkdir d13/d15/d34/d37/d6f 0 2026-03-09T17:30:18.083 INFO:tasks.workunit.client.1.vm09.stdout:2/356: read d13/d15/f20 [2328423,73083] 0 2026-03-09T17:30:18.084 INFO:tasks.workunit.client.1.vm09.stdout:8/369: creat d1/f74 x:0 0 0 2026-03-09T17:30:18.085 INFO:tasks.workunit.client.1.vm09.stdout:0/364: getdents d6/d1d/d24/d32 0 2026-03-09T17:30:18.085 INFO:tasks.workunit.client.1.vm09.stdout:6/373: write d3/d21/d76/f70 [4938925,57063] 0 2026-03-09T17:30:18.088 INFO:tasks.workunit.client.1.vm09.stdout:0/365: dread d6/f63 [0,4194304] 0 2026-03-09T17:30:18.090 INFO:tasks.workunit.client.1.vm09.stdout:7/483: dread - da/d11/d47/d5b/d6c/d9e/f57 zero size 2026-03-09T17:30:18.091 INFO:tasks.workunit.client.1.vm09.stdout:6/374: dwrite d3/f19 [4194304,4194304] 0 2026-03-09T17:30:18.092 INFO:tasks.workunit.client.1.vm09.stdout:6/375: dread - d3/d21/f5d zero size 2026-03-09T17:30:18.092 INFO:tasks.workunit.client.1.vm09.stdout:3/351: truncate f3 238099 0 2026-03-09T17:30:18.092 INFO:tasks.workunit.client.1.vm09.stdout:8/370: rename d1/da/f12 to d1/d14/d31/f75 0 2026-03-09T17:30:18.099 INFO:tasks.workunit.client.1.vm09.stdout:6/376: chown d3/d21/d25/f2f 1935317788 1 2026-03-09T17:30:18.100 INFO:tasks.workunit.client.1.vm09.stdout:0/366: dwrite d6/d1d/d24/d32/f68 [0,4194304] 0 2026-03-09T17:30:18.101 INFO:tasks.workunit.client.1.vm09.stdout:1/398: link d9/dc/f47 d9/dc/dd/f7b 0 2026-03-09T17:30:18.107 INFO:tasks.workunit.client.1.vm09.stdout:8/371: mknod d1/da/dd/c76 0 2026-03-09T17:30:18.111 INFO:tasks.workunit.client.1.vm09.stdout:3/352: rename d5/d16/d31/d3d/d12 to d5/d9/d30/d65 0 2026-03-09T17:30:18.111 INFO:tasks.workunit.client.1.vm09.stdout:3/353: chown d5/d16/f54 7793473 1 2026-03-09T17:30:18.114 INFO:tasks.workunit.client.1.vm09.stdout:6/377: rmdir d3/d21/d25/d26 39 2026-03-09T17:30:18.115 INFO:tasks.workunit.client.1.vm09.stdout:1/399: creat d9/f7c x:0 0 0 2026-03-09T17:30:18.116 INFO:tasks.workunit.client.1.vm09.stdout:7/484: link da/f27 da/d11/d2d/d56/d68/faa 0 2026-03-09T17:30:18.117 INFO:tasks.workunit.client.1.vm09.stdout:7/485: stat da/d11/d47/d5b/d6c/d9e/d4e/d4c/l86 0 2026-03-09T17:30:18.118 INFO:tasks.workunit.client.1.vm09.stdout:6/378: dwrite d3/d21/d25/f5f [0,4194304] 0 2026-03-09T17:30:18.130 INFO:tasks.workunit.client.1.vm09.stdout:0/367: rename c4 to d6/d1d/d24/d5e/c72 0 2026-03-09T17:30:18.133 INFO:tasks.workunit.client.1.vm09.stdout:5/395: truncate d0/f60 47123 0 2026-03-09T17:30:18.134 INFO:tasks.workunit.client.1.vm09.stdout:7/486: rmdir da/d11/d47/d5b/d6c/d9e 39 2026-03-09T17:30:18.134 INFO:tasks.workunit.client.1.vm09.stdout:7/487: fsync da/d11/d77/f79 0 2026-03-09T17:30:18.141 INFO:tasks.workunit.client.1.vm09.stdout:2/357: link d13/d15/c56 d13/d15/d34/d37/c70 0 2026-03-09T17:30:18.141 INFO:tasks.workunit.client.1.vm09.stdout:2/358: chown d13/d15/d21 0 1 2026-03-09T17:30:18.141 INFO:tasks.workunit.client.1.vm09.stdout:8/372: rename d1/da/d23/d34 to d1/da/dd/d77 0 2026-03-09T17:30:18.141 INFO:tasks.workunit.client.1.vm09.stdout:7/488: chown da/d11/d2d/c96 1671568 1 2026-03-09T17:30:18.141 INFO:tasks.workunit.client.1.vm09.stdout:0/368: dwrite d6/d1d/d39/f2e [0,4194304] 0 2026-03-09T17:30:18.146 INFO:tasks.workunit.client.1.vm09.stdout:7/489: read da/d11/d2d/d56/d68/faa [1151605,46442] 0 2026-03-09T17:30:18.147 INFO:tasks.workunit.client.1.vm09.stdout:8/373: dwrite d1/d14/d2a/f62 [0,4194304] 0 2026-03-09T17:30:18.149 INFO:tasks.workunit.client.1.vm09.stdout:8/374: dread d1/da/d23/d6c/f70 [0,4194304] 0 2026-03-09T17:30:18.157 INFO:tasks.workunit.client.1.vm09.stdout:6/379: creat d3/d21/d25/d26/d6b/f79 x:0 0 0 2026-03-09T17:30:18.164 INFO:tasks.workunit.client.1.vm09.stdout:5/396: rename d0/d46/l61 to d0/dc/d21/d6f/d42/l83 0 2026-03-09T17:30:18.173 INFO:tasks.workunit.client.1.vm09.stdout:4/425: truncate d11/f23 3548315 0 2026-03-09T17:30:18.173 INFO:tasks.workunit.client.1.vm09.stdout:9/346: dwrite d5/de/d29/d33/f3b [0,4194304] 0 2026-03-09T17:30:18.174 INFO:tasks.workunit.client.1.vm09.stdout:9/347: dread - d5/de/d29/f73 zero size 2026-03-09T17:30:18.174 INFO:tasks.workunit.client.1.vm09.stdout:9/348: readlink d5/d21/l69 0 2026-03-09T17:30:18.184 INFO:tasks.workunit.client.1.vm09.stdout:0/369: mknod d6/d1d/d24/c73 0 2026-03-09T17:30:18.184 INFO:tasks.workunit.client.1.vm09.stdout:7/490: creat da/d11/d47/d5b/d78/fab x:0 0 0 2026-03-09T17:30:18.185 INFO:tasks.workunit.client.1.vm09.stdout:8/375: dread d1/da/d23/d6c/f1c [4194304,4194304] 0 2026-03-09T17:30:18.186 INFO:tasks.workunit.client.1.vm09.stdout:7/491: truncate da/d11/d3e/f60 4972472 0 2026-03-09T17:30:18.186 INFO:tasks.workunit.client.1.vm09.stdout:2/359: creat d13/d15/f71 x:0 0 0 2026-03-09T17:30:18.187 INFO:tasks.workunit.client.1.vm09.stdout:5/397: rmdir d0/d9/d16 39 2026-03-09T17:30:18.189 INFO:tasks.workunit.client.1.vm09.stdout:9/349: write d5/de/d29/f52 [675951,93459] 0 2026-03-09T17:30:18.193 INFO:tasks.workunit.client.1.vm09.stdout:9/350: dwrite d5/f13 [4194304,4194304] 0 2026-03-09T17:30:18.202 INFO:tasks.workunit.client.1.vm09.stdout:0/370: rename d6/d1d/d24/c73 to d6/d1d/d24/d5e/d6c/c74 0 2026-03-09T17:30:18.211 INFO:tasks.workunit.client.1.vm09.stdout:8/376: mknod d1/d14/d2a/d42/d43/d44/c78 0 2026-03-09T17:30:18.211 INFO:tasks.workunit.client.1.vm09.stdout:8/377: fsync d1/f7 0 2026-03-09T17:30:18.211 INFO:tasks.workunit.client.1.vm09.stdout:7/492: rmdir da/d11/d47/d89 39 2026-03-09T17:30:18.211 INFO:tasks.workunit.client.1.vm09.stdout:3/354: write d5/d9/d30/d65/f3e [5017911,5652] 0 2026-03-09T17:30:18.211 INFO:tasks.workunit.client.1.vm09.stdout:0/371: dread d6/d1d/d39/f44 [0,4194304] 0 2026-03-09T17:30:18.211 INFO:tasks.workunit.client.1.vm09.stdout:2/360: mkdir d13/d15/d36/d72 0 2026-03-09T17:30:18.218 INFO:tasks.workunit.client.1.vm09.stdout:9/351: sync 2026-03-09T17:30:18.219 INFO:tasks.workunit.client.1.vm09.stdout:0/372: dread d6/d1d/f1e [0,4194304] 0 2026-03-09T17:30:18.221 INFO:tasks.workunit.client.1.vm09.stdout:3/355: dread d5/d9/f1e [0,4194304] 0 2026-03-09T17:30:18.223 INFO:tasks.workunit.client.1.vm09.stdout:8/378: rename d1/d14/d72 to d1/da/dd/d79 0 2026-03-09T17:30:18.228 INFO:tasks.workunit.client.1.vm09.stdout:8/379: write d1/d14/d2a/f62 [3978490,130224] 0 2026-03-09T17:30:18.228 INFO:tasks.workunit.client.1.vm09.stdout:6/380: creat d3/d21/d76/f7a x:0 0 0 2026-03-09T17:30:18.228 INFO:tasks.workunit.client.1.vm09.stdout:7/493: mkdir da/d11/d64/dac 0 2026-03-09T17:30:18.228 INFO:tasks.workunit.client.1.vm09.stdout:7/494: stat da/d11/f6a 0 2026-03-09T17:30:18.232 INFO:tasks.workunit.client.1.vm09.stdout:8/380: dwrite d1/da/dd/d63/f1d [0,4194304] 0 2026-03-09T17:30:18.233 INFO:tasks.workunit.client.1.vm09.stdout:9/352: creat d5/de/f76 x:0 0 0 2026-03-09T17:30:18.239 INFO:tasks.workunit.client.1.vm09.stdout:8/381: sync 2026-03-09T17:30:18.255 INFO:tasks.workunit.client.1.vm09.stdout:6/381: symlink d3/d1e/l7b 0 2026-03-09T17:30:18.262 INFO:tasks.workunit.client.1.vm09.stdout:5/398: dwrite d0/dc/f37 [0,4194304] 0 2026-03-09T17:30:18.264 INFO:tasks.workunit.client.1.vm09.stdout:3/356: mkdir d5/d9/d30/d65/d59/d66 0 2026-03-09T17:30:18.290 INFO:tasks.workunit.client.1.vm09.stdout:2/361: rename fb to d13/f73 0 2026-03-09T17:30:18.314 INFO:tasks.workunit.client.1.vm09.stdout:4/426: rename d11/d1e/d45/d60/d69 to d11/d1e/d83/d89/d8b 0 2026-03-09T17:30:18.315 INFO:tasks.workunit.client.1.vm09.stdout:4/427: stat d11/d1e/d29/f2e 0 2026-03-09T17:30:18.320 INFO:tasks.workunit.client.1.vm09.stdout:7/495: getdents da/d11/d2d/d56 0 2026-03-09T17:30:18.327 INFO:tasks.workunit.client.1.vm09.stdout:8/382: getdents d1/da/dd/d47 0 2026-03-09T17:30:18.327 INFO:tasks.workunit.client.1.vm09.stdout:8/383: dread d1/d14/d2a/f2e [0,4194304] 0 2026-03-09T17:30:18.335 INFO:tasks.workunit.client.1.vm09.stdout:2/362: getdents d13/d15/d34/d37/d6f 0 2026-03-09T17:30:18.342 INFO:tasks.workunit.client.1.vm09.stdout:9/353: symlink d5/de/d4e/d6e/l77 0 2026-03-09T17:30:18.380 INFO:tasks.workunit.client.1.vm09.stdout:8/384: dread d1/da/f35 [0,4194304] 0 2026-03-09T17:30:18.380 INFO:tasks.workunit.client.1.vm09.stdout:8/385: fdatasync d1/da/d23/d6c/d32/f6d 0 2026-03-09T17:30:18.380 INFO:tasks.workunit.client.1.vm09.stdout:8/386: write d1/da/d23/d6c/f70 [1895720,100307] 0 2026-03-09T17:30:18.380 INFO:tasks.workunit.client.1.vm09.stdout:2/363: rmdir d13/d15/d34 39 2026-03-09T17:30:18.380 INFO:tasks.workunit.client.1.vm09.stdout:2/364: dwrite d13/d15/d21/f28 [0,4194304] 0 2026-03-09T17:30:18.380 INFO:tasks.workunit.client.1.vm09.stdout:1/400: rename d9/dc/c2f to d9/dc/dd/d40/d21/c7d 0 2026-03-09T17:30:18.380 INFO:tasks.workunit.client.1.vm09.stdout:9/354: dwrite d5/de/f65 [0,4194304] 0 2026-03-09T17:30:18.380 INFO:tasks.workunit.client.1.vm09.stdout:9/355: dread - d5/de/d29/f73 zero size 2026-03-09T17:30:18.394 INFO:tasks.workunit.client.1.vm09.stdout:2/365: rmdir d13/d15/d34 39 2026-03-09T17:30:18.397 INFO:tasks.workunit.client.1.vm09.stdout:1/401: mkdir d9/dc/dd/d40/d21/d6f/d7e 0 2026-03-09T17:30:18.398 INFO:tasks.workunit.client.1.vm09.stdout:2/366: dwrite d13/f4c [0,4194304] 0 2026-03-09T17:30:18.399 INFO:tasks.workunit.client.1.vm09.stdout:0/373: rename d6/d1d/d39/f2e to d6/d1d/d24/f75 0 2026-03-09T17:30:18.413 INFO:tasks.workunit.client.1.vm09.stdout:1/402: fdatasync f3 0 2026-03-09T17:30:18.413 INFO:tasks.workunit.client.1.vm09.stdout:0/374: rmdir d6/d1d/d24/d32/d59 39 2026-03-09T17:30:18.413 INFO:tasks.workunit.client.1.vm09.stdout:0/375: readlink d6/d1d/d24/l6e 0 2026-03-09T17:30:18.418 INFO:tasks.workunit.client.1.vm09.stdout:1/403: stat d9/d38/d61/c7a 0 2026-03-09T17:30:18.420 INFO:tasks.workunit.client.1.vm09.stdout:6/382: rename d3/d21/f4d to d3/d1e/f7c 0 2026-03-09T17:30:18.459 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:18 vm06.local ceph-mon[57307]: pgmap v9: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 39 MiB/s rd, 115 MiB/s wr, 253 op/s 2026-03-09T17:30:18.459 INFO:tasks.workunit.client.1.vm09.stdout:9/356: link d5/de/d29/l3f d5/d2e/l78 0 2026-03-09T17:30:18.459 INFO:tasks.workunit.client.1.vm09.stdout:9/357: fdatasync d5/f5d 0 2026-03-09T17:30:18.459 INFO:tasks.workunit.client.1.vm09.stdout:8/387: rename d1/l19 to d1/da/d23/d6c/d32/l7a 0 2026-03-09T17:30:18.459 INFO:tasks.workunit.client.1.vm09.stdout:8/388: dread d1/d14/f2f [0,4194304] 0 2026-03-09T17:30:18.460 INFO:tasks.workunit.client.1.vm09.stdout:8/389: write d1/da/dd/f22 [752507,37029] 0 2026-03-09T17:30:18.460 INFO:tasks.workunit.client.1.vm09.stdout:9/358: chown d5/de/d29/c4d 3897381 1 2026-03-09T17:30:18.460 INFO:tasks.workunit.client.1.vm09.stdout:9/359: read d5/f4b [1590837,121221] 0 2026-03-09T17:30:18.460 INFO:tasks.workunit.client.1.vm09.stdout:6/383: dwrite d3/d7/f77 [0,4194304] 0 2026-03-09T17:30:18.460 INFO:tasks.workunit.client.1.vm09.stdout:8/390: symlink d1/da/d3a/l7b 0 2026-03-09T17:30:18.460 INFO:tasks.workunit.client.1.vm09.stdout:1/404: dread f6 [0,4194304] 0 2026-03-09T17:30:18.462 INFO:tasks.workunit.client.1.vm09.stdout:2/367: dread d13/d15/d34/f5b [0,4194304] 0 2026-03-09T17:30:18.482 INFO:tasks.workunit.client.1.vm09.stdout:9/360: rename d5/de/d29/d33/l6c to d5/de/d29/d33/l79 0 2026-03-09T17:30:18.489 INFO:tasks.workunit.client.1.vm09.stdout:0/376: rename d6/l3b to d6/d1d/l76 0 2026-03-09T17:30:18.503 INFO:tasks.workunit.client.1.vm09.stdout:2/368: truncate d13/d15/d3b/f3f 2744156 0 2026-03-09T17:30:18.514 INFO:tasks.workunit.client.1.vm09.stdout:9/361: creat d5/de/d29/d33/f7a x:0 0 0 2026-03-09T17:30:18.515 INFO:tasks.workunit.client.1.vm09.stdout:8/391: symlink d1/da/d23/d71/l7c 0 2026-03-09T17:30:18.516 INFO:tasks.workunit.client.1.vm09.stdout:8/392: chown d1/da/d3a/l7b 745819 1 2026-03-09T17:30:18.517 INFO:tasks.workunit.client.1.vm09.stdout:9/362: dwrite d5/d2e/f5e [0,4194304] 0 2026-03-09T17:30:18.518 INFO:tasks.workunit.client.1.vm09.stdout:9/363: write f2 [1439878,97332] 0 2026-03-09T17:30:18.521 INFO:tasks.workunit.client.1.vm09.stdout:9/364: chown d5/de/d29/f67 29 1 2026-03-09T17:30:18.531 INFO:tasks.workunit.client.1.vm09.stdout:6/384: rename d3/d21/d76/d3f/f67 to d3/d7/d59/d73/f7d 0 2026-03-09T17:30:18.532 INFO:tasks.workunit.client.1.vm09.stdout:6/385: readlink d3/d1e/l47 0 2026-03-09T17:30:18.533 INFO:tasks.workunit.client.1.vm09.stdout:3/357: truncate d5/d9/d30/d65/f18 1663771 0 2026-03-09T17:30:18.534 INFO:tasks.workunit.client.1.vm09.stdout:5/399: dwrite d0/dc/d21/f29 [0,4194304] 0 2026-03-09T17:30:18.537 INFO:tasks.workunit.client.1.vm09.stdout:5/400: dread d0/dc/d21/f29 [0,4194304] 0 2026-03-09T17:30:18.538 INFO:tasks.workunit.client.1.vm09.stdout:5/401: chown d0/dc/d21/d6f/d42/l6e 280 1 2026-03-09T17:30:18.543 INFO:tasks.workunit.client.1.vm09.stdout:4/428: write d11/d1e/d29/d36/d57/f67 [3400103,5764] 0 2026-03-09T17:30:18.548 INFO:tasks.workunit.client.1.vm09.stdout:7/496: dwrite da/d11/d2d/d56/f53 [0,4194304] 0 2026-03-09T17:30:18.552 INFO:tasks.workunit.client.1.vm09.stdout:4/429: dwrite d11/f1f [0,4194304] 0 2026-03-09T17:30:18.555 INFO:tasks.workunit.client.1.vm09.stdout:3/358: dread d5/d16/d46/f47 [0,4194304] 0 2026-03-09T17:30:18.565 INFO:tasks.workunit.client.1.vm09.stdout:1/405: symlink d9/d3a/l7f 0 2026-03-09T17:30:18.589 INFO:tasks.workunit.client.1.vm09.stdout:8/393: creat d1/da/d23/f7d x:0 0 0 2026-03-09T17:30:18.589 INFO:tasks.workunit.client.1.vm09.stdout:8/394: write d1/da/dd/f22 [3568477,88262] 0 2026-03-09T17:30:18.590 INFO:tasks.workunit.client.1.vm09.stdout:8/395: read d1/da/d23/d6c/f1c [5602708,40741] 0 2026-03-09T17:30:18.590 INFO:tasks.workunit.client.1.vm09.stdout:9/365: creat d5/d2e/f7b x:0 0 0 2026-03-09T17:30:18.591 INFO:tasks.workunit.client.1.vm09.stdout:7/497: sync 2026-03-09T17:30:18.591 INFO:tasks.workunit.client.1.vm09.stdout:8/396: dread - d1/d14/d2a/d42/f46 zero size 2026-03-09T17:30:18.594 INFO:tasks.workunit.client.1.vm09.stdout:6/386: mkdir d3/d21/d76/d5c/d7e 0 2026-03-09T17:30:18.619 INFO:tasks.workunit.client.1.vm09.stdout:4/430: chown d11/d1e/d29/l41 12 1 2026-03-09T17:30:18.621 INFO:tasks.workunit.client.1.vm09.stdout:3/359: mknod d5/d16/d25/c67 0 2026-03-09T17:30:18.622 INFO:tasks.workunit.client.1.vm09.stdout:4/431: dread d11/f18 [0,4194304] 0 2026-03-09T17:30:18.622 INFO:tasks.workunit.client.1.vm09.stdout:4/432: write d11/d1e/d29/d36/d57/f79 [457312,109401] 0 2026-03-09T17:30:18.632 INFO:tasks.workunit.client.1.vm09.stdout:0/377: mknod d6/d64/c77 0 2026-03-09T17:30:18.634 INFO:tasks.workunit.client.1.vm09.stdout:1/406: truncate d9/dc/d63/f67 706161 0 2026-03-09T17:30:18.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:18 vm09.local ceph-mon[62061]: pgmap v9: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 39 MiB/s rd, 115 MiB/s wr, 253 op/s 2026-03-09T17:30:18.672 INFO:tasks.workunit.client.1.vm09.stdout:3/360: unlink d5/c42 0 2026-03-09T17:30:18.674 INFO:tasks.workunit.client.1.vm09.stdout:2/369: creat d13/d15/f74 x:0 0 0 2026-03-09T17:30:18.677 INFO:tasks.workunit.client.1.vm09.stdout:4/433: dwrite d11/f25 [4194304,4194304] 0 2026-03-09T17:30:18.683 INFO:tasks.workunit.client.1.vm09.stdout:2/370: dwrite d13/d15/f2b [0,4194304] 0 2026-03-09T17:30:18.692 INFO:tasks.workunit.client.1.vm09.stdout:0/378: symlink d6/d1d/d39/l78 0 2026-03-09T17:30:18.695 INFO:tasks.workunit.client.1.vm09.stdout:1/407: unlink d9/dc/dd/d40/f1a 0 2026-03-09T17:30:18.710 INFO:tasks.workunit.client.1.vm09.stdout:4/434: dread d11/d1e/d29/d36/f3d [0,4194304] 0 2026-03-09T17:30:18.710 INFO:tasks.workunit.client.1.vm09.stdout:4/435: write d11/d1e/d29/d36/d57/f79 [1216353,59466] 0 2026-03-09T17:30:18.716 INFO:tasks.workunit.client.1.vm09.stdout:9/366: rename d5/d2e/c3a to d5/de/d29/c7c 0 2026-03-09T17:30:18.716 INFO:tasks.workunit.client.1.vm09.stdout:9/367: chown d5/d2e/d6b/f74 1856 1 2026-03-09T17:30:18.725 INFO:tasks.workunit.client.1.vm09.stdout:6/387: link d3/d21/d25/c44 d3/d7/d59/d73/c7f 0 2026-03-09T17:30:18.725 INFO:tasks.workunit.client.1.vm09.stdout:6/388: readlink d3/d1e/l47 0 2026-03-09T17:30:18.727 INFO:tasks.workunit.client.1.vm09.stdout:9/368: dread f2 [0,4194304] 0 2026-03-09T17:30:18.731 INFO:tasks.workunit.client.1.vm09.stdout:5/402: getdents d0/d52/d20 0 2026-03-09T17:30:18.734 INFO:tasks.workunit.client.1.vm09.stdout:8/397: dwrite d1/f28 [0,4194304] 0 2026-03-09T17:30:18.745 INFO:tasks.workunit.client.1.vm09.stdout:5/403: dread d0/dc/d21/d26/f36 [0,4194304] 0 2026-03-09T17:30:18.755 INFO:tasks.workunit.client.1.vm09.stdout:1/408: dwrite d9/dc/dd/d40/d22/d37/f41 [0,4194304] 0 2026-03-09T17:30:18.771 INFO:tasks.workunit.client.1.vm09.stdout:2/371: rmdir d13/d15 39 2026-03-09T17:30:18.772 INFO:tasks.workunit.client.1.vm09.stdout:2/372: stat d13/d4d/f5c 0 2026-03-09T17:30:18.774 INFO:tasks.workunit.client.1.vm09.stdout:6/389: creat d3/d21/f80 x:0 0 0 2026-03-09T17:30:18.774 INFO:tasks.workunit.client.1.vm09.stdout:9/369: creat d5/d2e/d70/f7d x:0 0 0 2026-03-09T17:30:18.787 INFO:tasks.workunit.client.1.vm09.stdout:8/398: fsync d1/da/f4b 0 2026-03-09T17:30:18.793 INFO:tasks.workunit.client.1.vm09.stdout:0/379: mknod d6/d1d/d24/d32/d59/c79 0 2026-03-09T17:30:18.794 INFO:tasks.workunit.client.1.vm09.stdout:0/380: truncate d6/d1d/d39/f53 347050 0 2026-03-09T17:30:18.801 INFO:tasks.workunit.client.1.vm09.stdout:5/404: dwrite d0/d46/f56 [0,4194304] 0 2026-03-09T17:30:18.816 INFO:tasks.workunit.client.1.vm09.stdout:7/498: getdents da/d11/d47/d5b/d6c/d9e/d4e 0 2026-03-09T17:30:18.816 INFO:tasks.workunit.client.1.vm09.stdout:7/499: read - da/d11/d47/d5b/f82 zero size 2026-03-09T17:30:18.817 INFO:tasks.workunit.client.1.vm09.stdout:7/500: dread - da/d11/d2d/d49/f98 zero size 2026-03-09T17:30:18.821 INFO:tasks.workunit.client.1.vm09.stdout:6/390: write d3/d7/d59/d73/f7d [14127,30910] 0 2026-03-09T17:30:18.826 INFO:tasks.workunit.client.1.vm09.stdout:2/373: dread d13/d15/d21/f24 [0,4194304] 0 2026-03-09T17:30:18.826 INFO:tasks.workunit.client.1.vm09.stdout:3/361: getdents d5 0 2026-03-09T17:30:18.833 INFO:tasks.workunit.client.1.vm09.stdout:0/381: fsync d6/d1d/d24/f50 0 2026-03-09T17:30:18.833 INFO:tasks.workunit.client.1.vm09.stdout:0/382: chown d6/d1d/d24 11521 1 2026-03-09T17:30:18.833 INFO:tasks.workunit.client.1.vm09.stdout:5/405: mknod d0/de/c84 0 2026-03-09T17:30:18.837 INFO:tasks.workunit.client.1.vm09.stdout:4/436: creat d11/d1e/f8c x:0 0 0 2026-03-09T17:30:18.840 INFO:tasks.workunit.client.1.vm09.stdout:5/406: sync 2026-03-09T17:30:18.846 INFO:tasks.workunit.client.1.vm09.stdout:7/501: mknod da/d11/d2d/d56/d68/cad 0 2026-03-09T17:30:18.847 INFO:tasks.workunit.client.1.vm09.stdout:1/409: write f2 [2426053,119549] 0 2026-03-09T17:30:18.851 INFO:tasks.workunit.client.1.vm09.stdout:9/370: mkdir d5/d7e 0 2026-03-09T17:30:18.851 INFO:tasks.workunit.client.1.vm09.stdout:1/410: chown d9/dc/dd/d40/d22/d37/d3f/l57 150186763 1 2026-03-09T17:30:18.852 INFO:tasks.workunit.client.1.vm09.stdout:3/362: rename d5/d9/c40 to d5/d9/d30/c68 0 2026-03-09T17:30:18.852 INFO:tasks.workunit.client.1.vm09.stdout:3/363: fsync d5/d9/d30/d65/f15 0 2026-03-09T17:30:18.855 INFO:tasks.workunit.client.1.vm09.stdout:2/374: unlink d13/d15/d21/f32 0 2026-03-09T17:30:18.856 INFO:tasks.workunit.client.1.vm09.stdout:8/399: mknod d1/d14/d2a/d49/c7e 0 2026-03-09T17:30:18.859 INFO:tasks.workunit.client.1.vm09.stdout:0/383: unlink d6/d1d/d39/c1a 0 2026-03-09T17:30:18.867 INFO:tasks.workunit.client.1.vm09.stdout:5/407: creat d0/dc/d21/d26/d5e/d68/f85 x:0 0 0 2026-03-09T17:30:18.867 INFO:tasks.workunit.client.1.vm09.stdout:0/384: dread d6/d1d/d46/f4d [0,4194304] 0 2026-03-09T17:30:18.873 INFO:tasks.workunit.client.1.vm09.stdout:7/502: creat da/d11/d47/d5b/d6c/d9e/d4e/fae x:0 0 0 2026-03-09T17:30:18.874 INFO:tasks.workunit.client.1.vm09.stdout:9/371: fsync d5/de/d4e/f56 0 2026-03-09T17:30:18.874 INFO:tasks.workunit.client.1.vm09.stdout:7/503: chown da/d11/l9a 216 1 2026-03-09T17:30:18.877 INFO:tasks.workunit.client.1.vm09.stdout:2/375: mknod d13/d4d/c75 0 2026-03-09T17:30:18.878 INFO:tasks.workunit.client.1.vm09.stdout:8/400: dread d1/da/dd/d47/f64 [0,4194304] 0 2026-03-09T17:30:18.884 INFO:tasks.workunit.client.1.vm09.stdout:0/385: dwrite d6/d1d/d39/f53 [0,4194304] 0 2026-03-09T17:30:18.884 INFO:tasks.workunit.client.1.vm09.stdout:7/504: dwrite da/d11/f25 [0,4194304] 0 2026-03-09T17:30:18.887 INFO:tasks.workunit.client.1.vm09.stdout:5/408: dwrite d0/d9/d16/d5c/f70 [0,4194304] 0 2026-03-09T17:30:18.887 INFO:tasks.workunit.client.1.vm09.stdout:7/505: sync 2026-03-09T17:30:18.897 INFO:tasks.workunit.client.1.vm09.stdout:7/506: read da/f1c [1300517,11684] 0 2026-03-09T17:30:18.912 INFO:tasks.workunit.client.1.vm09.stdout:9/372: creat d5/d2e/d6b/f7f x:0 0 0 2026-03-09T17:30:18.912 INFO:tasks.workunit.client.1.vm09.stdout:4/437: dwrite d11/d1e/d45/d60/f64 [0,4194304] 0 2026-03-09T17:30:18.917 INFO:tasks.workunit.client.1.vm09.stdout:6/391: dwrite d3/d7/f10 [0,4194304] 0 2026-03-09T17:30:18.928 INFO:tasks.workunit.client.1.vm09.stdout:0/386: creat d6/d1d/d39/f7a x:0 0 0 2026-03-09T17:30:18.928 INFO:tasks.workunit.client.1.vm09.stdout:5/409: rename d0/de to d0/d2/d76/d86 0 2026-03-09T17:30:18.928 INFO:tasks.workunit.client.1.vm09.stdout:7/507: sync 2026-03-09T17:30:18.944 INFO:tasks.workunit.client.1.vm09.stdout:9/373: symlink d5/d21/l80 0 2026-03-09T17:30:18.945 INFO:tasks.workunit.client.1.vm09.stdout:4/438: fdatasync d11/d1e/d83/d89/d8b/d58/f75 0 2026-03-09T17:30:18.945 INFO:tasks.workunit.client.1.vm09.stdout:3/364: link d5/d16/d31/d3d/d32/f33 d5/d16/d31/d3d/d32/f69 0 2026-03-09T17:30:18.945 INFO:tasks.workunit.client.1.vm09.stdout:8/401: symlink d1/da/l7f 0 2026-03-09T17:30:18.946 INFO:tasks.workunit.client.1.vm09.stdout:4/439: write d11/d1e/d29/d36/f86 [931219,126407] 0 2026-03-09T17:30:18.946 INFO:tasks.workunit.client.1.vm09.stdout:8/402: write d1/da/d23/d6c/d32/f50 [281968,8259] 0 2026-03-09T17:30:18.954 INFO:tasks.workunit.client.1.vm09.stdout:5/410: mkdir d0/d2/d76/d87 0 2026-03-09T17:30:18.979 INFO:tasks.workunit.client.1.vm09.stdout:3/365: chown f3 150997109 1 2026-03-09T17:30:18.979 INFO:tasks.workunit.client.1.vm09.stdout:8/403: creat d1/d14/d2a/d42/d5d/f80 x:0 0 0 2026-03-09T17:30:18.979 INFO:tasks.workunit.client.1.vm09.stdout:3/366: dwrite d5/d16/d31/f56 [0,4194304] 0 2026-03-09T17:30:18.979 INFO:tasks.workunit.client.1.vm09.stdout:4/440: dread d11/d1e/f22 [0,4194304] 0 2026-03-09T17:30:18.979 INFO:tasks.workunit.client.1.vm09.stdout:9/374: mkdir d5/d7e/d81 0 2026-03-09T17:30:18.979 INFO:tasks.workunit.client.1.vm09.stdout:3/367: creat d5/d9/d30/f6a x:0 0 0 2026-03-09T17:30:18.979 INFO:tasks.workunit.client.1.vm09.stdout:0/387: truncate d6/d1d/f37 1219219 0 2026-03-09T17:30:18.987 INFO:tasks.workunit.client.1.vm09.stdout:4/441: mknod d11/d1e/d83/d89/d8b/d58/c8d 0 2026-03-09T17:30:18.995 INFO:tasks.workunit.client.1.vm09.stdout:9/375: creat d5/d2e/f82 x:0 0 0 2026-03-09T17:30:18.995 INFO:tasks.workunit.client.1.vm09.stdout:6/392: getdents d3/d1e 0 2026-03-09T17:30:18.995 INFO:tasks.workunit.client.1.vm09.stdout:3/368: readlink d5/d16/d31/d37/d58/l5a 0 2026-03-09T17:30:18.995 INFO:tasks.workunit.client.1.vm09.stdout:0/388: chown d6/d1d/d46/l65 866188917 1 2026-03-09T17:30:18.995 INFO:tasks.workunit.client.1.vm09.stdout:7/508: getdents da/d11/d47/d5b/d6c 0 2026-03-09T17:30:18.995 INFO:tasks.workunit.client.1.vm09.stdout:9/376: symlink d5/d7e/l83 0 2026-03-09T17:30:18.995 INFO:tasks.workunit.client.1.vm09.stdout:9/377: chown d5 3590 1 2026-03-09T17:30:18.995 INFO:tasks.workunit.client.1.vm09.stdout:8/404: creat d1/d14/d2a/f81 x:0 0 0 2026-03-09T17:30:18.997 INFO:tasks.workunit.client.1.vm09.stdout:6/393: fsync d3/d21/f3c 0 2026-03-09T17:30:18.998 INFO:tasks.workunit.client.1.vm09.stdout:6/394: write d3/d21/d25/d26/d6b/f79 [825309,68503] 0 2026-03-09T17:30:18.998 INFO:tasks.workunit.client.1.vm09.stdout:3/369: creat d5/d16/d46/f6b x:0 0 0 2026-03-09T17:30:19.000 INFO:tasks.workunit.client.1.vm09.stdout:0/389: truncate d6/f27 150510 0 2026-03-09T17:30:19.001 INFO:tasks.workunit.client.1.vm09.stdout:0/390: stat d6/d1d/d24/d5e/d6c 0 2026-03-09T17:30:19.001 INFO:tasks.workunit.client.1.vm09.stdout:8/405: write d1/da/d23/d6c/f6a [893570,27679] 0 2026-03-09T17:30:19.006 INFO:tasks.workunit.client.1.vm09.stdout:7/509: rename da/d11/f19 to da/d11/d47/d89/faf 0 2026-03-09T17:30:19.014 INFO:tasks.workunit.client.1.vm09.stdout:6/395: mkdir d3/d21/d76/d81 0 2026-03-09T17:30:19.014 INFO:tasks.workunit.client.1.vm09.stdout:6/396: chown d3/d7/d59/d5a 42829 1 2026-03-09T17:30:19.017 INFO:tasks.workunit.client.1.vm09.stdout:4/442: getdents d11/d1e/d45/d60/d71 0 2026-03-09T17:30:19.017 INFO:tasks.workunit.client.1.vm09.stdout:7/510: dread da/d11/d2d/f59 [0,4194304] 0 2026-03-09T17:30:19.017 INFO:tasks.workunit.client.1.vm09.stdout:7/511: dread - da/d11/f6a zero size 2026-03-09T17:30:19.021 INFO:tasks.workunit.client.1.vm09.stdout:0/391: dread d6/d1d/d24/d32/f49 [0,4194304] 0 2026-03-09T17:30:19.026 INFO:tasks.workunit.client.1.vm09.stdout:8/406: truncate d1/d14/d2a/f54 2704471 0 2026-03-09T17:30:19.026 INFO:tasks.workunit.client.1.vm09.stdout:6/397: creat d3/d7/d59/d73/f82 x:0 0 0 2026-03-09T17:30:19.029 INFO:tasks.workunit.client.1.vm09.stdout:8/407: dread d1/da/d23/d6c/f70 [0,4194304] 0 2026-03-09T17:30:19.030 INFO:tasks.workunit.client.1.vm09.stdout:8/408: readlink d1/da/d3a/l7b 0 2026-03-09T17:30:19.043 INFO:tasks.workunit.client.1.vm09.stdout:1/411: truncate f8 2163213 0 2026-03-09T17:30:19.043 INFO:tasks.workunit.client.1.vm09.stdout:0/392: rmdir d6/d1d/d24/d5e/d6c 39 2026-03-09T17:30:19.043 INFO:tasks.workunit.client.1.vm09.stdout:1/412: chown d9/dc/dd/d40/d22/f2b 7623 1 2026-03-09T17:30:19.044 INFO:tasks.workunit.client.1.vm09.stdout:9/378: getdents d5/d2e/d70 0 2026-03-09T17:30:19.044 INFO:tasks.workunit.client.1.vm09.stdout:1/413: fsync d9/dc/dd/fe 0 2026-03-09T17:30:19.044 INFO:tasks.workunit.client.1.vm09.stdout:9/379: dread - d5/d2e/f82 zero size 2026-03-09T17:30:19.045 INFO:tasks.workunit.client.1.vm09.stdout:2/376: write d13/f26 [1255616,27653] 0 2026-03-09T17:30:19.045 INFO:tasks.workunit.client.1.vm09.stdout:1/414: chown d9/d38/l5c 47703241 1 2026-03-09T17:30:19.045 INFO:tasks.workunit.client.1.vm09.stdout:1/415: chown d9/d3a 5007748 1 2026-03-09T17:30:19.052 INFO:tasks.workunit.client.1.vm09.stdout:0/393: dwrite d6/d1d/d46/f4d [0,4194304] 0 2026-03-09T17:30:19.053 INFO:tasks.workunit.client.1.vm09.stdout:8/409: dwrite d1/da/d23/d6c/f1c [0,4194304] 0 2026-03-09T17:30:19.058 INFO:tasks.workunit.client.1.vm09.stdout:0/394: stat d6/d1d/f70 0 2026-03-09T17:30:19.058 INFO:tasks.workunit.client.1.vm09.stdout:0/395: stat d6/d1d/d24 0 2026-03-09T17:30:19.072 INFO:tasks.workunit.client.1.vm09.stdout:1/416: truncate d9/dc/dd/d40/d1d/f1e 2787703 0 2026-03-09T17:30:19.074 INFO:tasks.workunit.client.1.vm09.stdout:9/380: mkdir d5/d2e/d70/d84 0 2026-03-09T17:30:19.077 INFO:tasks.workunit.client.1.vm09.stdout:1/417: dwrite d9/dc/dd/d40/d22/d37/f41 [0,4194304] 0 2026-03-09T17:30:19.078 INFO:tasks.workunit.client.1.vm09.stdout:0/396: dread d6/d1d/d24/f5d [4194304,4194304] 0 2026-03-09T17:30:19.079 INFO:tasks.workunit.client.1.vm09.stdout:0/397: stat d6/d1d/d39 0 2026-03-09T17:30:19.084 INFO:tasks.workunit.client.1.vm09.stdout:9/381: dread d5/d21/f46 [0,4194304] 0 2026-03-09T17:30:19.086 INFO:tasks.workunit.client.1.vm09.stdout:9/382: stat d5/de 0 2026-03-09T17:30:19.093 INFO:tasks.workunit.client.1.vm09.stdout:9/383: sync 2026-03-09T17:30:19.093 INFO:tasks.workunit.client.1.vm09.stdout:9/384: fdatasync d5/d2e/f82 0 2026-03-09T17:30:19.093 INFO:tasks.workunit.client.1.vm09.stdout:9/385: chown d5/f47 4 1 2026-03-09T17:30:19.094 INFO:tasks.workunit.client.1.vm09.stdout:9/386: read d5/d2e/f5e [733394,127212] 0 2026-03-09T17:30:19.094 INFO:tasks.workunit.client.1.vm09.stdout:9/387: truncate d5/d2e/f7b 419544 0 2026-03-09T17:30:19.118 INFO:tasks.workunit.client.1.vm09.stdout:0/398: rename d6/lc to d6/d1d/d39/l7b 0 2026-03-09T17:30:19.118 INFO:tasks.workunit.client.1.vm09.stdout:5/411: write d0/d46/f4c [1097388,15286] 0 2026-03-09T17:30:19.119 INFO:tasks.workunit.client.1.vm09.stdout:0/399: chown d6/d1d/d24/d32/d59/d5b 878 1 2026-03-09T17:30:19.122 INFO:tasks.workunit.client.1.vm09.stdout:2/377: mknod d13/d15/d34/c76 0 2026-03-09T17:30:19.124 INFO:tasks.workunit.client.1.vm09.stdout:9/388: symlink d5/d21/l85 0 2026-03-09T17:30:19.127 INFO:tasks.workunit.client.1.vm09.stdout:8/410: dread d1/d14/f3c [0,4194304] 0 2026-03-09T17:30:19.128 INFO:tasks.workunit.client.1.vm09.stdout:8/411: chown d1/d14/d31/l30 18040326 1 2026-03-09T17:30:19.128 INFO:tasks.workunit.client.1.vm09.stdout:7/512: getdents da/d11/d47/d89 0 2026-03-09T17:30:19.129 INFO:tasks.workunit.client.1.vm09.stdout:8/412: chown d1/d14/d2a/d42/d43/d44 86056 1 2026-03-09T17:30:19.133 INFO:tasks.workunit.client.1.vm09.stdout:3/370: dwrite d5/d9/d30/d65/f18 [0,4194304] 0 2026-03-09T17:30:19.135 INFO:tasks.workunit.client.1.vm09.stdout:2/378: symlink d13/d15/d34/d37/l77 0 2026-03-09T17:30:19.136 INFO:tasks.workunit.client.1.vm09.stdout:2/379: fsync d13/d15/d21/f31 0 2026-03-09T17:30:19.138 INFO:tasks.workunit.client.1.vm09.stdout:8/413: creat d1/da/dd/d47/f82 x:0 0 0 2026-03-09T17:30:19.139 INFO:tasks.workunit.client.1.vm09.stdout:8/414: readlink d1/da/d23/d6c/l40 0 2026-03-09T17:30:19.144 INFO:tasks.workunit.client.1.vm09.stdout:5/412: dread d0/d9/f34 [0,4194304] 0 2026-03-09T17:30:19.169 INFO:tasks.workunit.client.1.vm09.stdout:4/443: dwrite d11/d1e/d29/d36/f6a [0,4194304] 0 2026-03-09T17:30:19.177 INFO:tasks.workunit.client.1.vm09.stdout:1/418: creat d9/dc/dd/d40/d22/d37/d3f/f80 x:0 0 0 2026-03-09T17:30:19.178 INFO:tasks.workunit.client.1.vm09.stdout:1/419: write d9/dc/dd/d40/d22/d37/d3f/f80 [695481,50910] 0 2026-03-09T17:30:19.196 INFO:tasks.workunit.client.1.vm09.stdout:3/371: symlink d5/d16/d46/l6c 0 2026-03-09T17:30:19.212 INFO:tasks.workunit.client.1.vm09.stdout:2/380: rmdir d13/d15/d34/d37 39 2026-03-09T17:30:19.215 INFO:tasks.workunit.client.1.vm09.stdout:2/381: dwrite d13/d15/f2a [4194304,4194304] 0 2026-03-09T17:30:19.222 INFO:tasks.workunit.client.1.vm09.stdout:5/413: write d0/dc/d21/d6f/f5f [1428639,128735] 0 2026-03-09T17:30:19.231 INFO:tasks.workunit.client.1.vm09.stdout:5/414: dread d0/d52/d20/f7c [0,4194304] 0 2026-03-09T17:30:19.231 INFO:tasks.workunit.client.1.vm09.stdout:5/415: fdatasync d0/dc/d21/d33/f65 0 2026-03-09T17:30:19.243 INFO:tasks.workunit.client.1.vm09.stdout:3/372: creat d5/d16/d31/d37/f6d x:0 0 0 2026-03-09T17:30:19.244 INFO:tasks.workunit.client.1.vm09.stdout:7/513: link da/d11/d47/d5b/d6c/d9e/d4e/l46 da/d11/d77/lb0 0 2026-03-09T17:30:19.244 INFO:tasks.workunit.client.1.vm09.stdout:3/373: chown d5/d16/d31/d37/d58/d64 304911 1 2026-03-09T17:30:19.244 INFO:tasks.workunit.client.1.vm09.stdout:5/416: symlink d0/d52/l88 0 2026-03-09T17:30:19.253 INFO:tasks.workunit.client.1.vm09.stdout:3/374: dread d5/d9/d30/d65/f1d [0,4194304] 0 2026-03-09T17:30:19.265 INFO:tasks.workunit.client.1.vm09.stdout:6/398: truncate d3/d21/d25/d26/f50 2984524 0 2026-03-09T17:30:19.266 INFO:tasks.workunit.client.1.vm09.stdout:9/389: rename d5/de/d29/l2c to d5/de/l86 0 2026-03-09T17:30:19.268 INFO:tasks.workunit.client.1.vm09.stdout:7/514: mkdir da/d11/d64/da7/db1 0 2026-03-09T17:30:19.269 INFO:tasks.workunit.client.1.vm09.stdout:4/444: link d11/d1e/d45/c88 d11/d1e/d45/c8e 0 2026-03-09T17:30:19.271 INFO:tasks.workunit.client.1.vm09.stdout:1/420: creat d9/dc/f81 x:0 0 0 2026-03-09T17:30:19.272 INFO:tasks.workunit.client.1.vm09.stdout:1/421: truncate d9/dc/f81 546089 0 2026-03-09T17:30:19.275 INFO:tasks.workunit.client.1.vm09.stdout:4/445: dwrite d11/f6e [0,4194304] 0 2026-03-09T17:30:19.276 INFO:tasks.workunit.client.1.vm09.stdout:4/446: chown d11/d1e/d29/f3b 127 1 2026-03-09T17:30:19.278 INFO:tasks.workunit.client.1.vm09.stdout:4/447: dread - d11/d1e/d45/f70 zero size 2026-03-09T17:30:19.281 INFO:tasks.workunit.client.1.vm09.stdout:9/390: symlink d5/d2e/d70/l87 0 2026-03-09T17:30:19.284 INFO:tasks.workunit.client.1.vm09.stdout:7/515: dread da/d11/d47/d5b/d6c/f73 [0,4194304] 0 2026-03-09T17:30:19.286 INFO:tasks.workunit.client.1.vm09.stdout:3/375: creat d5/d9/d30/d65/d59/d66/f6e x:0 0 0 2026-03-09T17:30:19.288 INFO:tasks.workunit.client.1.vm09.stdout:3/376: read d5/d16/d31/f44 [11517,42199] 0 2026-03-09T17:30:19.289 INFO:tasks.workunit.client.1.vm09.stdout:3/377: chown d5/d9/d30/d65/d59/d66 33670089 1 2026-03-09T17:30:19.290 INFO:tasks.workunit.client.1.vm09.stdout:5/417: dread d0/d2/f2a [0,4194304] 0 2026-03-09T17:30:19.297 INFO:tasks.workunit.client.1.vm09.stdout:2/382: rename d13/d15/d34/c3d to d13/d15/d3b/c78 0 2026-03-09T17:30:19.298 INFO:tasks.workunit.client.1.vm09.stdout:2/383: truncate d13/d15/d34/f44 910889 0 2026-03-09T17:30:19.298 INFO:tasks.workunit.client.1.vm09.stdout:2/384: chown d13/f40 5141 1 2026-03-09T17:30:19.299 INFO:tasks.workunit.client.1.vm09.stdout:2/385: write d13/f14 [833948,68519] 0 2026-03-09T17:30:19.301 INFO:tasks.workunit.client.1.vm09.stdout:2/386: stat d13/d15/d36/c3c 0 2026-03-09T17:30:19.334 INFO:tasks.workunit.client.1.vm09.stdout:0/400: dwrite d6/d1d/f1e [0,4194304] 0 2026-03-09T17:30:19.339 INFO:tasks.workunit.client.1.vm09.stdout:4/448: creat d11/d1e/d29/d36/d57/f8f x:0 0 0 2026-03-09T17:30:19.341 INFO:tasks.workunit.client.1.vm09.stdout:7/516: truncate da/d11/d47/d5b/d6c/d9e/d4e/d4c/f66 805530 0 2026-03-09T17:30:19.393 INFO:tasks.workunit.client.1.vm09.stdout:2/387: dwrite d13/f40 [0,4194304] 0 2026-03-09T17:30:19.395 INFO:tasks.workunit.client.1.vm09.stdout:2/388: dwrite d13/d15/d34/f5e [0,4194304] 0 2026-03-09T17:30:19.405 INFO:tasks.workunit.client.1.vm09.stdout:2/389: dwrite d13/d15/f2f [0,4194304] 0 2026-03-09T17:30:19.406 INFO:tasks.workunit.client.1.vm09.stdout:2/390: stat d13/d15/d34/d45/f6a 0 2026-03-09T17:30:19.421 INFO:tasks.workunit.client.1.vm09.stdout:8/415: write d1/d14/f3d [4792636,30287] 0 2026-03-09T17:30:19.446 INFO:tasks.workunit.client.1.vm09.stdout:8/416: sync 2026-03-09T17:30:19.467 INFO:tasks.workunit.client.1.vm09.stdout:4/449: readlink d11/d1e/d31/l47 0 2026-03-09T17:30:19.486 INFO:tasks.workunit.client.1.vm09.stdout:7/517: rename da/d76 to da/d11/d3e/da2/db2 0 2026-03-09T17:30:19.487 INFO:tasks.workunit.client.1.vm09.stdout:5/418: mknod d0/d9/c89 0 2026-03-09T17:30:19.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:19 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:19.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:19 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:19.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:19 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:30:19.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:19 vm06.local ceph-mon[57307]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:30:19.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:19 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:19.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:19 vm06.local ceph-mon[57307]: Upgrade: Need to upgrade myself (mgr.vm09.lqzvkh) 2026-03-09T17:30:19.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:19 vm06.local ceph-mon[57307]: Upgrade: Need to upgrade myself (mgr.vm09.lqzvkh) 2026-03-09T17:30:19.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:19 vm06.local ceph-mon[57307]: Upgrade: Updating mgr.vm06.pbgzei 2026-03-09T17:30:19.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:19 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:19.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:19 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.pbgzei", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:30:19.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:19 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.pbgzei", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:30:19.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:19 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:30:19.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:19 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:19.514 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:19 vm06.local ceph-mon[57307]: Deploying daemon mgr.vm06.pbgzei on vm06 2026-03-09T17:30:19.530 INFO:tasks.workunit.client.1.vm09.stdout:8/417: creat d1/da/dd/d79/f83 x:0 0 0 2026-03-09T17:30:19.532 INFO:tasks.workunit.client.1.vm09.stdout:0/401: fsync d6/d1d/f37 0 2026-03-09T17:30:19.533 INFO:tasks.workunit.client.1.vm09.stdout:0/402: stat d6/c29 0 2026-03-09T17:30:19.534 INFO:tasks.workunit.client.1.vm09.stdout:6/399: dwrite d3/d1e/f7c [0,4194304] 0 2026-03-09T17:30:19.535 INFO:tasks.workunit.client.1.vm09.stdout:6/400: readlink d3/d21/d25/d26/l2b 0 2026-03-09T17:30:19.536 INFO:tasks.workunit.client.1.vm09.stdout:7/518: creat da/d11/d47/d5b/d6c/fb3 x:0 0 0 2026-03-09T17:30:19.537 INFO:tasks.workunit.client.1.vm09.stdout:7/519: dread - da/d11/d2d/d49/f98 zero size 2026-03-09T17:30:19.539 INFO:tasks.workunit.client.1.vm09.stdout:4/450: dread d11/d1e/d83/d89/d8b/f5f [0,4194304] 0 2026-03-09T17:30:19.540 INFO:tasks.workunit.client.1.vm09.stdout:4/451: fdatasync d11/d1e/d29/d36/d57/f67 0 2026-03-09T17:30:19.555 INFO:tasks.workunit.client.1.vm09.stdout:5/419: dread d0/d52/f1c [0,4194304] 0 2026-03-09T17:30:19.589 INFO:tasks.workunit.client.1.vm09.stdout:6/401: fdatasync d3/d7/f40 0 2026-03-09T17:30:19.589 INFO:tasks.workunit.client.1.vm09.stdout:7/520: creat da/d11/d47/d89/fb4 x:0 0 0 2026-03-09T17:30:19.589 INFO:tasks.workunit.client.1.vm09.stdout:8/418: mknod d1/d14/d2a/c84 0 2026-03-09T17:30:19.589 INFO:tasks.workunit.client.1.vm09.stdout:2/391: creat d13/f79 x:0 0 0 2026-03-09T17:30:19.591 INFO:tasks.workunit.client.1.vm09.stdout:1/422: truncate d9/dc/dd/d40/d22/d37/f2e 6701709 0 2026-03-09T17:30:19.591 INFO:tasks.workunit.client.1.vm09.stdout:5/420: mknod d0/dc/d21/d33/c8a 0 2026-03-09T17:30:19.593 INFO:tasks.workunit.client.1.vm09.stdout:9/391: write d5/de/d29/d33/f4a [4041762,45272] 0 2026-03-09T17:30:19.597 INFO:tasks.workunit.client.1.vm09.stdout:3/378: dwrite d5/d16/d46/f47 [0,4194304] 0 2026-03-09T17:30:19.602 INFO:tasks.workunit.client.1.vm09.stdout:6/402: dread - d3/d7/f58 zero size 2026-03-09T17:30:19.604 INFO:tasks.workunit.client.1.vm09.stdout:8/419: dread d1/d14/f2f [0,4194304] 0 2026-03-09T17:30:19.617 INFO:tasks.workunit.client.1.vm09.stdout:7/521: truncate da/d11/d2d/d56/f50 1641388 0 2026-03-09T17:30:19.617 INFO:tasks.workunit.client.1.vm09.stdout:2/392: creat d13/d15/d34/d69/f7a x:0 0 0 2026-03-09T17:30:19.618 INFO:tasks.workunit.client.1.vm09.stdout:2/393: write d13/d15/f71 [881072,122371] 0 2026-03-09T17:30:19.619 INFO:tasks.workunit.client.1.vm09.stdout:2/394: write d13/d15/d34/f3a [3341047,56996] 0 2026-03-09T17:30:19.622 INFO:tasks.workunit.client.1.vm09.stdout:2/395: dwrite d13/d15/d34/d45/f57 [0,4194304] 0 2026-03-09T17:30:19.622 INFO:tasks.workunit.client.1.vm09.stdout:2/396: readlink d13/d15/d2c/l53 0 2026-03-09T17:30:19.627 INFO:tasks.workunit.client.1.vm09.stdout:1/423: creat d9/dc/d63/f82 x:0 0 0 2026-03-09T17:30:19.629 INFO:tasks.workunit.client.1.vm09.stdout:3/379: dread d5/d16/d31/d3d/fe [0,4194304] 0 2026-03-09T17:30:19.632 INFO:tasks.workunit.client.1.vm09.stdout:3/380: dwrite d5/d16/d46/f6b [0,4194304] 0 2026-03-09T17:30:19.634 INFO:tasks.workunit.client.1.vm09.stdout:4/452: write d11/d1e/d31/f5a [170854,14987] 0 2026-03-09T17:30:19.635 INFO:tasks.workunit.client.1.vm09.stdout:4/453: chown d11/d1e/d29/d36/f40 44474196 1 2026-03-09T17:30:19.644 INFO:tasks.workunit.client.1.vm09.stdout:0/403: creat d6/d1d/d24/d32/f7c x:0 0 0 2026-03-09T17:30:19.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:19 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:19.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:19 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:19 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:30:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:19 vm09.local ceph-mon[62061]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:30:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:19 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:19 vm09.local ceph-mon[62061]: Upgrade: Need to upgrade myself (mgr.vm09.lqzvkh) 2026-03-09T17:30:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:19 vm09.local ceph-mon[62061]: Upgrade: Need to upgrade myself (mgr.vm09.lqzvkh) 2026-03-09T17:30:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:19 vm09.local ceph-mon[62061]: Upgrade: Updating mgr.vm06.pbgzei 2026-03-09T17:30:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:19 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:19 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.pbgzei", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:30:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:19 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.pbgzei", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:30:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:19 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:30:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:19 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:19.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:19 vm09.local ceph-mon[62061]: Deploying daemon mgr.vm06.pbgzei on vm06 2026-03-09T17:30:19.657 INFO:tasks.workunit.client.1.vm09.stdout:6/403: creat d3/d7/d59/d5a/f83 x:0 0 0 2026-03-09T17:30:19.660 INFO:tasks.workunit.client.1.vm09.stdout:8/420: dread d1/da/d23/d6c/f70 [0,4194304] 0 2026-03-09T17:30:19.670 INFO:tasks.workunit.client.1.vm09.stdout:8/421: write d1/da/d23/d6c/f1c [4086463,109901] 0 2026-03-09T17:30:19.674 INFO:tasks.workunit.client.1.vm09.stdout:6/404: sync 2026-03-09T17:30:19.681 INFO:tasks.workunit.client.1.vm09.stdout:1/424: fdatasync d9/f34 0 2026-03-09T17:30:19.684 INFO:tasks.workunit.client.1.vm09.stdout:1/425: chown d9/d3a/l6a 488 1 2026-03-09T17:30:19.688 INFO:tasks.workunit.client.1.vm09.stdout:5/421: mkdir d0/d9/d8b 0 2026-03-09T17:30:19.688 INFO:tasks.workunit.client.1.vm09.stdout:5/422: write d0/d2/f5d [333986,59234] 0 2026-03-09T17:30:19.689 INFO:tasks.workunit.client.1.vm09.stdout:5/423: chown d0/d9/d8b 1 1 2026-03-09T17:30:19.691 INFO:tasks.workunit.client.1.vm09.stdout:4/454: rename d11/f46 to d11/d1e/d83/f90 0 2026-03-09T17:30:19.701 INFO:tasks.workunit.client.1.vm09.stdout:9/392: mkdir d5/de/d88 0 2026-03-09T17:30:19.706 INFO:tasks.workunit.client.1.vm09.stdout:0/404: creat d6/d1d/d46/f7d x:0 0 0 2026-03-09T17:30:19.709 INFO:tasks.workunit.client.1.vm09.stdout:0/405: dwrite d6/d1d/d46/f4d [4194304,4194304] 0 2026-03-09T17:30:19.713 INFO:tasks.workunit.client.1.vm09.stdout:0/406: dread d6/d1d/f3c [0,4194304] 0 2026-03-09T17:30:19.725 INFO:tasks.workunit.client.1.vm09.stdout:7/522: mknod da/d11/d64/da7/db1/cb5 0 2026-03-09T17:30:19.738 INFO:tasks.workunit.client.1.vm09.stdout:3/381: dwrite d5/d9/d30/d65/f43 [0,4194304] 0 2026-03-09T17:30:19.740 INFO:tasks.workunit.client.1.vm09.stdout:3/382: chown d5/d16/d25/c4d 0 1 2026-03-09T17:30:19.740 INFO:tasks.workunit.client.1.vm09.stdout:2/397: dread d13/d15/f20 [4194304,4194304] 0 2026-03-09T17:30:19.755 INFO:tasks.workunit.client.1.vm09.stdout:6/405: rename d3/d21/l69 to d3/d21/d76/d3f/l84 0 2026-03-09T17:30:19.757 INFO:tasks.workunit.client.1.vm09.stdout:6/406: write d3/d21/d25/d26/d6b/f79 [1822621,59992] 0 2026-03-09T17:30:19.757 INFO:tasks.workunit.client.1.vm09.stdout:5/424: read d0/f22 [88020,88508] 0 2026-03-09T17:30:19.762 INFO:tasks.workunit.client.1.vm09.stdout:5/425: dread d0/d2/f2a [0,4194304] 0 2026-03-09T17:30:19.771 INFO:tasks.workunit.client.1.vm09.stdout:0/407: creat d6/d64/f7e x:0 0 0 2026-03-09T17:30:19.772 INFO:tasks.workunit.client.1.vm09.stdout:0/408: fdatasync d6/d1d/d24/d32/f68 0 2026-03-09T17:30:19.784 INFO:tasks.workunit.client.1.vm09.stdout:3/383: rmdir d5/d16 39 2026-03-09T17:30:19.785 INFO:tasks.workunit.client.1.vm09.stdout:2/398: rmdir d13/d15/d21 39 2026-03-09T17:30:19.786 INFO:tasks.workunit.client.1.vm09.stdout:2/399: readlink d13/d15/l19 0 2026-03-09T17:30:19.792 INFO:tasks.workunit.client.1.vm09.stdout:8/422: unlink d1/da/dd/f1e 0 2026-03-09T17:30:19.793 INFO:tasks.workunit.client.1.vm09.stdout:8/423: chown d1/d14/d2a/d42/f46 261618900 1 2026-03-09T17:30:19.796 INFO:tasks.workunit.client.1.vm09.stdout:8/424: dwrite d1/da/dd/d47/f82 [0,4194304] 0 2026-03-09T17:30:19.797 INFO:tasks.workunit.client.1.vm09.stdout:8/425: dread - d1/f6e zero size 2026-03-09T17:30:19.799 INFO:tasks.workunit.client.1.vm09.stdout:1/426: symlink d9/dc/dd/d40/d22/d37/d3f/l83 0 2026-03-09T17:30:19.821 INFO:tasks.workunit.client.1.vm09.stdout:4/455: write d11/f13 [2399689,103745] 0 2026-03-09T17:30:19.826 INFO:tasks.workunit.client.1.vm09.stdout:4/456: dwrite d11/d1e/d29/d36/f86 [0,4194304] 0 2026-03-09T17:30:19.846 INFO:tasks.workunit.client.1.vm09.stdout:6/407: rmdir d3/d48 39 2026-03-09T17:30:19.867 INFO:tasks.workunit.client.1.vm09.stdout:7/523: stat da/d11/d47/d5b/d6c/f73 0 2026-03-09T17:30:19.884 INFO:tasks.workunit.client.1.vm09.stdout:7/524: dread da/d11/f1a [0,4194304] 0 2026-03-09T17:30:19.888 INFO:tasks.workunit.client.1.vm09.stdout:7/525: dwrite da/d11/d47/d5b/f82 [0,4194304] 0 2026-03-09T17:30:19.910 INFO:tasks.workunit.client.1.vm09.stdout:4/457: chown d11/d1e/f61 665 1 2026-03-09T17:30:19.912 INFO:tasks.workunit.client.1.vm09.stdout:9/393: rename d5/de/d29/f67 to d5/de/d29/f89 0 2026-03-09T17:30:19.914 INFO:tasks.workunit.client.1.vm09.stdout:6/408: dread - d3/d7/f4c zero size 2026-03-09T17:30:19.920 INFO:tasks.workunit.client.1.vm09.stdout:5/426: mknod d0/dc/d21/d26/d5e/d68/d79/c8c 0 2026-03-09T17:30:19.922 INFO:tasks.workunit.client.1.vm09.stdout:0/409: symlink d6/d1d/d24/d32/l7f 0 2026-03-09T17:30:19.923 INFO:tasks.workunit.client.1.vm09.stdout:5/427: dwrite d0/dc/d21/d26/f39 [4194304,4194304] 0 2026-03-09T17:30:19.923 INFO:tasks.workunit.client.1.vm09.stdout:6/409: sync 2026-03-09T17:30:19.924 INFO:tasks.workunit.client.1.vm09.stdout:5/428: chown d0/d52/d20 115087 1 2026-03-09T17:30:19.926 INFO:tasks.workunit.client.1.vm09.stdout:3/384: fdatasync d5/d16/f54 0 2026-03-09T17:30:19.943 INFO:tasks.workunit.client.1.vm09.stdout:5/429: dread d0/dc/d21/f62 [0,4194304] 0 2026-03-09T17:30:19.944 INFO:tasks.workunit.client.1.vm09.stdout:5/430: write d0/dc/d21/d33/f69 [1939524,41115] 0 2026-03-09T17:30:19.962 INFO:tasks.workunit.client.1.vm09.stdout:6/410: symlink d3/d21/d76/d5c/d61/l85 0 2026-03-09T17:30:19.969 INFO:tasks.workunit.client.1.vm09.stdout:2/400: dwrite d13/d15/d21/f30 [0,4194304] 0 2026-03-09T17:30:19.971 INFO:tasks.workunit.client.1.vm09.stdout:2/401: write d13/d15/d34/f5e [4232667,35616] 0 2026-03-09T17:30:19.972 INFO:tasks.workunit.client.1.vm09.stdout:2/402: chown d13/d15/d21/f31 199 1 2026-03-09T17:30:19.975 INFO:tasks.workunit.client.1.vm09.stdout:0/410: dwrite d6/d1d/f37 [0,4194304] 0 2026-03-09T17:30:19.976 INFO:tasks.workunit.client.1.vm09.stdout:2/403: chown d13/d15/d34/d69/f7a 5 1 2026-03-09T17:30:19.977 INFO:tasks.workunit.client.1.vm09.stdout:8/426: link d1/d14/d2a/d49/c4d d1/da/dd/d63/c85 0 2026-03-09T17:30:19.991 INFO:tasks.workunit.client.1.vm09.stdout:1/427: creat d9/dc/dd/d40/d21/f84 x:0 0 0 2026-03-09T17:30:20.005 INFO:tasks.workunit.client.1.vm09.stdout:9/394: rename d5/de/d4e/f56 to d5/de/d29/d33/f8a 0 2026-03-09T17:30:20.006 INFO:tasks.workunit.client.1.vm09.stdout:9/395: write d5/d2e/d70/f75 [780539,129345] 0 2026-03-09T17:30:20.007 INFO:tasks.workunit.client.1.vm09.stdout:6/411: mkdir d3/d21/d25/d26/d86 0 2026-03-09T17:30:20.007 INFO:tasks.workunit.client.1.vm09.stdout:6/412: write d3/d21/d76/d5c/f65 [5170497,11387] 0 2026-03-09T17:30:20.011 INFO:tasks.workunit.client.1.vm09.stdout:2/404: chown d13/f39 6127305 1 2026-03-09T17:30:20.012 INFO:tasks.workunit.client.1.vm09.stdout:8/427: symlink d1/da/d3a/l86 0 2026-03-09T17:30:20.013 INFO:tasks.workunit.client.1.vm09.stdout:1/428: creat d9/dc/dd/d40/d21/d6f/f85 x:0 0 0 2026-03-09T17:30:20.014 INFO:tasks.workunit.client.1.vm09.stdout:8/428: write d1/d14/d2a/d42/d5d/f80 [810096,30915] 0 2026-03-09T17:30:20.023 INFO:tasks.workunit.client.1.vm09.stdout:4/458: write d11/d1e/d83/f90 [393211,125983] 0 2026-03-09T17:30:20.027 INFO:tasks.workunit.client.1.vm09.stdout:3/385: rename d5/d16/d46/l6c to d5/d9/d30/d65/d59/l6f 0 2026-03-09T17:30:20.035 INFO:tasks.workunit.client.1.vm09.stdout:0/411: symlink d6/d1d/l80 0 2026-03-09T17:30:20.041 INFO:tasks.workunit.client.1.vm09.stdout:7/526: getdents da/d11/d2d/d56/d68 0 2026-03-09T17:30:20.042 INFO:tasks.workunit.client.1.vm09.stdout:7/527: fsync da/d11/d47/d89/fb4 0 2026-03-09T17:30:20.042 INFO:tasks.workunit.client.1.vm09.stdout:7/528: chown da/d11/l20 598 1 2026-03-09T17:30:20.047 INFO:tasks.workunit.client.1.vm09.stdout:7/529: dread da/d11/d77/f79 [0,4194304] 0 2026-03-09T17:30:20.048 INFO:tasks.workunit.client.1.vm09.stdout:9/396: dread d5/f1e [0,4194304] 0 2026-03-09T17:30:20.049 INFO:tasks.workunit.client.1.vm09.stdout:9/397: write d5/de/d29/f52 [18278,29502] 0 2026-03-09T17:30:20.055 INFO:tasks.workunit.client.1.vm09.stdout:0/412: mkdir d6/d1d/d24/d32/d59/d81 0 2026-03-09T17:30:20.056 INFO:tasks.workunit.client.1.vm09.stdout:0/413: fsync d6/d1d/d24/d32/d59/f5c 0 2026-03-09T17:30:20.057 INFO:tasks.workunit.client.1.vm09.stdout:5/431: getdents d0/d55 0 2026-03-09T17:30:20.058 INFO:tasks.workunit.client.1.vm09.stdout:5/432: chown d0/d52/c1d 2011831951 1 2026-03-09T17:30:20.063 INFO:tasks.workunit.client.1.vm09.stdout:5/433: dwrite d0/d9/d16/d5c/f70 [0,4194304] 0 2026-03-09T17:30:20.084 INFO:tasks.workunit.client.1.vm09.stdout:6/413: rename d3/d21/d25/c31 to d3/d7/d59/d72/c87 0 2026-03-09T17:30:20.089 INFO:tasks.workunit.client.1.vm09.stdout:2/405: creat d13/d15/d34/d37/d6f/f7b x:0 0 0 2026-03-09T17:30:20.090 INFO:tasks.workunit.client.1.vm09.stdout:2/406: write d13/d15/f2a [3285457,4283] 0 2026-03-09T17:30:20.095 INFO:tasks.workunit.client.1.vm09.stdout:1/429: creat d9/dc/dd/d40/f86 x:0 0 0 2026-03-09T17:30:20.105 INFO:tasks.workunit.client.1.vm09.stdout:9/398: dwrite d5/f1e [0,4194304] 0 2026-03-09T17:30:20.116 INFO:tasks.workunit.client.1.vm09.stdout:3/386: rename d5/d9/d30/d65/f39 to d5/d16/d31/d37/d58/d64/f70 0 2026-03-09T17:30:20.123 INFO:tasks.workunit.client.1.vm09.stdout:0/414: creat d6/d1d/d24/d32/d59/d81/f82 x:0 0 0 2026-03-09T17:30:20.130 INFO:tasks.workunit.client.1.vm09.stdout:1/430: mknod d9/dc/dd/d40/d22/d37/d3f/d42/d55/c87 0 2026-03-09T17:30:20.131 INFO:tasks.workunit.client.1.vm09.stdout:8/429: getdents d1/d14 0 2026-03-09T17:30:20.137 INFO:tasks.workunit.client.1.vm09.stdout:4/459: getdents d11/d1e 0 2026-03-09T17:30:20.138 INFO:tasks.workunit.client.1.vm09.stdout:1/431: sync 2026-03-09T17:30:20.138 INFO:tasks.workunit.client.1.vm09.stdout:1/432: chown d9/d38/l53 35231554 1 2026-03-09T17:30:20.141 INFO:tasks.workunit.client.1.vm09.stdout:9/399: rmdir d5/de/d29 39 2026-03-09T17:30:20.144 INFO:tasks.workunit.client.1.vm09.stdout:2/407: write d13/d15/d21/f5d [724320,22239] 0 2026-03-09T17:30:20.146 INFO:tasks.workunit.client.1.vm09.stdout:2/408: sync 2026-03-09T17:30:20.154 INFO:tasks.workunit.client.1.vm09.stdout:6/414: mkdir d3/d21/d76/d88 0 2026-03-09T17:30:20.155 INFO:tasks.workunit.client.1.vm09.stdout:3/387: mknod d5/d16/d31/d3d/d32/c71 0 2026-03-09T17:30:20.156 INFO:tasks.workunit.client.1.vm09.stdout:3/388: chown d5/d9/l23 3315 1 2026-03-09T17:30:20.158 INFO:tasks.workunit.client.1.vm09.stdout:0/415: unlink d6/d1d/d46/f5f 0 2026-03-09T17:30:20.158 INFO:tasks.workunit.client.1.vm09.stdout:0/416: write d6/d1d/d24/d32/f45 [3150999,102500] 0 2026-03-09T17:30:20.164 INFO:tasks.workunit.client.1.vm09.stdout:7/530: getdents da/d11/d64/da7/db1 0 2026-03-09T17:30:20.164 INFO:tasks.workunit.client.1.vm09.stdout:7/531: fdatasync da/d11/d3e/da2/db2/fa6 0 2026-03-09T17:30:20.166 INFO:tasks.workunit.client.1.vm09.stdout:4/460: symlink d11/d1e/d83/d89/d8b/l91 0 2026-03-09T17:30:20.168 INFO:tasks.workunit.client.1.vm09.stdout:4/461: sync 2026-03-09T17:30:20.171 INFO:tasks.workunit.client.1.vm09.stdout:1/433: dwrite d9/dc/dd/d40/d22/f2b [4194304,4194304] 0 2026-03-09T17:30:20.188 INFO:tasks.workunit.client.1.vm09.stdout:9/400: mkdir d5/d2e/d8b 0 2026-03-09T17:30:20.196 INFO:tasks.workunit.client.1.vm09.stdout:5/434: rename d0/dc/d21/d6f/d42/l83 to d0/dc/d21/l8d 0 2026-03-09T17:30:20.197 INFO:tasks.workunit.client.1.vm09.stdout:6/415: rmdir d3/d21/d76/d5c/d61/d6a 39 2026-03-09T17:30:20.199 INFO:tasks.workunit.client.1.vm09.stdout:5/435: sync 2026-03-09T17:30:20.200 INFO:tasks.workunit.client.1.vm09.stdout:3/389: rmdir d5/d16/d31/d37 39 2026-03-09T17:30:20.201 INFO:tasks.workunit.client.1.vm09.stdout:3/390: fdatasync d5/d9/d30/d65/f19 0 2026-03-09T17:30:20.208 INFO:tasks.workunit.client.1.vm09.stdout:0/417: mknod d6/d1d/d39/c83 0 2026-03-09T17:30:20.214 INFO:tasks.workunit.client.1.vm09.stdout:4/462: mknod d11/d1e/d45/d60/d71/c92 0 2026-03-09T17:30:20.227 INFO:tasks.workunit.client.1.vm09.stdout:1/434: mkdir d9/dc/dd/d40/d21/d35/d88 0 2026-03-09T17:30:20.234 INFO:tasks.workunit.client.1.vm09.stdout:9/401: creat d5/de/d4e/d6e/f8c x:0 0 0 2026-03-09T17:30:20.235 INFO:tasks.workunit.client.1.vm09.stdout:2/409: link d13/d15/f71 d13/d15/d34/d69/f7c 0 2026-03-09T17:30:20.242 INFO:tasks.workunit.client.1.vm09.stdout:6/416: chown d3/d21/d76/d3f/l84 119 1 2026-03-09T17:30:20.243 INFO:tasks.workunit.client.1.vm09.stdout:6/417: chown d3/c6e 52 1 2026-03-09T17:30:20.244 INFO:tasks.workunit.client.1.vm09.stdout:5/436: symlink d0/d46/l8e 0 2026-03-09T17:30:20.250 INFO:tasks.workunit.client.1.vm09.stdout:3/391: unlink d5/d16/d31/d3d/l10 0 2026-03-09T17:30:20.255 INFO:tasks.workunit.client.1.vm09.stdout:7/532: write da/d11/d47/d5b/d6c/d9e/d4e/f7d [646141,43707] 0 2026-03-09T17:30:20.256 INFO:tasks.workunit.client.1.vm09.stdout:7/533: chown da/f1c 3 1 2026-03-09T17:30:20.258 INFO:tasks.workunit.client.1.vm09.stdout:7/534: write da/d11/d47/d89/fb4 [407182,130435] 0 2026-03-09T17:30:20.262 INFO:tasks.workunit.client.1.vm09.stdout:4/463: creat d11/d1e/d29/f93 x:0 0 0 2026-03-09T17:30:20.274 INFO:tasks.workunit.client.1.vm09.stdout:6/418: symlink d3/d1e/l89 0 2026-03-09T17:30:20.282 INFO:tasks.workunit.client.1.vm09.stdout:7/535: creat da/d11/d64/da7/db1/fb6 x:0 0 0 2026-03-09T17:30:20.282 INFO:tasks.workunit.client.1.vm09.stdout:7/536: stat da/f27 0 2026-03-09T17:30:20.284 INFO:tasks.workunit.client.1.vm09.stdout:4/464: creat d11/d1e/d83/d89/f94 x:0 0 0 2026-03-09T17:30:20.289 INFO:tasks.workunit.client.1.vm09.stdout:1/435: creat d9/dc/dd/d40/d21/d6f/d7e/f89 x:0 0 0 2026-03-09T17:30:20.290 INFO:tasks.workunit.client.1.vm09.stdout:1/436: truncate d9/dc/d63/f67 1298604 0 2026-03-09T17:30:20.298 INFO:tasks.workunit.client.1.vm09.stdout:9/402: dread d5/de/d29/f35 [0,4194304] 0 2026-03-09T17:30:20.303 INFO:tasks.workunit.client.1.vm09.stdout:8/430: rename d1/da/dd/d47/l51 to d1/d14/d2a/l87 0 2026-03-09T17:30:20.306 INFO:tasks.workunit.client.1.vm09.stdout:9/403: dread d5/f13 [0,4194304] 0 2026-03-09T17:30:20.307 INFO:tasks.workunit.client.1.vm09.stdout:8/431: read d1/da/d23/d6c/d32/f50 [99529,30241] 0 2026-03-09T17:30:20.309 INFO:tasks.workunit.client.1.vm09.stdout:9/404: dwrite d5/de/d29/f36 [4194304,4194304] 0 2026-03-09T17:30:20.310 INFO:tasks.workunit.client.1.vm09.stdout:9/405: chown d5/de/d29/f36 5747 1 2026-03-09T17:30:20.310 INFO:tasks.workunit.client.1.vm09.stdout:9/406: dread - d5/d2e/f82 zero size 2026-03-09T17:30:20.338 INFO:tasks.workunit.client.1.vm09.stdout:3/392: mknod d5/d16/d31/d3d/c72 0 2026-03-09T17:30:20.352 INFO:tasks.workunit.client.1.vm09.stdout:7/537: creat da/d11/d3e/da2/fb7 x:0 0 0 2026-03-09T17:30:20.356 INFO:tasks.workunit.client.1.vm09.stdout:4/465: read d11/d1e/d31/f3a [32638,82576] 0 2026-03-09T17:30:20.364 INFO:tasks.workunit.client.1.vm09.stdout:4/466: dread d11/f12 [0,4194304] 0 2026-03-09T17:30:20.365 INFO:tasks.workunit.client.1.vm09.stdout:4/467: write d11/d1e/d29/f93 [830511,127438] 0 2026-03-09T17:30:20.372 INFO:tasks.workunit.client.1.vm09.stdout:1/437: creat d9/d3a/f8a x:0 0 0 2026-03-09T17:30:20.376 INFO:tasks.workunit.client.1.vm09.stdout:1/438: dwrite d9/dc/dd/d40/d21/d6f/f85 [0,4194304] 0 2026-03-09T17:30:20.382 INFO:tasks.workunit.client.1.vm09.stdout:1/439: chown d9/dc/dd/d40/d22/d37/d3f/f62 95110092 1 2026-03-09T17:30:20.383 INFO:tasks.workunit.client.1.vm09.stdout:2/410: rmdir d13/d62 0 2026-03-09T17:30:20.384 INFO:tasks.workunit.client.1.vm09.stdout:1/440: read - d9/dc/dd/d40/d21/d6f/d7e/f89 zero size 2026-03-09T17:30:20.390 INFO:tasks.workunit.client.1.vm09.stdout:8/432: rmdir d1/da/dd/d79 39 2026-03-09T17:30:20.391 INFO:tasks.workunit.client.1.vm09.stdout:9/407: unlink d5/f47 0 2026-03-09T17:30:20.394 INFO:tasks.workunit.client.1.vm09.stdout:0/418: link d6/d1d/d24/d32/l6b d6/d1d/d24/d5e/d6c/l84 0 2026-03-09T17:30:20.396 INFO:tasks.workunit.client.1.vm09.stdout:7/538: mknod da/d11/d64/cb8 0 2026-03-09T17:30:20.402 INFO:tasks.workunit.client.1.vm09.stdout:9/408: sync 2026-03-09T17:30:20.410 INFO:tasks.workunit.client.1.vm09.stdout:8/433: mknod d1/da/dd/d47/c88 0 2026-03-09T17:30:20.415 INFO:tasks.workunit.client.1.vm09.stdout:3/393: creat d5/d16/d31/d37/d58/f73 x:0 0 0 2026-03-09T17:30:20.416 INFO:tasks.workunit.client.1.vm09.stdout:0/419: mknod d6/d1d/d24/d32/d59/c85 0 2026-03-09T17:30:20.417 INFO:tasks.workunit.client.1.vm09.stdout:4/468: rmdir d11 39 2026-03-09T17:30:20.417 INFO:tasks.workunit.client.1.vm09.stdout:9/409: fdatasync d5/de/f65 0 2026-03-09T17:30:20.418 INFO:tasks.workunit.client.1.vm09.stdout:9/410: read d5/f1e [2058118,94026] 0 2026-03-09T17:30:20.419 INFO:tasks.workunit.client.1.vm09.stdout:2/411: truncate d13/d15/d34/d69/f7c 697634 0 2026-03-09T17:30:20.427 INFO:tasks.workunit.client.1.vm09.stdout:2/412: dread d13/d15/d21/f31 [0,4194304] 0 2026-03-09T17:30:20.431 INFO:tasks.workunit.client.1.vm09.stdout:5/437: truncate d0/d2/f2a 880016 0 2026-03-09T17:30:20.432 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:20 vm06.local ceph-mon[57307]: pgmap v10: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 33 MiB/s rd, 99 MiB/s wr, 217 op/s 2026-03-09T17:30:20.467 INFO:tasks.workunit.client.1.vm09.stdout:4/469: write d11/d1e/d83/f90 [932374,35439] 0 2026-03-09T17:30:20.478 INFO:tasks.workunit.client.1.vm09.stdout:6/419: rename d3/d7/c1d to d3/d21/d76/d5c/d61/c8a 0 2026-03-09T17:30:20.478 INFO:tasks.workunit.client.1.vm09.stdout:3/394: rename d5/d16/d31/d37 to d5/d16/d31/d37/d58/d74 22 2026-03-09T17:30:20.478 INFO:tasks.workunit.client.1.vm09.stdout:3/395: fsync d5/d16/d31/f56 0 2026-03-09T17:30:20.479 INFO:tasks.workunit.client.1.vm09.stdout:9/411: symlink d5/de/d4e/l8d 0 2026-03-09T17:30:20.482 INFO:tasks.workunit.client.1.vm09.stdout:9/412: dwrite d5/f1b [0,4194304] 0 2026-03-09T17:30:20.486 INFO:tasks.workunit.client.1.vm09.stdout:1/441: dwrite d9/dc/dd/d40/d22/d37/f2e [4194304,4194304] 0 2026-03-09T17:30:20.488 INFO:tasks.workunit.client.1.vm09.stdout:8/434: mknod d1/da/dd/c89 0 2026-03-09T17:30:20.489 INFO:tasks.workunit.client.1.vm09.stdout:2/413: creat d13/d4d/f7d x:0 0 0 2026-03-09T17:30:20.489 INFO:tasks.workunit.client.1.vm09.stdout:2/414: fdatasync fd 0 2026-03-09T17:30:20.492 INFO:tasks.workunit.client.1.vm09.stdout:0/420: mkdir d6/d1d/d24/d5e/d86 0 2026-03-09T17:30:20.492 INFO:tasks.workunit.client.1.vm09.stdout:8/435: sync 2026-03-09T17:30:20.494 INFO:tasks.workunit.client.1.vm09.stdout:5/438: write d0/d9/d16/d5c/f73 [5190297,61268] 0 2026-03-09T17:30:20.497 INFO:tasks.workunit.client.1.vm09.stdout:4/470: truncate d11/d1e/d29/f3b 1677171 0 2026-03-09T17:30:20.546 INFO:tasks.workunit.client.1.vm09.stdout:1/442: mkdir d9/dc/dd/d40/d22/d8b 0 2026-03-09T17:30:20.554 INFO:tasks.workunit.client.1.vm09.stdout:0/421: mknod d6/d1d/d24/d32/d59/c87 0 2026-03-09T17:30:20.577 INFO:tasks.workunit.client.1.vm09.stdout:8/436: mkdir d1/d14/d2a/d42/d5d/d8a 0 2026-03-09T17:30:20.578 INFO:tasks.workunit.client.1.vm09.stdout:5/439: symlink d0/d2/d76/l8f 0 2026-03-09T17:30:20.579 INFO:tasks.workunit.client.1.vm09.stdout:5/440: write d0/dc/d21/d26/f3d [7905632,89951] 0 2026-03-09T17:30:20.583 INFO:tasks.workunit.client.1.vm09.stdout:5/441: dread d0/d46/f4c [0,4194304] 0 2026-03-09T17:30:20.584 INFO:tasks.workunit.client.1.vm09.stdout:5/442: chown d0/d52/c43 22 1 2026-03-09T17:30:20.585 INFO:tasks.workunit.client.1.vm09.stdout:5/443: write d0/d9/d16/d5c/f73 [2752451,75573] 0 2026-03-09T17:30:20.590 INFO:tasks.workunit.client.1.vm09.stdout:7/539: link da/c3b da/d11/d64/dac/cb9 0 2026-03-09T17:30:20.596 INFO:tasks.workunit.client.1.vm09.stdout:7/540: dwrite da/d11/d2d/f45 [0,4194304] 0 2026-03-09T17:30:20.619 INFO:tasks.workunit.client.1.vm09.stdout:7/541: dread da/d11/d3e/f60 [0,4194304] 0 2026-03-09T17:30:20.638 INFO:tasks.workunit.client.1.vm09.stdout:6/420: fsync d3/d7/fe 0 2026-03-09T17:30:20.639 INFO:tasks.workunit.client.1.vm09.stdout:3/396: symlink d5/d16/l75 0 2026-03-09T17:30:20.640 INFO:tasks.workunit.client.1.vm09.stdout:3/397: write d5/d16/d31/d37/f5b [374517,37168] 0 2026-03-09T17:30:20.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:20 vm09.local ceph-mon[62061]: pgmap v10: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 33 MiB/s rd, 99 MiB/s wr, 217 op/s 2026-03-09T17:30:20.645 INFO:tasks.workunit.client.1.vm09.stdout:1/443: rmdir d9/dc/dd/d40/d22/d37/d3f/d42 39 2026-03-09T17:30:20.657 INFO:tasks.workunit.client.1.vm09.stdout:5/444: symlink d0/dc/d21/d26/d5e/d68/d79/l90 0 2026-03-09T17:30:20.658 INFO:tasks.workunit.client.1.vm09.stdout:5/445: read d0/d46/d4b/f4f [22235,63939] 0 2026-03-09T17:30:20.658 INFO:tasks.workunit.client.1.vm09.stdout:5/446: fsync d0/d46/d4b/f4f 0 2026-03-09T17:30:20.675 INFO:tasks.workunit.client.1.vm09.stdout:3/398: creat d5/d16/d31/d37/f76 x:0 0 0 2026-03-09T17:30:20.676 INFO:tasks.workunit.client.1.vm09.stdout:3/399: write d5/d16/d31/d37/f76 [391263,18963] 0 2026-03-09T17:30:20.676 INFO:tasks.workunit.client.1.vm09.stdout:9/413: creat d5/f8e x:0 0 0 2026-03-09T17:30:20.676 INFO:tasks.workunit.client.1.vm09.stdout:9/414: chown d5/d7e/d81 31987169 1 2026-03-09T17:30:20.677 INFO:tasks.workunit.client.1.vm09.stdout:3/400: rename d5 to d5/d16/d31/d37/d58/d77 22 2026-03-09T17:30:20.678 INFO:tasks.workunit.client.1.vm09.stdout:1/444: stat d9/dc/dd/d40/d22/d37/d3f/d42 0 2026-03-09T17:30:20.679 INFO:tasks.workunit.client.1.vm09.stdout:3/401: chown d5/d16/d31/d3d/c72 56621 1 2026-03-09T17:30:20.681 INFO:tasks.workunit.client.1.vm09.stdout:7/542: dread da/d11/d47/d5b/d6c/d9e/d4e/d4c/f67 [0,4194304] 0 2026-03-09T17:30:20.682 INFO:tasks.workunit.client.1.vm09.stdout:2/415: creat d13/d15/f7e x:0 0 0 2026-03-09T17:30:20.686 INFO:tasks.workunit.client.1.vm09.stdout:2/416: dwrite fd [4194304,4194304] 0 2026-03-09T17:30:20.689 INFO:tasks.workunit.client.1.vm09.stdout:2/417: dread d13/d15/d34/f5e [0,4194304] 0 2026-03-09T17:30:20.690 INFO:tasks.workunit.client.1.vm09.stdout:9/415: dread d5/d21/f2f [4194304,4194304] 0 2026-03-09T17:30:20.697 INFO:tasks.workunit.client.1.vm09.stdout:2/418: sync 2026-03-09T17:30:20.699 INFO:tasks.workunit.client.1.vm09.stdout:0/422: truncate d6/f9 4746108 0 2026-03-09T17:30:20.711 INFO:tasks.workunit.client.1.vm09.stdout:7/543: rmdir da/d11/d47 39 2026-03-09T17:30:20.711 INFO:tasks.workunit.client.1.vm09.stdout:2/419: dread d13/d15/d21/f30 [4194304,4194304] 0 2026-03-09T17:30:20.720 INFO:tasks.workunit.client.1.vm09.stdout:8/437: creat d1/d14/d2a/f8b x:0 0 0 2026-03-09T17:30:20.735 INFO:tasks.workunit.client.1.vm09.stdout:9/416: dread d5/de/d29/d33/f66 [0,4194304] 0 2026-03-09T17:30:20.743 INFO:tasks.workunit.client.1.vm09.stdout:4/471: getdents d11/d1e/d29/d36/d57/d78 0 2026-03-09T17:30:20.744 INFO:tasks.workunit.client.1.vm09.stdout:4/472: readlink d11/d1e/d29/l4f 0 2026-03-09T17:30:20.749 INFO:tasks.workunit.client.1.vm09.stdout:6/421: creat d3/d21/d25/f8b x:0 0 0 2026-03-09T17:30:20.749 INFO:tasks.workunit.client.1.vm09.stdout:6/422: truncate d3/d21/d76/d5c/f65 5360010 0 2026-03-09T17:30:20.750 INFO:tasks.workunit.client.1.vm09.stdout:3/402: symlink d5/d16/l78 0 2026-03-09T17:30:20.752 INFO:tasks.workunit.client.1.vm09.stdout:3/403: read d5/d9/d30/d65/f15 [899678,42877] 0 2026-03-09T17:30:20.752 INFO:tasks.workunit.client.1.vm09.stdout:7/544: creat da/d11/d77/fba x:0 0 0 2026-03-09T17:30:20.753 INFO:tasks.workunit.client.1.vm09.stdout:2/420: chown d13/d15/d2c/c63 26109770 1 2026-03-09T17:30:20.761 INFO:tasks.workunit.client.1.vm09.stdout:0/423: symlink d6/d1d/d24/d5e/l88 0 2026-03-09T17:30:20.761 INFO:tasks.workunit.client.1.vm09.stdout:0/424: chown d6/f63 2937573 1 2026-03-09T17:30:20.762 INFO:tasks.workunit.client.1.vm09.stdout:5/447: creat d0/f91 x:0 0 0 2026-03-09T17:30:20.763 INFO:tasks.workunit.client.1.vm09.stdout:5/448: write d0/d46/f56 [2565034,5612] 0 2026-03-09T17:30:20.771 INFO:tasks.workunit.client.1.vm09.stdout:6/423: rmdir d3/d21 39 2026-03-09T17:30:20.771 INFO:tasks.workunit.client.1.vm09.stdout:6/424: chown f2 3 1 2026-03-09T17:30:20.776 INFO:tasks.workunit.client.1.vm09.stdout:2/421: creat d13/d15/d21/f7f x:0 0 0 2026-03-09T17:30:20.777 INFO:tasks.workunit.client.1.vm09.stdout:9/417: creat d5/de/d88/f8f x:0 0 0 2026-03-09T17:30:20.779 INFO:tasks.workunit.client.1.vm09.stdout:1/445: getdents d9/dc/dd/d40/d21 0 2026-03-09T17:30:20.781 INFO:tasks.workunit.client.1.vm09.stdout:0/425: symlink d6/d1d/d24/d32/d59/d5b/l89 0 2026-03-09T17:30:20.782 INFO:tasks.workunit.client.1.vm09.stdout:5/449: creat d0/d55/f92 x:0 0 0 2026-03-09T17:30:20.784 INFO:tasks.workunit.client.1.vm09.stdout:5/450: dread d0/d52/d20/f25 [0,4194304] 0 2026-03-09T17:30:20.789 INFO:tasks.workunit.client.1.vm09.stdout:7/545: mknod da/d11/d47/d5b/d6c/d9e/d4e/d5f/cbb 0 2026-03-09T17:30:20.793 INFO:tasks.workunit.client.1.vm09.stdout:3/404: link d5/d16/d31/d37/f5b d5/d9/f79 0 2026-03-09T17:30:20.796 INFO:tasks.workunit.client.1.vm09.stdout:3/405: dwrite d5/d9/f4e [0,4194304] 0 2026-03-09T17:30:20.798 INFO:tasks.workunit.client.1.vm09.stdout:9/418: write d5/de/f20 [3077363,21491] 0 2026-03-09T17:30:20.800 INFO:tasks.workunit.client.1.vm09.stdout:3/406: write d5/d16/d31/f34 [4915126,81384] 0 2026-03-09T17:30:20.801 INFO:tasks.workunit.client.1.vm09.stdout:4/473: creat d11/d1e/d45/d60/f95 x:0 0 0 2026-03-09T17:30:20.803 INFO:tasks.workunit.client.1.vm09.stdout:1/446: unlink d9/dc/dd/d40/d21/f84 0 2026-03-09T17:30:20.814 INFO:tasks.workunit.client.1.vm09.stdout:6/425: creat d3/d21/f8c x:0 0 0 2026-03-09T17:30:20.816 INFO:tasks.workunit.client.1.vm09.stdout:6/426: dwrite d3/d7/fe [0,4194304] 0 2026-03-09T17:30:20.817 INFO:tasks.workunit.client.1.vm09.stdout:6/427: dread - d3/d21/f5d zero size 2026-03-09T17:30:20.824 INFO:tasks.workunit.client.1.vm09.stdout:7/546: mknod da/d11/d47/d5b/d6c/d9e/d4e/d5f/cbc 0 2026-03-09T17:30:20.824 INFO:tasks.workunit.client.1.vm09.stdout:8/438: getdents d1/d14/d2a 0 2026-03-09T17:30:20.826 INFO:tasks.workunit.client.1.vm09.stdout:2/422: creat d13/d15/d34/d37/d66/f80 x:0 0 0 2026-03-09T17:30:20.837 INFO:tasks.workunit.client.1.vm09.stdout:9/419: rmdir d5/de/d29/d33 39 2026-03-09T17:30:20.837 INFO:tasks.workunit.client.1.vm09.stdout:9/420: write d5/de/d29/f73 [524387,51829] 0 2026-03-09T17:30:20.837 INFO:tasks.workunit.client.1.vm09.stdout:9/421: chown d5/d21 232519 1 2026-03-09T17:30:20.837 INFO:tasks.workunit.client.1.vm09.stdout:3/407: mknod d5/d16/d31/d3d/d32/c7a 0 2026-03-09T17:30:20.837 INFO:tasks.workunit.client.1.vm09.stdout:3/408: chown d5/d16/d31/f56 1296 1 2026-03-09T17:30:20.837 INFO:tasks.workunit.client.1.vm09.stdout:4/474: truncate d11/d1e/f61 49273 0 2026-03-09T17:30:20.841 INFO:tasks.workunit.client.1.vm09.stdout:1/447: dwrite d9/dc/dd/d40/d1d/f17 [0,4194304] 0 2026-03-09T17:30:20.855 INFO:tasks.workunit.client.1.vm09.stdout:8/439: truncate d1/d14/d2a/d42/d43/f58 794543 0 2026-03-09T17:30:20.860 INFO:tasks.workunit.client.1.vm09.stdout:2/423: dread d13/d15/d2c/f2d [0,4194304] 0 2026-03-09T17:30:20.865 INFO:tasks.workunit.client.1.vm09.stdout:4/475: chown d11/d1e/d83/d89/d8b/d58/c87 6 1 2026-03-09T17:30:20.870 INFO:tasks.workunit.client.1.vm09.stdout:0/426: creat d6/d1d/d24/d5e/f8a x:0 0 0 2026-03-09T17:30:20.871 INFO:tasks.workunit.client.1.vm09.stdout:5/451: rename d0/dc/d21/d26/l3b to d0/d9/l93 0 2026-03-09T17:30:20.872 INFO:tasks.workunit.client.1.vm09.stdout:0/427: write d6/d1d/d46/f4d [3726157,82513] 0 2026-03-09T17:30:20.873 INFO:tasks.workunit.client.1.vm09.stdout:5/452: dread d0/d52/f1c [0,4194304] 0 2026-03-09T17:30:20.876 INFO:tasks.workunit.client.1.vm09.stdout:5/453: dwrite d0/dc/d21/d26/f28 [0,4194304] 0 2026-03-09T17:30:20.883 INFO:tasks.workunit.client.1.vm09.stdout:7/547: mknod da/d11/d47/d5b/d6c/cbd 0 2026-03-09T17:30:20.884 INFO:tasks.workunit.client.1.vm09.stdout:5/454: dwrite d0/d9/d16/d5c/f73 [0,4194304] 0 2026-03-09T17:30:20.894 INFO:tasks.workunit.client.1.vm09.stdout:2/424: chown d13/d15/c25 1058334 1 2026-03-09T17:30:20.896 INFO:tasks.workunit.client.1.vm09.stdout:4/476: creat d11/d1e/d83/f96 x:0 0 0 2026-03-09T17:30:20.896 INFO:tasks.workunit.client.1.vm09.stdout:6/428: creat d3/d21/d76/f8d x:0 0 0 2026-03-09T17:30:20.897 INFO:tasks.workunit.client.1.vm09.stdout:6/429: truncate d3/f41 210390 0 2026-03-09T17:30:20.897 INFO:tasks.workunit.client.1.vm09.stdout:7/548: mkdir da/d11/d47/d89/dbe 0 2026-03-09T17:30:20.898 INFO:tasks.workunit.client.1.vm09.stdout:6/430: dread - d3/d21/f8c zero size 2026-03-09T17:30:20.898 INFO:tasks.workunit.client.1.vm09.stdout:8/440: symlink d1/l8c 0 2026-03-09T17:30:20.901 INFO:tasks.workunit.client.1.vm09.stdout:4/477: dread d11/d1e/d29/d36/d57/f79 [0,4194304] 0 2026-03-09T17:30:20.902 INFO:tasks.workunit.client.1.vm09.stdout:5/455: dread d0/dc/d21/d33/f69 [0,4194304] 0 2026-03-09T17:30:20.903 INFO:tasks.workunit.client.1.vm09.stdout:0/428: mknod d6/d1d/d24/d5e/c8b 0 2026-03-09T17:30:20.903 INFO:tasks.workunit.client.1.vm09.stdout:9/422: getdents d5/de/d4e/d6e 0 2026-03-09T17:30:20.906 INFO:tasks.workunit.client.1.vm09.stdout:4/478: mkdir d11/d1e/d29/d36/d57/d97 0 2026-03-09T17:30:20.906 INFO:tasks.workunit.client.1.vm09.stdout:7/549: symlink da/d11/d2d/d56/da1/lbf 0 2026-03-09T17:30:20.908 INFO:tasks.workunit.client.1.vm09.stdout:0/429: mkdir d6/d1d/d24/d32/d59/d81/d8c 0 2026-03-09T17:30:20.909 INFO:tasks.workunit.client.1.vm09.stdout:6/431: rename d3/d7/ld to d3/d21/d76/d5c/d7e/l8e 0 2026-03-09T17:30:20.912 INFO:tasks.workunit.client.1.vm09.stdout:9/423: mkdir d5/de/d29/d90 0 2026-03-09T17:30:20.915 INFO:tasks.workunit.client.1.vm09.stdout:9/424: chown d5/c48 10596 1 2026-03-09T17:30:20.915 INFO:tasks.workunit.client.1.vm09.stdout:5/456: rename d0/d46/l8e to d0/d9/d8b/l94 0 2026-03-09T17:30:20.915 INFO:tasks.workunit.client.1.vm09.stdout:0/430: symlink d6/d1d/d46/l8d 0 2026-03-09T17:30:20.916 INFO:tasks.workunit.client.1.vm09.stdout:6/432: dwrite d3/f41 [0,4194304] 0 2026-03-09T17:30:20.917 INFO:tasks.workunit.client.1.vm09.stdout:5/457: write d0/dc/d21/d26/f3d [8156077,49557] 0 2026-03-09T17:30:20.925 INFO:tasks.workunit.client.1.vm09.stdout:0/431: dwrite d6/d1d/d24/d32/f45 [0,4194304] 0 2026-03-09T17:30:20.929 INFO:tasks.workunit.client.1.vm09.stdout:3/409: write d5/d9/d30/d65/f5e [920263,12915] 0 2026-03-09T17:30:20.932 INFO:tasks.workunit.client.1.vm09.stdout:9/425: mkdir d5/d91 0 2026-03-09T17:30:20.932 INFO:tasks.workunit.client.1.vm09.stdout:9/426: read d5/d21/f46 [1438812,40659] 0 2026-03-09T17:30:20.932 INFO:tasks.workunit.client.1.vm09.stdout:9/427: stat d5/de/d4e/d6e/l77 0 2026-03-09T17:30:20.934 INFO:tasks.workunit.client.1.vm09.stdout:6/433: dwrite d3/d21/d76/d5c/d61/d6a/f74 [0,4194304] 0 2026-03-09T17:30:20.936 INFO:tasks.workunit.client.1.vm09.stdout:9/428: truncate d5/f13 8956627 0 2026-03-09T17:30:20.938 INFO:tasks.workunit.client.1.vm09.stdout:6/434: mkdir d3/d21/d76/d3f/d8f 0 2026-03-09T17:30:20.939 INFO:tasks.workunit.client.1.vm09.stdout:6/435: write d3/d7/d59/d5a/f83 [857275,79645] 0 2026-03-09T17:30:20.940 INFO:tasks.workunit.client.1.vm09.stdout:0/432: rename d6/d1d/d39/f7a to d6/d1d/d24/f8e 0 2026-03-09T17:30:20.946 INFO:tasks.workunit.client.1.vm09.stdout:0/433: dread d6/d1d/f3c [0,4194304] 0 2026-03-09T17:30:20.949 INFO:tasks.workunit.client.1.vm09.stdout:1/448: dwrite d9/dc/dd/f7b [0,4194304] 0 2026-03-09T17:30:20.950 INFO:tasks.workunit.client.1.vm09.stdout:6/436: link d3/d21/d76/d5c/f78 d3/d21/d76/f90 0 2026-03-09T17:30:20.955 INFO:tasks.workunit.client.1.vm09.stdout:9/429: dread d5/de/f3c [0,4194304] 0 2026-03-09T17:30:20.957 INFO:tasks.workunit.client.1.vm09.stdout:1/449: dwrite d9/dc/dd/d40/d22/d37/d3f/d42/d55/f69 [0,4194304] 0 2026-03-09T17:30:20.958 INFO:tasks.workunit.client.1.vm09.stdout:0/434: link d6/d1d/d24/f75 d6/d1d/d24/d32/d59/d81/d8c/f8f 0 2026-03-09T17:30:20.964 INFO:tasks.workunit.client.1.vm09.stdout:5/458: dread d0/d2/d76/d86/f50 [0,4194304] 0 2026-03-09T17:30:20.967 INFO:tasks.workunit.client.1.vm09.stdout:1/450: write d9/f6c [1017306,100208] 0 2026-03-09T17:30:20.969 INFO:tasks.workunit.client.1.vm09.stdout:1/451: mknod d9/d3a/c8c 0 2026-03-09T17:30:20.973 INFO:tasks.workunit.client.1.vm09.stdout:1/452: chown d9/d38/l5c 24 1 2026-03-09T17:30:20.980 INFO:tasks.workunit.client.1.vm09.stdout:5/459: dwrite d0/d2/f5d [0,4194304] 0 2026-03-09T17:30:20.980 INFO:tasks.workunit.client.1.vm09.stdout:9/430: dwrite d5/d2e/f53 [0,4194304] 0 2026-03-09T17:30:20.980 INFO:tasks.workunit.client.1.vm09.stdout:5/460: truncate d0/d52/d20/f7c 5232682 0 2026-03-09T17:30:20.986 INFO:tasks.workunit.client.1.vm09.stdout:1/453: dread d9/dc/dd/d40/d22/d37/f2e [4194304,4194304] 0 2026-03-09T17:30:20.990 INFO:tasks.workunit.client.1.vm09.stdout:0/435: read d6/d1d/d24/f4e [346222,91387] 0 2026-03-09T17:30:20.990 INFO:tasks.workunit.client.1.vm09.stdout:0/436: dread d6/d1d/f3c [0,4194304] 0 2026-03-09T17:30:20.990 INFO:tasks.workunit.client.1.vm09.stdout:5/461: dwrite d0/dc/d21/d6f/f80 [0,4194304] 0 2026-03-09T17:30:20.990 INFO:tasks.workunit.client.1.vm09.stdout:9/431: fdatasync d5/d2e/f5a 0 2026-03-09T17:30:21.004 INFO:tasks.workunit.client.1.vm09.stdout:9/432: chown d5/de/c23 1542364 1 2026-03-09T17:30:21.015 INFO:tasks.workunit.client.1.vm09.stdout:5/462: dread d0/dc/d21/d33/f35 [0,4194304] 0 2026-03-09T17:30:21.021 INFO:tasks.workunit.client.1.vm09.stdout:0/437: dread d6/d1d/d24/f75 [0,4194304] 0 2026-03-09T17:30:21.021 INFO:tasks.workunit.client.1.vm09.stdout:5/463: dwrite d0/dc/d21/d26/f28 [0,4194304] 0 2026-03-09T17:30:21.029 INFO:tasks.workunit.client.1.vm09.stdout:1/454: creat d9/f8d x:0 0 0 2026-03-09T17:30:21.039 INFO:tasks.workunit.client.1.vm09.stdout:8/441: write d1/da/dd/f45 [370278,52824] 0 2026-03-09T17:30:21.041 INFO:tasks.workunit.client.1.vm09.stdout:2/425: dwrite d13/f73 [0,4194304] 0 2026-03-09T17:30:21.044 INFO:tasks.workunit.client.1.vm09.stdout:4/479: dwrite d11/d1e/d29/d36/f86 [4194304,4194304] 0 2026-03-09T17:30:21.045 INFO:tasks.workunit.client.1.vm09.stdout:4/480: write d11/f1f [5382454,53971] 0 2026-03-09T17:30:21.046 INFO:tasks.workunit.client.1.vm09.stdout:2/426: dwrite fd [0,4194304] 0 2026-03-09T17:30:21.084 INFO:tasks.workunit.client.1.vm09.stdout:8/442: mkdir d1/da/dd/d47/d4c/d8d 0 2026-03-09T17:30:21.086 INFO:tasks.workunit.client.1.vm09.stdout:4/481: mknod d11/d1e/d45/d60/d71/c98 0 2026-03-09T17:30:21.087 INFO:tasks.workunit.client.1.vm09.stdout:2/427: creat d13/d4d/f81 x:0 0 0 2026-03-09T17:30:21.087 INFO:tasks.workunit.client.1.vm09.stdout:4/482: chown d11/d1e/d29/d36/d57/d97 2904 1 2026-03-09T17:30:21.121 INFO:tasks.workunit.client.1.vm09.stdout:2/428: sync 2026-03-09T17:30:21.125 INFO:tasks.workunit.client.1.vm09.stdout:2/429: creat d13/d15/d34/d45/f82 x:0 0 0 2026-03-09T17:30:21.126 INFO:tasks.workunit.client.1.vm09.stdout:2/430: mkdir d13/d15/d34/d37/d6f/d83 0 2026-03-09T17:30:21.129 INFO:tasks.workunit.client.1.vm09.stdout:2/431: dread d13/d15/d21/f31 [0,4194304] 0 2026-03-09T17:30:21.129 INFO:tasks.workunit.client.1.vm09.stdout:2/432: rmdir d13/d15/d3b 39 2026-03-09T17:30:21.130 INFO:tasks.workunit.client.1.vm09.stdout:2/433: mkdir d13/d15/d34/d45/d84 0 2026-03-09T17:30:21.144 INFO:tasks.workunit.client.1.vm09.stdout:2/434: mkdir d13/d15/d60/d85 0 2026-03-09T17:30:21.149 INFO:tasks.workunit.client.1.vm09.stdout:2/435: dwrite d13/d15/d34/d69/f7a [0,4194304] 0 2026-03-09T17:30:21.150 INFO:tasks.workunit.client.1.vm09.stdout:2/436: write d13/d15/d34/d37/d6f/f7b [983874,12407] 0 2026-03-09T17:30:21.171 INFO:tasks.workunit.client.1.vm09.stdout:7/550: dwrite da/d11/f3f [4194304,4194304] 0 2026-03-09T17:30:21.174 INFO:tasks.workunit.client.1.vm09.stdout:2/437: symlink d13/d15/d36/d72/l86 0 2026-03-09T17:30:21.179 INFO:tasks.workunit.client.1.vm09.stdout:3/410: dwrite f3 [0,4194304] 0 2026-03-09T17:30:21.199 INFO:tasks.workunit.client.1.vm09.stdout:7/551: getdents da/d11/d64 0 2026-03-09T17:30:21.199 INFO:tasks.workunit.client.1.vm09.stdout:0/438: rmdir d6 39 2026-03-09T17:30:21.200 INFO:tasks.workunit.client.1.vm09.stdout:0/439: write d6/d1d/d24/d32/f45 [4296548,30833] 0 2026-03-09T17:30:21.204 INFO:tasks.workunit.client.1.vm09.stdout:0/440: creat d6/d1d/d24/d32/d59/d81/f90 x:0 0 0 2026-03-09T17:30:21.205 INFO:tasks.workunit.client.1.vm09.stdout:6/437: dwrite d3/d7/ff [0,4194304] 0 2026-03-09T17:30:21.224 INFO:tasks.workunit.client.1.vm09.stdout:6/438: dread d3/d21/d76/f70 [0,4194304] 0 2026-03-09T17:30:21.238 INFO:tasks.workunit.client.1.vm09.stdout:6/439: getdents d3/d21/d25/d26/d6b 0 2026-03-09T17:30:21.255 INFO:tasks.workunit.client.1.vm09.stdout:5/464: write d0/d52/d20/f25 [287593,45944] 0 2026-03-09T17:30:21.256 INFO:tasks.workunit.client.1.vm09.stdout:5/465: fsync d0/dc/d21/f7a 0 2026-03-09T17:30:21.259 INFO:tasks.workunit.client.1.vm09.stdout:5/466: dwrite d0/d52/f81 [0,4194304] 0 2026-03-09T17:30:21.310 INFO:tasks.workunit.client.1.vm09.stdout:8/443: mknod d1/d14/c8e 0 2026-03-09T17:30:21.313 INFO:tasks.workunit.client.1.vm09.stdout:8/444: dwrite d1/da/dd/f22 [0,4194304] 0 2026-03-09T17:30:21.332 INFO:tasks.workunit.client.1.vm09.stdout:8/445: getdents d1/d14/d2a/d42 0 2026-03-09T17:30:21.350 INFO:tasks.workunit.client.1.vm09.stdout:8/446: creat d1/da/d23/f8f x:0 0 0 2026-03-09T17:30:21.350 INFO:tasks.workunit.client.1.vm09.stdout:8/447: fsync d1/d14/d2a/d42/d5d/f80 0 2026-03-09T17:30:21.350 INFO:tasks.workunit.client.1.vm09.stdout:8/448: chown d1/d14/d2a/d49 579 1 2026-03-09T17:30:21.351 INFO:tasks.workunit.client.1.vm09.stdout:8/449: chown d1/da/dd/f45 153127782 1 2026-03-09T17:30:21.351 INFO:tasks.workunit.client.1.vm09.stdout:8/450: chown d1/da/dd 397 1 2026-03-09T17:30:21.427 INFO:tasks.workunit.client.1.vm09.stdout:3/411: symlink d5/d16/d31/d3d/l7b 0 2026-03-09T17:30:21.428 INFO:tasks.workunit.client.1.vm09.stdout:3/412: write d5/d16/d25/f60 [4563500,110367] 0 2026-03-09T17:30:21.447 INFO:tasks.workunit.client.1.vm09.stdout:3/413: creat d5/d16/d31/d37/d58/d64/f7c x:0 0 0 2026-03-09T17:30:21.455 INFO:tasks.workunit.client.1.vm09.stdout:3/414: getdents d5/d16 0 2026-03-09T17:30:21.469 INFO:tasks.workunit.client.1.vm09.stdout:3/415: getdents d5/d16/d31/d37 0 2026-03-09T17:30:21.469 INFO:tasks.workunit.client.1.vm09.stdout:3/416: dread - d5/d16/d31/d37/f6d zero size 2026-03-09T17:30:21.499 INFO:tasks.workunit.client.1.vm09.stdout:9/433: truncate d5/de/f2d 985824 0 2026-03-09T17:30:21.501 INFO:tasks.workunit.client.1.vm09.stdout:1/455: write d9/dc/dd/d40/d22/d37/d3f/f62 [811023,109548] 0 2026-03-09T17:30:21.502 INFO:tasks.workunit.client.1.vm09.stdout:9/434: truncate d5/de/d29/d33/f7a 463346 0 2026-03-09T17:30:21.506 INFO:tasks.workunit.client.1.vm09.stdout:9/435: dwrite d5/d2e/d6b/f74 [0,4194304] 0 2026-03-09T17:30:21.527 INFO:tasks.workunit.client.1.vm09.stdout:1/456: write d9/d3a/f8a [667602,2876] 0 2026-03-09T17:30:21.540 INFO:tasks.workunit.client.1.vm09.stdout:9/436: creat d5/d21/f92 x:0 0 0 2026-03-09T17:30:21.540 INFO:tasks.workunit.client.1.vm09.stdout:2/438: write d13/d15/d21/f24 [2064421,117569] 0 2026-03-09T17:30:21.555 INFO:tasks.workunit.client.1.vm09.stdout:2/439: dread d13/d15/d21/f28 [0,4194304] 0 2026-03-09T17:30:21.581 INFO:tasks.workunit.client.1.vm09.stdout:7/552: write da/d11/d47/d5b/d6c/f7b [365848,23800] 0 2026-03-09T17:30:21.601 INFO:tasks.workunit.client.1.vm09.stdout:7/553: dread da/d11/d3e/f88 [0,4194304] 0 2026-03-09T17:30:21.619 INFO:tasks.workunit.client.1.vm09.stdout:0/441: dwrite d6/d1d/f3c [0,4194304] 0 2026-03-09T17:30:21.627 INFO:tasks.workunit.client.1.vm09.stdout:6/440: truncate d3/d21/d76/d5c/d61/d6a/f74 2542490 0 2026-03-09T17:30:21.649 INFO:tasks.workunit.client.1.vm09.stdout:2/440: creat d13/d15/d36/d72/f87 x:0 0 0 2026-03-09T17:30:21.649 INFO:tasks.workunit.client.1.vm09.stdout:2/441: chown d13/d15/d21 61151808 1 2026-03-09T17:30:21.650 INFO:tasks.workunit.client.1.vm09.stdout:7/554: symlink da/d11/d2d/d56/lc0 0 2026-03-09T17:30:21.650 INFO:tasks.workunit.client.1.vm09.stdout:2/442: truncate d13/d4d/f6d 247395 0 2026-03-09T17:30:21.651 INFO:tasks.workunit.client.1.vm09.stdout:2/443: read d13/f40 [201984,60655] 0 2026-03-09T17:30:21.656 INFO:tasks.workunit.client.1.vm09.stdout:0/442: fsync d6/d1d/d24/f5d 0 2026-03-09T17:30:21.656 INFO:tasks.workunit.client.1.vm09.stdout:5/467: write d0/ff [2490313,10233] 0 2026-03-09T17:30:21.666 INFO:tasks.workunit.client.1.vm09.stdout:1/457: fdatasync d9/dc/f3d 0 2026-03-09T17:30:21.666 INFO:tasks.workunit.client.1.vm09.stdout:1/458: chown d9/dc/dd/d40/l27 4729706 1 2026-03-09T17:30:21.673 INFO:tasks.workunit.client.1.vm09.stdout:5/468: sync 2026-03-09T17:30:21.673 INFO:tasks.workunit.client.1.vm09.stdout:5/469: readlink d0/d52/d20/l27 0 2026-03-09T17:30:21.678 INFO:tasks.workunit.client.1.vm09.stdout:7/555: dwrite da/d11/d3e/f60 [0,4194304] 0 2026-03-09T17:30:21.681 INFO:tasks.workunit.client.1.vm09.stdout:7/556: dwrite da/d11/d47/d5b/d6c/f7b [0,4194304] 0 2026-03-09T17:30:21.684 INFO:tasks.workunit.client.1.vm09.stdout:7/557: stat da/d11/d2d/f70 0 2026-03-09T17:30:21.686 INFO:tasks.workunit.client.1.vm09.stdout:2/444: mkdir d13/d15/d21/d88 0 2026-03-09T17:30:21.687 INFO:tasks.workunit.client.1.vm09.stdout:6/441: mkdir d3/d21/d25/d91 0 2026-03-09T17:30:21.700 INFO:tasks.workunit.client.1.vm09.stdout:5/470: write d0/d9/f3e [2581794,77416] 0 2026-03-09T17:30:21.729 INFO:tasks.workunit.client.1.vm09.stdout:9/437: getdents d5/de/d29 0 2026-03-09T17:30:21.729 INFO:tasks.workunit.client.1.vm09.stdout:9/438: fsync d5/de/d29/f52 0 2026-03-09T17:30:21.738 INFO:tasks.workunit.client.1.vm09.stdout:5/471: dread d0/dc/d21/d6f/f5f [0,4194304] 0 2026-03-09T17:30:21.738 INFO:tasks.workunit.client.1.vm09.stdout:5/472: readlink d0/d55/l71 0 2026-03-09T17:30:21.738 INFO:tasks.workunit.client.1.vm09.stdout:5/473: chown d0/dc/d21/d33 0 1 2026-03-09T17:30:21.739 INFO:tasks.workunit.client.1.vm09.stdout:5/474: dread - d0/dc/d21/d33/f65 zero size 2026-03-09T17:30:21.741 INFO:tasks.workunit.client.1.vm09.stdout:0/443: creat d6/d1d/f91 x:0 0 0 2026-03-09T17:30:21.742 INFO:tasks.workunit.client.1.vm09.stdout:0/444: fsync d6/d1d/d24/d5e/f8a 0 2026-03-09T17:30:21.762 INFO:tasks.workunit.client.1.vm09.stdout:1/459: creat d9/dc/dd/d40/d21/d35/f8e x:0 0 0 2026-03-09T17:30:21.769 INFO:tasks.workunit.client.1.vm09.stdout:1/460: dread d9/dc/dd/fe [0,4194304] 0 2026-03-09T17:30:21.786 INFO:tasks.workunit.client.1.vm09.stdout:0/445: link d6/d1d/d39/l3d d6/d1d/d39/l92 0 2026-03-09T17:30:21.786 INFO:tasks.workunit.client.1.vm09.stdout:0/446: dread - d6/d1d/f91 zero size 2026-03-09T17:30:21.787 INFO:tasks.workunit.client.1.vm09.stdout:4/483: rename d11/d1e/d45/d60/c66 to d11/d1e/d31/c99 0 2026-03-09T17:30:21.789 INFO:tasks.workunit.client.1.vm09.stdout:0/447: dread d6/d1d/d24/f75 [0,4194304] 0 2026-03-09T17:30:21.792 INFO:tasks.workunit.client.1.vm09.stdout:8/451: symlink d1/da/dd/d47/l90 0 2026-03-09T17:30:21.792 INFO:tasks.workunit.client.1.vm09.stdout:3/417: mknod d5/d16/d31/d37/c7d 0 2026-03-09T17:30:21.804 INFO:tasks.workunit.client.1.vm09.stdout:9/439: rename d5/d2e/d6b to d5/de/d4e/d6e/d93 0 2026-03-09T17:30:21.805 INFO:tasks.workunit.client.1.vm09.stdout:9/440: readlink d5/d21/l80 0 2026-03-09T17:30:21.808 INFO:tasks.workunit.client.1.vm09.stdout:9/441: dwrite d5/d2e/f53 [0,4194304] 0 2026-03-09T17:30:21.829 INFO:tasks.workunit.client.1.vm09.stdout:5/475: rename d0/dc/d21/d7e to d0/d2/d76/d87/d95 0 2026-03-09T17:30:21.835 INFO:tasks.workunit.client.1.vm09.stdout:0/448: mkdir d6/d93 0 2026-03-09T17:30:21.835 INFO:tasks.workunit.client.1.vm09.stdout:0/449: chown d6/d1d/d46/l65 2044 1 2026-03-09T17:30:21.838 INFO:tasks.workunit.client.1.vm09.stdout:6/442: rmdir d3/d21/d76 39 2026-03-09T17:30:21.841 INFO:tasks.workunit.client.1.vm09.stdout:0/450: sync 2026-03-09T17:30:21.845 INFO:tasks.workunit.client.1.vm09.stdout:8/452: mknod d1/da/dd/d79/c91 0 2026-03-09T17:30:21.856 INFO:tasks.workunit.client.1.vm09.stdout:4/484: rename d11/d1e/d31/c5c to d11/d1e/d29/d36/d57/d78/c9a 0 2026-03-09T17:30:21.861 INFO:tasks.workunit.client.1.vm09.stdout:9/442: getdents d5/d2e/d8b 0 2026-03-09T17:30:21.864 INFO:tasks.workunit.client.1.vm09.stdout:1/461: dwrite d9/dc/f3d [0,4194304] 0 2026-03-09T17:30:21.890 INFO:tasks.workunit.client.1.vm09.stdout:9/443: rmdir d5/de 39 2026-03-09T17:30:21.903 INFO:tasks.workunit.client.1.vm09.stdout:7/558: write da/d11/d47/f8d [272280,58146] 0 2026-03-09T17:30:21.912 INFO:tasks.workunit.client.1.vm09.stdout:2/445: dwrite d13/f14 [0,4194304] 0 2026-03-09T17:30:21.919 INFO:tasks.workunit.client.1.vm09.stdout:9/444: fsync d5/de/d29/d33/f3b 0 2026-03-09T17:30:21.919 INFO:tasks.workunit.client.1.vm09.stdout:1/462: rmdir d9 39 2026-03-09T17:30:21.923 INFO:tasks.workunit.client.1.vm09.stdout:5/476: link d0/dc/d21/d33/c8a d0/c96 0 2026-03-09T17:30:21.926 INFO:tasks.workunit.client.1.vm09.stdout:9/445: mkdir d5/de/d29/d33/d94 0 2026-03-09T17:30:21.927 INFO:tasks.workunit.client.1.vm09.stdout:1/463: chown d9/d3a/l7f 945 1 2026-03-09T17:30:21.927 INFO:tasks.workunit.client.1.vm09.stdout:0/451: getdents d6/d64 0 2026-03-09T17:30:21.928 INFO:tasks.workunit.client.1.vm09.stdout:7/559: truncate da/d11/d47/d5b/d6c/d9e/d4e/f74 3736475 0 2026-03-09T17:30:21.928 INFO:tasks.workunit.client.1.vm09.stdout:5/477: creat d0/d52/f97 x:0 0 0 2026-03-09T17:30:21.929 INFO:tasks.workunit.client.1.vm09.stdout:9/446: symlink d5/d21/l95 0 2026-03-09T17:30:21.930 INFO:tasks.workunit.client.1.vm09.stdout:2/446: creat d13/f89 x:0 0 0 2026-03-09T17:30:21.936 INFO:tasks.workunit.client.1.vm09.stdout:0/452: dread d6/d1d/d24/d32/f68 [0,4194304] 0 2026-03-09T17:30:21.941 INFO:tasks.workunit.client.1.vm09.stdout:0/453: write d6/d1d/d24/d5e/f8a [858947,72995] 0 2026-03-09T17:30:21.941 INFO:tasks.workunit.client.1.vm09.stdout:2/447: rmdir d13/d15/d34/d69 39 2026-03-09T17:30:21.941 INFO:tasks.workunit.client.1.vm09.stdout:2/448: chown d13/d15/d21/d88 14706 1 2026-03-09T17:30:21.942 INFO:tasks.workunit.client.1.vm09.stdout:2/449: readlink d13/d15/d34/d37/l55 0 2026-03-09T17:30:21.964 INFO:tasks.workunit.client.1.vm09.stdout:3/418: write d5/f53 [637912,56473] 0 2026-03-09T17:30:21.964 INFO:tasks.workunit.client.1.vm09.stdout:3/419: readlink d5/d16/d31/d3d/l7b 0 2026-03-09T17:30:21.969 INFO:tasks.workunit.client.1.vm09.stdout:6/443: write d3/d7/f24 [3679746,105552] 0 2026-03-09T17:30:21.970 INFO:tasks.workunit.client.1.vm09.stdout:6/444: fsync d3/d7/ff 0 2026-03-09T17:30:21.975 INFO:tasks.workunit.client.1.vm09.stdout:8/453: dwrite d1/d14/d2a/f54 [0,4194304] 0 2026-03-09T17:30:21.976 INFO:tasks.workunit.client.1.vm09.stdout:4/485: truncate d11/f23 3926205 0 2026-03-09T17:30:21.977 INFO:tasks.workunit.client.1.vm09.stdout:1/464: rmdir d9/dc/dd/d40/d21/d6f/d7e 39 2026-03-09T17:30:21.978 INFO:tasks.workunit.client.1.vm09.stdout:4/486: write d11/d1e/d31/f74 [1037827,108176] 0 2026-03-09T17:30:21.978 INFO:tasks.workunit.client.1.vm09.stdout:5/478: symlink d0/d9/l98 0 2026-03-09T17:30:21.986 INFO:tasks.workunit.client.1.vm09.stdout:1/465: dwrite d9/dc/dd/d40/d22/f4a [0,4194304] 0 2026-03-09T17:30:21.990 INFO:tasks.workunit.client.1.vm09.stdout:9/447: creat d5/d7e/d81/f96 x:0 0 0 2026-03-09T17:30:22.007 INFO:tasks.workunit.client.1.vm09.stdout:0/454: fdatasync d6/f21 0 2026-03-09T17:30:22.017 INFO:tasks.workunit.client.1.vm09.stdout:3/420: mknod d5/d16/d25/c7e 0 2026-03-09T17:30:22.022 INFO:tasks.workunit.client.1.vm09.stdout:7/560: creat da/fc1 x:0 0 0 2026-03-09T17:30:22.033 INFO:tasks.workunit.client.1.vm09.stdout:5/479: creat d0/d9/d74/f99 x:0 0 0 2026-03-09T17:30:22.048 INFO:tasks.workunit.client.1.vm09.stdout:1/466: fsync d9/dc/dd/d40/d22/d37/f2e 0 2026-03-09T17:30:22.057 INFO:tasks.workunit.client.1.vm09.stdout:2/450: symlink d13/d15/d60/d85/l8a 0 2026-03-09T17:30:22.063 INFO:tasks.workunit.client.1.vm09.stdout:0/455: mkdir d6/d64/d94 0 2026-03-09T17:30:22.064 INFO:tasks.workunit.client.1.vm09.stdout:3/421: mknod d5/d16/c7f 0 2026-03-09T17:30:22.066 INFO:tasks.workunit.client.1.vm09.stdout:8/454: rename d1/c52 to d1/da/d23/d6c/d32/c92 0 2026-03-09T17:30:22.068 INFO:tasks.workunit.client.1.vm09.stdout:7/561: dread da/d11/d47/d5b/d6c/d9e/d4e/d4c/f66 [0,4194304] 0 2026-03-09T17:30:22.068 INFO:tasks.workunit.client.1.vm09.stdout:5/480: truncate d0/f22 1199547 0 2026-03-09T17:30:22.069 INFO:tasks.workunit.client.1.vm09.stdout:5/481: chown d0/dc/d21/d6f/f5f 1 1 2026-03-09T17:30:22.070 INFO:tasks.workunit.client.1.vm09.stdout:7/562: truncate da/d11/d3e/da2/db2/fa6 250276 0 2026-03-09T17:30:22.078 INFO:tasks.workunit.client.1.vm09.stdout:9/448: mkdir d5/d2e/d70/d84/d97 0 2026-03-09T17:30:22.079 INFO:tasks.workunit.client.1.vm09.stdout:9/449: fdatasync d5/d21/f2f 0 2026-03-09T17:30:22.083 INFO:tasks.workunit.client.1.vm09.stdout:0/456: mknod d6/d1d/d24/d32/d59/d81/c95 0 2026-03-09T17:30:22.084 INFO:tasks.workunit.client.1.vm09.stdout:0/457: readlink d6/d1d/d24/l69 0 2026-03-09T17:30:22.088 INFO:tasks.workunit.client.1.vm09.stdout:8/455: creat d1/da/d23/f93 x:0 0 0 2026-03-09T17:30:22.094 INFO:tasks.workunit.client.1.vm09.stdout:6/445: rename d3/d21/d76/f7a to d3/d21/d76/d5c/f92 0 2026-03-09T17:30:22.095 INFO:tasks.workunit.client.1.vm09.stdout:4/487: link d11/d1e/d29/d36/f40 d11/d1e/d31/f9b 0 2026-03-09T17:30:22.095 INFO:tasks.workunit.client.1.vm09.stdout:6/446: truncate d3/d21/f80 839762 0 2026-03-09T17:30:22.096 INFO:tasks.workunit.client.1.vm09.stdout:6/447: dread - d3/d21/d76/d5c/f6d zero size 2026-03-09T17:30:22.105 INFO:tasks.workunit.client.1.vm09.stdout:5/482: link d0/f91 d0/d2/d76/d87/d95/f9a 0 2026-03-09T17:30:22.110 INFO:tasks.workunit.client.1.vm09.stdout:7/563: rename da/d11/d47/d8f to da/d11/d47/d89/dbe/dc2 0 2026-03-09T17:30:22.112 INFO:tasks.workunit.client.1.vm09.stdout:7/564: read da/d11/d47/d5b/d6c/d9e/d4e/f7c [2670236,109048] 0 2026-03-09T17:30:22.112 INFO:tasks.workunit.client.1.vm09.stdout:7/565: fdatasync da/d11/d47/d5b/d6c/d9e/f35 0 2026-03-09T17:30:22.113 INFO:tasks.workunit.client.1.vm09.stdout:7/566: write da/d11/d64/fa9 [568532,8062] 0 2026-03-09T17:30:22.116 INFO:tasks.workunit.client.1.vm09.stdout:6/448: creat d3/d7/d59/d73/f93 x:0 0 0 2026-03-09T17:30:22.120 INFO:tasks.workunit.client.1.vm09.stdout:6/449: dwrite d3/d7/d59/d73/f75 [0,4194304] 0 2026-03-09T17:30:22.123 INFO:tasks.workunit.client.1.vm09.stdout:6/450: dread d3/d7/d59/d5a/f83 [0,4194304] 0 2026-03-09T17:30:22.132 INFO:tasks.workunit.client.1.vm09.stdout:2/451: creat d13/f8b x:0 0 0 2026-03-09T17:30:22.141 INFO:tasks.workunit.client.1.vm09.stdout:0/458: mknod d6/d93/c96 0 2026-03-09T17:30:22.142 INFO:tasks.workunit.client.1.vm09.stdout:8/456: creat d1/d14/d2a/d42/d5d/d8a/f94 x:0 0 0 2026-03-09T17:30:22.145 INFO:tasks.workunit.client.1.vm09.stdout:8/457: dwrite d1/da/d23/f93 [0,4194304] 0 2026-03-09T17:30:22.158 INFO:tasks.workunit.client.1.vm09.stdout:7/567: mknod da/d11/d64/da7/cc3 0 2026-03-09T17:30:22.164 INFO:tasks.workunit.client.1.vm09.stdout:4/488: mkdir d11/d1e/d45/d60/d9c 0 2026-03-09T17:30:22.179 INFO:tasks.workunit.client.1.vm09.stdout:1/467: truncate d9/f34 4929025 0 2026-03-09T17:30:22.182 INFO:tasks.workunit.client.1.vm09.stdout:5/483: dwrite d0/d2/f31 [0,4194304] 0 2026-03-09T17:30:22.199 INFO:tasks.workunit.client.1.vm09.stdout:6/451: unlink d3/d21/d76/f90 0 2026-03-09T17:30:22.210 INFO:tasks.workunit.client.1.vm09.stdout:8/458: dwrite d1/da/dd/d47/f64 [0,4194304] 0 2026-03-09T17:30:22.212 INFO:tasks.workunit.client.1.vm09.stdout:1/468: mknod d9/d3a/c8f 0 2026-03-09T17:30:22.215 INFO:tasks.workunit.client.1.vm09.stdout:5/484: mkdir d0/d2/d76/d87/d95/d9b 0 2026-03-09T17:30:22.216 INFO:tasks.workunit.client.1.vm09.stdout:5/485: chown d0/d46/d4b 201056184 1 2026-03-09T17:30:22.221 INFO:tasks.workunit.client.1.vm09.stdout:7/568: symlink da/d11/d2d/lc4 0 2026-03-09T17:30:22.222 INFO:tasks.workunit.client.1.vm09.stdout:4/489: mknod d11/c9d 0 2026-03-09T17:30:22.223 INFO:tasks.workunit.client.1.vm09.stdout:8/459: chown d1/da/d23/l6f 18134 1 2026-03-09T17:30:22.225 INFO:tasks.workunit.client.1.vm09.stdout:3/422: rename d5/d16/d25/c67 to d5/d16/c80 0 2026-03-09T17:30:22.233 INFO:tasks.workunit.client.1.vm09.stdout:9/450: link d5/de/c28 d5/c98 0 2026-03-09T17:30:22.239 INFO:tasks.workunit.client.1.vm09.stdout:7/569: readlink da/d11/d2d/l4d 0 2026-03-09T17:30:22.240 INFO:tasks.workunit.client.1.vm09.stdout:0/459: rename d6/d1d/d24/d32/d59/d5b to d6/d64/d97 0 2026-03-09T17:30:22.245 INFO:tasks.workunit.client.1.vm09.stdout:1/469: sync 2026-03-09T17:30:22.250 INFO:tasks.workunit.client.1.vm09.stdout:3/423: creat d5/d9/d30/d65/d59/f81 x:0 0 0 2026-03-09T17:30:22.253 INFO:tasks.workunit.client.1.vm09.stdout:3/424: dwrite d5/d9/d30/d65/f18 [0,4194304] 0 2026-03-09T17:30:22.276 INFO:tasks.workunit.client.1.vm09.stdout:9/451: dwrite d5/de/d29/d33/f66 [0,4194304] 0 2026-03-09T17:30:22.276 INFO:tasks.workunit.client.1.vm09.stdout:9/452: chown d5/de/d29/f37 2498 1 2026-03-09T17:30:22.283 INFO:tasks.workunit.client.1.vm09.stdout:7/570: rmdir da/d11/d64 39 2026-03-09T17:30:22.291 INFO:tasks.workunit.client.1.vm09.stdout:4/490: truncate d11/d1e/d31/f9b 108267 0 2026-03-09T17:30:22.298 INFO:tasks.workunit.client.1.vm09.stdout:4/491: dread d11/d1e/d45/d60/d71/f76 [0,4194304] 0 2026-03-09T17:30:22.298 INFO:tasks.workunit.client.1.vm09.stdout:4/492: chown d11/f1c 461806 1 2026-03-09T17:30:22.299 INFO:tasks.workunit.client.1.vm09.stdout:4/493: chown d11/d1e/d29/l5b 360778831 1 2026-03-09T17:30:22.301 INFO:tasks.workunit.client.1.vm09.stdout:4/494: dwrite d11/d1e/d29/f6d [0,4194304] 0 2026-03-09T17:30:22.306 INFO:tasks.workunit.client.1.vm09.stdout:6/452: rename d3/d7/d59/d72 to d3/d21/d76/d5c/d7e/d94 0 2026-03-09T17:30:22.312 INFO:tasks.workunit.client.1.vm09.stdout:2/452: write d13/d15/d34/d69/f7c [1037695,7845] 0 2026-03-09T17:30:22.314 INFO:tasks.workunit.client.1.vm09.stdout:2/453: chown d13/d15/d60 78756 1 2026-03-09T17:30:22.317 INFO:tasks.workunit.client.1.vm09.stdout:5/486: write d0/d2/d76/d87/d95/f9a [661819,101783] 0 2026-03-09T17:30:22.319 INFO:tasks.workunit.client.1.vm09.stdout:8/460: write d1/da/f35 [1599523,111586] 0 2026-03-09T17:30:22.321 INFO:tasks.workunit.client.1.vm09.stdout:1/470: chown d9/d38/l5c 2 1 2026-03-09T17:30:22.324 INFO:tasks.workunit.client.1.vm09.stdout:8/461: dwrite d1/da/f4b [0,4194304] 0 2026-03-09T17:30:22.327 INFO:tasks.workunit.client.1.vm09.stdout:8/462: write d1/f7 [4824640,77787] 0 2026-03-09T17:30:22.331 INFO:tasks.workunit.client.1.vm09.stdout:8/463: write d1/d14/d2a/d42/d5d/f80 [117755,13017] 0 2026-03-09T17:30:22.333 INFO:tasks.workunit.client.1.vm09.stdout:8/464: fdatasync d1/da/d23/f93 0 2026-03-09T17:30:22.333 INFO:tasks.workunit.client.1.vm09.stdout:8/465: chown d1/da/f4b 40 1 2026-03-09T17:30:22.333 INFO:tasks.workunit.client.1.vm09.stdout:8/466: dread - d1/d14/d2a/f8b zero size 2026-03-09T17:30:22.343 INFO:tasks.workunit.client.1.vm09.stdout:7/571: write da/d11/d64/fa9 [1458773,102090] 0 2026-03-09T17:30:22.346 INFO:tasks.workunit.client.1.vm09.stdout:3/425: dwrite d5/d16/d31/d37/f6d [0,4194304] 0 2026-03-09T17:30:22.357 INFO:tasks.workunit.client.1.vm09.stdout:4/495: fdatasync d11/d1e/d31/f3a 0 2026-03-09T17:30:22.357 INFO:tasks.workunit.client.1.vm09.stdout:2/454: mknod d13/d15/d34/d69/c8c 0 2026-03-09T17:30:22.358 INFO:tasks.workunit.client.1.vm09.stdout:2/455: chown d13/f40 46 1 2026-03-09T17:30:22.358 INFO:tasks.workunit.client.1.vm09.stdout:6/453: write d3/d48/f6c [491075,86281] 0 2026-03-09T17:30:22.361 INFO:tasks.workunit.client.1.vm09.stdout:6/454: write d3/f4f [114513,37767] 0 2026-03-09T17:30:22.364 INFO:tasks.workunit.client.1.vm09.stdout:1/471: fdatasync d9/dc/dd/d40/d22/d37/d3f/d42/f45 0 2026-03-09T17:30:22.369 INFO:tasks.workunit.client.1.vm09.stdout:8/467: creat d1/d14/d2a/d42/d43/f95 x:0 0 0 2026-03-09T17:30:22.369 INFO:tasks.workunit.client.1.vm09.stdout:9/453: mkdir d5/d91/d99 0 2026-03-09T17:30:22.370 INFO:tasks.workunit.client.1.vm09.stdout:8/468: chown d1/da/dd/d79/f83 795585572 1 2026-03-09T17:30:22.370 INFO:tasks.workunit.client.1.vm09.stdout:9/454: chown d5/de/d29/d33/f7a 481983131 1 2026-03-09T17:30:22.370 INFO:tasks.workunit.client.1.vm09.stdout:9/455: readlink d5/d7e/l83 0 2026-03-09T17:30:22.370 INFO:tasks.workunit.client.1.vm09.stdout:8/469: write d1/da/dd/d47/f82 [3699797,65579] 0 2026-03-09T17:30:22.371 INFO:tasks.workunit.client.1.vm09.stdout:8/470: chown d1/d14/d2a/d42/d43/d44 36809888 1 2026-03-09T17:30:22.371 INFO:tasks.workunit.client.1.vm09.stdout:9/456: write d5/de/d29/d33/f3b [949749,33873] 0 2026-03-09T17:30:22.377 INFO:tasks.workunit.client.1.vm09.stdout:9/457: dread d5/de/d4e/d6e/d93/f74 [0,4194304] 0 2026-03-09T17:30:22.379 INFO:tasks.workunit.client.1.vm09.stdout:9/458: fdatasync d5/d2e/f72 0 2026-03-09T17:30:22.379 INFO:tasks.workunit.client.1.vm09.stdout:7/572: creat da/d11/d64/da7/db1/fc5 x:0 0 0 2026-03-09T17:30:22.379 INFO:tasks.workunit.client.1.vm09.stdout:9/459: read d5/d2e/d70/f75 [109481,95733] 0 2026-03-09T17:30:22.380 INFO:tasks.workunit.client.1.vm09.stdout:9/460: write d5/de/f20 [8771647,37628] 0 2026-03-09T17:30:22.395 INFO:tasks.workunit.client.1.vm09.stdout:3/426: read d5/d9/d30/d65/f4f [617104,122025] 0 2026-03-09T17:30:22.414 INFO:tasks.workunit.client.1.vm09.stdout:8/471: unlink d1/da/dd/f27 0 2026-03-09T17:30:22.415 INFO:tasks.workunit.client.1.vm09.stdout:8/472: write d1/d14/d2a/f81 [606689,129831] 0 2026-03-09T17:30:22.420 INFO:tasks.workunit.client.1.vm09.stdout:7/573: mkdir da/d11/d47/d5b/d6c/d9e/dc6 0 2026-03-09T17:30:22.421 INFO:tasks.workunit.client.1.vm09.stdout:7/574: stat da/d11/d47/d5b/d6c/d9e/d4e/f2b 0 2026-03-09T17:30:22.431 INFO:tasks.workunit.client.1.vm09.stdout:9/461: creat d5/de/d29/d33/f9a x:0 0 0 2026-03-09T17:30:22.432 INFO:tasks.workunit.client.1.vm09.stdout:6/455: dread d3/d1e/f20 [0,4194304] 0 2026-03-09T17:30:22.433 INFO:tasks.workunit.client.1.vm09.stdout:0/460: getdents d6/d1d/d39 0 2026-03-09T17:30:22.435 INFO:tasks.workunit.client.1.vm09.stdout:9/462: dwrite d5/de/d29/d33/f7a [0,4194304] 0 2026-03-09T17:30:22.437 INFO:tasks.workunit.client.1.vm09.stdout:9/463: fsync d5/de/f76 0 2026-03-09T17:30:22.443 INFO:tasks.workunit.client.1.vm09.stdout:3/427: mknod d5/d9/d30/d65/d59/c82 0 2026-03-09T17:30:22.443 INFO:tasks.workunit.client.1.vm09.stdout:2/456: symlink d13/d15/d34/l8d 0 2026-03-09T17:30:22.454 INFO:tasks.workunit.client.1.vm09.stdout:8/473: mkdir d1/d14/d96 0 2026-03-09T17:30:22.459 INFO:tasks.workunit.client.1.vm09.stdout:5/487: write d0/d46/f4c [1911463,86427] 0 2026-03-09T17:30:22.459 INFO:tasks.workunit.client.1.vm09.stdout:8/474: dread - d1/d14/d2a/d42/d43/f95 zero size 2026-03-09T17:30:22.459 INFO:tasks.workunit.client.1.vm09.stdout:8/475: dread - d1/d14/d2a/d42/d43/f95 zero size 2026-03-09T17:30:22.479 INFO:tasks.workunit.client.1.vm09.stdout:6/456: chown d3/d21/d25/c44 572 1 2026-03-09T17:30:22.481 INFO:tasks.workunit.client.1.vm09.stdout:0/461: fsync d6/d1d/d24/d32/f49 0 2026-03-09T17:30:22.481 INFO:tasks.workunit.client.1.vm09.stdout:0/462: stat d6/d64/f7e 0 2026-03-09T17:30:22.503 INFO:tasks.workunit.client.1.vm09.stdout:4/496: link d11/d1e/d31/l47 d11/d1e/d83/d89/d8b/d58/l9e 0 2026-03-09T17:30:22.510 INFO:tasks.workunit.client.1.vm09.stdout:5/488: rmdir d0/d2 39 2026-03-09T17:30:22.511 INFO:tasks.workunit.client.1.vm09.stdout:5/489: chown d0/d46/f4c 102 1 2026-03-09T17:30:22.517 INFO:tasks.workunit.client.1.vm09.stdout:0/463: unlink d6/d1d/d24/d32/c48 0 2026-03-09T17:30:22.518 INFO:tasks.workunit.client.1.vm09.stdout:0/464: read - d6/d1d/d24/d32/d59/d81/f90 zero size 2026-03-09T17:30:22.521 INFO:tasks.workunit.client.1.vm09.stdout:0/465: dwrite d6/d1d/d46/f4d [4194304,4194304] 0 2026-03-09T17:30:22.552 INFO:tasks.workunit.client.1.vm09.stdout:3/428: dwrite d5/d9/f1e [0,4194304] 0 2026-03-09T17:30:22.559 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:22 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:22.559 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:22 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:22.559 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:22 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:22.559 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:22 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:30:22.559 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:22 vm06.local ceph-mon[57307]: pgmap v11: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 33 MiB/s rd, 99 MiB/s wr, 217 op/s 2026-03-09T17:30:22.582 INFO:tasks.workunit.client.1.vm09.stdout:5/490: creat d0/d9/d16/d5c/f9c x:0 0 0 2026-03-09T17:30:22.629 INFO:tasks.workunit.client.1.vm09.stdout:0/466: rename d6/d1d/d24/d32/d59/c79 to d6/d1d/d24/d32/d59/c98 0 2026-03-09T17:30:22.637 INFO:tasks.workunit.client.1.vm09.stdout:1/472: link d9/dc/dd/d40/d1d/f1e d9/dc/f90 0 2026-03-09T17:30:22.639 INFO:tasks.workunit.client.1.vm09.stdout:7/575: getdents da/d11/d64/da7 0 2026-03-09T17:30:22.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:22 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:22.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:22 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:22.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:22 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:22.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:22 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:30:22.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:22 vm09.local ceph-mon[62061]: pgmap v11: 65 pgs: 65 active+clean; 3.1 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 33 MiB/s rd, 99 MiB/s wr, 217 op/s 2026-03-09T17:30:22.649 INFO:tasks.workunit.client.1.vm09.stdout:6/457: rename d3/d1e to d3/d21/d76/d5c/d61/d95 0 2026-03-09T17:30:22.649 INFO:tasks.workunit.client.1.vm09.stdout:2/457: getdents d13/d4d 0 2026-03-09T17:30:22.650 INFO:tasks.workunit.client.1.vm09.stdout:0/467: chown d6/c15 129714 1 2026-03-09T17:30:22.651 INFO:tasks.workunit.client.1.vm09.stdout:9/464: link d5/de/c50 d5/de/d29/d90/c9b 0 2026-03-09T17:30:22.651 INFO:tasks.workunit.client.1.vm09.stdout:4/497: rmdir d11/d1e/d29/d36/d57/d97 0 2026-03-09T17:30:22.652 INFO:tasks.workunit.client.1.vm09.stdout:4/498: chown d11/d1e/d83/d89/d8b/c3e 0 1 2026-03-09T17:30:22.654 INFO:tasks.workunit.client.1.vm09.stdout:8/476: getdents d1/da/d23 0 2026-03-09T17:30:22.654 INFO:tasks.workunit.client.1.vm09.stdout:7/576: truncate da/d11/d3e/f88 4489776 0 2026-03-09T17:30:22.654 INFO:tasks.workunit.client.1.vm09.stdout:1/473: fsync d9/f6c 0 2026-03-09T17:30:22.657 INFO:tasks.workunit.client.1.vm09.stdout:5/491: rename d0/f78 to d0/d2/d76/d87/d95/f9d 0 2026-03-09T17:30:22.658 INFO:tasks.workunit.client.1.vm09.stdout:5/492: read d0/dc/d21/f62 [117728,14892] 0 2026-03-09T17:30:22.658 INFO:tasks.workunit.client.1.vm09.stdout:3/429: dwrite d5/d16/d31/d3d/fe [0,4194304] 0 2026-03-09T17:30:22.660 INFO:tasks.workunit.client.1.vm09.stdout:3/430: chown d5/d9/d30/d65/f3e 4 1 2026-03-09T17:30:22.661 INFO:tasks.workunit.client.1.vm09.stdout:5/493: write d0/dc/d21/d26/f39 [4971,37880] 0 2026-03-09T17:30:22.676 INFO:tasks.workunit.client.1.vm09.stdout:6/458: mkdir d3/d21/d25/d96 0 2026-03-09T17:30:22.677 INFO:tasks.workunit.client.1.vm09.stdout:4/499: dread d11/d1e/d83/d89/d8b/f38 [0,4194304] 0 2026-03-09T17:30:22.690 INFO:tasks.workunit.client.1.vm09.stdout:7/577: dread da/d11/d47/d5b/d6c/d9e/d4e/f42 [0,4194304] 0 2026-03-09T17:30:22.693 INFO:tasks.workunit.client.1.vm09.stdout:1/474: mkdir d9/dc/dd/d40/d22/d91 0 2026-03-09T17:30:22.694 INFO:tasks.workunit.client.1.vm09.stdout:0/468: write d6/d1d/d24/d32/d59/d81/d8c/f8f [4843904,14419] 0 2026-03-09T17:30:22.696 INFO:tasks.workunit.client.1.vm09.stdout:0/469: chown d6/d1d/f57 193804916 1 2026-03-09T17:30:22.698 INFO:tasks.workunit.client.1.vm09.stdout:2/458: write d13/d15/d21/f30 [6451694,113569] 0 2026-03-09T17:30:22.698 INFO:tasks.workunit.client.1.vm09.stdout:0/470: dread - d6/d1d/f70 zero size 2026-03-09T17:30:22.700 INFO:tasks.workunit.client.1.vm09.stdout:9/465: mknod d5/d2e/d70/d84/d97/c9c 0 2026-03-09T17:30:22.701 INFO:tasks.workunit.client.1.vm09.stdout:2/459: chown d13/d15/d34/d45/f82 105015219 1 2026-03-09T17:30:22.702 INFO:tasks.workunit.client.1.vm09.stdout:2/460: dread - d13/f8b zero size 2026-03-09T17:30:22.703 INFO:tasks.workunit.client.1.vm09.stdout:2/461: fsync d13/d4d/f5c 0 2026-03-09T17:30:22.705 INFO:tasks.workunit.client.1.vm09.stdout:2/462: chown d13/d15/d34/d69 41 1 2026-03-09T17:30:22.708 INFO:tasks.workunit.client.1.vm09.stdout:1/475: dread d9/dc/dd/d40/d22/d37/d3f/d42/d55/f69 [0,4194304] 0 2026-03-09T17:30:22.711 INFO:tasks.workunit.client.1.vm09.stdout:6/459: truncate d3/d21/d76/d5c/d61/f60 788360 0 2026-03-09T17:30:22.712 INFO:tasks.workunit.client.1.vm09.stdout:4/500: rmdir d11/d1e/d83/d89/d8b/d58 39 2026-03-09T17:30:22.715 INFO:tasks.workunit.client.1.vm09.stdout:0/471: dwrite d6/d1d/d24/d32/f68 [0,4194304] 0 2026-03-09T17:30:22.715 INFO:tasks.workunit.client.1.vm09.stdout:3/431: symlink d5/d16/l83 0 2026-03-09T17:30:22.724 INFO:tasks.workunit.client.1.vm09.stdout:5/494: dread d0/d52/d20/f63 [4194304,4194304] 0 2026-03-09T17:30:22.726 INFO:tasks.workunit.client.1.vm09.stdout:1/476: fsync f6 0 2026-03-09T17:30:22.727 INFO:tasks.workunit.client.1.vm09.stdout:8/477: dwrite d1/d14/d2a/d42/d43/f58 [0,4194304] 0 2026-03-09T17:30:22.728 INFO:tasks.workunit.client.1.vm09.stdout:4/501: symlink d11/d1e/d45/l9f 0 2026-03-09T17:30:22.733 INFO:tasks.workunit.client.1.vm09.stdout:1/477: truncate d9/dc/f76 677094 0 2026-03-09T17:30:22.736 INFO:tasks.workunit.client.1.vm09.stdout:3/432: dread d5/d9/d30/d65/f4f [0,4194304] 0 2026-03-09T17:30:22.741 INFO:tasks.workunit.client.1.vm09.stdout:1/478: dwrite d9/dc/dd/d40/d22/d37/f41 [0,4194304] 0 2026-03-09T17:30:22.743 INFO:tasks.workunit.client.1.vm09.stdout:1/479: chown d9/dc/dd/d40/d21/d6f/f85 108587 1 2026-03-09T17:30:22.749 INFO:tasks.workunit.client.1.vm09.stdout:0/472: creat d6/d1d/d24/d32/d59/f99 x:0 0 0 2026-03-09T17:30:22.769 INFO:tasks.workunit.client.1.vm09.stdout:2/463: symlink d13/d15/d34/d45/d84/l8e 0 2026-03-09T17:30:22.773 INFO:tasks.workunit.client.1.vm09.stdout:2/464: dwrite d13/d15/d21/f31 [0,4194304] 0 2026-03-09T17:30:22.783 INFO:tasks.workunit.client.1.vm09.stdout:8/478: mkdir d1/d14/d31/d97 0 2026-03-09T17:30:22.783 INFO:tasks.workunit.client.1.vm09.stdout:4/502: creat d11/d1e/d29/d36/d57/fa0 x:0 0 0 2026-03-09T17:30:22.787 INFO:tasks.workunit.client.1.vm09.stdout:7/578: truncate da/d11/d47/d5b/d6c/f7b 2297961 0 2026-03-09T17:30:22.787 INFO:tasks.workunit.client.1.vm09.stdout:9/466: truncate d5/f5d 2742463 0 2026-03-09T17:30:22.787 INFO:tasks.workunit.client.1.vm09.stdout:6/460: write d3/d7/d59/d5a/f64 [1267770,88520] 0 2026-03-09T17:30:22.793 INFO:tasks.workunit.client.1.vm09.stdout:3/433: rename d5/d9/d30/d65/d59/d66 to d5/d9/d30/d65/d59/d84 0 2026-03-09T17:30:22.798 INFO:tasks.workunit.client.1.vm09.stdout:0/473: creat d6/d1d/d24/d32/d59/d81/f9a x:0 0 0 2026-03-09T17:30:22.800 INFO:tasks.workunit.client.1.vm09.stdout:5/495: creat d0/dc/d21/d26/d5e/d68/d6d/f9e x:0 0 0 2026-03-09T17:30:22.817 INFO:tasks.workunit.client.1.vm09.stdout:2/465: write d13/d15/d36/f59 [761374,29138] 0 2026-03-09T17:30:22.819 INFO:tasks.workunit.client.1.vm09.stdout:2/466: write d13/d15/d34/d45/f57 [2491812,121518] 0 2026-03-09T17:30:22.823 INFO:tasks.workunit.client.1.vm09.stdout:4/503: truncate d11/d1e/d29/d36/f7f 973894 0 2026-03-09T17:30:22.824 INFO:tasks.workunit.client.1.vm09.stdout:8/479: truncate d1/f33 122885 0 2026-03-09T17:30:22.826 INFO:tasks.workunit.client.1.vm09.stdout:6/461: stat d3/d21/d76/d5c/f78 0 2026-03-09T17:30:22.827 INFO:tasks.workunit.client.1.vm09.stdout:4/504: dwrite d11/d1e/d45/d60/f64 [0,4194304] 0 2026-03-09T17:30:22.829 INFO:tasks.workunit.client.1.vm09.stdout:8/480: dread d1/d14/d2a/f2b [0,4194304] 0 2026-03-09T17:30:22.829 INFO:tasks.workunit.client.1.vm09.stdout:8/481: stat d1/d14/d2a/c84 0 2026-03-09T17:30:22.835 INFO:tasks.workunit.client.1.vm09.stdout:8/482: dwrite d1/da/d23/f8f [0,4194304] 0 2026-03-09T17:30:22.836 INFO:tasks.workunit.client.1.vm09.stdout:9/467: creat d5/d21/f9d x:0 0 0 2026-03-09T17:30:22.865 INFO:tasks.workunit.client.1.vm09.stdout:5/496: fdatasync d0/d2/d76/d86/f6b 0 2026-03-09T17:30:22.868 INFO:tasks.workunit.client.1.vm09.stdout:5/497: dwrite d0/d9/d16/d5c/f9c [0,4194304] 0 2026-03-09T17:30:22.872 INFO:tasks.workunit.client.1.vm09.stdout:5/498: chown d0/d9/f3e 160155 1 2026-03-09T17:30:22.872 INFO:tasks.workunit.client.1.vm09.stdout:0/474: dread d6/f21 [0,4194304] 0 2026-03-09T17:30:22.875 INFO:tasks.workunit.client.1.vm09.stdout:0/475: dread d6/d1d/d24/f75 [0,4194304] 0 2026-03-09T17:30:22.876 INFO:tasks.workunit.client.1.vm09.stdout:0/476: chown d6/d1d/d39/c3f 16348 1 2026-03-09T17:30:22.910 INFO:tasks.workunit.client.1.vm09.stdout:7/579: dwrite da/d11/d47/d5b/d6c/d9e/d4e/f8e [0,4194304] 0 2026-03-09T17:30:22.942 INFO:tasks.workunit.client.1.vm09.stdout:8/483: dread - d1/da/dd/d47/f66 zero size 2026-03-09T17:30:22.943 INFO:tasks.workunit.client.1.vm09.stdout:9/468: mknod d5/d2e/d70/c9e 0 2026-03-09T17:30:22.949 INFO:tasks.workunit.client.1.vm09.stdout:3/434: mkdir d5/d16/d85 0 2026-03-09T17:30:22.950 INFO:tasks.workunit.client.1.vm09.stdout:1/480: creat d9/dc/dd/d40/f92 x:0 0 0 2026-03-09T17:30:22.961 INFO:tasks.workunit.client.1.vm09.stdout:5/499: unlink d0/d9/l98 0 2026-03-09T17:30:22.985 INFO:tasks.workunit.client.1.vm09.stdout:7/580: fdatasync da/d11/d2d/d49/f52 0 2026-03-09T17:30:22.985 INFO:tasks.workunit.client.1.vm09.stdout:7/581: chown da/d11/d47/c5e 89 1 2026-03-09T17:30:22.988 INFO:tasks.workunit.client.1.vm09.stdout:2/467: link d13/d15/d34/f48 d13/d15/d34/d69/f8f 0 2026-03-09T17:30:22.991 INFO:tasks.workunit.client.1.vm09.stdout:2/468: dwrite d13/d15/d21/f24 [4194304,4194304] 0 2026-03-09T17:30:23.005 INFO:tasks.workunit.client.1.vm09.stdout:3/435: readlink d5/l14 0 2026-03-09T17:30:23.017 INFO:tasks.workunit.client.1.vm09.stdout:7/582: creat da/d11/d3e/da2/db2/fc7 x:0 0 0 2026-03-09T17:30:23.017 INFO:tasks.workunit.client.1.vm09.stdout:6/462: rename d3/d21/d76/f8d to d3/f97 0 2026-03-09T17:30:23.027 INFO:tasks.workunit.client.1.vm09.stdout:9/469: mknod d5/de/c9f 0 2026-03-09T17:30:23.027 INFO:tasks.workunit.client.1.vm09.stdout:9/470: dread - d5/de/d4e/d6e/d93/f7f zero size 2026-03-09T17:30:23.028 INFO:tasks.workunit.client.1.vm09.stdout:9/471: write d5/d7e/d81/f96 [272465,115627] 0 2026-03-09T17:30:23.029 INFO:tasks.workunit.client.1.vm09.stdout:1/481: creat d9/dc/dd/d40/d22/d91/f93 x:0 0 0 2026-03-09T17:30:23.045 INFO:tasks.workunit.client.1.vm09.stdout:0/477: truncate d6/d1d/f3c 3254640 0 2026-03-09T17:30:23.046 INFO:tasks.workunit.client.1.vm09.stdout:0/478: chown d6/d1d/l40 827 1 2026-03-09T17:30:23.047 INFO:tasks.workunit.client.1.vm09.stdout:0/479: chown d6/d1d/d24/d32/l43 1153309 1 2026-03-09T17:30:23.047 INFO:tasks.workunit.client.1.vm09.stdout:0/480: stat d6/d1d/d39/c10 0 2026-03-09T17:30:23.049 INFO:tasks.workunit.client.1.vm09.stdout:8/484: truncate d1/da/d23/d6c/f1c 3080017 0 2026-03-09T17:30:23.052 INFO:tasks.workunit.client.1.vm09.stdout:3/436: dwrite d5/d16/d31/d3d/d32/f33 [4194304,4194304] 0 2026-03-09T17:30:23.072 INFO:tasks.workunit.client.1.vm09.stdout:5/500: rename d0/d55 to d0/d9/d74/d75/d9f 0 2026-03-09T17:30:23.072 INFO:tasks.workunit.client.1.vm09.stdout:9/472: symlink d5/de/d29/la0 0 2026-03-09T17:30:23.073 INFO:tasks.workunit.client.1.vm09.stdout:4/505: link d11/d1e/d31/l47 d11/d1e/d83/d89/d8b/d58/la1 0 2026-03-09T17:30:23.074 INFO:tasks.workunit.client.1.vm09.stdout:3/437: creat d5/d9/d30/d65/d59/d84/f86 x:0 0 0 2026-03-09T17:30:23.074 INFO:tasks.workunit.client.1.vm09.stdout:2/469: rename d13/d15/d34/d37/d6f/d83 to d13/d15/d60/d90 0 2026-03-09T17:30:23.076 INFO:tasks.workunit.client.1.vm09.stdout:2/470: read d13/d15/d21/f5d [1794300,58863] 0 2026-03-09T17:30:23.078 INFO:tasks.workunit.client.1.vm09.stdout:6/463: dread d3/d21/d76/d3f/f51 [0,4194304] 0 2026-03-09T17:30:23.079 INFO:tasks.workunit.client.1.vm09.stdout:6/464: write d3/d21/d76/d5c/f6d [178530,38303] 0 2026-03-09T17:30:23.089 INFO:tasks.workunit.client.1.vm09.stdout:5/501: mknod d0/dc/d21/d26/d5e/d68/ca0 0 2026-03-09T17:30:23.089 INFO:tasks.workunit.client.1.vm09.stdout:7/583: creat da/d11/d47/fc8 x:0 0 0 2026-03-09T17:30:23.095 INFO:tasks.workunit.client.1.vm09.stdout:2/471: fdatasync d13/d15/d34/f5b 0 2026-03-09T17:30:23.095 INFO:tasks.workunit.client.1.vm09.stdout:2/472: chown d13/d15/d34/d37/l77 1425 1 2026-03-09T17:30:23.098 INFO:tasks.workunit.client.1.vm09.stdout:2/473: dwrite d13/f89 [0,4194304] 0 2026-03-09T17:30:23.100 INFO:tasks.workunit.client.1.vm09.stdout:7/584: write da/d11/d64/da7/fa8 [1491205,128121] 0 2026-03-09T17:30:23.100 INFO:tasks.workunit.client.1.vm09.stdout:3/438: fsync d5/d16/d25/f2c 0 2026-03-09T17:30:23.101 INFO:tasks.workunit.client.1.vm09.stdout:7/585: chown da/d11/d47/d5b/d6c/d9e/d4e/f8e 447 1 2026-03-09T17:30:23.106 INFO:tasks.workunit.client.1.vm09.stdout:5/502: read d0/f22 [888447,42192] 0 2026-03-09T17:30:23.106 INFO:tasks.workunit.client.1.vm09.stdout:5/503: stat d0/dc/d21/d26/d5e/d68/d6d 0 2026-03-09T17:30:23.107 INFO:tasks.workunit.client.1.vm09.stdout:5/504: write d0/dc/d21/d6f/f80 [4169362,26667] 0 2026-03-09T17:30:23.107 INFO:tasks.workunit.client.1.vm09.stdout:6/465: mkdir d3/d21/d25/d91/d98 0 2026-03-09T17:30:23.113 INFO:tasks.workunit.client.1.vm09.stdout:1/482: write f6 [726388,44758] 0 2026-03-09T17:30:23.114 INFO:tasks.workunit.client.1.vm09.stdout:1/483: readlink d9/dc/dd/d40/d22/d37/d3f/l66 0 2026-03-09T17:30:23.119 INFO:tasks.workunit.client.1.vm09.stdout:0/481: rename d6/d1d/d46/l8d to d6/d1d/d24/d32/l9b 0 2026-03-09T17:30:23.119 INFO:tasks.workunit.client.1.vm09.stdout:0/482: read - d6/f6d zero size 2026-03-09T17:30:23.121 INFO:tasks.workunit.client.1.vm09.stdout:3/439: fsync d5/d16/f45 0 2026-03-09T17:30:23.122 INFO:tasks.workunit.client.1.vm09.stdout:3/440: write d5/d16/d31/d3d/d32/f33 [2674173,98318] 0 2026-03-09T17:30:23.122 INFO:tasks.workunit.client.1.vm09.stdout:7/586: creat da/d11/d47/d5b/fc9 x:0 0 0 2026-03-09T17:30:23.127 INFO:tasks.workunit.client.1.vm09.stdout:4/506: write d11/f26 [2235872,1045] 0 2026-03-09T17:30:23.130 INFO:tasks.workunit.client.1.vm09.stdout:3/441: dread d5/f53 [0,4194304] 0 2026-03-09T17:30:23.138 INFO:tasks.workunit.client.1.vm09.stdout:2/474: write d13/d15/d21/f3e [1064635,48085] 0 2026-03-09T17:30:23.138 INFO:tasks.workunit.client.1.vm09.stdout:7/587: dread da/d11/d47/d5b/d6c/d9e/f35 [0,4194304] 0 2026-03-09T17:30:23.139 INFO:tasks.workunit.client.1.vm09.stdout:6/466: write d3/d21/d76/d5c/f92 [718237,61399] 0 2026-03-09T17:30:23.140 INFO:tasks.workunit.client.1.vm09.stdout:6/467: chown d3/d21/d76/d5c/d7e 11 1 2026-03-09T17:30:23.143 INFO:tasks.workunit.client.1.vm09.stdout:5/505: unlink d0/dc/d21/d6f/d42/l6e 0 2026-03-09T17:30:23.144 INFO:tasks.workunit.client.1.vm09.stdout:5/506: write d0/ff [349760,11875] 0 2026-03-09T17:30:23.144 INFO:tasks.workunit.client.1.vm09.stdout:5/507: chown d0/d9/c89 8 1 2026-03-09T17:30:23.147 INFO:tasks.workunit.client.1.vm09.stdout:1/484: truncate d9/dc/dd/d40/d1d/f4d 2478397 0 2026-03-09T17:30:23.150 INFO:tasks.workunit.client.1.vm09.stdout:1/485: dwrite d9/dc/dd/d40/d22/f2b [4194304,4194304] 0 2026-03-09T17:30:23.155 INFO:tasks.workunit.client.1.vm09.stdout:8/485: rename d1/da/d23/f93 to d1/d14/d2a/d42/d43/f98 0 2026-03-09T17:30:23.162 INFO:tasks.workunit.client.1.vm09.stdout:0/483: mkdir d6/d1d/d24/d32/d59/d9c 0 2026-03-09T17:30:23.162 INFO:tasks.workunit.client.1.vm09.stdout:0/484: stat d6/d1d/d24/d5e/c8b 0 2026-03-09T17:30:23.163 INFO:tasks.workunit.client.1.vm09.stdout:4/507: symlink d11/d1e/d45/d60/d71/la2 0 2026-03-09T17:30:23.169 INFO:tasks.workunit.client.1.vm09.stdout:7/588: creat da/d11/d47/d5b/d6c/d9e/d4e/d5f/fca x:0 0 0 2026-03-09T17:30:23.170 INFO:tasks.workunit.client.1.vm09.stdout:5/508: symlink d0/d9/d16/d5c/la1 0 2026-03-09T17:30:23.174 INFO:tasks.workunit.client.1.vm09.stdout:7/589: mknod da/d11/d64/da7/ccb 0 2026-03-09T17:30:23.174 INFO:tasks.workunit.client.1.vm09.stdout:5/509: fsync d0/dc/d21/d6f/d42/f82 0 2026-03-09T17:30:23.177 INFO:tasks.workunit.client.1.vm09.stdout:9/473: rename d5/de/d29/d90/c9b to d5/d2e/d8b/ca1 0 2026-03-09T17:30:23.177 INFO:tasks.workunit.client.1.vm09.stdout:0/485: symlink d6/l9d 0 2026-03-09T17:30:23.178 INFO:tasks.workunit.client.1.vm09.stdout:0/486: dread - d6/d64/f7e zero size 2026-03-09T17:30:23.178 INFO:tasks.workunit.client.1.vm09.stdout:7/590: chown da/d11/d47/d5b/d6c/f73 1478239 1 2026-03-09T17:30:23.188 INFO:tasks.workunit.client.1.vm09.stdout:5/510: dwrite d0/d2/d76/d86/f50 [0,4194304] 0 2026-03-09T17:30:23.195 INFO:tasks.workunit.client.1.vm09.stdout:2/475: write d13/f39 [214439,93804] 0 2026-03-09T17:30:23.195 INFO:tasks.workunit.client.1.vm09.stdout:8/486: write d1/d14/d2a/f2e [381749,80283] 0 2026-03-09T17:30:23.196 INFO:tasks.workunit.client.1.vm09.stdout:2/476: chown d13/d4d/f5c 0 1 2026-03-09T17:30:23.199 INFO:tasks.workunit.client.1.vm09.stdout:8/487: dread d1/d14/d2a/f54 [0,4194304] 0 2026-03-09T17:30:23.200 INFO:tasks.workunit.client.1.vm09.stdout:4/508: write d11/d1e/f22 [991175,83629] 0 2026-03-09T17:30:23.200 INFO:tasks.workunit.client.1.vm09.stdout:3/442: getdents d5/d9/d30/d65/d59/d84 0 2026-03-09T17:30:23.201 INFO:tasks.workunit.client.1.vm09.stdout:4/509: chown d11/d1e/d29/d36/l80 1389 1 2026-03-09T17:30:23.201 INFO:tasks.workunit.client.1.vm09.stdout:8/488: write d1/d14/d2a/f2b [1287661,22005] 0 2026-03-09T17:30:23.202 INFO:tasks.workunit.client.1.vm09.stdout:3/443: write d5/d9/d30/d65/f3e [37210,103601] 0 2026-03-09T17:30:23.210 INFO:tasks.workunit.client.1.vm09.stdout:3/444: dwrite d5/d16/d25/f28 [4194304,4194304] 0 2026-03-09T17:30:23.211 INFO:tasks.workunit.client.1.vm09.stdout:3/445: write d5/d9/d30/d65/f3e [1859674,22093] 0 2026-03-09T17:30:23.216 INFO:tasks.workunit.client.1.vm09.stdout:6/468: getdents d3/d21/d76/d5c/d7e/d94 0 2026-03-09T17:30:23.219 INFO:tasks.workunit.client.1.vm09.stdout:6/469: truncate d3/d7/d59/d73/f93 893644 0 2026-03-09T17:30:23.223 INFO:tasks.workunit.client.1.vm09.stdout:2/477: symlink d13/d15/d34/d37/d66/l91 0 2026-03-09T17:30:23.228 INFO:tasks.workunit.client.1.vm09.stdout:1/486: link d9/dc/l18 d9/dc/dd/d40/l94 0 2026-03-09T17:30:23.228 INFO:tasks.workunit.client.1.vm09.stdout:9/474: dread d5/f13 [4194304,4194304] 0 2026-03-09T17:30:23.230 INFO:tasks.workunit.client.1.vm09.stdout:1/487: write d9/dc/dd/d40/d21/d35/f8e [744332,73063] 0 2026-03-09T17:30:23.235 INFO:tasks.workunit.client.1.vm09.stdout:0/487: creat d6/d1d/d24/d5e/f9e x:0 0 0 2026-03-09T17:30:23.239 INFO:tasks.workunit.client.1.vm09.stdout:3/446: dread d5/d16/d25/f60 [0,4194304] 0 2026-03-09T17:30:23.239 INFO:tasks.workunit.client.1.vm09.stdout:9/475: fsync d5/f4b 0 2026-03-09T17:30:23.241 INFO:tasks.workunit.client.1.vm09.stdout:1/488: fdatasync f2 0 2026-03-09T17:30:23.242 INFO:tasks.workunit.client.1.vm09.stdout:3/447: dwrite d5/d16/f45 [0,4194304] 0 2026-03-09T17:30:23.252 INFO:tasks.workunit.client.1.vm09.stdout:0/488: mknod d6/d1d/d46/c9f 0 2026-03-09T17:30:23.252 INFO:tasks.workunit.client.1.vm09.stdout:0/489: fdatasync d6/d1d/f70 0 2026-03-09T17:30:23.252 INFO:tasks.workunit.client.1.vm09.stdout:1/489: dread - d9/dc/dd/d40/f73 zero size 2026-03-09T17:30:23.256 INFO:tasks.workunit.client.1.vm09.stdout:8/489: dread d1/da/d23/d6c/d32/f50 [0,4194304] 0 2026-03-09T17:30:23.256 INFO:tasks.workunit.client.1.vm09.stdout:5/511: dread d0/dc/d21/d26/f36 [0,4194304] 0 2026-03-09T17:30:23.257 INFO:tasks.workunit.client.1.vm09.stdout:0/490: dwrite d6/d1d/d24/d32/d59/f5c [0,4194304] 0 2026-03-09T17:30:23.258 INFO:tasks.workunit.client.1.vm09.stdout:3/448: creat d5/d9/d30/d65/d59/f87 x:0 0 0 2026-03-09T17:30:23.259 INFO:tasks.workunit.client.1.vm09.stdout:3/449: dread - d5/d9/d30/d65/d59/d84/f86 zero size 2026-03-09T17:30:23.261 INFO:tasks.workunit.client.1.vm09.stdout:3/450: write d5/d9/d30/d65/f19 [1161388,50754] 0 2026-03-09T17:30:23.266 INFO:tasks.workunit.client.1.vm09.stdout:3/451: dread d5/d16/d31/d3d/d32/f33 [4194304,4194304] 0 2026-03-09T17:30:23.268 INFO:tasks.workunit.client.1.vm09.stdout:0/491: dwrite d6/d1d/d46/f4d [0,4194304] 0 2026-03-09T17:30:23.273 INFO:tasks.workunit.client.1.vm09.stdout:0/492: dwrite d6/d1d/f91 [0,4194304] 0 2026-03-09T17:30:23.275 INFO:tasks.workunit.client.1.vm09.stdout:6/470: sync 2026-03-09T17:30:23.287 INFO:tasks.workunit.client.1.vm09.stdout:5/512: rename d0/d52/f81 to d0/dc/d21/d33/fa2 0 2026-03-09T17:30:23.297 INFO:tasks.workunit.client.1.vm09.stdout:5/513: dread d0/d9/f34 [0,4194304] 0 2026-03-09T17:30:23.310 INFO:tasks.workunit.client.1.vm09.stdout:6/471: rmdir d3/d21/d76/d5c 39 2026-03-09T17:30:23.314 INFO:tasks.workunit.client.1.vm09.stdout:9/476: link d5/de/d29/l3f d5/d21/la2 0 2026-03-09T17:30:23.325 INFO:tasks.workunit.client.1.vm09.stdout:7/591: dwrite da/d11/d47/d5b/d6c/d9e/d4e/d4c/f67 [0,4194304] 0 2026-03-09T17:30:23.336 INFO:tasks.workunit.client.1.vm09.stdout:0/493: mknod d6/d1d/ca0 0 2026-03-09T17:30:23.340 INFO:tasks.workunit.client.1.vm09.stdout:9/477: rmdir d5/d2e/d8b 39 2026-03-09T17:30:23.342 INFO:tasks.workunit.client.1.vm09.stdout:6/472: dread d3/d21/d76/d3f/f42 [0,4194304] 0 2026-03-09T17:30:23.344 INFO:tasks.workunit.client.1.vm09.stdout:4/510: write fe [5369514,29560] 0 2026-03-09T17:30:23.352 INFO:tasks.workunit.client.1.vm09.stdout:2/478: truncate d13/d15/f2f 347864 0 2026-03-09T17:30:23.352 INFO:tasks.workunit.client.1.vm09.stdout:2/479: chown d13/f14 1574074630 1 2026-03-09T17:30:23.354 INFO:tasks.workunit.client.1.vm09.stdout:1/490: write f3 [870366,39390] 0 2026-03-09T17:30:23.359 INFO:tasks.workunit.client.1.vm09.stdout:8/490: dwrite d1/f6e [0,4194304] 0 2026-03-09T17:30:23.360 INFO:tasks.workunit.client.1.vm09.stdout:5/514: creat d0/fa3 x:0 0 0 2026-03-09T17:30:23.360 INFO:tasks.workunit.client.1.vm09.stdout:3/452: getdents d5/d16/d31/d3d/d32 0 2026-03-09T17:30:23.362 INFO:tasks.workunit.client.1.vm09.stdout:4/511: symlink d11/d1e/la3 0 2026-03-09T17:30:23.362 INFO:tasks.workunit.client.1.vm09.stdout:7/592: mknod da/d11/d2d/ccc 0 2026-03-09T17:30:23.365 INFO:tasks.workunit.client.1.vm09.stdout:7/593: dread - da/d11/d77/fba zero size 2026-03-09T17:30:23.366 INFO:tasks.workunit.client.1.vm09.stdout:5/515: sync 2026-03-09T17:30:23.366 INFO:tasks.workunit.client.1.vm09.stdout:8/491: dwrite d1/da/d23/f8f [0,4194304] 0 2026-03-09T17:30:23.369 INFO:tasks.workunit.client.1.vm09.stdout:5/516: stat d0/d9/d16/d5c/la1 0 2026-03-09T17:30:23.378 INFO:tasks.workunit.client.1.vm09.stdout:5/517: dwrite d0/d9/d74/d75/d9f/f92 [0,4194304] 0 2026-03-09T17:30:23.382 INFO:tasks.workunit.client.1.vm09.stdout:1/491: rmdir d9/d3a 39 2026-03-09T17:30:23.382 INFO:tasks.workunit.client.1.vm09.stdout:5/518: stat d0/d2/d76/d87 0 2026-03-09T17:30:23.382 INFO:tasks.workunit.client.1.vm09.stdout:0/494: mknod d6/d1d/d24/ca1 0 2026-03-09T17:30:23.382 INFO:tasks.workunit.client.1.vm09.stdout:0/495: readlink d6/d1d/l1f 0 2026-03-09T17:30:23.382 INFO:tasks.workunit.client.1.vm09.stdout:5/519: write d0/d9/d74/f99 [694650,85605] 0 2026-03-09T17:30:23.382 INFO:tasks.workunit.client.1.vm09.stdout:9/478: symlink d5/de/d29/la3 0 2026-03-09T17:30:23.387 INFO:tasks.workunit.client.1.vm09.stdout:0/496: dread d6/d1d/f91 [0,4194304] 0 2026-03-09T17:30:23.388 INFO:tasks.workunit.client.1.vm09.stdout:0/497: chown d6/d64/c77 12 1 2026-03-09T17:30:23.389 INFO:tasks.workunit.client.1.vm09.stdout:5/520: dwrite d0/d9/d16/d5c/f70 [0,4194304] 0 2026-03-09T17:30:23.404 INFO:tasks.workunit.client.1.vm09.stdout:3/453: creat d5/d9/f88 x:0 0 0 2026-03-09T17:30:23.404 INFO:tasks.workunit.client.1.vm09.stdout:3/454: chown d5/d16/f45 9770462 1 2026-03-09T17:30:23.424 INFO:tasks.workunit.client.1.vm09.stdout:8/492: dread d1/f33 [0,4194304] 0 2026-03-09T17:30:23.424 INFO:tasks.workunit.client.1.vm09.stdout:1/492: rename d9/dc/dd/d40/d22/d37/d3f/d42/f45 to d9/dc/dd/d40/d22/d37/d3f/d42/f95 0 2026-03-09T17:30:23.436 INFO:tasks.workunit.client.1.vm09.stdout:1/493: mkdir d9/dc/d63/d96 0 2026-03-09T17:30:23.436 INFO:tasks.workunit.client.1.vm09.stdout:9/479: creat d5/d91/d99/fa4 x:0 0 0 2026-03-09T17:30:23.437 INFO:tasks.workunit.client.1.vm09.stdout:1/494: truncate d9/f8d 120617 0 2026-03-09T17:30:23.440 INFO:tasks.workunit.client.1.vm09.stdout:8/493: creat d1/d14/d2a/d42/d5d/d8a/f99 x:0 0 0 2026-03-09T17:30:23.440 INFO:tasks.workunit.client.1.vm09.stdout:8/494: readlink d1/da/d23/d71/l7c 0 2026-03-09T17:30:23.441 INFO:tasks.workunit.client.1.vm09.stdout:8/495: dread - d1/da/d23/f7d zero size 2026-03-09T17:30:23.441 INFO:tasks.workunit.client.1.vm09.stdout:8/496: stat d1/d14/f3c 0 2026-03-09T17:30:23.442 INFO:tasks.workunit.client.1.vm09.stdout:8/497: chown d1/d14/d2a/d42/d43/d44/f5c 3386 1 2026-03-09T17:30:23.461 INFO:tasks.workunit.client.1.vm09.stdout:2/480: write d13/d15/f1d [2721570,82353] 0 2026-03-09T17:30:23.466 INFO:tasks.workunit.client.1.vm09.stdout:7/594: creat da/fcd x:0 0 0 2026-03-09T17:30:23.466 INFO:tasks.workunit.client.1.vm09.stdout:7/595: stat da/d11/d47/d5b/d6c 0 2026-03-09T17:30:23.477 INFO:tasks.workunit.client.1.vm09.stdout:6/473: dwrite d3/d21/d76/d5c/d61/d95/f20 [0,4194304] 0 2026-03-09T17:30:23.479 INFO:tasks.workunit.client.1.vm09.stdout:6/474: chown d3/d21/d76/d5c/f65 127397679 1 2026-03-09T17:30:23.480 INFO:tasks.workunit.client.1.vm09.stdout:3/455: truncate d5/d9/d30/d65/f18 1923723 0 2026-03-09T17:30:23.481 INFO:tasks.workunit.client.1.vm09.stdout:9/480: chown d5/de/d29/f52 897707 1 2026-03-09T17:30:23.486 INFO:tasks.workunit.client.1.vm09.stdout:3/456: dwrite d5/d16/f45 [4194304,4194304] 0 2026-03-09T17:30:23.524 INFO:tasks.workunit.client.1.vm09.stdout:7/596: dwrite da/d11/d47/d5b/d6c/f73 [0,4194304] 0 2026-03-09T17:30:23.525 INFO:tasks.workunit.client.1.vm09.stdout:7/597: write da/d11/d47/d5b/d6c/d9e/d4e/d4c/f67 [288803,94103] 0 2026-03-09T17:30:23.536 INFO:tasks.workunit.client.1.vm09.stdout:1/495: truncate f8 2557776 0 2026-03-09T17:30:23.536 INFO:tasks.workunit.client.1.vm09.stdout:1/496: write d9/dc/dd/fe [6558040,106675] 0 2026-03-09T17:30:23.552 INFO:tasks.workunit.client.1.vm09.stdout:3/457: rename d5/d16/d31/f56 to d5/d16/d31/d3d/d32/f89 0 2026-03-09T17:30:23.554 INFO:tasks.workunit.client.1.vm09.stdout:3/458: dwrite d5/d9/f1e [0,4194304] 0 2026-03-09T17:30:23.558 INFO:tasks.workunit.client.1.vm09.stdout:0/498: getdents d6/d93 0 2026-03-09T17:30:23.569 INFO:tasks.workunit.client.1.vm09.stdout:0/499: dread d6/d1d/d24/d32/f49 [0,4194304] 0 2026-03-09T17:30:23.580 INFO:tasks.workunit.client.1.vm09.stdout:5/521: getdents d0/dc 0 2026-03-09T17:30:23.600 INFO:tasks.workunit.client.1.vm09.stdout:2/481: write d13/d15/d3b/d43/f46 [457951,10733] 0 2026-03-09T17:30:23.601 INFO:tasks.workunit.client.1.vm09.stdout:2/482: write d13/d15/f71 [767419,84731] 0 2026-03-09T17:30:23.602 INFO:tasks.workunit.client.1.vm09.stdout:4/512: truncate d11/d1e/d29/d36/f82 1004779 0 2026-03-09T17:30:23.672 INFO:tasks.workunit.client.1.vm09.stdout:9/481: dwrite d5/de/d29/f89 [0,4194304] 0 2026-03-09T17:30:23.687 INFO:tasks.workunit.client.1.vm09.stdout:3/459: rmdir d5/d9/d30 39 2026-03-09T17:30:23.688 INFO:tasks.workunit.client.1.vm09.stdout:3/460: read d5/d16/d25/f60 [2412269,43015] 0 2026-03-09T17:30:23.694 INFO:tasks.workunit.client.1.vm09.stdout:0/500: rename d6/d1d/d24/d32/f49 to d6/d1d/d24/d32/d59/d9c/fa2 0 2026-03-09T17:30:23.696 INFO:tasks.workunit.client.1.vm09.stdout:8/498: creat d1/da/dd/f9a x:0 0 0 2026-03-09T17:30:23.698 INFO:tasks.workunit.client.1.vm09.stdout:5/522: mkdir d0/d2/d76/d87/da4 0 2026-03-09T17:30:23.701 INFO:tasks.workunit.client.1.vm09.stdout:2/483: creat d13/d15/d60/d90/f92 x:0 0 0 2026-03-09T17:30:23.729 INFO:tasks.workunit.client.1.vm09.stdout:7/598: dwrite da/d11/d47/d5b/d6c/d9e/f38 [0,4194304] 0 2026-03-09T17:30:23.731 INFO:tasks.workunit.client.1.vm09.stdout:7/599: chown da/d11/d47/d5b/d6c/d9e/d4e/f42 36237862 1 2026-03-09T17:30:23.750 INFO:tasks.workunit.client.1.vm09.stdout:5/523: creat d0/d2/d76/d87/fa5 x:0 0 0 2026-03-09T17:30:23.750 INFO:tasks.workunit.client.1.vm09.stdout:5/524: read - d0/fa3 zero size 2026-03-09T17:30:23.750 INFO:tasks.workunit.client.1.vm09.stdout:2/484: mkdir d13/d15/d34/d69/d93 0 2026-03-09T17:30:23.751 INFO:tasks.workunit.client.1.vm09.stdout:6/475: getdents d3/d21/d25/d26/d34 0 2026-03-09T17:30:23.752 INFO:tasks.workunit.client.1.vm09.stdout:5/525: read - d0/dc/d21/d26/d5e/d68/d6d/f9e zero size 2026-03-09T17:30:23.753 INFO:tasks.workunit.client.1.vm09.stdout:8/499: dread d1/d14/d2a/f54 [0,4194304] 0 2026-03-09T17:30:23.753 INFO:tasks.workunit.client.1.vm09.stdout:5/526: truncate d0/dc/d21/f62 1539678 0 2026-03-09T17:30:23.753 INFO:tasks.workunit.client.1.vm09.stdout:5/527: chown d0/dc/d21/d33/fa2 2115149477 1 2026-03-09T17:30:23.755 INFO:tasks.workunit.client.1.vm09.stdout:9/482: rename d5/d7e/l83 to d5/d2e/d8b/la5 0 2026-03-09T17:30:23.755 INFO:tasks.workunit.client.1.vm09.stdout:4/513: unlink d11/d1e/f73 0 2026-03-09T17:30:23.756 INFO:tasks.workunit.client.1.vm09.stdout:4/514: chown d11/c9d 10952 1 2026-03-09T17:30:23.756 INFO:tasks.workunit.client.1.vm09.stdout:1/497: rmdir d9/dc/d63/d96 0 2026-03-09T17:30:23.757 INFO:tasks.workunit.client.1.vm09.stdout:8/500: dwrite d1/d14/d2a/d42/d5d/f80 [0,4194304] 0 2026-03-09T17:30:23.757 INFO:tasks.workunit.client.1.vm09.stdout:7/600: readlink da/d11/d77/lb0 0 2026-03-09T17:30:23.758 INFO:tasks.workunit.client.1.vm09.stdout:8/501: chown d1/da/dd/d63/f1d 104 1 2026-03-09T17:30:23.778 INFO:tasks.workunit.client.1.vm09.stdout:2/485: rmdir d13/d15/d21 39 2026-03-09T17:30:23.797 INFO:tasks.workunit.client.1.vm09.stdout:6/476: dwrite d3/d21/d76/d5c/f78 [0,4194304] 0 2026-03-09T17:30:23.816 INFO:tasks.workunit.client.1.vm09.stdout:7/601: mknod da/d11/d2d/d56/cce 0 2026-03-09T17:30:23.818 INFO:tasks.workunit.client.1.vm09.stdout:7/602: dwrite da/d11/d47/d5b/f82 [0,4194304] 0 2026-03-09T17:30:23.838 INFO:tasks.workunit.client.1.vm09.stdout:5/528: creat d0/d2/d76/d87/da4/fa6 x:0 0 0 2026-03-09T17:30:23.852 INFO:tasks.workunit.client.1.vm09.stdout:3/461: truncate d5/d9/d30/d65/f43 3750844 0 2026-03-09T17:30:23.853 INFO:tasks.workunit.client.1.vm09.stdout:8/502: dwrite d1/f33 [0,4194304] 0 2026-03-09T17:30:23.854 INFO:tasks.workunit.client.1.vm09.stdout:3/462: write d5/d9/f1e [2723648,7132] 0 2026-03-09T17:30:23.867 INFO:tasks.workunit.client.1.vm09.stdout:0/501: getdents d6/d1d/d24 0 2026-03-09T17:30:23.888 INFO:tasks.workunit.client.1.vm09.stdout:5/529: creat d0/dc/d21/d33/fa7 x:0 0 0 2026-03-09T17:30:23.888 INFO:tasks.workunit.client.1.vm09.stdout:5/530: dread - d0/dc/d21/d33/f65 zero size 2026-03-09T17:30:23.895 INFO:tasks.workunit.client.1.vm09.stdout:1/498: rename d9/f7c to d9/f97 0 2026-03-09T17:30:23.895 INFO:tasks.workunit.client.1.vm09.stdout:6/477: mkdir d3/d7/d99 0 2026-03-09T17:30:23.904 INFO:tasks.workunit.client.1.vm09.stdout:4/515: truncate d11/d1e/d29/f93 746204 0 2026-03-09T17:30:23.905 INFO:tasks.workunit.client.1.vm09.stdout:3/463: mkdir d5/d16/d31/d37/d58/d8a 0 2026-03-09T17:30:23.908 INFO:tasks.workunit.client.1.vm09.stdout:9/483: getdents d5/d91/d99 0 2026-03-09T17:30:23.909 INFO:tasks.workunit.client.1.vm09.stdout:8/503: dread d1/d14/d2a/f62 [0,4194304] 0 2026-03-09T17:30:23.910 INFO:tasks.workunit.client.1.vm09.stdout:5/531: creat d0/d2/d76/d86/fa8 x:0 0 0 2026-03-09T17:30:23.913 INFO:tasks.workunit.client.1.vm09.stdout:6/478: mkdir d3/d21/d25/d91/d9a 0 2026-03-09T17:30:23.920 INFO:tasks.workunit.client.1.vm09.stdout:4/516: fdatasync d11/d1e/d29/f2f 0 2026-03-09T17:30:23.929 INFO:tasks.workunit.client.1.vm09.stdout:1/499: write d9/dc/d63/f75 [912399,5289] 0 2026-03-09T17:30:23.933 INFO:tasks.workunit.client.1.vm09.stdout:2/486: getdents d13/d15/d3b 0 2026-03-09T17:30:23.955 INFO:tasks.workunit.client.1.vm09.stdout:3/464: write d5/d9/d30/f61 [815267,112732] 0 2026-03-09T17:30:23.956 INFO:tasks.workunit.client.1.vm09.stdout:3/465: dread - d5/d9/d30/d65/d59/f81 zero size 2026-03-09T17:30:23.961 INFO:tasks.workunit.client.1.vm09.stdout:0/502: creat d6/d1d/d24/d32/d59/d81/d8c/fa3 x:0 0 0 2026-03-09T17:30:23.970 INFO:tasks.workunit.client.1.vm09.stdout:9/484: symlink d5/de/d88/la6 0 2026-03-09T17:30:23.980 INFO:tasks.workunit.client.1.vm09.stdout:5/532: symlink d0/dc/d21/d6f/la9 0 2026-03-09T17:30:23.980 INFO:tasks.workunit.client.1.vm09.stdout:5/533: stat d0/d9/d16/d5c/f73 0 2026-03-09T17:30:23.992 INFO:tasks.workunit.client.1.vm09.stdout:4/517: truncate d11/d1e/d45/d60/f95 565572 0 2026-03-09T17:30:23.993 INFO:tasks.workunit.client.1.vm09.stdout:7/603: getdents da/d11/d47/d5b/d6c/d9e 0 2026-03-09T17:30:23.994 INFO:tasks.workunit.client.1.vm09.stdout:7/604: fdatasync da/d11/d64/da7/fa8 0 2026-03-09T17:30:23.998 INFO:tasks.workunit.client.1.vm09.stdout:1/500: rename d9/dc/f3d to d9/dc/dd/d40/d1d/f98 0 2026-03-09T17:30:24.002 INFO:tasks.workunit.client.1.vm09.stdout:1/501: dwrite d9/dc/dd/d40/d1d/f17 [4194304,4194304] 0 2026-03-09T17:30:24.009 INFO:tasks.workunit.client.1.vm09.stdout:2/487: truncate d13/d15/f20 1512386 0 2026-03-09T17:30:24.016 INFO:tasks.workunit.client.1.vm09.stdout:0/503: symlink d6/d64/la4 0 2026-03-09T17:30:24.017 INFO:tasks.workunit.client.1.vm09.stdout:0/504: read d6/f21 [3307818,15029] 0 2026-03-09T17:30:24.024 INFO:tasks.workunit.client.1.vm09.stdout:8/504: symlink d1/da/dd/d47/d4c/d8d/l9b 0 2026-03-09T17:30:24.047 INFO:tasks.workunit.client.1.vm09.stdout:6/479: dwrite d3/d7/f23 [0,4194304] 0 2026-03-09T17:30:24.052 INFO:tasks.workunit.client.1.vm09.stdout:6/480: chown d3/d21/d76/d5c/d7e/d94/c87 5566 1 2026-03-09T17:30:24.052 INFO:tasks.workunit.client.1.vm09.stdout:4/518: dread - d11/d1e/d45/d60/f7b zero size 2026-03-09T17:30:24.061 INFO:tasks.workunit.client.1.vm09.stdout:2/488: mkdir d13/d15/d36/d72/d94 0 2026-03-09T17:30:24.061 INFO:tasks.workunit.client.1.vm09.stdout:7/605: creat da/d11/d64/da7/fcf x:0 0 0 2026-03-09T17:30:24.061 INFO:tasks.workunit.client.1.vm09.stdout:7/606: chown da/d11/f25 7153269 1 2026-03-09T17:30:24.069 INFO:tasks.workunit.client.1.vm09.stdout:3/466: mknod d5/d16/d46/c8b 0 2026-03-09T17:30:24.070 INFO:tasks.workunit.client.1.vm09.stdout:9/485: unlink d5/f5d 0 2026-03-09T17:30:24.071 INFO:tasks.workunit.client.1.vm09.stdout:8/505: creat d1/d14/f9c x:0 0 0 2026-03-09T17:30:24.074 INFO:tasks.workunit.client.1.vm09.stdout:6/481: rename d3/d21/d25/d26/d34/l49 to d3/d48/l9b 0 2026-03-09T17:30:24.077 INFO:tasks.workunit.client.1.vm09.stdout:3/467: dwrite d5/d16/d31/d37/d58/d64/f7c [0,4194304] 0 2026-03-09T17:30:24.079 INFO:tasks.workunit.client.1.vm09.stdout:7/607: unlink da/d11/l4b 0 2026-03-09T17:30:24.080 INFO:tasks.workunit.client.1.vm09.stdout:7/608: stat da/d11/d3e/da2 0 2026-03-09T17:30:24.083 INFO:tasks.workunit.client.1.vm09.stdout:0/505: truncate d6/d1d/d24/d32/d59/d9c/fa2 563046 0 2026-03-09T17:30:24.107 INFO:tasks.workunit.client.1.vm09.stdout:8/506: rename d1/d14/d31/l3e to d1/da/d3a/l9d 0 2026-03-09T17:30:24.114 INFO:tasks.workunit.client.1.vm09.stdout:3/468: mkdir d5/d9/d30/d65/d59/d84/d8c 0 2026-03-09T17:30:24.116 INFO:tasks.workunit.client.1.vm09.stdout:0/506: fdatasync d6/f63 0 2026-03-09T17:30:24.116 INFO:tasks.workunit.client.1.vm09.stdout:5/534: getdents d0/d2/d76/d86 0 2026-03-09T17:30:24.118 INFO:tasks.workunit.client.1.vm09.stdout:8/507: creat d1/d14/d2a/d42/d43/f9e x:0 0 0 2026-03-09T17:30:24.123 INFO:tasks.workunit.client.1.vm09.stdout:5/535: dwrite d0/dc/d21/f7a [0,4194304] 0 2026-03-09T17:30:24.124 INFO:tasks.workunit.client.1.vm09.stdout:8/508: dwrite d1/d14/f9c [0,4194304] 0 2026-03-09T17:30:24.129 INFO:tasks.workunit.client.1.vm09.stdout:7/609: unlink da/d11/d2d/f32 0 2026-03-09T17:30:24.137 INFO:tasks.workunit.client.1.vm09.stdout:5/536: dwrite d0/dc/d21/d33/fa7 [0,4194304] 0 2026-03-09T17:30:24.165 INFO:tasks.workunit.client.1.vm09.stdout:1/502: truncate d9/dc/dd/d40/d22/f2b 8202114 0 2026-03-09T17:30:24.165 INFO:tasks.workunit.client.1.vm09.stdout:4/519: write d11/d1e/f3c [3130231,50668] 0 2026-03-09T17:30:24.177 INFO:tasks.workunit.client.1.vm09.stdout:6/482: truncate d3/d21/d76/d3f/f42 3557931 0 2026-03-09T17:30:24.181 INFO:tasks.workunit.client.1.vm09.stdout:0/507: creat d6/d1d/d24/d5e/d6c/fa5 x:0 0 0 2026-03-09T17:30:24.181 INFO:tasks.workunit.client.1.vm09.stdout:2/489: link d13/d15/d34/d37/c70 d13/d15/d3b/d43/c95 0 2026-03-09T17:30:24.185 INFO:tasks.workunit.client.1.vm09.stdout:8/509: mknod d1/da/dd/d79/c9f 0 2026-03-09T17:30:24.187 INFO:tasks.workunit.client.1.vm09.stdout:7/610: symlink da/d11/d3e/ld0 0 2026-03-09T17:30:24.190 INFO:tasks.workunit.client.1.vm09.stdout:5/537: symlink d0/d9/d74/d75/laa 0 2026-03-09T17:30:24.191 INFO:tasks.workunit.client.1.vm09.stdout:5/538: chown d0/l3 125 1 2026-03-09T17:30:24.191 INFO:tasks.workunit.client.1.vm09.stdout:5/539: fdatasync d0/d9/d16/d5c/f70 0 2026-03-09T17:30:24.199 INFO:tasks.workunit.client.1.vm09.stdout:4/520: dwrite d11/d1e/d31/f5a [0,4194304] 0 2026-03-09T17:30:24.200 INFO:tasks.workunit.client.1.vm09.stdout:4/521: write d11/d1e/d29/f6d [4257266,21093] 0 2026-03-09T17:30:24.200 INFO:tasks.workunit.client.1.vm09.stdout:4/522: write d11/f6e [2112073,81390] 0 2026-03-09T17:30:24.211 INFO:tasks.workunit.client.1.vm09.stdout:9/486: truncate d5/de/d29/f36 8351384 0 2026-03-09T17:30:24.218 INFO:tasks.workunit.client.1.vm09.stdout:1/503: mkdir d9/dc/dd/d40/d22/d91/d99 0 2026-03-09T17:30:24.220 INFO:tasks.workunit.client.1.vm09.stdout:6/483: mkdir d3/d7/d59/d9c 0 2026-03-09T17:30:24.221 INFO:tasks.workunit.client.1.vm09.stdout:6/484: write d3/f41 [4267559,51718] 0 2026-03-09T17:30:24.223 INFO:tasks.workunit.client.1.vm09.stdout:6/485: dread d3/d7/d59/d73/f93 [0,4194304] 0 2026-03-09T17:30:24.232 INFO:tasks.workunit.client.1.vm09.stdout:8/510: mknod d1/d14/d31/ca0 0 2026-03-09T17:30:24.242 INFO:tasks.workunit.client.1.vm09.stdout:7/611: dread da/fb [0,4194304] 0 2026-03-09T17:30:24.243 INFO:tasks.workunit.client.1.vm09.stdout:9/487: sync 2026-03-09T17:30:24.244 INFO:tasks.workunit.client.1.vm09.stdout:7/612: fdatasync da/d11/d47/d89/fb4 0 2026-03-09T17:30:24.246 INFO:tasks.workunit.client.1.vm09.stdout:9/488: dwrite d5/f11 [0,4194304] 0 2026-03-09T17:30:24.250 INFO:tasks.workunit.client.1.vm09.stdout:5/540: rmdir d0/d2/d76/d87/da4 39 2026-03-09T17:30:24.268 INFO:tasks.workunit.client.1.vm09.stdout:4/523: chown d11/d1e/f61 0 1 2026-03-09T17:30:24.269 INFO:tasks.workunit.client.1.vm09.stdout:1/504: stat d9/dc/dd/d40/d21/d6f/d7e 0 2026-03-09T17:30:24.276 INFO:tasks.workunit.client.1.vm09.stdout:6/486: creat d3/d21/d76/d3f/f9d x:0 0 0 2026-03-09T17:30:24.276 INFO:tasks.workunit.client.1.vm09.stdout:3/469: getdents d5/d16/d31/d3d 0 2026-03-09T17:30:24.277 INFO:tasks.workunit.client.1.vm09.stdout:8/511: rmdir d1/d14/d2a/d42/d43/d44 39 2026-03-09T17:30:24.277 INFO:tasks.workunit.client.1.vm09.stdout:6/487: fdatasync d3/d21/d76/d5c/d61/d95/f20 0 2026-03-09T17:30:24.278 INFO:tasks.workunit.client.1.vm09.stdout:7/613: creat da/d11/d64/da7/db1/fd1 x:0 0 0 2026-03-09T17:30:24.279 INFO:tasks.workunit.client.1.vm09.stdout:7/614: readlink da/d11/d2d/lc4 0 2026-03-09T17:30:24.279 INFO:tasks.workunit.client.1.vm09.stdout:7/615: readlink da/l97 0 2026-03-09T17:30:24.279 INFO:tasks.workunit.client.1.vm09.stdout:7/616: stat da/d11/d2d/d56/d68/faa 0 2026-03-09T17:30:24.281 INFO:tasks.workunit.client.1.vm09.stdout:7/617: read da/d11/d3e/f60 [555165,114167] 0 2026-03-09T17:30:24.295 INFO:tasks.workunit.client.1.vm09.stdout:1/505: dread f3 [0,4194304] 0 2026-03-09T17:30:24.301 INFO:tasks.workunit.client.1.vm09.stdout:9/489: dread d5/de/d29/f37 [0,4194304] 0 2026-03-09T17:30:24.311 INFO:tasks.workunit.client.1.vm09.stdout:5/541: rename d0/dc/d21/d26/f39 to d0/d2/d76/d87/d95/d9b/fab 0 2026-03-09T17:30:24.314 INFO:tasks.workunit.client.1.vm09.stdout:0/508: creat d6/fa6 x:0 0 0 2026-03-09T17:30:24.342 INFO:tasks.workunit.client.1.vm09.stdout:7/618: mknod da/d11/d3e/da2/db2/cd2 0 2026-03-09T17:30:24.348 INFO:tasks.workunit.client.1.vm09.stdout:1/506: fdatasync d9/dc/dd/d40/f73 0 2026-03-09T17:30:24.348 INFO:tasks.workunit.client.1.vm09.stdout:8/512: rename d1/da/l25 to d1/d14/d31/la1 0 2026-03-09T17:30:24.348 INFO:tasks.workunit.client.1.vm09.stdout:5/542: creat d0/d2/d76/d86/fac x:0 0 0 2026-03-09T17:30:24.353 INFO:tasks.workunit.client.1.vm09.stdout:0/509: creat d6/d64/fa7 x:0 0 0 2026-03-09T17:30:24.353 INFO:tasks.workunit.client.1.vm09.stdout:2/490: getdents d13/d15/d34/d37/d66 0 2026-03-09T17:30:24.354 INFO:tasks.workunit.client.1.vm09.stdout:6/488: mknod d3/d21/d25/d91/d9a/c9e 0 2026-03-09T17:30:24.354 INFO:tasks.workunit.client.1.vm09.stdout:7/619: symlink da/d11/d47/d89/dbe/dc2/ld3 0 2026-03-09T17:30:24.356 INFO:tasks.workunit.client.1.vm09.stdout:9/490: mkdir d5/de/d29/da7 0 2026-03-09T17:30:24.357 INFO:tasks.workunit.client.1.vm09.stdout:9/491: fsync d5/de/f65 0 2026-03-09T17:30:24.359 INFO:tasks.workunit.client.1.vm09.stdout:8/513: unlink d1/f74 0 2026-03-09T17:30:24.360 INFO:tasks.workunit.client.1.vm09.stdout:8/514: truncate d1/d14/d2a/d42/d5d/d8a/f99 890840 0 2026-03-09T17:30:24.364 INFO:tasks.workunit.client.1.vm09.stdout:4/524: link d11/d1e/d45/d60/f64 d11/fa4 0 2026-03-09T17:30:24.364 INFO:tasks.workunit.client.1.vm09.stdout:5/543: readlink d0/d9/l30 0 2026-03-09T17:30:24.366 INFO:tasks.workunit.client.1.vm09.stdout:2/491: creat d13/d15/d3b/d43/f96 x:0 0 0 2026-03-09T17:30:24.369 INFO:tasks.workunit.client.1.vm09.stdout:6/489: mkdir d3/d21/d76/d5c/d9f 0 2026-03-09T17:30:24.370 INFO:tasks.workunit.client.1.vm09.stdout:2/492: sync 2026-03-09T17:30:24.371 INFO:tasks.workunit.client.1.vm09.stdout:2/493: dread - d13/d15/d34/d37/d66/f80 zero size 2026-03-09T17:30:24.373 INFO:tasks.workunit.client.1.vm09.stdout:8/515: unlink d1/da/dd/c61 0 2026-03-09T17:30:24.374 INFO:tasks.workunit.client.1.vm09.stdout:8/516: chown d1/d14/d2a/d42/d5d/d8a/f94 29459772 1 2026-03-09T17:30:24.374 INFO:tasks.workunit.client.1.vm09.stdout:8/517: chown d1/da/dd/d47 1036193 1 2026-03-09T17:30:24.376 INFO:tasks.workunit.client.1.vm09.stdout:5/544: mknod d0/d9/d8b/cad 0 2026-03-09T17:30:24.380 INFO:tasks.workunit.client.1.vm09.stdout:1/507: creat d9/dc/dd/d40/d21/d35/d88/f9a x:0 0 0 2026-03-09T17:30:24.382 INFO:tasks.workunit.client.1.vm09.stdout:0/510: symlink d6/d1d/d24/d5e/d86/la8 0 2026-03-09T17:30:24.387 INFO:tasks.workunit.client.1.vm09.stdout:3/470: rename d5/d16/d31/c51 to d5/c8d 0 2026-03-09T17:30:24.391 INFO:tasks.workunit.client.1.vm09.stdout:4/525: creat d11/d1e/d83/fa5 x:0 0 0 2026-03-09T17:30:24.392 INFO:tasks.workunit.client.1.vm09.stdout:8/518: mkdir d1/da/dd/d47/d4c/d8d/da2 0 2026-03-09T17:30:24.394 INFO:tasks.workunit.client.1.vm09.stdout:5/545: unlink d0/d9/d16/d5c/f73 0 2026-03-09T17:30:24.397 INFO:tasks.workunit.client.1.vm09.stdout:0/511: mknod d6/d1d/d39/ca9 0 2026-03-09T17:30:24.398 INFO:tasks.workunit.client.1.vm09.stdout:6/490: mkdir d3/d21/d25/d91/d98/da0 0 2026-03-09T17:30:24.399 INFO:tasks.workunit.client.1.vm09.stdout:3/471: fsync d5/d16/f54 0 2026-03-09T17:30:24.402 INFO:tasks.workunit.client.1.vm09.stdout:7/620: rename da/d11/d47/f6e to da/d11/d47/d5b/d6c/d9e/d4e/d5f/fd4 0 2026-03-09T17:30:24.402 INFO:tasks.workunit.client.1.vm09.stdout:9/492: rename d5/de/d29/d33 to d5/de/d29/d33/d94/da8 22 2026-03-09T17:30:24.404 INFO:tasks.workunit.client.1.vm09.stdout:9/493: dread - d5/d2e/f82 zero size 2026-03-09T17:30:24.407 INFO:tasks.workunit.client.1.vm09.stdout:4/526: read - d11/d1e/d29/f8a zero size 2026-03-09T17:30:24.408 INFO:tasks.workunit.client.1.vm09.stdout:6/491: dwrite d3/d21/d76/d5c/f6d [0,4194304] 0 2026-03-09T17:30:24.410 INFO:tasks.workunit.client.1.vm09.stdout:6/492: chown d3/d21/d76/d3f/l84 1995588 1 2026-03-09T17:30:24.410 INFO:tasks.workunit.client.1.vm09.stdout:3/472: dread d5/d9/d30/d65/f3e [0,4194304] 0 2026-03-09T17:30:24.411 INFO:tasks.workunit.client.1.vm09.stdout:3/473: dread - d5/d9/f88 zero size 2026-03-09T17:30:24.425 INFO:tasks.workunit.client.1.vm09.stdout:0/512: mknod d6/d1d/d24/d5e/d6c/caa 0 2026-03-09T17:30:24.425 INFO:tasks.workunit.client.1.vm09.stdout:4/527: mknod d11/d1e/d83/d89/ca6 0 2026-03-09T17:30:24.426 INFO:tasks.workunit.client.1.vm09.stdout:0/513: write d6/d64/f7e [535592,57121] 0 2026-03-09T17:30:24.426 INFO:tasks.workunit.client.1.vm09.stdout:0/514: chown d6/d93 23702396 1 2026-03-09T17:30:24.435 INFO:tasks.workunit.client.1.vm09.stdout:6/493: dread d3/d7/d59/d5a/f83 [0,4194304] 0 2026-03-09T17:30:24.445 INFO:tasks.workunit.client.1.vm09.stdout:9/494: mkdir d5/de/d29/d33/d94/da9 0 2026-03-09T17:30:24.445 INFO:tasks.workunit.client.1.vm09.stdout:4/528: mkdir d11/d1e/d83/d89/da7 0 2026-03-09T17:30:24.445 INFO:tasks.workunit.client.1.vm09.stdout:7/621: dread da/d11/f1a [0,4194304] 0 2026-03-09T17:30:24.446 INFO:tasks.workunit.client.1.vm09.stdout:9/495: dwrite d5/d2e/f7b [0,4194304] 0 2026-03-09T17:30:24.449 INFO:tasks.workunit.client.1.vm09.stdout:6/494: creat d3/d21/d25/d26/fa1 x:0 0 0 2026-03-09T17:30:24.457 INFO:tasks.workunit.client.1.vm09.stdout:7/622: stat da/d11/d47/d5b/d6c/d9e/d4e/d4c/l90 0 2026-03-09T17:30:24.457 INFO:tasks.workunit.client.1.vm09.stdout:0/515: mknod d6/d1d/cab 0 2026-03-09T17:30:24.457 INFO:tasks.workunit.client.1.vm09.stdout:9/496: fdatasync d5/de/d29/f73 0 2026-03-09T17:30:24.489 INFO:tasks.workunit.client.1.vm09.stdout:8/519: dwrite d1/da/d23/d6c/d32/f6d [0,4194304] 0 2026-03-09T17:30:24.489 INFO:tasks.workunit.client.1.vm09.stdout:7/623: creat da/d11/d77/fd5 x:0 0 0 2026-03-09T17:30:24.490 INFO:tasks.workunit.client.1.vm09.stdout:7/624: read da/d11/d47/d5b/f82 [317052,122929] 0 2026-03-09T17:30:24.490 INFO:tasks.workunit.client.1.vm09.stdout:8/520: fsync d1/da/dd/f45 0 2026-03-09T17:30:24.491 INFO:tasks.workunit.client.1.vm09.stdout:8/521: chown d1/d14/d2a/d42/d5d/d8a/f99 7465363 1 2026-03-09T17:30:24.492 INFO:tasks.workunit.client.1.vm09.stdout:8/522: chown d1/d14/d2a/d42/d5d/d8a/f94 1 1 2026-03-09T17:30:24.493 INFO:tasks.workunit.client.1.vm09.stdout:8/523: truncate d1/d14/d2a/d42/d43/f9e 945349 0 2026-03-09T17:30:24.500 INFO:tasks.workunit.client.1.vm09.stdout:1/508: truncate d9/dc/dd/d40/d22/d37/f41 3299193 0 2026-03-09T17:30:24.500 INFO:tasks.workunit.client.1.vm09.stdout:6/495: creat d3/d21/d76/d81/fa2 x:0 0 0 2026-03-09T17:30:24.504 INFO:tasks.workunit.client.1.vm09.stdout:2/494: truncate d13/d15/d21/f31 117524 0 2026-03-09T17:30:24.509 INFO:tasks.workunit.client.1.vm09.stdout:1/509: dread d9/dc/dd/fe [4194304,4194304] 0 2026-03-09T17:30:24.516 INFO:tasks.workunit.client.1.vm09.stdout:5/546: write d0/f60 [8023,128651] 0 2026-03-09T17:30:24.516 INFO:tasks.workunit.client.1.vm09.stdout:7/625: unlink da/d11/d2d/d56/d68/f81 0 2026-03-09T17:30:24.530 INFO:tasks.workunit.client.1.vm09.stdout:3/474: dwrite d5/d16/d31/d3d/d32/f33 [0,4194304] 0 2026-03-09T17:30:24.538 INFO:tasks.workunit.client.1.vm09.stdout:3/475: read d5/d9/f4e [3222078,19765] 0 2026-03-09T17:30:24.549 INFO:tasks.workunit.client.1.vm09.stdout:4/529: write d11/d1e/d29/d36/f40 [734624,86501] 0 2026-03-09T17:30:24.549 INFO:tasks.workunit.client.1.vm09.stdout:6/496: creat d3/d7/d59/d73/fa3 x:0 0 0 2026-03-09T17:30:24.554 INFO:tasks.workunit.client.1.vm09.stdout:4/530: dwrite d11/d1e/d83/f96 [0,4194304] 0 2026-03-09T17:30:24.599 INFO:tasks.workunit.client.1.vm09.stdout:2/495: creat d13/d15/d60/f97 x:0 0 0 2026-03-09T17:30:24.616 INFO:tasks.workunit.client.1.vm09.stdout:1/510: dread d9/f34 [0,4194304] 0 2026-03-09T17:30:24.623 INFO:tasks.workunit.client.1.vm09.stdout:0/516: write d6/d1d/f3c [326552,46821] 0 2026-03-09T17:30:24.641 INFO:tasks.workunit.client.1.vm09.stdout:8/524: dwrite d1/da/d23/d6c/d32/f50 [0,4194304] 0 2026-03-09T17:30:24.647 INFO:tasks.workunit.client.1.vm09.stdout:9/497: dwrite d5/de/d29/f35 [0,4194304] 0 2026-03-09T17:30:24.683 INFO:tasks.workunit.client.1.vm09.stdout:3/476: unlink d5/d16/d25/c4d 0 2026-03-09T17:30:24.684 INFO:tasks.workunit.client.1.vm09.stdout:4/531: creat d11/d1e/d83/d89/fa8 x:0 0 0 2026-03-09T17:30:24.696 INFO:tasks.workunit.client.1.vm09.stdout:2/496: unlink d13/d15/d3b/d43/l47 0 2026-03-09T17:30:24.702 INFO:tasks.workunit.client.1.vm09.stdout:1/511: creat d9/dc/dd/d40/d21/d6f/d7e/f9b x:0 0 0 2026-03-09T17:30:24.707 INFO:tasks.workunit.client.1.vm09.stdout:7/626: dwrite da/d11/d47/d5b/d78/fab [0,4194304] 0 2026-03-09T17:30:24.710 INFO:tasks.workunit.client.1.vm09.stdout:6/497: truncate d3/d7/d59/d73/f7d 315278 0 2026-03-09T17:30:24.718 INFO:tasks.workunit.client.1.vm09.stdout:8/525: creat d1/da/d3a/fa3 x:0 0 0 2026-03-09T17:30:24.720 INFO:tasks.workunit.client.1.vm09.stdout:9/498: creat d5/de/d4e/d6e/d93/faa x:0 0 0 2026-03-09T17:30:24.722 INFO:tasks.workunit.client.1.vm09.stdout:3/477: unlink d5/d16/d31/d3d/l4c 0 2026-03-09T17:30:24.729 INFO:tasks.workunit.client.1.vm09.stdout:6/498: creat d3/d21/d25/d91/fa4 x:0 0 0 2026-03-09T17:30:24.729 INFO:tasks.workunit.client.1.vm09.stdout:6/499: stat d3/d21/d76/d5c 0 2026-03-09T17:30:24.730 INFO:tasks.workunit.client.1.vm09.stdout:0/517: fsync d6/d1d/d24/d32/d59/d9c/fa2 0 2026-03-09T17:30:24.733 INFO:tasks.workunit.client.1.vm09.stdout:8/526: unlink d1/da/dd/d63/f1d 0 2026-03-09T17:30:24.733 INFO:tasks.workunit.client.1.vm09.stdout:8/527: stat d1/da/dd/f22 0 2026-03-09T17:30:24.734 INFO:tasks.workunit.client.1.vm09.stdout:8/528: write d1/da/f4b [1770298,109321] 0 2026-03-09T17:30:24.734 INFO:tasks.workunit.client.1.vm09.stdout:8/529: readlink d1/da/d3a/l7b 0 2026-03-09T17:30:24.735 INFO:tasks.workunit.client.1.vm09.stdout:3/478: creat d5/d16/d31/d37/d58/d64/f8e x:0 0 0 2026-03-09T17:30:24.738 INFO:tasks.workunit.client.1.vm09.stdout:2/497: rename d13/d15/d3b/d43/c4a to d13/d15/d60/d85/c98 0 2026-03-09T17:30:24.740 INFO:tasks.workunit.client.1.vm09.stdout:5/547: getdents d0/d2/d76/d86 0 2026-03-09T17:30:24.744 INFO:tasks.workunit.client.1.vm09.stdout:0/518: mkdir d6/d1d/d24/d32/d59/d9c/dac 0 2026-03-09T17:30:24.745 INFO:tasks.workunit.client.1.vm09.stdout:0/519: write d6/d1d/d24/d32/d59/d81/f90 [264021,54942] 0 2026-03-09T17:30:24.751 INFO:tasks.workunit.client.1.vm09.stdout:9/499: rename d5/d2e/f7b to d5/d7e/d81/fab 0 2026-03-09T17:30:24.765 INFO:tasks.workunit.client.1.vm09.stdout:2/498: rename d13/l1c to d13/d15/d36/d72/l99 0 2026-03-09T17:30:24.766 INFO:tasks.workunit.client.1.vm09.stdout:9/500: truncate d5/f4f 431165 0 2026-03-09T17:30:24.768 INFO:tasks.workunit.client.1.vm09.stdout:6/500: link d3/d7/d59/d5a/f83 d3/d21/d76/d5c/d61/d95/fa5 0 2026-03-09T17:30:24.771 INFO:tasks.workunit.client.1.vm09.stdout:8/530: creat d1/d14/d2a/d42/d43/fa4 x:0 0 0 2026-03-09T17:30:24.771 INFO:tasks.workunit.client.1.vm09.stdout:3/479: link d5/d9/l23 d5/d16/d46/l8f 0 2026-03-09T17:30:24.779 INFO:tasks.workunit.client.1.vm09.stdout:8/531: rmdir d1/da/d23 39 2026-03-09T17:30:24.791 INFO:tasks.workunit.client.1.vm09.stdout:9/501: unlink d5/d2e/d8b/la5 0 2026-03-09T17:30:24.791 INFO:tasks.workunit.client.1.vm09.stdout:8/532: creat d1/d14/d2a/d49/fa5 x:0 0 0 2026-03-09T17:30:24.791 INFO:tasks.workunit.client.1.vm09.stdout:9/502: creat d5/d2e/d8b/fac x:0 0 0 2026-03-09T17:30:24.791 INFO:tasks.workunit.client.1.vm09.stdout:6/501: link d3/d21/d76/d5c/d61/d6a/f74 d3/d21/d25/d91/d98/fa6 0 2026-03-09T17:30:24.793 INFO:tasks.workunit.client.1.vm09.stdout:8/533: link d1/da/dd/c76 d1/d14/ca6 0 2026-03-09T17:30:24.796 INFO:tasks.workunit.client.1.vm09.stdout:6/502: fsync d3/d21/d25/d26/f50 0 2026-03-09T17:30:24.797 INFO:tasks.workunit.client.1.vm09.stdout:9/503: dread d5/f13 [0,4194304] 0 2026-03-09T17:30:24.800 INFO:tasks.workunit.client.1.vm09.stdout:8/534: chown d1/da/d23/d6c/d32/f6d 1793 1 2026-03-09T17:30:24.804 INFO:tasks.workunit.client.1.vm09.stdout:6/503: creat d3/d7/d59/d73/fa7 x:0 0 0 2026-03-09T17:30:24.805 INFO:tasks.workunit.client.1.vm09.stdout:9/504: creat d5/d2e/d70/d84/d97/fad x:0 0 0 2026-03-09T17:30:24.806 INFO:tasks.workunit.client.1.vm09.stdout:8/535: dwrite d1/da/dd/f45 [0,4194304] 0 2026-03-09T17:30:24.806 INFO:tasks.workunit.client.1.vm09.stdout:9/505: truncate d5/d21/f92 201224 0 2026-03-09T17:30:24.807 INFO:tasks.workunit.client.1.vm09.stdout:2/499: dread d13/d15/d21/f24 [4194304,4194304] 0 2026-03-09T17:30:24.812 INFO:tasks.workunit.client.1.vm09.stdout:6/504: rename d3/d21/d76/d3f/l84 to d3/d7/d99/la8 0 2026-03-09T17:30:24.815 INFO:tasks.workunit.client.1.vm09.stdout:6/505: read - d3/d7/f58 zero size 2026-03-09T17:30:24.815 INFO:tasks.workunit.client.1.vm09.stdout:8/536: dwrite d1/da/f35 [0,4194304] 0 2026-03-09T17:30:24.816 INFO:tasks.workunit.client.1.vm09.stdout:8/537: chown d1/f16 5194 1 2026-03-09T17:30:24.819 INFO:tasks.workunit.client.1.vm09.stdout:8/538: dwrite d1/d14/d2a/f8b [0,4194304] 0 2026-03-09T17:30:24.824 INFO:tasks.workunit.client.1.vm09.stdout:8/539: write d1/d14/f2f [4241912,46993] 0 2026-03-09T17:30:24.829 INFO:tasks.workunit.client.1.vm09.stdout:8/540: mknod d1/da/dd/d63/ca7 0 2026-03-09T17:30:24.829 INFO:tasks.workunit.client.1.vm09.stdout:8/541: fdatasync d1/f6e 0 2026-03-09T17:30:24.832 INFO:tasks.workunit.client.1.vm09.stdout:2/500: dwrite d13/d15/d34/d69/f7a [0,4194304] 0 2026-03-09T17:30:24.839 INFO:tasks.workunit.client.1.vm09.stdout:2/501: creat d13/d15/f9a x:0 0 0 2026-03-09T17:30:24.842 INFO:tasks.workunit.client.1.vm09.stdout:8/542: dread d1/da/dd/d47/f82 [0,4194304] 0 2026-03-09T17:30:24.846 INFO:tasks.workunit.client.1.vm09.stdout:8/543: dwrite d1/da/dd/f9a [0,4194304] 0 2026-03-09T17:30:24.847 INFO:tasks.workunit.client.1.vm09.stdout:8/544: stat d1/da/dd/d63/f36 0 2026-03-09T17:30:24.849 INFO:tasks.workunit.client.1.vm09.stdout:8/545: creat d1/d14/fa8 x:0 0 0 2026-03-09T17:30:24.851 INFO:tasks.workunit.client.1.vm09.stdout:8/546: link d1/da/d3a/l60 d1/d14/d2a/d42/d5d/la9 0 2026-03-09T17:30:24.865 INFO:tasks.workunit.client.1.vm09.stdout:8/547: symlink d1/d14/laa 0 2026-03-09T17:30:24.865 INFO:tasks.workunit.client.1.vm09.stdout:8/548: unlink d1/da/d23/d71/l7c 0 2026-03-09T17:30:24.865 INFO:tasks.workunit.client.1.vm09.stdout:8/549: dwrite d1/d14/d2a/f2b [0,4194304] 0 2026-03-09T17:30:24.873 INFO:tasks.workunit.client.1.vm09.stdout:6/506: dread d3/d21/d25/f5f [0,4194304] 0 2026-03-09T17:30:24.890 INFO:tasks.workunit.client.1.vm09.stdout:6/507: dread d3/d21/f28 [0,4194304] 0 2026-03-09T17:30:24.892 INFO:tasks.workunit.client.1.vm09.stdout:6/508: rename d3/fc to d3/d21/d76/d5c/d9f/fa9 0 2026-03-09T17:30:24.893 INFO:tasks.workunit.client.1.vm09.stdout:6/509: readlink d3/d7/l16 0 2026-03-09T17:30:24.893 INFO:tasks.workunit.client.1.vm09.stdout:6/510: chown d3/d21/d76/d5c/d61 0 1 2026-03-09T17:30:24.893 INFO:tasks.workunit.client.1.vm09.stdout:6/511: chown d3/d7/ff 115 1 2026-03-09T17:30:24.895 INFO:tasks.workunit.client.1.vm09.stdout:6/512: fdatasync d3/d21/d25/f2f 0 2026-03-09T17:30:24.896 INFO:tasks.workunit.client.1.vm09.stdout:6/513: mknod d3/d7/d59/d9c/caa 0 2026-03-09T17:30:24.897 INFO:tasks.workunit.client.1.vm09.stdout:6/514: mknod d3/d21/d76/d88/cab 0 2026-03-09T17:30:24.900 INFO:tasks.workunit.client.1.vm09.stdout:9/506: sync 2026-03-09T17:30:24.902 INFO:tasks.workunit.client.1.vm09.stdout:9/507: chown d5/d91/d99/fa4 3 1 2026-03-09T17:30:24.906 INFO:tasks.workunit.client.1.vm09.stdout:6/515: dwrite d3/d21/d76/f70 [0,4194304] 0 2026-03-09T17:30:24.907 INFO:tasks.workunit.client.1.vm09.stdout:9/508: dwrite d5/d2e/d8b/fac [0,4194304] 0 2026-03-09T17:30:24.908 INFO:tasks.workunit.client.1.vm09.stdout:6/516: fdatasync d3/d21/d76/d5c/f65 0 2026-03-09T17:30:24.908 INFO:tasks.workunit.client.1.vm09.stdout:6/517: fsync d3/d7/f23 0 2026-03-09T17:30:24.914 INFO:tasks.workunit.client.1.vm09.stdout:6/518: rename d3/d21/d76/c3b to d3/d21/d76/d5c/d9f/cac 0 2026-03-09T17:30:24.915 INFO:tasks.workunit.client.1.vm09.stdout:6/519: chown d3/d21/d25/f8b 4511300 1 2026-03-09T17:30:24.920 INFO:tasks.workunit.client.1.vm09.stdout:9/509: creat d5/de/fae x:0 0 0 2026-03-09T17:30:24.924 INFO:tasks.workunit.client.1.vm09.stdout:9/510: dread - d5/de/fae zero size 2026-03-09T17:30:24.924 INFO:tasks.workunit.client.1.vm09.stdout:9/511: fdatasync d5/de/f76 0 2026-03-09T17:30:24.924 INFO:tasks.workunit.client.1.vm09.stdout:6/520: creat d3/d21/d76/d5c/d61/fad x:0 0 0 2026-03-09T17:30:24.924 INFO:tasks.workunit.client.1.vm09.stdout:6/521: dread - d3/d21/d25/d26/fa1 zero size 2026-03-09T17:30:24.957 INFO:tasks.workunit.client.1.vm09.stdout:9/512: dread d5/f14 [0,4194304] 0 2026-03-09T17:30:24.959 INFO:tasks.workunit.client.1.vm09.stdout:9/513: mknod d5/de/d29/da7/caf 0 2026-03-09T17:30:24.964 INFO:tasks.workunit.client.1.vm09.stdout:9/514: link d5/f8e d5/de/d4e/d6e/d93/fb0 0 2026-03-09T17:30:24.974 INFO:tasks.workunit.client.1.vm09.stdout:4/532: rmdir d11/d1e/d83 39 2026-03-09T17:30:24.976 INFO:tasks.workunit.client.1.vm09.stdout:1/512: rmdir d9/dc/dd/d40/d21/d6f/d7e 39 2026-03-09T17:30:24.985 INFO:tasks.workunit.client.1.vm09.stdout:1/513: dwrite d9/dc/dd/d40/f92 [0,4194304] 0 2026-03-09T17:30:24.991 INFO:tasks.workunit.client.1.vm09.stdout:4/533: dread d11/f23 [0,4194304] 0 2026-03-09T17:30:24.993 INFO:tasks.workunit.client.1.vm09.stdout:7/627: write da/d11/d47/d5b/f82 [4717327,101641] 0 2026-03-09T17:30:24.996 INFO:tasks.workunit.client.1.vm09.stdout:0/520: write d6/d1d/d24/f75 [5530733,27161] 0 2026-03-09T17:30:24.996 INFO:tasks.workunit.client.1.vm09.stdout:0/521: chown d6/c29 4076 1 2026-03-09T17:30:24.996 INFO:tasks.workunit.client.1.vm09.stdout:5/548: write d0/d2/d76/d86/f6b [296868,61466] 0 2026-03-09T17:30:24.998 INFO:tasks.workunit.client.1.vm09.stdout:0/522: truncate d6/d1d/d24/d32/d59/d81/f82 47130 0 2026-03-09T17:30:24.999 INFO:tasks.workunit.client.1.vm09.stdout:5/549: dwrite d0/d2/f31 [0,4194304] 0 2026-03-09T17:30:25.035 INFO:tasks.workunit.client.1.vm09.stdout:4/534: rename d11/d1e/d29/c68 to d11/d1e/d29/d36/d57/d78/ca9 0 2026-03-09T17:30:25.053 INFO:tasks.workunit.client.1.vm09.stdout:1/514: dread d9/dc/dd/f7b [0,4194304] 0 2026-03-09T17:30:25.064 INFO:tasks.workunit.client.1.vm09.stdout:7/628: mknod da/d11/d47/d89/cd6 0 2026-03-09T17:30:25.066 INFO:tasks.workunit.client.1.vm09.stdout:7/629: write da/d11/d47/d5b/d6c/d9e/d4e/f8e [2974331,43149] 0 2026-03-09T17:30:25.069 INFO:tasks.workunit.client.1.vm09.stdout:2/502: dwrite d13/d15/d34/f48 [0,4194304] 0 2026-03-09T17:30:25.073 INFO:tasks.workunit.client.1.vm09.stdout:2/503: fsync d13/d15/d34/d69/f7a 0 2026-03-09T17:30:25.082 INFO:tasks.workunit.client.1.vm09.stdout:0/523: rename d6/d1d/d24/d5e/l88 to d6/d1d/d24/d32/d59/d9c/dac/lad 0 2026-03-09T17:30:25.096 INFO:tasks.workunit.client.1.vm09.stdout:7/630: creat da/d11/d2d/d49/fd7 x:0 0 0 2026-03-09T17:30:25.096 INFO:tasks.workunit.client.1.vm09.stdout:7/631: stat da/d11/d2d/d49/f52 0 2026-03-09T17:30:25.097 INFO:tasks.workunit.client.1.vm09.stdout:7/632: truncate da/d11/d47/f8d 701559 0 2026-03-09T17:30:25.100 INFO:tasks.workunit.client.1.vm09.stdout:4/535: dread d11/f18 [0,4194304] 0 2026-03-09T17:30:25.100 INFO:tasks.workunit.client.1.vm09.stdout:2/504: unlink d13/d15/d36/d72/l86 0 2026-03-09T17:30:25.103 INFO:tasks.workunit.client.1.vm09.stdout:2/505: dwrite d13/f89 [0,4194304] 0 2026-03-09T17:30:25.109 INFO:tasks.workunit.client.1.vm09.stdout:5/550: dread d0/d2/f2a [0,4194304] 0 2026-03-09T17:30:25.115 INFO:tasks.workunit.client.1.vm09.stdout:2/506: dwrite d13/f79 [0,4194304] 0 2026-03-09T17:30:25.117 INFO:tasks.workunit.client.1.vm09.stdout:2/507: stat d13/d15/d36/f59 0 2026-03-09T17:30:25.124 INFO:tasks.workunit.client.1.vm09.stdout:0/524: creat d6/d1d/d46/fae x:0 0 0 2026-03-09T17:30:25.124 INFO:tasks.workunit.client.1.vm09.stdout:7/633: mkdir da/d11/d3e/dd8 0 2026-03-09T17:30:25.131 INFO:tasks.workunit.client.1.vm09.stdout:4/536: dwrite d11/d1e/d83/d89/f94 [0,4194304] 0 2026-03-09T17:30:25.134 INFO:tasks.workunit.client.1.vm09.stdout:2/508: chown d13/f14 1 1 2026-03-09T17:30:25.134 INFO:tasks.workunit.client.1.vm09.stdout:4/537: truncate d11/d1e/d83/d89/d8b/f5f 4207029 0 2026-03-09T17:30:25.135 INFO:tasks.workunit.client.1.vm09.stdout:4/538: read d11/f1f [690281,1928] 0 2026-03-09T17:30:25.137 INFO:tasks.workunit.client.1.vm09.stdout:1/515: rename d9/dc/dd/d40/d21/d6f/d7e to d9/d3a/d9c 0 2026-03-09T17:30:25.139 INFO:tasks.workunit.client.1.vm09.stdout:1/516: dread - d9/d3a/d9c/f89 zero size 2026-03-09T17:30:25.139 INFO:tasks.workunit.client.1.vm09.stdout:1/517: readlink d9/d3a/l7f 0 2026-03-09T17:30:25.139 INFO:tasks.workunit.client.1.vm09.stdout:0/525: creat d6/faf x:0 0 0 2026-03-09T17:30:25.140 INFO:tasks.workunit.client.1.vm09.stdout:5/551: dread d0/d2/d76/d86/f50 [0,4194304] 0 2026-03-09T17:30:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:25 vm06.local ceph-mon[57307]: pgmap v12: 65 pgs: 65 active+clean; 3.3 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 53 MiB/s rd, 144 MiB/s wr, 318 op/s 2026-03-09T17:30:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:25 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:25 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:25 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:25 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:25.147 INFO:tasks.workunit.client.1.vm09.stdout:0/526: dwrite d6/d1d/f70 [0,4194304] 0 2026-03-09T17:30:25.158 INFO:tasks.workunit.client.1.vm09.stdout:7/634: rename da/f26 to da/d11/d47/d5b/d6c/d9e/d4e/fd9 0 2026-03-09T17:30:25.167 INFO:tasks.workunit.client.1.vm09.stdout:4/539: symlink d11/d1e/d83/d89/d8b/laa 0 2026-03-09T17:30:25.169 INFO:tasks.workunit.client.1.vm09.stdout:1/518: creat d9/dc/dd/f9d x:0 0 0 2026-03-09T17:30:25.192 INFO:tasks.workunit.client.1.vm09.stdout:0/527: rename d6/d1d/d46/fae to d6/d1d/d24/d32/d59/fb0 0 2026-03-09T17:30:25.200 INFO:tasks.workunit.client.1.vm09.stdout:8/550: dwrite d1/d14/d2a/d42/d5d/f80 [4194304,4194304] 0 2026-03-09T17:30:25.203 INFO:tasks.workunit.client.1.vm09.stdout:6/522: write d3/d7/f11 [519132,123702] 0 2026-03-09T17:30:25.204 INFO:tasks.workunit.client.1.vm09.stdout:8/551: read d1/f7 [4401285,73522] 0 2026-03-09T17:30:25.211 INFO:tasks.workunit.client.1.vm09.stdout:9/515: dwrite d5/de/d29/f57 [0,4194304] 0 2026-03-09T17:30:25.213 INFO:tasks.workunit.client.1.vm09.stdout:9/516: write d5/f1b [2190792,73336] 0 2026-03-09T17:30:25.241 INFO:tasks.workunit.client.1.vm09.stdout:3/480: truncate d5/d16/d31/d3d/d32/f89 1673312 0 2026-03-09T17:30:25.263 INFO:tasks.workunit.client.1.vm09.stdout:6/523: creat d3/d21/d76/d5c/d9f/fae x:0 0 0 2026-03-09T17:30:25.266 INFO:tasks.workunit.client.1.vm09.stdout:2/509: dwrite d13/d15/d21/f28 [0,4194304] 0 2026-03-09T17:30:25.267 INFO:tasks.workunit.client.1.vm09.stdout:2/510: chown d13/d15/d60 10906 1 2026-03-09T17:30:25.290 INFO:tasks.workunit.client.1.vm09.stdout:1/519: mkdir d9/d9e 0 2026-03-09T17:30:25.302 INFO:tasks.workunit.client.1.vm09.stdout:5/552: link d0/d9/l72 d0/d2/lae 0 2026-03-09T17:30:25.303 INFO:tasks.workunit.client.1.vm09.stdout:5/553: fdatasync d0/d9/d16/d5c/f70 0 2026-03-09T17:30:25.304 INFO:tasks.workunit.client.1.vm09.stdout:5/554: read d0/d2/d76/d86/f50 [1140267,109334] 0 2026-03-09T17:30:25.331 INFO:tasks.workunit.client.1.vm09.stdout:7/635: rename da/d11/d47/d5b/d6c/d9e/d4e/f8e to da/d11/d47/d89/dbe/fda 0 2026-03-09T17:30:25.353 INFO:tasks.workunit.client.1.vm09.stdout:0/528: link d6/d1d/d24/d32/f68 d6/d1d/d24/d32/d59/d81/d8c/fb1 0 2026-03-09T17:30:25.355 INFO:tasks.workunit.client.1.vm09.stdout:4/540: creat d11/fab x:0 0 0 2026-03-09T17:30:25.390 INFO:tasks.workunit.client.1.vm09.stdout:8/552: link d1/da/d23/f8f d1/d14/d31/d97/fab 0 2026-03-09T17:30:25.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:25 vm09.local ceph-mon[62061]: pgmap v12: 65 pgs: 65 active+clean; 3.3 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 53 MiB/s rd, 144 MiB/s wr, 318 op/s 2026-03-09T17:30:25.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:25 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:25.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:25 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:25.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:25 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:25.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:25 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:25.396 INFO:tasks.workunit.client.1.vm09.stdout:1/520: rename d9/d3a to d9/dc/dd/d9f 0 2026-03-09T17:30:25.400 INFO:tasks.workunit.client.1.vm09.stdout:7/636: dread - da/d11/d47/d89/dbe/dc2/f95 zero size 2026-03-09T17:30:25.405 INFO:tasks.workunit.client.1.vm09.stdout:7/637: read da/d11/d47/d5b/d6c/d9e/f38 [87228,54173] 0 2026-03-09T17:30:25.412 INFO:tasks.workunit.client.1.vm09.stdout:0/529: fdatasync d6/d1d/f91 0 2026-03-09T17:30:25.415 INFO:tasks.workunit.client.1.vm09.stdout:3/481: mkdir d5/d9/d90 0 2026-03-09T17:30:25.420 INFO:tasks.workunit.client.1.vm09.stdout:6/524: write d3/d21/d76/d3f/f51 [998160,61014] 0 2026-03-09T17:30:25.423 INFO:tasks.workunit.client.1.vm09.stdout:9/517: creat d5/de/fb1 x:0 0 0 2026-03-09T17:30:25.426 INFO:tasks.workunit.client.1.vm09.stdout:7/638: sync 2026-03-09T17:30:25.427 INFO:tasks.workunit.client.1.vm09.stdout:7/639: readlink da/d11/d2d/lc4 0 2026-03-09T17:30:25.429 INFO:tasks.workunit.client.1.vm09.stdout:7/640: sync 2026-03-09T17:30:25.432 INFO:tasks.workunit.client.1.vm09.stdout:2/511: write d13/d15/d34/d45/f49 [353089,7169] 0 2026-03-09T17:30:25.460 INFO:tasks.workunit.client.1.vm09.stdout:5/555: truncate d0/dc/d21/d33/f35 1491608 0 2026-03-09T17:30:25.460 INFO:tasks.workunit.client.1.vm09.stdout:1/521: write f3 [1257487,39225] 0 2026-03-09T17:30:25.461 INFO:tasks.workunit.client.1.vm09.stdout:3/482: creat d5/d16/d31/d37/d58/f91 x:0 0 0 2026-03-09T17:30:25.461 INFO:tasks.workunit.client.1.vm09.stdout:3/483: stat d5/d9/d30/d65/d59/c82 0 2026-03-09T17:30:25.461 INFO:tasks.workunit.client.1.vm09.stdout:9/518: symlink d5/de/d88/lb2 0 2026-03-09T17:30:25.462 INFO:tasks.workunit.client.1.vm09.stdout:9/519: read d5/f11 [3378104,15560] 0 2026-03-09T17:30:25.472 INFO:tasks.workunit.client.1.vm09.stdout:0/530: mkdir d6/d1d/d24/d5e/db2 0 2026-03-09T17:30:25.472 INFO:tasks.workunit.client.1.vm09.stdout:2/512: readlink d13/d15/d36/d72/l99 0 2026-03-09T17:30:25.474 INFO:tasks.workunit.client.1.vm09.stdout:4/541: rmdir d11/d1e/d83/d89/da7 0 2026-03-09T17:30:25.476 INFO:tasks.workunit.client.1.vm09.stdout:7/641: mkdir da/d11/d47/d5b/d6c/d9e/dc6/ddb 0 2026-03-09T17:30:25.479 INFO:tasks.workunit.client.1.vm09.stdout:7/642: dread da/d11/d47/d5b/d78/fab [0,4194304] 0 2026-03-09T17:30:25.505 INFO:tasks.workunit.client.1.vm09.stdout:9/520: creat d5/de/d29/da7/fb3 x:0 0 0 2026-03-09T17:30:25.521 INFO:tasks.workunit.client.1.vm09.stdout:8/553: truncate d1/da/dd/d47/f82 310694 0 2026-03-09T17:30:25.581 INFO:tasks.workunit.client.1.vm09.stdout:2/513: dread d13/d15/d34/f44 [0,4194304] 0 2026-03-09T17:30:25.581 INFO:tasks.workunit.client.1.vm09.stdout:2/514: readlink d13/d15/d36/l52 0 2026-03-09T17:30:25.589 INFO:tasks.workunit.client.1.vm09.stdout:4/542: symlink d11/d1e/d29/d36/d57/d78/lac 0 2026-03-09T17:30:25.591 INFO:tasks.workunit.client.1.vm09.stdout:6/525: rename d3/d7/f10 to d3/faf 0 2026-03-09T17:30:25.591 INFO:tasks.workunit.client.1.vm09.stdout:3/484: symlink d5/d9/d90/l92 0 2026-03-09T17:30:25.601 INFO:tasks.workunit.client.1.vm09.stdout:3/485: read d5/d9/f79 [56478,25572] 0 2026-03-09T17:30:25.603 INFO:tasks.workunit.client.1.vm09.stdout:9/521: rmdir d5/de/d29 39 2026-03-09T17:30:25.625 INFO:tasks.workunit.client.1.vm09.stdout:8/554: write d1/f16 [996649,44615] 0 2026-03-09T17:30:25.709 INFO:tasks.workunit.client.1.vm09.stdout:5/556: creat d0/d2/d76/d87/faf x:0 0 0 2026-03-09T17:30:25.711 INFO:tasks.workunit.client.1.vm09.stdout:4/543: creat d11/d1e/d29/d36/fad x:0 0 0 2026-03-09T17:30:25.711 INFO:tasks.workunit.client.1.vm09.stdout:1/522: creat d9/d5a/fa0 x:0 0 0 2026-03-09T17:30:25.713 INFO:tasks.workunit.client.1.vm09.stdout:2/515: dread d13/d15/d34/d45/f6a [0,4194304] 0 2026-03-09T17:30:25.713 INFO:tasks.workunit.client.1.vm09.stdout:2/516: write d13/d4d/f81 [613671,40212] 0 2026-03-09T17:30:25.714 INFO:tasks.workunit.client.1.vm09.stdout:6/526: mkdir d3/d7/d59/d73/db0 0 2026-03-09T17:30:25.715 INFO:tasks.workunit.client.1.vm09.stdout:7/643: unlink da/d11/d2d/d56/f50 0 2026-03-09T17:30:25.720 INFO:tasks.workunit.client.1.vm09.stdout:3/486: fdatasync d5/d9/d30/d65/f3e 0 2026-03-09T17:30:25.720 INFO:tasks.workunit.client.1.vm09.stdout:9/522: mkdir d5/d2e/d8b/db4 0 2026-03-09T17:30:25.739 INFO:tasks.workunit.client.1.vm09.stdout:4/544: dread - d11/d1e/d45/d60/f6c zero size 2026-03-09T17:30:25.741 INFO:tasks.workunit.client.1.vm09.stdout:2/517: rmdir d13/d15/d60/d85 39 2026-03-09T17:30:25.742 INFO:tasks.workunit.client.1.vm09.stdout:7/644: creat da/d11/d64/da7/fdc x:0 0 0 2026-03-09T17:30:25.744 INFO:tasks.workunit.client.1.vm09.stdout:9/523: symlink d5/d7e/d81/lb5 0 2026-03-09T17:30:25.745 INFO:tasks.workunit.client.1.vm09.stdout:4/545: sync 2026-03-09T17:30:25.747 INFO:tasks.workunit.client.1.vm09.stdout:0/531: getdents d6/d1d/d24/d32/d59/d9c 0 2026-03-09T17:30:25.748 INFO:tasks.workunit.client.1.vm09.stdout:0/532: write d6/d1d/d24/d32/d59/f99 [7085,120673] 0 2026-03-09T17:30:25.749 INFO:tasks.workunit.client.1.vm09.stdout:0/533: dread - d6/f6d zero size 2026-03-09T17:30:25.750 INFO:tasks.workunit.client.1.vm09.stdout:0/534: readlink d6/d1d/l1f 0 2026-03-09T17:30:25.751 INFO:tasks.workunit.client.1.vm09.stdout:0/535: fsync d6/d64/fa7 0 2026-03-09T17:30:25.753 INFO:tasks.workunit.client.1.vm09.stdout:0/536: write d6/d1d/d24/d32/d59/f5c [3771694,110057] 0 2026-03-09T17:30:25.757 INFO:tasks.workunit.client.1.vm09.stdout:5/557: link d0/ff d0/dc/d21/d26/d5e/fb0 0 2026-03-09T17:30:25.760 INFO:tasks.workunit.client.1.vm09.stdout:0/537: dread d6/d1d/f3c [0,4194304] 0 2026-03-09T17:30:25.778 INFO:tasks.workunit.client.1.vm09.stdout:6/527: mkdir d3/d21/db1 0 2026-03-09T17:30:25.783 INFO:tasks.workunit.client.1.vm09.stdout:1/523: dwrite d9/dc/dd/d40/d22/d37/f41 [0,4194304] 0 2026-03-09T17:30:25.835 INFO:tasks.workunit.client.1.vm09.stdout:3/487: symlink d5/d9/d30/d65/d59/d84/d8c/l93 0 2026-03-09T17:30:25.835 INFO:tasks.workunit.client.1.vm09.stdout:2/518: creat d13/d15/d36/d72/f9b x:0 0 0 2026-03-09T17:30:25.836 INFO:tasks.workunit.client.1.vm09.stdout:9/524: creat d5/d2e/d8b/fb6 x:0 0 0 2026-03-09T17:30:25.837 INFO:tasks.workunit.client.1.vm09.stdout:3/488: dread - d5/d9/d30/d65/d59/f81 zero size 2026-03-09T17:30:25.837 INFO:tasks.workunit.client.1.vm09.stdout:3/489: write d5/d16/f45 [8398008,65412] 0 2026-03-09T17:30:25.839 INFO:tasks.workunit.client.1.vm09.stdout:0/538: mknod d6/d1d/d46/cb3 0 2026-03-09T17:30:25.840 INFO:tasks.workunit.client.1.vm09.stdout:9/525: sync 2026-03-09T17:30:25.842 INFO:tasks.workunit.client.1.vm09.stdout:1/524: dread d9/f11 [4194304,4194304] 0 2026-03-09T17:30:25.850 INFO:tasks.workunit.client.1.vm09.stdout:2/519: creat d13/d15/d3b/f9c x:0 0 0 2026-03-09T17:30:25.852 INFO:tasks.workunit.client.1.vm09.stdout:2/520: sync 2026-03-09T17:30:25.854 INFO:tasks.workunit.client.1.vm09.stdout:8/555: getdents d1/d14/d2a/d49 0 2026-03-09T17:30:25.870 INFO:tasks.workunit.client.1.vm09.stdout:7/645: write da/d11/f6a [116758,101653] 0 2026-03-09T17:30:25.872 INFO:tasks.workunit.client.1.vm09.stdout:3/490: unlink d5/d16/d25/l3b 0 2026-03-09T17:30:25.876 INFO:tasks.workunit.client.1.vm09.stdout:4/546: dwrite d11/d1e/d83/d89/d8b/f53 [0,4194304] 0 2026-03-09T17:30:25.878 INFO:tasks.workunit.client.1.vm09.stdout:7/646: dwrite da/d11/d47/fc8 [0,4194304] 0 2026-03-09T17:30:25.922 INFO:tasks.workunit.client.1.vm09.stdout:1/525: creat d9/dc/dd/d40/d1d/fa1 x:0 0 0 2026-03-09T17:30:25.924 INFO:tasks.workunit.client.1.vm09.stdout:1/526: chown d9/dc/dd/d40/d22/d37/f2e 210943189 1 2026-03-09T17:30:25.934 INFO:tasks.workunit.client.1.vm09.stdout:8/556: creat d1/d14/d2a/d49/fac x:0 0 0 2026-03-09T17:30:25.938 INFO:tasks.workunit.client.1.vm09.stdout:5/558: getdents d0/dc 0 2026-03-09T17:30:25.944 INFO:tasks.workunit.client.1.vm09.stdout:6/528: write d3/faf [2214603,107934] 0 2026-03-09T17:30:25.946 INFO:tasks.workunit.client.1.vm09.stdout:9/526: write d5/de/d4e/d6e/d93/fb0 [409275,266] 0 2026-03-09T17:30:25.954 INFO:tasks.workunit.client.1.vm09.stdout:2/521: dwrite d13/d15/f74 [0,4194304] 0 2026-03-09T17:30:25.962 INFO:tasks.workunit.client.1.vm09.stdout:2/522: read d13/d15/d21/f3e [2806150,67663] 0 2026-03-09T17:30:25.967 INFO:tasks.workunit.client.1.vm09.stdout:2/523: dwrite d13/d15/f7e [0,4194304] 0 2026-03-09T17:30:25.976 INFO:tasks.workunit.client.1.vm09.stdout:2/524: dwrite d13/f8b [0,4194304] 0 2026-03-09T17:30:25.986 INFO:tasks.workunit.client.1.vm09.stdout:2/525: sync 2026-03-09T17:30:25.987 INFO:tasks.workunit.client.1.vm09.stdout:0/539: symlink d6/lb4 0 2026-03-09T17:30:25.987 INFO:tasks.workunit.client.1.vm09.stdout:0/540: stat d6/d1d/l47 0 2026-03-09T17:30:25.988 INFO:tasks.workunit.client.1.vm09.stdout:0/541: chown d6/d1d/f3c 9886505 1 2026-03-09T17:30:25.988 INFO:tasks.workunit.client.1.vm09.stdout:0/542: readlink d6/d1d/d24/l6e 0 2026-03-09T17:30:25.990 INFO:tasks.workunit.client.1.vm09.stdout:7/647: creat da/d11/d47/d5b/d78/fdd x:0 0 0 2026-03-09T17:30:25.991 INFO:tasks.workunit.client.1.vm09.stdout:7/648: chown da/d11/d2d/d56/d68/cad 8201725 1 2026-03-09T17:30:25.998 INFO:tasks.workunit.client.1.vm09.stdout:1/527: rename d9/d38/d61/c7a to d9/dc/dd/d40/d22/d37/d3f/d42/ca2 0 2026-03-09T17:30:26.002 INFO:tasks.workunit.client.1.vm09.stdout:8/557: truncate d1/f7 6910031 0 2026-03-09T17:30:26.002 INFO:tasks.workunit.client.1.vm09.stdout:8/558: dread - d1/d14/d2a/d42/d43/fa4 zero size 2026-03-09T17:30:26.019 INFO:tasks.workunit.client.1.vm09.stdout:5/559: rmdir d0 39 2026-03-09T17:30:26.028 INFO:tasks.workunit.client.1.vm09.stdout:9/527: mkdir d5/d2e/d70/d84/db7 0 2026-03-09T17:30:26.030 INFO:tasks.workunit.client.1.vm09.stdout:3/491: link d5/d16/d31/d37/f76 d5/d16/d31/d37/f94 0 2026-03-09T17:30:26.032 INFO:tasks.workunit.client.1.vm09.stdout:9/528: dread d5/f13 [4194304,4194304] 0 2026-03-09T17:30:26.198 INFO:tasks.workunit.client.1.vm09.stdout:4/547: fsync d11/d1e/f8c 0 2026-03-09T17:30:26.204 INFO:tasks.workunit.client.1.vm09.stdout:2/526: symlink d13/d15/d60/d90/l9d 0 2026-03-09T17:30:26.205 INFO:tasks.workunit.client.1.vm09.stdout:2/527: write d13/d15/d3b/d43/f96 [136984,81064] 0 2026-03-09T17:30:26.206 INFO:tasks.workunit.client.1.vm09.stdout:0/543: mkdir d6/d64/db5 0 2026-03-09T17:30:26.208 INFO:tasks.workunit.client.1.vm09.stdout:0/544: dread d6/d1d/d24/d32/d59/d81/f82 [0,4194304] 0 2026-03-09T17:30:26.210 INFO:tasks.workunit.client.1.vm09.stdout:1/528: creat d9/d5a/fa3 x:0 0 0 2026-03-09T17:30:26.217 INFO:tasks.workunit.client.1.vm09.stdout:8/559: creat d1/da/dd/d77/fad x:0 0 0 2026-03-09T17:30:26.218 INFO:tasks.workunit.client.1.vm09.stdout:6/529: getdents d3/d21/d25/d91/d98/da0 0 2026-03-09T17:30:26.221 INFO:tasks.workunit.client.1.vm09.stdout:2/528: dread d13/d15/f1d [0,4194304] 0 2026-03-09T17:30:26.239 INFO:tasks.workunit.client.1.vm09.stdout:9/529: dwrite d5/de/d29/f52 [0,4194304] 0 2026-03-09T17:30:26.260 INFO:tasks.workunit.client.1.vm09.stdout:0/545: dwrite d6/d1d/d24/d32/d59/fb0 [0,4194304] 0 2026-03-09T17:30:26.272 INFO:tasks.workunit.client.1.vm09.stdout:0/546: dwrite d6/d1d/d24/d5e/f8a [0,4194304] 0 2026-03-09T17:30:26.279 INFO:tasks.workunit.client.1.vm09.stdout:0/547: dread d6/d1d/d24/d32/d59/fb0 [0,4194304] 0 2026-03-09T17:30:26.290 INFO:tasks.workunit.client.1.vm09.stdout:0/548: dwrite d6/d1d/d24/d32/d59/d81/f90 [0,4194304] 0 2026-03-09T17:30:26.292 INFO:tasks.workunit.client.1.vm09.stdout:2/529: mknod d13/d15/d34/d45/c9e 0 2026-03-09T17:30:26.301 INFO:tasks.workunit.client.1.vm09.stdout:0/549: readlink d6/d1d/d39/l2f 0 2026-03-09T17:30:26.301 INFO:tasks.workunit.client.1.vm09.stdout:0/550: readlink d6/d1d/d24/d32/l5a 0 2026-03-09T17:30:26.301 INFO:tasks.workunit.client.1.vm09.stdout:2/530: dwrite d13/d15/d34/d37/d66/f80 [0,4194304] 0 2026-03-09T17:30:26.332 INFO:tasks.workunit.client.1.vm09.stdout:2/531: creat d13/d15/d60/d90/f9f x:0 0 0 2026-03-09T17:30:26.336 INFO:tasks.workunit.client.1.vm09.stdout:5/560: truncate d0/dc/f37 1052543 0 2026-03-09T17:30:26.341 INFO:tasks.workunit.client.1.vm09.stdout:8/560: link d1/c1a d1/d14/d2a/d42/d5d/d8a/cae 0 2026-03-09T17:30:26.342 INFO:tasks.workunit.client.1.vm09.stdout:5/561: symlink d0/dc/d21/d26/d5e/d68/d79/lb1 0 2026-03-09T17:30:26.343 INFO:tasks.workunit.client.1.vm09.stdout:9/530: getdents d5/d2e/d70 0 2026-03-09T17:30:26.350 INFO:tasks.workunit.client.1.vm09.stdout:9/531: mkdir d5/de/d29/d33/db8 0 2026-03-09T17:30:26.354 INFO:tasks.workunit.client.1.vm09.stdout:9/532: dwrite d5/de/d4e/d6e/d93/faa [0,4194304] 0 2026-03-09T17:30:26.357 INFO:tasks.workunit.client.1.vm09.stdout:9/533: creat d5/de/d29/d90/fb9 x:0 0 0 2026-03-09T17:30:26.368 INFO:tasks.workunit.client.1.vm09.stdout:9/534: symlink d5/de/d4e/lba 0 2026-03-09T17:30:26.373 INFO:tasks.workunit.client.1.vm09.stdout:9/535: getdents d5/d2e/d70 0 2026-03-09T17:30:26.432 INFO:tasks.workunit.client.1.vm09.stdout:7/649: rmdir da/d11/d2d/d56 39 2026-03-09T17:30:26.434 INFO:tasks.workunit.client.1.vm09.stdout:7/650: truncate da/d11/d2d/d49/f52 743556 0 2026-03-09T17:30:26.437 INFO:tasks.workunit.client.1.vm09.stdout:7/651: link da/d11/d47/d5b/d78/fdd da/d11/d47/d5b/d6c/d9e/d4e/d4c/fde 0 2026-03-09T17:30:26.448 INFO:tasks.workunit.client.1.vm09.stdout:7/652: dread da/d11/d47/d5b/d6c/d9e/d4e/fd9 [0,4194304] 0 2026-03-09T17:30:26.449 INFO:tasks.workunit.client.1.vm09.stdout:7/653: chown da/d11/d64/da7/db1/fb6 357429 1 2026-03-09T17:30:26.453 INFO:tasks.workunit.client.1.vm09.stdout:7/654: dwrite da/d11/d47/d5b/d6c/d9e/f35 [0,4194304] 0 2026-03-09T17:30:26.462 INFO:tasks.workunit.client.1.vm09.stdout:7/655: dread da/f36 [0,4194304] 0 2026-03-09T17:30:26.464 INFO:tasks.workunit.client.1.vm09.stdout:7/656: symlink da/d11/d47/d89/dbe/ldf 0 2026-03-09T17:30:26.466 INFO:tasks.workunit.client.1.vm09.stdout:7/657: getdents da/d11/d64/da7/db1 0 2026-03-09T17:30:26.561 INFO:tasks.workunit.client.1.vm09.stdout:4/548: dwrite d11/fa4 [0,4194304] 0 2026-03-09T17:30:26.702 INFO:tasks.workunit.client.1.vm09.stdout:0/551: dwrite d6/d1d/d24/f50 [0,4194304] 0 2026-03-09T17:30:26.702 INFO:tasks.workunit.client.1.vm09.stdout:6/530: dwrite d3/d21/d76/d5c/d61/f60 [0,4194304] 0 2026-03-09T17:30:26.710 INFO:tasks.workunit.client.1.vm09.stdout:6/531: creat d3/d21/d25/d26/fb2 x:0 0 0 2026-03-09T17:30:26.710 INFO:tasks.workunit.client.1.vm09.stdout:8/561: write d1/da/dd/d79/f83 [763335,62333] 0 2026-03-09T17:30:26.716 INFO:tasks.workunit.client.1.vm09.stdout:6/532: mknod d3/d7/d59/d5a/cb3 0 2026-03-09T17:30:26.716 INFO:tasks.workunit.client.1.vm09.stdout:5/562: truncate d0/d2/f31 1663461 0 2026-03-09T17:30:26.716 INFO:tasks.workunit.client.1.vm09.stdout:5/563: chown d0/d2 193 1 2026-03-09T17:30:26.721 INFO:tasks.workunit.client.1.vm09.stdout:3/492: rename d5/d9/d30/c68 to d5/d9/d30/c95 0 2026-03-09T17:30:26.722 INFO:tasks.workunit.client.1.vm09.stdout:5/564: read d0/d52/d20/f25 [658221,18428] 0 2026-03-09T17:30:26.723 INFO:tasks.workunit.client.1.vm09.stdout:6/533: link d3/d21/d76/d3f/f42 d3/d21/d25/d91/d9a/fb4 0 2026-03-09T17:30:26.726 INFO:tasks.workunit.client.1.vm09.stdout:3/493: dread d5/d16/d46/f63 [0,4194304] 0 2026-03-09T17:30:26.727 INFO:tasks.workunit.client.1.vm09.stdout:6/534: creat d3/d7/d99/fb5 x:0 0 0 2026-03-09T17:30:26.728 INFO:tasks.workunit.client.1.vm09.stdout:6/535: write d3/d21/d76/d5c/f6d [2269131,124898] 0 2026-03-09T17:30:26.729 INFO:tasks.workunit.client.1.vm09.stdout:2/532: link d13/f89 d13/d15/d34/d37/fa0 0 2026-03-09T17:30:26.733 INFO:tasks.workunit.client.1.vm09.stdout:3/494: dwrite d5/d16/d31/d3d/fe [4194304,4194304] 0 2026-03-09T17:30:26.734 INFO:tasks.workunit.client.1.vm09.stdout:3/495: dread - d5/d9/d30/d65/d59/d84/f86 zero size 2026-03-09T17:30:26.737 INFO:tasks.workunit.client.1.vm09.stdout:2/533: creat d13/d15/d36/fa1 x:0 0 0 2026-03-09T17:30:26.741 INFO:tasks.workunit.client.1.vm09.stdout:2/534: rename d13/d15/d34/d69/d93 to d13/d15/d2c/da2 0 2026-03-09T17:30:26.746 INFO:tasks.workunit.client.1.vm09.stdout:3/496: getdents d5/d9/d30/d65 0 2026-03-09T17:30:26.747 INFO:tasks.workunit.client.1.vm09.stdout:3/497: write d5/d16/f45 [1881166,65674] 0 2026-03-09T17:30:26.748 INFO:tasks.workunit.client.1.vm09.stdout:3/498: rename d5/d16/d31 to d5/d16/d31/d37/d58/d64/d96 22 2026-03-09T17:30:26.748 INFO:tasks.workunit.client.1.vm09.stdout:2/535: link d13/d15/d34/d45/f57 d13/fa3 0 2026-03-09T17:30:26.749 INFO:tasks.workunit.client.1.vm09.stdout:3/499: symlink d5/d9/d30/d65/d59/l97 0 2026-03-09T17:30:26.750 INFO:tasks.workunit.client.1.vm09.stdout:3/500: write d5/d16/d31/d37/d58/f91 [439665,85187] 0 2026-03-09T17:30:26.760 INFO:tasks.workunit.client.1.vm09.stdout:9/536: dwrite d5/d21/f2f [4194304,4194304] 0 2026-03-09T17:30:26.760 INFO:tasks.workunit.client.1.vm09.stdout:9/537: fdatasync d5/de/d29/f57 0 2026-03-09T17:30:26.763 INFO:tasks.workunit.client.1.vm09.stdout:9/538: truncate d5/de/d29/da7/fb3 909155 0 2026-03-09T17:30:26.767 INFO:tasks.workunit.client.1.vm09.stdout:0/552: rmdir d6/d1d/d24/d5e/d86 39 2026-03-09T17:30:26.768 INFO:tasks.workunit.client.1.vm09.stdout:3/501: dread d5/d9/d30/d65/f43 [0,4194304] 0 2026-03-09T17:30:26.768 INFO:tasks.workunit.client.1.vm09.stdout:0/553: mknod d6/d64/d94/cb6 0 2026-03-09T17:30:26.772 INFO:tasks.workunit.client.1.vm09.stdout:3/502: creat d5/d16/d85/f98 x:0 0 0 2026-03-09T17:30:26.774 INFO:tasks.workunit.client.1.vm09.stdout:8/562: dread d1/d14/d2a/d42/d5d/d8a/f99 [0,4194304] 0 2026-03-09T17:30:26.774 INFO:tasks.workunit.client.1.vm09.stdout:8/563: chown d1/da/d23/d6c/d32 6905257 1 2026-03-09T17:30:26.783 INFO:tasks.workunit.client.1.vm09.stdout:8/564: dwrite d1/d14/f2f [0,4194304] 0 2026-03-09T17:30:26.783 INFO:tasks.workunit.client.1.vm09.stdout:0/554: getdents d6/d1d/d24/d5e/d86 0 2026-03-09T17:30:26.783 INFO:tasks.workunit.client.1.vm09.stdout:8/565: write d1/d14/d2a/d42/f46 [4044400,94449] 0 2026-03-09T17:30:26.784 INFO:tasks.workunit.client.1.vm09.stdout:0/555: dread d6/d1d/d24/d32/d59/d81/f82 [0,4194304] 0 2026-03-09T17:30:26.785 INFO:tasks.workunit.client.1.vm09.stdout:0/556: write d6/d1d/d24/d32/d59/d81/f90 [265492,28363] 0 2026-03-09T17:30:26.792 INFO:tasks.workunit.client.1.vm09.stdout:0/557: creat d6/d93/fb7 x:0 0 0 2026-03-09T17:30:26.804 INFO:tasks.workunit.client.1.vm09.stdout:8/566: dread d1/d14/f3d [0,4194304] 0 2026-03-09T17:30:26.809 INFO:tasks.workunit.client.1.vm09.stdout:8/567: creat d1/da/dd/faf x:0 0 0 2026-03-09T17:30:26.812 INFO:tasks.workunit.client.1.vm09.stdout:9/539: sync 2026-03-09T17:30:26.812 INFO:tasks.workunit.client.1.vm09.stdout:0/558: sync 2026-03-09T17:30:26.813 INFO:tasks.workunit.client.1.vm09.stdout:9/540: write d5/de/d29/d33/f3b [2349928,81925] 0 2026-03-09T17:30:26.818 INFO:tasks.workunit.client.1.vm09.stdout:8/568: symlink d1/d14/d2a/d49/lb0 0 2026-03-09T17:30:26.818 INFO:tasks.workunit.client.1.vm09.stdout:8/569: chown d1/d14/d31/d97 32021 1 2026-03-09T17:30:26.819 INFO:tasks.workunit.client.1.vm09.stdout:7/658: write da/d11/d77/f79 [1682637,37085] 0 2026-03-09T17:30:26.820 INFO:tasks.workunit.client.1.vm09.stdout:7/659: chown da/d11/d47/d5b/d6c/d9e/d4e/f2b 6015 1 2026-03-09T17:30:26.824 INFO:tasks.workunit.client.1.vm09.stdout:7/660: fsync da/d11/d77/f79 0 2026-03-09T17:30:26.830 INFO:tasks.workunit.client.1.vm09.stdout:8/570: dread d1/d14/d31/d97/fab [0,4194304] 0 2026-03-09T17:30:26.844 INFO:tasks.workunit.client.1.vm09.stdout:1/529: mkdir d9/dc/dd/d40/d22/d37/da4 0 2026-03-09T17:30:26.845 INFO:tasks.workunit.client.1.vm09.stdout:7/661: rmdir da/d11/d47/d5b/d6c 39 2026-03-09T17:30:26.849 INFO:tasks.workunit.client.1.vm09.stdout:8/571: chown d1/da/d3a/l9d 0 1 2026-03-09T17:30:26.849 INFO:tasks.workunit.client.1.vm09.stdout:8/572: fdatasync d1/da/dd/f9a 0 2026-03-09T17:30:26.850 INFO:tasks.workunit.client.1.vm09.stdout:8/573: chown l0 22896 1 2026-03-09T17:30:26.855 INFO:tasks.workunit.client.1.vm09.stdout:1/530: unlink d9/dc/dd/d40/f73 0 2026-03-09T17:30:26.861 INFO:tasks.workunit.client.1.vm09.stdout:7/662: creat da/d11/d2d/d49/fe0 x:0 0 0 2026-03-09T17:30:26.868 INFO:tasks.workunit.client.1.vm09.stdout:8/574: symlink d1/da/d23/d6c/d32/lb1 0 2026-03-09T17:30:26.884 INFO:tasks.workunit.client.1.vm09.stdout:8/575: creat d1/da/d23/d6c/fb2 x:0 0 0 2026-03-09T17:30:26.885 INFO:tasks.workunit.client.1.vm09.stdout:7/663: dwrite da/d11/d47/d5b/d6c/f73 [0,4194304] 0 2026-03-09T17:30:26.887 INFO:tasks.workunit.client.1.vm09.stdout:1/531: creat d9/dc/dd/d40/d22/d91/d99/fa5 x:0 0 0 2026-03-09T17:30:26.892 INFO:tasks.workunit.client.1.vm09.stdout:8/576: unlink d1/d14/d2a/d49/c7e 0 2026-03-09T17:30:26.897 INFO:tasks.workunit.client.1.vm09.stdout:8/577: write d1/da/dd/f9a [2844384,61265] 0 2026-03-09T17:30:26.901 INFO:tasks.workunit.client.1.vm09.stdout:1/532: dread - d9/dc/dd/d40/d22/f50 zero size 2026-03-09T17:30:26.908 INFO:tasks.workunit.client.1.vm09.stdout:1/533: dwrite d9/dc/dd/d40/d22/d37/d3f/d42/d55/f69 [0,4194304] 0 2026-03-09T17:30:26.914 INFO:tasks.workunit.client.1.vm09.stdout:1/534: dwrite d9/dc/dd/d40/d1d/f17 [4194304,4194304] 0 2026-03-09T17:30:26.932 INFO:tasks.workunit.client.1.vm09.stdout:4/549: truncate d11/d1e/d29/f6d 3458651 0 2026-03-09T17:30:26.954 INFO:tasks.workunit.client.1.vm09.stdout:5/565: dwrite d0/d2/f31 [0,4194304] 0 2026-03-09T17:30:26.958 INFO:tasks.workunit.client.1.vm09.stdout:5/566: symlink d0/dc/d21/d26/d5e/d68/d6d/lb2 0 2026-03-09T17:30:26.959 INFO:tasks.workunit.client.1.vm09.stdout:5/567: chown d0/dc/d21/d6f/d42 4565868 1 2026-03-09T17:30:26.965 INFO:tasks.workunit.client.1.vm09.stdout:5/568: link d0/d52/d20/f63 d0/d9/d8b/fb3 0 2026-03-09T17:30:26.966 INFO:tasks.workunit.client.1.vm09.stdout:5/569: write d0/dc/d21/d6f/f80 [1870338,11988] 0 2026-03-09T17:30:26.967 INFO:tasks.workunit.client.1.vm09.stdout:5/570: mknod d0/d9/d8b/cb4 0 2026-03-09T17:30:26.968 INFO:tasks.workunit.client.1.vm09.stdout:5/571: chown d0/d2/d76/d87/fa5 47 1 2026-03-09T17:30:26.970 INFO:tasks.workunit.client.1.vm09.stdout:5/572: symlink d0/dc/d21/d26/d5e/d68/d79/lb5 0 2026-03-09T17:30:26.971 INFO:tasks.workunit.client.1.vm09.stdout:2/536: getdents d13/d15/d36 0 2026-03-09T17:30:26.972 INFO:tasks.workunit.client.1.vm09.stdout:5/573: write d0/d2/d76/d87/d95/f9a [358483,3143] 0 2026-03-09T17:30:26.973 INFO:tasks.workunit.client.1.vm09.stdout:6/536: truncate d3/d21/d76/d3f/f51 513585 0 2026-03-09T17:30:26.982 INFO:tasks.workunit.client.1.vm09.stdout:5/574: rmdir d0/dc 39 2026-03-09T17:30:26.990 INFO:tasks.workunit.client.1.vm09.stdout:2/537: unlink d13/c16 0 2026-03-09T17:30:26.991 INFO:tasks.workunit.client.1.vm09.stdout:5/575: mkdir d0/d9/d74/d75/d9f/db6 0 2026-03-09T17:30:26.996 INFO:tasks.workunit.client.1.vm09.stdout:4/550: dread d11/f25 [4194304,4194304] 0 2026-03-09T17:30:26.997 INFO:tasks.workunit.client.1.vm09.stdout:5/576: mkdir d0/d46/d4b/db7 0 2026-03-09T17:30:26.998 INFO:tasks.workunit.client.1.vm09.stdout:6/537: link d3/d21/d76/d3f/c4e d3/d21/db1/cb6 0 2026-03-09T17:30:26.998 INFO:tasks.workunit.client.1.vm09.stdout:3/503: write d5/d9/d30/f41 [70261,106509] 0 2026-03-09T17:30:27.001 INFO:tasks.workunit.client.1.vm09.stdout:5/577: write d0/dc/d21/f62 [215083,56657] 0 2026-03-09T17:30:27.003 INFO:tasks.workunit.client.1.vm09.stdout:6/538: dwrite d3/d7/f23 [4194304,4194304] 0 2026-03-09T17:30:27.011 INFO:tasks.workunit.client.1.vm09.stdout:9/541: write d5/d2e/f5e [3929141,41845] 0 2026-03-09T17:30:27.011 INFO:tasks.workunit.client.1.vm09.stdout:6/539: write d3/d7/d59/d73/fa3 [672603,3598] 0 2026-03-09T17:30:27.011 INFO:tasks.workunit.client.1.vm09.stdout:0/559: dwrite d6/d1d/f57 [0,4194304] 0 2026-03-09T17:30:27.011 INFO:tasks.workunit.client.1.vm09.stdout:5/578: symlink d0/d46/lb8 0 2026-03-09T17:30:27.011 INFO:tasks.workunit.client.1.vm09.stdout:5/579: stat d0/dc/d21/d6f/d42 0 2026-03-09T17:30:27.012 INFO:tasks.workunit.client.1.vm09.stdout:3/504: sync 2026-03-09T17:30:27.014 INFO:tasks.workunit.client.1.vm09.stdout:2/538: getdents d13/d15/d60/d90 0 2026-03-09T17:30:27.017 INFO:tasks.workunit.client.1.vm09.stdout:9/542: unlink d5/de/d29/f57 0 2026-03-09T17:30:27.024 INFO:tasks.workunit.client.1.vm09.stdout:4/551: creat d11/fae x:0 0 0 2026-03-09T17:30:27.035 INFO:tasks.workunit.client.1.vm09.stdout:9/543: rmdir d5/de/d29/da7 39 2026-03-09T17:30:27.040 INFO:tasks.workunit.client.1.vm09.stdout:7/664: dwrite da/d11/d47/d5b/d78/f80 [0,4194304] 0 2026-03-09T17:30:27.041 INFO:tasks.workunit.client.1.vm09.stdout:8/578: dwrite d1/da/dd/d47/f66 [0,4194304] 0 2026-03-09T17:30:27.048 INFO:tasks.workunit.client.1.vm09.stdout:5/580: symlink d0/d2/d76/d87/lb9 0 2026-03-09T17:30:27.052 INFO:tasks.workunit.client.1.vm09.stdout:3/505: link d5/d16/d31/d37/d58/d64/f7c d5/d9/d30/d65/d59/d84/d8c/f99 0 2026-03-09T17:30:27.060 INFO:tasks.workunit.client.1.vm09.stdout:1/535: truncate d9/dc/dd/d40/d1d/f17 517438 0 2026-03-09T17:30:27.064 INFO:tasks.workunit.client.1.vm09.stdout:2/539: mkdir d13/da4 0 2026-03-09T17:30:27.073 INFO:tasks.workunit.client.1.vm09.stdout:6/540: dwrite d3/d21/d76/d3f/f51 [0,4194304] 0 2026-03-09T17:30:27.074 INFO:tasks.workunit.client.1.vm09.stdout:4/552: mkdir d11/d1e/d45/daf 0 2026-03-09T17:30:27.079 INFO:tasks.workunit.client.1.vm09.stdout:5/581: sync 2026-03-09T17:30:27.084 INFO:tasks.workunit.client.1.vm09.stdout:9/544: truncate d5/d2e/d70/f75 1475198 0 2026-03-09T17:30:27.095 INFO:tasks.workunit.client.1.vm09.stdout:8/579: creat d1/da/d23/fb3 x:0 0 0 2026-03-09T17:30:27.103 INFO:tasks.workunit.client.1.vm09.stdout:2/540: symlink d13/d15/d34/d45/la5 0 2026-03-09T17:30:27.103 INFO:tasks.workunit.client.1.vm09.stdout:6/541: mkdir d3/d21/d25/d26/db7 0 2026-03-09T17:30:27.103 INFO:tasks.workunit.client.1.vm09.stdout:5/582: symlink d0/dc/d21/d33/lba 0 2026-03-09T17:30:27.103 INFO:tasks.workunit.client.1.vm09.stdout:0/560: getdents d6/d64/d97 0 2026-03-09T17:30:27.103 INFO:tasks.workunit.client.1.vm09.stdout:8/580: stat d1/d14/d31/c41 0 2026-03-09T17:30:27.103 INFO:tasks.workunit.client.1.vm09.stdout:0/561: chown d6/d64/db5 5 1 2026-03-09T17:30:27.103 INFO:tasks.workunit.client.1.vm09.stdout:0/562: read d6/d1d/d24/d5e/f8a [1563898,87025] 0 2026-03-09T17:30:27.106 INFO:tasks.workunit.client.1.vm09.stdout:0/563: sync 2026-03-09T17:30:27.111 INFO:tasks.workunit.client.1.vm09.stdout:1/536: mknod d9/dc/dd/d40/d21/ca6 0 2026-03-09T17:30:27.120 INFO:tasks.workunit.client.1.vm09.stdout:6/542: dwrite d3/f97 [0,4194304] 0 2026-03-09T17:30:27.121 INFO:tasks.workunit.client.1.vm09.stdout:2/541: dwrite d13/d15/d21/f24 [0,4194304] 0 2026-03-09T17:30:27.122 INFO:tasks.workunit.client.1.vm09.stdout:1/537: dwrite d9/d5a/fa0 [0,4194304] 0 2026-03-09T17:30:27.124 INFO:tasks.workunit.client.1.vm09.stdout:2/542: write d13/d15/d34/d69/f7c [1134419,117486] 0 2026-03-09T17:30:27.132 INFO:tasks.workunit.client.1.vm09.stdout:9/545: rename d5/de/c50 to d5/de/d29/d33/d94/cbb 0 2026-03-09T17:30:27.134 INFO:tasks.workunit.client.1.vm09.stdout:9/546: write d5/de/d4e/d6e/d93/faa [3837634,92056] 0 2026-03-09T17:30:27.137 INFO:tasks.workunit.client.1.vm09.stdout:7/665: write da/d11/d47/d5b/d6c/d9e/d4e/f33 [4521424,76565] 0 2026-03-09T17:30:27.137 INFO:tasks.workunit.client.1.vm09.stdout:7/666: readlink da/d11/l9a 0 2026-03-09T17:30:27.140 INFO:tasks.workunit.client.1.vm09.stdout:3/506: dwrite d5/d16/d31/f57 [0,4194304] 0 2026-03-09T17:30:27.147 INFO:tasks.workunit.client.1.vm09.stdout:3/507: truncate d5/d16/d31/f34 9261353 0 2026-03-09T17:30:27.147 INFO:tasks.workunit.client.1.vm09.stdout:3/508: fsync d5/d9/d30/f41 0 2026-03-09T17:30:27.151 INFO:tasks.workunit.client.1.vm09.stdout:7/667: dwrite da/d11/d2d/d49/fe0 [0,4194304] 0 2026-03-09T17:30:27.152 INFO:tasks.workunit.client.1.vm09.stdout:7/668: dread - da/d11/d47/d5b/d6c/d9e/d4e/d4c/fde zero size 2026-03-09T17:30:27.155 INFO:tasks.workunit.client.1.vm09.stdout:5/583: unlink d0/d2/f2a 0 2026-03-09T17:30:27.160 INFO:tasks.workunit.client.1.vm09.stdout:5/584: dwrite d0/d9/d16/d5c/f70 [0,4194304] 0 2026-03-09T17:30:27.162 INFO:tasks.workunit.client.1.vm09.stdout:0/564: unlink d6/d1d/d24/f8e 0 2026-03-09T17:30:27.183 INFO:tasks.workunit.client.1.vm09.stdout:4/553: creat d11/fb0 x:0 0 0 2026-03-09T17:30:27.187 INFO:tasks.workunit.client.1.vm09.stdout:4/554: dwrite d11/d1e/d83/d89/d8b/f53 [0,4194304] 0 2026-03-09T17:30:27.187 INFO:tasks.workunit.client.1.vm09.stdout:4/555: read - d11/d1e/d83/d89/fa8 zero size 2026-03-09T17:30:27.193 INFO:tasks.workunit.client.1.vm09.stdout:4/556: sync 2026-03-09T17:30:27.198 INFO:tasks.workunit.client.1.vm09.stdout:6/543: chown d3/d21/db1/cb6 60 1 2026-03-09T17:30:27.200 INFO:tasks.workunit.client.1.vm09.stdout:1/538: mknod d9/dc/dd/d9f/ca7 0 2026-03-09T17:30:27.205 INFO:tasks.workunit.client.1.vm09.stdout:2/543: creat d13/d15/d2c/da2/fa6 x:0 0 0 2026-03-09T17:30:27.216 INFO:tasks.workunit.client.1.vm09.stdout:8/581: mknod d1/cb4 0 2026-03-09T17:30:27.219 INFO:tasks.workunit.client.1.vm09.stdout:8/582: dwrite d1/da/f4b [4194304,4194304] 0 2026-03-09T17:30:27.228 INFO:tasks.workunit.client.1.vm09.stdout:5/585: dread d0/dc/d21/d33/f35 [0,4194304] 0 2026-03-09T17:30:27.241 INFO:tasks.workunit.client.1.vm09.stdout:6/544: creat d3/d21/d76/d3f/fb8 x:0 0 0 2026-03-09T17:30:27.242 INFO:tasks.workunit.client.1.vm09.stdout:6/545: chown d3/d21/d25/c44 0 1 2026-03-09T17:30:27.250 INFO:tasks.workunit.client.1.vm09.stdout:1/539: dread d9/dc/dd/d40/d1d/f4d [0,4194304] 0 2026-03-09T17:30:27.254 INFO:tasks.workunit.client.1.vm09.stdout:7/669: dread da/d11/f25 [0,4194304] 0 2026-03-09T17:30:27.258 INFO:tasks.workunit.client.1.vm09.stdout:2/544: rename d13/d15/d34/d69 to d13/d15/d36/d72/d94/da7 0 2026-03-09T17:30:27.266 INFO:tasks.workunit.client.1.vm09.stdout:8/583: rmdir d1/da/dd/d63 39 2026-03-09T17:30:27.285 INFO:tasks.workunit.client.1.vm09.stdout:8/584: chown d1/d14/d2a/d42/d5d/c5f 14 1 2026-03-09T17:30:27.285 INFO:tasks.workunit.client.1.vm09.stdout:6/546: rmdir d3/d21/d76/d5c/d7e/d94 39 2026-03-09T17:30:27.285 INFO:tasks.workunit.client.1.vm09.stdout:2/545: mknod d13/da4/ca8 0 2026-03-09T17:30:27.285 INFO:tasks.workunit.client.1.vm09.stdout:6/547: mknod d3/d21/d25/d91/d98/cb9 0 2026-03-09T17:30:27.285 INFO:tasks.workunit.client.1.vm09.stdout:6/548: chown d3/d21/d25/d91/d98/da0 369 1 2026-03-09T17:30:27.285 INFO:tasks.workunit.client.1.vm09.stdout:2/546: dread d13/f8b [0,4194304] 0 2026-03-09T17:30:27.285 INFO:tasks.workunit.client.1.vm09.stdout:6/549: read d3/d21/d76/d5c/d61/f53 [143698,1390] 0 2026-03-09T17:30:27.285 INFO:tasks.workunit.client.1.vm09.stdout:2/547: write d13/d15/d3b/d43/f96 [614097,41940] 0 2026-03-09T17:30:27.285 INFO:tasks.workunit.client.1.vm09.stdout:2/548: rmdir d13/d15/d3b/d43 39 2026-03-09T17:30:27.286 INFO:tasks.workunit.client.1.vm09.stdout:6/550: dread d3/d21/d76/d5c/f92 [0,4194304] 0 2026-03-09T17:30:27.288 INFO:tasks.workunit.client.1.vm09.stdout:7/670: dread da/d11/d47/d5b/d6c/d9e/d4e/d5f/fd4 [0,4194304] 0 2026-03-09T17:30:27.289 INFO:tasks.workunit.client.1.vm09.stdout:6/551: read d3/d21/d25/d26/d6b/f79 [894968,108820] 0 2026-03-09T17:30:27.289 INFO:tasks.workunit.client.1.vm09.stdout:7/671: mkdir da/d11/d64/d84/de1 0 2026-03-09T17:30:27.291 INFO:tasks.workunit.client.1.vm09.stdout:7/672: readlink da/d11/d2d/d56/da1/lbf 0 2026-03-09T17:30:27.292 INFO:tasks.workunit.client.1.vm09.stdout:7/673: read da/d11/d77/f79 [1449132,68551] 0 2026-03-09T17:30:27.300 INFO:tasks.workunit.client.1.vm09.stdout:6/552: dread d3/faf [0,4194304] 0 2026-03-09T17:30:27.312 INFO:tasks.workunit.client.1.vm09.stdout:6/553: chown d3/d21/d25/d26/db7 1 1 2026-03-09T17:30:27.312 INFO:tasks.workunit.client.1.vm09.stdout:6/554: truncate d3/d21/d76/d5c/d9f/fa9 2553861 0 2026-03-09T17:30:27.312 INFO:tasks.workunit.client.1.vm09.stdout:6/555: rename d3/d21/d76/d5c/d61/l52 to d3/d21/d25/d26/db7/lba 0 2026-03-09T17:30:27.342 INFO:tasks.workunit.client.1.vm09.stdout:1/540: sync 2026-03-09T17:30:27.346 INFO:tasks.workunit.client.1.vm09.stdout:1/541: dwrite d9/dc/dd/f7b [0,4194304] 0 2026-03-09T17:30:27.354 INFO:tasks.workunit.client.1.vm09.stdout:1/542: mknod d9/dc/dd/d40/d21/d35/ca8 0 2026-03-09T17:30:27.355 INFO:tasks.workunit.client.1.vm09.stdout:1/543: creat d9/dc/fa9 x:0 0 0 2026-03-09T17:30:27.371 INFO:tasks.workunit.client.1.vm09.stdout:2/549: sync 2026-03-09T17:30:27.371 INFO:tasks.workunit.client.1.vm09.stdout:2/550: mknod d13/d4d/ca9 0 2026-03-09T17:30:27.371 INFO:tasks.workunit.client.1.vm09.stdout:1/544: mkdir d9/dc/d63/daa 0 2026-03-09T17:30:27.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:27 vm06.local ceph-mon[57307]: pgmap v13: 65 pgs: 65 active+clean; 3.3 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 40 MiB/s rd, 96 MiB/s wr, 215 op/s 2026-03-09T17:30:27.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:27 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:27.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:27 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:27.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:27 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:27.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:27 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:30:27.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:27 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:27.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:27 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:27.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:27 vm06.local ceph-mon[57307]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mgr fail", "who": "vm09.lqzvkh"}]: dispatch 2026-03-09T17:30:27.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:27 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mgr fail", "who": "vm09.lqzvkh"}]: dispatch 2026-03-09T17:30:27.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:27 vm06.local ceph-mon[57307]: osdmap e40: 6 total, 6 up, 6 in 2026-03-09T17:30:27.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:27 vm06.local ceph-mon[57307]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' cmd='[{"prefix": "mgr fail", "who": "vm09.lqzvkh"}]': finished 2026-03-09T17:30:27.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:27 vm06.local ceph-mon[57307]: mgrmap e26: vm06.pbgzei(active, starting, since 0.0309603s) 2026-03-09T17:30:27.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:27 vm09.local ceph-mon[62061]: pgmap v13: 65 pgs: 65 active+clean; 3.3 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 40 MiB/s rd, 96 MiB/s wr, 215 op/s 2026-03-09T17:30:27.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:27 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:27.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:27 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:27.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:27 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:27.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:27 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:30:27.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:27 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' 2026-03-09T17:30:27.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:27 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:27.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:27 vm09.local ceph-mon[62061]: from='mgr.24477 192.168.123.109:0/3313592546' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mgr fail", "who": "vm09.lqzvkh"}]: dispatch 2026-03-09T17:30:27.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:27 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "mgr fail", "who": "vm09.lqzvkh"}]: dispatch 2026-03-09T17:30:27.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:27 vm09.local ceph-mon[62061]: osdmap e40: 6 total, 6 up, 6 in 2026-03-09T17:30:27.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:27 vm09.local ceph-mon[62061]: from='mgr.24477 ' entity='mgr.vm09.lqzvkh' cmd='[{"prefix": "mgr fail", "who": "vm09.lqzvkh"}]': finished 2026-03-09T17:30:27.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:27 vm09.local ceph-mon[62061]: mgrmap e26: vm06.pbgzei(active, starting, since 0.0309603s) 2026-03-09T17:30:27.409 INFO:tasks.workunit.client.1.vm09.stdout:8/585: dread d1/da/d23/d6c/f70 [0,4194304] 0 2026-03-09T17:30:27.411 INFO:tasks.workunit.client.1.vm09.stdout:6/556: dread d3/d21/d76/d5c/f78 [0,4194304] 0 2026-03-09T17:30:27.419 INFO:tasks.workunit.client.1.vm09.stdout:6/557: fdatasync d3/d21/d25/d91/fa4 0 2026-03-09T17:30:27.419 INFO:tasks.workunit.client.1.vm09.stdout:6/558: fdatasync d3/d7/fe 0 2026-03-09T17:30:27.419 INFO:tasks.workunit.client.1.vm09.stdout:6/559: mknod d3/d21/d76/cbb 0 2026-03-09T17:30:27.419 INFO:tasks.workunit.client.1.vm09.stdout:6/560: dwrite d3/d21/d76/d3f/fb8 [0,4194304] 0 2026-03-09T17:30:27.428 INFO:tasks.workunit.client.1.vm09.stdout:8/586: dread d1/da/d23/d6c/d32/f50 [0,4194304] 0 2026-03-09T17:30:27.428 INFO:tasks.workunit.client.1.vm09.stdout:6/561: write d3/d21/d76/d5c/f78 [368615,41588] 0 2026-03-09T17:30:27.431 INFO:tasks.workunit.client.1.vm09.stdout:8/587: link d1/d14/f9c d1/da/d23/d6c/d32/fb5 0 2026-03-09T17:30:27.432 INFO:tasks.workunit.client.1.vm09.stdout:8/588: chown d1/da/dd/d79/f83 57 1 2026-03-09T17:30:27.432 INFO:tasks.workunit.client.1.vm09.stdout:6/562: mkdir d3/d21/d25/d26/d86/dbc 0 2026-03-09T17:30:27.446 INFO:tasks.workunit.client.1.vm09.stdout:0/565: rename d6 to d6/d1d/d24/d32/d59/d81/db8 22 2026-03-09T17:30:27.494 INFO:tasks.workunit.client.1.vm09.stdout:4/557: write d11/f23 [525476,99526] 0 2026-03-09T17:30:27.496 INFO:tasks.workunit.client.1.vm09.stdout:3/509: dwrite d5/d16/d31/d37/f5b [0,4194304] 0 2026-03-09T17:30:27.502 INFO:tasks.workunit.client.1.vm09.stdout:9/547: truncate d5/de/d29/d33/f3b 2447180 0 2026-03-09T17:30:27.502 INFO:tasks.workunit.client.1.vm09.stdout:5/586: write d0/dc/d21/d6f/d42/f82 [618127,68484] 0 2026-03-09T17:30:27.515 INFO:tasks.workunit.client.1.vm09.stdout:7/674: dwrite da/d11/d2d/d56/f9f [0,4194304] 0 2026-03-09T17:30:27.530 INFO:tasks.workunit.client.1.vm09.stdout:0/566: write d6/d1d/d24/d32/d59/d81/f82 [655490,3914] 0 2026-03-09T17:30:27.537 INFO:tasks.workunit.client.1.vm09.stdout:2/551: dwrite d13/d15/d21/f3e [0,4194304] 0 2026-03-09T17:30:27.544 INFO:tasks.workunit.client.1.vm09.stdout:1/545: dwrite d9/dc/dd/d40/d21/d6f/f85 [4194304,4194304] 0 2026-03-09T17:30:27.545 INFO:tasks.workunit.client.1.vm09.stdout:8/589: dwrite d1/d14/d2a/f54 [0,4194304] 0 2026-03-09T17:30:27.545 INFO:tasks.workunit.client.1.vm09.stdout:6/563: dwrite d3/d21/d25/f8b [0,4194304] 0 2026-03-09T17:30:27.558 INFO:tasks.workunit.client.1.vm09.stdout:0/567: read d6/d1d/d39/f44 [410388,67531] 0 2026-03-09T17:30:27.558 INFO:tasks.workunit.client.1.vm09.stdout:2/552: mkdir d13/d4d/daa 0 2026-03-09T17:30:27.559 INFO:tasks.workunit.client.1.vm09.stdout:1/546: creat d9/dc/dd/d40/d1d/fab x:0 0 0 2026-03-09T17:30:27.576 INFO:tasks.workunit.client.1.vm09.stdout:1/547: unlink d9/dc/dd/d40/d21/d35/f8e 0 2026-03-09T17:30:27.576 INFO:tasks.workunit.client.1.vm09.stdout:0/568: creat d6/d1d/d24/d5e/db2/fb9 x:0 0 0 2026-03-09T17:30:27.577 INFO:tasks.workunit.client.1.vm09.stdout:0/569: fsync d6/d1d/d24/d5e/d6c/fa5 0 2026-03-09T17:30:27.579 INFO:tasks.workunit.client.1.vm09.stdout:2/553: creat d13/d15/d3b/d43/fab x:0 0 0 2026-03-09T17:30:27.581 INFO:tasks.workunit.client.1.vm09.stdout:0/570: mkdir d6/d1d/d24/d32/d59/dba 0 2026-03-09T17:30:27.586 INFO:tasks.workunit.client.1.vm09.stdout:1/548: rmdir d9/dc/d63/daa 0 2026-03-09T17:30:27.594 INFO:tasks.workunit.client.1.vm09.stdout:1/549: fdatasync d9/f59 0 2026-03-09T17:30:27.597 INFO:tasks.workunit.client.1.vm09.stdout:0/571: dread d6/d1d/d24/d32/f68 [0,4194304] 0 2026-03-09T17:30:27.598 INFO:tasks.workunit.client.1.vm09.stdout:0/572: fdatasync d6/d1d/d39/f44 0 2026-03-09T17:30:27.601 INFO:tasks.workunit.client.1.vm09.stdout:0/573: dread d6/f27 [0,4194304] 0 2026-03-09T17:30:27.607 INFO:tasks.workunit.client.1.vm09.stdout:8/590: read d1/f28 [3557673,51694] 0 2026-03-09T17:30:27.608 INFO:tasks.workunit.client.1.vm09.stdout:8/591: fdatasync d1/da/dd/d47/f66 0 2026-03-09T17:30:27.618 INFO:tasks.workunit.client.1.vm09.stdout:4/558: write d11/d1e/d29/f3b [2307401,120057] 0 2026-03-09T17:30:27.618 INFO:tasks.workunit.client.1.vm09.stdout:4/559: write d11/d1e/d29/f3b [1504878,6949] 0 2026-03-09T17:30:27.644 INFO:tasks.workunit.client.1.vm09.stdout:1/550: dread d9/f11 [0,4194304] 0 2026-03-09T17:30:27.644 INFO:tasks.workunit.client.1.vm09.stdout:1/551: stat d9/dc/dd/f7b 0 2026-03-09T17:30:27.661 INFO:tasks.workunit.client.1.vm09.stdout:9/548: write d5/de/d29/d33/f66 [7780803,130962] 0 2026-03-09T17:30:27.661 INFO:tasks.workunit.client.1.vm09.stdout:7/675: write da/d11/d3e/f88 [1210405,88062] 0 2026-03-09T17:30:27.663 INFO:tasks.workunit.client.1.vm09.stdout:1/552: link d9/dc/dd/d40/d22/d37/d3f/d42/d55/c64 d9/dc/dd/d40/d22/d37/d3f/d42/cac 0 2026-03-09T17:30:27.665 INFO:tasks.workunit.client.1.vm09.stdout:3/510: dwrite d5/f22 [4194304,4194304] 0 2026-03-09T17:30:27.677 INFO:tasks.workunit.client.1.vm09.stdout:1/553: mkdir d9/dc/dd/d40/d1d/dad 0 2026-03-09T17:30:27.677 INFO:tasks.workunit.client.1.vm09.stdout:6/564: truncate d3/d21/d76/d5c/d61/f60 1647118 0 2026-03-09T17:30:27.677 INFO:tasks.workunit.client.1.vm09.stdout:3/511: rmdir d5/d9/d30/d65/d59/d84/d8c 39 2026-03-09T17:30:27.685 INFO:tasks.workunit.client.1.vm09.stdout:5/587: read d0/d46/f56 [2517832,110218] 0 2026-03-09T17:30:27.686 INFO:tasks.workunit.client.1.vm09.stdout:5/588: dread - d0/d2/d76/d86/fac zero size 2026-03-09T17:30:27.686 INFO:tasks.workunit.client.1.vm09.stdout:0/574: rmdir d6/d1d/d24/d32/d59 39 2026-03-09T17:30:27.686 INFO:tasks.workunit.client.1.vm09.stdout:9/549: link d5/d2e/c43 d5/d2e/d70/d84/cbc 0 2026-03-09T17:30:27.687 INFO:tasks.workunit.client.1.vm09.stdout:2/554: write d13/d15/d34/f3a [5142941,15545] 0 2026-03-09T17:30:27.688 INFO:tasks.workunit.client.1.vm09.stdout:3/512: unlink d5/d16/d46/l8f 0 2026-03-09T17:30:27.698 INFO:tasks.workunit.client.1.vm09.stdout:0/575: dread d6/d1d/d46/f4d [4194304,4194304] 0 2026-03-09T17:30:27.698 INFO:tasks.workunit.client.1.vm09.stdout:0/576: dread - d6/d1d/d24/d5e/f9e zero size 2026-03-09T17:30:27.701 INFO:tasks.workunit.client.1.vm09.stdout:5/589: symlink d0/dc/d21/d26/lbb 0 2026-03-09T17:30:27.709 INFO:tasks.workunit.client.1.vm09.stdout:9/550: dwrite d5/de/d29/da7/fb3 [0,4194304] 0 2026-03-09T17:30:27.711 INFO:tasks.workunit.client.1.vm09.stdout:9/551: read - d5/de/d4e/d6e/d93/f7f zero size 2026-03-09T17:30:27.714 INFO:tasks.workunit.client.1.vm09.stdout:2/555: mknod d13/d15/d34/d45/d84/cac 0 2026-03-09T17:30:27.716 INFO:tasks.workunit.client.1.vm09.stdout:3/513: creat d5/d16/d31/d37/d58/d64/f9a x:0 0 0 2026-03-09T17:30:27.717 INFO:tasks.workunit.client.1.vm09.stdout:1/554: link d9/dc/dd/d9f/c8f d9/dc/dd/d40/d21/d35/cae 0 2026-03-09T17:30:27.722 INFO:tasks.workunit.client.1.vm09.stdout:0/577: unlink d6/d1d/d24/d32/f68 0 2026-03-09T17:30:27.726 INFO:tasks.workunit.client.1.vm09.stdout:5/590: rename d0/d9/f3e to d0/dc/d21/d26/d5e/fbc 0 2026-03-09T17:30:27.726 INFO:tasks.workunit.client.1.vm09.stdout:5/591: chown d0/d2 3 1 2026-03-09T17:30:27.727 INFO:tasks.workunit.client.1.vm09.stdout:4/560: write d11/f19 [704777,15723] 0 2026-03-09T17:30:27.733 INFO:tasks.workunit.client.1.vm09.stdout:8/592: dwrite d1/da/dd/d63/f36 [0,4194304] 0 2026-03-09T17:30:27.746 INFO:tasks.workunit.client.1.vm09.stdout:3/514: truncate d5/d9/d30/d65/f4f 1705158 0 2026-03-09T17:30:27.754 INFO:tasks.workunit.client.1.vm09.stdout:7/676: dwrite da/d11/d47/d89/dbe/dc2/f95 [0,4194304] 0 2026-03-09T17:30:27.755 INFO:tasks.workunit.client.1.vm09.stdout:0/578: creat d6/d64/d97/fbb x:0 0 0 2026-03-09T17:30:27.757 INFO:tasks.workunit.client.1.vm09.stdout:7/677: dwrite da/fcd [0,4194304] 0 2026-03-09T17:30:27.766 INFO:tasks.workunit.client.1.vm09.stdout:5/592: mkdir d0/d9/d74/d75/dbd 0 2026-03-09T17:30:27.766 INFO:tasks.workunit.client.1.vm09.stdout:5/593: truncate d0/f60 951218 0 2026-03-09T17:30:27.766 INFO:tasks.workunit.client.1.vm09.stdout:5/594: chown d0/d2 414761 1 2026-03-09T17:30:27.767 INFO:tasks.workunit.client.1.vm09.stdout:4/561: chown d11/d1e/d29/f2e 442329621 1 2026-03-09T17:30:27.768 INFO:tasks.workunit.client.1.vm09.stdout:6/565: write d3/d7/f24 [356508,29871] 0 2026-03-09T17:30:27.768 INFO:tasks.workunit.client.1.vm09.stdout:6/566: readlink d3/d21/d76/d5c/d61/d95/l7b 0 2026-03-09T17:30:27.777 INFO:tasks.workunit.client.1.vm09.stdout:8/593: mkdir d1/da/d23/d71/db6 0 2026-03-09T17:30:27.778 INFO:tasks.workunit.client.1.vm09.stdout:8/594: readlink d1/da/dd/l15 0 2026-03-09T17:30:27.781 INFO:tasks.workunit.client.1.vm09.stdout:8/595: dwrite d1/d14/d2a/d49/fac [0,4194304] 0 2026-03-09T17:30:27.791 INFO:tasks.workunit.client.1.vm09.stdout:3/515: rmdir d5/d16/d31/d37/d58 39 2026-03-09T17:30:27.794 INFO:tasks.workunit.client.1.vm09.stdout:2/556: dwrite d13/f73 [0,4194304] 0 2026-03-09T17:30:27.796 INFO:tasks.workunit.client.1.vm09.stdout:2/557: read d13/f8b [3421105,119171] 0 2026-03-09T17:30:27.802 INFO:tasks.workunit.client.1.vm09.stdout:1/555: write d9/dc/dd/d40/d1d/f77 [674701,109963] 0 2026-03-09T17:30:27.808 INFO:tasks.workunit.client.1.vm09.stdout:1/556: chown d9/dc/dd/d40/d22 93 1 2026-03-09T17:30:27.808 INFO:tasks.workunit.client.1.vm09.stdout:7/678: creat da/d11/d47/d5b/d6c/d9e/dc6/fe2 x:0 0 0 2026-03-09T17:30:27.808 INFO:tasks.workunit.client.1.vm09.stdout:7/679: dread - da/d11/d47/d5b/d6c/d9e/d4e/d4c/fde zero size 2026-03-09T17:30:27.808 INFO:tasks.workunit.client.1.vm09.stdout:4/562: creat d11/d1e/d29/fb1 x:0 0 0 2026-03-09T17:30:27.808 INFO:tasks.workunit.client.1.vm09.stdout:6/567: rmdir d3/d7 39 2026-03-09T17:30:27.811 INFO:tasks.workunit.client.1.vm09.stdout:1/557: dread f6 [0,4194304] 0 2026-03-09T17:30:27.820 INFO:tasks.workunit.client.1.vm09.stdout:0/579: read d6/d1d/f41 [2920639,42255] 0 2026-03-09T17:30:27.828 INFO:tasks.workunit.client.1.vm09.stdout:8/596: dread d1/da/dd/f22 [0,4194304] 0 2026-03-09T17:30:27.832 INFO:tasks.workunit.client.1.vm09.stdout:5/595: mkdir d0/d2/d76/d87/da4/dbe 0 2026-03-09T17:30:27.832 INFO:tasks.workunit.client.1.vm09.stdout:7/680: symlink da/d11/d64/da7/le3 0 2026-03-09T17:30:27.832 INFO:tasks.workunit.client.1.vm09.stdout:9/552: link d5/de/l24 d5/de/d29/d33/db8/lbd 0 2026-03-09T17:30:27.833 INFO:tasks.workunit.client.1.vm09.stdout:7/681: truncate da/d11/d2d/d49/fd7 815150 0 2026-03-09T17:30:27.841 INFO:tasks.workunit.client.1.vm09.stdout:8/597: mkdir d1/da/dd/d47/db7 0 2026-03-09T17:30:27.847 INFO:tasks.workunit.client.1.vm09.stdout:5/596: mkdir d0/d2/d76/d87/da4/dbf 0 2026-03-09T17:30:27.847 INFO:tasks.workunit.client.1.vm09.stdout:7/682: rename da/d11/d2d/d49 to da/d11/d47/d5b/d6c/d9e/d4e/d4c/de4 0 2026-03-09T17:30:27.848 INFO:tasks.workunit.client.1.vm09.stdout:4/563: sync 2026-03-09T17:30:27.852 INFO:tasks.workunit.client.1.vm09.stdout:7/683: dwrite da/d11/d64/da7/fdc [0,4194304] 0 2026-03-09T17:30:27.853 INFO:tasks.workunit.client.1.vm09.stdout:1/558: symlink d9/dc/laf 0 2026-03-09T17:30:27.860 INFO:tasks.workunit.client.1.vm09.stdout:8/598: fsync d1/d14/d2a/d42/d43/f98 0 2026-03-09T17:30:27.862 INFO:tasks.workunit.client.1.vm09.stdout:5/597: mkdir d0/d2/d76/d87/d95/d9b/dc0 0 2026-03-09T17:30:27.863 INFO:tasks.workunit.client.1.vm09.stdout:5/598: stat d0/d2/d76/d87/d95 0 2026-03-09T17:30:27.864 INFO:tasks.workunit.client.1.vm09.stdout:8/599: dread d1/f16 [0,4194304] 0 2026-03-09T17:30:27.875 INFO:tasks.workunit.client.1.vm09.stdout:9/553: dread d5/f11 [0,4194304] 0 2026-03-09T17:30:27.882 INFO:tasks.workunit.client.1.vm09.stdout:3/516: dwrite d5/d9/d30/d65/f3e [0,4194304] 0 2026-03-09T17:30:27.883 INFO:tasks.workunit.client.1.vm09.stdout:0/580: dread d6/d1d/f1e [0,4194304] 0 2026-03-09T17:30:27.883 INFO:tasks.workunit.client.1.vm09.stdout:0/581: chown d6/d1d/d46/c52 385163 1 2026-03-09T17:30:27.884 INFO:tasks.workunit.client.1.vm09.stdout:0/582: write d6/d64/fa7 [567708,89330] 0 2026-03-09T17:30:27.884 INFO:tasks.workunit.client.1.vm09.stdout:0/583: readlink d6/d64/la4 0 2026-03-09T17:30:27.892 INFO:tasks.workunit.client.1.vm09.stdout:6/568: write d3/d21/d76/d5c/d61/f53 [1448427,43417] 0 2026-03-09T17:30:27.892 INFO:tasks.workunit.client.1.vm09.stdout:2/558: write d13/d15/f2f [777348,85640] 0 2026-03-09T17:30:27.893 INFO:tasks.workunit.client.1.vm09.stdout:2/559: write d13/d15/f2f [1153882,3803] 0 2026-03-09T17:30:27.894 INFO:tasks.workunit.client.1.vm09.stdout:2/560: readlink d13/d15/d36/d72/l99 0 2026-03-09T17:30:27.902 INFO:tasks.workunit.client.1.vm09.stdout:4/564: write d11/f15 [4962633,124179] 0 2026-03-09T17:30:27.903 INFO:tasks.workunit.client.1.vm09.stdout:5/599: mkdir d0/d2/d76/dc1 0 2026-03-09T17:30:27.905 INFO:tasks.workunit.client.1.vm09.stdout:9/554: creat d5/de/d29/d33/d94/fbe x:0 0 0 2026-03-09T17:30:27.909 INFO:tasks.workunit.client.1.vm09.stdout:0/584: symlink d6/d64/d94/lbc 0 2026-03-09T17:30:27.911 INFO:tasks.workunit.client.1.vm09.stdout:6/569: creat d3/d21/d76/d5c/fbd x:0 0 0 2026-03-09T17:30:27.912 INFO:tasks.workunit.client.1.vm09.stdout:0/585: dread d6/d1d/f57 [0,4194304] 0 2026-03-09T17:30:27.913 INFO:tasks.workunit.client.1.vm09.stdout:0/586: fdatasync d6/d1d/f57 0 2026-03-09T17:30:27.918 INFO:tasks.workunit.client.1.vm09.stdout:1/559: getdents d9/dc/dd/d40/d22/d37/da4 0 2026-03-09T17:30:27.920 INFO:tasks.workunit.client.1.vm09.stdout:1/560: dread d9/dc/dd/d40/d1d/f4d [0,4194304] 0 2026-03-09T17:30:27.927 INFO:tasks.workunit.client.1.vm09.stdout:9/555: symlink d5/d2e/d70/d84/lbf 0 2026-03-09T17:30:27.929 INFO:tasks.workunit.client.1.vm09.stdout:7/684: dwrite da/d11/d47/d5b/d6c/d9e/d4e/fd9 [4194304,4194304] 0 2026-03-09T17:30:27.930 INFO:tasks.workunit.client.1.vm09.stdout:3/517: dwrite d5/d16/d31/f44 [0,4194304] 0 2026-03-09T17:30:27.931 INFO:tasks.workunit.client.1.vm09.stdout:3/518: truncate d5/d9/d30/d65/d59/d84/f86 783693 0 2026-03-09T17:30:27.931 INFO:tasks.workunit.client.1.vm09.stdout:3/519: fsync d5/d16/d31/d3d/fe 0 2026-03-09T17:30:27.937 INFO:tasks.workunit.client.1.vm09.stdout:3/520: fdatasync d5/d9/f4e 0 2026-03-09T17:30:27.945 INFO:tasks.workunit.client.1.vm09.stdout:8/600: truncate d1/da/f4b 4860203 0 2026-03-09T17:30:27.950 INFO:tasks.workunit.client.1.vm09.stdout:2/561: link d13/d15/d34/d37/d66/f80 d13/d15/d21/d88/fad 0 2026-03-09T17:30:27.959 INFO:tasks.workunit.client.1.vm09.stdout:0/587: mkdir d6/d64/dbd 0 2026-03-09T17:30:27.959 INFO:tasks.workunit.client.1.vm09.stdout:4/565: rename d11/d1e/f22 to d11/d1e/d45/fb2 0 2026-03-09T17:30:27.959 INFO:tasks.workunit.client.1.vm09.stdout:1/561: rmdir d9/dc/dd/d40/d22/d37/d3f/d42/d55 39 2026-03-09T17:30:27.959 INFO:tasks.workunit.client.1.vm09.stdout:4/566: dwrite d11/f19 [0,4194304] 0 2026-03-09T17:30:27.973 INFO:tasks.workunit.client.1.vm09.stdout:7/685: mkdir da/d11/d77/de5 0 2026-03-09T17:30:27.976 INFO:tasks.workunit.client.1.vm09.stdout:1/562: sync 2026-03-09T17:30:27.976 INFO:tasks.workunit.client.1.vm09.stdout:1/563: dread - d9/dc/dd/d9f/d9c/f89 zero size 2026-03-09T17:30:27.981 INFO:tasks.workunit.client.1.vm09.stdout:8/601: creat d1/d14/d2a/d42/d5d/d8a/fb8 x:0 0 0 2026-03-09T17:30:27.985 INFO:tasks.workunit.client.1.vm09.stdout:4/567: stat d11/d1e/d29/d36/f7f 0 2026-03-09T17:30:27.992 INFO:tasks.workunit.client.1.vm09.stdout:9/556: chown d5/d21/l95 287969309 1 2026-03-09T17:30:27.992 INFO:tasks.workunit.client.1.vm09.stdout:9/557: chown d5/de/f3c 205152 1 2026-03-09T17:30:27.992 INFO:tasks.workunit.client.1.vm09.stdout:1/564: truncate f2 2486672 0 2026-03-09T17:30:27.993 INFO:tasks.workunit.client.1.vm09.stdout:1/565: write d9/dc/dd/d40/d22/d37/f41 [4545004,68000] 0 2026-03-09T17:30:27.993 INFO:tasks.workunit.client.1.vm09.stdout:3/521: symlink d5/d9/d30/d65/d59/d84/l9b 0 2026-03-09T17:30:27.993 INFO:tasks.workunit.client.1.vm09.stdout:1/566: write d9/dc/dd/d40/d21/d6f/f85 [6604714,36234] 0 2026-03-09T17:30:27.995 INFO:tasks.workunit.client.1.vm09.stdout:1/567: chown d9/dc/dd/d40/c24 32781888 1 2026-03-09T17:30:27.997 INFO:tasks.workunit.client.1.vm09.stdout:1/568: readlink d9/dc/dd/d40/d21/d35/l3e 0 2026-03-09T17:30:27.998 INFO:tasks.workunit.client.1.vm09.stdout:1/569: readlink d9/dc/dd/l3c 0 2026-03-09T17:30:27.999 INFO:tasks.workunit.client.1.vm09.stdout:8/602: rename d1/da/f35 to d1/da/dd/d79/fb9 0 2026-03-09T17:30:28.004 INFO:tasks.workunit.client.1.vm09.stdout:4/568: creat d11/d1e/d45/fb3 x:0 0 0 2026-03-09T17:30:28.004 INFO:tasks.workunit.client.1.vm09.stdout:4/569: write d11/fab [1018114,68769] 0 2026-03-09T17:30:28.005 INFO:tasks.workunit.client.1.vm09.stdout:4/570: read - d11/d1e/d45/fb3 zero size 2026-03-09T17:30:28.006 INFO:tasks.workunit.client.1.vm09.stdout:1/570: sync 2026-03-09T17:30:28.013 INFO:tasks.workunit.client.1.vm09.stdout:5/600: link d0/dc/d21/d26/d5e/fbc d0/d9/d8b/fc2 0 2026-03-09T17:30:28.040 INFO:tasks.workunit.client.1.vm09.stdout:6/570: write d3/d21/d25/d26/d34/f66 [1711987,87833] 0 2026-03-09T17:30:28.054 INFO:tasks.workunit.client.1.vm09.stdout:4/571: creat d11/d1e/d45/fb4 x:0 0 0 2026-03-09T17:30:28.055 INFO:tasks.workunit.client.1.vm09.stdout:1/571: mkdir d9/dc/dd/d9f/d9c/db0 0 2026-03-09T17:30:28.057 INFO:tasks.workunit.client.1.vm09.stdout:2/562: write d13/f4c [3878774,113456] 0 2026-03-09T17:30:28.066 INFO:tasks.workunit.client.1.vm09.stdout:9/558: write d5/f4b [5083525,58060] 0 2026-03-09T17:30:28.070 INFO:tasks.workunit.client.1.vm09.stdout:9/559: dwrite d5/f8e [0,4194304] 0 2026-03-09T17:30:28.081 INFO:tasks.workunit.client.1.vm09.stdout:5/601: truncate d0/d46/f4c 1908259 0 2026-03-09T17:30:28.082 INFO:tasks.workunit.client.1.vm09.stdout:5/602: chown d0/l5b 182348 1 2026-03-09T17:30:28.083 INFO:tasks.workunit.client.1.vm09.stdout:7/686: rmdir da/d11/d64/d84/de1 0 2026-03-09T17:30:28.087 INFO:tasks.workunit.client.1.vm09.stdout:5/603: dwrite d0/d2/d76/d86/fa8 [0,4194304] 0 2026-03-09T17:30:28.090 INFO:tasks.workunit.client.1.vm09.stdout:5/604: read d0/dc/d21/f29 [833680,58474] 0 2026-03-09T17:30:28.101 INFO:tasks.workunit.client.1.vm09.stdout:8/603: truncate d1/da/dd/d47/f82 1149363 0 2026-03-09T17:30:28.103 INFO:tasks.workunit.client.1.vm09.stdout:8/604: readlink d1/d14/d2a/d49/lb0 0 2026-03-09T17:30:28.104 INFO:tasks.workunit.client.1.vm09.stdout:4/572: chown d11/d1e/c2a 57811097 1 2026-03-09T17:30:28.107 INFO:tasks.workunit.client.1.vm09.stdout:7/687: fdatasync da/f15 0 2026-03-09T17:30:28.113 INFO:tasks.workunit.client.1.vm09.stdout:6/571: dread d3/d7/f23 [0,4194304] 0 2026-03-09T17:30:28.119 INFO:tasks.workunit.client.1.vm09.stdout:8/605: readlink d1/d14/d2a/d42/d5d/la9 0 2026-03-09T17:30:28.120 INFO:tasks.workunit.client.1.vm09.stdout:8/606: stat d1/f33 0 2026-03-09T17:30:28.120 INFO:tasks.workunit.client.1.vm09.stdout:4/573: mkdir d11/d1e/d29/db5 0 2026-03-09T17:30:28.120 INFO:tasks.workunit.client.1.vm09.stdout:2/563: truncate d13/d15/d34/d45/f57 1003300 0 2026-03-09T17:30:28.136 INFO:tasks.workunit.client.1.vm09.stdout:3/522: dwrite d5/d16/d31/d3d/d32/f89 [0,4194304] 0 2026-03-09T17:30:28.137 INFO:tasks.workunit.client.1.vm09.stdout:1/572: dread d9/dc/dd/d40/d22/f4a [0,4194304] 0 2026-03-09T17:30:28.139 INFO:tasks.workunit.client.1.vm09.stdout:7/688: creat da/d11/d3e/da2/db2/fe6 x:0 0 0 2026-03-09T17:30:28.156 INFO:tasks.workunit.client.1.vm09.stdout:0/588: rename d6/d1d/d24/d32/d59/d81/d8c/f8f to d6/d1d/d24/d32/fbe 0 2026-03-09T17:30:28.156 INFO:tasks.workunit.client.1.vm09.stdout:5/605: dread d0/d9/f7f [4194304,4194304] 0 2026-03-09T17:30:28.157 INFO:tasks.workunit.client.1.vm09.stdout:5/606: write d0/dc/d21/d6f/f80 [5008442,30548] 0 2026-03-09T17:30:28.160 INFO:tasks.workunit.client.1.vm09.stdout:8/607: symlink d1/d14/d2a/d42/d5d/lba 0 2026-03-09T17:30:28.172 INFO:tasks.workunit.client.1.vm09.stdout:8/608: dread d1/d14/d2a/d42/d43/f98 [0,4194304] 0 2026-03-09T17:30:28.178 INFO:tasks.workunit.client.1.vm09.stdout:2/564: mkdir d13/d15/d36/d72/d94/dae 0 2026-03-09T17:30:28.185 INFO:tasks.workunit.client.1.vm09.stdout:9/560: creat d5/de/d29/fc0 x:0 0 0 2026-03-09T17:30:28.185 INFO:tasks.workunit.client.1.vm09.stdout:2/565: sync 2026-03-09T17:30:28.191 INFO:tasks.workunit.client.1.vm09.stdout:6/572: write d3/d21/f28 [3811869,47421] 0 2026-03-09T17:30:28.201 INFO:tasks.workunit.client.1.vm09.stdout:4/574: write d11/f1c [408486,111013] 0 2026-03-09T17:30:28.211 INFO:tasks.workunit.client.1.vm09.stdout:7/689: rename da/d11/d2d/d56/d68/faa to da/d11/d47/d5b/d6c/d9e/d4e/fe7 0 2026-03-09T17:30:28.211 INFO:tasks.workunit.client.1.vm09.stdout:0/589: truncate d6/d1d/d24/d5e/f67 5020141 0 2026-03-09T17:30:28.211 INFO:tasks.workunit.client.1.vm09.stdout:5/607: mkdir d0/dc/dc3 0 2026-03-09T17:30:28.212 INFO:tasks.workunit.client.1.vm09.stdout:3/523: dread d5/d16/d31/d37/f6d [0,4194304] 0 2026-03-09T17:30:28.212 INFO:tasks.workunit.client.1.vm09.stdout:9/561: fsync d5/de/f20 0 2026-03-09T17:30:28.214 INFO:tasks.workunit.client.1.vm09.stdout:3/524: dread - d5/d16/d85/f98 zero size 2026-03-09T17:30:28.216 INFO:tasks.workunit.client.1.vm09.stdout:3/525: fdatasync d5/d16/d31/f44 0 2026-03-09T17:30:28.225 INFO:tasks.workunit.client.1.vm09.stdout:6/573: unlink f2 0 2026-03-09T17:30:28.229 INFO:tasks.workunit.client.1.vm09.stdout:5/608: dread d0/d2/d76/d87/d95/d9b/fab [0,4194304] 0 2026-03-09T17:30:28.229 INFO:tasks.workunit.client.1.vm09.stdout:5/609: fdatasync d0/d2/d76/d87/faf 0 2026-03-09T17:30:28.231 INFO:tasks.workunit.client.1.vm09.stdout:9/562: dread d5/de/d29/d33/f66 [4194304,4194304] 0 2026-03-09T17:30:28.234 INFO:tasks.workunit.client.1.vm09.stdout:1/573: mkdir d9/dc/dd/d40/d22/d37/d3f/d42/d55/db1 0 2026-03-09T17:30:28.235 INFO:tasks.workunit.client.1.vm09.stdout:1/574: stat d9/dc/dd/d9f/l6a 0 2026-03-09T17:30:28.237 INFO:tasks.workunit.client.1.vm09.stdout:4/575: mkdir d11/d1e/d31/db6 0 2026-03-09T17:30:28.239 INFO:tasks.workunit.client.1.vm09.stdout:0/590: symlink d6/d1d/d24/d5e/d86/lbf 0 2026-03-09T17:30:28.243 INFO:tasks.workunit.client.1.vm09.stdout:4/576: dwrite d11/fa4 [0,4194304] 0 2026-03-09T17:30:28.248 INFO:tasks.workunit.client.1.vm09.stdout:2/566: creat d13/d15/d60/d85/faf x:0 0 0 2026-03-09T17:30:28.249 INFO:tasks.workunit.client.1.vm09.stdout:4/577: chown d11/d1e/d29/d36/d57/f8f 651552925 1 2026-03-09T17:30:28.251 INFO:tasks.workunit.client.1.vm09.stdout:5/610: symlink d0/dc/lc4 0 2026-03-09T17:30:28.252 INFO:tasks.workunit.client.1.vm09.stdout:9/563: symlink d5/de/d29/d33/d94/lc1 0 2026-03-09T17:30:28.253 INFO:tasks.workunit.client.1.vm09.stdout:8/609: write d1/da/f4b [3310662,124454] 0 2026-03-09T17:30:28.261 INFO:tasks.workunit.client.1.vm09.stdout:9/564: sync 2026-03-09T17:30:28.262 INFO:tasks.workunit.client.1.vm09.stdout:0/591: creat d6/d1d/d24/d32/d59/d81/fc0 x:0 0 0 2026-03-09T17:30:28.262 INFO:tasks.workunit.client.1.vm09.stdout:9/565: stat d5/de/d4e/d6e/d93 0 2026-03-09T17:30:28.263 INFO:tasks.workunit.client.1.vm09.stdout:2/567: mkdir d13/d15/d36/d72/d94/da7/db0 0 2026-03-09T17:30:28.266 INFO:tasks.workunit.client.1.vm09.stdout:3/526: mkdir d5/d9c 0 2026-03-09T17:30:28.267 INFO:tasks.workunit.client.1.vm09.stdout:5/611: creat d0/d2/fc5 x:0 0 0 2026-03-09T17:30:28.268 INFO:tasks.workunit.client.1.vm09.stdout:4/578: rename d11/d1e/d83 to d11/d1e/d45/d60/d71/db7 0 2026-03-09T17:30:28.270 INFO:tasks.workunit.client.1.vm09.stdout:7/690: creat da/d11/d47/d5b/fe8 x:0 0 0 2026-03-09T17:30:28.271 INFO:tasks.workunit.client.1.vm09.stdout:9/566: chown d5/de/d4e/c68 7397874 1 2026-03-09T17:30:28.272 INFO:tasks.workunit.client.1.vm09.stdout:2/568: mkdir d13/d15/d2c/db1 0 2026-03-09T17:30:28.277 INFO:tasks.workunit.client.1.vm09.stdout:1/575: dread d9/dc/dd/f4f [0,4194304] 0 2026-03-09T17:30:28.278 INFO:tasks.workunit.client.1.vm09.stdout:3/527: symlink d5/d16/d46/l9d 0 2026-03-09T17:30:28.278 INFO:tasks.workunit.client.1.vm09.stdout:3/528: readlink d5/d9/d30/d65/l49 0 2026-03-09T17:30:28.279 INFO:tasks.workunit.client.1.vm09.stdout:5/612: rmdir d0/dc/d21/d6f 39 2026-03-09T17:30:28.279 INFO:tasks.workunit.client.1.vm09.stdout:5/613: fdatasync d0/d2/fc5 0 2026-03-09T17:30:28.279 INFO:tasks.workunit.client.1.vm09.stdout:5/614: fdatasync d0/dc/d21/f7a 0 2026-03-09T17:30:28.284 INFO:tasks.workunit.client.1.vm09.stdout:0/592: rename d6/d1d/d24/d32/f7c to d6/d1d/d24/d32/d59/d81/fc1 0 2026-03-09T17:30:28.290 INFO:tasks.workunit.client.1.vm09.stdout:7/691: creat da/d11/d3e/fe9 x:0 0 0 2026-03-09T17:30:28.297 INFO:tasks.workunit.client.1.vm09.stdout:6/574: getdents d3/d21/d76/d5c/d9f 0 2026-03-09T17:30:28.306 INFO:tasks.workunit.client.1.vm09.stdout:2/569: dread d13/d15/f2b [0,4194304] 0 2026-03-09T17:30:28.309 INFO:tasks.workunit.client.1.vm09.stdout:5/615: creat d0/dc/d21/d26/fc6 x:0 0 0 2026-03-09T17:30:28.318 INFO:tasks.workunit.client.1.vm09.stdout:0/593: dread d6/d1d/d46/f4d [0,4194304] 0 2026-03-09T17:30:28.318 INFO:tasks.workunit.client.1.vm09.stdout:8/610: link d1/d14/d2a/d49/fac d1/d14/d2a/d42/d43/fbb 0 2026-03-09T17:30:28.320 INFO:tasks.workunit.client.1.vm09.stdout:7/692: creat da/d11/d47/d5b/d78/fea x:0 0 0 2026-03-09T17:30:28.323 INFO:tasks.workunit.client.1.vm09.stdout:9/567: symlink d5/de/d29/d33/d94/da9/lc2 0 2026-03-09T17:30:28.324 INFO:tasks.workunit.client.1.vm09.stdout:9/568: dread - d5/de/d29/d33/d94/fbe zero size 2026-03-09T17:30:28.325 INFO:tasks.workunit.client.1.vm09.stdout:6/575: mkdir d3/d21/d25/d26/d86/dbe 0 2026-03-09T17:30:28.327 INFO:tasks.workunit.client.1.vm09.stdout:3/529: symlink d5/d9/d30/d65/d59/d84/l9e 0 2026-03-09T17:30:28.335 INFO:tasks.workunit.client.1.vm09.stdout:3/530: dread d5/d16/d46/f63 [0,4194304] 0 2026-03-09T17:30:28.336 INFO:tasks.workunit.client.1.vm09.stdout:3/531: readlink d5/d16/d25/l3a 0 2026-03-09T17:30:28.338 INFO:tasks.workunit.client.1.vm09.stdout:9/569: dread d5/de/d29/f89 [0,4194304] 0 2026-03-09T17:30:28.341 INFO:tasks.workunit.client.1.vm09.stdout:6/576: dread d3/d7/d59/d5a/f64 [0,4194304] 0 2026-03-09T17:30:28.341 INFO:tasks.workunit.client.1.vm09.stdout:3/532: dwrite d5/d16/d31/d3d/d32/f89 [0,4194304] 0 2026-03-09T17:30:28.344 INFO:tasks.workunit.client.1.vm09.stdout:0/594: write d6/d1d/d24/d32/d59/d81/d8c/fa3 [972252,69196] 0 2026-03-09T17:30:28.354 INFO:tasks.workunit.client.1.vm09.stdout:5/616: dread d0/dc/d21/d6f/f80 [0,4194304] 0 2026-03-09T17:30:28.355 INFO:tasks.workunit.client.1.vm09.stdout:7/693: rename da/d11/d47/f8d to da/d11/d64/d84/feb 0 2026-03-09T17:30:28.365 INFO:tasks.workunit.client.1.vm09.stdout:7/694: read - da/d11/d47/d5b/d6c/d9e/f57 zero size 2026-03-09T17:30:28.365 INFO:tasks.workunit.client.1.vm09.stdout:1/576: creat d9/dc/fb2 x:0 0 0 2026-03-09T17:30:28.365 INFO:tasks.workunit.client.1.vm09.stdout:1/577: write d9/dc/dd/f7b [1393973,87532] 0 2026-03-09T17:30:28.371 INFO:tasks.workunit.client.1.vm09.stdout:2/570: symlink d13/d15/d3b/lb2 0 2026-03-09T17:30:28.371 INFO:tasks.workunit.client.1.vm09.stdout:2/571: chown d13/d15/d21/f5d 28042 1 2026-03-09T17:30:28.391 INFO:tasks.workunit.client.1.vm09.stdout:8/611: mknod d1/da/dd/cbc 0 2026-03-09T17:30:28.398 INFO:tasks.workunit.client.1.vm09.stdout:9/570: dwrite d5/f8e [4194304,4194304] 0 2026-03-09T17:30:28.402 INFO:tasks.workunit.client.1.vm09.stdout:0/595: fdatasync d6/d1d/f41 0 2026-03-09T17:30:28.410 INFO:tasks.workunit.client.1.vm09.stdout:5/617: creat d0/dc/d21/d26/d5e/d68/d79/fc7 x:0 0 0 2026-03-09T17:30:28.414 INFO:tasks.workunit.client.1.vm09.stdout:6/577: rename d3/d21/d25/d26/d34 to d3/d21/d25/d26/d6b/dbf 0 2026-03-09T17:30:28.420 INFO:tasks.workunit.client.1.vm09.stdout:1/578: creat d9/dc/dd/d9f/d9c/fb3 x:0 0 0 2026-03-09T17:30:28.424 INFO:tasks.workunit.client.1.vm09.stdout:2/572: fsync d13/d15/f1d 0 2026-03-09T17:30:28.430 INFO:tasks.workunit.client.1.vm09.stdout:4/579: link d11/d1e/d31/c99 d11/d1e/cb8 0 2026-03-09T17:30:28.430 INFO:tasks.workunit.client.1.vm09.stdout:4/580: chown d11/f26 2638 1 2026-03-09T17:30:28.430 INFO:tasks.workunit.client.1.vm09.stdout:3/533: mkdir d5/d16/d31/d3d/d9f 0 2026-03-09T17:30:28.435 INFO:tasks.workunit.client.1.vm09.stdout:0/596: write d6/d1d/d24/d32/d59/fb0 [3714328,55336] 0 2026-03-09T17:30:28.435 INFO:tasks.workunit.client.1.vm09.stdout:0/597: readlink d6/lb4 0 2026-03-09T17:30:28.440 INFO:tasks.workunit.client.1.vm09.stdout:7/695: rename da/d11/d47/d89/dbe/dc2 to da/d11/d77/de5/dec 0 2026-03-09T17:30:28.443 INFO:tasks.workunit.client.1.vm09.stdout:9/571: write d5/de/d29/d33/f4a [1055263,125364] 0 2026-03-09T17:30:28.454 INFO:tasks.workunit.client.1.vm09.stdout:4/581: symlink d11/d1e/d45/lb9 0 2026-03-09T17:30:28.457 INFO:tasks.workunit.client.1.vm09.stdout:1/579: dwrite d9/dc/dd/d40/f86 [0,4194304] 0 2026-03-09T17:30:28.461 INFO:tasks.workunit.client.1.vm09.stdout:8/612: mkdir d1/dbd 0 2026-03-09T17:30:28.463 INFO:tasks.workunit.client.1.vm09.stdout:8/613: write d1/da/dd/f45 [4920969,65157] 0 2026-03-09T17:30:28.475 INFO:tasks.workunit.client.1.vm09.stdout:4/582: dread d11/f23 [0,4194304] 0 2026-03-09T17:30:28.476 INFO:tasks.workunit.client.1.vm09.stdout:5/618: mknod d0/d9/d74/d75/dbd/cc8 0 2026-03-09T17:30:28.477 INFO:tasks.workunit.client.1.vm09.stdout:7/696: creat da/d11/d47/d5b/d6c/d9e/d4e/d4c/fed x:0 0 0 2026-03-09T17:30:28.477 INFO:tasks.workunit.client.1.vm09.stdout:0/598: dread - d6/d1d/d46/f7d zero size 2026-03-09T17:30:28.479 INFO:tasks.workunit.client.1.vm09.stdout:5/619: dread d0/d2/d76/d87/d95/d9b/fab [0,4194304] 0 2026-03-09T17:30:28.480 INFO:tasks.workunit.client.1.vm09.stdout:2/573: mkdir d13/db3 0 2026-03-09T17:30:28.485 INFO:tasks.workunit.client.1.vm09.stdout:4/583: creat d11/d1e/d45/d60/d71/db7/d89/fba x:0 0 0 2026-03-09T17:30:28.485 INFO:tasks.workunit.client.1.vm09.stdout:3/534: creat d5/d16/d31/d3d/d32/fa0 x:0 0 0 2026-03-09T17:30:28.486 INFO:tasks.workunit.client.1.vm09.stdout:6/578: link d3/d7/f4c d3/d21/d25/d91/d9a/fc0 0 2026-03-09T17:30:28.487 INFO:tasks.workunit.client.1.vm09.stdout:7/697: read da/d11/d47/d5b/d6c/d9e/f38 [1278155,73602] 0 2026-03-09T17:30:28.488 INFO:tasks.workunit.client.1.vm09.stdout:1/580: creat d9/dc/dd/d9f/d9c/db0/fb4 x:0 0 0 2026-03-09T17:30:28.488 INFO:tasks.workunit.client.1.vm09.stdout:0/599: mkdir d6/d1d/d24/d5e/dc2 0 2026-03-09T17:30:28.490 INFO:tasks.workunit.client.1.vm09.stdout:1/581: dread - d9/dc/dd/d40/d21/d35/d88/f9a zero size 2026-03-09T17:30:28.492 INFO:tasks.workunit.client.1.vm09.stdout:4/584: symlink d11/d1e/d45/d60/d71/db7/lbb 0 2026-03-09T17:30:28.492 INFO:tasks.workunit.client.1.vm09.stdout:2/574: symlink d13/d15/d34/lb4 0 2026-03-09T17:30:28.494 INFO:tasks.workunit.client.1.vm09.stdout:4/585: fsync d11/d1e/d45/fb3 0 2026-03-09T17:30:28.496 INFO:tasks.workunit.client.1.vm09.stdout:0/600: dread d6/d64/f7e [0,4194304] 0 2026-03-09T17:30:28.496 INFO:tasks.workunit.client.1.vm09.stdout:0/601: truncate d6/d64/f7e 938334 0 2026-03-09T17:30:28.497 INFO:tasks.workunit.client.1.vm09.stdout:6/579: chown d3/d7/l43 11934959 1 2026-03-09T17:30:28.503 INFO:tasks.workunit.client.1.vm09.stdout:8/614: link d1/da/dd/d47/c88 d1/da/dd/d63/cbe 0 2026-03-09T17:30:28.504 INFO:tasks.workunit.client.1.vm09.stdout:9/572: write d5/de/d4e/d6e/d93/f74 [630895,126758] 0 2026-03-09T17:30:28.516 INFO:tasks.workunit.client.1.vm09.stdout:3/535: write d5/d16/d31/d37/f6d [3507898,64925] 0 2026-03-09T17:30:28.519 INFO:tasks.workunit.client.1.vm09.stdout:5/620: rename d0/d52/c2b to d0/d9/cc9 0 2026-03-09T17:30:28.520 INFO:tasks.workunit.client.1.vm09.stdout:5/621: write d0/dc/d21/f62 [2491603,20768] 0 2026-03-09T17:30:28.526 INFO:tasks.workunit.client.1.vm09.stdout:6/580: truncate d3/d21/d25/f5f 4449583 0 2026-03-09T17:30:28.527 INFO:tasks.workunit.client.1.vm09.stdout:8/615: symlink d1/da/dd/d79/lbf 0 2026-03-09T17:30:28.535 INFO:tasks.workunit.client.1.vm09.stdout:0/602: mknod d6/d1d/d24/d5e/dc2/cc3 0 2026-03-09T17:30:28.536 INFO:tasks.workunit.client.1.vm09.stdout:7/698: creat da/d11/d2d/fee x:0 0 0 2026-03-09T17:30:28.537 INFO:tasks.workunit.client.1.vm09.stdout:6/581: creat d3/d21/d76/d88/fc1 x:0 0 0 2026-03-09T17:30:28.546 INFO:tasks.workunit.client.1.vm09.stdout:4/586: dwrite d11/d1e/d29/f8a [0,4194304] 0 2026-03-09T17:30:28.548 INFO:tasks.workunit.client.1.vm09.stdout:3/536: write d5/d16/d31/d37/f94 [212353,61607] 0 2026-03-09T17:30:28.548 INFO:tasks.workunit.client.1.vm09.stdout:3/537: stat d5/d16/f54 0 2026-03-09T17:30:28.549 INFO:tasks.workunit.client.1.vm09.stdout:3/538: write d5/f22 [2938272,66164] 0 2026-03-09T17:30:28.566 INFO:tasks.workunit.client.1.vm09.stdout:8/616: dread d1/da/f4b [0,4194304] 0 2026-03-09T17:30:28.569 INFO:tasks.workunit.client.1.vm09.stdout:9/573: mknod d5/cc3 0 2026-03-09T17:30:28.569 INFO:tasks.workunit.client.1.vm09.stdout:9/574: stat d5/de/d4e/d6e/d93/f74 0 2026-03-09T17:30:28.571 INFO:tasks.workunit.client.1.vm09.stdout:5/622: symlink d0/dc/dc3/lca 0 2026-03-09T17:30:28.572 INFO:tasks.workunit.client.1.vm09.stdout:5/623: truncate d0/dc/d21/f29 4987796 0 2026-03-09T17:30:28.573 INFO:tasks.workunit.client.1.vm09.stdout:2/575: rmdir d13/d15/d36/d72/d94/dae 0 2026-03-09T17:30:28.573 INFO:tasks.workunit.client.1.vm09.stdout:5/624: fsync d0/d2/d76/d87/fa5 0 2026-03-09T17:30:28.585 INFO:tasks.workunit.client.1.vm09.stdout:9/575: creat d5/d2e/d70/d84/fc4 x:0 0 0 2026-03-09T17:30:28.585 INFO:tasks.workunit.client.1.vm09.stdout:1/582: getdents d9/dc/dd/d40/d22/d91/d99 0 2026-03-09T17:30:28.589 INFO:tasks.workunit.client.1.vm09.stdout:2/576: mkdir d13/d15/d34/d45/d84/db5 0 2026-03-09T17:30:28.600 INFO:tasks.workunit.client.1.vm09.stdout:0/603: rename d6/d1d/d24/d32/d59/c98 to d6/d1d/d24/d5e/dc2/cc4 0 2026-03-09T17:30:28.613 INFO:tasks.workunit.client.1.vm09.stdout:7/699: dwrite da/f21 [0,4194304] 0 2026-03-09T17:30:28.618 INFO:tasks.workunit.client.1.vm09.stdout:9/576: mknod d5/de/d88/cc5 0 2026-03-09T17:30:28.622 INFO:tasks.workunit.client.1.vm09.stdout:9/577: dwrite d5/f11 [0,4194304] 0 2026-03-09T17:30:28.628 INFO:tasks.workunit.client.1.vm09.stdout:2/577: symlink d13/d15/d36/d72/d94/lb6 0 2026-03-09T17:30:28.629 INFO:tasks.workunit.client.1.vm09.stdout:2/578: chown d13/d15/d3b/d43/c5a 10031 1 2026-03-09T17:30:28.630 INFO:tasks.workunit.client.1.vm09.stdout:6/582: rename d3/d7/d59/d73/c7f to d3/d21/d25/d26/d86/dbc/cc2 0 2026-03-09T17:30:28.633 INFO:tasks.workunit.client.1.vm09.stdout:5/625: symlink d0/d9/lcb 0 2026-03-09T17:30:28.634 INFO:tasks.workunit.client.1.vm09.stdout:5/626: fdatasync d0/f91 0 2026-03-09T17:30:28.634 INFO:tasks.workunit.client.1.vm09.stdout:3/539: creat d5/fa1 x:0 0 0 2026-03-09T17:30:28.635 INFO:tasks.workunit.client.1.vm09.stdout:5/627: write d0/f91 [517261,11306] 0 2026-03-09T17:30:28.636 INFO:tasks.workunit.client.1.vm09.stdout:8/617: creat d1/da/dd/fc0 x:0 0 0 2026-03-09T17:30:28.651 INFO:tasks.workunit.client.1.vm09.stdout:9/578: creat d5/de/d29/d33/d94/fc6 x:0 0 0 2026-03-09T17:30:28.655 INFO:tasks.workunit.client.1.vm09.stdout:4/587: getdents d11/d1e/d29/d36 0 2026-03-09T17:30:28.658 INFO:tasks.workunit.client.1.vm09.stdout:5/628: symlink d0/d2/d76/d87/d95/d9b/lcc 0 2026-03-09T17:30:28.658 INFO:tasks.workunit.client.1.vm09.stdout:3/540: chown d5/d16/d31/d37/d58/d64/f8e 1454 1 2026-03-09T17:30:28.663 INFO:tasks.workunit.client.1.vm09.stdout:9/579: dread d5/f4b [0,4194304] 0 2026-03-09T17:30:28.663 INFO:tasks.workunit.client.1.vm09.stdout:1/583: rmdir d9/dc/dd/d40/d1d/dad 0 2026-03-09T17:30:28.663 INFO:tasks.workunit.client.1.vm09.stdout:1/584: fsync d9/dc/dd/d9f/d9c/db0/fb4 0 2026-03-09T17:30:28.666 INFO:tasks.workunit.client.1.vm09.stdout:2/579: creat d13/db3/fb7 x:0 0 0 2026-03-09T17:30:28.667 INFO:tasks.workunit.client.1.vm09.stdout:4/588: write f3 [15036,88930] 0 2026-03-09T17:30:28.668 INFO:tasks.workunit.client.1.vm09.stdout:5/629: creat d0/dc/d21/d26/fcd x:0 0 0 2026-03-09T17:30:28.669 INFO:tasks.workunit.client.1.vm09.stdout:5/630: write d0/d2/d76/d86/fac [449386,110793] 0 2026-03-09T17:30:28.671 INFO:tasks.workunit.client.1.vm09.stdout:4/589: dwrite d11/d1e/d45/d60/f95 [0,4194304] 0 2026-03-09T17:30:28.674 INFO:tasks.workunit.client.1.vm09.stdout:3/541: dread - d5/d9/d30/d65/d59/f87 zero size 2026-03-09T17:30:28.680 INFO:tasks.workunit.client.1.vm09.stdout:5/631: dwrite d0/d2/fc5 [0,4194304] 0 2026-03-09T17:30:28.682 INFO:tasks.workunit.client.1.vm09.stdout:5/632: read - d0/dc/d21/d26/fcd zero size 2026-03-09T17:30:28.686 INFO:tasks.workunit.client.1.vm09.stdout:5/633: readlink d0/dc/d21/d26/d5e/d68/d79/lb1 0 2026-03-09T17:30:28.690 INFO:tasks.workunit.client.1.vm09.stdout:0/604: write d6/f21 [483601,113349] 0 2026-03-09T17:30:28.704 INFO:tasks.workunit.client.1.vm09.stdout:7/700: dwrite da/d11/d47/d5b/d6c/d9e/d4e/f63 [0,4194304] 0 2026-03-09T17:30:28.705 INFO:tasks.workunit.client.1.vm09.stdout:9/580: rename d5/de/d29/d33/d94 to d5/de/d29/d90/dc7 0 2026-03-09T17:30:28.707 INFO:tasks.workunit.client.1.vm09.stdout:3/542: unlink d5/d9/c50 0 2026-03-09T17:30:28.709 INFO:tasks.workunit.client.1.vm09.stdout:6/583: write d3/d21/d25/d26/f2a [8124893,88136] 0 2026-03-09T17:30:28.711 INFO:tasks.workunit.client.1.vm09.stdout:7/701: dwrite da/d11/d47/d5b/d78/f80 [4194304,4194304] 0 2026-03-09T17:30:28.716 INFO:tasks.workunit.client.1.vm09.stdout:5/634: unlink d0/d52/d20/l5a 0 2026-03-09T17:30:28.717 INFO:tasks.workunit.client.1.vm09.stdout:5/635: write d0/fa3 [541648,43928] 0 2026-03-09T17:30:28.722 INFO:tasks.workunit.client.1.vm09.stdout:0/605: stat d6/d1d/d24/f4e 0 2026-03-09T17:30:28.723 INFO:tasks.workunit.client.1.vm09.stdout:4/590: read d11/d1e/d45/d60/d71/f76 [1048032,4191] 0 2026-03-09T17:30:28.730 INFO:tasks.workunit.client.1.vm09.stdout:9/581: creat d5/de/d29/d90/dc7/fc8 x:0 0 0 2026-03-09T17:30:28.738 INFO:tasks.workunit.client.1.vm09.stdout:3/543: creat d5/d9/d30/d65/d59/fa2 x:0 0 0 2026-03-09T17:30:28.738 INFO:tasks.workunit.client.1.vm09.stdout:7/702: dread da/f15 [0,4194304] 0 2026-03-09T17:30:28.743 INFO:tasks.workunit.client.1.vm09.stdout:6/584: rmdir d3/d21/d25/d26/db7 39 2026-03-09T17:30:28.754 INFO:tasks.workunit.client.1.vm09.stdout:8/618: write d1/d14/f9c [2667566,66989] 0 2026-03-09T17:30:28.755 INFO:tasks.workunit.client.1.vm09.stdout:1/585: dwrite d9/f34 [0,4194304] 0 2026-03-09T17:30:28.770 INFO:tasks.workunit.client.1.vm09.stdout:9/582: mkdir d5/d91/d99/dc9 0 2026-03-09T17:30:28.779 INFO:tasks.workunit.client.1.vm09.stdout:2/580: dwrite d13/d15/d36/d72/f87 [0,4194304] 0 2026-03-09T17:30:28.788 INFO:tasks.workunit.client.1.vm09.stdout:7/703: symlink da/d11/d47/d5b/d6c/d9e/d4e/d5f/lef 0 2026-03-09T17:30:28.791 INFO:tasks.workunit.client.1.vm09.stdout:6/585: rename d3/d21/d25/d26/c35 to d3/d21/db1/cc3 0 2026-03-09T17:30:28.793 INFO:tasks.workunit.client.1.vm09.stdout:5/636: mkdir d0/d2/d76/d87/d95/d9b/dc0/dce 0 2026-03-09T17:30:28.798 INFO:tasks.workunit.client.1.vm09.stdout:1/586: unlink d9/dc/dd/d40/d22/d37/d3f/l5d 0 2026-03-09T17:30:28.817 INFO:tasks.workunit.client.1.vm09.stdout:9/583: rename d5/d2e/d70 to d5/de/d4e/dca 0 2026-03-09T17:30:28.818 INFO:tasks.workunit.client.1.vm09.stdout:6/586: read - d3/d21/d25/d91/d9a/fc0 zero size 2026-03-09T17:30:28.818 INFO:tasks.workunit.client.1.vm09.stdout:7/704: read da/d11/d2d/d56/f53 [4093257,17054] 0 2026-03-09T17:30:28.819 INFO:tasks.workunit.client.1.vm09.stdout:9/584: fdatasync d5/de/d29/d33/f9a 0 2026-03-09T17:30:28.821 INFO:tasks.workunit.client.1.vm09.stdout:9/585: write d5/de/d4e/d6e/d93/fb0 [6883349,61313] 0 2026-03-09T17:30:28.822 INFO:tasks.workunit.client.1.vm09.stdout:6/587: mkdir d3/d21/d25/d26/d6b/dbf/dc4 0 2026-03-09T17:30:28.825 INFO:tasks.workunit.client.1.vm09.stdout:0/606: getdents d6/d64/d94 0 2026-03-09T17:30:28.826 INFO:tasks.workunit.client.1.vm09.stdout:4/591: getdents d11/d1e/d45/d60/d71/db7/d89 0 2026-03-09T17:30:28.830 INFO:tasks.workunit.client.1.vm09.stdout:7/705: fsync da/d11/d47/d5b/d6c/d9e/d4e/d4c/de4/f7f 0 2026-03-09T17:30:28.830 INFO:tasks.workunit.client.1.vm09.stdout:2/581: sync 2026-03-09T17:30:28.830 INFO:tasks.workunit.client.1.vm09.stdout:7/706: chown da/d11 227317349 1 2026-03-09T17:30:28.831 INFO:tasks.workunit.client.1.vm09.stdout:8/619: getdents d1/d14/d2a/d49 0 2026-03-09T17:30:28.833 INFO:tasks.workunit.client.1.vm09.stdout:6/588: chown d3/d21/d25/c44 214650150 1 2026-03-09T17:30:28.834 INFO:tasks.workunit.client.1.vm09.stdout:5/637: link d0/dc/d21/d6f/f80 d0/d2/fcf 0 2026-03-09T17:30:28.835 INFO:tasks.workunit.client.1.vm09.stdout:5/638: write d0/dc/d21/d33/fa7 [2889531,13107] 0 2026-03-09T17:30:28.836 INFO:tasks.workunit.client.1.vm09.stdout:4/592: dread - d11/d1e/d45/f70 zero size 2026-03-09T17:30:28.842 INFO:tasks.workunit.client.1.vm09.stdout:8/620: read d1/d14/d2a/f81 [544579,103094] 0 2026-03-09T17:30:28.844 INFO:tasks.workunit.client.1.vm09.stdout:5/639: mknod d0/d2/d76/d86/cd0 0 2026-03-09T17:30:28.848 INFO:tasks.workunit.client.1.vm09.stdout:7/707: mknod da/d11/d47/d5b/cf0 0 2026-03-09T17:30:28.848 INFO:tasks.workunit.client.1.vm09.stdout:8/621: mknod d1/da/d3a/cc1 0 2026-03-09T17:30:28.848 INFO:tasks.workunit.client.1.vm09.stdout:7/708: chown da/d11/d64/cb8 60716934 1 2026-03-09T17:30:28.850 INFO:tasks.workunit.client.1.vm09.stdout:0/607: rename d6/d1d/d39/f44 to d6/d1d/fc5 0 2026-03-09T17:30:28.850 INFO:tasks.workunit.client.1.vm09.stdout:4/593: dread f10 [0,4194304] 0 2026-03-09T17:30:28.850 INFO:tasks.workunit.client.1.vm09.stdout:4/594: readlink d11/d1e/la3 0 2026-03-09T17:30:28.853 INFO:tasks.workunit.client.1.vm09.stdout:3/544: dread d5/d16/f45 [8388608,4194304] 0 2026-03-09T17:30:28.854 INFO:tasks.workunit.client.1.vm09.stdout:3/545: chown d5/d9/d30/d65/d59 574358317 1 2026-03-09T17:30:28.855 INFO:tasks.workunit.client.1.vm09.stdout:5/640: unlink d0/d2/d76/d86/fac 0 2026-03-09T17:30:28.859 INFO:tasks.workunit.client.1.vm09.stdout:7/709: dread da/d11/d2d/d56/f9f [0,4194304] 0 2026-03-09T17:30:28.860 INFO:tasks.workunit.client.1.vm09.stdout:0/608: symlink d6/d1d/d24/d32/d59/d9c/lc6 0 2026-03-09T17:30:28.860 INFO:tasks.workunit.client.1.vm09.stdout:0/609: chown d6/d1d/d24/d32/d59/d9c/dac 44060 1 2026-03-09T17:30:28.861 INFO:tasks.workunit.client.1.vm09.stdout:0/610: fsync d6/d1d/d24/d32/d59/fb0 0 2026-03-09T17:30:28.866 INFO:tasks.workunit.client.1.vm09.stdout:0/611: dwrite d6/d93/fb7 [0,4194304] 0 2026-03-09T17:30:28.867 INFO:tasks.workunit.client.1.vm09.stdout:3/546: rmdir d5/d9/d30 39 2026-03-09T17:30:28.867 INFO:tasks.workunit.client.1.vm09.stdout:5/641: rmdir d0/d2/d76/d87/d95/d9b 39 2026-03-09T17:30:28.871 INFO:tasks.workunit.client.1.vm09.stdout:5/642: dread d0/dc/d21/d26/f3d [0,4194304] 0 2026-03-09T17:30:28.885 INFO:tasks.workunit.client.1.vm09.stdout:6/589: rename d3/d21/d25/d91 to d3/d21/d76/d5c/d7e/dc5 0 2026-03-09T17:30:28.886 INFO:tasks.workunit.client.1.vm09.stdout:3/547: creat d5/d16/d31/d3d/d32/fa3 x:0 0 0 2026-03-09T17:30:28.896 INFO:tasks.workunit.client.1.vm09.stdout:0/612: mknod d6/d1d/cc7 0 2026-03-09T17:30:28.900 INFO:tasks.workunit.client.1.vm09.stdout:8/622: rename d1/da/dd/d47/d4c/d8d to d1/da/d23/dc2 0 2026-03-09T17:30:28.904 INFO:tasks.workunit.client.1.vm09.stdout:6/590: fsync d3/d7/f77 0 2026-03-09T17:30:28.909 INFO:tasks.workunit.client.1.vm09.stdout:5/643: fsync d0/d9/d8b/fb3 0 2026-03-09T17:30:28.914 INFO:tasks.workunit.client.1.vm09.stdout:6/591: sync 2026-03-09T17:30:28.917 INFO:tasks.workunit.client.1.vm09.stdout:8/623: rename d1/da/dd/d47/db7 to d1/d14/d96/dc3 0 2026-03-09T17:30:28.918 INFO:tasks.workunit.client.1.vm09.stdout:8/624: write d1/d14/d2a/f8b [2924427,22878] 0 2026-03-09T17:30:28.920 INFO:tasks.workunit.client.1.vm09.stdout:3/548: symlink d5/d16/d31/d37/d58/d8a/la4 0 2026-03-09T17:30:28.931 INFO:tasks.workunit.client.1.vm09.stdout:0/613: mkdir d6/d1d/d24/d5e/dc8 0 2026-03-09T17:30:28.934 INFO:tasks.workunit.client.1.vm09.stdout:0/614: dread - d6/fa6 zero size 2026-03-09T17:30:28.935 INFO:tasks.workunit.client.1.vm09.stdout:3/549: dwrite d5/d16/d31/d37/f6d [0,4194304] 0 2026-03-09T17:30:28.939 INFO:tasks.workunit.client.1.vm09.stdout:6/592: dread d3/d7/fe [0,4194304] 0 2026-03-09T17:30:28.941 INFO:tasks.workunit.client.1.vm09.stdout:1/587: dwrite f8 [0,4194304] 0 2026-03-09T17:30:28.947 INFO:tasks.workunit.client.1.vm09.stdout:9/586: write d5/de/f2d [1111863,10461] 0 2026-03-09T17:30:28.948 INFO:tasks.workunit.client.1.vm09.stdout:2/582: dwrite d13/d15/d3b/f3f [0,4194304] 0 2026-03-09T17:30:28.949 INFO:tasks.workunit.client.1.vm09.stdout:5/644: dread d0/dc/d21/d6f/f80 [0,4194304] 0 2026-03-09T17:30:28.951 INFO:tasks.workunit.client.1.vm09.stdout:5/645: dread d0/d2/fc5 [0,4194304] 0 2026-03-09T17:30:28.951 INFO:tasks.workunit.client.1.vm09.stdout:6/593: readlink d3/d21/l45 0 2026-03-09T17:30:28.956 INFO:tasks.workunit.client.1.vm09.stdout:6/594: stat d3/c14 0 2026-03-09T17:30:28.961 INFO:tasks.workunit.client.1.vm09.stdout:8/625: rename d1/da/d23/d6c/d32/f50 to d1/da/d23/fc4 0 2026-03-09T17:30:28.969 INFO:tasks.workunit.client.1.vm09.stdout:9/587: rmdir d5/d91/d99 39 2026-03-09T17:30:28.979 INFO:tasks.workunit.client.1.vm09.stdout:9/588: read d5/de/d4e/d6e/d93/f74 [2113195,112776] 0 2026-03-09T17:30:28.980 INFO:tasks.workunit.client.1.vm09.stdout:1/588: creat d9/dc/dd/d40/d22/d37/d3f/d42/d55/fb5 x:0 0 0 2026-03-09T17:30:28.984 INFO:tasks.workunit.client.1.vm09.stdout:5/646: unlink d0/d46/d4b/f4f 0 2026-03-09T17:30:28.989 INFO:tasks.workunit.client.1.vm09.stdout:0/615: dread d6/d1d/d24/d32/f45 [0,4194304] 0 2026-03-09T17:30:28.989 INFO:tasks.workunit.client.1.vm09.stdout:6/595: rename d3/d7/f24 to d3/d21/d25/d26/d86/dbe/fc6 0 2026-03-09T17:30:28.990 INFO:tasks.workunit.client.1.vm09.stdout:6/596: truncate d3/d7/f11 834605 0 2026-03-09T17:30:29.001 INFO:tasks.workunit.client.1.vm09.stdout:0/616: mkdir d6/d64/d97/dc9 0 2026-03-09T17:30:29.003 INFO:tasks.workunit.client.1.vm09.stdout:8/626: mkdir d1/d14/d31/d97/dc5 0 2026-03-09T17:30:29.004 INFO:tasks.workunit.client.1.vm09.stdout:6/597: creat d3/d48/fc7 x:0 0 0 2026-03-09T17:30:29.007 INFO:tasks.workunit.client.1.vm09.stdout:3/550: creat d5/d16/d31/d37/fa5 x:0 0 0 2026-03-09T17:30:29.007 INFO:tasks.workunit.client.1.vm09.stdout:3/551: fsync d5/d16/f54 0 2026-03-09T17:30:29.011 INFO:tasks.workunit.client.1.vm09.stdout:9/589: truncate d5/f4f 31825 0 2026-03-09T17:30:29.013 INFO:tasks.workunit.client.1.vm09.stdout:5/647: symlink d0/d2/d76/d87/da4/dbf/ld1 0 2026-03-09T17:30:29.014 INFO:tasks.workunit.client.1.vm09.stdout:5/648: fsync d0/dc/d21/d26/d5e/d68/d79/fc7 0 2026-03-09T17:30:29.016 INFO:tasks.workunit.client.1.vm09.stdout:6/598: truncate d3/d21/d76/d5c/d7e/dc5/d9a/fc0 663046 0 2026-03-09T17:30:29.022 INFO:tasks.workunit.client.1.vm09.stdout:9/590: sync 2026-03-09T17:30:29.024 INFO:tasks.workunit.client.1.vm09.stdout:9/591: read - d5/de/d29/d90/fb9 zero size 2026-03-09T17:30:29.024 INFO:tasks.workunit.client.1.vm09.stdout:9/592: write d5/de/d29/d90/dc7/fbe [367357,2311] 0 2026-03-09T17:30:29.027 INFO:tasks.workunit.client.1.vm09.stdout:6/599: creat d3/d21/db1/fc8 x:0 0 0 2026-03-09T17:30:29.027 INFO:tasks.workunit.client.1.vm09.stdout:4/595: write d11/f1f [260121,6637] 0 2026-03-09T17:30:29.030 INFO:tasks.workunit.client.1.vm09.stdout:4/596: read - d11/d1e/d45/d60/f7b zero size 2026-03-09T17:30:29.030 INFO:tasks.workunit.client.1.vm09.stdout:9/593: fdatasync d5/de/d29/d33/f9a 0 2026-03-09T17:30:29.032 INFO:tasks.workunit.client.1.vm09.stdout:7/710: write da/d11/d47/d5b/d6c/d9e/d4e/d4c/fde [649154,57548] 0 2026-03-09T17:30:29.038 INFO:tasks.workunit.client.1.vm09.stdout:3/552: creat d5/d9/fa6 x:0 0 0 2026-03-09T17:30:29.039 INFO:tasks.workunit.client.1.vm09.stdout:6/600: dread d3/d7/f40 [0,4194304] 0 2026-03-09T17:30:29.047 INFO:tasks.workunit.client.1.vm09.stdout:5/649: rename d0/dc/d21/f62 to d0/d9/fd2 0 2026-03-09T17:30:29.065 INFO:tasks.workunit.client.1.vm09.stdout:2/583: dwrite d13/d15/d21/f30 [4194304,4194304] 0 2026-03-09T17:30:29.068 INFO:tasks.workunit.client.1.vm09.stdout:1/589: rmdir d9/dc/dd/d40 39 2026-03-09T17:30:29.071 INFO:tasks.workunit.client.1.vm09.stdout:4/597: read fe [389241,105734] 0 2026-03-09T17:30:29.075 INFO:tasks.workunit.client.1.vm09.stdout:9/594: dread d5/de/d4e/dca/f75 [0,4194304] 0 2026-03-09T17:30:29.075 INFO:tasks.workunit.client.1.vm09.stdout:9/595: write d5/de/f76 [603771,35103] 0 2026-03-09T17:30:29.076 INFO:tasks.workunit.client.1.vm09.stdout:9/596: chown d5/f34 4100 1 2026-03-09T17:30:29.078 INFO:tasks.workunit.client.1.vm09.stdout:9/597: dread d5/f11 [4194304,4194304] 0 2026-03-09T17:30:29.081 INFO:tasks.workunit.client.1.vm09.stdout:8/627: write d1/da/d23/d6c/f1c [3804106,71150] 0 2026-03-09T17:30:29.082 INFO:tasks.workunit.client.1.vm09.stdout:8/628: chown d1/da/d3a 1 1 2026-03-09T17:30:29.083 INFO:tasks.workunit.client.1.vm09.stdout:8/629: readlink d1/da/d23/l6f 0 2026-03-09T17:30:29.085 INFO:tasks.workunit.client.1.vm09.stdout:8/630: chown d1/d14/d2a/d42/d5d/lba 30553285 1 2026-03-09T17:30:29.085 INFO:tasks.workunit.client.1.vm09.stdout:0/617: dwrite d6/d1d/d24/d32/d59/d81/d8c/fb1 [4194304,4194304] 0 2026-03-09T17:30:29.105 INFO:tasks.workunit.client.1.vm09.stdout:7/711: mknod da/d11/d2d/cf1 0 2026-03-09T17:30:29.106 INFO:tasks.workunit.client.1.vm09.stdout:4/598: dread - d11/d1e/d29/d36/d57/f8f zero size 2026-03-09T17:30:29.110 INFO:tasks.workunit.client.1.vm09.stdout:6/601: readlink d3/d21/d25/d26/db7/lba 0 2026-03-09T17:30:29.115 INFO:tasks.workunit.client.1.vm09.stdout:0/618: symlink d6/d64/d94/lca 0 2026-03-09T17:30:29.115 INFO:tasks.workunit.client.1.vm09.stdout:0/619: chown d6/f27 1769843 1 2026-03-09T17:30:29.117 INFO:tasks.workunit.client.1.vm09.stdout:5/650: mknod d0/cd3 0 2026-03-09T17:30:29.121 INFO:tasks.workunit.client.1.vm09.stdout:2/584: mkdir d13/d15/d21/d88/db8 0 2026-03-09T17:30:29.130 INFO:tasks.workunit.client.1.vm09.stdout:1/590: rmdir d9/dc/dd/d40/d22/d91/d99 39 2026-03-09T17:30:29.145 INFO:tasks.workunit.client.1.vm09.stdout:4/599: dread d11/d1e/d29/d36/f6a [0,4194304] 0 2026-03-09T17:30:29.145 INFO:tasks.workunit.client.1.vm09.stdout:7/712: truncate da/d11/d3e/da2/db2/fc7 551166 0 2026-03-09T17:30:29.145 INFO:tasks.workunit.client.1.vm09.stdout:9/598: symlink d5/lcb 0 2026-03-09T17:30:29.146 INFO:tasks.workunit.client.1.vm09.stdout:6/602: creat d3/d21/d76/d81/fc9 x:0 0 0 2026-03-09T17:30:29.147 INFO:tasks.workunit.client.1.vm09.stdout:0/620: unlink d6/d64/la4 0 2026-03-09T17:30:29.152 INFO:tasks.workunit.client.1.vm09.stdout:6/603: sync 2026-03-09T17:30:29.153 INFO:tasks.workunit.client.1.vm09.stdout:8/631: truncate d1/d14/d2a/d42/d43/d44/f5c 1327108 0 2026-03-09T17:30:29.155 INFO:tasks.workunit.client.1.vm09.stdout:2/585: mknod d13/d15/d36/d72/cb9 0 2026-03-09T17:30:29.156 INFO:tasks.workunit.client.1.vm09.stdout:3/553: creat d5/d9/d30/d65/d59/d84/fa7 x:0 0 0 2026-03-09T17:30:29.157 INFO:tasks.workunit.client.1.vm09.stdout:3/554: chown d5/d9/d30/d65/d59/d84/l9b 36 1 2026-03-09T17:30:29.157 INFO:tasks.workunit.client.1.vm09.stdout:9/599: creat d5/d2e/d8b/fcc x:0 0 0 2026-03-09T17:30:29.158 INFO:tasks.workunit.client.1.vm09.stdout:3/555: write d5/d16/d85/f98 [898381,16250] 0 2026-03-09T17:30:29.158 INFO:tasks.workunit.client.1.vm09.stdout:7/713: unlink da/d11/d64/da7/fdc 0 2026-03-09T17:30:29.159 INFO:tasks.workunit.client.1.vm09.stdout:4/600: creat d11/d1e/d29/d36/d57/fbc x:0 0 0 2026-03-09T17:30:29.160 INFO:tasks.workunit.client.1.vm09.stdout:2/586: dwrite d13/d15/d34/f48 [0,4194304] 0 2026-03-09T17:30:29.165 INFO:tasks.workunit.client.1.vm09.stdout:5/651: rmdir d0/d2/d76/d87/d95/d9b 39 2026-03-09T17:30:29.167 INFO:tasks.workunit.client.1.vm09.stdout:8/632: dread - d1/d14/d2a/d42/d5d/d8a/f94 zero size 2026-03-09T17:30:29.177 INFO:tasks.workunit.client.1.vm09.stdout:3/556: rmdir d5/d16 39 2026-03-09T17:30:29.204 INFO:tasks.workunit.client.1.vm09.stdout:4/601: dwrite d11/d1e/d45/d60/d71/db7/d89/d8b/d58/f72 [0,4194304] 0 2026-03-09T17:30:29.220 INFO:tasks.workunit.client.1.vm09.stdout:5/652: mkdir d0/dc/d21/d26/d5e/dd4 0 2026-03-09T17:30:29.221 INFO:tasks.workunit.client.1.vm09.stdout:9/600: symlink d5/lcd 0 2026-03-09T17:30:29.221 INFO:tasks.workunit.client.1.vm09.stdout:1/591: creat d9/dc/dd/d40/d21/fb6 x:0 0 0 2026-03-09T17:30:29.222 INFO:tasks.workunit.client.1.vm09.stdout:1/592: dread - d9/dc/dd/d40/d22/f50 zero size 2026-03-09T17:30:29.222 INFO:tasks.workunit.client.1.vm09.stdout:7/714: chown da/d11/d3e/da2/db2/fc7 26 1 2026-03-09T17:30:29.223 INFO:tasks.workunit.client.1.vm09.stdout:2/587: mknod d13/d15/cba 0 2026-03-09T17:30:29.224 INFO:tasks.workunit.client.1.vm09.stdout:2/588: read d13/d15/d34/f3a [1967719,33303] 0 2026-03-09T17:30:29.231 INFO:tasks.workunit.client.1.vm09.stdout:3/557: rmdir d5/d16/d31/d37/d58/d64 39 2026-03-09T17:30:29.236 INFO:tasks.workunit.client.1.vm09.stdout:5/653: dread - d0/dc/d21/d26/d5e/d68/f85 zero size 2026-03-09T17:30:29.236 INFO:tasks.workunit.client.1.vm09.stdout:5/654: write d0/dc/d21/d26/fc6 [904409,45933] 0 2026-03-09T17:30:29.239 INFO:tasks.workunit.client.1.vm09.stdout:0/621: getdents d6/d1d/d46 0 2026-03-09T17:30:29.240 INFO:tasks.workunit.client.1.vm09.stdout:0/622: chown d6/d1d/d24/d32/d59/d81/c95 54571 1 2026-03-09T17:30:29.244 INFO:tasks.workunit.client.1.vm09.stdout:3/558: unlink d5/d16/d25/l3a 0 2026-03-09T17:30:29.250 INFO:tasks.workunit.client.1.vm09.stdout:5/655: write d0/d2/d76/d86/f50 [542446,67392] 0 2026-03-09T17:30:29.251 INFO:tasks.workunit.client.1.vm09.stdout:3/559: sync 2026-03-09T17:30:29.252 INFO:tasks.workunit.client.1.vm09.stdout:9/601: write d5/de/d29/f89 [3444797,82132] 0 2026-03-09T17:30:29.255 INFO:tasks.workunit.client.1.vm09.stdout:7/715: getdents da/d11/d47/d5b/d6c/d9e/dc6/ddb 0 2026-03-09T17:30:29.255 INFO:tasks.workunit.client.1.vm09.stdout:8/633: link d1/d14/f3d d1/da/dd/fc6 0 2026-03-09T17:30:29.256 INFO:tasks.workunit.client.1.vm09.stdout:6/604: dwrite d3/d21/d76/d5c/d61/f60 [0,4194304] 0 2026-03-09T17:30:29.260 INFO:tasks.workunit.client.1.vm09.stdout:4/602: dwrite d11/d1e/d45/d60/f7b [0,4194304] 0 2026-03-09T17:30:29.263 INFO:tasks.workunit.client.1.vm09.stdout:5/656: dread - d0/d52/f97 zero size 2026-03-09T17:30:29.264 INFO:tasks.workunit.client.1.vm09.stdout:5/657: stat d0/d52/d20/f25 0 2026-03-09T17:30:29.272 INFO:tasks.workunit.client.1.vm09.stdout:3/560: mkdir d5/d16/d31/d37/d58/d8a/da8 0 2026-03-09T17:30:29.273 INFO:tasks.workunit.client.1.vm09.stdout:3/561: chown d5/d16/d31/d37/fa5 1428378598 1 2026-03-09T17:30:29.276 INFO:tasks.workunit.client.1.vm09.stdout:3/562: dwrite d5/d16/d31/d37/fa5 [0,4194304] 0 2026-03-09T17:30:29.280 INFO:tasks.workunit.client.1.vm09.stdout:0/623: symlink d6/d1d/d24/d32/d59/d81/d8c/lcb 0 2026-03-09T17:30:29.293 INFO:tasks.workunit.client.1.vm09.stdout:9/602: symlink d5/de/d29/d90/dc7/lce 0 2026-03-09T17:30:29.293 INFO:tasks.workunit.client.1.vm09.stdout:9/603: readlink d5/de/d4e/dca/l87 0 2026-03-09T17:30:29.295 INFO:tasks.workunit.client.1.vm09.stdout:8/634: mknod d1/da/dd/d63/cc7 0 2026-03-09T17:30:29.303 INFO:tasks.workunit.client.1.vm09.stdout:5/658: mknod d0/dc/d21/d6f/cd5 0 2026-03-09T17:30:29.314 INFO:tasks.workunit.client.1.vm09.stdout:0/624: unlink d6/d1d/d24/d32/d59/d81/c95 0 2026-03-09T17:30:29.318 INFO:tasks.workunit.client.1.vm09.stdout:1/593: rename d9/dc/dd/d40/d22/d37/d3f/d42/ca2 to d9/dc/dd/d40/cb7 0 2026-03-09T17:30:29.321 INFO:tasks.workunit.client.1.vm09.stdout:9/604: symlink d5/de/d29/da7/lcf 0 2026-03-09T17:30:29.325 INFO:tasks.workunit.client.1.vm09.stdout:1/594: sync 2026-03-09T17:30:29.325 INFO:tasks.workunit.client.1.vm09.stdout:1/595: write d9/dc/dd/d9f/d9c/f9b [160518,12268] 0 2026-03-09T17:30:29.332 INFO:tasks.workunit.client.1.vm09.stdout:8/635: mkdir d1/da/d23/d6c/d32/dc8 0 2026-03-09T17:30:29.338 INFO:tasks.workunit.client.1.vm09.stdout:7/716: mkdir da/d11/d47/d5b/df2 0 2026-03-09T17:30:29.342 INFO:tasks.workunit.client.1.vm09.stdout:7/717: read da/d11/d47/d5b/d6c/d9e/f38 [2488965,104423] 0 2026-03-09T17:30:29.342 INFO:tasks.workunit.client.1.vm09.stdout:7/718: readlink da/d11/d2d/d56/l62 0 2026-03-09T17:30:29.346 INFO:tasks.workunit.client.1.vm09.stdout:3/563: mkdir d5/d9/da9 0 2026-03-09T17:30:29.351 INFO:tasks.workunit.client.1.vm09.stdout:2/589: rename d13/d15/d2c/da2/fa6 to d13/d15/d34/d37/fbb 0 2026-03-09T17:30:29.357 INFO:tasks.workunit.client.1.vm09.stdout:9/605: dread d5/f4b [4194304,4194304] 0 2026-03-09T17:30:29.358 INFO:tasks.workunit.client.1.vm09.stdout:2/590: dread d13/d15/d21/f24 [4194304,4194304] 0 2026-03-09T17:30:29.369 INFO:tasks.workunit.client.1.vm09.stdout:1/596: dwrite d9/dc/dd/d9f/f8a [0,4194304] 0 2026-03-09T17:30:29.370 INFO:tasks.workunit.client.1.vm09.stdout:6/605: rmdir d3/d21/d25/d26/d6b/dbf/dc4 0 2026-03-09T17:30:29.371 INFO:tasks.workunit.client.1.vm09.stdout:6/606: dread - d3/d21/d25/d26/fa1 zero size 2026-03-09T17:30:29.387 INFO:tasks.workunit.client.1.vm09.stdout:4/603: creat d11/fbd x:0 0 0 2026-03-09T17:30:29.394 INFO:tasks.workunit.client.1.vm09.stdout:3/564: write d5/d16/d31/d37/d58/d64/f9a [144319,4479] 0 2026-03-09T17:30:29.397 INFO:tasks.workunit.client.1.vm09.stdout:9/606: read d5/de/d29/d33/f3b [324872,37209] 0 2026-03-09T17:30:29.403 INFO:tasks.workunit.client.1.vm09.stdout:2/591: dread d13/d15/f20 [0,4194304] 0 2026-03-09T17:30:29.413 INFO:tasks.workunit.client.1.vm09.stdout:7/719: dwrite da/d11/d2d/f69 [0,4194304] 0 2026-03-09T17:30:29.438 INFO:tasks.workunit.client.1.vm09.stdout:6/607: fsync d3/d21/d76/d5c/d61/d95/fa5 0 2026-03-09T17:30:29.439 INFO:tasks.workunit.client.1.vm09.stdout:6/608: readlink d3/d7/l43 0 2026-03-09T17:30:29.450 INFO:tasks.workunit.client.1.vm09.stdout:8/636: write d1/f7 [664065,62587] 0 2026-03-09T17:30:29.453 INFO:tasks.workunit.client.1.vm09.stdout:0/625: getdents d6/d1d/d24/d5e/dc2 0 2026-03-09T17:30:29.455 INFO:tasks.workunit.client.1.vm09.stdout:5/659: rename d0/d9/c40 to d0/d2/d76/d87/cd6 0 2026-03-09T17:30:29.456 INFO:tasks.workunit.client.1.vm09.stdout:8/637: dwrite d1/d14/d2a/f2b [0,4194304] 0 2026-03-09T17:30:29.469 INFO:tasks.workunit.client.1.vm09.stdout:3/565: mknod d5/d9/d30/d65/d59/d84/caa 0 2026-03-09T17:30:29.479 INFO:tasks.workunit.client.1.vm09.stdout:2/592: symlink d13/d15/d60/lbc 0 2026-03-09T17:30:29.487 INFO:tasks.workunit.client.1.vm09.stdout:7/720: unlink da/c1b 0 2026-03-09T17:30:29.487 INFO:tasks.workunit.client.1.vm09.stdout:7/721: readlink da/d11/d47/d5b/l87 0 2026-03-09T17:30:29.488 INFO:tasks.workunit.client.1.vm09.stdout:1/597: creat d9/dc/dd/d40/d21/fb8 x:0 0 0 2026-03-09T17:30:29.488 INFO:tasks.workunit.client.1.vm09.stdout:4/604: creat d11/d1e/fbe x:0 0 0 2026-03-09T17:30:29.489 INFO:tasks.workunit.client.1.vm09.stdout:0/626: mkdir d6/d1d/d24/d32/d59/d9c/dac/dcc 0 2026-03-09T17:30:29.490 INFO:tasks.workunit.client.1.vm09.stdout:6/609: rename d3/d21/f3c to d3/d21/d76/d5c/d7e/dc5/fca 0 2026-03-09T17:30:29.492 INFO:tasks.workunit.client.1.vm09.stdout:3/566: truncate d5/d9/d30/f6a 451004 0 2026-03-09T17:30:29.494 INFO:tasks.workunit.client.1.vm09.stdout:3/567: stat d5/d9/fa6 0 2026-03-09T17:30:29.496 INFO:tasks.workunit.client.1.vm09.stdout:7/722: dwrite da/d11/d77/fd5 [0,4194304] 0 2026-03-09T17:30:29.498 INFO:tasks.workunit.client.1.vm09.stdout:6/610: dread d3/d21/d76/d5c/d61/f60 [0,4194304] 0 2026-03-09T17:30:29.503 INFO:tasks.workunit.client.1.vm09.stdout:5/660: dread d0/d9/f77 [0,4194304] 0 2026-03-09T17:30:29.508 INFO:tasks.workunit.client.1.vm09.stdout:4/605: creat d11/d1e/d31/fbf x:0 0 0 2026-03-09T17:30:29.508 INFO:tasks.workunit.client.1.vm09.stdout:0/627: fsync d6/d1d/f3c 0 2026-03-09T17:30:29.509 INFO:tasks.workunit.client.1.vm09.stdout:2/593: dread d13/fa3 [0,4194304] 0 2026-03-09T17:30:29.510 INFO:tasks.workunit.client.1.vm09.stdout:2/594: chown d13/d15/d36 27013 1 2026-03-09T17:30:29.511 INFO:tasks.workunit.client.1.vm09.stdout:7/723: creat da/d11/d47/d5b/d6c/d9e/ff3 x:0 0 0 2026-03-09T17:30:29.512 INFO:tasks.workunit.client.1.vm09.stdout:7/724: fdatasync da/d11/d3e/da2/db2/fe6 0 2026-03-09T17:30:29.512 INFO:tasks.workunit.client.1.vm09.stdout:7/725: chown da/d11/d47/d5b/d6c/d9e/d4e/c2a 8814 1 2026-03-09T17:30:29.513 INFO:tasks.workunit.client.1.vm09.stdout:3/568: read d5/d16/d31/d3d/d32/f33 [8170759,61069] 0 2026-03-09T17:30:29.514 INFO:tasks.workunit.client.1.vm09.stdout:6/611: symlink d3/d21/db1/lcb 0 2026-03-09T17:30:29.520 INFO:tasks.workunit.client.1.vm09.stdout:0/628: creat d6/d93/fcd x:0 0 0 2026-03-09T17:30:29.532 INFO:tasks.workunit.client.1.vm09.stdout:4/606: rename d11/d1e/d45/d60/d71/la2 to d11/d1e/d45/d60/d71/db7/d89/d8b/d58/lc0 0 2026-03-09T17:30:29.543 INFO:tasks.workunit.client.1.vm09.stdout:9/607: getdents d5/d91/d99 0 2026-03-09T17:30:29.559 INFO:tasks.workunit.client.1.vm09.stdout:1/598: dwrite f6 [0,4194304] 0 2026-03-09T17:30:29.561 INFO:tasks.workunit.client.1.vm09.stdout:1/599: write d9/dc/dd/d40/d22/d37/f41 [1273100,26813] 0 2026-03-09T17:30:29.594 INFO:tasks.workunit.client.1.vm09.stdout:5/661: mknod d0/d2/d76/d87/d95/cd7 0 2026-03-09T17:30:29.597 INFO:tasks.workunit.client.1.vm09.stdout:8/638: getdents d1/da/dd/d47/d4c 0 2026-03-09T17:30:29.605 INFO:tasks.workunit.client.1.vm09.stdout:3/569: dwrite d5/d9/d30/d65/f19 [0,4194304] 0 2026-03-09T17:30:29.608 INFO:tasks.workunit.client.1.vm09.stdout:4/607: fdatasync fe 0 2026-03-09T17:30:29.610 INFO:tasks.workunit.client.1.vm09.stdout:6/612: rename d3/d21/d76/d3f/c4e to d3/d7/d59/d5a/ccc 0 2026-03-09T17:30:29.656 INFO:tasks.workunit.client.1.vm09.stdout:2/595: truncate d13/f4c 2896937 0 2026-03-09T17:30:29.664 INFO:tasks.workunit.client.1.vm09.stdout:2/596: dread d13/f79 [0,4194304] 0 2026-03-09T17:30:29.697 INFO:tasks.workunit.client.1.vm09.stdout:9/608: write d5/d2e/f6f [6808724,72482] 0 2026-03-09T17:30:29.717 INFO:tasks.workunit.client.1.vm09.stdout:1/600: dwrite d9/dc/dd/d40/d1d/f4d [0,4194304] 0 2026-03-09T17:30:29.724 INFO:tasks.workunit.client.1.vm09.stdout:8/639: rmdir d1/d14/d2a/d42/d5d/d8a 39 2026-03-09T17:30:29.729 INFO:tasks.workunit.client.1.vm09.stdout:0/629: link d6/d64/d97/fbb d6/d1d/d24/d32/d59/d9c/fce 0 2026-03-09T17:30:29.730 INFO:tasks.workunit.client.1.vm09.stdout:0/630: write d6/d1d/d24/d5e/d6c/fa5 [1016775,130693] 0 2026-03-09T17:30:29.738 INFO:tasks.workunit.client.1.vm09.stdout:4/608: mkdir d11/d1e/d29/d36/d57/d78/dc1 0 2026-03-09T17:30:29.742 INFO:tasks.workunit.client.1.vm09.stdout:2/597: truncate d13/d4d/f6d 380147 0 2026-03-09T17:30:29.748 INFO:tasks.workunit.client.1.vm09.stdout:5/662: mkdir d0/d46/d4b/db7/dd8 0 2026-03-09T17:30:29.750 INFO:tasks.workunit.client.1.vm09.stdout:1/601: mkdir d9/dc/dd/d40/d21/d35/db9 0 2026-03-09T17:30:29.751 INFO:tasks.workunit.client.1.vm09.stdout:8/640: symlink d1/da/d23/d6c/d32/lc9 0 2026-03-09T17:30:29.755 INFO:tasks.workunit.client.1.vm09.stdout:6/613: mkdir d3/d21/d76/d5c/d7e/dcd 0 2026-03-09T17:30:29.755 INFO:tasks.workunit.client.1.vm09.stdout:6/614: chown d3/f97 21621 1 2026-03-09T17:30:29.756 INFO:tasks.workunit.client.1.vm09.stdout:6/615: write d3/d21/d76/d5c/d61/fad [225930,100737] 0 2026-03-09T17:30:29.758 INFO:tasks.workunit.client.1.vm09.stdout:7/726: link da/d11/d47/d5b/d6c/d9e/d4e/d4c/l4f da/d11/d2d/d56/d68/lf4 0 2026-03-09T17:30:29.759 INFO:tasks.workunit.client.1.vm09.stdout:7/727: stat da/fc1 0 2026-03-09T17:30:29.761 INFO:tasks.workunit.client.1.vm09.stdout:5/663: mknod d0/d2/d76/cd9 0 2026-03-09T17:30:29.763 INFO:tasks.workunit.client.1.vm09.stdout:5/664: truncate d0/d2/d76/d87/d95/f9a 823974 0 2026-03-09T17:30:29.763 INFO:tasks.workunit.client.1.vm09.stdout:1/602: fdatasync d9/dc/dd/d40/d22/d91/d99/fa5 0 2026-03-09T17:30:29.764 INFO:tasks.workunit.client.1.vm09.stdout:8/641: rmdir d1/da/dd/d79 39 2026-03-09T17:30:29.764 INFO:tasks.workunit.client.1.vm09.stdout:3/570: creat d5/d9/d30/d65/d59/d84/fab x:0 0 0 2026-03-09T17:30:29.765 INFO:tasks.workunit.client.1.vm09.stdout:3/571: chown d5 6937 1 2026-03-09T17:30:29.765 INFO:tasks.workunit.client.1.vm09.stdout:3/572: fdatasync d5/d16/d31/d37/d58/f91 0 2026-03-09T17:30:29.766 INFO:tasks.workunit.client.1.vm09.stdout:4/609: mknod d11/d1e/d45/daf/cc2 0 2026-03-09T17:30:29.771 INFO:tasks.workunit.client.1.vm09.stdout:4/610: dread d11/d1e/d45/d60/d71/db7/d89/d8b/d58/f72 [0,4194304] 0 2026-03-09T17:30:29.775 INFO:tasks.workunit.client.1.vm09.stdout:1/603: rename d9/dc/dd/d9f/d9c/db0 to d9/dc/d63/dba 0 2026-03-09T17:30:29.775 INFO:tasks.workunit.client.1.vm09.stdout:1/604: dread - d9/dc/dd/d40/d22/d91/d99/fa5 zero size 2026-03-09T17:30:29.781 INFO:tasks.workunit.client.1.vm09.stdout:9/609: link d5/de/d4e/dca/c9e d5/d21/cd0 0 2026-03-09T17:30:29.782 INFO:tasks.workunit.client.1.vm09.stdout:5/665: fdatasync d0/d2/d76/d87/d95/d9b/fab 0 2026-03-09T17:30:29.786 INFO:tasks.workunit.client.1.vm09.stdout:8/642: creat d1/da/dd/d79/fca x:0 0 0 2026-03-09T17:30:29.791 INFO:tasks.workunit.client.1.vm09.stdout:1/605: mknod d9/dc/d63/dba/cbb 0 2026-03-09T17:30:29.791 INFO:tasks.workunit.client.1.vm09.stdout:6/616: rmdir d3/d21/d76/d5c/d7e/dc5/d98/da0 0 2026-03-09T17:30:29.791 INFO:tasks.workunit.client.1.vm09.stdout:2/598: getdents d13/d15/d60/d85 0 2026-03-09T17:30:29.792 INFO:tasks.workunit.client.1.vm09.stdout:1/606: dread d9/dc/dd/d9f/f8a [0,4194304] 0 2026-03-09T17:30:29.792 INFO:tasks.workunit.client.1.vm09.stdout:5/666: symlink d0/d2/d76/d86/lda 0 2026-03-09T17:30:29.797 INFO:tasks.workunit.client.1.vm09.stdout:6/617: symlink d3/d21/db1/lce 0 2026-03-09T17:30:29.797 INFO:tasks.workunit.client.1.vm09.stdout:1/607: dwrite f3 [4194304,4194304] 0 2026-03-09T17:30:29.798 INFO:tasks.workunit.client.1.vm09.stdout:6/618: write d3/d21/db1/fc8 [389956,24359] 0 2026-03-09T17:30:29.802 INFO:tasks.workunit.client.1.vm09.stdout:2/599: symlink d13/d15/d36/lbd 0 2026-03-09T17:30:29.805 INFO:tasks.workunit.client.1.vm09.stdout:9/610: symlink d5/d2e/ld1 0 2026-03-09T17:30:29.809 INFO:tasks.workunit.client.1.vm09.stdout:2/600: dwrite d13/d15/d21/f30 [0,4194304] 0 2026-03-09T17:30:29.813 INFO:tasks.workunit.client.1.vm09.stdout:3/573: rename d5/d16/d31/f44 to d5/fac 0 2026-03-09T17:30:29.816 INFO:tasks.workunit.client.1.vm09.stdout:3/574: dwrite d5/d16/d31/f57 [4194304,4194304] 0 2026-03-09T17:30:29.817 INFO:tasks.workunit.client.1.vm09.stdout:2/601: write d13/d15/f7e [64054,64853] 0 2026-03-09T17:30:29.819 INFO:tasks.workunit.client.1.vm09.stdout:2/602: stat d13/d15/d34/f44 0 2026-03-09T17:30:29.836 INFO:tasks.workunit.client.1.vm09.stdout:6/619: dread d3/d21/d25/d26/d6b/dbf/f66 [0,4194304] 0 2026-03-09T17:30:29.841 INFO:tasks.workunit.client.1.vm09.stdout:5/667: mknod d0/d2/d76/d87/da4/dbe/cdb 0 2026-03-09T17:30:29.851 INFO:tasks.workunit.client.1.vm09.stdout:5/668: read d0/d9/f77 [3307992,20676] 0 2026-03-09T17:30:29.855 INFO:tasks.workunit.client.1.vm09.stdout:9/611: dread d5/d21/f2b [0,4194304] 0 2026-03-09T17:30:29.867 INFO:tasks.workunit.client.1.vm09.stdout:3/575: truncate d5/d16/d25/f2b 632426 0 2026-03-09T17:30:29.867 INFO:tasks.workunit.client.1.vm09.stdout:2/603: truncate d13/d15/d60/d90/f92 948954 0 2026-03-09T17:30:29.867 INFO:tasks.workunit.client.1.vm09.stdout:2/604: dread - d13/db3/fb7 zero size 2026-03-09T17:30:29.869 INFO:tasks.workunit.client.1.vm09.stdout:9/612: mknod d5/de/d4e/d6e/d93/cd2 0 2026-03-09T17:30:29.873 INFO:tasks.workunit.client.1.vm09.stdout:3/576: mkdir d5/d16/d31/d37/d58/d64/dad 0 2026-03-09T17:30:29.874 INFO:tasks.workunit.client.1.vm09.stdout:3/577: write d5/d16/d31/d37/d58/d64/f8e [286808,51363] 0 2026-03-09T17:30:29.874 INFO:tasks.workunit.client.1.vm09.stdout:9/613: mknod d5/de/d4e/dca/d84/d97/cd3 0 2026-03-09T17:30:29.875 INFO:tasks.workunit.client.1.vm09.stdout:9/614: chown d5/f13 431 1 2026-03-09T17:30:29.876 INFO:tasks.workunit.client.1.vm09.stdout:5/669: truncate d0/dc/d21/d26/f3d 379696 0 2026-03-09T17:30:29.876 INFO:tasks.workunit.client.1.vm09.stdout:5/670: chown d0/dc/d21/d26/l38 16104 1 2026-03-09T17:30:29.885 INFO:tasks.workunit.client.1.vm09.stdout:0/631: write d6/f63 [823450,84683] 0 2026-03-09T17:30:29.886 INFO:tasks.workunit.client.1.vm09.stdout:7/728: write da/d11/f25 [3871361,71632] 0 2026-03-09T17:30:29.890 INFO:tasks.workunit.client.1.vm09.stdout:4/611: write d11/d1e/d29/d36/f3d [1009012,55105] 0 2026-03-09T17:30:29.893 INFO:tasks.workunit.client.1.vm09.stdout:9/615: mkdir d5/de/d29/dd4 0 2026-03-09T17:30:29.893 INFO:tasks.workunit.client.1.vm09.stdout:0/632: mknod d6/d1d/d39/ccf 0 2026-03-09T17:30:29.894 INFO:tasks.workunit.client.1.vm09.stdout:7/729: mknod da/d11/d2d/d56/cf5 0 2026-03-09T17:30:29.897 INFO:tasks.workunit.client.1.vm09.stdout:0/633: creat d6/d1d/d46/fd0 x:0 0 0 2026-03-09T17:30:29.911 INFO:tasks.workunit.client.1.vm09.stdout:7/730: truncate da/d11/d47/d5b/d6c/d9e/d4e/d4c/f66 1748673 0 2026-03-09T17:30:29.912 INFO:tasks.workunit.client.1.vm09.stdout:7/731: dread da/d11/d47/d89/fb4 [0,4194304] 0 2026-03-09T17:30:29.955 INFO:tasks.workunit.client.1.vm09.stdout:8/643: write d1/d14/d2a/d42/d43/f95 [655893,107638] 0 2026-03-09T17:30:29.959 INFO:tasks.workunit.client.1.vm09.stdout:8/644: dwrite d1/d14/d2a/f54 [0,4194304] 0 2026-03-09T17:30:29.962 INFO:tasks.workunit.client.1.vm09.stdout:8/645: rename d1/d14/d31 to d1/d14/d96/dc3/dcb 0 2026-03-09T17:30:29.976 INFO:tasks.workunit.client.1.vm09.stdout:5/671: sync 2026-03-09T17:30:29.977 INFO:tasks.workunit.client.1.vm09.stdout:0/634: sync 2026-03-09T17:30:29.977 INFO:tasks.workunit.client.1.vm09.stdout:7/732: sync 2026-03-09T17:30:29.978 INFO:tasks.workunit.client.1.vm09.stdout:7/733: stat da/d11/d2d/f71 0 2026-03-09T17:30:29.979 INFO:tasks.workunit.client.1.vm09.stdout:7/734: creat da/d11/d64/da7/ff6 x:0 0 0 2026-03-09T17:30:29.981 INFO:tasks.workunit.client.1.vm09.stdout:5/672: rename d0/d2/d76/d87/da4/dbf/ld1 to d0/d52/d20/ldc 0 2026-03-09T17:30:29.985 INFO:tasks.workunit.client.1.vm09.stdout:1/608: dwrite d9/dc/dd/d40/d22/d37/d3f/f68 [0,4194304] 0 2026-03-09T17:30:29.988 INFO:tasks.workunit.client.1.vm09.stdout:0/635: rename d6/d1d/d24/d32/d59/dba to d6/d1d/d24/d32/d59/d9c/dac/dd1 0 2026-03-09T17:30:29.996 INFO:tasks.workunit.client.1.vm09.stdout:2/605: write d13/d15/d34/f44 [1477831,33641] 0 2026-03-09T17:30:29.996 INFO:tasks.workunit.client.1.vm09.stdout:4/612: write d11/f13 [5153517,68036] 0 2026-03-09T17:30:29.996 INFO:tasks.workunit.client.1.vm09.stdout:6/620: write d3/d21/d76/d5c/d61/f60 [2285194,15546] 0 2026-03-09T17:30:29.999 INFO:tasks.workunit.client.1.vm09.stdout:3/578: dwrite d5/d16/d46/f6b [0,4194304] 0 2026-03-09T17:30:30.001 INFO:tasks.workunit.client.1.vm09.stdout:7/735: rename da/d11/d47/d5b/d6c/d9e/f57 to da/d11/d47/d5b/d6c/d9e/d4e/ff7 0 2026-03-09T17:30:30.016 INFO:tasks.workunit.client.1.vm09.stdout:6/621: rename d3/d21/d76/d5c/d9f/fae to d3/d7/d99/fcf 0 2026-03-09T17:30:30.019 INFO:tasks.workunit.client.1.vm09.stdout:9/616: write d5/de/f65 [2526925,99346] 0 2026-03-09T17:30:30.027 INFO:tasks.workunit.client.1.vm09.stdout:0/636: mkdir d6/d64/dbd/dd2 0 2026-03-09T17:30:30.027 INFO:tasks.workunit.client.1.vm09.stdout:0/637: dwrite d6/d1d/d24/d32/d59/f5c [0,4194304] 0 2026-03-09T17:30:30.028 INFO:tasks.workunit.client.1.vm09.stdout:1/609: sync 2026-03-09T17:30:30.037 INFO:tasks.workunit.client.1.vm09.stdout:5/673: dread d0/d9/d74/d75/d9f/f92 [0,4194304] 0 2026-03-09T17:30:30.037 INFO:tasks.workunit.client.1.vm09.stdout:4/613: mknod d11/d1e/d45/d60/d71/db7/d89/d8b/cc3 0 2026-03-09T17:30:30.041 INFO:tasks.workunit.client.1.vm09.stdout:5/674: dwrite d0/d2/d76/d87/fa5 [0,4194304] 0 2026-03-09T17:30:30.049 INFO:tasks.workunit.client.1.vm09.stdout:6/622: dwrite d3/d7/d59/d73/f93 [0,4194304] 0 2026-03-09T17:30:30.064 INFO:tasks.workunit.client.1.vm09.stdout:9/617: dread d5/d7e/d81/f96 [0,4194304] 0 2026-03-09T17:30:30.066 INFO:tasks.workunit.client.1.vm09.stdout:9/618: read d5/de/d29/d33/f3b [1453597,67137] 0 2026-03-09T17:30:30.072 INFO:tasks.workunit.client.1.vm09.stdout:6/623: mknod d3/d21/d25/d26/d86/dbe/cd0 0 2026-03-09T17:30:30.075 INFO:tasks.workunit.client.1.vm09.stdout:6/624: readlink d3/l55 0 2026-03-09T17:30:30.075 INFO:tasks.workunit.client.1.vm09.stdout:4/614: dread d11/f12 [0,4194304] 0 2026-03-09T17:30:30.075 INFO:tasks.workunit.client.1.vm09.stdout:0/638: symlink d6/d64/d97/dc9/ld3 0 2026-03-09T17:30:30.075 INFO:tasks.workunit.client.1.vm09.stdout:4/615: stat d11/d1e/d29/d36/c56 0 2026-03-09T17:30:30.075 INFO:tasks.workunit.client.1.vm09.stdout:2/606: getdents d13/db3 0 2026-03-09T17:30:30.075 INFO:tasks.workunit.client.1.vm09.stdout:4/616: stat d11/d1e/d31/f9b 0 2026-03-09T17:30:30.077 INFO:tasks.workunit.client.1.vm09.stdout:6/625: creat d3/d21/db1/fd1 x:0 0 0 2026-03-09T17:30:30.078 INFO:tasks.workunit.client.1.vm09.stdout:6/626: dread - d3/d48/fc7 zero size 2026-03-09T17:30:30.081 INFO:tasks.workunit.client.1.vm09.stdout:5/675: dread d0/d46/f56 [0,4194304] 0 2026-03-09T17:30:30.082 INFO:tasks.workunit.client.1.vm09.stdout:2/607: creat d13/d15/d21/d88/fbe x:0 0 0 2026-03-09T17:30:30.085 INFO:tasks.workunit.client.1.vm09.stdout:4/617: symlink d11/d1e/d29/d36/d57/lc4 0 2026-03-09T17:30:30.085 INFO:tasks.workunit.client.1.vm09.stdout:6/627: dwrite d3/d21/d76/d3f/fb8 [0,4194304] 0 2026-03-09T17:30:30.097 INFO:tasks.workunit.client.1.vm09.stdout:6/628: fdatasync d3/d21/d25/f54 0 2026-03-09T17:30:30.100 INFO:tasks.workunit.client.1.vm09.stdout:0/639: dread d6/d1d/d24/d5e/f67 [0,4194304] 0 2026-03-09T17:30:30.102 INFO:tasks.workunit.client.1.vm09.stdout:5/676: mknod d0/dc/d21/d26/d5e/dd4/cdd 0 2026-03-09T17:30:30.105 INFO:tasks.workunit.client.1.vm09.stdout:7/736: dread - da/d11/d47/d5b/d6c/d9e/d4e/ff7 zero size 2026-03-09T17:30:30.108 INFO:tasks.workunit.client.1.vm09.stdout:8/646: dwrite d1/da/dd/d47/f82 [0,4194304] 0 2026-03-09T17:30:30.113 INFO:tasks.workunit.client.1.vm09.stdout:3/579: dwrite d5/d9/d30/d65/f5e [0,4194304] 0 2026-03-09T17:30:30.118 INFO:tasks.workunit.client.1.vm09.stdout:5/677: dread d0/d2/d76/d87/d95/d9b/fab [0,4194304] 0 2026-03-09T17:30:30.118 INFO:tasks.workunit.client.1.vm09.stdout:9/619: link d5/d2e/c43 d5/de/cd5 0 2026-03-09T17:30:30.119 INFO:tasks.workunit.client.1.vm09.stdout:7/737: write da/d11/d2d/d56/f9f [4537718,111849] 0 2026-03-09T17:30:30.119 INFO:tasks.workunit.client.1.vm09.stdout:9/620: dread - d5/de/d4e/dca/d84/fc4 zero size 2026-03-09T17:30:30.125 INFO:tasks.workunit.client.1.vm09.stdout:8/647: dwrite d1/d14/d2a/f8b [0,4194304] 0 2026-03-09T17:30:30.127 INFO:tasks.workunit.client.1.vm09.stdout:8/648: stat d1/da/dd/d63/f36 0 2026-03-09T17:30:30.128 INFO:tasks.workunit.client.1.vm09.stdout:0/640: rename d6/d1d/d24/l6e to d6/d1d/d24/d32/d59/ld4 0 2026-03-09T17:30:30.136 INFO:tasks.workunit.client.1.vm09.stdout:2/608: getdents d13/d15/d3b 0 2026-03-09T17:30:30.137 INFO:tasks.workunit.client.1.vm09.stdout:2/609: stat d13/d15/d21/f30 0 2026-03-09T17:30:30.138 INFO:tasks.workunit.client.1.vm09.stdout:9/621: fsync d5/de/d29/f36 0 2026-03-09T17:30:30.139 INFO:tasks.workunit.client.1.vm09.stdout:9/622: chown d5/de/d29/fc0 14069 1 2026-03-09T17:30:30.142 INFO:tasks.workunit.client.1.vm09.stdout:5/678: mkdir d0/d2/d76/d87/d95/d9b/dc0/dde 0 2026-03-09T17:30:30.144 INFO:tasks.workunit.client.1.vm09.stdout:0/641: symlink d6/d93/ld5 0 2026-03-09T17:30:30.146 INFO:tasks.workunit.client.1.vm09.stdout:3/580: mkdir d5/d16/d31/d37/dae 0 2026-03-09T17:30:30.150 INFO:tasks.workunit.client.1.vm09.stdout:7/738: mkdir da/d11/d47/d5b/d6c/df8 0 2026-03-09T17:30:30.154 INFO:tasks.workunit.client.1.vm09.stdout:0/642: dwrite d6/d64/fa7 [0,4194304] 0 2026-03-09T17:30:30.157 INFO:tasks.workunit.client.1.vm09.stdout:7/739: dwrite da/d11/d77/de5/dec/f95 [0,4194304] 0 2026-03-09T17:30:30.174 INFO:tasks.workunit.client.1.vm09.stdout:3/581: rename d5/d16/d31/d3d/d32/f89 to d5/d16/d31/d37/d58/d8a/da8/faf 0 2026-03-09T17:30:30.178 INFO:tasks.workunit.client.1.vm09.stdout:0/643: mkdir d6/d64/d97/dd6 0 2026-03-09T17:30:30.187 INFO:tasks.workunit.client.1.vm09.stdout:7/740: creat da/d11/d2d/d56/d68/ff9 x:0 0 0 2026-03-09T17:30:30.187 INFO:tasks.workunit.client.1.vm09.stdout:7/741: readlink da/d11/l9a 0 2026-03-09T17:30:30.204 INFO:tasks.workunit.client.1.vm09.stdout:1/610: write d9/dc/dd/d40/d22/f2b [3054896,5628] 0 2026-03-09T17:30:30.205 INFO:tasks.workunit.client.1.vm09.stdout:1/611: write d9/dc/dd/d40/d21/d6f/f85 [2765103,6270] 0 2026-03-09T17:30:30.205 INFO:tasks.workunit.client.1.vm09.stdout:1/612: fsync f3 0 2026-03-09T17:30:30.223 INFO:tasks.workunit.client.1.vm09.stdout:8/649: link d1/c11 d1/d14/ccc 0 2026-03-09T17:30:30.230 INFO:tasks.workunit.client.1.vm09.stdout:4/618: dwrite d11/d1e/d29/d36/f6a [0,4194304] 0 2026-03-09T17:30:30.233 INFO:tasks.workunit.client.1.vm09.stdout:9/623: write d5/de/d4e/d6e/d93/f74 [3517815,12830] 0 2026-03-09T17:30:30.234 INFO:tasks.workunit.client.1.vm09.stdout:2/610: write d13/d4d/f6d [1092315,56936] 0 2026-03-09T17:30:30.236 INFO:tasks.workunit.client.1.vm09.stdout:2/611: fdatasync d13/d15/d60/f97 0 2026-03-09T17:30:30.236 INFO:tasks.workunit.client.1.vm09.stdout:9/624: chown d5/f1b 52463 1 2026-03-09T17:30:30.236 INFO:tasks.workunit.client.1.vm09.stdout:3/582: rename d5/d16/d31/d3d/d32 to d5/d9/d90/db0 0 2026-03-09T17:30:30.239 INFO:tasks.workunit.client.1.vm09.stdout:6/629: dwrite d3/d7/d59/d5a/f83 [0,4194304] 0 2026-03-09T17:30:30.241 INFO:tasks.workunit.client.1.vm09.stdout:2/612: chown d13/d15/d34/d45/f6a 2027 1 2026-03-09T17:30:30.242 INFO:tasks.workunit.client.1.vm09.stdout:6/630: rename d3/d21/d76/d5c/d7e to d3/d21/d76/d5c/d7e/dcd/dd2 22 2026-03-09T17:30:30.254 INFO:tasks.workunit.client.1.vm09.stdout:1/613: creat d9/dc/dd/d40/d22/d91/d99/fbc x:0 0 0 2026-03-09T17:30:30.257 INFO:tasks.workunit.client.1.vm09.stdout:5/679: getdents d0/d9/d74/d75/d9f 0 2026-03-09T17:30:30.260 INFO:tasks.workunit.client.1.vm09.stdout:4/619: dread d11/f16 [4194304,4194304] 0 2026-03-09T17:30:30.266 INFO:tasks.workunit.client.1.vm09.stdout:1/614: dread d9/dc/dd/f4f [0,4194304] 0 2026-03-09T17:30:30.267 INFO:tasks.workunit.client.1.vm09.stdout:1/615: read - d9/dc/dd/d40/d22/d91/d99/fbc zero size 2026-03-09T17:30:30.270 INFO:tasks.workunit.client.1.vm09.stdout:9/625: read d5/f11 [2820178,93267] 0 2026-03-09T17:30:30.274 INFO:tasks.workunit.client.1.vm09.stdout:2/613: rename d13/d15/d34/d37/fbb to d13/db3/fbf 0 2026-03-09T17:30:30.274 INFO:tasks.workunit.client.1.vm09.stdout:6/631: chown d3/d21/d76/d5c/d7e/l8e 80 1 2026-03-09T17:30:30.275 INFO:tasks.workunit.client.1.vm09.stdout:2/614: write d13/d15/d3b/d43/fab [785443,64144] 0 2026-03-09T17:30:30.285 INFO:tasks.workunit.client.1.vm09.stdout:5/680: rename d0/d2/d76/d86/lda to d0/dc/d21/d33/ldf 0 2026-03-09T17:30:30.291 INFO:tasks.workunit.client.1.vm09.stdout:8/650: write d1/da/d23/d6c/d32/f56 [98515,31843] 0 2026-03-09T17:30:30.291 INFO:tasks.workunit.client.1.vm09.stdout:9/626: dread d5/d2e/f53 [0,4194304] 0 2026-03-09T17:30:30.292 INFO:tasks.workunit.client.1.vm09.stdout:8/651: fsync d1/d14/d2a/f2b 0 2026-03-09T17:30:30.293 INFO:tasks.workunit.client.1.vm09.stdout:9/627: write d5/d2e/d8b/fcc [162410,103694] 0 2026-03-09T17:30:30.293 INFO:tasks.workunit.client.1.vm09.stdout:0/644: symlink d6/d1d/d24/d5e/dc8/ld7 0 2026-03-09T17:30:30.295 INFO:tasks.workunit.client.1.vm09.stdout:8/652: read d1/da/dd/f45 [911337,831] 0 2026-03-09T17:30:30.301 INFO:tasks.workunit.client.1.vm09.stdout:8/653: dwrite d1/d14/d2a/d42/d43/f95 [0,4194304] 0 2026-03-09T17:30:30.311 INFO:tasks.workunit.client.1.vm09.stdout:4/620: creat d11/d1e/d31/db6/fc5 x:0 0 0 2026-03-09T17:30:30.311 INFO:tasks.workunit.client.1.vm09.stdout:9/628: dwrite d5/de/d29/da7/fb3 [0,4194304] 0 2026-03-09T17:30:30.311 INFO:tasks.workunit.client.1.vm09.stdout:9/629: write d5/f8e [6875434,107463] 0 2026-03-09T17:30:30.311 INFO:tasks.workunit.client.1.vm09.stdout:7/742: getdents da/d11/d47/d5b/d6c 0 2026-03-09T17:30:30.313 INFO:tasks.workunit.client.1.vm09.stdout:2/615: creat d13/d15/d36/d72/d94/fc0 x:0 0 0 2026-03-09T17:30:30.333 INFO:tasks.workunit.client.1.vm09.stdout:3/583: dwrite d5/d9/d30/d65/f43 [0,4194304] 0 2026-03-09T17:30:30.339 INFO:tasks.workunit.client.1.vm09.stdout:5/681: rmdir d0/dc/d21/d26/d5e/d68/d79 39 2026-03-09T17:30:30.361 INFO:tasks.workunit.client.1.vm09.stdout:0/645: dread d6/f9 [0,4194304] 0 2026-03-09T17:30:30.369 INFO:tasks.workunit.client.1.vm09.stdout:9/630: rmdir d5/de/d4e/dca/d84 39 2026-03-09T17:30:30.370 INFO:tasks.workunit.client.1.vm09.stdout:9/631: write d5/d2e/d8b/fb6 [965536,23649] 0 2026-03-09T17:30:30.371 INFO:tasks.workunit.client.1.vm09.stdout:8/654: dread d1/d14/d2a/d42/d43/f9e [0,4194304] 0 2026-03-09T17:30:30.378 INFO:tasks.workunit.client.1.vm09.stdout:2/616: mknod d13/d15/d34/d37/cc1 0 2026-03-09T17:30:30.387 INFO:tasks.workunit.client.1.vm09.stdout:6/632: creat d3/d7/fd3 x:0 0 0 2026-03-09T17:30:30.387 INFO:tasks.workunit.client.1.vm09.stdout:6/633: fdatasync d3/d21/d76/d5c/d61/f60 0 2026-03-09T17:30:30.391 INFO:tasks.workunit.client.1.vm09.stdout:6/634: dwrite d3/d21/d76/d5c/d61/f53 [0,4194304] 0 2026-03-09T17:30:30.396 INFO:tasks.workunit.client.1.vm09.stdout:4/621: dwrite f10 [0,4194304] 0 2026-03-09T17:30:30.409 INFO:tasks.workunit.client.1.vm09.stdout:1/616: getdents d9/dc/dd/d40/d22/d37/d3f 0 2026-03-09T17:30:30.409 INFO:tasks.workunit.client.1.vm09.stdout:9/632: truncate d5/d2e/f72 372427 0 2026-03-09T17:30:30.425 INFO:tasks.workunit.client.1.vm09.stdout:8/655: unlink d1/da/d23/d6c/f6a 0 2026-03-09T17:30:30.427 INFO:tasks.workunit.client.1.vm09.stdout:7/743: mkdir da/d11/d47/dfa 0 2026-03-09T17:30:30.446 INFO:tasks.workunit.client.1.vm09.stdout:2/617: dwrite d13/d15/d34/f5b [4194304,4194304] 0 2026-03-09T17:30:30.448 INFO:tasks.workunit.client.1.vm09.stdout:2/618: chown d13/d15/d34/d45/d84/db5 1005880 1 2026-03-09T17:30:30.448 INFO:tasks.workunit.client.1.vm09.stdout:2/619: write d13/d15/d36/d72/d94/da7/f7a [3710897,109047] 0 2026-03-09T17:30:30.449 INFO:tasks.workunit.client.1.vm09.stdout:2/620: fsync d13/d15/f9a 0 2026-03-09T17:30:30.450 INFO:tasks.workunit.client.1.vm09.stdout:6/635: truncate d3/d21/f80 378003 0 2026-03-09T17:30:30.451 INFO:tasks.workunit.client.1.vm09.stdout:6/636: chown d3/d21/d76/d5c/d7e/dc5/fa4 1168322473 1 2026-03-09T17:30:30.452 INFO:tasks.workunit.client.1.vm09.stdout:0/646: mknod d6/d64/d97/dd6/cd8 0 2026-03-09T17:30:30.458 INFO:tasks.workunit.client.1.vm09.stdout:4/622: symlink d11/d1e/d45/lc6 0 2026-03-09T17:30:30.462 INFO:tasks.workunit.client.1.vm09.stdout:2/621: dread d13/d15/f2f [0,4194304] 0 2026-03-09T17:30:30.462 INFO:tasks.workunit.client.1.vm09.stdout:2/622: chown d13/d15/f9a 0 1 2026-03-09T17:30:30.472 INFO:tasks.workunit.client.1.vm09.stdout:1/617: stat d9/f97 0 2026-03-09T17:30:30.472 INFO:tasks.workunit.client.1.vm09.stdout:1/618: dread - d9/d5a/fa3 zero size 2026-03-09T17:30:30.473 INFO:tasks.workunit.client.1.vm09.stdout:1/619: chown d9/dc/f76 0 1 2026-03-09T17:30:30.473 INFO:tasks.workunit.client.1.vm09.stdout:1/620: write f3 [389101,47189] 0 2026-03-09T17:30:30.491 INFO:tasks.workunit.client.1.vm09.stdout:8/656: creat d1/d14/fcd x:0 0 0 2026-03-09T17:30:30.493 INFO:tasks.workunit.client.1.vm09.stdout:7/744: dread - da/d11/d3e/da2/fb7 zero size 2026-03-09T17:30:30.535 INFO:tasks.workunit.client.1.vm09.stdout:6/637: mknod d3/d21/d76/d5c/d9f/cd4 0 2026-03-09T17:30:30.560 INFO:tasks.workunit.client.1.vm09.stdout:0/647: write d6/d64/f7e [1638023,97488] 0 2026-03-09T17:30:30.566 INFO:tasks.workunit.client.1.vm09.stdout:4/623: creat d11/d1e/d29/d36/fc7 x:0 0 0 2026-03-09T17:30:30.572 INFO:tasks.workunit.client.1.vm09.stdout:2/623: creat d13/d15/d60/d85/fc2 x:0 0 0 2026-03-09T17:30:30.572 INFO:tasks.workunit.client.1.vm09.stdout:1/621: creat d9/d38/fbd x:0 0 0 2026-03-09T17:30:30.573 INFO:tasks.workunit.client.1.vm09.stdout:1/622: dread - d9/dc/d63/dba/fb4 zero size 2026-03-09T17:30:30.573 INFO:tasks.workunit.client.1.vm09.stdout:1/623: write d9/dc/dd/d40/d1d/f4d [5205014,83207] 0 2026-03-09T17:30:30.595 INFO:tasks.workunit.client.1.vm09.stdout:8/657: unlink d1/da/dd/d63/c85 0 2026-03-09T17:30:30.595 INFO:tasks.workunit.client.1.vm09.stdout:8/658: readlink d1/da/d23/dc2/l9b 0 2026-03-09T17:30:30.600 INFO:tasks.workunit.client.1.vm09.stdout:5/682: getdents d0/d9 0 2026-03-09T17:30:30.603 INFO:tasks.workunit.client.1.vm09.stdout:3/584: link d5/d9/d30/d65/d59/l6f d5/d16/d31/d37/d58/d8a/da8/lb1 0 2026-03-09T17:30:30.606 INFO:tasks.workunit.client.1.vm09.stdout:3/585: dwrite d5/d16/d31/d37/f76 [0,4194304] 0 2026-03-09T17:30:30.614 INFO:tasks.workunit.client.1.vm09.stdout:3/586: dwrite d5/d16/d46/f6b [0,4194304] 0 2026-03-09T17:30:30.621 INFO:tasks.workunit.client.1.vm09.stdout:3/587: dwrite d5/d9/d90/db0/fa3 [0,4194304] 0 2026-03-09T17:30:30.623 INFO:tasks.workunit.client.1.vm09.stdout:3/588: write d5/d9/d90/db0/fa0 [980591,36112] 0 2026-03-09T17:30:30.659 INFO:tasks.workunit.client.1.vm09.stdout:0/648: mkdir d6/d64/dd9 0 2026-03-09T17:30:30.660 INFO:tasks.workunit.client.1.vm09.stdout:2/624: rmdir d13/d15/d60/d90 39 2026-03-09T17:30:30.661 INFO:tasks.workunit.client.1.vm09.stdout:0/649: dread d6/f9 [0,4194304] 0 2026-03-09T17:30:30.667 INFO:tasks.workunit.client.1.vm09.stdout:9/633: creat d5/de/fd6 x:0 0 0 2026-03-09T17:30:30.680 INFO:tasks.workunit.client.1.vm09.stdout:5/683: truncate d0/d46/f56 1427081 0 2026-03-09T17:30:30.694 INFO:tasks.workunit.client.1.vm09.stdout:9/634: sync 2026-03-09T17:30:30.696 INFO:tasks.workunit.client.1.vm09.stdout:3/589: chown d5/c8d 0 1 2026-03-09T17:30:30.704 INFO:tasks.workunit.client.1.vm09.stdout:6/638: dwrite d3/d7/f18 [0,4194304] 0 2026-03-09T17:30:30.706 INFO:tasks.workunit.client.1.vm09.stdout:2/625: mkdir d13/d15/d36/d72/dc3 0 2026-03-09T17:30:30.708 INFO:tasks.workunit.client.1.vm09.stdout:4/624: mkdir d11/dc8 0 2026-03-09T17:30:30.708 INFO:tasks.workunit.client.1.vm09.stdout:4/625: chown d11/f19 76024451 1 2026-03-09T17:30:30.729 INFO:tasks.workunit.client.1.vm09.stdout:1/624: symlink d9/dc/dd/d40/d22/d8b/lbe 0 2026-03-09T17:30:30.758 INFO:tasks.workunit.client.1.vm09.stdout:8/659: dwrite d1/d14/d2a/d42/d5d/d8a/f94 [0,4194304] 0 2026-03-09T17:30:30.765 INFO:tasks.workunit.client.1.vm09.stdout:9/635: rmdir d5/de/d29/d33 39 2026-03-09T17:30:30.784 INFO:tasks.workunit.client.1.vm09.stdout:2/626: mknod d13/da4/cc4 0 2026-03-09T17:30:30.786 INFO:tasks.workunit.client.1.vm09.stdout:1/625: stat d9/dc/dd/c5f 0 2026-03-09T17:30:30.786 INFO:tasks.workunit.client.1.vm09.stdout:1/626: truncate d9/dc/fa9 975763 0 2026-03-09T17:30:30.787 INFO:tasks.workunit.client.1.vm09.stdout:7/745: link da/d11/d47/d5b/d6c/d9e/d4e/d4c/f66 da/d11/d77/ffb 0 2026-03-09T17:30:30.803 INFO:tasks.workunit.client.1.vm09.stdout:4/626: write d11/d1e/d29/d36/f7f [23876,118491] 0 2026-03-09T17:30:30.804 INFO:tasks.workunit.client.1.vm09.stdout:3/590: rename d5/d16/c80 to d5/d9/d30/cb2 0 2026-03-09T17:30:30.810 INFO:tasks.workunit.client.1.vm09.stdout:8/660: dwrite d1/d14/f3c [0,4194304] 0 2026-03-09T17:30:30.812 INFO:tasks.workunit.client.1.vm09.stdout:8/661: rename d1/da/d23/d71 to d1/da/d23/d71/db6/dce 22 2026-03-09T17:30:30.813 INFO:tasks.workunit.client.1.vm09.stdout:6/639: fdatasync d3/d21/f80 0 2026-03-09T17:30:30.824 INFO:tasks.workunit.client.1.vm09.stdout:0/650: creat d6/d1d/d24/fda x:0 0 0 2026-03-09T17:30:30.840 INFO:tasks.workunit.client.1.vm09.stdout:7/746: symlink da/d11/d2d/d56/lfc 0 2026-03-09T17:30:30.856 INFO:tasks.workunit.client.1.vm09.stdout:4/627: creat d11/d1e/d45/d60/d71/db7/fc9 x:0 0 0 2026-03-09T17:30:30.856 INFO:tasks.workunit.client.1.vm09.stdout:4/628: dread - d11/d1e/d45/d60/f6c zero size 2026-03-09T17:30:30.873 INFO:tasks.workunit.client.1.vm09.stdout:8/662: rename d1/da/dd/d79/f83 to d1/d14/d96/dc3/dcb/fcf 0 2026-03-09T17:30:30.879 INFO:tasks.workunit.client.1.vm09.stdout:9/636: dwrite d5/d21/f38 [0,4194304] 0 2026-03-09T17:30:30.886 INFO:tasks.workunit.client.1.vm09.stdout:6/640: unlink d3/d7/l43 0 2026-03-09T17:30:30.887 INFO:tasks.workunit.client.1.vm09.stdout:2/627: creat d13/d15/d36/d72/dc3/fc5 x:0 0 0 2026-03-09T17:30:30.895 INFO:tasks.workunit.client.1.vm09.stdout:1/627: unlink d9/c4c 0 2026-03-09T17:30:30.900 INFO:tasks.workunit.client.1.vm09.stdout:1/628: chown d9/f59 678 1 2026-03-09T17:30:30.900 INFO:tasks.workunit.client.1.vm09.stdout:5/684: getdents d0 0 2026-03-09T17:30:30.905 INFO:tasks.workunit.client.1.vm09.stdout:1/629: sync 2026-03-09T17:30:30.906 INFO:tasks.workunit.client.1.vm09.stdout:4/629: fdatasync d11/d1e/d45/d60/d71/db7/d89/f94 0 2026-03-09T17:30:30.908 INFO:tasks.workunit.client.1.vm09.stdout:3/591: mkdir d5/d16/d31/d3d/db3 0 2026-03-09T17:30:30.923 INFO:tasks.workunit.client.1.vm09.stdout:7/747: rename da/d11/d47/d5b/d6c/d9e/d4e/d5f/lef to da/d11/d77/de5/dec/lfd 0 2026-03-09T17:30:30.926 INFO:tasks.workunit.client.1.vm09.stdout:8/663: unlink d1/l8c 0 2026-03-09T17:30:30.963 INFO:tasks.workunit.client.1.vm09.stdout:9/637: rmdir d5/d91 39 2026-03-09T17:30:30.963 INFO:tasks.workunit.client.1.vm09.stdout:2/628: mknod d13/db3/cc6 0 2026-03-09T17:30:30.966 INFO:tasks.workunit.client.1.vm09.stdout:6/641: rename d3/d21/d76/f70 to d3/d7/d59/d9c/fd5 0 2026-03-09T17:30:30.968 INFO:tasks.workunit.client.1.vm09.stdout:0/651: dwrite d6/d1d/d24/d32/f45 [4194304,4194304] 0 2026-03-09T17:30:30.971 INFO:tasks.workunit.client.1.vm09.stdout:8/664: read d1/d14/d2a/f2b [3511763,94910] 0 2026-03-09T17:30:30.991 INFO:tasks.workunit.client.1.vm09.stdout:4/630: dread d11/f3f [0,4194304] 0 2026-03-09T17:30:30.999 INFO:tasks.workunit.client.1.vm09.stdout:9/638: fsync d5/de/d29/f36 0 2026-03-09T17:30:31.043 INFO:tasks.workunit.client.1.vm09.stdout:5/685: rename d0/dc/d21/f29 to d0/d46/d4b/fe0 0 2026-03-09T17:30:31.044 INFO:tasks.workunit.client.1.vm09.stdout:5/686: chown d0/d9/d16/d5c/la1 56676315 1 2026-03-09T17:30:31.046 INFO:tasks.workunit.client.1.vm09.stdout:6/642: mknod d3/d7/d59/d5a/cd6 0 2026-03-09T17:30:31.048 INFO:tasks.workunit.client.1.vm09.stdout:7/748: mknod da/d11/d47/d5b/d6c/d9e/dc6/ddb/cfe 0 2026-03-09T17:30:31.049 INFO:tasks.workunit.client.1.vm09.stdout:7/749: chown da/d11/d47/d5b/d6c/d9e/dc6/fe2 248779 1 2026-03-09T17:30:31.080 INFO:tasks.workunit.client.1.vm09.stdout:8/665: creat d1/d14/d96/dc3/fd0 x:0 0 0 2026-03-09T17:30:31.087 INFO:tasks.workunit.client.1.vm09.stdout:4/631: chown c7 1820871 1 2026-03-09T17:30:31.088 INFO:tasks.workunit.client.1.vm09.stdout:4/632: stat d11/d1e/d31/fbf 0 2026-03-09T17:30:31.090 INFO:tasks.workunit.client.1.vm09.stdout:9/639: stat d5/f4f 0 2026-03-09T17:30:31.113 INFO:tasks.workunit.client.1.vm09.stdout:3/592: truncate d5/d9/d30/f41 2080927 0 2026-03-09T17:30:31.123 INFO:tasks.workunit.client.1.vm09.stdout:1/630: rename d9/dc/fb2 to d9/d5a/fbf 0 2026-03-09T17:30:31.129 INFO:tasks.workunit.client.1.vm09.stdout:1/631: dread d9/dc/dd/d40/d1d/f1e [0,4194304] 0 2026-03-09T17:30:31.139 INFO:tasks.workunit.client.1.vm09.stdout:6/643: read d3/d7/d59/d9c/fd5 [2176583,11294] 0 2026-03-09T17:30:31.151 INFO:tasks.workunit.client.1.vm09.stdout:7/750: truncate da/d11/d47/d5b/d6c/d9e/d4e/d4c/de4/f98 691474 0 2026-03-09T17:30:31.154 INFO:tasks.workunit.client.1.vm09.stdout:8/666: creat d1/d14/d2a/fd1 x:0 0 0 2026-03-09T17:30:31.187 INFO:tasks.workunit.client.1.vm09.stdout:9/640: fdatasync d5/de/d4e/dca/f75 0 2026-03-09T17:30:31.189 INFO:tasks.workunit.client.1.vm09.stdout:9/641: dwrite d5/d2e/d8b/fb6 [0,4194304] 0 2026-03-09T17:30:31.191 INFO:tasks.workunit.client.1.vm09.stdout:2/629: link d13/d15/d36/c3c d13/db3/cc7 0 2026-03-09T17:30:31.199 INFO:tasks.workunit.client.1.vm09.stdout:3/593: rename d5/d16/d85 to d5/d16/d31/d37/dae/db4 0 2026-03-09T17:30:31.210 INFO:tasks.workunit.client.1.vm09.stdout:6/644: mknod d3/d21/d76/d3f/cd7 0 2026-03-09T17:30:31.225 INFO:tasks.workunit.client.1.vm09.stdout:0/652: creat d6/d1d/fdb x:0 0 0 2026-03-09T17:30:31.225 INFO:tasks.workunit.client.1.vm09.stdout:0/653: readlink d6/d93/ld5 0 2026-03-09T17:30:31.226 INFO:tasks.workunit.client.1.vm09.stdout:0/654: write d6/d1d/d24/d32/d59/f99 [404343,18905] 0 2026-03-09T17:30:31.226 INFO:tasks.workunit.client.1.vm09.stdout:0/655: chown d6/d1d/l47 2103803 1 2026-03-09T17:30:31.227 INFO:tasks.workunit.client.1.vm09.stdout:7/751: creat da/d11/d3e/da2/fff x:0 0 0 2026-03-09T17:30:31.233 INFO:tasks.workunit.client.1.vm09.stdout:8/667: truncate d1/da/dd/f22 1485451 0 2026-03-09T17:30:31.239 INFO:tasks.workunit.client.1.vm09.stdout:9/642: mknod d5/de/d29/d90/dc7/cd7 0 2026-03-09T17:30:31.257 INFO:tasks.workunit.client.1.vm09.stdout:6/645: creat d3/d21/d25/d26/d86/dbc/fd8 x:0 0 0 2026-03-09T17:30:31.270 INFO:tasks.workunit.client.1.vm09.stdout:3/594: symlink d5/d16/d31/d37/d58/lb5 0 2026-03-09T17:30:31.279 INFO:tasks.workunit.client.1.vm09.stdout:7/752: dread da/d11/d47/d5b/d6c/d9e/d4e/f7c [0,4194304] 0 2026-03-09T17:30:31.280 INFO:tasks.workunit.client.1.vm09.stdout:8/668: mknod d1/d14/d2a/d42/d43/cd2 0 2026-03-09T17:30:31.289 INFO:tasks.workunit.client.1.vm09.stdout:0/656: dread d6/d1d/f41 [0,4194304] 0 2026-03-09T17:30:31.290 INFO:tasks.workunit.client.1.vm09.stdout:0/657: read d6/d1d/d24/d32/d59/fb0 [1813377,113693] 0 2026-03-09T17:30:31.298 INFO:tasks.workunit.client.1.vm09.stdout:8/669: symlink d1/d14/d2a/d42/d43/d44/ld3 0 2026-03-09T17:30:31.309 INFO:tasks.workunit.client.1.vm09.stdout:9/643: link d5/de/d29/d90/dc7/cd7 d5/d7e/d81/cd8 0 2026-03-09T17:30:31.311 INFO:tasks.workunit.client.1.vm09.stdout:3/595: creat d5/d9/da9/fb6 x:0 0 0 2026-03-09T17:30:31.315 INFO:tasks.workunit.client.1.vm09.stdout:7/753: unlink da/d11/d77/l83 0 2026-03-09T17:30:31.317 INFO:tasks.workunit.client.1.vm09.stdout:0/658: creat d6/d1d/d39/fdc x:0 0 0 2026-03-09T17:30:31.317 INFO:tasks.workunit.client.1.vm09.stdout:8/670: truncate d1/d14/d2a/d42/d5d/d8a/f99 11102 0 2026-03-09T17:30:31.318 INFO:tasks.workunit.client.1.vm09.stdout:9/644: rmdir d5/de/d4e/dca 39 2026-03-09T17:30:31.318 INFO:tasks.workunit.client.1.vm09.stdout:6/646: link d3/d21/d25/f5f d3/d21/db1/fd9 0 2026-03-09T17:30:31.320 INFO:tasks.workunit.client.1.vm09.stdout:7/754: creat da/d11/d47/d5b/d6c/d9e/d4e/d5f/f100 x:0 0 0 2026-03-09T17:30:31.322 INFO:tasks.workunit.client.1.vm09.stdout:0/659: truncate d6/d1d/d24/d5e/f9e 16716 0 2026-03-09T17:30:31.330 INFO:tasks.workunit.client.1.vm09.stdout:8/671: unlink d1/d14/d2a/d42/d43/d44/ld3 0 2026-03-09T17:30:31.330 INFO:tasks.workunit.client.1.vm09.stdout:8/672: write d1/d14/f2f [4419372,9785] 0 2026-03-09T17:30:31.331 INFO:tasks.workunit.client.1.vm09.stdout:8/673: fdatasync d1/d14/d2a/f54 0 2026-03-09T17:30:31.332 INFO:tasks.workunit.client.1.vm09.stdout:3/596: symlink d5/d16/d31/d3d/db3/lb7 0 2026-03-09T17:30:31.335 INFO:tasks.workunit.client.1.vm09.stdout:6/647: rename d3/d21/d76/d5c/d61/d95/c3d to d3/d21/d25/d26/d86/dbe/cda 0 2026-03-09T17:30:31.336 INFO:tasks.workunit.client.1.vm09.stdout:7/755: mkdir da/d11/d77/d101 0 2026-03-09T17:30:31.339 INFO:tasks.workunit.client.1.vm09.stdout:7/756: dwrite da/d11/d47/d5b/d6c/f7b [0,4194304] 0 2026-03-09T17:30:31.342 INFO:tasks.workunit.client.1.vm09.stdout:5/687: dwrite d0/d9/f34 [0,4194304] 0 2026-03-09T17:30:31.356 INFO:tasks.workunit.client.1.vm09.stdout:3/597: mknod d5/d16/d31/d37/d58/d64/cb8 0 2026-03-09T17:30:31.359 INFO:tasks.workunit.client.1.vm09.stdout:4/633: write d11/d1e/d45/d60/d71/db7/f90 [1764785,97136] 0 2026-03-09T17:30:31.360 INFO:tasks.workunit.client.1.vm09.stdout:6/648: dread - d3/d21/f8c zero size 2026-03-09T17:30:31.361 INFO:tasks.workunit.client.1.vm09.stdout:6/649: dread - d3/d7/d99/fcf zero size 2026-03-09T17:30:31.362 INFO:tasks.workunit.client.1.vm09.stdout:3/598: dread d5/d16/d31/d37/d58/d64/f70 [0,4194304] 0 2026-03-09T17:30:31.363 INFO:tasks.workunit.client.1.vm09.stdout:7/757: stat da/d11/d2d/c6f 0 2026-03-09T17:30:31.367 INFO:tasks.workunit.client.1.vm09.stdout:8/674: mknod d1/d14/d2a/d42/d43/cd4 0 2026-03-09T17:30:31.368 INFO:tasks.workunit.client.1.vm09.stdout:5/688: unlink d0/fa3 0 2026-03-09T17:30:31.376 INFO:tasks.workunit.client.1.vm09.stdout:2/630: dwrite d13/d15/d21/d88/fad [0,4194304] 0 2026-03-09T17:30:31.380 INFO:tasks.workunit.client.1.vm09.stdout:9/645: rmdir d5/d91 39 2026-03-09T17:30:31.384 INFO:tasks.workunit.client.1.vm09.stdout:1/632: write d9/dc/dd/d40/d22/d37/d3f/f80 [549998,22107] 0 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: Active manager daemon vm06.pbgzei restarted 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: Activating manager daemon vm06.pbgzei 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: osdmap e41: 6 total, 6 up, 6 in 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: mgrmap e27: vm06.pbgzei(active, starting, since 0.0105488s) 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.pbgzei/crt"}]: dispatch 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.cjcawy"}]: dispatch 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vmzmbb"}]: dispatch 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.drzmdt"}]: dispatch 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr metadata", "who": "vm06.pbgzei", "id": "vm06.pbgzei"}]: dispatch 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:30:31.402 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:30:31.403 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:30:31.403 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:30:31.403 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:30:31.403 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.pbgzei/key"}]: dispatch 2026-03-09T17:30:31.403 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:31 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T17:30:31.403 INFO:tasks.workunit.client.1.vm09.stdout:1/633: rename d9/dc/dd/d40/d22 to d9/d9e/dc0 0 2026-03-09T17:30:31.404 INFO:tasks.workunit.client.1.vm09.stdout:7/758: mknod da/d11/c102 0 2026-03-09T17:30:31.405 INFO:tasks.workunit.client.1.vm09.stdout:0/660: dwrite d6/d1d/d46/f4d [0,4194304] 0 2026-03-09T17:30:31.413 INFO:tasks.workunit.client.1.vm09.stdout:0/661: dwrite d6/d1d/d46/fd0 [0,4194304] 0 2026-03-09T17:30:31.422 INFO:tasks.workunit.client.1.vm09.stdout:8/675: dread d1/da/dd/fc6 [0,4194304] 0 2026-03-09T17:30:31.449 INFO:tasks.workunit.client.1.vm09.stdout:7/759: symlink da/d11/d3e/da2/l103 0 2026-03-09T17:30:31.450 INFO:tasks.workunit.client.1.vm09.stdout:4/634: getdents d11/d1e/d45/d60/d71/db7/d89/d8b 0 2026-03-09T17:30:31.453 INFO:tasks.workunit.client.1.vm09.stdout:4/635: dwrite d11/f1c [4194304,4194304] 0 2026-03-09T17:30:31.453 INFO:tasks.workunit.client.1.vm09.stdout:4/636: chown d11/d1e/d29/l4f 38632 1 2026-03-09T17:30:31.454 INFO:tasks.workunit.client.1.vm09.stdout:0/662: mknod d6/d1d/d24/d32/d59/d9c/dac/cdd 0 2026-03-09T17:30:31.455 INFO:tasks.workunit.client.1.vm09.stdout:8/676: creat d1/d14/fd5 x:0 0 0 2026-03-09T17:30:31.455 INFO:tasks.workunit.client.1.vm09.stdout:6/650: getdents d3/d21/d76/d5c/d61 0 2026-03-09T17:30:31.459 INFO:tasks.workunit.client.1.vm09.stdout:4/637: rmdir d11/d1e/d45/d60/d71/db7/d89/d8b 39 2026-03-09T17:30:31.462 INFO:tasks.workunit.client.1.vm09.stdout:8/677: unlink d1/l4 0 2026-03-09T17:30:31.463 INFO:tasks.workunit.client.1.vm09.stdout:8/678: stat d1/da/d23/d6c/d32/dc8 0 2026-03-09T17:30:31.465 INFO:tasks.workunit.client.1.vm09.stdout:9/646: link d5/d2e/ld1 d5/de/d4e/dca/d84/d97/ld9 0 2026-03-09T17:30:31.465 INFO:tasks.workunit.client.1.vm09.stdout:4/638: creat d11/d1e/d29/db5/fca x:0 0 0 2026-03-09T17:30:31.465 INFO:tasks.workunit.client.1.vm09.stdout:9/647: write d5/d21/f38 [2776748,46001] 0 2026-03-09T17:30:31.467 INFO:tasks.workunit.client.1.vm09.stdout:4/639: write d11/d1e/d29/d36/f3d [1485344,27189] 0 2026-03-09T17:30:31.470 INFO:tasks.workunit.client.1.vm09.stdout:9/648: write d5/d2e/f5e [5239493,40305] 0 2026-03-09T17:30:31.470 INFO:tasks.workunit.client.1.vm09.stdout:0/663: dwrite d6/d1d/d24/d32/fbe [0,4194304] 0 2026-03-09T17:30:31.476 INFO:tasks.workunit.client.1.vm09.stdout:8/679: unlink d1/d14/d2a/fd1 0 2026-03-09T17:30:31.484 INFO:tasks.workunit.client.1.vm09.stdout:8/680: rmdir d1/da/dd/d77 39 2026-03-09T17:30:31.489 INFO:tasks.workunit.client.1.vm09.stdout:0/664: creat d6/d1d/d24/d32/fde x:0 0 0 2026-03-09T17:30:31.490 INFO:tasks.workunit.client.1.vm09.stdout:0/665: symlink d6/d64/db5/ldf 0 2026-03-09T17:30:31.490 INFO:tasks.workunit.client.1.vm09.stdout:9/649: sync 2026-03-09T17:30:31.492 INFO:tasks.workunit.client.1.vm09.stdout:9/650: mknod d5/de/d29/dd4/cda 0 2026-03-09T17:30:31.493 INFO:tasks.workunit.client.1.vm09.stdout:5/689: dread d0/dc/d21/d6f/f80 [4194304,4194304] 0 2026-03-09T17:30:31.494 INFO:tasks.workunit.client.1.vm09.stdout:5/690: creat d0/d9/d16/fe1 x:0 0 0 2026-03-09T17:30:31.500 INFO:tasks.workunit.client.1.vm09.stdout:9/651: creat d5/d91/fdb x:0 0 0 2026-03-09T17:30:31.505 INFO:tasks.workunit.client.1.vm09.stdout:5/691: truncate d0/d2/f5d 2450301 0 2026-03-09T17:30:31.505 INFO:tasks.workunit.client.1.vm09.stdout:5/692: chown d0/d9/f77 273 1 2026-03-09T17:30:31.505 INFO:tasks.workunit.client.1.vm09.stdout:9/652: creat d5/de/d29/da7/fdc x:0 0 0 2026-03-09T17:30:31.505 INFO:tasks.workunit.client.1.vm09.stdout:5/693: unlink d0/d2/fc5 0 2026-03-09T17:30:31.505 INFO:tasks.workunit.client.1.vm09.stdout:9/653: creat d5/de/d29/d90/dc7/fdd x:0 0 0 2026-03-09T17:30:31.505 INFO:tasks.workunit.client.1.vm09.stdout:7/760: dread da/d11/d64/d84/feb [0,4194304] 0 2026-03-09T17:30:31.505 INFO:tasks.workunit.client.1.vm09.stdout:7/761: rmdir da/d11 39 2026-03-09T17:30:31.513 INFO:tasks.workunit.client.1.vm09.stdout:7/762: link da/d11/d47/d89/dbe/ldf da/d11/d47/d5b/d6c/l104 0 2026-03-09T17:30:31.530 INFO:tasks.workunit.client.1.vm09.stdout:6/651: dread d3/d21/d25/d26/f2a [0,4194304] 0 2026-03-09T17:30:31.531 INFO:tasks.workunit.client.1.vm09.stdout:6/652: creat d3/d21/d25/fdb x:0 0 0 2026-03-09T17:30:31.533 INFO:tasks.workunit.client.1.vm09.stdout:6/653: creat d3/d7/d59/d9c/fdc x:0 0 0 2026-03-09T17:30:31.534 INFO:tasks.workunit.client.1.vm09.stdout:6/654: mknod d3/d7/d59/d9c/cdd 0 2026-03-09T17:30:31.535 INFO:tasks.workunit.client.1.vm09.stdout:6/655: chown d3/d7/d59/d73 19 1 2026-03-09T17:30:31.538 INFO:tasks.workunit.client.1.vm09.stdout:6/656: dwrite d3/d21/d76/d3f/f51 [0,4194304] 0 2026-03-09T17:30:31.540 INFO:tasks.workunit.client.1.vm09.stdout:6/657: symlink d3/d7/d59/d73/lde 0 2026-03-09T17:30:31.542 INFO:tasks.workunit.client.1.vm09.stdout:6/658: rmdir d3/d21/d76/d5c/d7e/dcd 0 2026-03-09T17:30:31.563 INFO:tasks.workunit.client.1.vm09.stdout:3/599: truncate d5/d16/d25/f28 7780677 0 2026-03-09T17:30:31.567 INFO:tasks.workunit.client.1.vm09.stdout:3/600: creat d5/d9/d90/fb9 x:0 0 0 2026-03-09T17:30:31.571 INFO:tasks.workunit.client.1.vm09.stdout:3/601: rmdir d5/d16/d31/d37/d58/d64/dad 0 2026-03-09T17:30:31.572 INFO:tasks.workunit.client.1.vm09.stdout:3/602: read d5/d9/d30/f61 [879734,44538] 0 2026-03-09T17:30:31.575 INFO:tasks.workunit.client.1.vm09.stdout:3/603: dwrite d5/d16/d31/d37/f94 [0,4194304] 0 2026-03-09T17:30:31.580 INFO:tasks.workunit.client.1.vm09.stdout:3/604: dwrite d5/d16/d31/d37/d58/f91 [0,4194304] 0 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: Active manager daemon vm06.pbgzei restarted 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: Activating manager daemon vm06.pbgzei 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: osdmap e41: 6 total, 6 up, 6 in 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: mgrmap e27: vm06.pbgzei(active, starting, since 0.0105488s) 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.? 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.pbgzei/crt"}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.cjcawy"}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vmzmbb"}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.drzmdt"}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr metadata", "who": "vm06.pbgzei", "id": "vm06.pbgzei"}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.pbgzei/key"}]: dispatch 2026-03-09T17:30:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:31 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T17:30:31.659 INFO:tasks.workunit.client.1.vm09.stdout:2/631: truncate d13/d15/f7e 3089196 0 2026-03-09T17:30:31.663 INFO:tasks.workunit.client.1.vm09.stdout:2/632: mkdir d13/dc8 0 2026-03-09T17:30:31.666 INFO:tasks.workunit.client.1.vm09.stdout:2/633: readlink d13/d15/d60/lbc 0 2026-03-09T17:30:31.666 INFO:tasks.workunit.client.1.vm09.stdout:1/634: write d9/dc/f76 [100353,97995] 0 2026-03-09T17:30:31.666 INFO:tasks.workunit.client.1.vm09.stdout:1/635: symlink d9/d9e/dc0/d37/lc1 0 2026-03-09T17:30:31.667 INFO:tasks.workunit.client.1.vm09.stdout:1/636: read d9/d9e/dc0/d37/d3f/f68 [3390405,116145] 0 2026-03-09T17:30:31.669 INFO:tasks.workunit.client.1.vm09.stdout:2/634: getdents d13/d15/d34/d37 0 2026-03-09T17:30:31.670 INFO:tasks.workunit.client.1.vm09.stdout:1/637: symlink d9/dc/dd/d9f/d9c/lc2 0 2026-03-09T17:30:31.673 INFO:tasks.workunit.client.1.vm09.stdout:2/635: creat d13/d15/d60/fc9 x:0 0 0 2026-03-09T17:30:31.673 INFO:tasks.workunit.client.1.vm09.stdout:2/636: chown d13/d15/d34/d37/c41 6 1 2026-03-09T17:30:31.675 INFO:tasks.workunit.client.1.vm09.stdout:2/637: mknod d13/d15/d36/d72/d94/da7/cca 0 2026-03-09T17:30:31.675 INFO:tasks.workunit.client.1.vm09.stdout:2/638: dread - d13/d15/f9a zero size 2026-03-09T17:30:31.680 INFO:tasks.workunit.client.1.vm09.stdout:2/639: rename d13/d15/d2c to d13/d15/d34/d45/d84/dcb 0 2026-03-09T17:30:31.681 INFO:tasks.workunit.client.1.vm09.stdout:2/640: creat d13/d15/d36/d72/dc3/fcc x:0 0 0 2026-03-09T17:30:31.685 INFO:tasks.workunit.client.1.vm09.stdout:2/641: creat d13/d15/d60/fcd x:0 0 0 2026-03-09T17:30:31.702 INFO:tasks.workunit.client.1.vm09.stdout:8/681: write d1/d14/d2a/d42/d43/d44/f5c [2203194,31256] 0 2026-03-09T17:30:31.702 INFO:tasks.workunit.client.1.vm09.stdout:0/666: write d6/f27 [13137,39712] 0 2026-03-09T17:30:31.703 INFO:tasks.workunit.client.1.vm09.stdout:0/667: chown d6/l9d 90776 1 2026-03-09T17:30:31.704 INFO:tasks.workunit.client.1.vm09.stdout:4/640: dwrite d11/d1e/d29/f2f [0,4194304] 0 2026-03-09T17:30:31.705 INFO:tasks.workunit.client.1.vm09.stdout:4/641: fsync d11/d1e/d29/d36/fc7 0 2026-03-09T17:30:31.708 INFO:tasks.workunit.client.1.vm09.stdout:9/654: write d5/f1b [893558,58026] 0 2026-03-09T17:30:31.713 INFO:tasks.workunit.client.1.vm09.stdout:7/763: write da/d11/f1a [8171799,47435] 0 2026-03-09T17:30:31.714 INFO:tasks.workunit.client.1.vm09.stdout:4/642: write d11/d1e/d45/d60/d71/db7/d89/d8b/d58/f7d [2113241,117877] 0 2026-03-09T17:30:31.717 INFO:tasks.workunit.client.1.vm09.stdout:0/668: sync 2026-03-09T17:30:31.720 INFO:tasks.workunit.client.1.vm09.stdout:9/655: mkdir d5/d91/d99/dc9/dde 0 2026-03-09T17:30:31.723 INFO:tasks.workunit.client.1.vm09.stdout:9/656: dread - d5/d2e/f82 zero size 2026-03-09T17:30:31.725 INFO:tasks.workunit.client.1.vm09.stdout:8/682: dwrite d1/da/dd/d77/fad [0,4194304] 0 2026-03-09T17:30:31.725 INFO:tasks.workunit.client.1.vm09.stdout:7/764: creat da/d11/d47/d5b/d6c/d9e/f105 x:0 0 0 2026-03-09T17:30:31.726 INFO:tasks.workunit.client.1.vm09.stdout:7/765: chown da/fb 73541 1 2026-03-09T17:30:31.743 INFO:tasks.workunit.client.1.vm09.stdout:7/766: mkdir da/d11/d47/d89/dbe/d106 0 2026-03-09T17:30:31.743 INFO:tasks.workunit.client.1.vm09.stdout:0/669: fsync d6/d1d/d24/f5d 0 2026-03-09T17:30:31.743 INFO:tasks.workunit.client.1.vm09.stdout:4/643: dread fe [0,4194304] 0 2026-03-09T17:30:31.743 INFO:tasks.workunit.client.1.vm09.stdout:8/683: rename d1/d14/d2a/d42/d5d/l6b to d1/da/d23/d6c/d32/ld6 0 2026-03-09T17:30:31.743 INFO:tasks.workunit.client.1.vm09.stdout:9/657: creat d5/d91/d99/dc9/dde/fdf x:0 0 0 2026-03-09T17:30:31.745 INFO:tasks.workunit.client.1.vm09.stdout:4/644: chown d11/d1e/c2a 123865 1 2026-03-09T17:30:31.746 INFO:tasks.workunit.client.1.vm09.stdout:4/645: stat d11/d1e/d45/d60/d71/db7/d89/d8b/f38 0 2026-03-09T17:30:31.751 INFO:tasks.workunit.client.1.vm09.stdout:8/684: dwrite d1/da/dd/faf [0,4194304] 0 2026-03-09T17:30:31.752 INFO:tasks.workunit.client.1.vm09.stdout:8/685: chown d1/da/d23/d6c/d32/f6d 26901 1 2026-03-09T17:30:31.753 INFO:tasks.workunit.client.1.vm09.stdout:7/767: mkdir da/d11/d77/d101/d107 0 2026-03-09T17:30:31.755 INFO:tasks.workunit.client.1.vm09.stdout:9/658: readlink d5/de/d4e/dca/d84/d97/ld9 0 2026-03-09T17:30:31.756 INFO:tasks.workunit.client.1.vm09.stdout:0/670: symlink d6/d1d/d46/le0 0 2026-03-09T17:30:31.756 INFO:tasks.workunit.client.1.vm09.stdout:4/646: mknod d11/d1e/d45/ccb 0 2026-03-09T17:30:31.756 INFO:tasks.workunit.client.1.vm09.stdout:4/647: stat d11/f26 0 2026-03-09T17:30:31.764 INFO:tasks.workunit.client.1.vm09.stdout:8/686: fsync d1/da/dd/f22 0 2026-03-09T17:30:31.766 INFO:tasks.workunit.client.1.vm09.stdout:8/687: creat d1/da/dd/d79/fd7 x:0 0 0 2026-03-09T17:30:31.767 INFO:tasks.workunit.client.1.vm09.stdout:9/659: getdents d5/de/d4e/dca/d84/d97 0 2026-03-09T17:30:31.768 INFO:tasks.workunit.client.1.vm09.stdout:8/688: truncate d1/d14/d2a/d42/d43/f58 2507873 0 2026-03-09T17:30:31.772 INFO:tasks.workunit.client.1.vm09.stdout:0/671: rename d6/d1d/d24/d32/l6b to d6/le1 0 2026-03-09T17:30:31.792 INFO:tasks.workunit.client.1.vm09.stdout:6/659: write d3/d21/d25/d26/f50 [454392,14126] 0 2026-03-09T17:30:31.801 INFO:tasks.workunit.client.1.vm09.stdout:3/605: dread d5/d9/f4e [0,4194304] 0 2026-03-09T17:30:31.804 INFO:tasks.workunit.client.1.vm09.stdout:3/606: dwrite d5/d16/d31/d37/fa5 [0,4194304] 0 2026-03-09T17:30:31.810 INFO:tasks.workunit.client.1.vm09.stdout:8/689: dread d1/d14/d2a/d49/fac [0,4194304] 0 2026-03-09T17:30:31.823 INFO:tasks.workunit.client.1.vm09.stdout:3/607: truncate d5/d9/d30/d65/d59/d84/f6e 550313 0 2026-03-09T17:30:31.825 INFO:tasks.workunit.client.1.vm09.stdout:3/608: fsync d5/d16/d25/f28 0 2026-03-09T17:30:31.826 INFO:tasks.workunit.client.1.vm09.stdout:1/638: dwrite d9/f29 [0,4194304] 0 2026-03-09T17:30:31.833 INFO:tasks.workunit.client.1.vm09.stdout:1/639: creat d9/d38/fc3 x:0 0 0 2026-03-09T17:30:31.840 INFO:tasks.workunit.client.1.vm09.stdout:1/640: link d9/d9e/dc0/d37/lc1 d9/dc/d63/dba/lc4 0 2026-03-09T17:30:31.841 INFO:tasks.workunit.client.1.vm09.stdout:1/641: readlink d9/dc/dd/d9f/l7f 0 2026-03-09T17:30:31.841 INFO:tasks.workunit.client.1.vm09.stdout:1/642: dread - d9/d38/fbd zero size 2026-03-09T17:30:31.842 INFO:tasks.workunit.client.1.vm09.stdout:1/643: read d9/d9e/dc0/f2b [2499514,125131] 0 2026-03-09T17:30:31.844 INFO:tasks.workunit.client.1.vm09.stdout:1/644: unlink d9/d5a/fa3 0 2026-03-09T17:30:31.847 INFO:tasks.workunit.client.1.vm09.stdout:2/642: dread d13/d15/d21/f30 [4194304,4194304] 0 2026-03-09T17:30:31.856 INFO:tasks.workunit.client.1.vm09.stdout:2/643: truncate d13/d15/f74 3645046 0 2026-03-09T17:30:31.857 INFO:tasks.workunit.client.1.vm09.stdout:2/644: fdatasync d13/f40 0 2026-03-09T17:30:31.858 INFO:tasks.workunit.client.1.vm09.stdout:8/690: read d1/da/d23/d6c/d32/f56 [105652,87093] 0 2026-03-09T17:30:31.858 INFO:tasks.workunit.client.1.vm09.stdout:8/691: fsync d1/da/d23/d6c/fb2 0 2026-03-09T17:30:31.860 INFO:tasks.workunit.client.1.vm09.stdout:2/645: rename d13/d4d/ca9 to d13/d15/d34/d37/cce 0 2026-03-09T17:30:31.864 INFO:tasks.workunit.client.1.vm09.stdout:8/692: mknod d1/d14/d2a/d42/d43/cd8 0 2026-03-09T17:30:31.864 INFO:tasks.workunit.client.1.vm09.stdout:8/693: chown d1/f28 10 1 2026-03-09T17:30:31.865 INFO:tasks.workunit.client.1.vm09.stdout:2/646: mkdir d13/d15/d34/d45/d84/db5/dcf 0 2026-03-09T17:30:31.868 INFO:tasks.workunit.client.1.vm09.stdout:2/647: dwrite d13/d15/d21/d88/fad [0,4194304] 0 2026-03-09T17:30:31.873 INFO:tasks.workunit.client.1.vm09.stdout:7/768: dread da/d11/d47/d5b/d6c/d9e/d4e/d4c/f67 [0,4194304] 0 2026-03-09T17:30:31.876 INFO:tasks.workunit.client.1.vm09.stdout:8/694: rename d1/d14/d2a/d42/d43/d44/c78 to d1/da/dd/d47/cd9 0 2026-03-09T17:30:31.884 INFO:tasks.workunit.client.1.vm09.stdout:8/695: fdatasync d1/f16 0 2026-03-09T17:30:31.900 INFO:tasks.workunit.client.1.vm09.stdout:7/769: link da/c43 da/d11/d77/de5/c108 0 2026-03-09T17:30:31.901 INFO:tasks.workunit.client.1.vm09.stdout:7/770: truncate da/d11/d47/d5b/d6c/d9e/d4e/d4c/fed 181772 0 2026-03-09T17:30:31.904 INFO:tasks.workunit.client.1.vm09.stdout:7/771: mknod da/d11/d47/d89/c109 0 2026-03-09T17:30:31.909 INFO:tasks.workunit.client.1.vm09.stdout:7/772: link da/d11/d47/d89/la3 da/d11/d47/d5b/d6c/df8/l10a 0 2026-03-09T17:30:31.915 INFO:tasks.workunit.client.1.vm09.stdout:7/773: getdents da/d11/d3e/da2 0 2026-03-09T17:30:31.916 INFO:tasks.workunit.client.1.vm09.stdout:7/774: dread - da/d11/d64/da7/db1/fc5 zero size 2026-03-09T17:30:31.918 INFO:tasks.workunit.client.1.vm09.stdout:7/775: truncate da/d11/d2d/f71 420207 0 2026-03-09T17:30:31.919 INFO:tasks.workunit.client.1.vm09.stdout:7/776: chown da/d11/d47/d5b/d6c/fb3 905 1 2026-03-09T17:30:31.971 INFO:tasks.workunit.client.1.vm09.stdout:5/694: dwrite d0/d2/f5d [0,4194304] 0 2026-03-09T17:30:31.972 INFO:tasks.workunit.client.1.vm09.stdout:5/695: dread - d0/d52/f97 zero size 2026-03-09T17:30:31.984 INFO:tasks.workunit.client.1.vm09.stdout:4/648: dwrite d11/d1e/d29/d36/d57/fa0 [0,4194304] 0 2026-03-09T17:30:31.987 INFO:tasks.workunit.client.1.vm09.stdout:9/660: write d5/d91/d99/fa4 [145797,94048] 0 2026-03-09T17:30:31.989 INFO:tasks.workunit.client.1.vm09.stdout:0/672: write d6/f9 [3649822,111876] 0 2026-03-09T17:30:31.992 INFO:tasks.workunit.client.1.vm09.stdout:5/696: read d0/d46/f56 [16163,42404] 0 2026-03-09T17:30:31.997 INFO:tasks.workunit.client.1.vm09.stdout:6/660: dwrite d3/d7/d59/d73/f82 [0,4194304] 0 2026-03-09T17:30:31.997 INFO:tasks.workunit.client.1.vm09.stdout:5/697: dwrite d0/dc/d21/d26/fcd [0,4194304] 0 2026-03-09T17:30:32.008 INFO:tasks.workunit.client.1.vm09.stdout:3/609: dwrite d5/d9/d30/d65/f18 [0,4194304] 0 2026-03-09T17:30:32.008 INFO:tasks.workunit.client.1.vm09.stdout:3/610: chown d5/d16/d31/d37/d58/d8a 334388882 1 2026-03-09T17:30:32.016 INFO:tasks.workunit.client.1.vm09.stdout:3/611: dwrite d5/d9/d90/db0/fa3 [0,4194304] 0 2026-03-09T17:30:32.016 INFO:tasks.workunit.client.1.vm09.stdout:1/645: write d9/f8d [771734,87416] 0 2026-03-09T17:30:32.024 INFO:tasks.workunit.client.1.vm09.stdout:3/612: dwrite d5/d16/d31/d37/fa5 [4194304,4194304] 0 2026-03-09T17:30:32.031 INFO:tasks.workunit.client.1.vm09.stdout:9/661: dread d5/f14 [4194304,4194304] 0 2026-03-09T17:30:32.031 INFO:tasks.workunit.client.1.vm09.stdout:2/648: write d13/f79 [1773546,20517] 0 2026-03-09T17:30:32.034 INFO:tasks.workunit.client.1.vm09.stdout:8/696: dwrite d1/da/f4b [4194304,4194304] 0 2026-03-09T17:30:32.037 INFO:tasks.workunit.client.1.vm09.stdout:8/697: fsync d1/da/d23/d6c/f1c 0 2026-03-09T17:30:32.040 INFO:tasks.workunit.client.1.vm09.stdout:7/777: dwrite da/f36 [0,4194304] 0 2026-03-09T17:30:32.042 INFO:tasks.workunit.client.1.vm09.stdout:7/778: readlink da/d11/l9a 0 2026-03-09T17:30:32.045 INFO:tasks.workunit.client.1.vm09.stdout:3/613: mknod d5/d16/d25/cba 0 2026-03-09T17:30:32.047 INFO:tasks.workunit.client.1.vm09.stdout:4/649: link d11/d1e/d45/d60/d71/db7/d89/d8b/f38 d11/d1e/d29/fcc 0 2026-03-09T17:30:32.050 INFO:tasks.workunit.client.1.vm09.stdout:2/649: read d13/d15/f2f [502935,30414] 0 2026-03-09T17:30:32.050 INFO:tasks.workunit.client.1.vm09.stdout:0/673: creat d6/d1d/d24/d32/d59/d81/d8c/fe2 x:0 0 0 2026-03-09T17:30:32.051 INFO:tasks.workunit.client.1.vm09.stdout:6/661: creat d3/d21/d25/fdf x:0 0 0 2026-03-09T17:30:32.052 INFO:tasks.workunit.client.1.vm09.stdout:3/614: unlink d5/d16/d31/d37/d58/d64/f8e 0 2026-03-09T17:30:32.053 INFO:tasks.workunit.client.1.vm09.stdout:5/698: rename d0/dc/d21/d6f/d42 to d0/dc/d21/de2 0 2026-03-09T17:30:32.057 INFO:tasks.workunit.client.1.vm09.stdout:8/698: unlink d1/da/dd/d47/cd9 0 2026-03-09T17:30:32.061 INFO:tasks.workunit.client.1.vm09.stdout:5/699: creat d0/d52/fe3 x:0 0 0 2026-03-09T17:30:32.061 INFO:tasks.workunit.client.1.vm09.stdout:3/615: read d5/d9/d30/f6a [207455,58417] 0 2026-03-09T17:30:32.061 INFO:tasks.workunit.client.1.vm09.stdout:8/699: symlink d1/d14/d96/lda 0 2026-03-09T17:30:32.061 INFO:tasks.workunit.client.1.vm09.stdout:5/700: chown d0/dc/d21/d33/c51 477353 1 2026-03-09T17:30:32.061 INFO:tasks.workunit.client.1.vm09.stdout:6/662: fsync d3/d21/d76/d5c/d7e/dc5/fca 0 2026-03-09T17:30:32.063 INFO:tasks.workunit.client.1.vm09.stdout:3/616: mkdir d5/d9/d90/db0/dbb 0 2026-03-09T17:30:32.063 INFO:tasks.workunit.client.1.vm09.stdout:4/650: sync 2026-03-09T17:30:32.064 INFO:tasks.workunit.client.1.vm09.stdout:8/700: creat d1/d14/d2a/fdb x:0 0 0 2026-03-09T17:30:32.065 INFO:tasks.workunit.client.1.vm09.stdout:6/663: write d3/d21/d76/d5c/f92 [1612581,57657] 0 2026-03-09T17:30:32.069 INFO:tasks.workunit.client.1.vm09.stdout:4/651: rename d11/d1e/d29/d36/l59 to d11/d1e/d29/d36/lcd 0 2026-03-09T17:30:32.071 INFO:tasks.workunit.client.1.vm09.stdout:6/664: truncate d3/d21/f80 52948 0 2026-03-09T17:30:32.071 INFO:tasks.workunit.client.1.vm09.stdout:3/617: mknod d5/d16/cbc 0 2026-03-09T17:30:32.073 INFO:tasks.workunit.client.1.vm09.stdout:5/701: creat d0/d2/d76/d87/da4/dbe/fe4 x:0 0 0 2026-03-09T17:30:32.079 INFO:tasks.workunit.client.1.vm09.stdout:4/652: mkdir d11/d1e/d29/d36/d57/dce 0 2026-03-09T17:30:32.079 INFO:tasks.workunit.client.1.vm09.stdout:4/653: fdatasync d11/d1e/d29/d36/f6a 0 2026-03-09T17:30:32.080 INFO:tasks.workunit.client.1.vm09.stdout:8/701: symlink d1/d14/d2a/d42/d43/ldc 0 2026-03-09T17:30:32.081 INFO:tasks.workunit.client.1.vm09.stdout:0/674: dread d6/d1d/f3c [0,4194304] 0 2026-03-09T17:30:32.106 INFO:tasks.workunit.client.1.vm09.stdout:0/675: dread d6/d64/f7e [0,4194304] 0 2026-03-09T17:30:32.113 INFO:tasks.workunit.client.1.vm09.stdout:8/702: rename d1/d14/d96/dc3 to d1/da/d23/d6c/ddd 0 2026-03-09T17:30:32.113 INFO:tasks.workunit.client.1.vm09.stdout:8/703: write d1/d14/f3c [117695,33805] 0 2026-03-09T17:30:32.114 INFO:tasks.workunit.client.1.vm09.stdout:8/704: fdatasync d1/da/d23/d6c/d32/fb5 0 2026-03-09T17:30:32.118 INFO:tasks.workunit.client.1.vm09.stdout:6/665: mkdir d3/d21/d25/d96/de0 0 2026-03-09T17:30:32.119 INFO:tasks.workunit.client.1.vm09.stdout:6/666: chown d3/d21/d76/d5c/d7e/dc5/d98/cb9 0 1 2026-03-09T17:30:32.120 INFO:tasks.workunit.client.1.vm09.stdout:3/618: creat d5/d9/d90/db0/dbb/fbd x:0 0 0 2026-03-09T17:30:32.121 INFO:tasks.workunit.client.1.vm09.stdout:3/619: write d5/d16/d46/f63 [951118,93956] 0 2026-03-09T17:30:32.123 INFO:tasks.workunit.client.1.vm09.stdout:5/702: mkdir d0/d2/d76/d87/de5 0 2026-03-09T17:30:32.127 INFO:tasks.workunit.client.1.vm09.stdout:8/705: sync 2026-03-09T17:30:32.129 INFO:tasks.workunit.client.1.vm09.stdout:8/706: write d1/da/d23/d6c/ddd/fd0 [402209,114555] 0 2026-03-09T17:30:32.131 INFO:tasks.workunit.client.1.vm09.stdout:5/703: dwrite d0/dc/d21/de2/f82 [0,4194304] 0 2026-03-09T17:30:32.133 INFO:tasks.workunit.client.1.vm09.stdout:0/676: rename d6/d1d/d46/f7d to d6/d1d/d46/fe3 0 2026-03-09T17:30:32.144 INFO:tasks.workunit.client.1.vm09.stdout:6/667: rmdir d3/d7/d59/d9c 39 2026-03-09T17:30:32.149 INFO:tasks.workunit.client.1.vm09.stdout:6/668: dwrite d3/d21/d76/d5c/d61/f60 [0,4194304] 0 2026-03-09T17:30:32.160 INFO:tasks.workunit.client.1.vm09.stdout:3/620: rename d5/d16/d31/f57 to d5/d9/d90/db0/dbb/fbe 0 2026-03-09T17:30:32.160 INFO:tasks.workunit.client.1.vm09.stdout:3/621: dread - d5/fa1 zero size 2026-03-09T17:30:32.166 INFO:tasks.workunit.client.1.vm09.stdout:8/707: mkdir d1/da/d23/d71/dde 0 2026-03-09T17:30:32.167 INFO:tasks.workunit.client.1.vm09.stdout:5/704: readlink d0/d2/lae 0 2026-03-09T17:30:32.168 INFO:tasks.workunit.client.1.vm09.stdout:8/708: read d1/da/d23/f8f [311918,111820] 0 2026-03-09T17:30:32.169 INFO:tasks.workunit.client.1.vm09.stdout:4/654: rmdir d11/d1e/d45/d60/d9c 0 2026-03-09T17:30:32.169 INFO:tasks.workunit.client.1.vm09.stdout:4/655: write d11/fab [145891,54299] 0 2026-03-09T17:30:32.170 INFO:tasks.workunit.client.1.vm09.stdout:4/656: stat d11/d1e/d29/d36/f7f 0 2026-03-09T17:30:32.174 INFO:tasks.workunit.client.1.vm09.stdout:5/705: dwrite d0/d9/d16/d5c/f70 [0,4194304] 0 2026-03-09T17:30:32.184 INFO:tasks.workunit.client.1.vm09.stdout:0/677: unlink d6/d64/d94/lbc 0 2026-03-09T17:30:32.184 INFO:tasks.workunit.client.1.vm09.stdout:0/678: fdatasync d6/d1d/d24/d5e/f9e 0 2026-03-09T17:30:32.184 INFO:tasks.workunit.client.1.vm09.stdout:0/679: stat d6/d1d/d24/d5e/d6c/fa5 0 2026-03-09T17:30:32.184 INFO:tasks.workunit.client.1.vm09.stdout:6/669: mkdir d3/d7/d59/d73/de1 0 2026-03-09T17:30:32.184 INFO:tasks.workunit.client.1.vm09.stdout:5/706: readlink d0/dc/d21/d33/ldf 0 2026-03-09T17:30:32.188 INFO:tasks.workunit.client.1.vm09.stdout:0/680: stat d6/d1d/d24/d5e/d6c/l84 0 2026-03-09T17:30:32.192 INFO:tasks.workunit.client.1.vm09.stdout:6/670: mknod d3/d21/d25/d26/d86/dbe/ce2 0 2026-03-09T17:30:32.197 INFO:tasks.workunit.client.1.vm09.stdout:8/709: mkdir d1/da/d23/dc2/da2/ddf 0 2026-03-09T17:30:32.211 INFO:tasks.workunit.client.1.vm09.stdout:5/707: mkdir d0/d2/d76/d87/d95/d9b/dc0/de6 0 2026-03-09T17:30:32.217 INFO:tasks.workunit.client.1.vm09.stdout:6/671: fsync d3/d21/d76/d5c/d61/d95/f20 0 2026-03-09T17:30:32.217 INFO:tasks.workunit.client.1.vm09.stdout:6/672: dread - d3/d21/d76/d5c/d7e/dc5/fa4 zero size 2026-03-09T17:30:32.219 INFO:tasks.workunit.client.1.vm09.stdout:4/657: link d11/d1e/d29/d36/d57/f79 d11/d1e/d29/db5/fcf 0 2026-03-09T17:30:32.219 INFO:tasks.workunit.client.1.vm09.stdout:4/658: write d11/d1e/d31/fbf [274305,93571] 0 2026-03-09T17:30:32.232 INFO:tasks.workunit.client.1.vm09.stdout:0/681: dread d6/d1d/f57 [0,4194304] 0 2026-03-09T17:30:32.232 INFO:tasks.workunit.client.1.vm09.stdout:0/682: chown d6/d1d/d39/ccf 103 1 2026-03-09T17:30:32.245 INFO:tasks.workunit.client.1.vm09.stdout:5/708: rename d0/f60 to d0/d9/d8b/fe7 0 2026-03-09T17:30:32.251 INFO:tasks.workunit.client.1.vm09.stdout:0/683: rmdir d6/d1d/d24/d5e/dc8 39 2026-03-09T17:30:32.253 INFO:tasks.workunit.client.1.vm09.stdout:4/659: symlink d11/d1e/d29/d36/d57/dce/ld0 0 2026-03-09T17:30:32.257 INFO:tasks.workunit.client.1.vm09.stdout:1/646: dwrite d9/dc/dd/d40/d1d/fa1 [0,4194304] 0 2026-03-09T17:30:32.258 INFO:tasks.workunit.client.1.vm09.stdout:1/647: fsync d9/dc/dd/d40/d1d/fab 0 2026-03-09T17:30:32.261 INFO:tasks.workunit.client.1.vm09.stdout:9/662: write d5/de/d4e/dca/d84/d97/fad [741851,67225] 0 2026-03-09T17:30:32.261 INFO:tasks.workunit.client.1.vm09.stdout:9/663: fsync d5/de/d29/da7/fdc 0 2026-03-09T17:30:32.261 INFO:tasks.workunit.client.1.vm09.stdout:9/664: write d5/de/d29/fc0 [497174,6325] 0 2026-03-09T17:30:32.262 INFO:tasks.workunit.client.1.vm09.stdout:9/665: truncate d5/d21/f38 4531977 0 2026-03-09T17:30:32.265 INFO:tasks.workunit.client.1.vm09.stdout:7/779: dwrite da/d11/d2d/f45 [0,4194304] 0 2026-03-09T17:30:32.274 INFO:tasks.workunit.client.1.vm09.stdout:6/673: rename d3/d21/d76/d5c/d61/d95/f20 to d3/d7/d59/d9c/fe3 0 2026-03-09T17:30:32.275 INFO:tasks.workunit.client.1.vm09.stdout:0/684: mknod d6/d1d/d24/d5e/d86/ce4 0 2026-03-09T17:30:32.276 INFO:tasks.workunit.client.1.vm09.stdout:8/710: getdents d1 0 2026-03-09T17:30:32.279 INFO:tasks.workunit.client.1.vm09.stdout:4/660: mknod d11/d1e/d29/d36/d57/cd1 0 2026-03-09T17:30:32.285 INFO:tasks.workunit.client.1.vm09.stdout:9/666: mkdir d5/d2e/d8b/de0 0 2026-03-09T17:30:32.291 INFO:tasks.workunit.client.1.vm09.stdout:2/650: dread d13/d15/f74 [0,4194304] 0 2026-03-09T17:30:32.291 INFO:tasks.workunit.client.1.vm09.stdout:2/651: chown d13/dc8 2720 1 2026-03-09T17:30:32.292 INFO:tasks.workunit.client.1.vm09.stdout:8/711: creat d1/d14/d96/fe0 x:0 0 0 2026-03-09T17:30:32.304 INFO:tasks.workunit.client.1.vm09.stdout:1/648: creat d9/d9e/dc0/d37/d3f/d42/d55/db1/fc5 x:0 0 0 2026-03-09T17:30:32.308 INFO:tasks.workunit.client.1.vm09.stdout:7/780: symlink da/d11/d47/d5b/d6c/df8/l10b 0 2026-03-09T17:30:32.316 INFO:tasks.workunit.client.1.vm09.stdout:2/652: stat d13/d15/d34/d37/cce 0 2026-03-09T17:30:32.317 INFO:tasks.workunit.client.1.vm09.stdout:0/685: symlink d6/d1d/d24/d5e/le5 0 2026-03-09T17:30:32.344 INFO:tasks.workunit.client.1.vm09.stdout:2/653: symlink d13/d15/d36/d72/d94/da7/ld0 0 2026-03-09T17:30:32.345 INFO:tasks.workunit.client.1.vm09.stdout:2/654: dread d13/d15/f2f [0,4194304] 0 2026-03-09T17:30:32.348 INFO:tasks.workunit.client.1.vm09.stdout:3/622: dwrite d5/d16/d25/f2b [0,4194304] 0 2026-03-09T17:30:32.363 INFO:tasks.workunit.client.1.vm09.stdout:7/781: dread da/d11/f3f [0,4194304] 0 2026-03-09T17:30:32.376 INFO:tasks.workunit.client.1.vm09.stdout:2/655: mkdir d13/d15/d21/d88/db8/dd1 0 2026-03-09T17:30:32.389 INFO:tasks.workunit.client.1.vm09.stdout:0/686: link d6/d1d/d24/d5e/f8a d6/d1d/d24/d32/d59/d9c/dac/fe6 0 2026-03-09T17:30:32.391 INFO:tasks.workunit.client.1.vm09.stdout:0/687: dwrite d6/d1d/d24/d5e/f9e [0,4194304] 0 2026-03-09T17:30:32.392 INFO:tasks.workunit.client.1.vm09.stdout:1/649: write d9/dc/dd/d40/d1d/fa1 [4148165,106478] 0 2026-03-09T17:30:32.399 INFO:tasks.workunit.client.1.vm09.stdout:4/661: write d11/d1e/d31/f65 [55617,125855] 0 2026-03-09T17:30:32.401 INFO:tasks.workunit.client.1.vm09.stdout:6/674: truncate d3/f97 3921366 0 2026-03-09T17:30:32.419 INFO:tasks.workunit.client.1.vm09.stdout:9/667: dwrite d5/de/d29/d33/f66 [0,4194304] 0 2026-03-09T17:30:32.446 INFO:tasks.workunit.client.1.vm09.stdout:4/662: mknod d11/d1e/d29/d36/d57/dce/cd2 0 2026-03-09T17:30:32.465 INFO:tasks.workunit.client.1.vm09.stdout:5/709: unlink d0/d9/d16/d5c/f70 0 2026-03-09T17:30:32.465 INFO:tasks.workunit.client.1.vm09.stdout:3/623: truncate d5/d16/d31/d3d/fe 7225226 0 2026-03-09T17:30:32.465 INFO:tasks.workunit.client.1.vm09.stdout:3/624: chown d5/d16 634 1 2026-03-09T17:30:32.465 INFO:tasks.workunit.client.1.vm09.stdout:9/668: truncate d5/de/d29/d33/f3b 831892 0 2026-03-09T17:30:32.465 INFO:tasks.workunit.client.1.vm09.stdout:9/669: write d5/de/f65 [4348289,122358] 0 2026-03-09T17:30:32.465 INFO:tasks.workunit.client.1.vm09.stdout:9/670: dread - d5/de/d4e/d6e/d93/f7f zero size 2026-03-09T17:30:32.465 INFO:tasks.workunit.client.1.vm09.stdout:1/650: symlink d9/dc/dd/d40/d21/d35/db9/lc6 0 2026-03-09T17:30:32.465 INFO:tasks.workunit.client.1.vm09.stdout:1/651: write d9/dc/dd/f7b [3180322,84486] 0 2026-03-09T17:30:32.465 INFO:tasks.workunit.client.1.vm09.stdout:2/656: creat d13/d15/fd2 x:0 0 0 2026-03-09T17:30:32.465 INFO:tasks.workunit.client.1.vm09.stdout:2/657: dwrite d13/d15/d21/d88/fbe [0,4194304] 0 2026-03-09T17:30:32.465 INFO:tasks.workunit.client.1.vm09.stdout:1/652: unlink d9/dc/dd/f9d 0 2026-03-09T17:30:32.474 INFO:tasks.workunit.client.1.vm09.stdout:1/653: mknod d9/d9e/cc7 0 2026-03-09T17:30:32.476 INFO:tasks.workunit.client.1.vm09.stdout:5/710: mknod d0/dc/d21/ce8 0 2026-03-09T17:30:32.479 INFO:tasks.workunit.client.1.vm09.stdout:3/625: link d5/f22 d5/d16/d31/d37/fbf 0 2026-03-09T17:30:32.480 INFO:tasks.workunit.client.1.vm09.stdout:9/671: creat d5/de/d29/fe1 x:0 0 0 2026-03-09T17:30:32.480 INFO:tasks.workunit.client.1.vm09.stdout:9/672: chown d5/de/d4e/d6e/d93/cd2 5734 1 2026-03-09T17:30:32.481 INFO:tasks.workunit.client.1.vm09.stdout:2/658: mkdir d13/d15/d34/dd3 0 2026-03-09T17:30:32.482 INFO:tasks.workunit.client.1.vm09.stdout:1/654: creat d9/dc/d63/fc8 x:0 0 0 2026-03-09T17:30:32.483 INFO:tasks.workunit.client.1.vm09.stdout:5/711: creat d0/dc/d21/d33/fe9 x:0 0 0 2026-03-09T17:30:32.485 INFO:tasks.workunit.client.1.vm09.stdout:3/626: unlink d5/d16/f45 0 2026-03-09T17:30:32.486 INFO:tasks.workunit.client.1.vm09.stdout:9/673: creat d5/de/d88/fe2 x:0 0 0 2026-03-09T17:30:32.486 INFO:tasks.workunit.client.1.vm09.stdout:2/659: mknod d13/d15/d34/d45/d84/db5/cd4 0 2026-03-09T17:30:32.487 INFO:tasks.workunit.client.1.vm09.stdout:5/712: mknod d0/dc/cea 0 2026-03-09T17:30:32.489 INFO:tasks.workunit.client.1.vm09.stdout:9/674: rename f2 to d5/de/d29/d33/fe3 0 2026-03-09T17:30:32.491 INFO:tasks.workunit.client.1.vm09.stdout:3/627: truncate d5/d9/d30/f41 221726 0 2026-03-09T17:30:32.491 INFO:tasks.workunit.client.1.vm09.stdout:9/675: mknod d5/de/d4e/dca/d84/d97/ce4 0 2026-03-09T17:30:32.493 INFO:tasks.workunit.client.1.vm09.stdout:3/628: write d5/d9/d90/db0/dbb/fbe [7593336,73526] 0 2026-03-09T17:30:32.497 INFO:tasks.workunit.client.1.vm09.stdout:9/676: dwrite d5/de/d4e/d6e/d93/f74 [0,4194304] 0 2026-03-09T17:30:32.500 INFO:tasks.workunit.client.1.vm09.stdout:9/677: creat d5/d2e/d8b/db4/fe5 x:0 0 0 2026-03-09T17:30:32.514 INFO:tasks.workunit.client.1.vm09.stdout:1/655: dread d9/d9e/dc0/d37/d3f/d42/f95 [0,4194304] 0 2026-03-09T17:30:32.520 INFO:tasks.workunit.client.1.vm09.stdout:1/656: read d9/dc/fa9 [567173,36509] 0 2026-03-09T17:30:32.522 INFO:tasks.workunit.client.1.vm09.stdout:1/657: chown d9/dc/dd/d40/d1d/fab 57 1 2026-03-09T17:30:32.522 INFO:tasks.workunit.client.1.vm09.stdout:1/658: write d9/dc/dd/f7b [5152452,9694] 0 2026-03-09T17:30:32.524 INFO:tasks.workunit.client.1.vm09.stdout:5/713: dread d0/f22 [0,4194304] 0 2026-03-09T17:30:32.525 INFO:tasks.workunit.client.1.vm09.stdout:8/712: write d1/d14/d2a/d42/d5d/d8a/f99 [966696,103345] 0 2026-03-09T17:30:32.528 INFO:tasks.workunit.client.1.vm09.stdout:7/782: dwrite da/d11/d47/d5b/d78/f9b [0,4194304] 0 2026-03-09T17:30:32.536 INFO:tasks.workunit.client.1.vm09.stdout:1/659: mkdir d9/d9e/dc9 0 2026-03-09T17:30:32.536 INFO:tasks.workunit.client.1.vm09.stdout:8/713: fdatasync d1/da/dd/d79/fb9 0 2026-03-09T17:30:32.536 INFO:tasks.workunit.client.1.vm09.stdout:5/714: rmdir d0/d9/d74/d75/dbd 39 2026-03-09T17:30:32.536 INFO:tasks.workunit.client.1.vm09.stdout:1/660: mknod d9/d9e/dc0/d37/d3f/cca 0 2026-03-09T17:30:32.536 INFO:tasks.workunit.client.1.vm09.stdout:8/714: mknod d1/da/d23/d71/dde/ce1 0 2026-03-09T17:30:32.536 INFO:tasks.workunit.client.1.vm09.stdout:7/783: getdents da/d11/d77/de5 0 2026-03-09T17:30:32.543 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:32 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T17:30:32.543 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:32 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T17:30:32.543 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:32 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T17:30:32.543 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:32 vm06.local ceph-mon[57307]: Manager daemon vm06.pbgzei is now available 2026-03-09T17:30:32.543 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:32 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:32.543 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:32 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:30:32.543 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:32 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/mirror_snapshot_schedule"}]: dispatch 2026-03-09T17:30:32.543 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:32 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/mirror_snapshot_schedule"}]: dispatch 2026-03-09T17:30:32.543 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:32 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/trash_purge_schedule"}]: dispatch 2026-03-09T17:30:32.543 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:32 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/trash_purge_schedule"}]: dispatch 2026-03-09T17:30:32.544 INFO:tasks.workunit.client.1.vm09.stdout:9/678: sync 2026-03-09T17:30:32.546 INFO:tasks.workunit.client.1.vm09.stdout:5/715: dread d0/d2/d76/d86/f6b [0,4194304] 0 2026-03-09T17:30:32.552 INFO:tasks.workunit.client.1.vm09.stdout:5/716: write d0/dc/d21/d33/fe9 [924416,60498] 0 2026-03-09T17:30:32.553 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:32 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T17:30:32.553 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:32 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T17:30:32.553 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:32 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T17:30:32.553 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:32 vm09.local ceph-mon[62061]: Manager daemon vm06.pbgzei is now available 2026-03-09T17:30:32.553 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:32 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:32.553 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:32 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:30:32.553 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:32 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/mirror_snapshot_schedule"}]: dispatch 2026-03-09T17:30:32.553 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:32 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/mirror_snapshot_schedule"}]: dispatch 2026-03-09T17:30:32.553 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:32 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/trash_purge_schedule"}]: dispatch 2026-03-09T17:30:32.553 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:32 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/trash_purge_schedule"}]: dispatch 2026-03-09T17:30:32.558 INFO:tasks.workunit.client.1.vm09.stdout:8/715: dread d1/da/dd/d79/fb9 [0,4194304] 0 2026-03-09T17:30:32.564 INFO:tasks.workunit.client.1.vm09.stdout:0/688: write d6/d64/f7e [849931,61350] 0 2026-03-09T17:30:32.565 INFO:tasks.workunit.client.1.vm09.stdout:6/675: write d3/d21/d76/d5c/d61/f53 [5084852,76743] 0 2026-03-09T17:30:32.567 INFO:tasks.workunit.client.1.vm09.stdout:6/676: truncate d3/d21/d76/d81/fc9 658186 0 2026-03-09T17:30:32.575 INFO:tasks.workunit.client.1.vm09.stdout:1/661: dread d9/dc/dd/d9f/f8a [0,4194304] 0 2026-03-09T17:30:32.583 INFO:tasks.workunit.client.1.vm09.stdout:9/679: dread d5/de/d4e/dca/d84/d97/fad [0,4194304] 0 2026-03-09T17:30:32.601 INFO:tasks.workunit.client.1.vm09.stdout:4/663: dwrite d11/f12 [0,4194304] 0 2026-03-09T17:30:32.603 INFO:tasks.workunit.client.1.vm09.stdout:4/664: write d11/d1e/d45/d60/f64 [1467225,36468] 0 2026-03-09T17:30:32.610 INFO:tasks.workunit.client.1.vm09.stdout:9/680: read d5/d21/f38 [1352775,92587] 0 2026-03-09T17:30:32.615 INFO:tasks.workunit.client.1.vm09.stdout:8/716: sync 2026-03-09T17:30:32.638 INFO:tasks.workunit.client.1.vm09.stdout:0/689: mknod d6/d64/d97/dc9/ce7 0 2026-03-09T17:30:32.640 INFO:tasks.workunit.client.1.vm09.stdout:6/677: creat d3/d21/d76/d5c/d61/d95/fe4 x:0 0 0 2026-03-09T17:30:32.642 INFO:tasks.workunit.client.1.vm09.stdout:0/690: dwrite d6/fa6 [0,4194304] 0 2026-03-09T17:30:32.647 INFO:tasks.workunit.client.1.vm09.stdout:1/662: symlink d9/dc/dd/d40/d21/d35/lcb 0 2026-03-09T17:30:32.649 INFO:tasks.workunit.client.1.vm09.stdout:1/663: chown c4 157792878 1 2026-03-09T17:30:32.649 INFO:tasks.workunit.client.1.vm09.stdout:1/664: chown d9/dc/dd/d40/d1d/fab 3595 1 2026-03-09T17:30:32.664 INFO:tasks.workunit.client.1.vm09.stdout:4/665: truncate d11/f6e 3725628 0 2026-03-09T17:30:32.665 INFO:tasks.workunit.client.1.vm09.stdout:9/681: unlink d5/lcb 0 2026-03-09T17:30:32.667 INFO:tasks.workunit.client.1.vm09.stdout:8/717: fdatasync d1/da/d23/d6c/d32/f56 0 2026-03-09T17:30:32.670 INFO:tasks.workunit.client.1.vm09.stdout:3/629: write d5/d16/d31/d37/d58/d64/f7c [1895318,80536] 0 2026-03-09T17:30:32.670 INFO:tasks.workunit.client.1.vm09.stdout:2/660: dwrite d13/f89 [0,4194304] 0 2026-03-09T17:30:32.687 INFO:tasks.workunit.client.1.vm09.stdout:5/717: rename d0/dc/d21/de2/f82 to d0/d46/d4b/feb 0 2026-03-09T17:30:32.687 INFO:tasks.workunit.client.1.vm09.stdout:8/718: chown d1/d14/d2a/l55 2723 1 2026-03-09T17:30:32.687 INFO:tasks.workunit.client.1.vm09.stdout:9/682: dread d5/de/d29/fc0 [0,4194304] 0 2026-03-09T17:30:32.687 INFO:tasks.workunit.client.1.vm09.stdout:7/784: write da/d11/d47/d89/dbe/fda [1468654,127540] 0 2026-03-09T17:30:32.687 INFO:tasks.workunit.client.1.vm09.stdout:5/718: chown d0/d2/d76/d86 3 1 2026-03-09T17:30:32.687 INFO:tasks.workunit.client.1.vm09.stdout:3/630: write d5/d9/d30/d65/f18 [3377555,23392] 0 2026-03-09T17:30:32.687 INFO:tasks.workunit.client.1.vm09.stdout:1/665: mknod d9/dc/dd/d40/d21/d35/ccc 0 2026-03-09T17:30:32.703 INFO:tasks.workunit.client.1.vm09.stdout:6/678: dread d3/d21/d76/d5c/d61/d95/f7c [0,4194304] 0 2026-03-09T17:30:32.710 INFO:tasks.workunit.client.1.vm09.stdout:6/679: dwrite d3/d21/d25/fdf [0,4194304] 0 2026-03-09T17:30:32.710 INFO:tasks.workunit.client.1.vm09.stdout:4/666: read d11/d1e/f28 [1718530,111534] 0 2026-03-09T17:30:32.755 INFO:tasks.workunit.client.1.vm09.stdout:2/661: truncate fd 4768522 0 2026-03-09T17:30:32.784 INFO:tasks.workunit.client.1.vm09.stdout:5/719: chown d0/d52/l88 63563516 1 2026-03-09T17:30:32.820 INFO:tasks.workunit.client.1.vm09.stdout:1/666: dread d9/dc/dd/fe [0,4194304] 0 2026-03-09T17:30:32.831 INFO:tasks.workunit.client.1.vm09.stdout:6/680: creat d3/d21/d76/d5c/d61/d95/fe5 x:0 0 0 2026-03-09T17:30:32.832 INFO:tasks.workunit.client.1.vm09.stdout:6/681: truncate d3/d21/db1/fc8 1077925 0 2026-03-09T17:30:32.836 INFO:tasks.workunit.client.1.vm09.stdout:4/667: truncate d11/fb0 171030 0 2026-03-09T17:30:32.879 INFO:tasks.workunit.client.1.vm09.stdout:2/662: mknod d13/d15/d36/d72/d94/cd5 0 2026-03-09T17:30:32.884 INFO:tasks.workunit.client.1.vm09.stdout:8/719: link d1/da/f4b d1/d14/d2a/d49/fe2 0 2026-03-09T17:30:32.886 INFO:tasks.workunit.client.1.vm09.stdout:0/691: dwrite d6/d1d/f57 [0,4194304] 0 2026-03-09T17:30:32.888 INFO:tasks.workunit.client.1.vm09.stdout:5/720: dread - d0/dc/d21/d26/d5e/d68/d6d/f9e zero size 2026-03-09T17:30:32.889 INFO:tasks.workunit.client.1.vm09.stdout:3/631: mknod d5/d9c/cc0 0 2026-03-09T17:30:32.902 INFO:tasks.workunit.client.1.vm09.stdout:3/632: read d5/d9/d30/d65/f15 [2998644,111779] 0 2026-03-09T17:30:32.909 INFO:tasks.workunit.client.1.vm09.stdout:2/663: read d13/d15/d36/f59 [3548089,87349] 0 2026-03-09T17:30:32.917 INFO:tasks.workunit.client.1.vm09.stdout:8/720: rmdir d1/da/d23/d6c/ddd/dcb 39 2026-03-09T17:30:32.918 INFO:tasks.workunit.client.1.vm09.stdout:9/683: creat d5/de/d4e/fe6 x:0 0 0 2026-03-09T17:30:32.918 INFO:tasks.workunit.client.1.vm09.stdout:8/721: fdatasync d1/da/d23/d6c/f1c 0 2026-03-09T17:30:32.923 INFO:tasks.workunit.client.1.vm09.stdout:0/692: creat d6/d1d/d46/fe8 x:0 0 0 2026-03-09T17:30:32.930 INFO:tasks.workunit.client.1.vm09.stdout:5/721: fdatasync d0/d2/d76/d86/f6b 0 2026-03-09T17:30:32.931 INFO:tasks.workunit.client.1.vm09.stdout:5/722: write d0/dc/d21/f7a [454726,71278] 0 2026-03-09T17:30:32.934 INFO:tasks.workunit.client.1.vm09.stdout:4/668: mknod d11/d1e/d45/d60/cd3 0 2026-03-09T17:30:32.937 INFO:tasks.workunit.client.1.vm09.stdout:3/633: truncate d5/f53 1082886 0 2026-03-09T17:30:32.963 INFO:tasks.workunit.client.1.vm09.stdout:1/667: creat d9/d9e/dc0/d37/da4/fcd x:0 0 0 2026-03-09T17:30:32.963 INFO:tasks.workunit.client.1.vm09.stdout:1/668: chown d9/d5a/fbf 2 1 2026-03-09T17:30:32.964 INFO:tasks.workunit.client.1.vm09.stdout:7/785: write da/d11/d47/d5b/d6c/fb3 [50544,5790] 0 2026-03-09T17:30:32.973 INFO:tasks.workunit.client.1.vm09.stdout:5/723: mknod d0/d9/d16/cec 0 2026-03-09T17:30:32.974 INFO:tasks.workunit.client.1.vm09.stdout:4/669: rmdir d11/d1e 39 2026-03-09T17:30:32.979 INFO:tasks.workunit.client.1.vm09.stdout:0/693: getdents d6/d1d/d24/d32/d59/d9c/dac/dd1 0 2026-03-09T17:30:32.984 INFO:tasks.workunit.client.1.vm09.stdout:6/682: dwrite d3/d21/f8c [0,4194304] 0 2026-03-09T17:30:32.988 INFO:tasks.workunit.client.1.vm09.stdout:6/683: write d3/d21/db1/fd1 [497330,34093] 0 2026-03-09T17:30:32.988 INFO:tasks.workunit.client.1.vm09.stdout:8/722: dread d1/d14/d2a/f54 [0,4194304] 0 2026-03-09T17:30:32.989 INFO:tasks.workunit.client.1.vm09.stdout:8/723: readlink d1/d14/d2a/d42/d5d/la9 0 2026-03-09T17:30:33.001 INFO:tasks.workunit.client.1.vm09.stdout:7/786: creat da/d11/d64/da7/f10c x:0 0 0 2026-03-09T17:30:33.017 INFO:tasks.workunit.client.1.vm09.stdout:2/664: write d13/d15/d36/d72/f9b [738811,73509] 0 2026-03-09T17:30:33.018 INFO:tasks.workunit.client.1.vm09.stdout:1/669: dread d9/d9e/dc0/d37/f41 [0,4194304] 0 2026-03-09T17:30:33.024 INFO:tasks.workunit.client.1.vm09.stdout:9/684: unlink d5/de/d29/d33/f3b 0 2026-03-09T17:30:33.029 INFO:tasks.workunit.client.1.vm09.stdout:5/724: write d0/d2/d76/d86/f6b [770441,40738] 0 2026-03-09T17:30:33.035 INFO:tasks.workunit.client.1.vm09.stdout:2/665: sync 2026-03-09T17:30:33.053 INFO:tasks.workunit.client.1.vm09.stdout:6/684: truncate d3/d7/f77 1381307 0 2026-03-09T17:30:33.054 INFO:tasks.workunit.client.1.vm09.stdout:8/724: read - d1/d14/d2a/d49/fa5 zero size 2026-03-09T17:30:33.055 INFO:tasks.workunit.client.1.vm09.stdout:8/725: write d1/da/dd/d79/fb9 [1583659,83186] 0 2026-03-09T17:30:33.056 INFO:tasks.workunit.client.1.vm09.stdout:7/787: symlink da/d11/d47/d5b/d6c/d9e/dc6/ddb/l10d 0 2026-03-09T17:30:33.060 INFO:tasks.workunit.client.1.vm09.stdout:4/670: fdatasync d11/d1e/d45/d60/d71/db7/fa5 0 2026-03-09T17:30:33.070 INFO:tasks.workunit.client.1.vm09.stdout:2/666: chown d13/d15/d60/d90/f9f 220668 1 2026-03-09T17:30:33.072 INFO:tasks.workunit.client.1.vm09.stdout:4/671: dread d11/d1e/d29/f8a [0,4194304] 0 2026-03-09T17:30:33.073 INFO:tasks.workunit.client.1.vm09.stdout:3/634: truncate d5/d16/d31/d37/d58/d64/f7c 782039 0 2026-03-09T17:30:33.075 INFO:tasks.workunit.client.1.vm09.stdout:6/685: symlink d3/d48/le6 0 2026-03-09T17:30:33.080 INFO:tasks.workunit.client.1.vm09.stdout:0/694: dwrite d6/d1d/d24/d32/d59/d81/f82 [0,4194304] 0 2026-03-09T17:30:33.089 INFO:tasks.workunit.client.1.vm09.stdout:0/695: dread d6/d1d/f57 [0,4194304] 0 2026-03-09T17:30:33.090 INFO:tasks.workunit.client.1.vm09.stdout:0/696: write d6/d1d/d24/d32/d59/d81/f82 [2910597,22256] 0 2026-03-09T17:30:33.109 INFO:tasks.workunit.client.1.vm09.stdout:5/725: creat d0/d2/d76/d87/d95/d9b/dc0/dde/fed x:0 0 0 2026-03-09T17:30:33.130 INFO:tasks.workunit.client.1.vm09.stdout:4/672: creat d11/d1e/d29/d36/d57/fd4 x:0 0 0 2026-03-09T17:30:33.131 INFO:tasks.workunit.client.1.vm09.stdout:3/635: creat d5/d16/d31/d3d/db3/fc1 x:0 0 0 2026-03-09T17:30:33.133 INFO:tasks.workunit.client.1.vm09.stdout:4/673: sync 2026-03-09T17:30:33.141 INFO:tasks.workunit.client.1.vm09.stdout:7/788: symlink da/d11/d2d/l10e 0 2026-03-09T17:30:33.141 INFO:tasks.workunit.client.1.vm09.stdout:8/726: creat d1/dbd/fe3 x:0 0 0 2026-03-09T17:30:33.148 INFO:tasks.workunit.client.1.vm09.stdout:8/727: dwrite d1/d14/d2a/d42/d5d/d8a/f99 [0,4194304] 0 2026-03-09T17:30:33.165 INFO:tasks.workunit.client.1.vm09.stdout:0/697: fdatasync d6/d1d/f70 0 2026-03-09T17:30:33.173 INFO:tasks.workunit.client.1.vm09.stdout:9/685: rename d5/de/d4e/d6e to d5/de/d4e/dca/de7 0 2026-03-09T17:30:33.177 INFO:tasks.workunit.client.1.vm09.stdout:9/686: read d5/de/f20 [4583758,25358] 0 2026-03-09T17:30:33.181 INFO:tasks.workunit.client.1.vm09.stdout:2/667: mkdir d13/d15/d36/d72/d94/da7/db0/dd6 0 2026-03-09T17:30:33.189 INFO:tasks.workunit.client.1.vm09.stdout:4/674: mkdir d11/d1e/d31/dd5 0 2026-03-09T17:30:33.196 INFO:tasks.workunit.client.1.vm09.stdout:4/675: dread d11/d1e/f28 [0,4194304] 0 2026-03-09T17:30:33.198 INFO:tasks.workunit.client.1.vm09.stdout:4/676: chown d11/d1e/d45/d60/d71/db7/d89/d8b/f5f 16 1 2026-03-09T17:30:33.205 INFO:tasks.workunit.client.1.vm09.stdout:4/677: dwrite d11/d1e/d29/d36/f7f [0,4194304] 0 2026-03-09T17:30:33.216 INFO:tasks.workunit.client.1.vm09.stdout:1/670: link d9/d9e/dc0/d37/f41 d9/d9e/dc0/d37/fce 0 2026-03-09T17:30:33.221 INFO:tasks.workunit.client.1.vm09.stdout:9/687: unlink d5/de/d4e/dca/d84/d97/c9c 0 2026-03-09T17:30:33.225 INFO:tasks.workunit.client.1.vm09.stdout:2/668: mkdir d13/d15/d34/d45/d84/dd7 0 2026-03-09T17:30:33.228 INFO:tasks.workunit.client.1.vm09.stdout:3/636: mknod d5/d16/d31/cc2 0 2026-03-09T17:30:33.228 INFO:tasks.workunit.client.1.vm09.stdout:7/789: mknod da/d11/d47/d89/dbe/d106/c10f 0 2026-03-09T17:30:33.231 INFO:tasks.workunit.client.1.vm09.stdout:0/698: creat d6/d1d/d24/d32/d59/d9c/dac/dcc/fe9 x:0 0 0 2026-03-09T17:30:33.231 INFO:tasks.workunit.client.1.vm09.stdout:7/790: chown da/d11/d47/d5b/d6c/d9e/d4e/d5f/cbb 37 1 2026-03-09T17:30:33.231 INFO:tasks.workunit.client.1.vm09.stdout:0/699: write d6/d93/fb7 [3097918,38542] 0 2026-03-09T17:30:33.233 INFO:tasks.workunit.client.1.vm09.stdout:1/671: mkdir d9/d9e/dc0/d91/d99/dcf 0 2026-03-09T17:30:33.235 INFO:tasks.workunit.client.1.vm09.stdout:1/672: fdatasync d9/dc/dd/d40/d21/d35/d88/f9a 0 2026-03-09T17:30:33.254 INFO:tasks.workunit.client.1.vm09.stdout:5/726: creat d0/d9/d74/d75/fee x:0 0 0 2026-03-09T17:30:33.254 INFO:tasks.workunit.client.1.vm09.stdout:1/673: write d9/dc/dd/d9f/d9c/f9b [283901,67899] 0 2026-03-09T17:30:33.254 INFO:tasks.workunit.client.1.vm09.stdout:5/727: fdatasync d0/d9/d16/fe1 0 2026-03-09T17:30:33.254 INFO:tasks.workunit.client.1.vm09.stdout:9/688: fsync d5/f14 0 2026-03-09T17:30:33.254 INFO:tasks.workunit.client.1.vm09.stdout:0/700: mknod d6/d1d/d24/d5e/dc2/cea 0 2026-03-09T17:30:33.254 INFO:tasks.workunit.client.1.vm09.stdout:1/674: creat d9/d38/d61/fd0 x:0 0 0 2026-03-09T17:30:33.257 INFO:tasks.workunit.client.1.vm09.stdout:2/669: symlink d13/d15/d36/d72/d94/da7/db0/dd6/ld8 0 2026-03-09T17:30:33.259 INFO:tasks.workunit.client.1.vm09.stdout:3/637: symlink d5/d9/lc3 0 2026-03-09T17:30:33.262 INFO:tasks.workunit.client.1.vm09.stdout:7/791: dread da/d11/d2d/f69 [0,4194304] 0 2026-03-09T17:30:33.263 INFO:tasks.workunit.client.1.vm09.stdout:7/792: readlink da/d11/d3e/l54 0 2026-03-09T17:30:33.265 INFO:tasks.workunit.client.1.vm09.stdout:6/686: rename d3/d7/f4c to d3/fe7 0 2026-03-09T17:30:33.266 INFO:tasks.workunit.client.1.vm09.stdout:1/675: creat d9/dc/d63/fd1 x:0 0 0 2026-03-09T17:30:33.289 INFO:tasks.workunit.client.1.vm09.stdout:1/676: dwrite f3 [4194304,4194304] 0 2026-03-09T17:30:33.290 INFO:tasks.workunit.client.1.vm09.stdout:3/638: mkdir d5/d9/d30/dc4 0 2026-03-09T17:30:33.290 INFO:tasks.workunit.client.1.vm09.stdout:4/678: getdents d11/d1e/d29/db5 0 2026-03-09T17:30:33.290 INFO:tasks.workunit.client.1.vm09.stdout:0/701: getdents d6/d64/dbd/dd2 0 2026-03-09T17:30:33.290 INFO:tasks.workunit.client.1.vm09.stdout:3/639: chown d5/d16/d31/d3d/l7b 58 1 2026-03-09T17:30:33.290 INFO:tasks.workunit.client.1.vm09.stdout:6/687: symlink d3/d7/d59/d73/db0/le8 0 2026-03-09T17:30:33.290 INFO:tasks.workunit.client.1.vm09.stdout:1/677: creat d9/d9e/dc9/fd2 x:0 0 0 2026-03-09T17:30:33.291 INFO:tasks.workunit.client.1.vm09.stdout:9/689: link d5/de/d4e/l5b d5/de/d4e/dca/le8 0 2026-03-09T17:30:33.313 INFO:tasks.workunit.client.1.vm09.stdout:3/640: unlink d5/d16/d31/d37/d58/d64/f70 0 2026-03-09T17:30:33.313 INFO:tasks.workunit.client.1.vm09.stdout:1/678: unlink d9/f34 0 2026-03-09T17:30:33.313 INFO:tasks.workunit.client.1.vm09.stdout:1/679: creat d9/dc/d63/dba/fd3 x:0 0 0 2026-03-09T17:30:33.313 INFO:tasks.workunit.client.1.vm09.stdout:9/690: symlink d5/d91/le9 0 2026-03-09T17:30:33.313 INFO:tasks.workunit.client.1.vm09.stdout:9/691: write d5/de/f2d [324692,61028] 0 2026-03-09T17:30:33.313 INFO:tasks.workunit.client.1.vm09.stdout:9/692: fsync d5/de/d29/fe1 0 2026-03-09T17:30:33.313 INFO:tasks.workunit.client.1.vm09.stdout:9/693: dwrite d5/f11 [4194304,4194304] 0 2026-03-09T17:30:33.313 INFO:tasks.workunit.client.1.vm09.stdout:9/694: chown d5/de/d4e/dca/de7/d93/fb0 309848 1 2026-03-09T17:30:33.313 INFO:tasks.workunit.client.1.vm09.stdout:3/641: read d5/d9/d30/d65/f43 [895216,112077] 0 2026-03-09T17:30:33.314 INFO:tasks.workunit.client.1.vm09.stdout:3/642: chown d5/d16 2845294 1 2026-03-09T17:30:33.325 INFO:tasks.workunit.client.1.vm09.stdout:3/643: mkdir d5/d16/dc5 0 2026-03-09T17:30:33.329 INFO:tasks.workunit.client.1.vm09.stdout:3/644: dwrite d5/d9/d30/d65/f5e [0,4194304] 0 2026-03-09T17:30:33.341 INFO:tasks.workunit.client.1.vm09.stdout:3/645: creat d5/d9/d30/fc6 x:0 0 0 2026-03-09T17:30:33.445 INFO:tasks.workunit.client.1.vm09.stdout:0/702: sync 2026-03-09T17:30:33.445 INFO:tasks.workunit.client.1.vm09.stdout:0/703: chown d6/d1d/d24/d32/d59/d9c/fa2 97738146 1 2026-03-09T17:30:33.447 INFO:tasks.workunit.client.1.vm09.stdout:0/704: symlink d6/d1d/d24/d5e/d6c/leb 0 2026-03-09T17:30:33.458 INFO:tasks.workunit.client.1.vm09.stdout:8/728: write d1/da/dd/d47/d4c/f67 [588507,97656] 0 2026-03-09T17:30:33.470 INFO:tasks.workunit.client.1.vm09.stdout:8/729: unlink d1/da/d23/d6c/ddd/fd0 0 2026-03-09T17:30:33.470 INFO:tasks.workunit.client.1.vm09.stdout:8/730: getdents d1/da/d23/d6c/d32/dc8 0 2026-03-09T17:30:33.470 INFO:tasks.workunit.client.1.vm09.stdout:8/731: symlink d1/da/dd/d47/d4c/le4 0 2026-03-09T17:30:33.470 INFO:tasks.workunit.client.1.vm09.stdout:8/732: creat d1/d14/d2a/d42/d43/d44/fe5 x:0 0 0 2026-03-09T17:30:33.473 INFO:tasks.workunit.client.1.vm09.stdout:5/728: dwrite d0/d2/d76/d87/da4/fa6 [0,4194304] 0 2026-03-09T17:30:33.475 INFO:tasks.workunit.client.1.vm09.stdout:8/733: link d1/da/d23/dc2/l9b d1/d14/d2a/d49/le6 0 2026-03-09T17:30:33.475 INFO:tasks.workunit.client.1.vm09.stdout:6/688: sync 2026-03-09T17:30:33.479 INFO:tasks.workunit.client.1.vm09.stdout:7/793: write da/d11/f6a [114962,79018] 0 2026-03-09T17:30:33.479 INFO:tasks.workunit.client.1.vm09.stdout:2/670: write d13/fa3 [1809692,38117] 0 2026-03-09T17:30:33.481 INFO:tasks.workunit.client.1.vm09.stdout:6/689: chown d3/d21/d76/d5c/d61/d95/l47 8 1 2026-03-09T17:30:33.481 INFO:tasks.workunit.client.1.vm09.stdout:2/671: chown d13/d15/d60/fcd 112 1 2026-03-09T17:30:33.481 INFO:tasks.workunit.client.1.vm09.stdout:5/729: dread - d0/dc/d21/d33/f65 zero size 2026-03-09T17:30:33.484 INFO:tasks.workunit.client.1.vm09.stdout:6/690: symlink d3/d7/d59/d73/db0/le9 0 2026-03-09T17:30:33.484 INFO:tasks.workunit.client.1.vm09.stdout:2/672: read d13/d15/d34/f3a [2916738,17203] 0 2026-03-09T17:30:33.488 INFO:tasks.workunit.client.1.vm09.stdout:5/730: stat d0/dc/d21/d26/d5e/d68 0 2026-03-09T17:30:33.496 INFO:tasks.workunit.client.1.vm09.stdout:4/679: dwrite d11/d1e/f28 [0,4194304] 0 2026-03-09T17:30:33.496 INFO:tasks.workunit.client.1.vm09.stdout:7/794: dread da/f36 [0,4194304] 0 2026-03-09T17:30:33.496 INFO:tasks.workunit.client.1.vm09.stdout:4/680: chown d11/d1e/d45/c85 467 1 2026-03-09T17:30:33.496 INFO:tasks.workunit.client.1.vm09.stdout:6/691: write d3/d21/d25/fdb [459603,123586] 0 2026-03-09T17:30:33.496 INFO:tasks.workunit.client.1.vm09.stdout:1/680: dwrite f2 [0,4194304] 0 2026-03-09T17:30:33.502 INFO:tasks.workunit.client.1.vm09.stdout:4/681: creat d11/d1e/d29/d36/d57/d78/fd6 x:0 0 0 2026-03-09T17:30:33.505 INFO:tasks.workunit.client.1.vm09.stdout:7/795: link da/d11/d77/ffb da/d11/d47/d5b/d6c/d9e/d4e/d5f/f110 0 2026-03-09T17:30:33.508 INFO:tasks.workunit.client.1.vm09.stdout:1/681: rename d9/d9e/dc0/d37/d3f/l83 to d9/d9e/dc0/d91/ld4 0 2026-03-09T17:30:33.508 INFO:tasks.workunit.client.1.vm09.stdout:5/731: dread d0/dc/d21/d33/f35 [0,4194304] 0 2026-03-09T17:30:33.509 INFO:tasks.workunit.client.1.vm09.stdout:1/682: fdatasync d9/dc/d63/fd1 0 2026-03-09T17:30:33.517 INFO:tasks.workunit.client.1.vm09.stdout:2/673: dread d13/d15/f2a [4194304,4194304] 0 2026-03-09T17:30:33.521 INFO:tasks.workunit.client.1.vm09.stdout:8/734: sync 2026-03-09T17:30:33.532 INFO:tasks.workunit.client.1.vm09.stdout:6/692: dread d3/d7/f23 [4194304,4194304] 0 2026-03-09T17:30:33.533 INFO:tasks.workunit.client.1.vm09.stdout:5/732: mknod d0/d2/d76/d87/d95/d9b/dc0/cef 0 2026-03-09T17:30:33.542 INFO:tasks.workunit.client.1.vm09.stdout:7/796: link da/d11/d47/d5b/d6c/d9e/d4e/d4c/f66 da/d11/d3e/dd8/f111 0 2026-03-09T17:30:33.542 INFO:tasks.workunit.client.1.vm09.stdout:2/674: dread - d13/d15/d21/f7f zero size 2026-03-09T17:30:33.554 INFO:tasks.workunit.client.1.vm09.stdout:8/735: creat d1/fe7 x:0 0 0 2026-03-09T17:30:33.554 INFO:tasks.workunit.client.1.vm09.stdout:2/675: symlink d13/d15/d60/d85/ld9 0 2026-03-09T17:30:33.555 INFO:tasks.workunit.client.1.vm09.stdout:7/797: write da/d11/d47/d5b/d6c/f7b [3317324,95038] 0 2026-03-09T17:30:33.556 INFO:tasks.workunit.client.1.vm09.stdout:3/646: write d5/d16/d46/f47 [1641966,38152] 0 2026-03-09T17:30:33.566 INFO:tasks.workunit.client.1.vm09.stdout:0/705: dwrite d6/d1d/d24/d32/d59/d9c/fce [0,4194304] 0 2026-03-09T17:30:33.566 INFO:tasks.workunit.client.1.vm09.stdout:9/695: dwrite d5/d21/f9d [0,4194304] 0 2026-03-09T17:30:33.567 INFO:tasks.workunit.client.1.vm09.stdout:0/706: chown d6/c15 6040260 1 2026-03-09T17:30:33.592 INFO:tasks.workunit.client.1.vm09.stdout:4/682: getdents d11/d1e/d29/d36/d57/d78 0 2026-03-09T17:30:33.615 INFO:tasks.workunit.client.1.vm09.stdout:1/683: dwrite d9/d9e/dc0/f50 [0,4194304] 0 2026-03-09T17:30:33.648 INFO:tasks.workunit.client.1.vm09.stdout:3/647: mknod d5/d16/d31/d37/dae/db4/cc7 0 2026-03-09T17:30:33.650 INFO:tasks.workunit.client.1.vm09.stdout:3/648: dread d5/d9/d30/d65/f5e [0,4194304] 0 2026-03-09T17:30:33.661 INFO:tasks.workunit.client.1.vm09.stdout:9/696: mknod d5/d91/d99/dc9/dde/cea 0 2026-03-09T17:30:33.669 INFO:tasks.workunit.client.1.vm09.stdout:1/684: creat d9/d5a/fd5 x:0 0 0 2026-03-09T17:30:33.674 INFO:tasks.workunit.client.1.vm09.stdout:6/693: rename d3/d7/fe to d3/d21/d25/fea 0 2026-03-09T17:30:33.674 INFO:tasks.workunit.client.1.vm09.stdout:4/683: dread d11/d1e/d45/d60/d71/db7/d89/f94 [0,4194304] 0 2026-03-09T17:30:33.675 INFO:tasks.workunit.client.1.vm09.stdout:8/736: chown d1/da/d23/d6c/ddd 3166590 1 2026-03-09T17:30:33.677 INFO:tasks.workunit.client.1.vm09.stdout:5/733: creat d0/ff0 x:0 0 0 2026-03-09T17:30:33.684 INFO:tasks.workunit.client.1.vm09.stdout:7/798: creat da/d11/d3e/dd8/f112 x:0 0 0 2026-03-09T17:30:33.685 INFO:tasks.workunit.client.1.vm09.stdout:2/676: write d13/d15/d21/f7f [707821,48876] 0 2026-03-09T17:30:33.686 INFO:tasks.workunit.client.1.vm09.stdout:0/707: truncate d6/d1d/d24/d32/d59/d81/fc1 715944 0 2026-03-09T17:30:33.689 INFO:tasks.workunit.client.1.vm09.stdout:3/649: dread d5/d9/d30/d65/d59/d84/f86 [0,4194304] 0 2026-03-09T17:30:33.691 INFO:tasks.workunit.client.1.vm09.stdout:3/650: write d5/d9/d30/d65/f18 [4490320,130275] 0 2026-03-09T17:30:33.713 INFO:tasks.workunit.client.1.vm09.stdout:9/697: rename d5/de/d4e/fe6 to d5/de/d4e/dca/d84/feb 0 2026-03-09T17:30:33.713 INFO:tasks.workunit.client.1.vm09.stdout:4/684: mkdir d11/d1e/d29/d36/dd7 0 2026-03-09T17:30:33.714 INFO:tasks.workunit.client.1.vm09.stdout:7/799: stat da/d11/d47/d5b/d6c/f73 0 2026-03-09T17:30:33.714 INFO:tasks.workunit.client.1.vm09.stdout:8/737: stat d1/da/d23/d6c/ddd/dcb/ca0 0 2026-03-09T17:30:33.717 INFO:tasks.workunit.client.1.vm09.stdout:5/734: chown d0/dc/d21/d33/f69 347 1 2026-03-09T17:30:33.718 INFO:tasks.workunit.client.1.vm09.stdout:4/685: read d11/d1e/f28 [25249,67192] 0 2026-03-09T17:30:33.719 INFO:tasks.workunit.client.1.vm09.stdout:1/685: truncate d9/d9e/dc0/d37/d3f/f68 1521185 0 2026-03-09T17:30:33.719 INFO:tasks.workunit.client.1.vm09.stdout:9/698: fdatasync d5/de/d4e/dca/de7/d93/fb0 0 2026-03-09T17:30:33.745 INFO:tasks.workunit.client.1.vm09.stdout:2/677: rename d13/d15/d60 to d13/d15/d36/d72/d94/dda 0 2026-03-09T17:30:33.746 INFO:tasks.workunit.client.1.vm09.stdout:7/800: unlink da/fb 0 2026-03-09T17:30:33.749 INFO:tasks.workunit.client.1.vm09.stdout:8/738: symlink d1/da/dd/d77/le8 0 2026-03-09T17:30:33.763 INFO:tasks.workunit.client.1.vm09.stdout:4/686: chown d11/d1e/c21 133 1 2026-03-09T17:30:33.764 INFO:tasks.workunit.client.1.vm09.stdout:4/687: write d11/d1e/d29/db5/fca [594076,104610] 0 2026-03-09T17:30:33.772 INFO:tasks.workunit.client.1.vm09.stdout:9/699: dread - d5/de/d88/f8f zero size 2026-03-09T17:30:33.773 INFO:tasks.workunit.client.1.vm09.stdout:9/700: dread - d5/de/d29/da7/fdc zero size 2026-03-09T17:30:33.778 INFO:tasks.workunit.client.1.vm09.stdout:7/801: mknod da/d11/d47/d5b/d6c/d9e/c113 0 2026-03-09T17:30:33.785 INFO:tasks.workunit.client.1.vm09.stdout:2/678: rmdir d13/d15/d36/d72/d94 39 2026-03-09T17:30:33.785 INFO:tasks.workunit.client.1.vm09.stdout:0/708: creat d6/d1d/d24/d32/fec x:0 0 0 2026-03-09T17:30:33.788 INFO:tasks.workunit.client.1.vm09.stdout:3/651: write d5/d16/d31/d3d/fe [7357549,54542] 0 2026-03-09T17:30:33.789 INFO:tasks.workunit.client.1.vm09.stdout:8/739: truncate d1/da/d23/fb3 1001671 0 2026-03-09T17:30:33.791 INFO:tasks.workunit.client.1.vm09.stdout:6/694: getdents d3/d7/d99 0 2026-03-09T17:30:33.791 INFO:tasks.workunit.client.1.vm09.stdout:6/695: chown d3/d21/l2d 0 1 2026-03-09T17:30:33.794 INFO:tasks.workunit.client.1.vm09.stdout:2/679: sync 2026-03-09T17:30:33.796 INFO:tasks.workunit.client.1.vm09.stdout:4/688: mkdir d11/d1e/d45/d60/d71/db7/d89/d8b/dd8 0 2026-03-09T17:30:33.802 INFO:tasks.workunit.client.1.vm09.stdout:9/701: creat d5/d91/d99/dc9/dde/fec x:0 0 0 2026-03-09T17:30:33.804 INFO:tasks.workunit.client.1.vm09.stdout:0/709: mkdir d6/d1d/d24/d5e/d6c/ded 0 2026-03-09T17:30:33.806 INFO:tasks.workunit.client.1.vm09.stdout:6/696: mknod d3/d21/d25/d26/d6b/ceb 0 2026-03-09T17:30:33.808 INFO:tasks.workunit.client.1.vm09.stdout:7/802: dread da/d11/d77/f79 [0,4194304] 0 2026-03-09T17:30:33.816 INFO:tasks.workunit.client.1.vm09.stdout:4/689: dread d11/d1e/d29/d36/d57/f79 [0,4194304] 0 2026-03-09T17:30:33.816 INFO:tasks.workunit.client.1.vm09.stdout:1/686: creat d9/dc/dd/d40/d21/d6f/fd6 x:0 0 0 2026-03-09T17:30:33.817 INFO:tasks.workunit.client.1.vm09.stdout:1/687: read d9/d9e/dc0/d37/fce [540345,96097] 0 2026-03-09T17:30:33.817 INFO:tasks.workunit.client.1.vm09.stdout:1/688: readlink l7 0 2026-03-09T17:30:33.817 INFO:tasks.workunit.client.1.vm09.stdout:9/702: mknod d5/d7e/ced 0 2026-03-09T17:30:33.818 INFO:tasks.workunit.client.1.vm09.stdout:1/689: stat d9/d9e/dc0/d91/d99 0 2026-03-09T17:30:33.819 INFO:tasks.workunit.client.1.vm09.stdout:9/703: sync 2026-03-09T17:30:33.821 INFO:tasks.workunit.client.1.vm09.stdout:1/690: dwrite d9/dc/d63/dba/fb4 [0,4194304] 0 2026-03-09T17:30:33.831 INFO:tasks.workunit.client.1.vm09.stdout:3/652: mknod d5/cc8 0 2026-03-09T17:30:33.833 INFO:tasks.workunit.client.1.vm09.stdout:1/691: stat d9/dc/dd/l3c 0 2026-03-09T17:30:33.835 INFO:tasks.workunit.client.1.vm09.stdout:1/692: write d9/dc/dd/d40/d21/d35/d88/f9a [471554,116354] 0 2026-03-09T17:30:33.846 INFO:tasks.workunit.client.1.vm09.stdout:3/653: sync 2026-03-09T17:30:33.850 INFO:tasks.workunit.client.1.vm09.stdout:5/735: getdents d0/d2/d76/d87/da4/dbf 0 2026-03-09T17:30:33.851 INFO:tasks.workunit.client.1.vm09.stdout:9/704: creat d5/de/d4e/dca/d84/fee x:0 0 0 2026-03-09T17:30:33.854 INFO:tasks.workunit.client.1.vm09.stdout:0/710: symlink d6/d1d/d24/d5e/lee 0 2026-03-09T17:30:33.855 INFO:tasks.workunit.client.1.vm09.stdout:8/740: dwrite d1/da/d3a/fa3 [0,4194304] 0 2026-03-09T17:30:33.855 INFO:tasks.workunit.client.1.vm09.stdout:0/711: write d6/f63 [1909610,100842] 0 2026-03-09T17:30:33.859 INFO:tasks.workunit.client.1.vm09.stdout:0/712: readlink d6/le1 0 2026-03-09T17:30:33.859 INFO:tasks.workunit.client.1.vm09.stdout:0/713: truncate d6/f27 787812 0 2026-03-09T17:30:33.864 INFO:tasks.workunit.client.1.vm09.stdout:1/693: creat d9/dc/d63/dba/fd7 x:0 0 0 2026-03-09T17:30:33.869 INFO:tasks.workunit.client.1.vm09.stdout:2/680: rename d13/d15/d21/f31 to d13/d15/fdb 0 2026-03-09T17:30:33.875 INFO:tasks.workunit.client.1.vm09.stdout:6/697: dwrite d3/d7/d59/d73/fa3 [0,4194304] 0 2026-03-09T17:30:33.879 INFO:tasks.workunit.client.1.vm09.stdout:6/698: chown d3/d21/d76/d5c/d61/d95 23 1 2026-03-09T17:30:33.882 INFO:tasks.workunit.client.1.vm09.stdout:1/694: dread d9/dc/dd/d40/d21/d35/d88/f9a [0,4194304] 0 2026-03-09T17:30:33.882 INFO:tasks.workunit.client.1.vm09.stdout:4/690: symlink d11/d1e/d45/d60/d71/db7/d89/ld9 0 2026-03-09T17:30:33.889 INFO:tasks.workunit.client.1.vm09.stdout:5/736: creat d0/d2/ff1 x:0 0 0 2026-03-09T17:30:33.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:33 vm06.local ceph-mon[57307]: mgrmap e28: vm06.pbgzei(active, since 1.57284s) 2026-03-09T17:30:33.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:33 vm06.local ceph-mon[57307]: pgmap v3: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail 2026-03-09T17:30:33.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:33 vm06.local ceph-mon[57307]: from='mgr.? 192.168.123.109:0/3529164605' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/crt"}]: dispatch 2026-03-09T17:30:33.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:33 vm06.local ceph-mon[57307]: Standby manager daemon vm09.lqzvkh started 2026-03-09T17:30:33.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:33 vm06.local ceph-mon[57307]: from='mgr.? 192.168.123.109:0/3529164605' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:30:33.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:33 vm06.local ceph-mon[57307]: from='mgr.? 192.168.123.109:0/3529164605' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/key"}]: dispatch 2026-03-09T17:30:33.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:33 vm06.local ceph-mon[57307]: from='mgr.? 192.168.123.109:0/3529164605' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T17:30:33.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:33 vm06.local ceph-mon[57307]: pgmap v4: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail 2026-03-09T17:30:33.895 INFO:tasks.workunit.client.1.vm09.stdout:7/803: creat da/d11/d47/d5b/f114 x:0 0 0 2026-03-09T17:30:33.895 INFO:tasks.workunit.client.1.vm09.stdout:2/681: symlink d13/d15/d3b/ldc 0 2026-03-09T17:30:33.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:33 vm09.local ceph-mon[62061]: mgrmap e28: vm06.pbgzei(active, since 1.57284s) 2026-03-09T17:30:33.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:33 vm09.local ceph-mon[62061]: pgmap v3: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail 2026-03-09T17:30:33.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:33 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.109:0/3529164605' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/crt"}]: dispatch 2026-03-09T17:30:33.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:33 vm09.local ceph-mon[62061]: Standby manager daemon vm09.lqzvkh started 2026-03-09T17:30:33.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:33 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.109:0/3529164605' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:30:33.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:33 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.109:0/3529164605' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/key"}]: dispatch 2026-03-09T17:30:33.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:33 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.109:0/3529164605' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T17:30:33.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:33 vm09.local ceph-mon[62061]: pgmap v4: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail 2026-03-09T17:30:33.899 INFO:tasks.workunit.client.1.vm09.stdout:0/714: rename d6/d1d/f91 to d6/d1d/d24/d5e/d6c/fef 0 2026-03-09T17:30:33.900 INFO:tasks.workunit.client.1.vm09.stdout:7/804: truncate da/d11/d47/d5b/f114 343558 0 2026-03-09T17:30:33.900 INFO:tasks.workunit.client.1.vm09.stdout:5/737: dread - d0/d52/fe3 zero size 2026-03-09T17:30:33.901 INFO:tasks.workunit.client.1.vm09.stdout:1/695: dread d9/dc/fa9 [0,4194304] 0 2026-03-09T17:30:33.902 INFO:tasks.workunit.client.1.vm09.stdout:9/705: creat d5/d2e/fef x:0 0 0 2026-03-09T17:30:33.907 INFO:tasks.workunit.client.1.vm09.stdout:6/699: rename d3/d7/f77 to d3/d21/d25/d96/fec 0 2026-03-09T17:30:33.907 INFO:tasks.workunit.client.1.vm09.stdout:7/805: creat da/d11/d77/d101/f115 x:0 0 0 2026-03-09T17:30:33.913 INFO:tasks.workunit.client.1.vm09.stdout:7/806: write da/d11/d47/d5b/d78/fdd [1334180,102192] 0 2026-03-09T17:30:33.913 INFO:tasks.workunit.client.1.vm09.stdout:3/654: truncate d5/d16/d31/d37/fa5 3669864 0 2026-03-09T17:30:33.917 INFO:tasks.workunit.client.1.vm09.stdout:4/691: rmdir d11/d1e/d31/dd5 0 2026-03-09T17:30:33.918 INFO:tasks.workunit.client.1.vm09.stdout:1/696: mkdir d9/dc/dd/d40/d21/d35/db9/dd8 0 2026-03-09T17:30:33.919 INFO:tasks.workunit.client.1.vm09.stdout:5/738: write d0/d2/d76/d87/d95/f9d [3201063,107336] 0 2026-03-09T17:30:33.919 INFO:tasks.workunit.client.1.vm09.stdout:1/697: dread - d9/d38/fbd zero size 2026-03-09T17:30:33.920 INFO:tasks.workunit.client.1.vm09.stdout:1/698: chown d9/dc 65 1 2026-03-09T17:30:33.920 INFO:tasks.workunit.client.1.vm09.stdout:8/741: rename d1/d14/f3d to d1/d14/d2a/fe9 0 2026-03-09T17:30:33.921 INFO:tasks.workunit.client.1.vm09.stdout:6/700: truncate d3/d21/d76/d5c/f78 3003677 0 2026-03-09T17:30:33.921 INFO:tasks.workunit.client.1.vm09.stdout:1/699: truncate f2 4609976 0 2026-03-09T17:30:33.923 INFO:tasks.workunit.client.1.vm09.stdout:4/692: readlink d11/d1e/d29/l4f 0 2026-03-09T17:30:33.930 INFO:tasks.workunit.client.1.vm09.stdout:7/807: write da/d11/d47/d5b/d78/fab [1962132,95153] 0 2026-03-09T17:30:33.934 INFO:tasks.workunit.client.1.vm09.stdout:0/715: dread d6/d1d/d24/d5e/f8a [0,4194304] 0 2026-03-09T17:30:33.942 INFO:tasks.workunit.client.1.vm09.stdout:5/739: creat d0/d2/d76/d86/ff2 x:0 0 0 2026-03-09T17:30:33.942 INFO:tasks.workunit.client.1.vm09.stdout:6/701: chown d3/d21/d76/d5c/f78 1 1 2026-03-09T17:30:33.945 INFO:tasks.workunit.client.1.vm09.stdout:1/700: dwrite d9/dc/dd/d40/d21/fb6 [0,4194304] 0 2026-03-09T17:30:33.949 INFO:tasks.workunit.client.1.vm09.stdout:2/682: dread d13/d15/f2b [0,4194304] 0 2026-03-09T17:30:33.949 INFO:tasks.workunit.client.1.vm09.stdout:7/808: mknod da/d11/d47/d5b/d6c/c116 0 2026-03-09T17:30:33.951 INFO:tasks.workunit.client.1.vm09.stdout:0/716: dread d6/f63 [0,4194304] 0 2026-03-09T17:30:33.957 INFO:tasks.workunit.client.1.vm09.stdout:9/706: dread d5/de/f3c [0,4194304] 0 2026-03-09T17:30:33.959 INFO:tasks.workunit.client.1.vm09.stdout:7/809: dwrite da/d11/d47/d5b/d78/f80 [4194304,4194304] 0 2026-03-09T17:30:33.975 INFO:tasks.workunit.client.1.vm09.stdout:6/702: read d3/d21/d76/d5c/d7e/dc5/d9a/fb4 [505644,130878] 0 2026-03-09T17:30:33.981 INFO:tasks.workunit.client.1.vm09.stdout:6/703: chown d3/d21/d25 648867 1 2026-03-09T17:30:33.981 INFO:tasks.workunit.client.1.vm09.stdout:8/742: rename d1/da/d23/d6c/d32/ld6 to d1/d14/d2a/d42/d5d/d8a/lea 0 2026-03-09T17:30:33.981 INFO:tasks.workunit.client.1.vm09.stdout:3/655: dwrite d5/d9/d30/d65/d59/d84/f6e [0,4194304] 0 2026-03-09T17:30:33.986 INFO:tasks.workunit.client.1.vm09.stdout:0/717: unlink d6/d1d/d46/c6a 0 2026-03-09T17:30:33.987 INFO:tasks.workunit.client.1.vm09.stdout:7/810: chown da/d11/d77/de5/dec/lfd 232 1 2026-03-09T17:30:33.991 INFO:tasks.workunit.client.1.vm09.stdout:4/693: truncate d11/d1e/d29/d36/f6a 693290 0 2026-03-09T17:30:33.994 INFO:tasks.workunit.client.1.vm09.stdout:6/704: creat d3/d7/d59/d5a/fed x:0 0 0 2026-03-09T17:30:33.994 INFO:tasks.workunit.client.1.vm09.stdout:9/707: dread d5/d21/f2b [0,4194304] 0 2026-03-09T17:30:34.000 INFO:tasks.workunit.client.1.vm09.stdout:5/740: link d0/d52/l67 d0/d2/d76/d86/lf3 0 2026-03-09T17:30:34.001 INFO:tasks.workunit.client.1.vm09.stdout:5/741: write d0/d52/fe3 [286288,58594] 0 2026-03-09T17:30:34.004 INFO:tasks.workunit.client.1.vm09.stdout:9/708: fdatasync d5/d21/f2b 0 2026-03-09T17:30:34.008 INFO:tasks.workunit.client.1.vm09.stdout:6/705: rmdir d3/d21/d25/d26/d6b 39 2026-03-09T17:30:34.008 INFO:tasks.workunit.client.1.vm09.stdout:2/683: unlink d13/d15/d3b/f9c 0 2026-03-09T17:30:34.010 INFO:tasks.workunit.client.1.vm09.stdout:4/694: mknod d11/dc8/cda 0 2026-03-09T17:30:34.010 INFO:tasks.workunit.client.1.vm09.stdout:3/656: rename d5/d16/d31/d37/d58/d64/f7c to d5/d9/da9/fc9 0 2026-03-09T17:30:34.012 INFO:tasks.workunit.client.1.vm09.stdout:5/742: mkdir d0/d2/d76/d87/d95/d9b/dc0/df4 0 2026-03-09T17:30:34.015 INFO:tasks.workunit.client.1.vm09.stdout:6/706: fsync d3/d21/d76/d5c/d9f/fa9 0 2026-03-09T17:30:34.015 INFO:tasks.workunit.client.1.vm09.stdout:2/684: mknod d13/d15/d34/d45/d84/db5/cdd 0 2026-03-09T17:30:34.017 INFO:tasks.workunit.client.1.vm09.stdout:7/811: sync 2026-03-09T17:30:34.027 INFO:tasks.workunit.client.1.vm09.stdout:1/701: write d9/f59 [357840,94859] 0 2026-03-09T17:30:34.028 INFO:tasks.workunit.client.1.vm09.stdout:8/743: truncate d1/d14/d2a/d42/d43/f95 2246886 0 2026-03-09T17:30:34.032 INFO:tasks.workunit.client.1.vm09.stdout:1/702: read d9/dc/f47 [3681278,13937] 0 2026-03-09T17:30:34.034 INFO:tasks.workunit.client.1.vm09.stdout:1/703: truncate d9/d5a/fbf 861635 0 2026-03-09T17:30:34.034 INFO:tasks.workunit.client.1.vm09.stdout:5/743: symlink d0/dc/d21/d6f/lf5 0 2026-03-09T17:30:34.035 INFO:tasks.workunit.client.1.vm09.stdout:5/744: chown d0/d2/fcf 21510570 1 2026-03-09T17:30:34.036 INFO:tasks.workunit.client.1.vm09.stdout:9/709: dwrite d5/de/d4e/dca/de7/d93/faa [0,4194304] 0 2026-03-09T17:30:34.040 INFO:tasks.workunit.client.1.vm09.stdout:0/718: getdents d6/d64/d97/dc9 0 2026-03-09T17:30:34.040 INFO:tasks.workunit.client.1.vm09.stdout:9/710: chown d5/d21/f38 1916930 1 2026-03-09T17:30:34.042 INFO:tasks.workunit.client.1.vm09.stdout:4/695: dwrite d11/d1e/d29/f2e [0,4194304] 0 2026-03-09T17:30:34.047 INFO:tasks.workunit.client.1.vm09.stdout:6/707: creat d3/d21/d76/d5c/d7e/dc5/d98/fee x:0 0 0 2026-03-09T17:30:34.047 INFO:tasks.workunit.client.1.vm09.stdout:1/704: dwrite d9/dc/d63/fc8 [0,4194304] 0 2026-03-09T17:30:34.051 INFO:tasks.workunit.client.1.vm09.stdout:4/696: read d11/d1e/d45/d60/d71/f76 [2784015,110207] 0 2026-03-09T17:30:34.051 INFO:tasks.workunit.client.1.vm09.stdout:7/812: symlink da/d11/d3e/l117 0 2026-03-09T17:30:34.064 INFO:tasks.workunit.client.1.vm09.stdout:3/657: rename d5/d9/d30/f41 to d5/d16/d31/d37/d58/d64/fca 0 2026-03-09T17:30:34.065 INFO:tasks.workunit.client.1.vm09.stdout:3/658: chown d5/d9/d90/fb9 121877 1 2026-03-09T17:30:34.069 INFO:tasks.workunit.client.1.vm09.stdout:9/711: write d5/d21/f2f [3921399,28851] 0 2026-03-09T17:30:34.075 INFO:tasks.workunit.client.1.vm09.stdout:6/708: readlink d3/d21/d25/d26/d6b/dbf/l3a 0 2026-03-09T17:30:34.075 INFO:tasks.workunit.client.1.vm09.stdout:1/705: mknod d9/dc/d63/cd9 0 2026-03-09T17:30:34.076 INFO:tasks.workunit.client.1.vm09.stdout:3/659: dwrite d5/d9/d30/d65/f18 [0,4194304] 0 2026-03-09T17:30:34.076 INFO:tasks.workunit.client.1.vm09.stdout:8/744: mknod d1/da/d23/d6c/ddd/dcb/d97/dc5/ceb 0 2026-03-09T17:30:34.089 INFO:tasks.workunit.client.1.vm09.stdout:2/685: write d13/f8b [3232491,124380] 0 2026-03-09T17:30:34.089 INFO:tasks.workunit.client.1.vm09.stdout:2/686: read d13/d15/d34/f48 [680426,123970] 0 2026-03-09T17:30:34.092 INFO:tasks.workunit.client.1.vm09.stdout:0/719: mkdir d6/d1d/df0 0 2026-03-09T17:30:34.093 INFO:tasks.workunit.client.1.vm09.stdout:5/745: dread d0/d9/f77 [4194304,4194304] 0 2026-03-09T17:30:34.098 INFO:tasks.workunit.client.1.vm09.stdout:0/720: dwrite d6/d1d/d39/fdc [0,4194304] 0 2026-03-09T17:30:34.100 INFO:tasks.workunit.client.1.vm09.stdout:0/721: fdatasync d6/d1d/d24/fda 0 2026-03-09T17:30:34.104 INFO:tasks.workunit.client.1.vm09.stdout:1/706: creat d9/d38/d61/fda x:0 0 0 2026-03-09T17:30:34.112 INFO:tasks.workunit.client.1.vm09.stdout:1/707: write d9/d9e/dc0/d91/d99/fbc [827315,96993] 0 2026-03-09T17:30:34.112 INFO:tasks.workunit.client.1.vm09.stdout:9/712: rename d5/d7e/d81 to d5/de/d29/dd4/df0 0 2026-03-09T17:30:34.114 INFO:tasks.workunit.client.1.vm09.stdout:1/708: chown d9/d38/l5c 600648 1 2026-03-09T17:30:34.117 INFO:tasks.workunit.client.1.vm09.stdout:3/660: mknod d5/d16/d31/d3d/db3/ccb 0 2026-03-09T17:30:34.118 INFO:tasks.workunit.client.1.vm09.stdout:2/687: mkdir d13/d15/d34/d37/d6f/dde 0 2026-03-09T17:30:34.121 INFO:tasks.workunit.client.1.vm09.stdout:0/722: mknod d6/d1d/d24/d5e/db2/cf1 0 2026-03-09T17:30:34.122 INFO:tasks.workunit.client.1.vm09.stdout:7/813: creat da/d11/d47/d5b/d6c/f118 x:0 0 0 2026-03-09T17:30:34.126 INFO:tasks.workunit.client.1.vm09.stdout:8/745: rename d1/da/dd/f9a to d1/d14/d2a/d42/d5d/d8a/fec 0 2026-03-09T17:30:34.128 INFO:tasks.workunit.client.1.vm09.stdout:4/697: creat d11/d1e/d45/d60/d71/db7/d89/d8b/dd8/fdb x:0 0 0 2026-03-09T17:30:34.136 INFO:tasks.workunit.client.1.vm09.stdout:6/709: link d3/d21/d25/f54 d3/d7/d59/d73/fef 0 2026-03-09T17:30:34.142 INFO:tasks.workunit.client.1.vm09.stdout:0/723: creat d6/d1d/d24/d32/d59/d9c/dac/dcc/ff2 x:0 0 0 2026-03-09T17:30:34.142 INFO:tasks.workunit.client.1.vm09.stdout:7/814: fsync da/d11/d47/d5b/d6c/d9e/d4e/f7d 0 2026-03-09T17:30:34.144 INFO:tasks.workunit.client.1.vm09.stdout:7/815: readlink da/d11/d77/de5/dec/lfd 0 2026-03-09T17:30:34.146 INFO:tasks.workunit.client.1.vm09.stdout:5/746: rename d0/d2/d76/d87/da4/dbe/fe4 to d0/d2/ff6 0 2026-03-09T17:30:34.153 INFO:tasks.workunit.client.1.vm09.stdout:2/688: dread d13/f8b [0,4194304] 0 2026-03-09T17:30:34.162 INFO:tasks.workunit.client.1.vm09.stdout:9/713: mkdir d5/d2e/d8b/de0/df1 0 2026-03-09T17:30:34.162 INFO:tasks.workunit.client.1.vm09.stdout:4/698: dread - d11/d1e/d45/fb4 zero size 2026-03-09T17:30:34.177 INFO:tasks.workunit.client.1.vm09.stdout:1/709: write d9/d9e/dc0/d37/d3f/d42/f95 [644699,3940] 0 2026-03-09T17:30:34.180 INFO:tasks.workunit.client.1.vm09.stdout:1/710: write d9/f59 [373141,59979] 0 2026-03-09T17:30:34.186 INFO:tasks.workunit.client.1.vm09.stdout:1/711: dwrite d9/f8d [0,4194304] 0 2026-03-09T17:30:34.189 INFO:tasks.workunit.client.1.vm09.stdout:2/689: write d13/d4d/f81 [10275,121414] 0 2026-03-09T17:30:34.192 INFO:tasks.workunit.client.1.vm09.stdout:8/746: mkdir d1/da/dd/ded 0 2026-03-09T17:30:34.193 INFO:tasks.workunit.client.1.vm09.stdout:4/699: mknod d11/d1e/d29/d36/cdc 0 2026-03-09T17:30:34.198 INFO:tasks.workunit.client.1.vm09.stdout:4/700: stat d11/d1e/d45/d60/d71/db7/d89/ld9 0 2026-03-09T17:30:34.204 INFO:tasks.workunit.client.1.vm09.stdout:2/690: dread d13/d15/f74 [0,4194304] 0 2026-03-09T17:30:34.210 INFO:tasks.workunit.client.1.vm09.stdout:3/661: creat d5/d16/fcc x:0 0 0 2026-03-09T17:30:34.220 INFO:tasks.workunit.client.1.vm09.stdout:6/710: fsync d3/d21/d25/fea 0 2026-03-09T17:30:34.221 INFO:tasks.workunit.client.1.vm09.stdout:7/816: creat da/d11/d47/dfa/f119 x:0 0 0 2026-03-09T17:30:34.231 INFO:tasks.workunit.client.1.vm09.stdout:9/714: symlink d5/de/d4e/dca/d84/db7/lf2 0 2026-03-09T17:30:34.231 INFO:tasks.workunit.client.1.vm09.stdout:1/712: write d9/dc/dd/d40/d1d/f17 [565722,44567] 0 2026-03-09T17:30:34.231 INFO:tasks.workunit.client.1.vm09.stdout:7/817: sync 2026-03-09T17:30:34.234 INFO:tasks.workunit.client.1.vm09.stdout:9/715: dread - d5/de/d29/da7/fdc zero size 2026-03-09T17:30:34.235 INFO:tasks.workunit.client.1.vm09.stdout:0/724: truncate d6/d1d/d24/d5e/d6c/fa5 196823 0 2026-03-09T17:30:34.236 INFO:tasks.workunit.client.1.vm09.stdout:0/725: stat d6/d1d/d24/d32/d59/c85 0 2026-03-09T17:30:34.239 INFO:tasks.workunit.client.1.vm09.stdout:4/701: unlink d11/d1e/d29/d36/c84 0 2026-03-09T17:30:34.246 INFO:tasks.workunit.client.1.vm09.stdout:5/747: dwrite d0/dc/f37 [0,4194304] 0 2026-03-09T17:30:34.248 INFO:tasks.workunit.client.1.vm09.stdout:8/747: dread d1/da/d23/d6c/d32/f6d [0,4194304] 0 2026-03-09T17:30:34.258 INFO:tasks.workunit.client.1.vm09.stdout:7/818: mkdir da/d11/d3e/da2/d11a 0 2026-03-09T17:30:34.258 INFO:tasks.workunit.client.1.vm09.stdout:2/691: write d13/d15/d21/f5d [1067192,56135] 0 2026-03-09T17:30:34.264 INFO:tasks.workunit.client.1.vm09.stdout:9/716: creat d5/d2e/d8b/de0/ff3 x:0 0 0 2026-03-09T17:30:34.272 INFO:tasks.workunit.client.1.vm09.stdout:4/702: readlink d11/d1e/d45/d60/d71/db7/d89/d8b/l4b 0 2026-03-09T17:30:34.276 INFO:tasks.workunit.client.1.vm09.stdout:5/748: creat d0/dc/dc3/ff7 x:0 0 0 2026-03-09T17:30:34.278 INFO:tasks.workunit.client.1.vm09.stdout:8/748: unlink d1/d14/d2a/l87 0 2026-03-09T17:30:34.284 INFO:tasks.workunit.client.1.vm09.stdout:6/711: link d3/d7/d59/d73/f93 d3/d21/db1/ff0 0 2026-03-09T17:30:34.286 INFO:tasks.workunit.client.1.vm09.stdout:2/692: chown c12 728116785 1 2026-03-09T17:30:34.289 INFO:tasks.workunit.client.1.vm09.stdout:1/713: mkdir d9/dc/dd/d40/ddb 0 2026-03-09T17:30:34.294 INFO:tasks.workunit.client.1.vm09.stdout:9/717: rename d5/de/d4e/dca/l87 to d5/de/d4e/dca/d84/db7/lf4 0 2026-03-09T17:30:34.295 INFO:tasks.workunit.client.1.vm09.stdout:9/718: chown d5/d2e 26182 1 2026-03-09T17:30:34.296 INFO:tasks.workunit.client.1.vm09.stdout:4/703: chown d11/d1e/d45/d60/d71/db7/d89/d8b/cc3 196083 1 2026-03-09T17:30:34.298 INFO:tasks.workunit.client.1.vm09.stdout:9/719: sync 2026-03-09T17:30:34.298 INFO:tasks.workunit.client.1.vm09.stdout:7/819: dwrite da/d11/d47/d5b/d6c/d9e/d4e/d4c/de4/fe0 [0,4194304] 0 2026-03-09T17:30:34.304 INFO:tasks.workunit.client.1.vm09.stdout:9/720: dwrite d5/de/d4e/dca/de7/d93/faa [0,4194304] 0 2026-03-09T17:30:34.310 INFO:tasks.workunit.client.1.vm09.stdout:9/721: fsync d5/f8e 0 2026-03-09T17:30:34.313 INFO:tasks.workunit.client.1.vm09.stdout:5/749: rmdir d0 39 2026-03-09T17:30:34.314 INFO:tasks.workunit.client.1.vm09.stdout:9/722: fsync d5/de/d29/da7/fb3 0 2026-03-09T17:30:34.329 INFO:tasks.workunit.client.1.vm09.stdout:8/749: unlink d1/da/dd/d47/l90 0 2026-03-09T17:30:34.330 INFO:tasks.workunit.client.1.vm09.stdout:3/662: write d5/d9/d30/d65/d59/d84/d8c/f99 [335247,89505] 0 2026-03-09T17:30:34.331 INFO:tasks.workunit.client.1.vm09.stdout:6/712: mknod d3/d21/d76/d5c/d61/cf1 0 2026-03-09T17:30:34.334 INFO:tasks.workunit.client.1.vm09.stdout:1/714: mknod d9/d9e/dc0/cdc 0 2026-03-09T17:30:34.335 INFO:tasks.workunit.client.1.vm09.stdout:1/715: fsync d9/d9e/dc0/d37/da4/fcd 0 2026-03-09T17:30:34.337 INFO:tasks.workunit.client.1.vm09.stdout:7/820: truncate da/d11/d64/da7/fa8 6279937 0 2026-03-09T17:30:34.342 INFO:tasks.workunit.client.1.vm09.stdout:7/821: write da/d11/d47/d5b/d78/f9b [2963467,80193] 0 2026-03-09T17:30:34.348 INFO:tasks.workunit.client.1.vm09.stdout:8/750: dread - d1/d14/d2a/d42/d43/fa4 zero size 2026-03-09T17:30:34.353 INFO:tasks.workunit.client.1.vm09.stdout:0/726: getdents d6/d1d/d24/d5e/d6c 0 2026-03-09T17:30:34.353 INFO:tasks.workunit.client.1.vm09.stdout:8/751: chown d1/da/dd/fc0 1 1 2026-03-09T17:30:34.353 INFO:tasks.workunit.client.1.vm09.stdout:4/704: creat d11/d1e/d29/d36/dd7/fdd x:0 0 0 2026-03-09T17:30:34.353 INFO:tasks.workunit.client.1.vm09.stdout:6/713: truncate d3/d21/d76/d5c/d7e/dc5/d9a/fb4 1454955 0 2026-03-09T17:30:34.354 INFO:tasks.workunit.client.1.vm09.stdout:0/727: fsync d6/d1d/d24/d32/fec 0 2026-03-09T17:30:34.354 INFO:tasks.workunit.client.1.vm09.stdout:8/752: stat d1/d14/d2a/d42/d5d/la9 0 2026-03-09T17:30:34.357 INFO:tasks.workunit.client.1.vm09.stdout:5/750: creat d0/d2/d76/d87/d95/d9b/dc0/dce/ff8 x:0 0 0 2026-03-09T17:30:34.364 INFO:tasks.workunit.client.1.vm09.stdout:2/693: rename d13/d15/d36/d72/d94/dda to d13/d15/d3b/ddf 0 2026-03-09T17:30:34.365 INFO:tasks.workunit.client.1.vm09.stdout:1/716: mkdir d9/ddd 0 2026-03-09T17:30:34.367 INFO:tasks.workunit.client.1.vm09.stdout:8/753: chown d1/da/d23/d6c/d32/l7a 4817464 1 2026-03-09T17:30:34.368 INFO:tasks.workunit.client.1.vm09.stdout:2/694: write d13/d15/d3b/ddf/f97 [55174,79207] 0 2026-03-09T17:30:34.370 INFO:tasks.workunit.client.1.vm09.stdout:0/728: creat d6/d1d/d24/d32/d59/d81/ff3 x:0 0 0 2026-03-09T17:30:34.374 INFO:tasks.workunit.client.1.vm09.stdout:3/663: dread f3 [0,4194304] 0 2026-03-09T17:30:34.395 INFO:tasks.workunit.client.1.vm09.stdout:3/664: creat d5/d9/d90/db0/fcd x:0 0 0 2026-03-09T17:30:34.397 INFO:tasks.workunit.client.1.vm09.stdout:9/723: write d5/f14 [6901778,115681] 0 2026-03-09T17:30:34.406 INFO:tasks.workunit.client.1.vm09.stdout:4/705: write d11/d1e/d45/d60/d71/db7/f96 [4710441,80690] 0 2026-03-09T17:30:34.407 INFO:tasks.workunit.client.1.vm09.stdout:4/706: write d11/d1e/d31/f65 [890456,38484] 0 2026-03-09T17:30:34.408 INFO:tasks.workunit.client.1.vm09.stdout:5/751: write d0/dc/d21/d26/d5e/d68/d6d/f9e [234534,26769] 0 2026-03-09T17:30:34.409 INFO:tasks.workunit.client.1.vm09.stdout:2/695: write d13/d15/d34/d45/f82 [737201,24997] 0 2026-03-09T17:30:34.412 INFO:tasks.workunit.client.1.vm09.stdout:7/822: dwrite da/d11/d47/d5b/d6c/d9e/d4e/f2b [0,4194304] 0 2026-03-09T17:30:34.416 INFO:tasks.workunit.client.1.vm09.stdout:1/717: dwrite d9/dc/d63/f67 [0,4194304] 0 2026-03-09T17:30:34.416 INFO:tasks.workunit.client.1.vm09.stdout:8/754: dwrite d1/d14/d2a/d42/d5d/d8a/fb8 [0,4194304] 0 2026-03-09T17:30:34.420 INFO:tasks.workunit.client.1.vm09.stdout:8/755: readlink d1/d14/d2a/d42/d5d/lba 0 2026-03-09T17:30:34.421 INFO:tasks.workunit.client.1.vm09.stdout:1/718: chown d9/dc/dd/d40/d21/d35/db9/lc6 324287 1 2026-03-09T17:30:34.426 INFO:tasks.workunit.client.1.vm09.stdout:3/665: rename d5/d9/l1b to d5/d16/d31/d37/d58/d64/lce 0 2026-03-09T17:30:34.426 INFO:tasks.workunit.client.1.vm09.stdout:4/707: fdatasync d11/d1e/d29/d36/fad 0 2026-03-09T17:30:34.428 INFO:tasks.workunit.client.1.vm09.stdout:2/696: dread - d13/d4d/f7d zero size 2026-03-09T17:30:34.428 INFO:tasks.workunit.client.1.vm09.stdout:0/729: creat d6/d1d/d24/d32/d59/d81/d8c/ff4 x:0 0 0 2026-03-09T17:30:34.431 INFO:tasks.workunit.client.1.vm09.stdout:0/730: chown d6/d1d/d24/d5e/d6c/c74 5 1 2026-03-09T17:30:34.432 INFO:tasks.workunit.client.1.vm09.stdout:7/823: unlink da/d11/d47/d5b/d78/fdd 0 2026-03-09T17:30:34.432 INFO:tasks.workunit.client.1.vm09.stdout:0/731: chown d6/d1d/d46/l35 219496 1 2026-03-09T17:30:34.441 INFO:tasks.workunit.client.1.vm09.stdout:3/666: dwrite d5/d9/fa6 [0,4194304] 0 2026-03-09T17:30:34.441 INFO:tasks.workunit.client.1.vm09.stdout:1/719: unlink d9/dc/dd/d40/d21/fb8 0 2026-03-09T17:30:34.445 INFO:tasks.workunit.client.1.vm09.stdout:5/752: dread d0/d2/d76/d86/f6b [0,4194304] 0 2026-03-09T17:30:34.446 INFO:tasks.workunit.client.1.vm09.stdout:0/732: truncate d6/d1d/d24/d32/d59/d9c/dac/dcc/ff2 336758 0 2026-03-09T17:30:34.446 INFO:tasks.workunit.client.1.vm09.stdout:1/720: chown d9/dc/dd/d40/d21/fb6 28 1 2026-03-09T17:30:34.446 INFO:tasks.workunit.client.1.vm09.stdout:6/714: mkdir d3/d21/d76/d5c/d61/d6a/df2 0 2026-03-09T17:30:34.447 INFO:tasks.workunit.client.1.vm09.stdout:8/756: dread d1/da/d23/d6c/d32/fb5 [0,4194304] 0 2026-03-09T17:30:34.450 INFO:tasks.workunit.client.1.vm09.stdout:9/724: link d5/de/d29/da7/fdc d5/de/d29/d33/db8/ff5 0 2026-03-09T17:30:34.459 INFO:tasks.workunit.client.1.vm09.stdout:4/708: mknod d11/d1e/d31/cde 0 2026-03-09T17:30:34.468 INFO:tasks.workunit.client.1.vm09.stdout:1/721: dread - d9/d9e/dc0/d37/d3f/f56 zero size 2026-03-09T17:30:34.468 INFO:tasks.workunit.client.1.vm09.stdout:1/722: dread - d9/f97 zero size 2026-03-09T17:30:34.469 INFO:tasks.workunit.client.1.vm09.stdout:1/723: stat d9/d9e/dc0/d37/f41 0 2026-03-09T17:30:34.473 INFO:tasks.workunit.client.1.vm09.stdout:3/667: dread d5/d16/d46/f63 [0,4194304] 0 2026-03-09T17:30:34.484 INFO:tasks.workunit.client.1.vm09.stdout:6/715: fdatasync d3/d21/d76/d5c/d61/f60 0 2026-03-09T17:30:34.484 INFO:tasks.workunit.client.1.vm09.stdout:9/725: creat d5/de/d4e/dca/d84/d97/ff6 x:0 0 0 2026-03-09T17:30:34.485 INFO:tasks.workunit.client.1.vm09.stdout:6/716: chown d3/d7/d59/d9c/caa 2124083 1 2026-03-09T17:30:34.485 INFO:tasks.workunit.client.1.vm09.stdout:4/709: read d11/d1e/d45/d60/d71/db7/d89/d8b/f53 [4064200,100109] 0 2026-03-09T17:30:34.486 INFO:tasks.workunit.client.1.vm09.stdout:6/717: chown d3/d7/d59/d5a/fed 379 1 2026-03-09T17:30:34.488 INFO:tasks.workunit.client.1.vm09.stdout:8/757: symlink d1/da/dd/lee 0 2026-03-09T17:30:34.489 INFO:tasks.workunit.client.1.vm09.stdout:9/726: readlink d5/de/l6d 0 2026-03-09T17:30:34.500 INFO:tasks.workunit.client.1.vm09.stdout:0/733: creat d6/d1d/d24/d32/d59/d81/d8c/ff5 x:0 0 0 2026-03-09T17:30:34.503 INFO:tasks.workunit.client.1.vm09.stdout:0/734: dread d6/d1d/d24/d32/d59/d9c/dac/fe6 [0,4194304] 0 2026-03-09T17:30:34.511 INFO:tasks.workunit.client.1.vm09.stdout:3/668: getdents d5/d16/dc5 0 2026-03-09T17:30:34.513 INFO:tasks.workunit.client.1.vm09.stdout:3/669: chown d5/d9/d90/db0/c7a 63320976 1 2026-03-09T17:30:34.514 INFO:tasks.workunit.client.1.vm09.stdout:7/824: dwrite da/d11/d47/d5b/d6c/d9e/d4e/d4c/f67 [4194304,4194304] 0 2026-03-09T17:30:34.517 INFO:tasks.workunit.client.1.vm09.stdout:3/670: dread d5/d16/d31/d37/dae/db4/f98 [0,4194304] 0 2026-03-09T17:30:34.528 INFO:tasks.workunit.client.1.vm09.stdout:8/758: rename d1/d14/d2a/c84 to d1/da/d23/d71/cef 0 2026-03-09T17:30:34.528 INFO:tasks.workunit.client.1.vm09.stdout:8/759: chown d1/da/d23/d6c/d32/fb5 21733470 1 2026-03-09T17:30:34.532 INFO:tasks.workunit.client.1.vm09.stdout:8/760: dwrite d1/fe7 [0,4194304] 0 2026-03-09T17:30:34.540 INFO:tasks.workunit.client.1.vm09.stdout:5/753: write d0/dc/d21/d33/f35 [1622619,7893] 0 2026-03-09T17:30:34.543 INFO:tasks.workunit.client.1.vm09.stdout:9/727: dwrite d5/de/d29/dd4/df0/fab [4194304,4194304] 0 2026-03-09T17:30:34.558 INFO:tasks.workunit.client.1.vm09.stdout:9/728: dwrite d5/d91/fdb [0,4194304] 0 2026-03-09T17:30:34.560 INFO:tasks.workunit.client.1.vm09.stdout:2/697: write d13/d15/d3b/d43/f46 [37142,5584] 0 2026-03-09T17:30:34.565 INFO:tasks.workunit.client.1.vm09.stdout:9/729: dwrite d5/d21/f2f [4194304,4194304] 0 2026-03-09T17:30:34.585 INFO:tasks.workunit.client.1.vm09.stdout:9/730: dread d5/f8e [4194304,4194304] 0 2026-03-09T17:30:34.598 INFO:tasks.workunit.client.1.vm09.stdout:4/710: creat d11/d1e/d45/d60/d71/db7/d89/d8b/d58/fdf x:0 0 0 2026-03-09T17:30:34.601 INFO:tasks.workunit.client.1.vm09.stdout:4/711: dread d11/d1e/d45/d60/d71/db7/d89/f94 [0,4194304] 0 2026-03-09T17:30:34.612 INFO:tasks.workunit.client.1.vm09.stdout:7/825: mknod da/d11/d47/d5b/d6c/d9e/d4e/c11b 0 2026-03-09T17:30:34.617 INFO:tasks.workunit.client.1.vm09.stdout:0/735: write d6/d1d/d24/f5d [7074611,31505] 0 2026-03-09T17:30:34.633 INFO:tasks.workunit.client.1.vm09.stdout:5/754: creat d0/d52/ff9 x:0 0 0 2026-03-09T17:30:34.672 INFO:tasks.workunit.client.1.vm09.stdout:5/755: dread d0/d9/f77 [0,4194304] 0 2026-03-09T17:30:34.680 INFO:tasks.workunit.client.1.vm09.stdout:5/756: dwrite d0/d2/f5d [0,4194304] 0 2026-03-09T17:30:34.735 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:34 vm06.local ceph-mon[57307]: mgrmap e29: vm06.pbgzei(active, since 2s), standbys: vm09.lqzvkh 2026-03-09T17:30:34.735 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:34 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr metadata", "who": "vm09.lqzvkh", "id": "vm09.lqzvkh"}]: dispatch 2026-03-09T17:30:34.755 INFO:tasks.workunit.client.1.vm09.stdout:8/761: write d1/da/dd/d47/f64 [4277939,18267] 0 2026-03-09T17:30:34.772 INFO:tasks.workunit.client.1.vm09.stdout:1/724: link d9/d9e/dc0/c2d d9/dc/dd/d40/cde 0 2026-03-09T17:30:34.776 INFO:tasks.workunit.client.1.vm09.stdout:6/718: creat d3/d7/d59/ff3 x:0 0 0 2026-03-09T17:30:34.776 INFO:tasks.workunit.client.1.vm09.stdout:4/712: stat d11/d1e/d45/c8e 0 2026-03-09T17:30:34.779 INFO:tasks.workunit.client.1.vm09.stdout:7/826: unlink da/d11/d64/da7/ccb 0 2026-03-09T17:30:34.779 INFO:tasks.workunit.client.1.vm09.stdout:3/671: symlink d5/d16/d31/lcf 0 2026-03-09T17:30:34.787 INFO:tasks.workunit.client.1.vm09.stdout:5/757: chown d0/dc/d21/d26/d5e/d68/d79/lb5 6 1 2026-03-09T17:30:34.789 INFO:tasks.workunit.client.1.vm09.stdout:8/762: unlink d1/da/d23/d6c/d32/f56 0 2026-03-09T17:30:34.791 INFO:tasks.workunit.client.1.vm09.stdout:1/725: truncate f8 4291528 0 2026-03-09T17:30:34.792 INFO:tasks.workunit.client.1.vm09.stdout:9/731: mkdir d5/de/df7 0 2026-03-09T17:30:34.792 INFO:tasks.workunit.client.1.vm09.stdout:9/732: chown d5/de/d29/f35 0 1 2026-03-09T17:30:34.793 INFO:tasks.workunit.client.1.vm09.stdout:9/733: read - d5/de/d4e/dca/d84/feb zero size 2026-03-09T17:30:34.797 INFO:tasks.workunit.client.1.vm09.stdout:3/672: mknod d5/d9/d30/d65/d59/cd0 0 2026-03-09T17:30:34.798 INFO:tasks.workunit.client.1.vm09.stdout:5/758: mknod d0/dc/d21/d26/d5e/dd4/cfa 0 2026-03-09T17:30:34.800 INFO:tasks.workunit.client.1.vm09.stdout:8/763: symlink d1/da/d23/d71/lf0 0 2026-03-09T17:30:34.801 INFO:tasks.workunit.client.1.vm09.stdout:8/764: chown d1/d14/f2f 2092 1 2026-03-09T17:30:34.801 INFO:tasks.workunit.client.1.vm09.stdout:8/765: stat d1/da/dd/d79 0 2026-03-09T17:30:34.803 INFO:tasks.workunit.client.1.vm09.stdout:6/719: mknod d3/d21/d25/cf4 0 2026-03-09T17:30:34.804 INFO:tasks.workunit.client.1.vm09.stdout:2/698: dwrite d13/d15/f20 [0,4194304] 0 2026-03-09T17:30:34.805 INFO:tasks.workunit.client.1.vm09.stdout:4/713: write d11/d1e/d31/f9b [583270,26187] 0 2026-03-09T17:30:34.808 INFO:tasks.workunit.client.1.vm09.stdout:0/736: write d6/d1d/f57 [941961,89297] 0 2026-03-09T17:30:34.810 INFO:tasks.workunit.client.1.vm09.stdout:7/827: creat da/d11/d47/d5b/df2/f11c x:0 0 0 2026-03-09T17:30:34.811 INFO:tasks.workunit.client.1.vm09.stdout:3/673: fdatasync d5/d16/d25/f2c 0 2026-03-09T17:30:34.813 INFO:tasks.workunit.client.1.vm09.stdout:3/674: readlink d5/d16/d31/d37/d58/d8a/la4 0 2026-03-09T17:30:34.815 INFO:tasks.workunit.client.1.vm09.stdout:9/734: read d5/f1e [1081922,59865] 0 2026-03-09T17:30:34.821 INFO:tasks.workunit.client.1.vm09.stdout:8/766: dread d1/da/d23/d6c/ddd/dcb/fcf [0,4194304] 0 2026-03-09T17:30:34.821 INFO:tasks.workunit.client.1.vm09.stdout:2/699: dread d13/d15/d34/f44 [0,4194304] 0 2026-03-09T17:30:34.822 INFO:tasks.workunit.client.1.vm09.stdout:1/726: unlink d9/d9e/dc0/d37/lc1 0 2026-03-09T17:30:34.823 INFO:tasks.workunit.client.1.vm09.stdout:4/714: mknod d11/d1e/d29/ce0 0 2026-03-09T17:30:34.824 INFO:tasks.workunit.client.1.vm09.stdout:2/700: truncate d13/d15/d34/d45/f57 2253731 0 2026-03-09T17:30:34.828 INFO:tasks.workunit.client.1.vm09.stdout:7/828: mkdir da/d11/d47/d89/dbe/d11d 0 2026-03-09T17:30:34.832 INFO:tasks.workunit.client.1.vm09.stdout:5/759: write d0/d46/d4b/feb [3190794,86654] 0 2026-03-09T17:30:34.832 INFO:tasks.workunit.client.1.vm09.stdout:8/767: dwrite d1/da/dd/faf [0,4194304] 0 2026-03-09T17:30:34.832 INFO:tasks.workunit.client.1.vm09.stdout:5/760: write d0/d2/f5d [1618654,62935] 0 2026-03-09T17:30:34.832 INFO:tasks.workunit.client.1.vm09.stdout:8/768: chown d1/da/dd/d63 451 1 2026-03-09T17:30:34.839 INFO:tasks.workunit.client.1.vm09.stdout:1/727: creat d9/d9e/fdf x:0 0 0 2026-03-09T17:30:34.841 INFO:tasks.workunit.client.1.vm09.stdout:4/715: symlink d11/d1e/d29/d36/le1 0 2026-03-09T17:30:34.841 INFO:tasks.workunit.client.1.vm09.stdout:2/701: rename d13/d15/d34/d37/l77 to d13/d15/d34/d37/d66/le0 0 2026-03-09T17:30:34.843 INFO:tasks.workunit.client.1.vm09.stdout:7/829: creat da/d11/d47/d89/f11e x:0 0 0 2026-03-09T17:30:34.843 INFO:tasks.workunit.client.1.vm09.stdout:7/830: stat da/d11/d47/d5b/d6c/d9e/ff3 0 2026-03-09T17:30:34.857 INFO:tasks.workunit.client.1.vm09.stdout:3/675: dread d5/d16/d25/f2b [0,4194304] 0 2026-03-09T17:30:34.864 INFO:tasks.workunit.client.1.vm09.stdout:0/737: dread d6/d1d/f1e [0,4194304] 0 2026-03-09T17:30:34.864 INFO:tasks.workunit.client.1.vm09.stdout:0/738: chown d6/d1d/d24/d32/d59/d81/ff3 6 1 2026-03-09T17:30:34.870 INFO:tasks.workunit.client.1.vm09.stdout:9/735: unlink d5/de/d29/d90/dc7/cbb 0 2026-03-09T17:30:34.871 INFO:tasks.workunit.client.1.vm09.stdout:9/736: dread - d5/de/d4e/dca/d84/fee zero size 2026-03-09T17:30:34.880 INFO:tasks.workunit.client.1.vm09.stdout:6/720: write d3/d7/d59/d73/f75 [1045980,34196] 0 2026-03-09T17:30:34.880 INFO:tasks.workunit.client.1.vm09.stdout:6/721: chown d3/d21/d25/d26/d86 1180 1 2026-03-09T17:30:34.886 INFO:tasks.workunit.client.1.vm09.stdout:1/728: mkdir d9/d9e/dc0/d37/d3f/d42/d55/de0 0 2026-03-09T17:30:34.903 INFO:tasks.workunit.client.1.vm09.stdout:0/739: fdatasync d6/d1d/f1e 0 2026-03-09T17:30:34.904 INFO:tasks.workunit.client.1.vm09.stdout:9/737: dread - d5/de/d29/d90/fb9 zero size 2026-03-09T17:30:34.906 INFO:tasks.workunit.client.1.vm09.stdout:8/769: mkdir d1/da/d23/d71/db6/df1 0 2026-03-09T17:30:34.907 INFO:tasks.workunit.client.1.vm09.stdout:4/716: write f3 [1066041,42809] 0 2026-03-09T17:30:34.915 INFO:tasks.workunit.client.1.vm09.stdout:2/702: write d13/f40 [2710616,76478] 0 2026-03-09T17:30:34.917 INFO:tasks.workunit.client.1.vm09.stdout:5/761: dwrite d0/d9/d74/f99 [0,4194304] 0 2026-03-09T17:30:34.919 INFO:tasks.workunit.client.1.vm09.stdout:5/762: stat d0/d9/l93 0 2026-03-09T17:30:34.922 INFO:tasks.workunit.client.1.vm09.stdout:0/740: dread d6/d1d/d46/fd0 [0,4194304] 0 2026-03-09T17:30:34.924 INFO:tasks.workunit.client.1.vm09.stdout:7/831: dwrite da/d11/f3f [8388608,4194304] 0 2026-03-09T17:30:34.924 INFO:tasks.workunit.client.1.vm09.stdout:2/703: read d13/d4d/f81 [222045,12323] 0 2026-03-09T17:30:34.924 INFO:tasks.workunit.client.1.vm09.stdout:6/722: creat d3/d21/d25/d26/d86/dbe/ff5 x:0 0 0 2026-03-09T17:30:34.925 INFO:tasks.workunit.client.1.vm09.stdout:2/704: chown d13/fa3 4 1 2026-03-09T17:30:34.929 INFO:tasks.workunit.client.1.vm09.stdout:3/676: creat d5/d16/dc5/fd1 x:0 0 0 2026-03-09T17:30:34.933 INFO:tasks.workunit.client.1.vm09.stdout:3/677: write d5/d9/d30/d65/d59/d84/fab [41041,103929] 0 2026-03-09T17:30:34.941 INFO:tasks.workunit.client.1.vm09.stdout:5/763: dread d0/d2/d76/d87/d95/d9b/fab [0,4194304] 0 2026-03-09T17:30:34.941 INFO:tasks.workunit.client.1.vm09.stdout:9/738: rename d5/d21/f2f to d5/d2e/d8b/db4/ff8 0 2026-03-09T17:30:34.948 INFO:tasks.workunit.client.1.vm09.stdout:1/729: dwrite d9/dc/f90 [0,4194304] 0 2026-03-09T17:30:34.950 INFO:tasks.workunit.client.1.vm09.stdout:2/705: dread d13/f89 [0,4194304] 0 2026-03-09T17:30:34.959 INFO:tasks.workunit.client.1.vm09.stdout:0/741: sync 2026-03-09T17:30:34.965 INFO:tasks.workunit.client.1.vm09.stdout:0/742: chown d6/d64/d97 62 1 2026-03-09T17:30:34.966 INFO:tasks.workunit.client.1.vm09.stdout:0/743: chown d6/d1d/d24/d5e/dc2/cc4 91 1 2026-03-09T17:30:34.971 INFO:tasks.workunit.client.1.vm09.stdout:0/744: dwrite d6/d1d/d24/d32/d59/fb0 [0,4194304] 0 2026-03-09T17:30:34.980 INFO:tasks.workunit.client.1.vm09.stdout:5/764: unlink d0/d2/d76/d87/fa5 0 2026-03-09T17:30:34.983 INFO:tasks.workunit.client.1.vm09.stdout:6/723: chown d3/d21/d76/d5c/d61/c5e 1104927 1 2026-03-09T17:30:34.983 INFO:tasks.workunit.client.1.vm09.stdout:1/730: creat d9/d9e/dc0/d8b/fe1 x:0 0 0 2026-03-09T17:30:34.983 INFO:tasks.workunit.client.1.vm09.stdout:0/745: creat d6/d1d/d24/d5e/d6c/ff6 x:0 0 0 2026-03-09T17:30:34.988 INFO:tasks.workunit.client.1.vm09.stdout:9/739: symlink d5/de/d4e/lf9 0 2026-03-09T17:30:34.988 INFO:tasks.workunit.client.1.vm09.stdout:3/678: rename d5/d16/d31/d3d/c72 to d5/d9/d30/d65/d59/d84/d8c/cd2 0 2026-03-09T17:30:34.988 INFO:tasks.workunit.client.1.vm09.stdout:9/740: chown d5/de/d29/d33/f4a 118 1 2026-03-09T17:30:34.989 INFO:tasks.workunit.client.1.vm09.stdout:3/679: readlink d5/d9/d30/d65/d59/l97 0 2026-03-09T17:30:34.990 INFO:tasks.workunit.client.1.vm09.stdout:5/765: rmdir d0/dc/d21/d26/d5e/dd4 39 2026-03-09T17:30:34.993 INFO:tasks.workunit.client.1.vm09.stdout:6/724: mkdir d3/d21/d76/d5c/df6 0 2026-03-09T17:30:34.994 INFO:tasks.workunit.client.1.vm09.stdout:3/680: creat d5/d9/d90/fd3 x:0 0 0 2026-03-09T17:30:34.996 INFO:tasks.workunit.client.1.vm09.stdout:3/681: readlink d5/d16/d31/d37/d58/d8a/la4 0 2026-03-09T17:30:34.997 INFO:tasks.workunit.client.1.vm09.stdout:6/725: mknod d3/d7/cf7 0 2026-03-09T17:30:34.998 INFO:tasks.workunit.client.1.vm09.stdout:9/741: dwrite d5/d2e/fef [0,4194304] 0 2026-03-09T17:30:34.999 INFO:tasks.workunit.client.1.vm09.stdout:5/766: dwrite d0/d9/d74/f99 [0,4194304] 0 2026-03-09T17:30:35.009 INFO:tasks.workunit.client.1.vm09.stdout:5/767: write d0/d2/d76/d86/ff2 [204576,129687] 0 2026-03-09T17:30:35.035 INFO:tasks.workunit.client.1.vm09.stdout:8/770: dwrite d1/da/f4b [4194304,4194304] 0 2026-03-09T17:30:35.037 INFO:tasks.workunit.client.1.vm09.stdout:3/682: rename d5/d9/d30/d65/f19 to d5/d9/da9/fd4 0 2026-03-09T17:30:35.037 INFO:tasks.workunit.client.1.vm09.stdout:4/717: dwrite d11/d1e/d29/d36/d57/f8f [0,4194304] 0 2026-03-09T17:30:35.039 INFO:tasks.workunit.client.1.vm09.stdout:7/832: dwrite da/d11/d47/d5b/d6c/d9e/d4e/ff7 [0,4194304] 0 2026-03-09T17:30:35.056 INFO:tasks.workunit.client.1.vm09.stdout:6/726: truncate d3/d7/d59/d73/fef 645947 0 2026-03-09T17:30:35.057 INFO:tasks.workunit.client.1.vm09.stdout:2/706: truncate d13/d15/d21/f5d 357254 0 2026-03-09T17:30:35.059 INFO:tasks.workunit.client.1.vm09.stdout:0/746: write d6/d1d/d24/d32/d59/d81/d8c/fb1 [2583012,128809] 0 2026-03-09T17:30:35.062 INFO:tasks.workunit.client.1.vm09.stdout:7/833: mkdir da/d11/d64/d11f 0 2026-03-09T17:30:35.062 INFO:tasks.workunit.client.1.vm09.stdout:0/747: chown d6/d1d/d24/d5e/dc2/cc4 6 1 2026-03-09T17:30:35.063 INFO:tasks.workunit.client.1.vm09.stdout:2/707: symlink d13/d15/d36/d72/dc3/le1 0 2026-03-09T17:30:35.064 INFO:tasks.workunit.client.1.vm09.stdout:4/718: mkdir d11/d1e/de2 0 2026-03-09T17:30:35.066 INFO:tasks.workunit.client.1.vm09.stdout:9/742: dread d5/de/f20 [0,4194304] 0 2026-03-09T17:30:35.068 INFO:tasks.workunit.client.1.vm09.stdout:7/834: fsync da/d11/d64/d84/feb 0 2026-03-09T17:30:35.068 INFO:tasks.workunit.client.1.vm09.stdout:4/719: mkdir d11/d1e/d29/d36/de3 0 2026-03-09T17:30:35.068 INFO:tasks.workunit.client.1.vm09.stdout:4/720: fdatasync f10 0 2026-03-09T17:30:35.068 INFO:tasks.workunit.client.1.vm09.stdout:2/708: truncate d13/d15/d34/d45/d84/dcb/f2d 2331501 0 2026-03-09T17:30:35.069 INFO:tasks.workunit.client.1.vm09.stdout:4/721: unlink d11/d1e/f3c 0 2026-03-09T17:30:35.070 INFO:tasks.workunit.client.1.vm09.stdout:0/748: getdents d6/d1d/d39 0 2026-03-09T17:30:35.077 INFO:tasks.workunit.client.1.vm09.stdout:4/722: dread d11/d1e/d29/f93 [0,4194304] 0 2026-03-09T17:30:35.078 INFO:tasks.workunit.client.1.vm09.stdout:0/749: dwrite d6/d1d/f57 [4194304,4194304] 0 2026-03-09T17:30:35.086 INFO:tasks.workunit.client.1.vm09.stdout:6/727: sync 2026-03-09T17:30:35.086 INFO:tasks.workunit.client.1.vm09.stdout:0/750: stat d6/d1d/d24 0 2026-03-09T17:30:35.088 INFO:tasks.workunit.client.1.vm09.stdout:6/728: write d3/d21/d25/d26/d6b/dbf/f66 [4048838,87024] 0 2026-03-09T17:30:35.088 INFO:tasks.workunit.client.1.vm09.stdout:0/751: dread - d6/d1d/d24/d5e/db2/fb9 zero size 2026-03-09T17:30:35.090 INFO:tasks.workunit.client.1.vm09.stdout:0/752: truncate d6/faf 728660 0 2026-03-09T17:30:35.092 INFO:tasks.workunit.client.1.vm09.stdout:0/753: unlink d6/d1d/d24/d32/d59/d81/fc1 0 2026-03-09T17:30:35.104 INFO:tasks.workunit.client.1.vm09.stdout:1/731: dwrite d9/d9e/dc0/d37/d3f/f62 [0,4194304] 0 2026-03-09T17:30:35.104 INFO:tasks.workunit.client.1.vm09.stdout:0/754: mkdir d6/d1d/d24/d5e/dc2/df7 0 2026-03-09T17:30:35.114 INFO:tasks.workunit.client.1.vm09.stdout:1/732: fdatasync d9/dc/dd/f7b 0 2026-03-09T17:30:35.116 INFO:tasks.workunit.client.1.vm09.stdout:1/733: dwrite d9/dc/dd/d40/d1d/fab [0,4194304] 0 2026-03-09T17:30:35.121 INFO:tasks.workunit.client.1.vm09.stdout:1/734: fsync d9/dc/dd/fe 0 2026-03-09T17:30:35.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:34 vm09.local ceph-mon[62061]: mgrmap e29: vm06.pbgzei(active, since 2s), standbys: vm09.lqzvkh 2026-03-09T17:30:35.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:34 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr metadata", "who": "vm09.lqzvkh", "id": "vm09.lqzvkh"}]: dispatch 2026-03-09T17:30:35.157 INFO:tasks.workunit.client.1.vm09.stdout:3/683: write d5/d9/d30/f61 [1607798,94710] 0 2026-03-09T17:30:35.163 INFO:tasks.workunit.client.1.vm09.stdout:5/768: dwrite d0/dc/d21/d26/d5e/fbc [0,4194304] 0 2026-03-09T17:30:35.168 INFO:tasks.workunit.client.1.vm09.stdout:5/769: write d0/d2/d76/d87/d95/d9b/dc0/dde/fed [516398,12542] 0 2026-03-09T17:30:35.168 INFO:tasks.workunit.client.1.vm09.stdout:8/771: write d1/f16 [1508811,106557] 0 2026-03-09T17:30:35.171 INFO:tasks.workunit.client.1.vm09.stdout:5/770: rmdir d0/d46 39 2026-03-09T17:30:35.171 INFO:tasks.workunit.client.1.vm09.stdout:3/684: rmdir d5/d16/d31/d3d/db3 39 2026-03-09T17:30:35.172 INFO:tasks.workunit.client.1.vm09.stdout:5/771: readlink d0/dc/d21/d33/lba 0 2026-03-09T17:30:35.179 INFO:tasks.workunit.client.1.vm09.stdout:2/709: fsync d13/d15/d34/d45/d84/dcb/f2d 0 2026-03-09T17:30:35.180 INFO:tasks.workunit.client.1.vm09.stdout:6/729: write d3/d7/d59/d9c/fd5 [3226180,65783] 0 2026-03-09T17:30:35.180 INFO:tasks.workunit.client.1.vm09.stdout:4/723: write d11/d1e/d29/d36/d57/f79 [1799822,130839] 0 2026-03-09T17:30:35.181 INFO:tasks.workunit.client.1.vm09.stdout:9/743: write d5/d21/f38 [1103108,124509] 0 2026-03-09T17:30:35.182 INFO:tasks.workunit.client.1.vm09.stdout:3/685: fdatasync d5/d9/d90/db0/f33 0 2026-03-09T17:30:35.184 INFO:tasks.workunit.client.1.vm09.stdout:5/772: mknod d0/d46/cfb 0 2026-03-09T17:30:35.186 INFO:tasks.workunit.client.1.vm09.stdout:2/710: chown lf 6201105 1 2026-03-09T17:30:35.191 INFO:tasks.workunit.client.1.vm09.stdout:9/744: symlink d5/de/d29/d33/lfa 0 2026-03-09T17:30:35.193 INFO:tasks.workunit.client.1.vm09.stdout:7/835: dwrite da/d11/d47/d5b/d78/fea [0,4194304] 0 2026-03-09T17:30:35.196 INFO:tasks.workunit.client.1.vm09.stdout:7/836: dread - da/d11/d3e/da2/fb7 zero size 2026-03-09T17:30:35.206 INFO:tasks.workunit.client.1.vm09.stdout:0/755: dwrite d6/d1d/d24/d5e/db2/fb9 [0,4194304] 0 2026-03-09T17:30:35.208 INFO:tasks.workunit.client.1.vm09.stdout:8/772: dwrite d1/d14/d2a/d42/d5d/d8a/fb8 [0,4194304] 0 2026-03-09T17:30:35.216 INFO:tasks.workunit.client.1.vm09.stdout:3/686: dwrite d5/d9/d90/db0/dbb/fbe [4194304,4194304] 0 2026-03-09T17:30:35.220 INFO:tasks.workunit.client.1.vm09.stdout:4/724: dwrite d11/d1e/d31/f65 [0,4194304] 0 2026-03-09T17:30:35.220 INFO:tasks.workunit.client.1.vm09.stdout:9/745: readlink d5/de/l24 0 2026-03-09T17:30:35.220 INFO:tasks.workunit.client.1.vm09.stdout:7/837: symlink da/d11/d64/da7/db1/l120 0 2026-03-09T17:30:35.221 INFO:tasks.workunit.client.1.vm09.stdout:7/838: stat da/d11/d2d/d56/lc0 0 2026-03-09T17:30:35.221 INFO:tasks.workunit.client.1.vm09.stdout:4/725: write d11/d1e/d31/f65 [1203398,82863] 0 2026-03-09T17:30:35.224 INFO:tasks.workunit.client.1.vm09.stdout:3/687: chown d5/d16/d31/d37/d58/f73 475038 1 2026-03-09T17:30:35.227 INFO:tasks.workunit.client.1.vm09.stdout:4/726: readlink d11/d1e/d29/d36/d57/lc4 0 2026-03-09T17:30:35.228 INFO:tasks.workunit.client.1.vm09.stdout:6/730: truncate d3/d21/d25/d26/f2a 5652455 0 2026-03-09T17:30:35.228 INFO:tasks.workunit.client.1.vm09.stdout:5/773: dread d0/f22 [0,4194304] 0 2026-03-09T17:30:35.229 INFO:tasks.workunit.client.1.vm09.stdout:5/774: fdatasync d0/d2/d76/d87/d95/f9a 0 2026-03-09T17:30:35.241 INFO:tasks.workunit.client.1.vm09.stdout:2/711: creat d13/d15/d36/d72/d94/fe2 x:0 0 0 2026-03-09T17:30:35.245 INFO:tasks.workunit.client.1.vm09.stdout:8/773: mknod d1/da/dd/d79/cf2 0 2026-03-09T17:30:35.262 INFO:tasks.workunit.client.1.vm09.stdout:1/735: dwrite d9/d9e/dc0/d37/d3f/f56 [0,4194304] 0 2026-03-09T17:30:35.268 INFO:tasks.workunit.client.1.vm09.stdout:7/839: read - da/d11/d2d/fee zero size 2026-03-09T17:30:35.284 INFO:tasks.workunit.client.1.vm09.stdout:3/688: creat d5/d9/d30/d65/d59/d84/d8c/fd5 x:0 0 0 2026-03-09T17:30:35.284 INFO:tasks.workunit.client.1.vm09.stdout:4/727: truncate d11/d1e/d29/d36/d57/fbc 803765 0 2026-03-09T17:30:35.284 INFO:tasks.workunit.client.1.vm09.stdout:5/775: truncate d0/dc/d21/d26/d5e/d68/d79/fc7 113182 0 2026-03-09T17:30:35.286 INFO:tasks.workunit.client.1.vm09.stdout:5/776: fdatasync d0/dc/d21/d26/d5e/fbc 0 2026-03-09T17:30:35.290 INFO:tasks.workunit.client.1.vm09.stdout:5/777: dwrite d0/d9/d74/d75/fee [0,4194304] 0 2026-03-09T17:30:35.291 INFO:tasks.workunit.client.1.vm09.stdout:5/778: fsync d0/ff0 0 2026-03-09T17:30:35.292 INFO:tasks.workunit.client.1.vm09.stdout:5/779: chown d0/d9/d74/d75/fee 8459504 1 2026-03-09T17:30:35.302 INFO:tasks.workunit.client.1.vm09.stdout:1/736: creat d9/d9e/fe2 x:0 0 0 2026-03-09T17:30:35.306 INFO:tasks.workunit.client.1.vm09.stdout:1/737: read d9/dc/dd/d40/d1d/f4d [2021122,4788] 0 2026-03-09T17:30:35.309 INFO:tasks.workunit.client.1.vm09.stdout:7/840: symlink da/d11/d2d/d56/d68/l121 0 2026-03-09T17:30:35.319 INFO:tasks.workunit.client.1.vm09.stdout:3/689: rmdir d5/d9/d90/db0 39 2026-03-09T17:30:35.319 INFO:tasks.workunit.client.1.vm09.stdout:3/690: readlink d5/d9/d90/l92 0 2026-03-09T17:30:35.319 INFO:tasks.workunit.client.1.vm09.stdout:3/691: write d5/d9/da9/fc9 [1784721,43268] 0 2026-03-09T17:30:35.319 INFO:tasks.workunit.client.1.vm09.stdout:4/728: dread - d11/d1e/d45/d60/d71/db7/d89/fba zero size 2026-03-09T17:30:35.320 INFO:tasks.workunit.client.1.vm09.stdout:5/780: sync 2026-03-09T17:30:35.327 INFO:tasks.workunit.client.1.vm09.stdout:0/756: creat d6/d1d/ff8 x:0 0 0 2026-03-09T17:30:35.330 INFO:tasks.workunit.client.1.vm09.stdout:1/738: creat d9/dc/dd/d40/d21/d35/fe3 x:0 0 0 2026-03-09T17:30:35.335 INFO:tasks.workunit.client.1.vm09.stdout:7/841: creat da/d11/d47/d5b/d6c/d9e/dc6/ddb/f122 x:0 0 0 2026-03-09T17:30:35.335 INFO:tasks.workunit.client.1.vm09.stdout:4/729: dread d11/d1e/d29/d36/f86 [0,4194304] 0 2026-03-09T17:30:35.336 INFO:tasks.workunit.client.1.vm09.stdout:3/692: write d5/d9/d30/d65/f43 [240523,116659] 0 2026-03-09T17:30:35.339 INFO:tasks.workunit.client.1.vm09.stdout:2/712: link d13/d15/d3b/ldc d13/d4d/le3 0 2026-03-09T17:30:35.339 INFO:tasks.workunit.client.1.vm09.stdout:4/730: write d11/d1e/d45/d60/d71/db7/fc9 [391764,122356] 0 2026-03-09T17:30:35.346 INFO:tasks.workunit.client.1.vm09.stdout:0/757: fsync d6/f21 0 2026-03-09T17:30:35.359 INFO:tasks.workunit.client.1.vm09.stdout:1/739: rmdir d9/dc/dd/d9f/d9c 39 2026-03-09T17:30:35.369 INFO:tasks.workunit.client.1.vm09.stdout:7/842: creat da/d11/d2d/d56/da1/f123 x:0 0 0 2026-03-09T17:30:35.370 INFO:tasks.workunit.client.1.vm09.stdout:3/693: write d5/d9/d90/db0/fa0 [1923156,68519] 0 2026-03-09T17:30:35.374 INFO:tasks.workunit.client.1.vm09.stdout:4/731: unlink d11/d1e/d45/d60/d71/db7/fc9 0 2026-03-09T17:30:35.374 INFO:tasks.workunit.client.1.vm09.stdout:4/732: read - d11/d1e/d45/f70 zero size 2026-03-09T17:30:35.379 INFO:tasks.workunit.client.1.vm09.stdout:5/781: creat d0/d9/d74/d75/dbd/ffc x:0 0 0 2026-03-09T17:30:35.380 INFO:tasks.workunit.client.1.vm09.stdout:4/733: dread d11/d1e/d29/db5/fcf [0,4194304] 0 2026-03-09T17:30:35.380 INFO:tasks.workunit.client.1.vm09.stdout:8/774: write d1/da/d23/fc4 [2499147,34248] 0 2026-03-09T17:30:35.384 INFO:tasks.workunit.client.1.vm09.stdout:9/746: truncate d5/f11 2898319 0 2026-03-09T17:30:35.386 INFO:tasks.workunit.client.1.vm09.stdout:0/758: creat d6/d1d/d24/d5e/dc2/ff9 x:0 0 0 2026-03-09T17:30:35.387 INFO:tasks.workunit.client.1.vm09.stdout:9/747: sync 2026-03-09T17:30:35.388 INFO:tasks.workunit.client.1.vm09.stdout:6/731: dwrite d3/d7/d59/d73/f7d [0,4194304] 0 2026-03-09T17:30:35.390 INFO:tasks.workunit.client.1.vm09.stdout:7/843: chown da/d11/d47/d5b/d6c/l104 107817 1 2026-03-09T17:30:35.390 INFO:tasks.workunit.client.1.vm09.stdout:7/844: readlink da/d11/l3c 0 2026-03-09T17:30:35.395 INFO:tasks.workunit.client.1.vm09.stdout:3/694: chown d5/d16/d31/d3d/db3/ccb 4617 1 2026-03-09T17:30:35.397 INFO:tasks.workunit.client.1.vm09.stdout:7/845: dwrite da/d11/d47/d5b/d6c/d9e/f105 [0,4194304] 0 2026-03-09T17:30:35.408 INFO:tasks.workunit.client.1.vm09.stdout:6/732: dwrite d3/d21/d76/d5c/d61/f60 [0,4194304] 0 2026-03-09T17:30:35.411 INFO:tasks.workunit.client.1.vm09.stdout:7/846: dread da/d11/d47/d5b/d6c/fb3 [0,4194304] 0 2026-03-09T17:30:35.413 INFO:tasks.workunit.client.1.vm09.stdout:3/695: dread d5/d16/d31/d37/d58/f91 [0,4194304] 0 2026-03-09T17:30:35.422 INFO:tasks.workunit.client.1.vm09.stdout:8/775: chown d1/d14/d2a/d42/d5d/d8a/cae 9316270 1 2026-03-09T17:30:35.428 INFO:tasks.workunit.client.1.vm09.stdout:0/759: mkdir d6/d64/dbd/dfa 0 2026-03-09T17:30:35.429 INFO:tasks.workunit.client.1.vm09.stdout:0/760: dread - d6/d1d/d24/d32/fde zero size 2026-03-09T17:30:35.451 INFO:tasks.workunit.client.1.vm09.stdout:2/713: link d13/d15/d34/d37/c70 d13/d4d/daa/ce4 0 2026-03-09T17:30:35.459 INFO:tasks.workunit.client.1.vm09.stdout:4/734: write d11/f25 [6243765,58065] 0 2026-03-09T17:30:35.466 INFO:tasks.workunit.client.1.vm09.stdout:5/782: dwrite d0/f22 [0,4194304] 0 2026-03-09T17:30:35.475 INFO:tasks.workunit.client.1.vm09.stdout:6/733: chown d3/d7/d59/d5a/ccc 431444065 1 2026-03-09T17:30:35.476 INFO:tasks.workunit.client.1.vm09.stdout:7/847: chown da/d11/d77/de5/c108 718 1 2026-03-09T17:30:35.479 INFO:tasks.workunit.client.1.vm09.stdout:7/848: chown da/d11/d2d/d56/cce 1 1 2026-03-09T17:30:35.481 INFO:tasks.workunit.client.1.vm09.stdout:3/696: creat d5/d9/d30/d65/d59/fd6 x:0 0 0 2026-03-09T17:30:35.482 INFO:tasks.workunit.client.1.vm09.stdout:3/697: chown d5/d9/d30/d65/d59/f87 21415500 1 2026-03-09T17:30:35.484 INFO:tasks.workunit.client.1.vm09.stdout:7/849: dwrite da/d11/d3e/da2/fff [0,4194304] 0 2026-03-09T17:30:35.494 INFO:tasks.workunit.client.1.vm09.stdout:8/776: mknod d1/d14/d2a/cf3 0 2026-03-09T17:30:35.494 INFO:tasks.workunit.client.1.vm09.stdout:8/777: dread - d1/da/dd/d79/fca zero size 2026-03-09T17:30:35.502 INFO:tasks.workunit.client.1.vm09.stdout:1/740: rename d9/dc/d63 to d9/dc/dd/d9f/de4 0 2026-03-09T17:30:35.503 INFO:tasks.workunit.client.1.vm09.stdout:1/741: truncate d9/d9e/dc0/d37/d3f/f56 4718432 0 2026-03-09T17:30:35.511 INFO:tasks.workunit.client.1.vm09.stdout:1/742: dread d9/dc/dd/d40/d21/fb6 [0,4194304] 0 2026-03-09T17:30:35.520 INFO:tasks.workunit.client.1.vm09.stdout:4/735: unlink d11/d1e/d45/lb9 0 2026-03-09T17:30:35.541 INFO:tasks.workunit.client.1.vm09.stdout:5/783: fdatasync d0/d52/d20/f7c 0 2026-03-09T17:30:35.547 INFO:tasks.workunit.client.1.vm09.stdout:6/734: dread d3/d7/d59/d73/fa3 [0,4194304] 0 2026-03-09T17:30:35.573 INFO:tasks.workunit.client.1.vm09.stdout:0/761: write d6/f6d [814538,99762] 0 2026-03-09T17:30:35.595 INFO:tasks.workunit.client.1.vm09.stdout:9/748: write d5/d2e/f6f [3583343,14378] 0 2026-03-09T17:30:35.614 INFO:tasks.workunit.client.1.vm09.stdout:2/714: mkdir d13/d15/d21/d88/db8/dd1/de5 0 2026-03-09T17:30:35.623 INFO:tasks.workunit.client.1.vm09.stdout:5/784: rmdir d0/dc/d21/d26/d5e/d68 39 2026-03-09T17:30:35.632 INFO:tasks.workunit.client.1.vm09.stdout:3/698: dwrite d5/d9/d30/d65/f1d [0,4194304] 0 2026-03-09T17:30:35.634 INFO:tasks.workunit.client.1.vm09.stdout:2/715: dwrite d13/d15/d34/d45/f57 [0,4194304] 0 2026-03-09T17:30:35.638 INFO:tasks.workunit.client.1.vm09.stdout:7/850: dwrite da/d11/d2d/f70 [0,4194304] 0 2026-03-09T17:30:35.641 INFO:tasks.workunit.client.1.vm09.stdout:5/785: dwrite d0/dc/dc3/ff7 [0,4194304] 0 2026-03-09T17:30:35.643 INFO:tasks.workunit.client.1.vm09.stdout:7/851: write da/d11/d3e/dd8/f112 [1046539,32170] 0 2026-03-09T17:30:35.649 INFO:tasks.workunit.client.1.vm09.stdout:7/852: readlink da/d11/d47/d5b/d6c/d9e/d4e/l46 0 2026-03-09T17:30:35.719 INFO:tasks.workunit.client.1.vm09.stdout:1/743: mkdir d9/de5 0 2026-03-09T17:30:35.719 INFO:tasks.workunit.client.1.vm09.stdout:2/716: fsync d13/f8b 0 2026-03-09T17:30:35.721 INFO:tasks.workunit.client.1.vm09.stdout:3/699: creat d5/d9c/fd7 x:0 0 0 2026-03-09T17:30:35.722 INFO:tasks.workunit.client.1.vm09.stdout:7/853: creat da/d11/d47/d5b/d6c/d9e/d4e/d5f/f124 x:0 0 0 2026-03-09T17:30:35.722 INFO:tasks.workunit.client.1.vm09.stdout:4/736: link d11/d1e/d29/f8a d11/d1e/fe4 0 2026-03-09T17:30:35.723 INFO:tasks.workunit.client.1.vm09.stdout:4/737: stat d11/d1e/d45/daf/cc2 0 2026-03-09T17:30:35.724 INFO:tasks.workunit.client.1.vm09.stdout:2/717: creat d13/d15/d21/d88/db8/fe6 x:0 0 0 2026-03-09T17:30:35.724 INFO:tasks.workunit.client.1.vm09.stdout:1/744: truncate d9/d9e/dc0/d91/f93 260749 0 2026-03-09T17:30:35.729 INFO:tasks.workunit.client.1.vm09.stdout:3/700: rename d5/d16/d46/l9d to d5/d9/d30/ld8 0 2026-03-09T17:30:35.730 INFO:tasks.workunit.client.1.vm09.stdout:4/738: symlink d11/d1e/d45/daf/le5 0 2026-03-09T17:30:35.732 INFO:tasks.workunit.client.1.vm09.stdout:0/762: link d6/l9d d6/d1d/d24/d5e/lfb 0 2026-03-09T17:30:35.734 INFO:tasks.workunit.client.1.vm09.stdout:3/701: rmdir d5/d9/d90/db0 39 2026-03-09T17:30:35.734 INFO:tasks.workunit.client.1.vm09.stdout:1/745: mknod d9/d9e/dc0/d37/ce6 0 2026-03-09T17:30:35.737 INFO:tasks.workunit.client.1.vm09.stdout:0/763: dwrite d6/d1d/d24/d32/d59/d9c/dac/dcc/fe9 [0,4194304] 0 2026-03-09T17:30:35.741 INFO:tasks.workunit.client.1.vm09.stdout:3/702: creat d5/d9/d30/d65/d59/fd9 x:0 0 0 2026-03-09T17:30:35.755 INFO:tasks.workunit.client.1.vm09.stdout:7/854: rename da/d11/l28 to da/d11/d47/d5b/l125 0 2026-03-09T17:30:35.758 INFO:tasks.workunit.client.1.vm09.stdout:0/764: dwrite d6/f6d [0,4194304] 0 2026-03-09T17:30:35.761 INFO:tasks.workunit.client.1.vm09.stdout:0/765: stat d6/d1d/f70 0 2026-03-09T17:30:35.763 INFO:tasks.workunit.client.1.vm09.stdout:7/855: chown da/d11/f25 0 1 2026-03-09T17:30:35.778 INFO:tasks.workunit.client.1.vm09.stdout:2/718: rename d13/d15/d21/f3e to d13/d15/d34/d45/d84/dcb/fe7 0 2026-03-09T17:30:35.779 INFO:tasks.workunit.client.1.vm09.stdout:1/746: getdents d9/d9e/dc0/d37/da4 0 2026-03-09T17:30:35.781 INFO:tasks.workunit.client.1.vm09.stdout:2/719: rename d13/d15/d36/d72/d94/da7/db0/dd6/ld8 to d13/dc8/le8 0 2026-03-09T17:30:35.783 INFO:tasks.workunit.client.1.vm09.stdout:2/720: symlink d13/d15/d34/d45/d84/dcb/le9 0 2026-03-09T17:30:35.784 INFO:tasks.workunit.client.1.vm09.stdout:2/721: chown d13/d15/d36/d72/d94/da7/c8c 119 1 2026-03-09T17:30:35.786 INFO:tasks.workunit.client.1.vm09.stdout:2/722: dread d13/d15/fdb [0,4194304] 0 2026-03-09T17:30:35.787 INFO:tasks.workunit.client.1.vm09.stdout:7/856: sync 2026-03-09T17:30:35.787 INFO:tasks.workunit.client.1.vm09.stdout:1/747: sync 2026-03-09T17:30:35.789 INFO:tasks.workunit.client.1.vm09.stdout:2/723: creat d13/da4/fea x:0 0 0 2026-03-09T17:30:35.790 INFO:tasks.workunit.client.1.vm09.stdout:7/857: stat da/d11/d47/d5b/d6c/ca0 0 2026-03-09T17:30:35.792 INFO:tasks.workunit.client.1.vm09.stdout:2/724: symlink d13/da4/leb 0 2026-03-09T17:30:35.793 INFO:tasks.workunit.client.1.vm09.stdout:1/748: creat d9/ddd/fe7 x:0 0 0 2026-03-09T17:30:35.799 INFO:tasks.workunit.client.1.vm09.stdout:7/858: creat da/d11/d47/d5b/d6c/d9e/f126 x:0 0 0 2026-03-09T17:30:35.799 INFO:tasks.workunit.client.1.vm09.stdout:2/725: write d13/d15/d34/f5b [9308103,113936] 0 2026-03-09T17:30:35.804 INFO:tasks.workunit.client.1.vm09.stdout:1/749: mknod d9/d9e/dc0/d37/da4/ce8 0 2026-03-09T17:30:35.808 INFO:tasks.workunit.client.1.vm09.stdout:6/735: write d3/d48/f6c [1472569,72099] 0 2026-03-09T17:30:35.811 INFO:tasks.workunit.client.1.vm09.stdout:6/736: write d3/d21/d25/d26/d6b/dbf/f66 [5171385,4410] 0 2026-03-09T17:30:35.828 INFO:tasks.workunit.client.1.vm09.stdout:2/726: dread d13/d15/d36/d72/f87 [0,4194304] 0 2026-03-09T17:30:35.830 INFO:tasks.workunit.client.1.vm09.stdout:2/727: stat d13/d15/d34/d45/d84/dcb/c63 0 2026-03-09T17:30:35.836 INFO:tasks.workunit.client.1.vm09.stdout:1/750: write d9/dc/dd/d9f/de4/dba/fd7 [848022,87123] 0 2026-03-09T17:30:35.841 INFO:tasks.workunit.client.1.vm09.stdout:1/751: dwrite d9/d38/fc3 [0,4194304] 0 2026-03-09T17:30:35.847 INFO:tasks.workunit.client.1.vm09.stdout:2/728: readlink d13/d15/d3b/l58 0 2026-03-09T17:30:35.850 INFO:tasks.workunit.client.1.vm09.stdout:1/752: dread d9/dc/fa9 [0,4194304] 0 2026-03-09T17:30:35.851 INFO:tasks.workunit.client.1.vm09.stdout:1/753: fsync d9/d9e/dc0/d91/d99/fbc 0 2026-03-09T17:30:35.852 INFO:tasks.workunit.client.1.vm09.stdout:1/754: write d9/dc/dd/d9f/de4/fc8 [2312877,61746] 0 2026-03-09T17:30:35.865 INFO:tasks.workunit.client.1.vm09.stdout:9/749: write d5/de/f76 [69879,110397] 0 2026-03-09T17:30:35.865 INFO:tasks.workunit.client.1.vm09.stdout:8/778: write d1/da/dd/fc6 [3658910,39412] 0 2026-03-09T17:30:35.869 INFO:tasks.workunit.client.1.vm09.stdout:5/786: write d0/dc/d21/d26/f36 [1488279,79894] 0 2026-03-09T17:30:35.876 INFO:tasks.workunit.client.1.vm09.stdout:9/750: dwrite d5/de/d4e/dca/de7/d93/fb0 [4194304,4194304] 0 2026-03-09T17:30:35.887 INFO:tasks.workunit.client.1.vm09.stdout:4/739: write d11/d1e/d45/d60/d71/db7/fa5 [652640,58293] 0 2026-03-09T17:30:35.889 INFO:tasks.workunit.client.1.vm09.stdout:2/729: symlink d13/d15/d3b/ddf/lec 0 2026-03-09T17:30:35.892 INFO:tasks.workunit.client.1.vm09.stdout:7/859: stat da/d11/d47/d5b/l125 0 2026-03-09T17:30:35.892 INFO:tasks.workunit.client.1.vm09.stdout:0/766: truncate d6/d1d/d24/d32/d59/d9c/fce 1271830 0 2026-03-09T17:30:35.892 INFO:tasks.workunit.client.1.vm09.stdout:5/787: readlink d0/dc/d21/d26/d5e/d68/d79/lb1 0 2026-03-09T17:30:35.893 INFO:tasks.workunit.client.1.vm09.stdout:7/860: chown da/f1c 348588 1 2026-03-09T17:30:35.894 INFO:tasks.workunit.client.1.vm09.stdout:0/767: fsync d6/d1d/d24/d32/d59/d81/ff3 0 2026-03-09T17:30:35.896 INFO:tasks.workunit.client.1.vm09.stdout:5/788: readlink d0/d2/d76/d87/lb9 0 2026-03-09T17:30:35.896 INFO:tasks.workunit.client.1.vm09.stdout:1/755: dwrite d9/d5a/fbf [0,4194304] 0 2026-03-09T17:30:35.897 INFO:tasks.workunit.client.1.vm09.stdout:9/751: mkdir d5/de/d29/d33/db8/dfb 0 2026-03-09T17:30:35.897 INFO:tasks.workunit.client.1.vm09.stdout:3/703: dwrite d5/d9/d30/d65/d59/d84/f86 [0,4194304] 0 2026-03-09T17:30:35.903 INFO:tasks.workunit.client.1.vm09.stdout:0/768: read d6/d1d/d24/d5e/db2/fb9 [1694571,28552] 0 2026-03-09T17:30:35.903 INFO:tasks.workunit.client.1.vm09.stdout:2/730: sync 2026-03-09T17:30:35.905 INFO:tasks.workunit.client.1.vm09.stdout:2/731: sync 2026-03-09T17:30:35.905 INFO:tasks.workunit.client.1.vm09.stdout:3/704: chown d5/d16/d31/d3d/d9f 48599001 1 2026-03-09T17:30:35.907 INFO:tasks.workunit.client.1.vm09.stdout:9/752: chown d5/de/d4e/dca/d84/lbf 631 1 2026-03-09T17:30:35.908 INFO:tasks.workunit.client.1.vm09.stdout:9/753: fsync d5/d2e/f6f 0 2026-03-09T17:30:35.914 INFO:tasks.workunit.client.1.vm09.stdout:4/740: dread d11/f13 [0,4194304] 0 2026-03-09T17:30:35.919 INFO:tasks.workunit.client.1.vm09.stdout:6/737: dwrite d3/d21/d76/d5c/d61/d95/fa5 [4194304,4194304] 0 2026-03-09T17:30:35.921 INFO:tasks.workunit.client.1.vm09.stdout:6/738: write d3/d7/d59/d5a/f83 [2405811,65371] 0 2026-03-09T17:30:35.922 INFO:tasks.workunit.client.1.vm09.stdout:6/739: write d3/d7/d59/d73/f82 [674411,79559] 0 2026-03-09T17:30:35.927 INFO:tasks.workunit.client.1.vm09.stdout:8/779: dread d1/d14/d2a/d42/d43/f58 [0,4194304] 0 2026-03-09T17:30:35.930 INFO:tasks.workunit.client.1.vm09.stdout:7/861: symlink da/d11/d47/l127 0 2026-03-09T17:30:35.931 INFO:tasks.workunit.client.1.vm09.stdout:0/769: dread d6/d1d/d24/d32/d59/d81/d8c/fb1 [0,4194304] 0 2026-03-09T17:30:35.934 INFO:tasks.workunit.client.1.vm09.stdout:0/770: dwrite d6/d1d/d24/d32/d59/d9c/dac/dcc/ff2 [0,4194304] 0 2026-03-09T17:30:35.954 INFO:tasks.workunit.client.1.vm09.stdout:1/756: mknod d9/dc/dd/d40/d21/d35/d88/ce9 0 2026-03-09T17:30:35.955 INFO:tasks.workunit.client.1.vm09.stdout:3/705: write d5/d9/d90/db0/dbb/fbe [4240456,128165] 0 2026-03-09T17:30:35.956 INFO:tasks.workunit.client.1.vm09.stdout:3/706: fsync d5/d9/d90/db0/fcd 0 2026-03-09T17:30:35.959 INFO:tasks.workunit.client.1.vm09.stdout:3/707: dwrite d5/d9/d30/d65/d59/d84/f86 [4194304,4194304] 0 2026-03-09T17:30:35.960 INFO:tasks.workunit.client.1.vm09.stdout:2/732: mknod d13/d15/d3b/d43/ced 0 2026-03-09T17:30:35.967 INFO:tasks.workunit.client.1.vm09.stdout:9/754: unlink d5/de/f76 0 2026-03-09T17:30:35.975 INFO:tasks.workunit.client.1.vm09.stdout:3/708: sync 2026-03-09T17:30:35.985 INFO:tasks.workunit.client.1.vm09.stdout:7/862: stat da/d11/d47/d5b/d6c/df8/l10a 0 2026-03-09T17:30:35.991 INFO:tasks.workunit.client.1.vm09.stdout:8/780: fsync d1/dbd/fe3 0 2026-03-09T17:30:35.996 INFO:tasks.workunit.client.1.vm09.stdout:0/771: mkdir d6/d64/d97/dc9/dfc 0 2026-03-09T17:30:36.006 INFO:tasks.workunit.client.1.vm09.stdout:5/789: dwrite d0/d52/d20/f25 [0,4194304] 0 2026-03-09T17:30:36.009 INFO:tasks.workunit.client.1.vm09.stdout:5/790: write d0/dc/d21/d6f/f5f [2880426,73633] 0 2026-03-09T17:30:36.011 INFO:tasks.workunit.client.1.vm09.stdout:4/741: dwrite d11/d1e/d31/f74 [0,4194304] 0 2026-03-09T17:30:36.018 INFO:tasks.workunit.client.1.vm09.stdout:4/742: dread d11/d1e/d29/f8a [0,4194304] 0 2026-03-09T17:30:36.044 INFO:tasks.workunit.client.1.vm09.stdout:9/755: rename d5/d2e/f6f to d5/de/d29/da7/ffc 0 2026-03-09T17:30:36.054 INFO:tasks.workunit.client.1.vm09.stdout:3/709: rmdir d5/d16/d31/d37/d58/d8a/da8 39 2026-03-09T17:30:36.181 INFO:tasks.workunit.client.1.vm09.stdout:6/740: symlink d3/d21/d76/d3f/d8f/lf8 0 2026-03-09T17:30:36.182 INFO:tasks.workunit.client.1.vm09.stdout:6/741: write d3/d21/d76/d5c/f92 [851689,93069] 0 2026-03-09T17:30:36.188 INFO:tasks.workunit.client.1.vm09.stdout:7/863: creat da/d11/d47/dfa/f128 x:0 0 0 2026-03-09T17:30:36.192 INFO:tasks.workunit.client.1.vm09.stdout:0/772: creat d6/d64/d97/dd6/ffd x:0 0 0 2026-03-09T17:30:36.194 INFO:tasks.workunit.client.1.vm09.stdout:1/757: mkdir d9/de5/dea 0 2026-03-09T17:30:36.196 INFO:tasks.workunit.client.1.vm09.stdout:4/743: readlink d11/d1e/d31/l47 0 2026-03-09T17:30:36.201 INFO:tasks.workunit.client.1.vm09.stdout:0/773: dread d6/d1d/d24/f5d [0,4194304] 0 2026-03-09T17:30:36.201 INFO:tasks.workunit.client.1.vm09.stdout:0/774: fsync d6/d1d/ff8 0 2026-03-09T17:30:36.234 INFO:tasks.workunit.client.1.vm09.stdout:7/864: creat da/d11/d2d/d56/d68/f129 x:0 0 0 2026-03-09T17:30:36.234 INFO:tasks.workunit.client.1.vm09.stdout:8/781: symlink d1/da/d23/dc2/da2/ddf/lf4 0 2026-03-09T17:30:36.238 INFO:tasks.workunit.client.1.vm09.stdout:5/791: symlink d0/dc/d21/d26/d5e/d68/lfd 0 2026-03-09T17:30:36.239 INFO:tasks.workunit.client.1.vm09.stdout:5/792: write d0/dc/d21/d33/f35 [583198,50448] 0 2026-03-09T17:30:36.243 INFO:tasks.workunit.client.1.vm09.stdout:5/793: dread d0/dc/dc3/ff7 [0,4194304] 0 2026-03-09T17:30:36.246 INFO:tasks.workunit.client.1.vm09.stdout:5/794: dread d0/d52/d20/f25 [0,4194304] 0 2026-03-09T17:30:36.247 INFO:tasks.workunit.client.1.vm09.stdout:0/775: unlink d6/lb4 0 2026-03-09T17:30:36.253 INFO:tasks.workunit.client.1.vm09.stdout:1/758: dread d9/dc/dd/d40/f86 [0,4194304] 0 2026-03-09T17:30:36.255 INFO:tasks.workunit.client.1.vm09.stdout:0/776: dwrite d6/d1d/d24/d32/d59/d81/d8c/fe2 [0,4194304] 0 2026-03-09T17:30:36.262 INFO:tasks.workunit.client.1.vm09.stdout:8/782: mkdir d1/d14/d2a/d42/d43/d44/df5 0 2026-03-09T17:30:36.262 INFO:tasks.workunit.client.1.vm09.stdout:8/783: write d1/d14/fd5 [782819,54310] 0 2026-03-09T17:30:36.271 INFO:tasks.workunit.client.1.vm09.stdout:2/733: getdents d13/d15/d3b 0 2026-03-09T17:30:36.289 INFO:tasks.workunit.client.1.vm09.stdout:9/756: fsync d5/d2e/d8b/fcc 0 2026-03-09T17:30:36.303 INFO:tasks.workunit.client.1.vm09.stdout:1/759: creat d9/d38/d61/feb x:0 0 0 2026-03-09T17:30:36.313 INFO:tasks.workunit.client.1.vm09.stdout:1/760: dread d9/dc/f90 [0,4194304] 0 2026-03-09T17:30:36.314 INFO:tasks.workunit.client.1.vm09.stdout:1/761: write d9/d9e/dc0/f50 [2302805,105117] 0 2026-03-09T17:30:36.337 INFO:tasks.workunit.client.1.vm09.stdout:8/784: fsync d1/da/d23/d6c/ddd/dcb/d97/fab 0 2026-03-09T17:30:36.342 INFO:tasks.workunit.client.1.vm09.stdout:2/734: creat d13/dc8/fee x:0 0 0 2026-03-09T17:30:36.352 INFO:tasks.workunit.client.1.vm09.stdout:9/757: truncate d5/de/d4e/dca/f7d 196105 0 2026-03-09T17:30:36.355 INFO:tasks.workunit.client.1.vm09.stdout:6/742: getdents d3/d7/d59/d5a 0 2026-03-09T17:30:36.363 INFO:tasks.workunit.client.1.vm09.stdout:6/743: dread d3/d48/f6c [0,4194304] 0 2026-03-09T17:30:36.364 INFO:tasks.workunit.client.1.vm09.stdout:6/744: readlink d3/d48/le6 0 2026-03-09T17:30:36.364 INFO:tasks.workunit.client.1.vm09.stdout:0/777: fsync d6/d64/d97/fbb 0 2026-03-09T17:30:36.366 INFO:tasks.workunit.client.1.vm09.stdout:8/785: symlink d1/d14/d2a/d42/lf6 0 2026-03-09T17:30:36.367 INFO:tasks.workunit.client.1.vm09.stdout:8/786: chown d1/da/dd/d77/fad 9896666 1 2026-03-09T17:30:36.367 INFO:tasks.workunit.client.1.vm09.stdout:2/735: unlink d13/db3/fb7 0 2026-03-09T17:30:36.368 INFO:tasks.workunit.client.1.vm09.stdout:4/744: getdents d11/d1e/d31/db6 0 2026-03-09T17:30:36.369 INFO:tasks.workunit.client.1.vm09.stdout:2/736: truncate d13/d15/d36/d72/dc3/fcc 470357 0 2026-03-09T17:30:36.372 INFO:tasks.workunit.client.1.vm09.stdout:9/758: dread d5/de/d29/d90/dc7/fbe [0,4194304] 0 2026-03-09T17:30:36.379 INFO:tasks.workunit.client.1.vm09.stdout:9/759: dread d5/d2e/f5a [0,4194304] 0 2026-03-09T17:30:36.381 INFO:tasks.workunit.client.1.vm09.stdout:8/787: mknod d1/da/d23/d6c/ddd/dcb/d97/dc5/cf7 0 2026-03-09T17:30:36.389 INFO:tasks.workunit.client.1.vm09.stdout:2/737: mknod d13/d15/d36/d72/d94/da7/db0/cef 0 2026-03-09T17:30:36.389 INFO:tasks.workunit.client.1.vm09.stdout:2/738: read - d13/d4d/f7d zero size 2026-03-09T17:30:36.390 INFO:tasks.workunit.client.1.vm09.stdout:9/760: sync 2026-03-09T17:30:36.391 INFO:tasks.workunit.client.1.vm09.stdout:2/739: dread - d13/da4/fea zero size 2026-03-09T17:30:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:36 vm06.local ceph-mon[57307]: [09/Mar/2026:17:30:34] ENGINE Bus STARTING 2026-03-09T17:30:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:36 vm06.local ceph-mon[57307]: [09/Mar/2026:17:30:34] ENGINE Serving on http://192.168.123.106:8765 2026-03-09T17:30:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:36 vm06.local ceph-mon[57307]: [09/Mar/2026:17:30:34] ENGINE Serving on https://192.168.123.106:7150 2026-03-09T17:30:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:36 vm06.local ceph-mon[57307]: [09/Mar/2026:17:30:34] ENGINE Bus STARTED 2026-03-09T17:30:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:36 vm06.local ceph-mon[57307]: [09/Mar/2026:17:30:34] ENGINE Client ('192.168.123.106', 43026) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T17:30:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:36 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:36 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:36 vm06.local ceph-mon[57307]: pgmap v5: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail 2026-03-09T17:30:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:36 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:36 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:36 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:36 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:36 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:36 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:36 vm09.local ceph-mon[62061]: [09/Mar/2026:17:30:34] ENGINE Bus STARTING 2026-03-09T17:30:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:36 vm09.local ceph-mon[62061]: [09/Mar/2026:17:30:34] ENGINE Serving on http://192.168.123.106:8765 2026-03-09T17:30:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:36 vm09.local ceph-mon[62061]: [09/Mar/2026:17:30:34] ENGINE Serving on https://192.168.123.106:7150 2026-03-09T17:30:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:36 vm09.local ceph-mon[62061]: [09/Mar/2026:17:30:34] ENGINE Bus STARTED 2026-03-09T17:30:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:36 vm09.local ceph-mon[62061]: [09/Mar/2026:17:30:34] ENGINE Client ('192.168.123.106', 43026) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T17:30:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:36 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:36 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:36 vm09.local ceph-mon[62061]: pgmap v5: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail 2026-03-09T17:30:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:36 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:36 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:36 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:36 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:36 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:36 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:36.402 INFO:tasks.workunit.client.1.vm09.stdout:4/745: dread d11/d1e/d45/d60/d71/db7/d89/d8b/f5f [0,4194304] 0 2026-03-09T17:30:36.403 INFO:tasks.workunit.client.1.vm09.stdout:4/746: fdatasync d11/f25 0 2026-03-09T17:30:36.405 INFO:tasks.workunit.client.1.vm09.stdout:4/747: read d11/d1e/d45/d60/d71/db7/f96 [238415,107474] 0 2026-03-09T17:30:36.414 INFO:tasks.workunit.client.1.vm09.stdout:9/761: rename d5/de/d29/d33/f9a to d5/de/d29/da7/ffd 0 2026-03-09T17:30:36.415 INFO:tasks.workunit.client.1.vm09.stdout:1/762: getdents d9/ddd 0 2026-03-09T17:30:36.416 INFO:tasks.workunit.client.1.vm09.stdout:1/763: write d9/d9e/dc0/d37/d3f/d42/d55/db1/fc5 [768730,113188] 0 2026-03-09T17:30:36.425 INFO:tasks.workunit.client.1.vm09.stdout:8/788: dread d1/fe7 [0,4194304] 0 2026-03-09T17:30:36.428 INFO:tasks.workunit.client.1.vm09.stdout:4/748: truncate d11/d1e/d31/f7c 285681 0 2026-03-09T17:30:36.435 INFO:tasks.workunit.client.1.vm09.stdout:2/740: fdatasync d13/d15/d3b/ddf/d90/f92 0 2026-03-09T17:30:36.470 INFO:tasks.workunit.client.1.vm09.stdout:9/762: creat d5/d2e/d8b/de0/ffe x:0 0 0 2026-03-09T17:30:36.471 INFO:tasks.workunit.client.1.vm09.stdout:1/764: symlink d9/dc/dd/d40/d21/d6f/lec 0 2026-03-09T17:30:36.475 INFO:tasks.workunit.client.1.vm09.stdout:1/765: dwrite d9/d9e/dc0/d37/d3f/d42/f95 [0,4194304] 0 2026-03-09T17:30:36.487 INFO:tasks.workunit.client.1.vm09.stdout:9/763: link d5/de/d29/d33/c3e d5/d2e/d8b/de0/df1/cff 0 2026-03-09T17:30:36.491 INFO:tasks.workunit.client.1.vm09.stdout:9/764: dwrite d5/de/f2d [0,4194304] 0 2026-03-09T17:30:36.495 INFO:tasks.workunit.client.1.vm09.stdout:9/765: creat d5/d7e/f100 x:0 0 0 2026-03-09T17:30:36.503 INFO:tasks.workunit.client.1.vm09.stdout:9/766: sync 2026-03-09T17:30:36.504 INFO:tasks.workunit.client.1.vm09.stdout:9/767: chown d5/d21/l69 13734 1 2026-03-09T17:30:36.505 INFO:tasks.workunit.client.1.vm09.stdout:9/768: write d5/f11 [1817254,32082] 0 2026-03-09T17:30:36.510 INFO:tasks.workunit.client.1.vm09.stdout:9/769: mkdir d5/de/d29/d90/dc7/d101 0 2026-03-09T17:30:36.523 INFO:tasks.workunit.client.1.vm09.stdout:9/770: link d5/de/f20 d5/de/d29/d33/db8/f102 0 2026-03-09T17:30:36.527 INFO:tasks.workunit.client.1.vm09.stdout:9/771: creat d5/de/d4e/dca/d84/d97/f103 x:0 0 0 2026-03-09T17:30:36.533 INFO:tasks.workunit.client.1.vm09.stdout:9/772: mkdir d5/de/d29/d90/dc7/da9/d104 0 2026-03-09T17:30:36.545 INFO:tasks.workunit.client.1.vm09.stdout:7/865: dwrite da/fcd [0,4194304] 0 2026-03-09T17:30:36.546 INFO:tasks.workunit.client.1.vm09.stdout:7/866: stat da/d11/d47/d5b/d78/f80 0 2026-03-09T17:30:36.558 INFO:tasks.workunit.client.1.vm09.stdout:9/773: creat d5/de/d4e/dca/d84/f105 x:0 0 0 2026-03-09T17:30:36.561 INFO:tasks.workunit.client.1.vm09.stdout:7/867: read - da/d11/d64/da7/ff6 zero size 2026-03-09T17:30:36.561 INFO:tasks.workunit.client.1.vm09.stdout:9/774: chown d5/d91/d99/dc9/dde 212063 1 2026-03-09T17:30:36.610 INFO:tasks.workunit.client.1.vm09.stdout:5/795: write d0/dc/d21/d26/f3d [1192626,126676] 0 2026-03-09T17:30:36.638 INFO:tasks.workunit.client.1.vm09.stdout:3/710: symlink d5/d9/d30/lda 0 2026-03-09T17:30:36.657 INFO:tasks.workunit.client.1.vm09.stdout:6/745: symlink d3/d21/d25/d26/lf9 0 2026-03-09T17:30:36.665 INFO:tasks.workunit.client.1.vm09.stdout:6/746: dread d3/d7/d59/d9c/fd5 [0,4194304] 0 2026-03-09T17:30:36.681 INFO:tasks.workunit.client.1.vm09.stdout:0/778: creat d6/d64/ffe x:0 0 0 2026-03-09T17:30:36.683 INFO:tasks.workunit.client.1.vm09.stdout:5/796: rmdir d0/d9/d16/d5c 39 2026-03-09T17:30:36.685 INFO:tasks.workunit.client.1.vm09.stdout:3/711: creat d5/d9/d30/d65/fdb x:0 0 0 2026-03-09T17:30:36.687 INFO:tasks.workunit.client.1.vm09.stdout:9/775: dread d5/de/d29/fc0 [0,4194304] 0 2026-03-09T17:30:36.688 INFO:tasks.workunit.client.1.vm09.stdout:6/747: mkdir d3/d21/d25/d26/d86/dbc/dfa 0 2026-03-09T17:30:36.691 INFO:tasks.workunit.client.1.vm09.stdout:6/748: dwrite d3/d21/d25/d26/f50 [0,4194304] 0 2026-03-09T17:30:36.693 INFO:tasks.workunit.client.1.vm09.stdout:7/868: mkdir da/d11/d12a 0 2026-03-09T17:30:36.697 INFO:tasks.workunit.client.1.vm09.stdout:7/869: dread - da/d11/d47/d5b/d6c/d9e/d4e/f92 zero size 2026-03-09T17:30:36.697 INFO:tasks.workunit.client.1.vm09.stdout:5/797: creat d0/dc/d21/d26/d5e/d68/d6d/ffe x:0 0 0 2026-03-09T17:30:36.698 INFO:tasks.workunit.client.1.vm09.stdout:3/712: stat d5/d16/l75 0 2026-03-09T17:30:36.700 INFO:tasks.workunit.client.1.vm09.stdout:6/749: sync 2026-03-09T17:30:36.704 INFO:tasks.workunit.client.1.vm09.stdout:0/779: dread d6/d1d/d24/f4e [0,4194304] 0 2026-03-09T17:30:36.705 INFO:tasks.workunit.client.1.vm09.stdout:5/798: mkdir d0/d2/d76/d86/dff 0 2026-03-09T17:30:36.705 INFO:tasks.workunit.client.1.vm09.stdout:0/780: chown d6/d64/d97/dc9/ce7 260317 1 2026-03-09T17:30:36.707 INFO:tasks.workunit.client.1.vm09.stdout:9/776: creat d5/d2e/d8b/de0/df1/f106 x:0 0 0 2026-03-09T17:30:36.714 INFO:tasks.workunit.client.1.vm09.stdout:0/781: fsync d6/d1d/d24/d32/d59/d81/fc0 0 2026-03-09T17:30:36.719 INFO:tasks.workunit.client.1.vm09.stdout:6/750: dread d3/d21/d25/d96/fec [0,4194304] 0 2026-03-09T17:30:36.724 INFO:tasks.workunit.client.1.vm09.stdout:3/713: dread d5/d9/d90/db0/f69 [0,4194304] 0 2026-03-09T17:30:36.728 INFO:tasks.workunit.client.1.vm09.stdout:3/714: dread d5/d9/d90/db0/f69 [0,4194304] 0 2026-03-09T17:30:36.729 INFO:tasks.workunit.client.1.vm09.stdout:0/782: dread d6/d1d/d39/f53 [0,4194304] 0 2026-03-09T17:30:36.729 INFO:tasks.workunit.client.1.vm09.stdout:6/751: getdents d3/d7 0 2026-03-09T17:30:36.730 INFO:tasks.workunit.client.1.vm09.stdout:6/752: read - d3/d7/f58 zero size 2026-03-09T17:30:36.735 INFO:tasks.workunit.client.1.vm09.stdout:0/783: unlink d6/d1d/d39/c10 0 2026-03-09T17:30:36.735 INFO:tasks.workunit.client.1.vm09.stdout:0/784: write d6/d93/fb7 [3067904,73420] 0 2026-03-09T17:30:36.741 INFO:tasks.workunit.client.1.vm09.stdout:0/785: symlink d6/d1d/d46/lff 0 2026-03-09T17:30:36.741 INFO:tasks.workunit.client.1.vm09.stdout:0/786: fsync d6/d1d/d24/d5e/f9e 0 2026-03-09T17:30:36.763 INFO:tasks.workunit.client.1.vm09.stdout:0/787: rename d6/d1d/d39/fdc to d6/d1d/d39/f100 0 2026-03-09T17:30:36.769 INFO:tasks.workunit.client.1.vm09.stdout:8/789: dwrite d1/da/d23/fb3 [0,4194304] 0 2026-03-09T17:30:36.772 INFO:tasks.workunit.client.1.vm09.stdout:4/749: dwrite fd [4194304,4194304] 0 2026-03-09T17:30:36.772 INFO:tasks.workunit.client.1.vm09.stdout:2/741: dwrite d13/f73 [0,4194304] 0 2026-03-09T17:30:36.777 INFO:tasks.workunit.client.1.vm09.stdout:0/788: read d6/d1d/d24/d32/d59/d81/f90 [2436798,39082] 0 2026-03-09T17:30:36.795 INFO:tasks.workunit.client.1.vm09.stdout:1/766: dwrite d9/dc/dd/d40/f92 [0,4194304] 0 2026-03-09T17:30:36.800 INFO:tasks.workunit.client.1.vm09.stdout:1/767: dwrite f2 [0,4194304] 0 2026-03-09T17:30:36.803 INFO:tasks.workunit.client.1.vm09.stdout:1/768: fsync d9/d38/d61/fda 0 2026-03-09T17:30:36.803 INFO:tasks.workunit.client.1.vm09.stdout:1/769: fdatasync d9/dc/dd/d40/d1d/f17 0 2026-03-09T17:30:36.810 INFO:tasks.workunit.client.1.vm09.stdout:4/750: dread - d11/fae zero size 2026-03-09T17:30:36.811 INFO:tasks.workunit.client.1.vm09.stdout:2/742: rename d13/f8b to d13/d4d/daa/ff0 0 2026-03-09T17:30:36.814 INFO:tasks.workunit.client.1.vm09.stdout:0/789: stat d6/d1d/d24/d5e/dc8/ld7 0 2026-03-09T17:30:36.818 INFO:tasks.workunit.client.1.vm09.stdout:1/770: creat d9/d9e/dc9/fed x:0 0 0 2026-03-09T17:30:36.819 INFO:tasks.workunit.client.1.vm09.stdout:1/771: chown d9/dc/dd/d9f/c49 767746179 1 2026-03-09T17:30:36.834 INFO:tasks.workunit.client.1.vm09.stdout:4/751: truncate d11/d1e/d31/f3a 931140 0 2026-03-09T17:30:36.841 INFO:tasks.workunit.client.1.vm09.stdout:8/790: link d1/da/d23/d6c/ddd/dcb/c41 d1/d14/d2a/d49/cf8 0 2026-03-09T17:30:36.841 INFO:tasks.workunit.client.1.vm09.stdout:7/870: write da/d11/d47/d5b/d6c/d9e/ff3 [300165,96673] 0 2026-03-09T17:30:36.842 INFO:tasks.workunit.client.1.vm09.stdout:4/752: truncate d11/d1e/d31/f5a 635720 0 2026-03-09T17:30:36.843 INFO:tasks.workunit.client.1.vm09.stdout:8/791: chown d1/d14/d2a/d42/d43/ldc 7801 1 2026-03-09T17:30:36.843 INFO:tasks.workunit.client.1.vm09.stdout:9/777: truncate d5/f34 2124772 0 2026-03-09T17:30:36.845 INFO:tasks.workunit.client.1.vm09.stdout:8/792: dread - d1/d14/d96/fe0 zero size 2026-03-09T17:30:36.846 INFO:tasks.workunit.client.1.vm09.stdout:5/799: dwrite d0/dc/d21/d33/fa7 [4194304,4194304] 0 2026-03-09T17:30:36.847 INFO:tasks.workunit.client.1.vm09.stdout:3/715: write d5/d16/d31/d37/d58/d64/f9a [514233,16777] 0 2026-03-09T17:30:36.848 INFO:tasks.workunit.client.1.vm09.stdout:1/772: dread d9/dc/dd/d40/d21/fb6 [0,4194304] 0 2026-03-09T17:30:36.853 INFO:tasks.workunit.client.1.vm09.stdout:7/871: mkdir da/d11/d3e/d12b 0 2026-03-09T17:30:36.854 INFO:tasks.workunit.client.1.vm09.stdout:6/753: write d3/d21/d76/d81/fa2 [14784,12979] 0 2026-03-09T17:30:36.854 INFO:tasks.workunit.client.1.vm09.stdout:9/778: rmdir d5/de/d29/d90/dc7 39 2026-03-09T17:30:36.856 INFO:tasks.workunit.client.1.vm09.stdout:5/800: write d0/d9/fd2 [1378688,78419] 0 2026-03-09T17:30:36.857 INFO:tasks.workunit.client.1.vm09.stdout:8/793: mknod d1/da/d23/d6c/ddd/dcb/d97/cf9 0 2026-03-09T17:30:36.859 INFO:tasks.workunit.client.1.vm09.stdout:3/716: unlink d5/d16/d25/f28 0 2026-03-09T17:30:36.861 INFO:tasks.workunit.client.1.vm09.stdout:3/717: chown d5/d16/l83 145770981 1 2026-03-09T17:30:36.861 INFO:tasks.workunit.client.1.vm09.stdout:0/790: getdents d6/d1d/d24/d5e 0 2026-03-09T17:30:36.871 INFO:tasks.workunit.client.1.vm09.stdout:5/801: rename d0/dc/dc3/lca to d0/dc/l100 0 2026-03-09T17:30:36.872 INFO:tasks.workunit.client.1.vm09.stdout:8/794: creat d1/d14/ffa x:0 0 0 2026-03-09T17:30:36.873 INFO:tasks.workunit.client.1.vm09.stdout:3/718: unlink d5/d9/d30/d65/f5e 0 2026-03-09T17:30:36.873 INFO:tasks.workunit.client.1.vm09.stdout:8/795: write d1/da/d23/d6c/f1c [1670067,15050] 0 2026-03-09T17:30:36.874 INFO:tasks.workunit.client.1.vm09.stdout:3/719: chown d5/d9/d30/d65/d59/c82 106808 1 2026-03-09T17:30:36.876 INFO:tasks.workunit.client.1.vm09.stdout:8/796: read d1/d14/d2a/f54 [486952,9507] 0 2026-03-09T17:30:36.879 INFO:tasks.workunit.client.1.vm09.stdout:9/779: dread d5/de/d4e/dca/de7/d93/faa [0,4194304] 0 2026-03-09T17:30:36.883 INFO:tasks.workunit.client.1.vm09.stdout:4/753: dread d11/d1e/d45/d60/d71/f76 [0,4194304] 0 2026-03-09T17:30:36.886 INFO:tasks.workunit.client.1.vm09.stdout:2/743: dwrite d13/d15/d34/d45/d84/dcb/f2d [0,4194304] 0 2026-03-09T17:30:36.888 INFO:tasks.workunit.client.1.vm09.stdout:4/754: dwrite f3 [0,4194304] 0 2026-03-09T17:30:36.894 INFO:tasks.workunit.client.1.vm09.stdout:3/720: dread d5/d9/d30/f61 [0,4194304] 0 2026-03-09T17:30:36.894 INFO:tasks.workunit.client.1.vm09.stdout:0/791: dread d6/d1d/d24/d32/d59/f99 [0,4194304] 0 2026-03-09T17:30:36.896 INFO:tasks.workunit.client.1.vm09.stdout:0/792: truncate d6/d1d/d24/d5e/dc2/ff9 412693 0 2026-03-09T17:30:36.911 INFO:tasks.workunit.client.1.vm09.stdout:7/872: creat da/d11/d64/d11f/f12c x:0 0 0 2026-03-09T17:30:36.918 INFO:tasks.workunit.client.1.vm09.stdout:6/754: dread d3/d21/d76/d5c/f6d [0,4194304] 0 2026-03-09T17:30:36.921 INFO:tasks.workunit.client.1.vm09.stdout:5/802: dwrite d0/d2/d76/d86/f6b [0,4194304] 0 2026-03-09T17:30:36.924 INFO:tasks.workunit.client.1.vm09.stdout:5/803: chown d0/d52/d20/l7d 64563 1 2026-03-09T17:30:36.930 INFO:tasks.workunit.client.1.vm09.stdout:6/755: sync 2026-03-09T17:30:36.936 INFO:tasks.workunit.client.1.vm09.stdout:9/780: truncate d5/de/d4e/dca/d84/d97/fad 559494 0 2026-03-09T17:30:36.949 INFO:tasks.workunit.client.1.vm09.stdout:2/744: mkdir d13/db3/df1 0 2026-03-09T17:30:36.949 INFO:tasks.workunit.client.1.vm09.stdout:1/773: creat d9/dc/dd/fee x:0 0 0 2026-03-09T17:30:36.960 INFO:tasks.workunit.client.1.vm09.stdout:1/774: creat d9/d9e/dc0/d8b/fef x:0 0 0 2026-03-09T17:30:36.960 INFO:tasks.workunit.client.1.vm09.stdout:0/793: creat d6/d64/dbd/dfa/f101 x:0 0 0 2026-03-09T17:30:36.960 INFO:tasks.workunit.client.1.vm09.stdout:7/873: creat da/d11/d3e/da2/d11a/f12d x:0 0 0 2026-03-09T17:30:36.962 INFO:tasks.workunit.client.1.vm09.stdout:5/804: mkdir d0/d9/d74/d75/d9f/db6/d101 0 2026-03-09T17:30:36.964 INFO:tasks.workunit.client.1.vm09.stdout:9/781: creat d5/de/df7/f107 x:0 0 0 2026-03-09T17:30:36.966 INFO:tasks.workunit.client.1.vm09.stdout:8/797: link d1/f28 d1/da/dd/d79/ffb 0 2026-03-09T17:30:36.967 INFO:tasks.workunit.client.1.vm09.stdout:0/794: creat d6/d64/db5/f102 x:0 0 0 2026-03-09T17:30:36.968 INFO:tasks.workunit.client.1.vm09.stdout:4/755: link d11/d1e/d45/d60/d71/db7/d89/fa8 d11/d1e/fe6 0 2026-03-09T17:30:36.969 INFO:tasks.workunit.client.1.vm09.stdout:5/805: chown d0/dc/l100 16032238 1 2026-03-09T17:30:36.975 INFO:tasks.workunit.client.1.vm09.stdout:9/782: rename d5/de/d29/l40 to d5/d2e/d8b/de0/l108 0 2026-03-09T17:30:36.978 INFO:tasks.workunit.client.1.vm09.stdout:5/806: dwrite d0/f91 [0,4194304] 0 2026-03-09T17:30:36.989 INFO:tasks.workunit.client.1.vm09.stdout:4/756: read d11/d1e/d29/d36/d57/fbc [598223,126449] 0 2026-03-09T17:30:36.990 INFO:tasks.workunit.client.1.vm09.stdout:4/757: write d11/d1e/d29/d36/f7f [3436465,9092] 0 2026-03-09T17:30:36.996 INFO:tasks.workunit.client.1.vm09.stdout:3/721: write d5/d16/d25/f2c [1257345,125604] 0 2026-03-09T17:30:36.997 INFO:tasks.workunit.client.1.vm09.stdout:3/722: write d5/d16/d25/f2c [380301,111594] 0 2026-03-09T17:30:37.000 INFO:tasks.workunit.client.1.vm09.stdout:6/756: truncate d3/d21/d76/d5c/d61/f60 3929105 0 2026-03-09T17:30:37.002 INFO:tasks.workunit.client.1.vm09.stdout:2/745: write fd [3229972,88461] 0 2026-03-09T17:30:37.004 INFO:tasks.workunit.client.1.vm09.stdout:2/746: dread d13/f73 [0,4194304] 0 2026-03-09T17:30:37.009 INFO:tasks.workunit.client.1.vm09.stdout:9/783: rmdir d5/de/d4e/dca/d84 39 2026-03-09T17:30:37.009 INFO:tasks.workunit.client.1.vm09.stdout:1/775: dwrite d9/dc/dd/d9f/f8a [0,4194304] 0 2026-03-09T17:30:37.014 INFO:tasks.workunit.client.1.vm09.stdout:9/784: sync 2026-03-09T17:30:37.027 INFO:tasks.workunit.client.1.vm09.stdout:7/874: write da/d11/d47/d5b/d6c/d9e/d4e/fe7 [1476183,79826] 0 2026-03-09T17:30:37.027 INFO:tasks.workunit.client.1.vm09.stdout:7/875: chown da/d11/d2d/f69 27279858 1 2026-03-09T17:30:37.029 INFO:tasks.workunit.client.1.vm09.stdout:7/876: write da/d11/d47/d89/dbe/fda [367807,31274] 0 2026-03-09T17:30:37.037 INFO:tasks.workunit.client.1.vm09.stdout:8/798: truncate d1/d14/d2a/f54 3973104 0 2026-03-09T17:30:37.038 INFO:tasks.workunit.client.1.vm09.stdout:5/807: mknod d0/d2/d76/d87/da4/dbf/c102 0 2026-03-09T17:30:37.051 INFO:tasks.workunit.client.1.vm09.stdout:6/757: mknod d3/d21/d76/d5c/d7e/dc5/d9a/cfb 0 2026-03-09T17:30:37.051 INFO:tasks.workunit.client.1.vm09.stdout:1/776: mkdir d9/d9e/dc0/d37/d3f/d42/d55/df0 0 2026-03-09T17:30:37.052 INFO:tasks.workunit.client.1.vm09.stdout:9/785: rename d5/de/l6d to d5/d2e/d8b/de0/l109 0 2026-03-09T17:30:37.052 INFO:tasks.workunit.client.1.vm09.stdout:1/777: read d9/dc/dd/f4f [3502123,97767] 0 2026-03-09T17:30:37.058 INFO:tasks.workunit.client.1.vm09.stdout:0/795: link d6/d1d/d24/d32/d59/d81/f90 d6/d64/d97/dc9/f103 0 2026-03-09T17:30:37.058 INFO:tasks.workunit.client.1.vm09.stdout:0/796: chown d6/d1d/ca0 2049328 1 2026-03-09T17:30:37.072 INFO:tasks.workunit.client.1.vm09.stdout:4/758: write d11/d1e/d45/d60/d71/db7/d89/d8b/d58/f75 [4987356,29408] 0 2026-03-09T17:30:37.072 INFO:tasks.workunit.client.1.vm09.stdout:3/723: write d5/d16/d25/f60 [231017,56859] 0 2026-03-09T17:30:37.078 INFO:tasks.workunit.client.1.vm09.stdout:5/808: symlink d0/d52/l103 0 2026-03-09T17:30:37.082 INFO:tasks.workunit.client.1.vm09.stdout:6/758: creat d3/d21/d76/d5c/d61/d6a/ffc x:0 0 0 2026-03-09T17:30:37.083 INFO:tasks.workunit.client.1.vm09.stdout:6/759: fsync d3/d21/d76/d5c/d61/d95/fe5 0 2026-03-09T17:30:37.084 INFO:tasks.workunit.client.1.vm09.stdout:6/760: chown d3/d21/d25/d26/db7/lba 11380 1 2026-03-09T17:30:37.084 INFO:tasks.workunit.client.1.vm09.stdout:5/809: dread d0/d2/d76/d86/ff2 [0,4194304] 0 2026-03-09T17:30:37.084 INFO:tasks.workunit.client.1.vm09.stdout:2/747: symlink d13/lf2 0 2026-03-09T17:30:37.086 INFO:tasks.workunit.client.1.vm09.stdout:8/799: dread d1/d14/d2a/d42/d43/f98 [0,4194304] 0 2026-03-09T17:30:37.089 INFO:tasks.workunit.client.1.vm09.stdout:1/778: creat d9/dc/dd/d40/d21/d35/d88/ff1 x:0 0 0 2026-03-09T17:30:37.095 INFO:tasks.workunit.client.1.vm09.stdout:0/797: rename d6/d1d/d39/c3f to d6/d1d/d24/d32/d59/d9c/dac/dd1/c104 0 2026-03-09T17:30:37.096 INFO:tasks.workunit.client.1.vm09.stdout:1/779: dread f3 [4194304,4194304] 0 2026-03-09T17:30:37.102 INFO:tasks.workunit.client.1.vm09.stdout:3/724: read d5/d9/f79 [3360223,52385] 0 2026-03-09T17:30:37.102 INFO:tasks.workunit.client.1.vm09.stdout:3/725: chown d5/d16/d31/d37/f76 2087486 1 2026-03-09T17:30:37.114 INFO:tasks.workunit.client.1.vm09.stdout:9/786: symlink d5/de/d29/d90/dc7/da9/l10a 0 2026-03-09T17:30:37.116 INFO:tasks.workunit.client.1.vm09.stdout:5/810: mkdir d0/d9/d74/d104 0 2026-03-09T17:30:37.124 INFO:tasks.workunit.client.1.vm09.stdout:5/811: dread d0/d46/d4b/feb [0,4194304] 0 2026-03-09T17:30:37.127 INFO:tasks.workunit.client.1.vm09.stdout:5/812: dwrite d0/dc/d21/d26/d5e/fbc [0,4194304] 0 2026-03-09T17:30:37.137 INFO:tasks.workunit.client.1.vm09.stdout:2/748: dwrite d13/d4d/f7d [0,4194304] 0 2026-03-09T17:30:37.138 INFO:tasks.workunit.client.1.vm09.stdout:7/877: creat da/d11/d47/d5b/d6c/d9e/d4e/d4c/de4/f12e x:0 0 0 2026-03-09T17:30:37.138 INFO:tasks.workunit.client.1.vm09.stdout:0/798: creat d6/d64/f105 x:0 0 0 2026-03-09T17:30:37.139 INFO:tasks.workunit.client.1.vm09.stdout:2/749: read d13/f40 [3519150,82988] 0 2026-03-09T17:30:37.140 INFO:tasks.workunit.client.1.vm09.stdout:0/799: write d6/d1d/d24/d32/d59/d81/ff3 [352471,34969] 0 2026-03-09T17:30:37.141 INFO:tasks.workunit.client.1.vm09.stdout:0/800: fdatasync d6/d1d/d24/fda 0 2026-03-09T17:30:37.163 INFO:tasks.workunit.client.1.vm09.stdout:8/800: write d1/da/dd/d47/f66 [4921260,995] 0 2026-03-09T17:30:37.165 INFO:tasks.workunit.client.1.vm09.stdout:4/759: symlink d11/d1e/d45/d60/d71/db7/d89/le7 0 2026-03-09T17:30:37.168 INFO:tasks.workunit.client.1.vm09.stdout:6/761: symlink d3/d7/d59/lfd 0 2026-03-09T17:30:37.169 INFO:tasks.workunit.client.1.vm09.stdout:9/787: dread - d5/de/fb1 zero size 2026-03-09T17:30:37.170 INFO:tasks.workunit.client.1.vm09.stdout:5/813: creat d0/d9/d74/d75/dbd/f105 x:0 0 0 2026-03-09T17:30:37.170 INFO:tasks.workunit.client.1.vm09.stdout:5/814: stat d0/d9/d16/fe1 0 2026-03-09T17:30:37.171 INFO:tasks.workunit.client.1.vm09.stdout:7/878: unlink da/d11/d47/d5b/d6c/d9e/d4e/f42 0 2026-03-09T17:30:37.178 INFO:tasks.workunit.client.1.vm09.stdout:1/780: mknod d9/dc/cf2 0 2026-03-09T17:30:37.179 INFO:tasks.workunit.client.1.vm09.stdout:8/801: dread - d1/da/d23/f7d zero size 2026-03-09T17:30:37.179 INFO:tasks.workunit.client.1.vm09.stdout:3/726: mknod d5/d16/d31/d3d/d9f/cdc 0 2026-03-09T17:30:37.180 INFO:tasks.workunit.client.1.vm09.stdout:4/760: symlink d11/d1e/d45/d60/d71/db7/d89/d8b/d58/le8 0 2026-03-09T17:30:37.180 INFO:tasks.workunit.client.1.vm09.stdout:6/762: creat d3/d21/d25/d26/d6b/ffe x:0 0 0 2026-03-09T17:30:37.182 INFO:tasks.workunit.client.1.vm09.stdout:5/815: symlink d0/d9/l106 0 2026-03-09T17:30:37.183 INFO:tasks.workunit.client.1.vm09.stdout:7/879: creat da/d11/d3e/da2/db2/f12f x:0 0 0 2026-03-09T17:30:37.187 INFO:tasks.workunit.client.1.vm09.stdout:2/750: write d13/d15/fdb [559159,42607] 0 2026-03-09T17:30:37.192 INFO:tasks.workunit.client.1.vm09.stdout:6/763: dwrite d3/d21/d76/d5c/d61/d95/fa5 [4194304,4194304] 0 2026-03-09T17:30:37.192 INFO:tasks.workunit.client.1.vm09.stdout:3/727: creat d5/d9/d30/d65/fdd x:0 0 0 2026-03-09T17:30:37.192 INFO:tasks.workunit.client.1.vm09.stdout:4/761: fdatasync d11/d1e/d45/fb4 0 2026-03-09T17:30:37.193 INFO:tasks.workunit.client.1.vm09.stdout:6/764: stat d3/d21/d25 0 2026-03-09T17:30:37.193 INFO:tasks.workunit.client.1.vm09.stdout:8/802: dread d1/d14/d2a/d42/d43/fbb [0,4194304] 0 2026-03-09T17:30:37.194 INFO:tasks.workunit.client.1.vm09.stdout:3/728: dread - d5/d9/d30/d65/d59/fd9 zero size 2026-03-09T17:30:37.198 INFO:tasks.workunit.client.1.vm09.stdout:3/729: dwrite d5/d9c/fd7 [0,4194304] 0 2026-03-09T17:30:37.206 INFO:tasks.workunit.client.1.vm09.stdout:2/751: dread d13/d15/f2a [0,4194304] 0 2026-03-09T17:30:37.208 INFO:tasks.workunit.client.1.vm09.stdout:9/788: rename d5/de/d29/d33/f8a to d5/de/d4e/dca/de7/d93/f10b 0 2026-03-09T17:30:37.213 INFO:tasks.workunit.client.1.vm09.stdout:3/730: dread d5/d16/d31/d37/d58/d64/f9a [0,4194304] 0 2026-03-09T17:30:37.220 INFO:tasks.workunit.client.1.vm09.stdout:5/816: chown d0/d46/f4c 515707516 1 2026-03-09T17:30:37.220 INFO:tasks.workunit.client.1.vm09.stdout:5/817: chown d0/dc/d21/d26 255601 1 2026-03-09T17:30:37.221 INFO:tasks.workunit.client.1.vm09.stdout:5/818: fdatasync d0/dc/d21/d26/fcd 0 2026-03-09T17:30:37.228 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:37 vm06.local ceph-mon[57307]: mgrmap e30: vm06.pbgzei(active, since 4s), standbys: vm09.lqzvkh 2026-03-09T17:30:37.228 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:37 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:37.228 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:37 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:37.248 INFO:tasks.workunit.client.1.vm09.stdout:7/880: dread da/d11/d47/d5b/d6c/d9e/d4e/d4c/de4/fd7 [0,4194304] 0 2026-03-09T17:30:37.249 INFO:tasks.workunit.client.1.vm09.stdout:4/762: creat d11/d1e/d29/d36/dd7/fe9 x:0 0 0 2026-03-09T17:30:37.250 INFO:tasks.workunit.client.1.vm09.stdout:6/765: stat d3/d21/d76/d3f/f42 0 2026-03-09T17:30:37.250 INFO:tasks.workunit.client.1.vm09.stdout:6/766: stat d3/d7/l16 0 2026-03-09T17:30:37.251 INFO:tasks.workunit.client.1.vm09.stdout:2/752: creat d13/d15/d34/d37/ff3 x:0 0 0 2026-03-09T17:30:37.251 INFO:tasks.workunit.client.1.vm09.stdout:4/763: write d11/d1e/d29/d36/f3d [819706,62580] 0 2026-03-09T17:30:37.253 INFO:tasks.workunit.client.1.vm09.stdout:3/731: symlink d5/d16/d31/d37/dae/lde 0 2026-03-09T17:30:37.255 INFO:tasks.workunit.client.1.vm09.stdout:0/801: link d6/d1d/d24/d32/l9b d6/d1d/d24/d5e/d6c/l106 0 2026-03-09T17:30:37.256 INFO:tasks.workunit.client.1.vm09.stdout:7/881: read da/d11/d47/d5b/d6c/d9e/d4e/f74 [361945,115004] 0 2026-03-09T17:30:37.259 INFO:tasks.workunit.client.1.vm09.stdout:8/803: fdatasync d1/da/dd/d79/ffb 0 2026-03-09T17:30:37.260 INFO:tasks.workunit.client.1.vm09.stdout:9/789: creat d5/de/d29/d90/dc7/d101/f10c x:0 0 0 2026-03-09T17:30:37.260 INFO:tasks.workunit.client.1.vm09.stdout:9/790: stat d5/de/d29/fc0 0 2026-03-09T17:30:37.261 INFO:tasks.workunit.client.1.vm09.stdout:9/791: dread d5/f4f [0,4194304] 0 2026-03-09T17:30:37.261 INFO:tasks.workunit.client.1.vm09.stdout:9/792: chown d5/c48 295560379 1 2026-03-09T17:30:37.262 INFO:tasks.workunit.client.1.vm09.stdout:9/793: chown d5/de/d4e/dca/de7/d93/f74 1389067971 1 2026-03-09T17:30:37.263 INFO:tasks.workunit.client.1.vm09.stdout:4/764: rmdir d11/d1e/d45 39 2026-03-09T17:30:37.263 INFO:tasks.workunit.client.1.vm09.stdout:5/819: mknod d0/dc/d21/d26/d5e/dd4/c107 0 2026-03-09T17:30:37.266 INFO:tasks.workunit.client.1.vm09.stdout:2/753: rename d13/d15/d34/f48 to d13/d15/d34/d45/d84/dcb/db1/ff4 0 2026-03-09T17:30:37.266 INFO:tasks.workunit.client.1.vm09.stdout:1/781: getdents d9/d9e/dc0/d37/d3f/d42/d55/db1 0 2026-03-09T17:30:37.268 INFO:tasks.workunit.client.1.vm09.stdout:7/882: creat da/d11/d47/d5b/d6c/d9e/f130 x:0 0 0 2026-03-09T17:30:37.270 INFO:tasks.workunit.client.1.vm09.stdout:0/802: dread d6/d1d/d24/d5e/d6c/fef [0,4194304] 0 2026-03-09T17:30:37.285 INFO:tasks.workunit.client.1.vm09.stdout:6/767: write d3/d21/d76/d3f/f51 [5001331,108517] 0 2026-03-09T17:30:37.289 INFO:tasks.workunit.client.1.vm09.stdout:9/794: symlink d5/de/d29/dd4/df0/l10d 0 2026-03-09T17:30:37.289 INFO:tasks.workunit.client.1.vm09.stdout:3/732: mkdir d5/d16/d31/d37/d58/d8a/da8/ddf 0 2026-03-09T17:30:37.290 INFO:tasks.workunit.client.1.vm09.stdout:5/820: creat d0/d52/f108 x:0 0 0 2026-03-09T17:30:37.294 INFO:tasks.workunit.client.1.vm09.stdout:1/782: dwrite d9/dc/dd/d9f/de4/dba/fd3 [0,4194304] 0 2026-03-09T17:30:37.312 INFO:tasks.workunit.client.1.vm09.stdout:4/765: symlink d11/d1e/d45/d60/d71/lea 0 2026-03-09T17:30:37.315 INFO:tasks.workunit.client.1.vm09.stdout:6/768: truncate d3/d7/f40 837854 0 2026-03-09T17:30:37.316 INFO:tasks.workunit.client.1.vm09.stdout:4/766: dwrite d11/d1e/d29/d36/f3d [0,4194304] 0 2026-03-09T17:30:37.329 INFO:tasks.workunit.client.1.vm09.stdout:3/733: symlink d5/d9/d90/db0/le0 0 2026-03-09T17:30:37.331 INFO:tasks.workunit.client.1.vm09.stdout:3/734: dread - d5/d9/d30/d65/fdd zero size 2026-03-09T17:30:37.332 INFO:tasks.workunit.client.1.vm09.stdout:7/883: dwrite da/d11/d3e/da2/db2/fa6 [0,4194304] 0 2026-03-09T17:30:37.334 INFO:tasks.workunit.client.1.vm09.stdout:7/884: write da/d11/d64/da7/f10c [740772,14013] 0 2026-03-09T17:30:37.335 INFO:tasks.workunit.client.1.vm09.stdout:7/885: chown da/d11/d47/d89/faf 229 1 2026-03-09T17:30:37.339 INFO:tasks.workunit.client.1.vm09.stdout:6/769: readlink d3/d21/d76/d5c/d61/l85 0 2026-03-09T17:30:37.341 INFO:tasks.workunit.client.1.vm09.stdout:4/767: mkdir d11/d1e/d45/d60/d71/db7/d89/d8b/d58/deb 0 2026-03-09T17:30:37.341 INFO:tasks.workunit.client.1.vm09.stdout:4/768: stat d11/f12 0 2026-03-09T17:30:37.343 INFO:tasks.workunit.client.1.vm09.stdout:9/795: creat d5/d2e/d8b/db4/f10e x:0 0 0 2026-03-09T17:30:37.346 INFO:tasks.workunit.client.1.vm09.stdout:4/769: dwrite d11/d1e/d29/d36/f40 [0,4194304] 0 2026-03-09T17:30:37.357 INFO:tasks.workunit.client.1.vm09.stdout:8/804: rename d1/d14/f2f to d1/da/d23/d6c/ddd/ffc 0 2026-03-09T17:30:37.359 INFO:tasks.workunit.client.1.vm09.stdout:3/735: mkdir d5/d9/d30/d65/d59/d84/d8c/de1 0 2026-03-09T17:30:37.363 INFO:tasks.workunit.client.1.vm09.stdout:7/886: creat da/d11/d47/d89/dbe/d106/f131 x:0 0 0 2026-03-09T17:30:37.365 INFO:tasks.workunit.client.1.vm09.stdout:6/770: fsync d3/f19 0 2026-03-09T17:30:37.367 INFO:tasks.workunit.client.1.vm09.stdout:9/796: symlink d5/de/d88/l10f 0 2026-03-09T17:30:37.375 INFO:tasks.workunit.client.1.vm09.stdout:0/803: rename d6/d1d/ff8 to d6/d93/f107 0 2026-03-09T17:30:37.382 INFO:tasks.workunit.client.1.vm09.stdout:3/736: read d5/d9/d30/d65/f3e [4918156,107925] 0 2026-03-09T17:30:37.387 INFO:tasks.workunit.client.1.vm09.stdout:2/754: truncate d13/d4d/f7d 1233692 0 2026-03-09T17:30:37.390 INFO:tasks.workunit.client.1.vm09.stdout:5/821: dwrite d0/d2/d76/d87/faf [0,4194304] 0 2026-03-09T17:30:37.393 INFO:tasks.workunit.client.1.vm09.stdout:6/771: symlink d3/d7/d59/d73/lff 0 2026-03-09T17:30:37.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:37 vm09.local ceph-mon[62061]: mgrmap e30: vm06.pbgzei(active, since 4s), standbys: vm09.lqzvkh 2026-03-09T17:30:37.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:37 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:37.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:37 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:37.395 INFO:tasks.workunit.client.1.vm09.stdout:0/804: rename d6/d1d/d24/f4e to d6/d1d/d24/d32/d59/d9c/dac/dcc/f108 0 2026-03-09T17:30:37.395 INFO:tasks.workunit.client.1.vm09.stdout:1/783: dwrite d9/d9e/dc0/d37/d3f/d42/d55/f69 [0,4194304] 0 2026-03-09T17:30:37.396 INFO:tasks.workunit.client.1.vm09.stdout:8/805: mkdir d1/d14/d2a/d42/d43/dfd 0 2026-03-09T17:30:37.397 INFO:tasks.workunit.client.1.vm09.stdout:9/797: fsync d5/de/d29/d90/dc7/fdd 0 2026-03-09T17:30:37.402 INFO:tasks.workunit.client.1.vm09.stdout:2/755: read d13/d15/d34/d37/fa0 [2265996,29866] 0 2026-03-09T17:30:37.419 INFO:tasks.workunit.client.1.vm09.stdout:5/822: mkdir d0/d46/d4b/db7/d109 0 2026-03-09T17:30:37.420 INFO:tasks.workunit.client.1.vm09.stdout:5/823: chown d0/dc/l100 13632 1 2026-03-09T17:30:37.422 INFO:tasks.workunit.client.1.vm09.stdout:4/770: creat d11/d1e/d45/d60/d71/db7/d89/fec x:0 0 0 2026-03-09T17:30:37.424 INFO:tasks.workunit.client.1.vm09.stdout:6/772: mkdir d3/d21/d25/d26/d6b/d100 0 2026-03-09T17:30:37.425 INFO:tasks.workunit.client.1.vm09.stdout:1/784: creat d9/dc/dd/d40/d21/d35/db9/ff3 x:0 0 0 2026-03-09T17:30:37.425 INFO:tasks.workunit.client.1.vm09.stdout:8/806: creat d1/d14/ffe x:0 0 0 2026-03-09T17:30:37.429 INFO:tasks.workunit.client.1.vm09.stdout:7/887: dread da/d11/d2d/f45 [0,4194304] 0 2026-03-09T17:30:37.429 INFO:tasks.workunit.client.1.vm09.stdout:9/798: dread - d5/de/d4e/dca/d84/fee zero size 2026-03-09T17:30:37.431 INFO:tasks.workunit.client.1.vm09.stdout:2/756: fdatasync d13/d15/d3b/d43/fab 0 2026-03-09T17:30:37.433 INFO:tasks.workunit.client.1.vm09.stdout:5/824: creat d0/dc/d21/d6f/f10a x:0 0 0 2026-03-09T17:30:37.441 INFO:tasks.workunit.client.1.vm09.stdout:6/773: mkdir d3/d21/d76/d88/d101 0 2026-03-09T17:30:37.442 INFO:tasks.workunit.client.1.vm09.stdout:6/774: read d3/d7/f23 [7995912,57787] 0 2026-03-09T17:30:37.447 INFO:tasks.workunit.client.1.vm09.stdout:3/737: write d5/d16/d31/d37/d58/f73 [229393,99435] 0 2026-03-09T17:30:37.451 INFO:tasks.workunit.client.1.vm09.stdout:4/771: write d11/d1e/d45/d60/d71/db7/f96 [1725508,114827] 0 2026-03-09T17:30:37.451 INFO:tasks.workunit.client.1.vm09.stdout:0/805: truncate d6/f9 3105002 0 2026-03-09T17:30:37.452 INFO:tasks.workunit.client.1.vm09.stdout:7/888: mknod da/d11/d77/de5/dec/c132 0 2026-03-09T17:30:37.454 INFO:tasks.workunit.client.1.vm09.stdout:8/807: dread d1/d14/d2a/f54 [0,4194304] 0 2026-03-09T17:30:37.455 INFO:tasks.workunit.client.1.vm09.stdout:8/808: chown d1/d14/d2a/c4e 4482680 1 2026-03-09T17:30:37.456 INFO:tasks.workunit.client.1.vm09.stdout:8/809: chown d1/da/d3a/cc1 0 1 2026-03-09T17:30:37.458 INFO:tasks.workunit.client.1.vm09.stdout:2/757: mkdir d13/d15/d21/df5 0 2026-03-09T17:30:37.458 INFO:tasks.workunit.client.1.vm09.stdout:9/799: dwrite d5/de/d4e/dca/de7/d93/faa [0,4194304] 0 2026-03-09T17:30:37.458 INFO:tasks.workunit.client.1.vm09.stdout:6/775: creat d3/d7/f102 x:0 0 0 2026-03-09T17:30:37.459 INFO:tasks.workunit.client.1.vm09.stdout:9/800: stat d5/l19 0 2026-03-09T17:30:37.460 INFO:tasks.workunit.client.1.vm09.stdout:5/825: write d0/d2/ff6 [820910,106655] 0 2026-03-09T17:30:37.460 INFO:tasks.workunit.client.1.vm09.stdout:3/738: write d5/d9/d30/d65/f3e [2716803,119680] 0 2026-03-09T17:30:37.461 INFO:tasks.workunit.client.1.vm09.stdout:3/739: fdatasync d5/d9/d30/d65/d59/d84/d8c/f99 0 2026-03-09T17:30:37.461 INFO:tasks.workunit.client.1.vm09.stdout:4/772: creat d11/d1e/d45/fed x:0 0 0 2026-03-09T17:30:37.469 INFO:tasks.workunit.client.1.vm09.stdout:0/806: mkdir d6/d1d/d24/d32/d59/d9c/dac/d109 0 2026-03-09T17:30:37.471 INFO:tasks.workunit.client.1.vm09.stdout:7/889: symlink da/d11/d77/de5/dec/l133 0 2026-03-09T17:30:37.474 INFO:tasks.workunit.client.1.vm09.stdout:2/758: creat d13/d15/d34/d45/ff6 x:0 0 0 2026-03-09T17:30:37.481 INFO:tasks.workunit.client.1.vm09.stdout:1/785: rmdir d9/dc/dd/d40/d21/d35/db9/dd8 0 2026-03-09T17:30:37.481 INFO:tasks.workunit.client.1.vm09.stdout:3/740: symlink d5/d9/d90/db0/le2 0 2026-03-09T17:30:37.481 INFO:tasks.workunit.client.1.vm09.stdout:9/801: rename d5/de/d29/d90/dc7/d101/f10c to d5/de/d88/f110 0 2026-03-09T17:30:37.482 INFO:tasks.workunit.client.1.vm09.stdout:4/773: mknod d11/d1e/d29/d36/cee 0 2026-03-09T17:30:37.482 INFO:tasks.workunit.client.1.vm09.stdout:3/741: fsync d5/d16/d31/d37/f94 0 2026-03-09T17:30:37.483 INFO:tasks.workunit.client.1.vm09.stdout:9/802: chown d5/de/d4e/dca/d84/d97/ld9 211491012 1 2026-03-09T17:30:37.484 INFO:tasks.workunit.client.1.vm09.stdout:0/807: fsync d6/d1d/d24/d32/d59/f99 0 2026-03-09T17:30:37.485 INFO:tasks.workunit.client.1.vm09.stdout:7/890: rmdir da/d11/d77 39 2026-03-09T17:30:37.487 INFO:tasks.workunit.client.1.vm09.stdout:6/776: symlink d3/d21/d25/d96/de0/l103 0 2026-03-09T17:30:37.489 INFO:tasks.workunit.client.1.vm09.stdout:5/826: creat d0/d2/d76/d87/d95/d9b/dc0/de6/f10b x:0 0 0 2026-03-09T17:30:37.489 INFO:tasks.workunit.client.1.vm09.stdout:2/759: dread - d13/d15/d34/d45/f61 zero size 2026-03-09T17:30:37.493 INFO:tasks.workunit.client.1.vm09.stdout:6/777: chown d3/d21/d76/d5c/d61/d95/l47 18 1 2026-03-09T17:30:37.493 INFO:tasks.workunit.client.1.vm09.stdout:0/808: dwrite d6/d1d/d24/d32/d59/d9c/dac/dcc/fe9 [0,4194304] 0 2026-03-09T17:30:37.494 INFO:tasks.workunit.client.1.vm09.stdout:9/803: creat d5/de/d29/d90/dc7/f111 x:0 0 0 2026-03-09T17:30:37.504 INFO:tasks.workunit.client.1.vm09.stdout:3/742: dread d5/d9/d30/d65/f3e [0,4194304] 0 2026-03-09T17:30:37.505 INFO:tasks.workunit.client.1.vm09.stdout:8/810: write d1/da/d23/d6c/ddd/ffc [2631628,48412] 0 2026-03-09T17:30:37.507 INFO:tasks.workunit.client.1.vm09.stdout:4/774: mkdir d11/d1e/def 0 2026-03-09T17:30:37.533 INFO:tasks.workunit.client.1.vm09.stdout:9/804: sync 2026-03-09T17:30:37.546 INFO:tasks.workunit.client.1.vm09.stdout:4/775: rmdir d11/d1e/d45/d60/d71/db7/d89/d8b 39 2026-03-09T17:30:37.564 INFO:tasks.workunit.client.1.vm09.stdout:5/827: write d0/d9/d74/d75/d9f/f92 [1617389,112383] 0 2026-03-09T17:30:37.572 INFO:tasks.workunit.client.1.vm09.stdout:0/809: write d6/d1d/f41 [2166090,60778] 0 2026-03-09T17:30:37.575 INFO:tasks.workunit.client.1.vm09.stdout:8/811: dwrite d1/d14/d2a/f54 [0,4194304] 0 2026-03-09T17:30:37.576 INFO:tasks.workunit.client.1.vm09.stdout:8/812: stat d1/da/dd/d79/c91 0 2026-03-09T17:30:37.581 INFO:tasks.workunit.client.1.vm09.stdout:9/805: write d5/de/d29/d90/dc7/fdd [197236,29676] 0 2026-03-09T17:30:37.586 INFO:tasks.workunit.client.1.vm09.stdout:7/891: creat da/d11/d47/d5b/d6c/d9e/d4e/d4c/f134 x:0 0 0 2026-03-09T17:30:37.587 INFO:tasks.workunit.client.1.vm09.stdout:1/786: link d9/d9e/dc0/d37/d3f/cca d9/d9e/dc0/d91/cf4 0 2026-03-09T17:30:37.587 INFO:tasks.workunit.client.1.vm09.stdout:1/787: dread - d9/d9e/dc9/fd2 zero size 2026-03-09T17:30:37.589 INFO:tasks.workunit.client.1.vm09.stdout:2/760: link d13/d15/l19 d13/d15/d21/df5/lf7 0 2026-03-09T17:30:37.589 INFO:tasks.workunit.client.1.vm09.stdout:2/761: chown d13/d15/d36/fa1 612442 1 2026-03-09T17:30:37.590 INFO:tasks.workunit.client.1.vm09.stdout:2/762: fdatasync d13/d15/d34/d45/ff6 0 2026-03-09T17:30:37.597 INFO:tasks.workunit.client.1.vm09.stdout:3/743: creat d5/d16/fe3 x:0 0 0 2026-03-09T17:30:37.611 INFO:tasks.workunit.client.1.vm09.stdout:6/778: write d3/d7/d59/d9c/fe3 [4606248,1342] 0 2026-03-09T17:30:37.619 INFO:tasks.workunit.client.1.vm09.stdout:6/779: dread d3/d21/db1/ff0 [0,4194304] 0 2026-03-09T17:30:37.650 INFO:tasks.workunit.client.1.vm09.stdout:5/828: creat d0/d2/d76/d87/d95/d9b/dc0/dce/f10c x:0 0 0 2026-03-09T17:30:37.650 INFO:tasks.workunit.client.1.vm09.stdout:5/829: fsync d0/d52/f108 0 2026-03-09T17:30:37.653 INFO:tasks.workunit.client.1.vm09.stdout:8/813: stat d1/f6e 0 2026-03-09T17:30:37.655 INFO:tasks.workunit.client.1.vm09.stdout:5/830: dwrite d0/d2/d76/d87/d95/f9d [0,4194304] 0 2026-03-09T17:30:37.659 INFO:tasks.workunit.client.1.vm09.stdout:9/806: stat d5/de/d29/d33/l79 0 2026-03-09T17:30:37.665 INFO:tasks.workunit.client.1.vm09.stdout:1/788: fdatasync d9/dc/dd/d40/d1d/f98 0 2026-03-09T17:30:37.666 INFO:tasks.workunit.client.1.vm09.stdout:1/789: truncate d9/d38/d61/feb 929438 0 2026-03-09T17:30:37.668 INFO:tasks.workunit.client.1.vm09.stdout:8/814: dread d1/da/d23/d6c/d32/f6d [0,4194304] 0 2026-03-09T17:30:37.670 INFO:tasks.workunit.client.1.vm09.stdout:3/744: unlink d5/d9/d90/l92 0 2026-03-09T17:30:37.689 INFO:tasks.workunit.client.1.vm09.stdout:2/763: symlink d13/d15/d34/d45/d84/db5/dcf/lf8 0 2026-03-09T17:30:37.690 INFO:tasks.workunit.client.1.vm09.stdout:8/815: stat d1/d14/d2a/d42/d5d/d8a/fec 0 2026-03-09T17:30:37.691 INFO:tasks.workunit.client.1.vm09.stdout:6/780: mknod d3/d7/d59/c104 0 2026-03-09T17:30:37.693 INFO:tasks.workunit.client.1.vm09.stdout:0/810: creat d6/d1d/d24/d32/f10a x:0 0 0 2026-03-09T17:30:37.694 INFO:tasks.workunit.client.1.vm09.stdout:5/831: mkdir d0/d46/d4b/db7/d109/d10d 0 2026-03-09T17:30:37.695 INFO:tasks.workunit.client.1.vm09.stdout:7/892: creat da/f135 x:0 0 0 2026-03-09T17:30:37.695 INFO:tasks.workunit.client.1.vm09.stdout:5/832: chown d0/d9/d8b/fb3 1 1 2026-03-09T17:30:37.700 INFO:tasks.workunit.client.1.vm09.stdout:1/790: creat d9/de5/dea/ff5 x:0 0 0 2026-03-09T17:30:37.702 INFO:tasks.workunit.client.1.vm09.stdout:3/745: dread d5/d16/d31/d37/d58/f91 [0,4194304] 0 2026-03-09T17:30:37.705 INFO:tasks.workunit.client.1.vm09.stdout:2/764: truncate d13/d15/d34/d45/f6a 1032159 0 2026-03-09T17:30:37.706 INFO:tasks.workunit.client.1.vm09.stdout:8/816: rename d1/d14/d2a/d49/lb0 to d1/da/d23/dc2/lff 0 2026-03-09T17:30:37.706 INFO:tasks.workunit.client.1.vm09.stdout:9/807: dwrite d5/d2e/d8b/fb6 [0,4194304] 0 2026-03-09T17:30:37.708 INFO:tasks.workunit.client.1.vm09.stdout:6/781: creat d3/d21/d76/d5c/d9f/f105 x:0 0 0 2026-03-09T17:30:37.715 INFO:tasks.workunit.client.1.vm09.stdout:3/746: dread d5/d16/d31/d3d/fe [0,4194304] 0 2026-03-09T17:30:37.727 INFO:tasks.workunit.client.1.vm09.stdout:5/833: mkdir d0/d2/d76/d87/da4/d10e 0 2026-03-09T17:30:37.727 INFO:tasks.workunit.client.1.vm09.stdout:7/893: mknod da/d11/d47/d5b/d6c/d9e/d4e/d4c/de4/c136 0 2026-03-09T17:30:37.728 INFO:tasks.workunit.client.1.vm09.stdout:1/791: rmdir d9/d9e/dc0/d37 39 2026-03-09T17:30:37.734 INFO:tasks.workunit.client.1.vm09.stdout:1/792: dread d9/dc/dd/d40/f86 [0,4194304] 0 2026-03-09T17:30:37.736 INFO:tasks.workunit.client.1.vm09.stdout:1/793: chown d9/dc/dd/d40/l23 39921 1 2026-03-09T17:30:37.747 INFO:tasks.workunit.client.1.vm09.stdout:6/782: unlink d3/d21/db1/lcb 0 2026-03-09T17:30:37.747 INFO:tasks.workunit.client.1.vm09.stdout:6/783: fdatasync d3/d7/d59/d73/f75 0 2026-03-09T17:30:37.748 INFO:tasks.workunit.client.1.vm09.stdout:0/811: write d6/d1d/d24/d32/d59/d81/fc0 [758051,15724] 0 2026-03-09T17:30:37.754 INFO:tasks.workunit.client.1.vm09.stdout:2/765: write d13/d15/d21/d88/fad [4373977,71602] 0 2026-03-09T17:30:37.757 INFO:tasks.workunit.client.1.vm09.stdout:8/817: truncate d1/d14/d2a/d42/d5d/d8a/f94 1799447 0 2026-03-09T17:30:37.759 INFO:tasks.workunit.client.1.vm09.stdout:3/747: symlink d5/d16/d31/d37/dae/le4 0 2026-03-09T17:30:37.761 INFO:tasks.workunit.client.1.vm09.stdout:9/808: dwrite d5/de/d29/fc0 [0,4194304] 0 2026-03-09T17:30:37.762 INFO:tasks.workunit.client.1.vm09.stdout:9/809: read - d5/d2e/d8b/de0/df1/f106 zero size 2026-03-09T17:30:37.773 INFO:tasks.workunit.client.1.vm09.stdout:4/776: link d11/d1e/d29/fcc d11/d1e/d45/d60/d71/db7/d89/d8b/ff0 0 2026-03-09T17:30:37.777 INFO:tasks.workunit.client.1.vm09.stdout:7/894: rename da/d11/d47/d89 to da/d11/d64/da7/d137 0 2026-03-09T17:30:37.777 INFO:tasks.workunit.client.1.vm09.stdout:7/895: fsync da/d11/d47/d5b/d78/fea 0 2026-03-09T17:30:37.778 INFO:tasks.workunit.client.1.vm09.stdout:7/896: write da/d11/d47/d5b/d6c/d9e/d4e/f2b [2334007,54095] 0 2026-03-09T17:30:37.783 INFO:tasks.workunit.client.1.vm09.stdout:5/834: truncate d0/d52/d20/f7c 2854699 0 2026-03-09T17:30:37.785 INFO:tasks.workunit.client.1.vm09.stdout:5/835: dwrite d0/d2/d76/d87/d95/f9d [0,4194304] 0 2026-03-09T17:30:37.789 INFO:tasks.workunit.client.1.vm09.stdout:1/794: chown d9/d9e/dc0/d37/d3f/d42/d55/df0 343 1 2026-03-09T17:30:37.807 INFO:tasks.workunit.client.1.vm09.stdout:0/812: mkdir d6/d64/d94/d10b 0 2026-03-09T17:30:37.811 INFO:tasks.workunit.client.1.vm09.stdout:6/784: mknod d3/d21/d76/d5c/d61/d6a/c106 0 2026-03-09T17:30:37.813 INFO:tasks.workunit.client.1.vm09.stdout:8/818: symlink d1/d14/d2a/d42/d43/d44/l100 0 2026-03-09T17:30:37.818 INFO:tasks.workunit.client.1.vm09.stdout:2/766: rmdir d13/d15/d3b/ddf/d85 39 2026-03-09T17:30:37.822 INFO:tasks.workunit.client.1.vm09.stdout:9/810: creat d5/de/d4e/dca/d84/d97/f112 x:0 0 0 2026-03-09T17:30:37.824 INFO:tasks.workunit.client.1.vm09.stdout:6/785: creat d3/d21/d25/d26/d86/dbc/f107 x:0 0 0 2026-03-09T17:30:37.826 INFO:tasks.workunit.client.1.vm09.stdout:6/786: mknod d3/d21/d76/d81/c108 0 2026-03-09T17:30:37.827 INFO:tasks.workunit.client.1.vm09.stdout:4/777: rename d11/d1e/d29/d36/d57 to d11/d1e/d45/d60/df1 0 2026-03-09T17:30:37.829 INFO:tasks.workunit.client.1.vm09.stdout:2/767: creat d13/d15/d36/d72/ff9 x:0 0 0 2026-03-09T17:30:37.832 INFO:tasks.workunit.client.1.vm09.stdout:5/836: rename d0/d2/d76/d87/da4/dbf to d0/d9/d74/d10f 0 2026-03-09T17:30:37.832 INFO:tasks.workunit.client.1.vm09.stdout:0/813: rename d6/d1d/d24/d32/d59/d9c to d6/d1d/d24/d32/d59/d9c/dac/d10c 22 2026-03-09T17:30:37.833 INFO:tasks.workunit.client.1.vm09.stdout:2/768: chown d13/d15/d34/d37/c70 56 1 2026-03-09T17:30:37.833 INFO:tasks.workunit.client.1.vm09.stdout:2/769: chown d13/d15/d34 20088922 1 2026-03-09T17:30:37.839 INFO:tasks.workunit.client.1.vm09.stdout:5/837: dread d0/dc/d21/d26/f3d [0,4194304] 0 2026-03-09T17:30:37.850 INFO:tasks.workunit.client.1.vm09.stdout:0/814: creat d6/d1d/df0/f10d x:0 0 0 2026-03-09T17:30:37.864 INFO:tasks.workunit.client.1.vm09.stdout:7/897: dwrite da/d11/d47/d5b/d6c/f73 [0,4194304] 0 2026-03-09T17:30:37.864 INFO:tasks.workunit.client.1.vm09.stdout:7/898: dread - da/d11/d3e/da2/fb7 zero size 2026-03-09T17:30:37.866 INFO:tasks.workunit.client.1.vm09.stdout:1/795: dwrite f6 [0,4194304] 0 2026-03-09T17:30:37.868 INFO:tasks.workunit.client.1.vm09.stdout:8/819: dwrite d1/da/dd/f22 [0,4194304] 0 2026-03-09T17:30:37.879 INFO:tasks.workunit.client.1.vm09.stdout:9/811: write d5/f1e [3303119,73213] 0 2026-03-09T17:30:37.882 INFO:tasks.workunit.client.1.vm09.stdout:9/812: fdatasync d5/de/d29/fc0 0 2026-03-09T17:30:37.882 INFO:tasks.workunit.client.1.vm09.stdout:0/815: readlink d6/d1d/l22 0 2026-03-09T17:30:37.883 INFO:tasks.workunit.client.1.vm09.stdout:3/748: dwrite d5/f22 [0,4194304] 0 2026-03-09T17:30:37.889 INFO:tasks.workunit.client.1.vm09.stdout:1/796: symlink d9/dc/dd/d40/d21/d35/db9/lf6 0 2026-03-09T17:30:37.901 INFO:tasks.workunit.client.1.vm09.stdout:0/816: mknod d6/d64/d97/dd6/c10e 0 2026-03-09T17:30:37.901 INFO:tasks.workunit.client.1.vm09.stdout:0/817: write d6/d1d/d24/d32/fde [215313,40677] 0 2026-03-09T17:30:37.901 INFO:tasks.workunit.client.1.vm09.stdout:8/820: unlink d1/d14/d2a/d49/c4d 0 2026-03-09T17:30:37.901 INFO:tasks.workunit.client.1.vm09.stdout:9/813: rmdir d5/d91/d99/dc9/dde 39 2026-03-09T17:30:37.901 INFO:tasks.workunit.client.1.vm09.stdout:3/749: symlink d5/d9/d30/d65/d59/d84/d8c/de1/le5 0 2026-03-09T17:30:37.901 INFO:tasks.workunit.client.1.vm09.stdout:1/797: creat d9/dc/dd/d9f/de4/ff7 x:0 0 0 2026-03-09T17:30:37.901 INFO:tasks.workunit.client.1.vm09.stdout:8/821: rename d1/da/dd/d63 to d1/da/d23/d71/d101 0 2026-03-09T17:30:37.901 INFO:tasks.workunit.client.1.vm09.stdout:8/822: fdatasync d1/da/dd/d79/fca 0 2026-03-09T17:30:37.904 INFO:tasks.workunit.client.1.vm09.stdout:7/899: dread da/f36 [0,4194304] 0 2026-03-09T17:30:37.905 INFO:tasks.workunit.client.1.vm09.stdout:1/798: mknod d9/d9e/dc0/d37/d3f/d42/d55/cf8 0 2026-03-09T17:30:37.906 INFO:tasks.workunit.client.1.vm09.stdout:1/799: chown d9/dc/dd/d9f/l7f 4473 1 2026-03-09T17:30:37.911 INFO:tasks.workunit.client.1.vm09.stdout:1/800: fdatasync d9/d9e/dc0/d37/d3f/d42/d55/f69 0 2026-03-09T17:30:37.911 INFO:tasks.workunit.client.1.vm09.stdout:8/823: creat d1/d14/d2a/f102 x:0 0 0 2026-03-09T17:30:37.911 INFO:tasks.workunit.client.1.vm09.stdout:9/814: mkdir d5/de/d29/d113 0 2026-03-09T17:30:37.911 INFO:tasks.workunit.client.1.vm09.stdout:4/778: dread d11/f19 [0,4194304] 0 2026-03-09T17:30:37.911 INFO:tasks.workunit.client.1.vm09.stdout:1/801: truncate d9/dc/dd/d40/d21/d6f/fd6 783374 0 2026-03-09T17:30:37.913 INFO:tasks.workunit.client.1.vm09.stdout:8/824: readlink d1/d14/d2a/d42/d5d/d8a/lea 0 2026-03-09T17:30:37.913 INFO:tasks.workunit.client.1.vm09.stdout:4/779: symlink d11/d1e/d29/db5/lf2 0 2026-03-09T17:30:37.917 INFO:tasks.workunit.client.1.vm09.stdout:7/900: truncate da/d11/d47/d5b/d6c/d9e/d4e/d4c/fed 1190029 0 2026-03-09T17:30:37.924 INFO:tasks.workunit.client.1.vm09.stdout:8/825: mkdir d1/da/d3a/d103 0 2026-03-09T17:30:37.924 INFO:tasks.workunit.client.1.vm09.stdout:1/802: truncate d9/dc/dd/d40/d1d/f1e 919549 0 2026-03-09T17:30:37.924 INFO:tasks.workunit.client.1.vm09.stdout:1/803: mknod d9/d9e/dc0/d37/da4/cf9 0 2026-03-09T17:30:37.924 INFO:tasks.workunit.client.1.vm09.stdout:8/826: creat d1/da/d23/dc2/da2/ddf/f104 x:0 0 0 2026-03-09T17:30:37.924 INFO:tasks.workunit.client.1.vm09.stdout:4/780: rename d11/fb0 to d11/d1e/d45/daf/ff3 0 2026-03-09T17:30:37.924 INFO:tasks.workunit.client.1.vm09.stdout:4/781: dread - d11/fae zero size 2026-03-09T17:30:37.924 INFO:tasks.workunit.client.1.vm09.stdout:9/815: link d5/d21/cd0 d5/de/d29/dd4/df0/c114 0 2026-03-09T17:30:37.924 INFO:tasks.workunit.client.1.vm09.stdout:1/804: truncate d9/dc/dd/d40/d1d/f4d 1362927 0 2026-03-09T17:30:37.924 INFO:tasks.workunit.client.1.vm09.stdout:8/827: chown d1/da/d23/d71/d101/cbe 2874485 1 2026-03-09T17:30:37.924 INFO:tasks.workunit.client.1.vm09.stdout:9/816: creat d5/d2e/d8b/db4/f115 x:0 0 0 2026-03-09T17:30:37.928 INFO:tasks.workunit.client.1.vm09.stdout:8/828: rename d1/d14/d2a/d42/d43/f9e to d1/da/d23/dc2/da2/ddf/f105 0 2026-03-09T17:30:37.933 INFO:tasks.workunit.client.1.vm09.stdout:7/901: dread da/d11/d64/fa9 [0,4194304] 0 2026-03-09T17:30:37.933 INFO:tasks.workunit.client.1.vm09.stdout:9/817: mkdir d5/d2e/d8b/d116 0 2026-03-09T17:30:37.944 INFO:tasks.workunit.client.1.vm09.stdout:1/805: getdents d9/d9e/dc0/d91 0 2026-03-09T17:30:37.948 INFO:tasks.workunit.client.1.vm09.stdout:1/806: dwrite d9/dc/dd/fee [0,4194304] 0 2026-03-09T17:30:37.949 INFO:tasks.workunit.client.1.vm09.stdout:7/902: fdatasync da/d11/d2d/d56/da1/f123 0 2026-03-09T17:30:37.968 INFO:tasks.workunit.client.1.vm09.stdout:6/787: dwrite d3/d7/d59/d73/fef [0,4194304] 0 2026-03-09T17:30:37.969 INFO:tasks.workunit.client.1.vm09.stdout:8/829: dread d1/d14/d2a/f2b [0,4194304] 0 2026-03-09T17:30:37.970 INFO:tasks.workunit.client.1.vm09.stdout:1/807: creat d9/d9e/dc0/ffa x:0 0 0 2026-03-09T17:30:37.972 INFO:tasks.workunit.client.1.vm09.stdout:2/770: write d13/d15/d34/d37/fa0 [487002,15473] 0 2026-03-09T17:30:37.972 INFO:tasks.workunit.client.1.vm09.stdout:7/903: fdatasync da/d11/d2d/f69 0 2026-03-09T17:30:37.974 INFO:tasks.workunit.client.1.vm09.stdout:8/830: creat d1/da/dd/d77/f106 x:0 0 0 2026-03-09T17:30:37.977 INFO:tasks.workunit.client.1.vm09.stdout:6/788: truncate d3/d21/d76/d5c/d7e/dc5/d98/fa6 1634008 0 2026-03-09T17:30:37.977 INFO:tasks.workunit.client.1.vm09.stdout:8/831: unlink d1/da/dd/d79/c9f 0 2026-03-09T17:30:37.978 INFO:tasks.workunit.client.1.vm09.stdout:8/832: write d1/da/dd/faf [3642296,68190] 0 2026-03-09T17:30:37.979 INFO:tasks.workunit.client.1.vm09.stdout:8/833: chown d1/d14/d2a/d42/d5d/d8a/fb8 1561256243 1 2026-03-09T17:30:37.980 INFO:tasks.workunit.client.1.vm09.stdout:8/834: write d1/da/dd/d47/f64 [3063331,85967] 0 2026-03-09T17:30:37.981 INFO:tasks.workunit.client.1.vm09.stdout:6/789: fdatasync d3/d21/d76/d5c/d7e/dc5/d9a/fb4 0 2026-03-09T17:30:37.982 INFO:tasks.workunit.client.1.vm09.stdout:6/790: fdatasync d3/d21/d76/d5c/f92 0 2026-03-09T17:30:37.984 INFO:tasks.workunit.client.1.vm09.stdout:6/791: fdatasync d3/d21/d76/d5c/d9f/f105 0 2026-03-09T17:30:37.984 INFO:tasks.workunit.client.1.vm09.stdout:6/792: chown d3/d7/d59/d73/db0 99 1 2026-03-09T17:30:37.985 INFO:tasks.workunit.client.1.vm09.stdout:6/793: stat d3/d21/d76/d5c/d61/d6a/c106 0 2026-03-09T17:30:37.994 INFO:tasks.workunit.client.1.vm09.stdout:7/904: unlink da/d11/d77/de5/dec/f95 0 2026-03-09T17:30:37.997 INFO:tasks.workunit.client.1.vm09.stdout:5/838: write d0/d46/d4b/feb [1195905,46505] 0 2026-03-09T17:30:38.007 INFO:tasks.workunit.client.1.vm09.stdout:3/750: write d5/d9/d30/d65/d59/d84/fa7 [882996,82354] 0 2026-03-09T17:30:38.016 INFO:tasks.workunit.client.1.vm09.stdout:3/751: dread - d5/d16/fe3 zero size 2026-03-09T17:30:38.016 INFO:tasks.workunit.client.1.vm09.stdout:0/818: dwrite d6/d1d/d39/f53 [0,4194304] 0 2026-03-09T17:30:38.016 INFO:tasks.workunit.client.1.vm09.stdout:4/782: write d11/d1e/d29/f81 [424640,17173] 0 2026-03-09T17:30:38.019 INFO:tasks.workunit.client.1.vm09.stdout:8/835: sync 2026-03-09T17:30:38.019 INFO:tasks.workunit.client.1.vm09.stdout:3/752: sync 2026-03-09T17:30:38.020 INFO:tasks.workunit.client.1.vm09.stdout:9/818: dwrite d5/de/f65 [0,4194304] 0 2026-03-09T17:30:38.032 INFO:tasks.workunit.client.1.vm09.stdout:4/783: creat d11/d1e/d31/db6/ff4 x:0 0 0 2026-03-09T17:30:38.035 INFO:tasks.workunit.client.1.vm09.stdout:9/819: mknod d5/d2e/d8b/db4/c117 0 2026-03-09T17:30:38.054 INFO:tasks.workunit.client.1.vm09.stdout:3/753: mknod d5/d16/d31/d37/dae/ce6 0 2026-03-09T17:30:38.054 INFO:tasks.workunit.client.1.vm09.stdout:4/784: symlink d11/d1e/d45/d60/df1/dce/lf5 0 2026-03-09T17:30:38.054 INFO:tasks.workunit.client.1.vm09.stdout:1/808: write d9/dc/dd/d40/f86 [2948141,33913] 0 2026-03-09T17:30:38.054 INFO:tasks.workunit.client.1.vm09.stdout:2/771: write d13/d15/d36/d72/d94/fc0 [655957,130264] 0 2026-03-09T17:30:38.054 INFO:tasks.workunit.client.1.vm09.stdout:6/794: write d3/f4f [814078,109520] 0 2026-03-09T17:30:38.054 INFO:tasks.workunit.client.1.vm09.stdout:2/772: rmdir d13/d15/d36/d72/d94/da7/db0 39 2026-03-09T17:30:38.054 INFO:tasks.workunit.client.1.vm09.stdout:6/795: dread - d3/d48/f68 zero size 2026-03-09T17:30:38.054 INFO:tasks.workunit.client.1.vm09.stdout:6/796: write d3/d21/d76/d5c/d9f/f105 [533730,29423] 0 2026-03-09T17:30:38.054 INFO:tasks.workunit.client.1.vm09.stdout:2/773: dwrite d13/d15/d34/d37/d66/f80 [0,4194304] 0 2026-03-09T17:30:38.054 INFO:tasks.workunit.client.1.vm09.stdout:6/797: chown d3/d7/d99/fcf 282605494 1 2026-03-09T17:30:38.054 INFO:tasks.workunit.client.1.vm09.stdout:3/754: getdents d5/d16/d31/d3d/d9f 0 2026-03-09T17:30:38.054 INFO:tasks.workunit.client.1.vm09.stdout:2/774: truncate d13/db3/fbf 558936 0 2026-03-09T17:30:38.057 INFO:tasks.workunit.client.1.vm09.stdout:4/785: dread d11/d1e/d45/d60/d71/db7/f90 [0,4194304] 0 2026-03-09T17:30:38.058 INFO:tasks.workunit.client.1.vm09.stdout:3/755: dread - d5/d9/d30/d65/d59/fa2 zero size 2026-03-09T17:30:38.060 INFO:tasks.workunit.client.1.vm09.stdout:1/809: sync 2026-03-09T17:30:38.063 INFO:tasks.workunit.client.1.vm09.stdout:7/905: write da/d11/d47/d5b/fe8 [521306,35465] 0 2026-03-09T17:30:38.063 INFO:tasks.workunit.client.1.vm09.stdout:1/810: mkdir d9/de5/dfb 0 2026-03-09T17:30:38.067 INFO:tasks.workunit.client.1.vm09.stdout:4/786: creat d11/d1e/def/ff6 x:0 0 0 2026-03-09T17:30:38.068 INFO:tasks.workunit.client.1.vm09.stdout:5/839: write d0/d9/f7f [8255954,106761] 0 2026-03-09T17:30:38.068 INFO:tasks.workunit.client.1.vm09.stdout:1/811: creat d9/dc/dd/d9f/de4/ffc x:0 0 0 2026-03-09T17:30:38.069 INFO:tasks.workunit.client.1.vm09.stdout:0/819: write d6/d1d/d24/d5e/d6c/fa5 [216275,18430] 0 2026-03-09T17:30:38.069 INFO:tasks.workunit.client.1.vm09.stdout:7/906: write da/d11/d2d/d56/f9f [1993363,42409] 0 2026-03-09T17:30:38.070 INFO:tasks.workunit.client.1.vm09.stdout:8/836: write d1/da/dd/d77/fad [2038350,81870] 0 2026-03-09T17:30:38.070 INFO:tasks.workunit.client.1.vm09.stdout:2/775: link d13/d15/d21/df5/lf7 d13/d15/d34/d45/d84/dcb/da2/lfa 0 2026-03-09T17:30:38.074 INFO:tasks.workunit.client.1.vm09.stdout:5/840: stat d0/c10 0 2026-03-09T17:30:38.075 INFO:tasks.workunit.client.1.vm09.stdout:5/841: chown d0/dc/d21/d33/fa2 1243405783 1 2026-03-09T17:30:38.079 INFO:tasks.workunit.client.1.vm09.stdout:1/812: mkdir d9/d9e/dc0/d37/da4/dfd 0 2026-03-09T17:30:38.082 INFO:tasks.workunit.client.1.vm09.stdout:1/813: dread - d9/dc/dd/d40/d21/d35/fe3 zero size 2026-03-09T17:30:38.082 INFO:tasks.workunit.client.1.vm09.stdout:1/814: readlink d9/dc/dd/d40/l23 0 2026-03-09T17:30:38.083 INFO:tasks.workunit.client.1.vm09.stdout:0/820: mknod d6/d64/db5/c10f 0 2026-03-09T17:30:38.084 INFO:tasks.workunit.client.1.vm09.stdout:0/821: read - d6/d1d/d24/d32/d59/d81/d8c/ff4 zero size 2026-03-09T17:30:38.084 INFO:tasks.workunit.client.1.vm09.stdout:0/822: fdatasync d6/d64/d97/dd6/ffd 0 2026-03-09T17:30:38.085 INFO:tasks.workunit.client.1.vm09.stdout:8/837: symlink d1/da/d23/d71/l107 0 2026-03-09T17:30:38.086 INFO:tasks.workunit.client.1.vm09.stdout:7/907: rmdir da/d11/d3e/da2/db2 39 2026-03-09T17:30:38.087 INFO:tasks.workunit.client.1.vm09.stdout:7/908: write da/d11/d77/d101/f115 [519273,76180] 0 2026-03-09T17:30:38.090 INFO:tasks.workunit.client.1.vm09.stdout:0/823: stat d6/d1d/d24/d32/l9b 0 2026-03-09T17:30:38.091 INFO:tasks.workunit.client.1.vm09.stdout:0/824: chown d6/d1d/d24/d32/fec 6 1 2026-03-09T17:30:38.092 INFO:tasks.workunit.client.1.vm09.stdout:0/825: read - d6/d1d/d24/d32/d59/d81/d8c/ff4 zero size 2026-03-09T17:30:38.093 INFO:tasks.workunit.client.1.vm09.stdout:0/826: chown d6/d1d/d24/d32/d59/d9c/dac/dcc/fe9 15639 1 2026-03-09T17:30:38.103 INFO:tasks.workunit.client.1.vm09.stdout:8/838: dread - d1/d14/fcd zero size 2026-03-09T17:30:38.103 INFO:tasks.workunit.client.1.vm09.stdout:9/820: write d5/de/d29/d90/dc7/fc6 [561482,31530] 0 2026-03-09T17:30:38.110 INFO:tasks.workunit.client.1.vm09.stdout:6/798: write d3/d21/d25/d96/fec [710137,36514] 0 2026-03-09T17:30:38.111 INFO:tasks.workunit.client.1.vm09.stdout:1/815: dread d9/d9e/dc0/d37/d3f/d42/f95 [0,4194304] 0 2026-03-09T17:30:38.113 INFO:tasks.workunit.client.1.vm09.stdout:0/827: readlink d6/d1d/d39/l3d 0 2026-03-09T17:30:38.113 INFO:tasks.workunit.client.1.vm09.stdout:0/828: fdatasync d6/d64/d97/dd6/ffd 0 2026-03-09T17:30:38.115 INFO:tasks.workunit.client.1.vm09.stdout:0/829: write d6/d1d/d24/d32/d59/fb0 [275504,126260] 0 2026-03-09T17:30:38.116 INFO:tasks.workunit.client.1.vm09.stdout:1/816: sync 2026-03-09T17:30:38.119 INFO:tasks.workunit.client.1.vm09.stdout:3/756: dwrite d5/d9/d30/f6a [0,4194304] 0 2026-03-09T17:30:38.125 INFO:tasks.workunit.client.1.vm09.stdout:0/830: dwrite d6/d1d/d24/d32/d59/d9c/dac/dcc/fe9 [0,4194304] 0 2026-03-09T17:30:38.128 INFO:tasks.workunit.client.1.vm09.stdout:4/787: dread d11/d1e/d29/d36/f86 [4194304,4194304] 0 2026-03-09T17:30:38.133 INFO:tasks.workunit.client.1.vm09.stdout:5/842: link d0/dc/l2e d0/dc/d21/d26/d5e/dd4/l110 0 2026-03-09T17:30:38.136 INFO:tasks.workunit.client.1.vm09.stdout:2/776: dwrite d13/f26 [0,4194304] 0 2026-03-09T17:30:38.160 INFO:tasks.workunit.client.1.vm09.stdout:4/788: dread d11/d1e/d45/d60/d71/db7/fa5 [0,4194304] 0 2026-03-09T17:30:38.161 INFO:tasks.workunit.client.1.vm09.stdout:8/839: rmdir d1/d14/d2a/d42/d5d 39 2026-03-09T17:30:38.176 INFO:tasks.workunit.client.1.vm09.stdout:0/831: rmdir d6/d93 39 2026-03-09T17:30:38.189 INFO:tasks.workunit.client.1.vm09.stdout:5/843: mkdir d0/d9/d74/d75/d9f/db6/d111 0 2026-03-09T17:30:38.198 INFO:tasks.workunit.client.1.vm09.stdout:3/757: dread d5/d16/d25/f60 [0,4194304] 0 2026-03-09T17:30:38.200 INFO:tasks.workunit.client.1.vm09.stdout:8/840: rmdir d1/d14/d96 39 2026-03-09T17:30:38.202 INFO:tasks.workunit.client.1.vm09.stdout:7/909: creat da/f138 x:0 0 0 2026-03-09T17:30:38.203 INFO:tasks.workunit.client.1.vm09.stdout:7/910: fsync da/d11/d3e/da2/d11a/f12d 0 2026-03-09T17:30:38.205 INFO:tasks.workunit.client.1.vm09.stdout:9/821: creat d5/de/d29/d33/db8/dfb/f118 x:0 0 0 2026-03-09T17:30:38.207 INFO:tasks.workunit.client.1.vm09.stdout:8/841: dwrite d1/d14/d2a/d49/fe2 [4194304,4194304] 0 2026-03-09T17:30:38.208 INFO:tasks.workunit.client.1.vm09.stdout:8/842: chown d1/da/dd/lee 6 1 2026-03-09T17:30:38.216 INFO:tasks.workunit.client.1.vm09.stdout:6/799: write d3/faf [1838297,125880] 0 2026-03-09T17:30:38.221 INFO:tasks.workunit.client.1.vm09.stdout:1/817: dwrite d9/dc/dd/d40/d21/d6f/f85 [8388608,4194304] 0 2026-03-09T17:30:38.231 INFO:tasks.workunit.client.1.vm09.stdout:1/818: chown d9/dc/laf 270242415 1 2026-03-09T17:30:38.236 INFO:tasks.workunit.client.1.vm09.stdout:5/844: fdatasync d0/d52/f97 0 2026-03-09T17:30:38.237 INFO:tasks.workunit.client.1.vm09.stdout:3/758: rename d5/d9/d30/d65/d59/d84/d8c to d5/d9c/de7 0 2026-03-09T17:30:38.245 INFO:tasks.workunit.client.1.vm09.stdout:0/832: mknod d6/d1d/d24/d5e/dc2/df7/c110 0 2026-03-09T17:30:38.246 INFO:tasks.workunit.client.1.vm09.stdout:4/789: rmdir d11/d1e/de2 0 2026-03-09T17:30:38.259 INFO:tasks.workunit.client.1.vm09.stdout:8/843: dwrite d1/d14/d2a/f81 [0,4194304] 0 2026-03-09T17:30:38.261 INFO:tasks.workunit.client.1.vm09.stdout:1/819: dwrite d9/f29 [4194304,4194304] 0 2026-03-09T17:30:38.267 INFO:tasks.workunit.client.1.vm09.stdout:9/822: dwrite d5/d21/f46 [0,4194304] 0 2026-03-09T17:30:38.269 INFO:tasks.workunit.client.1.vm09.stdout:2/777: getdents d13/d15/d34/d37/d66 0 2026-03-09T17:30:38.271 INFO:tasks.workunit.client.1.vm09.stdout:7/911: dwrite da/d11/d47/d5b/d6c/d9e/d4e/d4c/de4/f52 [0,4194304] 0 2026-03-09T17:30:38.280 INFO:tasks.workunit.client.1.vm09.stdout:9/823: dwrite d5/f14 [4194304,4194304] 0 2026-03-09T17:30:38.281 INFO:tasks.workunit.client.1.vm09.stdout:7/912: dread - da/d11/d2d/d56/d68/f129 zero size 2026-03-09T17:30:38.282 INFO:tasks.workunit.client.1.vm09.stdout:9/824: dread - d5/d2e/d8b/db4/f115 zero size 2026-03-09T17:30:38.289 INFO:tasks.workunit.client.1.vm09.stdout:4/790: mkdir d11/d1e/d29/d36/df7 0 2026-03-09T17:30:38.289 INFO:tasks.workunit.client.1.vm09.stdout:0/833: mkdir d6/d1d/d24/d32/d59/d9c/dac/dcc/d111 0 2026-03-09T17:30:38.294 INFO:tasks.workunit.client.1.vm09.stdout:5/845: creat d0/d2/d76/d87/d95/d9b/dc0/df4/f112 x:0 0 0 2026-03-09T17:30:38.298 INFO:tasks.workunit.client.1.vm09.stdout:1/820: mknod d9/dc/dd/d9f/de4/cfe 0 2026-03-09T17:30:38.301 INFO:tasks.workunit.client.1.vm09.stdout:1/821: dwrite d9/dc/dd/d9f/de4/fd1 [0,4194304] 0 2026-03-09T17:30:38.312 INFO:tasks.workunit.client.1.vm09.stdout:3/759: creat d5/d9/fe8 x:0 0 0 2026-03-09T17:30:38.317 INFO:tasks.workunit.client.1.vm09.stdout:7/913: fdatasync da/d11/d77/fba 0 2026-03-09T17:30:38.318 INFO:tasks.workunit.client.1.vm09.stdout:9/825: creat d5/de/d88/f119 x:0 0 0 2026-03-09T17:30:38.321 INFO:tasks.workunit.client.1.vm09.stdout:5/846: rmdir d0/d2 39 2026-03-09T17:30:38.321 INFO:tasks.workunit.client.1.vm09.stdout:5/847: chown d0/d52/d20/l27 70048484 1 2026-03-09T17:30:38.322 INFO:tasks.workunit.client.1.vm09.stdout:7/914: dread da/f36 [0,4194304] 0 2026-03-09T17:30:38.327 INFO:tasks.workunit.client.1.vm09.stdout:0/834: unlink d6/d1d/d24/d32/d59/d9c/fce 0 2026-03-09T17:30:38.328 INFO:tasks.workunit.client.1.vm09.stdout:6/800: link d3/d21/d25/l29 d3/d21/d76/d5c/d61/d6a/l109 0 2026-03-09T17:30:38.329 INFO:tasks.workunit.client.1.vm09.stdout:6/801: write d3/d21/d25/d26/d86/dbc/f107 [281719,88781] 0 2026-03-09T17:30:38.338 INFO:tasks.workunit.client.1.vm09.stdout:1/822: mkdir d9/d38/d61/dff 0 2026-03-09T17:30:38.340 INFO:tasks.workunit.client.1.vm09.stdout:0/835: dread d6/d1d/d24/d32/d59/d9c/dac/dcc/ff2 [0,4194304] 0 2026-03-09T17:30:38.341 INFO:tasks.workunit.client.1.vm09.stdout:0/836: chown d6/d1d/d24/d32/d59/d81/d8c 333596 1 2026-03-09T17:30:38.342 INFO:tasks.workunit.client.1.vm09.stdout:0/837: fdatasync d6/d1d/d24/d32/d59/d81/d8c/ff5 0 2026-03-09T17:30:38.343 INFO:tasks.workunit.client.1.vm09.stdout:7/915: read da/d11/d47/d5b/d6c/d9e/d4e/d4c/f67 [7812000,41840] 0 2026-03-09T17:30:38.345 INFO:tasks.workunit.client.1.vm09.stdout:1/823: mknod d9/d38/d61/c100 0 2026-03-09T17:30:38.346 INFO:tasks.workunit.client.1.vm09.stdout:5/848: dwrite d0/d2/d76/d87/da4/fa6 [0,4194304] 0 2026-03-09T17:30:38.352 INFO:tasks.workunit.client.1.vm09.stdout:3/760: sync 2026-03-09T17:30:38.353 INFO:tasks.workunit.client.1.vm09.stdout:8/844: getdents d1/da/dd/d47/d4c 0 2026-03-09T17:30:38.353 INFO:tasks.workunit.client.1.vm09.stdout:1/824: sync 2026-03-09T17:30:38.360 INFO:tasks.workunit.client.1.vm09.stdout:6/802: rename d3/l17 to d3/d21/d76/d5c/d7e/d94/l10a 0 2026-03-09T17:30:38.361 INFO:tasks.workunit.client.1.vm09.stdout:6/803: chown d3/f19 333 1 2026-03-09T17:30:38.361 INFO:tasks.workunit.client.1.vm09.stdout:5/849: read d0/d9/fd2 [1673610,90600] 0 2026-03-09T17:30:38.361 INFO:tasks.workunit.client.1.vm09.stdout:1/825: read - d9/dc/dd/d9f/de4/ff7 zero size 2026-03-09T17:30:38.363 INFO:tasks.workunit.client.1.vm09.stdout:1/826: chown d9/d38 157439794 1 2026-03-09T17:30:38.365 INFO:tasks.workunit.client.1.vm09.stdout:2/778: getdents d13/d15 0 2026-03-09T17:30:38.365 INFO:tasks.workunit.client.1.vm09.stdout:3/761: dread d5/d16/d25/f60 [0,4194304] 0 2026-03-09T17:30:38.366 INFO:tasks.workunit.client.1.vm09.stdout:5/850: write d0/d2/d76/d87/d95/d9b/dc0/dde/fed [648864,35905] 0 2026-03-09T17:30:38.367 INFO:tasks.workunit.client.1.vm09.stdout:6/804: creat d3/d21/d76/d3f/d8f/f10b x:0 0 0 2026-03-09T17:30:38.374 INFO:tasks.workunit.client.1.vm09.stdout:8/845: read d1/d14/d2a/d42/d5d/d8a/fb8 [1405190,5882] 0 2026-03-09T17:30:38.376 INFO:tasks.workunit.client.1.vm09.stdout:2/779: stat d13/c1f 0 2026-03-09T17:30:38.379 INFO:tasks.workunit.client.1.vm09.stdout:9/826: getdents d5/d21 0 2026-03-09T17:30:38.386 INFO:tasks.workunit.client.1.vm09.stdout:5/851: fsync d0/d46/f4c 0 2026-03-09T17:30:38.386 INFO:tasks.workunit.client.1.vm09.stdout:6/805: creat d3/d21/d25/d26/db7/f10c x:0 0 0 2026-03-09T17:30:38.387 INFO:tasks.workunit.client.1.vm09.stdout:3/762: truncate d5/d9/d30/d65/d59/f81 584343 0 2026-03-09T17:30:38.387 INFO:tasks.workunit.client.1.vm09.stdout:0/838: truncate d6/d1d/d24/d5e/f9e 1875887 0 2026-03-09T17:30:38.388 INFO:tasks.workunit.client.1.vm09.stdout:0/839: stat d6/c29 0 2026-03-09T17:30:38.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:38 vm06.local ceph-mon[57307]: pgmap v6: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail 2026-03-09T17:30:38.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:38 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:38.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:38 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:38.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:38 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:38.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:38 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:38.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:38 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:38.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:38 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:30:38.392 INFO:tasks.workunit.client.1.vm09.stdout:8/846: rename d1/d14/d2a/d42/d43/f58 to d1/d14/d2a/d42/d43/d44/f108 0 2026-03-09T17:30:38.393 INFO:tasks.workunit.client.1.vm09.stdout:2/780: stat d13/d15/d36/d72/d94/da7/db0 0 2026-03-09T17:30:38.395 INFO:tasks.workunit.client.1.vm09.stdout:8/847: sync 2026-03-09T17:30:38.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:38 vm09.local ceph-mon[62061]: pgmap v6: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail 2026-03-09T17:30:38.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:38 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:38.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:38 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:38.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:38 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:38.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:38 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:30:38.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:38 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:38.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:38 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:30:38.396 INFO:tasks.workunit.client.1.vm09.stdout:1/827: link d9/dc/dd/d40/d1d/f98 d9/d38/d61/dff/f101 0 2026-03-09T17:30:38.398 INFO:tasks.workunit.client.1.vm09.stdout:5/852: symlink d0/d2/l113 0 2026-03-09T17:30:38.405 INFO:tasks.workunit.client.1.vm09.stdout:2/781: fdatasync d13/d15/d21/f27 0 2026-03-09T17:30:38.408 INFO:tasks.workunit.client.1.vm09.stdout:3/763: dread d5/d9/d90/db0/f69 [4194304,4194304] 0 2026-03-09T17:30:38.414 INFO:tasks.workunit.client.1.vm09.stdout:1/828: mkdir d9/de5/dea/d102 0 2026-03-09T17:30:38.419 INFO:tasks.workunit.client.1.vm09.stdout:8/848: dread d1/da/d23/dc2/da2/ddf/f105 [0,4194304] 0 2026-03-09T17:30:38.431 INFO:tasks.workunit.client.1.vm09.stdout:2/782: fdatasync d13/d15/d21/f28 0 2026-03-09T17:30:38.431 INFO:tasks.workunit.client.1.vm09.stdout:4/791: write d11/f13 [3199499,114357] 0 2026-03-09T17:30:38.432 INFO:tasks.workunit.client.1.vm09.stdout:3/764: dread - d5/d9/da9/fb6 zero size 2026-03-09T17:30:38.433 INFO:tasks.workunit.client.1.vm09.stdout:3/765: write d5/d9/f4e [3901785,94669] 0 2026-03-09T17:30:38.443 INFO:tasks.workunit.client.1.vm09.stdout:3/766: dwrite d5/d16/d31/d37/f76 [0,4194304] 0 2026-03-09T17:30:38.445 INFO:tasks.workunit.client.1.vm09.stdout:1/829: rmdir d9/d9e 39 2026-03-09T17:30:38.446 INFO:tasks.workunit.client.1.vm09.stdout:1/830: dread - d9/d38/d61/fda zero size 2026-03-09T17:30:38.451 INFO:tasks.workunit.client.1.vm09.stdout:7/916: write da/d11/d3e/f60 [5742127,120155] 0 2026-03-09T17:30:38.456 INFO:tasks.workunit.client.1.vm09.stdout:2/783: chown d13/d15/d34/d45/f6a 0 1 2026-03-09T17:30:38.456 INFO:tasks.workunit.client.1.vm09.stdout:7/917: truncate da/d11/d47/dfa/f119 7059 0 2026-03-09T17:30:38.458 INFO:tasks.workunit.client.1.vm09.stdout:4/792: symlink d11/d1e/def/lf8 0 2026-03-09T17:30:38.459 INFO:tasks.workunit.client.1.vm09.stdout:5/853: link d0/d9/cc9 d0/d2/d76/d86/c114 0 2026-03-09T17:30:38.465 INFO:tasks.workunit.client.1.vm09.stdout:4/793: truncate d11/d1e/d45/d60/df1/fbc 625828 0 2026-03-09T17:30:38.465 INFO:tasks.workunit.client.1.vm09.stdout:5/854: read d0/d52/d20/f63 [4542392,44013] 0 2026-03-09T17:30:38.465 INFO:tasks.workunit.client.1.vm09.stdout:7/918: stat da/d11/d3e/da2/db2/fa6 0 2026-03-09T17:30:38.465 INFO:tasks.workunit.client.1.vm09.stdout:2/784: dread d13/d15/d3b/ddf/f97 [0,4194304] 0 2026-03-09T17:30:38.466 INFO:tasks.workunit.client.1.vm09.stdout:5/855: readlink d0/d46/l47 0 2026-03-09T17:30:38.468 INFO:tasks.workunit.client.1.vm09.stdout:3/767: dwrite d5/d9/d30/d65/f3e [0,4194304] 0 2026-03-09T17:30:38.469 INFO:tasks.workunit.client.1.vm09.stdout:2/785: write d13/f89 [4921550,55703] 0 2026-03-09T17:30:38.474 INFO:tasks.workunit.client.1.vm09.stdout:3/768: write d5/d9/d90/db0/fa0 [2946472,128465] 0 2026-03-09T17:30:38.480 INFO:tasks.workunit.client.1.vm09.stdout:5/856: mkdir d0/d115 0 2026-03-09T17:30:38.481 INFO:tasks.workunit.client.1.vm09.stdout:3/769: truncate d5/d9/da9/fb6 860702 0 2026-03-09T17:30:38.482 INFO:tasks.workunit.client.1.vm09.stdout:4/794: getdents d11/d1e/d45 0 2026-03-09T17:30:38.488 INFO:tasks.workunit.client.1.vm09.stdout:5/857: creat d0/d9/d74/d75/d9f/db6/d101/f116 x:0 0 0 2026-03-09T17:30:38.489 INFO:tasks.workunit.client.1.vm09.stdout:7/919: dread da/d11/d77/f79 [0,4194304] 0 2026-03-09T17:30:38.491 INFO:tasks.workunit.client.1.vm09.stdout:4/795: mkdir d11/d1e/d45/d60/d71/db7/d89/d8b/d58/deb/df9 0 2026-03-09T17:30:38.491 INFO:tasks.workunit.client.1.vm09.stdout:3/770: dwrite d5/d9/d30/f6a [0,4194304] 0 2026-03-09T17:30:38.493 INFO:tasks.workunit.client.1.vm09.stdout:7/920: symlink da/d11/d47/d5b/d6c/d9e/d4e/l139 0 2026-03-09T17:30:38.494 INFO:tasks.workunit.client.1.vm09.stdout:5/858: symlink d0/d9/d74/d104/l117 0 2026-03-09T17:30:38.496 INFO:tasks.workunit.client.1.vm09.stdout:5/859: fsync d0/d9/d74/d75/dbd/ffc 0 2026-03-09T17:30:38.499 INFO:tasks.workunit.client.1.vm09.stdout:4/796: fsync d11/f16 0 2026-03-09T17:30:38.516 INFO:tasks.workunit.client.1.vm09.stdout:7/921: unlink da/d11/c34 0 2026-03-09T17:30:38.520 INFO:tasks.workunit.client.1.vm09.stdout:3/771: creat d5/d9/d30/d65/fe9 x:0 0 0 2026-03-09T17:30:38.527 INFO:tasks.workunit.client.1.vm09.stdout:2/786: dread d13/d15/f20 [0,4194304] 0 2026-03-09T17:30:38.535 INFO:tasks.workunit.client.1.vm09.stdout:4/797: fdatasync d11/d1e/d45/d60/d71/db7/d89/f94 0 2026-03-09T17:30:38.539 INFO:tasks.workunit.client.1.vm09.stdout:5/860: dread d0/dc/dc3/ff7 [0,4194304] 0 2026-03-09T17:30:38.546 INFO:tasks.workunit.client.1.vm09.stdout:5/861: dwrite d0/dc/d21/d26/d5e/d68/d6d/f9e [0,4194304] 0 2026-03-09T17:30:38.557 INFO:tasks.workunit.client.1.vm09.stdout:6/806: dwrite d3/d21/d25/d26/d6b/f79 [0,4194304] 0 2026-03-09T17:30:38.558 INFO:tasks.workunit.client.1.vm09.stdout:9/827: dwrite d5/de/d4e/dca/de7/d93/f10b [0,4194304] 0 2026-03-09T17:30:38.569 INFO:tasks.workunit.client.1.vm09.stdout:0/840: truncate d6/d1d/d24/d32/d59/d81/f82 3154876 0 2026-03-09T17:30:38.569 INFO:tasks.workunit.client.1.vm09.stdout:2/787: mknod d13/d15/d3b/d43/cfb 0 2026-03-09T17:30:38.571 INFO:tasks.workunit.client.1.vm09.stdout:3/772: getdents d5/d16/d31/d37/d58/d8a/da8/ddf 0 2026-03-09T17:30:38.579 INFO:tasks.workunit.client.1.vm09.stdout:8/849: dwrite d1/da/dd/fc0 [0,4194304] 0 2026-03-09T17:30:38.586 INFO:tasks.workunit.client.1.vm09.stdout:1/831: dread f3 [0,4194304] 0 2026-03-09T17:30:38.587 INFO:tasks.workunit.client.1.vm09.stdout:8/850: chown d1/da/d23/d71/d101/l29 6599283 1 2026-03-09T17:30:38.587 INFO:tasks.workunit.client.1.vm09.stdout:9/828: dwrite d5/d91/d99/dc9/dde/fec [0,4194304] 0 2026-03-09T17:30:38.604 INFO:tasks.workunit.client.1.vm09.stdout:9/829: dwrite d5/d7e/f100 [0,4194304] 0 2026-03-09T17:30:38.632 INFO:tasks.workunit.client.1.vm09.stdout:6/807: dread d3/d21/d76/d3f/f51 [0,4194304] 0 2026-03-09T17:30:38.638 INFO:tasks.workunit.client.1.vm09.stdout:0/841: creat d6/d1d/d24/d32/d59/d9c/dac/f112 x:0 0 0 2026-03-09T17:30:38.638 INFO:tasks.workunit.client.1.vm09.stdout:2/788: symlink d13/d15/d3b/ddf/d90/lfc 0 2026-03-09T17:30:38.639 INFO:tasks.workunit.client.1.vm09.stdout:4/798: truncate d11/d1e/d29/d36/f6a 1723246 0 2026-03-09T17:30:38.640 INFO:tasks.workunit.client.1.vm09.stdout:8/851: dread - d1/d14/fa8 zero size 2026-03-09T17:30:38.646 INFO:tasks.workunit.client.1.vm09.stdout:9/830: chown d5/c98 268809 1 2026-03-09T17:30:38.647 INFO:tasks.workunit.client.1.vm09.stdout:6/808: unlink d3/d21/db1/cb6 0 2026-03-09T17:30:38.647 INFO:tasks.workunit.client.1.vm09.stdout:3/773: mknod d5/d16/d31/d3d/cea 0 2026-03-09T17:30:38.649 INFO:tasks.workunit.client.1.vm09.stdout:1/832: dwrite d9/dc/dd/d9f/de4/dba/fb4 [0,4194304] 0 2026-03-09T17:30:38.651 INFO:tasks.workunit.client.1.vm09.stdout:6/809: dread - d3/d21/d76/d5c/d7e/dc5/d98/fee zero size 2026-03-09T17:30:38.652 INFO:tasks.workunit.client.1.vm09.stdout:6/810: write d3/faf [534313,86926] 0 2026-03-09T17:30:38.653 INFO:tasks.workunit.client.1.vm09.stdout:8/852: unlink d1/da/dd/fc0 0 2026-03-09T17:30:38.653 INFO:tasks.workunit.client.1.vm09.stdout:9/831: truncate d5/f4f 149369 0 2026-03-09T17:30:38.670 INFO:tasks.workunit.client.1.vm09.stdout:3/774: unlink d5/d9/d30/d65/d59/d84/l9b 0 2026-03-09T17:30:38.676 INFO:tasks.workunit.client.1.vm09.stdout:3/775: truncate d5/d9/f88 519195 0 2026-03-09T17:30:38.680 INFO:tasks.workunit.client.1.vm09.stdout:8/853: symlink d1/d14/d2a/d42/d5d/l109 0 2026-03-09T17:30:38.680 INFO:tasks.workunit.client.1.vm09.stdout:2/789: link d13/d15/d3b/ddf/d85/ld9 d13/d15/lfd 0 2026-03-09T17:30:38.686 INFO:tasks.workunit.client.1.vm09.stdout:9/832: link d5/de/d4e/dca/d84/db7/lf2 d5/d2e/d8b/l11a 0 2026-03-09T17:30:38.690 INFO:tasks.workunit.client.1.vm09.stdout:8/854: creat d1/da/d23/d6c/d32/dc8/f10a x:0 0 0 2026-03-09T17:30:38.693 INFO:tasks.workunit.client.1.vm09.stdout:2/790: unlink d13/dc8/le8 0 2026-03-09T17:30:38.694 INFO:tasks.workunit.client.1.vm09.stdout:1/833: sync 2026-03-09T17:30:38.700 INFO:tasks.workunit.client.1.vm09.stdout:1/834: symlink d9/d9e/l103 0 2026-03-09T17:30:38.700 INFO:tasks.workunit.client.1.vm09.stdout:9/833: getdents d5/de/df7 0 2026-03-09T17:30:38.701 INFO:tasks.workunit.client.1.vm09.stdout:9/834: chown d5/de/d29/d90/dc7/da9/lc2 109 1 2026-03-09T17:30:38.710 INFO:tasks.workunit.client.1.vm09.stdout:1/835: rename l7 to d9/d9e/l104 0 2026-03-09T17:30:38.722 INFO:tasks.workunit.client.1.vm09.stdout:9/835: rename d5/de/d4e/dca/de7/l77 to d5/de/d29/d33/l11b 0 2026-03-09T17:30:38.724 INFO:tasks.workunit.client.1.vm09.stdout:1/836: fdatasync d9/d9e/dc0/d91/d99/fa5 0 2026-03-09T17:30:38.737 INFO:tasks.workunit.client.1.vm09.stdout:7/922: dwrite da/d11/d64/da7/db1/fb6 [0,4194304] 0 2026-03-09T17:30:38.739 INFO:tasks.workunit.client.1.vm09.stdout:1/837: mkdir d9/dc/dd/d40/ddb/d105 0 2026-03-09T17:30:38.772 INFO:tasks.workunit.client.1.vm09.stdout:4/799: truncate d11/d1e/d45/d60/df1/f67 2210820 0 2026-03-09T17:30:38.773 INFO:tasks.workunit.client.1.vm09.stdout:4/800: stat d11/d1e/d45/d60/d71/db7/d89/d8b/d58/c8d 0 2026-03-09T17:30:38.773 INFO:tasks.workunit.client.1.vm09.stdout:4/801: chown d11/d1e/d29/f2f 79194875 1 2026-03-09T17:30:38.784 INFO:tasks.workunit.client.1.vm09.stdout:5/862: dread d0/d46/f4c [0,4194304] 0 2026-03-09T17:30:38.789 INFO:tasks.workunit.client.1.vm09.stdout:5/863: write d0/dc/d21/d26/f3d [2055181,8522] 0 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:5/864: readlink d0/d2/l113 0 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:4/802: dread d11/f15 [0,4194304] 0 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:4/803: stat d11/d1e/d45/d60/d71/c92 0 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:0/842: dwrite d6/d1d/d24/f5d [0,4194304] 0 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:4/804: stat d11/d1e/d29/f6d 0 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:0/843: chown d6/d1d/d24/d5e/f67 14822 1 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:0/844: dread - d6/d64/d97/dd6/ffd zero size 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:0/845: read - d6/d1d/d24/d32/d59/d81/d8c/ff4 zero size 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:3/776: write d5/d16/d46/f63 [1528749,8016] 0 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:3/777: readlink d5/d9/d30/d65/l49 0 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:6/811: write d3/d7/f40 [1453084,67531] 0 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:6/812: dread - d3/d7/d59/d5a/fed zero size 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:4/805: truncate d11/fa4 3383968 0 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:8/855: getdents d1/da/d23/d6c/d32/dc8 0 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:0/846: symlink d6/d1d/d24/d5e/dc2/l113 0 2026-03-09T17:30:38.822 INFO:tasks.workunit.client.1.vm09.stdout:5/865: link d0/dc/d21/d26/d5e/d68/f85 d0/d2/d76/f118 0 2026-03-09T17:30:38.826 INFO:tasks.workunit.client.1.vm09.stdout:4/806: fdatasync d11/d1e/fe4 0 2026-03-09T17:30:38.834 INFO:tasks.workunit.client.1.vm09.stdout:8/856: mknod d1/d14/d2a/c10b 0 2026-03-09T17:30:38.847 INFO:tasks.workunit.client.1.vm09.stdout:9/836: write d5/d91/d99/fa4 [533070,74671] 0 2026-03-09T17:30:38.847 INFO:tasks.workunit.client.1.vm09.stdout:9/837: chown d5/d2e/f82 2000398690 1 2026-03-09T17:30:38.848 INFO:tasks.workunit.client.1.vm09.stdout:1/838: write d9/dc/dd/d40/d1d/f1e [895444,38543] 0 2026-03-09T17:30:38.848 INFO:tasks.workunit.client.1.vm09.stdout:7/923: write da/d11/d47/d5b/d6c/d9e/d4e/fae [471645,91602] 0 2026-03-09T17:30:38.852 INFO:tasks.workunit.client.1.vm09.stdout:1/839: chown d9/d9e/dc0/d37/d3f/d42/d55/l51 7072728 1 2026-03-09T17:30:38.852 INFO:tasks.workunit.client.1.vm09.stdout:1/840: write d9/ddd/fe7 [810764,85504] 0 2026-03-09T17:30:38.852 INFO:tasks.workunit.client.1.vm09.stdout:1/841: truncate d9/dc/dd/d40/d21/d35/d88/ff1 502249 0 2026-03-09T17:30:38.867 INFO:tasks.workunit.client.1.vm09.stdout:6/813: write d3/d21/d76/d3f/fb8 [3304043,119253] 0 2026-03-09T17:30:38.870 INFO:tasks.workunit.client.1.vm09.stdout:0/847: dread d6/d1d/d24/d32/f45 [0,4194304] 0 2026-03-09T17:30:38.882 INFO:tasks.workunit.client.1.vm09.stdout:2/791: creat d13/d15/d21/d88/db8/dd1/de5/ffe x:0 0 0 2026-03-09T17:30:38.890 INFO:tasks.workunit.client.1.vm09.stdout:1/842: mknod d9/dc/dd/d40/ddb/c106 0 2026-03-09T17:30:38.892 INFO:tasks.workunit.client.1.vm09.stdout:7/924: read da/f16 [1067257,78773] 0 2026-03-09T17:30:38.892 INFO:tasks.workunit.client.1.vm09.stdout:7/925: chown da/d11/d47/dfa 33 1 2026-03-09T17:30:38.897 INFO:tasks.workunit.client.1.vm09.stdout:3/778: truncate d5/d16/d25/f2c 1357291 0 2026-03-09T17:30:38.897 INFO:tasks.workunit.client.1.vm09.stdout:4/807: write d11/d1e/d29/f2f [4130859,22663] 0 2026-03-09T17:30:38.903 INFO:tasks.workunit.client.1.vm09.stdout:8/857: rename d1/f33 to d1/da/d23/dc2/f10c 0 2026-03-09T17:30:38.904 INFO:tasks.workunit.client.1.vm09.stdout:9/838: mkdir d5/de/d29/d90/dc7/da9/d104/d11c 0 2026-03-09T17:30:38.904 INFO:tasks.workunit.client.1.vm09.stdout:6/814: write d3/d7/d59/d9c/fdc [301823,99838] 0 2026-03-09T17:30:38.909 INFO:tasks.workunit.client.1.vm09.stdout:5/866: dwrite d0/dc/d21/d26/d5e/d68/d79/fc7 [0,4194304] 0 2026-03-09T17:30:38.909 INFO:tasks.workunit.client.1.vm09.stdout:4/808: rename d11/f1c to d11/d1e/d45/d60/df1/ffa 0 2026-03-09T17:30:38.912 INFO:tasks.workunit.client.1.vm09.stdout:6/815: fdatasync d3/d21/d76/d3f/f9d 0 2026-03-09T17:30:38.915 INFO:tasks.workunit.client.1.vm09.stdout:1/843: mkdir d9/dc/dd/d9f/d107 0 2026-03-09T17:30:38.916 INFO:tasks.workunit.client.1.vm09.stdout:0/848: link d6/d64/fa7 d6/d64/db5/f114 0 2026-03-09T17:30:38.917 INFO:tasks.workunit.client.1.vm09.stdout:5/867: write d0/d2/d76/d87/d95/d9b/dc0/df4/f112 [539603,102580] 0 2026-03-09T17:30:38.919 INFO:tasks.workunit.client.1.vm09.stdout:5/868: fdatasync d0/d9/d74/d75/fee 0 2026-03-09T17:30:38.919 INFO:tasks.workunit.client.1.vm09.stdout:2/792: rename d13/d15/d34/d45/d84/dd7 to d13/d4d/daa/dff 0 2026-03-09T17:30:38.920 INFO:tasks.workunit.client.1.vm09.stdout:2/793: dread - d13/dc8/fee zero size 2026-03-09T17:30:38.921 INFO:tasks.workunit.client.1.vm09.stdout:6/816: truncate d3/d7/d59/d73/fa7 957791 0 2026-03-09T17:30:38.923 INFO:tasks.workunit.client.1.vm09.stdout:3/779: creat d5/d16/d31/d3d/feb x:0 0 0 2026-03-09T17:30:38.925 INFO:tasks.workunit.client.1.vm09.stdout:5/869: symlink d0/d2/d76/d87/d95/d9b/l119 0 2026-03-09T17:30:38.928 INFO:tasks.workunit.client.1.vm09.stdout:2/794: read d13/d15/d3b/d43/f46 [121256,90130] 0 2026-03-09T17:30:38.929 INFO:tasks.workunit.client.1.vm09.stdout:9/839: getdents d5/de/d29/dd4/df0 0 2026-03-09T17:30:38.930 INFO:tasks.workunit.client.1.vm09.stdout:4/809: rmdir d11/d1e/d45/d60/df1/d78/dc1 0 2026-03-09T17:30:38.931 INFO:tasks.workunit.client.1.vm09.stdout:0/849: link d6/d1d/d24/d32/d59/l71 d6/d64/d97/dd6/l115 0 2026-03-09T17:30:38.933 INFO:tasks.workunit.client.1.vm09.stdout:3/780: dwrite d5/d9/d30/d65/d59/d84/fab [0,4194304] 0 2026-03-09T17:30:38.938 INFO:tasks.workunit.client.1.vm09.stdout:6/817: mkdir d3/d7/d10d 0 2026-03-09T17:30:38.938 INFO:tasks.workunit.client.1.vm09.stdout:9/840: mknod d5/d91/d99/dc9/dde/c11d 0 2026-03-09T17:30:38.942 INFO:tasks.workunit.client.1.vm09.stdout:0/850: readlink d6/d1d/d24/d32/d59/l71 0 2026-03-09T17:30:38.942 INFO:tasks.workunit.client.1.vm09.stdout:9/841: dwrite d5/d2e/fef [0,4194304] 0 2026-03-09T17:30:38.950 INFO:tasks.workunit.client.1.vm09.stdout:2/795: creat d13/f100 x:0 0 0 2026-03-09T17:30:38.953 INFO:tasks.workunit.client.1.vm09.stdout:7/926: write da/d11/d2d/f59 [9717,3597] 0 2026-03-09T17:30:38.953 INFO:tasks.workunit.client.1.vm09.stdout:1/844: write d9/f97 [382057,84196] 0 2026-03-09T17:30:38.957 INFO:tasks.workunit.client.1.vm09.stdout:0/851: mkdir d6/d116 0 2026-03-09T17:30:38.962 INFO:tasks.workunit.client.1.vm09.stdout:0/852: chown d6/d1d 1300 1 2026-03-09T17:30:38.962 INFO:tasks.workunit.client.1.vm09.stdout:9/842: mknod d5/d91/c11e 0 2026-03-09T17:30:38.964 INFO:tasks.workunit.client.1.vm09.stdout:7/927: fdatasync da/d11/d64/da7/d137/fb4 0 2026-03-09T17:30:38.968 INFO:tasks.workunit.client.1.vm09.stdout:3/781: dread d5/d9/da9/fc9 [0,4194304] 0 2026-03-09T17:30:38.973 INFO:tasks.workunit.client.1.vm09.stdout:6/818: link d3/d21/d76/d5c/f78 d3/d21/d76/d5c/d7e/d94/f10e 0 2026-03-09T17:30:38.973 INFO:tasks.workunit.client.1.vm09.stdout:6/819: stat d3/d21/d76/d3f/f9d 0 2026-03-09T17:30:38.976 INFO:tasks.workunit.client.1.vm09.stdout:1/845: creat d9/de5/dfb/f108 x:0 0 0 2026-03-09T17:30:38.978 INFO:tasks.workunit.client.1.vm09.stdout:0/853: truncate d6/d1d/d24/d32/d59/d9c/dac/dcc/f108 224513 0 2026-03-09T17:30:38.978 INFO:tasks.workunit.client.1.vm09.stdout:5/870: dread d0/d9/d16/d5c/f9c [0,4194304] 0 2026-03-09T17:30:38.979 INFO:tasks.workunit.client.1.vm09.stdout:6/820: mkdir d3/d48/d10f 0 2026-03-09T17:30:38.984 INFO:tasks.workunit.client.1.vm09.stdout:7/928: symlink da/d11/d64/da7/d137/dbe/d11d/l13a 0 2026-03-09T17:30:38.985 INFO:tasks.workunit.client.1.vm09.stdout:7/929: fdatasync da/d11/d64/d11f/f12c 0 2026-03-09T17:30:38.985 INFO:tasks.workunit.client.1.vm09.stdout:1/846: truncate d9/d9e/dc0/d37/f41 3180863 0 2026-03-09T17:30:38.986 INFO:tasks.workunit.client.1.vm09.stdout:3/782: mknod d5/d16/d31/d37/d58/cec 0 2026-03-09T17:30:38.987 INFO:tasks.workunit.client.1.vm09.stdout:3/783: fsync d5/d9/d30/d65/f1d 0 2026-03-09T17:30:38.988 INFO:tasks.workunit.client.1.vm09.stdout:5/871: unlink d0/d2/d76/d86/f50 0 2026-03-09T17:30:38.990 INFO:tasks.workunit.client.1.vm09.stdout:6/821: unlink d3/d21/d25/d26/d86/dbe/cd0 0 2026-03-09T17:30:38.991 INFO:tasks.workunit.client.1.vm09.stdout:5/872: dwrite d0/d2/d76/d86/f6b [0,4194304] 0 2026-03-09T17:30:39.004 INFO:tasks.workunit.client.1.vm09.stdout:5/873: fsync d0/dc/d21/d6f/f10a 0 2026-03-09T17:30:39.010 INFO:tasks.workunit.client.1.vm09.stdout:9/843: truncate d5/de/d29/da7/fb3 2457250 0 2026-03-09T17:30:39.017 INFO:tasks.workunit.client.1.vm09.stdout:8/858: dwrite d1/da/d23/dc2/da2/ddf/f105 [0,4194304] 0 2026-03-09T17:30:39.017 INFO:tasks.workunit.client.1.vm09.stdout:2/796: dwrite d13/d15/d3b/ddf/d85/faf [0,4194304] 0 2026-03-09T17:30:39.021 INFO:tasks.workunit.client.1.vm09.stdout:5/874: dwrite d0/d9/d74/d75/dbd/f105 [0,4194304] 0 2026-03-09T17:30:39.025 INFO:tasks.workunit.client.1.vm09.stdout:5/875: dread d0/d9/d16/d5c/f9c [0,4194304] 0 2026-03-09T17:30:39.028 INFO:tasks.workunit.client.1.vm09.stdout:0/854: mkdir d6/d116/d117 0 2026-03-09T17:30:39.034 INFO:tasks.workunit.client.1.vm09.stdout:1/847: unlink d9/d9e/dc0/d37/d3f/d42/c72 0 2026-03-09T17:30:39.035 INFO:tasks.workunit.client.1.vm09.stdout:1/848: chown d9/d5a/fa0 1 1 2026-03-09T17:30:39.035 INFO:tasks.workunit.client.1.vm09.stdout:4/810: dwrite d11/f15 [4194304,4194304] 0 2026-03-09T17:30:39.037 INFO:tasks.workunit.client.1.vm09.stdout:2/797: symlink d13/da4/l101 0 2026-03-09T17:30:39.038 INFO:tasks.workunit.client.1.vm09.stdout:0/855: read d6/d1d/d39/f53 [3311044,52533] 0 2026-03-09T17:30:39.038 INFO:tasks.workunit.client.1.vm09.stdout:4/811: creat d11/d1e/d45/d60/d71/ffb x:0 0 0 2026-03-09T17:30:39.042 INFO:tasks.workunit.client.1.vm09.stdout:1/849: mkdir d9/d38/d61/dff/d109 0 2026-03-09T17:30:39.042 INFO:tasks.workunit.client.1.vm09.stdout:8/859: rename d1/da/d23/dc2/lff to d1/da/d3a/l10d 0 2026-03-09T17:30:39.043 INFO:tasks.workunit.client.1.vm09.stdout:0/856: mknod d6/d1d/d24/d5e/d6c/c118 0 2026-03-09T17:30:39.043 INFO:tasks.workunit.client.1.vm09.stdout:7/930: link da/d11/d47/d5b/d6c/d9e/d4e/d5f/f110 da/d11/d77/f13b 0 2026-03-09T17:30:39.043 INFO:tasks.workunit.client.1.vm09.stdout:1/850: creat d9/de5/dea/f10a x:0 0 0 2026-03-09T17:30:39.047 INFO:tasks.workunit.client.1.vm09.stdout:8/860: write d1/d14/d2a/fe9 [3549060,67835] 0 2026-03-09T17:30:39.049 INFO:tasks.workunit.client.1.vm09.stdout:6/822: rename d3/d7/d59/d5a/fed to d3/d21/db1/f110 0 2026-03-09T17:30:39.049 INFO:tasks.workunit.client.1.vm09.stdout:4/812: getdents d11/d1e/d45/d60/df1/d78 0 2026-03-09T17:30:39.051 INFO:tasks.workunit.client.1.vm09.stdout:7/931: symlink da/d11/d47/d5b/d6c/l13c 0 2026-03-09T17:30:39.053 INFO:tasks.workunit.client.1.vm09.stdout:4/813: fsync d11/d1e/d45/d60/d71/db7/f90 0 2026-03-09T17:30:39.054 INFO:tasks.workunit.client.1.vm09.stdout:6/823: chown d3/d21/d25/d26/db7/f10c 224 1 2026-03-09T17:30:39.055 INFO:tasks.workunit.client.1.vm09.stdout:8/861: fdatasync d1/d14/d2a/f8b 0 2026-03-09T17:30:39.056 INFO:tasks.workunit.client.1.vm09.stdout:8/862: fsync d1/d14/d2a/d42/d5d/d8a/fec 0 2026-03-09T17:30:39.061 INFO:tasks.workunit.client.1.vm09.stdout:4/814: fdatasync d11/f25 0 2026-03-09T17:30:39.064 INFO:tasks.workunit.client.1.vm09.stdout:1/851: dread d9/f59 [0,4194304] 0 2026-03-09T17:30:39.068 INFO:tasks.workunit.client.1.vm09.stdout:7/932: dwrite da/d11/f3f [12582912,4194304] 0 2026-03-09T17:30:39.069 INFO:tasks.workunit.client.1.vm09.stdout:4/815: mknod d11/dc8/cfc 0 2026-03-09T17:30:39.071 INFO:tasks.workunit.client.1.vm09.stdout:0/857: dread d6/d1d/d24/d32/d59/d9c/dac/fe6 [0,4194304] 0 2026-03-09T17:30:39.071 INFO:tasks.workunit.client.1.vm09.stdout:6/824: read d3/d7/d59/d73/f82 [372659,26830] 0 2026-03-09T17:30:39.074 INFO:tasks.workunit.client.1.vm09.stdout:4/816: creat d11/d1e/d45/ffd x:0 0 0 2026-03-09T17:30:39.074 INFO:tasks.workunit.client.1.vm09.stdout:7/933: mkdir da/d11/d2d/d56/da1/d13d 0 2026-03-09T17:30:39.075 INFO:tasks.workunit.client.1.vm09.stdout:6/825: dread - d3/d21/d76/d88/fc1 zero size 2026-03-09T17:30:39.076 INFO:tasks.workunit.client.1.vm09.stdout:4/817: creat d11/d1e/d31/db6/ffe x:0 0 0 2026-03-09T17:30:39.079 INFO:tasks.workunit.client.1.vm09.stdout:7/934: mkdir da/d11/d47/d5b/d6c/d9e/d4e/d4c/d13e 0 2026-03-09T17:30:39.082 INFO:tasks.workunit.client.1.vm09.stdout:7/935: stat da/d11/d47/d5b/d6c/d9e/d4e/d4c/de4/fe0 0 2026-03-09T17:30:39.091 INFO:tasks.workunit.client.1.vm09.stdout:1/852: sync 2026-03-09T17:30:39.095 INFO:tasks.workunit.client.1.vm09.stdout:1/853: stat d9/dc/f47 0 2026-03-09T17:30:39.095 INFO:tasks.workunit.client.1.vm09.stdout:0/858: dread d6/d1d/f3c [0,4194304] 0 2026-03-09T17:30:39.097 INFO:tasks.workunit.client.1.vm09.stdout:6/826: dwrite d3/d21/d25/d26/d86/dbc/f107 [0,4194304] 0 2026-03-09T17:30:39.099 INFO:tasks.workunit.client.1.vm09.stdout:0/859: mknod d6/d1d/df0/c119 0 2026-03-09T17:30:39.101 INFO:tasks.workunit.client.1.vm09.stdout:1/854: fsync d9/dc/dd/d40/d1d/f98 0 2026-03-09T17:30:39.101 INFO:tasks.workunit.client.1.vm09.stdout:1/855: stat d9/dc/dd/d40/d1d/c4b 0 2026-03-09T17:30:39.109 INFO:tasks.workunit.client.1.vm09.stdout:0/860: dread d6/d1d/d24/d32/d59/d9c/dac/fe6 [0,4194304] 0 2026-03-09T17:30:39.113 INFO:tasks.workunit.client.1.vm09.stdout:0/861: mkdir d6/d1d/d24/d5e/dc2/d11a 0 2026-03-09T17:30:39.121 INFO:tasks.workunit.client.1.vm09.stdout:0/862: dwrite d6/d1d/d24/d5e/db2/fb9 [0,4194304] 0 2026-03-09T17:30:39.129 INFO:tasks.workunit.client.1.vm09.stdout:0/863: symlink d6/d1d/d24/d32/d59/d81/d8c/l11b 0 2026-03-09T17:30:39.132 INFO:tasks.workunit.client.1.vm09.stdout:0/864: truncate d6/d1d/d24/f50 1235939 0 2026-03-09T17:30:39.138 INFO:tasks.workunit.client.1.vm09.stdout:0/865: dwrite d6/d1d/d39/f100 [0,4194304] 0 2026-03-09T17:30:39.157 INFO:tasks.workunit.client.1.vm09.stdout:3/784: truncate d5/d9/d30/f6a 1194571 0 2026-03-09T17:30:39.158 INFO:tasks.workunit.client.1.vm09.stdout:5/876: write d0/dc/d21/d33/f65 [743643,71758] 0 2026-03-09T17:30:39.163 INFO:tasks.workunit.client.1.vm09.stdout:3/785: creat d5/d9c/de7/fed x:0 0 0 2026-03-09T17:30:39.165 INFO:tasks.workunit.client.1.vm09.stdout:9/844: dwrite d5/d2e/f5a [0,4194304] 0 2026-03-09T17:30:39.170 INFO:tasks.workunit.client.1.vm09.stdout:9/845: chown d5/f11 0 1 2026-03-09T17:30:39.179 INFO:tasks.workunit.client.1.vm09.stdout:3/786: dwrite d5/d16/d25/f2b [0,4194304] 0 2026-03-09T17:30:39.182 INFO:tasks.workunit.client.1.vm09.stdout:9/846: dwrite d5/d2e/d8b/fb6 [0,4194304] 0 2026-03-09T17:30:39.189 INFO:tasks.workunit.client.1.vm09.stdout:3/787: rename d5/d9/d30/fc6 to d5/d16/d31/d37/d58/d8a/da8/ddf/fee 0 2026-03-09T17:30:39.194 INFO:tasks.workunit.client.1.vm09.stdout:2/798: write d13/d15/d3b/d43/f46 [471937,32647] 0 2026-03-09T17:30:39.195 INFO:tasks.workunit.client.1.vm09.stdout:3/788: creat d5/d16/d31/d37/fef x:0 0 0 2026-03-09T17:30:39.195 INFO:tasks.workunit.client.1.vm09.stdout:5/877: dread d0/d9/d8b/fc2 [0,4194304] 0 2026-03-09T17:30:39.197 INFO:tasks.workunit.client.1.vm09.stdout:5/878: dread - d0/dc/d21/d6f/f10a zero size 2026-03-09T17:30:39.197 INFO:tasks.workunit.client.1.vm09.stdout:3/789: fsync d5/d9/d30/d65/d59/d84/f6e 0 2026-03-09T17:30:39.198 INFO:tasks.workunit.client.1.vm09.stdout:3/790: dread - d5/d9/d30/d65/fdd zero size 2026-03-09T17:30:39.209 INFO:tasks.workunit.client.1.vm09.stdout:5/879: dwrite d0/dc/d21/d6f/f5f [0,4194304] 0 2026-03-09T17:30:39.212 INFO:tasks.workunit.client.1.vm09.stdout:2/799: dwrite d13/d15/d21/d88/db8/dd1/de5/ffe [0,4194304] 0 2026-03-09T17:30:39.223 INFO:tasks.workunit.client.1.vm09.stdout:5/880: read d0/d2/f5d [2748051,122302] 0 2026-03-09T17:30:39.227 INFO:tasks.workunit.client.1.vm09.stdout:4/818: write d11/d1e/fe6 [235196,13225] 0 2026-03-09T17:30:39.230 INFO:tasks.workunit.client.1.vm09.stdout:5/881: rename d0/d52/l67 to d0/dc/d21/d26/d5e/d68/d79/l11a 0 2026-03-09T17:30:39.230 INFO:tasks.workunit.client.1.vm09.stdout:5/882: readlink d0/d9/d74/d104/l117 0 2026-03-09T17:30:39.240 INFO:tasks.workunit.client.1.vm09.stdout:8/863: dwrite d1/d14/d2a/f8b [0,4194304] 0 2026-03-09T17:30:39.240 INFO:tasks.workunit.client.1.vm09.stdout:8/864: creat d1/da/d23/d71/d101/f10e x:0 0 0 2026-03-09T17:30:39.245 INFO:tasks.workunit.client.1.vm09.stdout:8/865: getdents d1/da/d23/dc2/da2 0 2026-03-09T17:30:39.248 INFO:tasks.workunit.client.1.vm09.stdout:7/936: write da/d11/d47/d5b/d6c/d9e/d4e/d5f/f110 [456674,43196] 0 2026-03-09T17:30:39.260 INFO:tasks.workunit.client.1.vm09.stdout:7/937: rmdir da/d11/d2d/d56/da1 39 2026-03-09T17:30:39.263 INFO:tasks.workunit.client.1.vm09.stdout:4/819: dread fd [4194304,4194304] 0 2026-03-09T17:30:39.265 INFO:tasks.workunit.client.1.vm09.stdout:2/800: dread d13/d15/d21/d88/fad [0,4194304] 0 2026-03-09T17:30:39.266 INFO:tasks.workunit.client.1.vm09.stdout:0/866: dwrite d6/d1d/f3c [0,4194304] 0 2026-03-09T17:30:39.266 INFO:tasks.workunit.client.1.vm09.stdout:6/827: dwrite d3/d48/f68 [0,4194304] 0 2026-03-09T17:30:39.269 INFO:tasks.workunit.client.1.vm09.stdout:1/856: dwrite d9/d38/d61/fd0 [0,4194304] 0 2026-03-09T17:30:39.271 INFO:tasks.workunit.client.1.vm09.stdout:4/820: mkdir d11/d1e/d45/daf/dff 0 2026-03-09T17:30:39.272 INFO:tasks.workunit.client.1.vm09.stdout:3/791: dwrite d5/d16/d31/d37/f5b [0,4194304] 0 2026-03-09T17:30:39.275 INFO:tasks.workunit.client.1.vm09.stdout:8/866: sync 2026-03-09T17:30:39.275 INFO:tasks.workunit.client.1.vm09.stdout:0/867: mkdir d6/d1d/d24/d5e/dc2/d11c 0 2026-03-09T17:30:39.275 INFO:tasks.workunit.client.1.vm09.stdout:7/938: creat da/d11/d64/da7/d137/f13f x:0 0 0 2026-03-09T17:30:39.275 INFO:tasks.workunit.client.1.vm09.stdout:2/801: read - d13/d15/d36/fa1 zero size 2026-03-09T17:30:39.276 INFO:tasks.workunit.client.1.vm09.stdout:8/867: readlink d1/da/d23/d6c/l40 0 2026-03-09T17:30:39.280 INFO:tasks.workunit.client.1.vm09.stdout:3/792: stat d5/d9c/de7/de1/le5 0 2026-03-09T17:30:39.283 INFO:tasks.workunit.client.1.vm09.stdout:0/868: chown d6/d1d/l76 518977766 1 2026-03-09T17:30:39.287 INFO:tasks.workunit.client.1.vm09.stdout:8/868: creat d1/da/d23/d6c/f10f x:0 0 0 2026-03-09T17:30:39.289 INFO:tasks.workunit.client.1.vm09.stdout:2/802: dread - d13/d15/d3b/ddf/fc9 zero size 2026-03-09T17:30:39.292 INFO:tasks.workunit.client.1.vm09.stdout:3/793: unlink d5/d9/d90/db0/le2 0 2026-03-09T17:30:39.294 INFO:tasks.workunit.client.1.vm09.stdout:3/794: fdatasync d5/d9/d30/d65/f3e 0 2026-03-09T17:30:39.301 INFO:tasks.workunit.client.1.vm09.stdout:8/869: symlink d1/da/d23/dc2/l110 0 2026-03-09T17:30:39.304 INFO:tasks.workunit.client.1.vm09.stdout:4/821: getdents d11/d1e/d45/daf/dff 0 2026-03-09T17:30:39.304 INFO:tasks.workunit.client.1.vm09.stdout:0/869: read d6/f9 [2609033,6522] 0 2026-03-09T17:30:39.307 INFO:tasks.workunit.client.1.vm09.stdout:1/857: symlink d9/d9e/dc0/d37/d3f/d42/d55/de0/l10b 0 2026-03-09T17:30:39.309 INFO:tasks.workunit.client.1.vm09.stdout:3/795: mknod d5/d9c/de7/cf0 0 2026-03-09T17:30:39.313 INFO:tasks.workunit.client.1.vm09.stdout:1/858: fdatasync d9/dc/dd/d40/d21/d6f/fd6 0 2026-03-09T17:30:39.314 INFO:tasks.workunit.client.1.vm09.stdout:4/822: creat d11/d1e/d45/d60/d71/db7/d89/d8b/dd8/f100 x:0 0 0 2026-03-09T17:30:39.314 INFO:tasks.workunit.client.1.vm09.stdout:3/796: readlink d5/d9/d30/d65/l49 0 2026-03-09T17:30:39.315 INFO:tasks.workunit.client.1.vm09.stdout:8/870: unlink d1/d14/d2a/d49/cf8 0 2026-03-09T17:30:39.317 INFO:tasks.workunit.client.1.vm09.stdout:0/870: mkdir d6/d64/d97/dc9/dfc/d11d 0 2026-03-09T17:30:39.318 INFO:tasks.workunit.client.1.vm09.stdout:8/871: fsync d1/da/dd/d47/f64 0 2026-03-09T17:30:39.319 INFO:tasks.workunit.client.1.vm09.stdout:2/803: getdents d13/d15/d3b/d43 0 2026-03-09T17:30:39.319 INFO:tasks.workunit.client.1.vm09.stdout:9/847: dread d5/d2e/d8b/db4/ff8 [0,4194304] 0 2026-03-09T17:30:39.320 INFO:tasks.workunit.client.1.vm09.stdout:1/859: mknod d9/d38/c10c 0 2026-03-09T17:30:39.324 INFO:tasks.workunit.client.1.vm09.stdout:9/848: truncate d5/d91/d99/dc9/dde/fdf 19146 0 2026-03-09T17:30:39.325 INFO:tasks.workunit.client.1.vm09.stdout:2/804: chown d13/d15/f7e 520156255 1 2026-03-09T17:30:39.326 INFO:tasks.workunit.client.1.vm09.stdout:2/805: rename d13/d15/d34 to d13/d15/d34/d37/d6f/d102 22 2026-03-09T17:30:39.327 INFO:tasks.workunit.client.1.vm09.stdout:9/849: creat d5/d2e/f11f x:0 0 0 2026-03-09T17:30:39.328 INFO:tasks.workunit.client.1.vm09.stdout:2/806: creat d13/d15/d34/d37/d6f/f103 x:0 0 0 2026-03-09T17:30:39.338 INFO:tasks.workunit.client.1.vm09.stdout:0/871: dread d6/f63 [0,4194304] 0 2026-03-09T17:30:39.338 INFO:tasks.workunit.client.1.vm09.stdout:0/872: readlink d6/d1d/d24/d32/l43 0 2026-03-09T17:30:39.338 INFO:tasks.workunit.client.1.vm09.stdout:4/823: sync 2026-03-09T17:30:39.340 INFO:tasks.workunit.client.1.vm09.stdout:0/873: creat d6/d64/d97/dc9/dfc/f11e x:0 0 0 2026-03-09T17:30:39.341 INFO:tasks.workunit.client.1.vm09.stdout:2/807: dread d13/d4d/f5c [0,4194304] 0 2026-03-09T17:30:39.342 INFO:tasks.workunit.client.1.vm09.stdout:2/808: chown d13/d15/d34/f5e 0 1 2026-03-09T17:30:39.342 INFO:tasks.workunit.client.1.vm09.stdout:0/874: creat d6/d1d/d24/d5e/dc2/df7/f11f x:0 0 0 2026-03-09T17:30:39.344 INFO:tasks.workunit.client.1.vm09.stdout:0/875: stat d6/l9d 0 2026-03-09T17:30:39.346 INFO:tasks.workunit.client.1.vm09.stdout:4/824: chown d11/d1e/d29/d36/c42 232403481 1 2026-03-09T17:30:39.347 INFO:tasks.workunit.client.1.vm09.stdout:2/809: creat d13/d15/d36/d72/f104 x:0 0 0 2026-03-09T17:30:39.347 INFO:tasks.workunit.client.1.vm09.stdout:2/810: chown d13/d15/d3b/ddf/fcd 1683 1 2026-03-09T17:30:39.348 INFO:tasks.workunit.client.1.vm09.stdout:5/883: write d0/d46/f4c [1448109,115321] 0 2026-03-09T17:30:39.351 INFO:tasks.workunit.client.1.vm09.stdout:1/860: chown d9/d9e/dc0/d37/fce 82 1 2026-03-09T17:30:39.353 INFO:tasks.workunit.client.1.vm09.stdout:2/811: chown d13/d15/d34/d45/d84/dcb/fe7 138534440 1 2026-03-09T17:30:39.353 INFO:tasks.workunit.client.1.vm09.stdout:2/812: chown d13/f14 1916190 1 2026-03-09T17:30:39.355 INFO:tasks.workunit.client.1.vm09.stdout:5/884: dread d0/d2/d76/d87/d95/d9b/dc0/dde/fed [0,4194304] 0 2026-03-09T17:30:39.355 INFO:tasks.workunit.client.1.vm09.stdout:4/825: chown d11/d1e/d45/d60/df1/fbc 5 1 2026-03-09T17:30:39.360 INFO:tasks.workunit.client.1.vm09.stdout:1/861: mknod d9/dc/dd/d40/ddb/d105/c10d 0 2026-03-09T17:30:39.363 INFO:tasks.workunit.client.1.vm09.stdout:1/862: chown d9/f29 14 1 2026-03-09T17:30:39.364 INFO:tasks.workunit.client.1.vm09.stdout:4/826: mkdir d11/d1e/d29/d36/de3/d101 0 2026-03-09T17:30:39.367 INFO:tasks.workunit.client.1.vm09.stdout:5/885: sync 2026-03-09T17:30:39.370 INFO:tasks.workunit.client.1.vm09.stdout:5/886: mkdir d0/dc/d21/d33/d11b 0 2026-03-09T17:30:39.380 INFO:tasks.workunit.client.1.vm09.stdout:6/828: write d3/d21/d76/d3f/f9d [305539,80213] 0 2026-03-09T17:30:39.380 INFO:tasks.workunit.client.1.vm09.stdout:2/813: dread d13/f26 [0,4194304] 0 2026-03-09T17:30:39.382 INFO:tasks.workunit.client.1.vm09.stdout:7/939: write da/f16 [9387867,94563] 0 2026-03-09T17:30:39.383 INFO:tasks.workunit.client.1.vm09.stdout:7/940: write da/d11/d47/d5b/d6c/d9e/d4e/fe7 [932947,129440] 0 2026-03-09T17:30:39.386 INFO:tasks.workunit.client.1.vm09.stdout:7/941: sync 2026-03-09T17:30:39.386 INFO:tasks.workunit.client.1.vm09.stdout:6/829: rename d3/d7/d59/d5a/f64 to d3/d21/d76/d88/f111 0 2026-03-09T17:30:39.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:39 vm06.local ceph-mon[57307]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T17:30:39.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:39 vm06.local ceph-mon[57307]: Updating vm09:/etc/ceph/ceph.conf 2026-03-09T17:30:39.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:39 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:39.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:39 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:39.392 INFO:tasks.workunit.client.1.vm09.stdout:2/814: dread - d13/d15/f9a zero size 2026-03-09T17:30:39.392 INFO:tasks.workunit.client.1.vm09.stdout:5/887: dread d0/dc/d21/d33/fa2 [0,4194304] 0 2026-03-09T17:30:39.393 INFO:tasks.workunit.client.1.vm09.stdout:7/942: rmdir da/d11/d77/d101 39 2026-03-09T17:30:39.393 INFO:tasks.workunit.client.1.vm09.stdout:2/815: chown d13/d15/d34/f5e 5 1 2026-03-09T17:30:39.395 INFO:tasks.workunit.client.1.vm09.stdout:2/816: creat d13/d15/d3b/f105 x:0 0 0 2026-03-09T17:30:39.400 INFO:tasks.workunit.client.1.vm09.stdout:6/830: creat d3/d7/f112 x:0 0 0 2026-03-09T17:30:39.401 INFO:tasks.workunit.client.1.vm09.stdout:2/817: chown d13/d15/d21/f28 15 1 2026-03-09T17:30:39.401 INFO:tasks.workunit.client.1.vm09.stdout:9/850: write d5/d2e/d8b/fcc [756422,122064] 0 2026-03-09T17:30:39.412 INFO:tasks.workunit.client.1.vm09.stdout:2/818: creat d13/d15/d34/d37/d6f/dde/f106 x:0 0 0 2026-03-09T17:30:39.414 INFO:tasks.workunit.client.1.vm09.stdout:3/797: dwrite d5/d16/d25/f2c [0,4194304] 0 2026-03-09T17:30:39.418 INFO:tasks.workunit.client.1.vm09.stdout:9/851: mkdir d5/de/d29/d90/dc7/da9/d104/d120 0 2026-03-09T17:30:39.419 INFO:tasks.workunit.client.1.vm09.stdout:3/798: write d5/d16/d31/d37/fbf [8628627,62215] 0 2026-03-09T17:30:39.422 INFO:tasks.workunit.client.1.vm09.stdout:0/876: dwrite d6/f9 [0,4194304] 0 2026-03-09T17:30:39.423 INFO:tasks.workunit.client.1.vm09.stdout:8/872: dwrite d1/da/d23/f8f [0,4194304] 0 2026-03-09T17:30:39.425 INFO:tasks.workunit.client.1.vm09.stdout:6/831: dwrite d3/d21/d76/d5c/d9f/f105 [0,4194304] 0 2026-03-09T17:30:39.426 INFO:tasks.workunit.client.1.vm09.stdout:5/888: dwrite d0/d9/d74/d75/d9f/f92 [0,4194304] 0 2026-03-09T17:30:39.426 INFO:tasks.workunit.client.1.vm09.stdout:9/852: chown d5/c1f 0 1 2026-03-09T17:30:39.437 INFO:tasks.workunit.client.1.vm09.stdout:5/889: mknod d0/d52/d20/c11c 0 2026-03-09T17:30:39.437 INFO:tasks.workunit.client.1.vm09.stdout:6/832: dread - d3/d7/f58 zero size 2026-03-09T17:30:39.437 INFO:tasks.workunit.client.1.vm09.stdout:9/853: fsync d5/de/d4e/dca/de7/d93/f74 0 2026-03-09T17:30:39.438 INFO:tasks.workunit.client.1.vm09.stdout:0/877: symlink d6/d1d/d24/l120 0 2026-03-09T17:30:39.438 INFO:tasks.workunit.client.1.vm09.stdout:9/854: readlink d5/de/d29/d33/lfa 0 2026-03-09T17:30:39.438 INFO:tasks.workunit.client.1.vm09.stdout:2/819: creat d13/d15/d34/dd3/f107 x:0 0 0 2026-03-09T17:30:39.439 INFO:tasks.workunit.client.1.vm09.stdout:6/833: fsync d3/d21/db1/f110 0 2026-03-09T17:30:39.439 INFO:tasks.workunit.client.1.vm09.stdout:5/890: readlink d0/dc/l2e 0 2026-03-09T17:30:39.440 INFO:tasks.workunit.client.1.vm09.stdout:8/873: mkdir d1/d14/d2a/d42/d43/dfd/d111 0 2026-03-09T17:30:39.442 INFO:tasks.workunit.client.1.vm09.stdout:5/891: dread - d0/ff0 zero size 2026-03-09T17:30:39.443 INFO:tasks.workunit.client.1.vm09.stdout:9/855: creat d5/de/d29/d90/dc7/f121 x:0 0 0 2026-03-09T17:30:39.446 INFO:tasks.workunit.client.1.vm09.stdout:9/856: write d5/de/f65 [4361586,42606] 0 2026-03-09T17:30:39.446 INFO:tasks.workunit.client.1.vm09.stdout:2/820: rename d13/d15/d21/f27 to d13/d15/d34/d45/f108 0 2026-03-09T17:30:39.446 INFO:tasks.workunit.client.1.vm09.stdout:0/878: dread d6/d1d/d39/f53 [0,4194304] 0 2026-03-09T17:30:39.453 INFO:tasks.workunit.client.1.vm09.stdout:6/834: read d3/d21/d25/fea [5624599,38803] 0 2026-03-09T17:30:39.462 INFO:tasks.workunit.client.1.vm09.stdout:8/874: mkdir d1/da/d23/d6c/ddd/dcb/d97/dc5/d112 0 2026-03-09T17:30:39.462 INFO:tasks.workunit.client.1.vm09.stdout:0/879: readlink d6/d1d/d24/d32/d59/ld4 0 2026-03-09T17:30:39.462 INFO:tasks.workunit.client.1.vm09.stdout:3/799: dread d5/d16/d31/d37/d58/f91 [0,4194304] 0 2026-03-09T17:30:39.462 INFO:tasks.workunit.client.1.vm09.stdout:3/800: chown d5/d9 885762593 1 2026-03-09T17:30:39.462 INFO:tasks.workunit.client.1.vm09.stdout:8/875: truncate d1/d14/ffa 36430 0 2026-03-09T17:30:39.462 INFO:tasks.workunit.client.1.vm09.stdout:5/892: link d0/d52/l88 d0/d46/d4b/db7/l11d 0 2026-03-09T17:30:39.462 INFO:tasks.workunit.client.1.vm09.stdout:3/801: mknod d5/d9/cf1 0 2026-03-09T17:30:39.462 INFO:tasks.workunit.client.1.vm09.stdout:8/876: stat d1/f16 0 2026-03-09T17:30:39.462 INFO:tasks.workunit.client.1.vm09.stdout:6/835: dread d3/d7/d59/d73/fa7 [0,4194304] 0 2026-03-09T17:30:39.462 INFO:tasks.workunit.client.1.vm09.stdout:5/893: creat d0/d46/d4b/db7/d109/d10d/f11e x:0 0 0 2026-03-09T17:30:39.462 INFO:tasks.workunit.client.1.vm09.stdout:0/880: dread d6/d1d/d24/d5e/f8a [0,4194304] 0 2026-03-09T17:30:39.462 INFO:tasks.workunit.client.1.vm09.stdout:6/836: mkdir d3/d21/d25/d26/d86/d113 0 2026-03-09T17:30:39.463 INFO:tasks.workunit.client.1.vm09.stdout:8/877: creat d1/da/d23/d71/f113 x:0 0 0 2026-03-09T17:30:39.463 INFO:tasks.workunit.client.1.vm09.stdout:3/802: symlink d5/d9/d30/dc4/lf2 0 2026-03-09T17:30:39.468 INFO:tasks.workunit.client.1.vm09.stdout:0/881: creat d6/d64/dbd/dd2/f121 x:0 0 0 2026-03-09T17:30:39.470 INFO:tasks.workunit.client.1.vm09.stdout:0/882: creat d6/d64/d97/dc9/f122 x:0 0 0 2026-03-09T17:30:39.475 INFO:tasks.workunit.client.1.vm09.stdout:0/883: symlink d6/d64/dd9/l123 0 2026-03-09T17:30:39.482 INFO:tasks.workunit.client.1.vm09.stdout:0/884: read d6/d1d/d24/d32/d59/d9c/dac/dcc/fe9 [2402995,70020] 0 2026-03-09T17:30:39.482 INFO:tasks.workunit.client.1.vm09.stdout:5/894: dread d0/d9/d74/f99 [0,4194304] 0 2026-03-09T17:30:39.482 INFO:tasks.workunit.client.1.vm09.stdout:4/827: truncate fd 2315429 0 2026-03-09T17:30:39.483 INFO:tasks.workunit.client.1.vm09.stdout:3/803: dread d5/d16/d31/d37/f94 [0,4194304] 0 2026-03-09T17:30:39.487 INFO:tasks.workunit.client.1.vm09.stdout:4/828: unlink d11/d1e/d29/d36/fc7 0 2026-03-09T17:30:39.487 INFO:tasks.workunit.client.1.vm09.stdout:3/804: fdatasync d5/d9/d30/d65/f18 0 2026-03-09T17:30:39.488 INFO:tasks.workunit.client.1.vm09.stdout:1/863: dwrite d9/d9e/dc0/d37/d3f/f80 [0,4194304] 0 2026-03-09T17:30:39.492 INFO:tasks.workunit.client.1.vm09.stdout:7/943: dwrite da/d11/d77/fd5 [4194304,4194304] 0 2026-03-09T17:30:39.493 INFO:tasks.workunit.client.1.vm09.stdout:5/895: truncate d0/d9/d16/d5c/f9c 4811728 0 2026-03-09T17:30:39.498 INFO:tasks.workunit.client.1.vm09.stdout:4/829: unlink d11/d1e/la3 0 2026-03-09T17:30:39.499 INFO:tasks.workunit.client.1.vm09.stdout:3/805: mkdir d5/d16/d31/d3d/db3/df3 0 2026-03-09T17:30:39.509 INFO:tasks.workunit.client.1.vm09.stdout:1/864: stat d9/dc/dd/d9f/de4/dba/lc4 0 2026-03-09T17:30:39.509 INFO:tasks.workunit.client.1.vm09.stdout:5/896: dread d0/d2/d76/d87/d95/f9a [0,4194304] 0 2026-03-09T17:30:39.511 INFO:tasks.workunit.client.1.vm09.stdout:7/944: symlink da/d11/d47/d5b/d6c/d9e/d4e/l140 0 2026-03-09T17:30:39.511 INFO:tasks.workunit.client.1.vm09.stdout:5/897: truncate d0/dc/f37 4348188 0 2026-03-09T17:30:39.511 INFO:tasks.workunit.client.1.vm09.stdout:4/830: dread d11/d1e/d45/d60/d71/db7/d89/d8b/ff0 [0,4194304] 0 2026-03-09T17:30:39.512 INFO:tasks.workunit.client.1.vm09.stdout:3/806: symlink d5/d16/d31/d37/d58/d8a/da8/lf4 0 2026-03-09T17:30:39.515 INFO:tasks.workunit.client.1.vm09.stdout:7/945: stat da/d11/d47/d5b/d6c/df8/l10a 0 2026-03-09T17:30:39.518 INFO:tasks.workunit.client.1.vm09.stdout:4/831: unlink d11/d1e/d29/db5/fca 0 2026-03-09T17:30:39.518 INFO:tasks.workunit.client.1.vm09.stdout:9/857: dwrite d5/de/d29/d90/dc7/fbe [0,4194304] 0 2026-03-09T17:30:39.523 INFO:tasks.workunit.client.1.vm09.stdout:9/858: write d5/f8e [5943744,56889] 0 2026-03-09T17:30:39.523 INFO:tasks.workunit.client.1.vm09.stdout:4/832: fdatasync d11/f3f 0 2026-03-09T17:30:39.524 INFO:tasks.workunit.client.1.vm09.stdout:3/807: rmdir d5/d16/d31/d37/dae/db4 39 2026-03-09T17:30:39.524 INFO:tasks.workunit.client.1.vm09.stdout:7/946: creat da/d11/d47/d5b/d6c/d9e/d4e/d5f/f141 x:0 0 0 2026-03-09T17:30:39.525 INFO:tasks.workunit.client.1.vm09.stdout:7/947: readlink da/d11/d64/da7/d137/dbe/ldf 0 2026-03-09T17:30:39.525 INFO:tasks.workunit.client.1.vm09.stdout:7/948: readlink da/d11/d77/lb0 0 2026-03-09T17:30:39.531 INFO:tasks.workunit.client.1.vm09.stdout:9/859: creat d5/de/d88/f122 x:0 0 0 2026-03-09T17:30:39.531 INFO:tasks.workunit.client.1.vm09.stdout:7/949: unlink da/fcd 0 2026-03-09T17:30:39.532 INFO:tasks.workunit.client.1.vm09.stdout:3/808: mknod d5/d9/d30/d65/cf5 0 2026-03-09T17:30:39.533 INFO:tasks.workunit.client.1.vm09.stdout:3/809: write d5/d9c/de7/f99 [1694613,106641] 0 2026-03-09T17:30:39.536 INFO:tasks.workunit.client.1.vm09.stdout:4/833: creat d11/d1e/d45/d60/d71/db7/d89/d8b/d58/deb/df9/f102 x:0 0 0 2026-03-09T17:30:39.536 INFO:tasks.workunit.client.1.vm09.stdout:7/950: rmdir da/d11/d47/d5b/d6c/d9e/d4e/d4c/de4 39 2026-03-09T17:30:39.537 INFO:tasks.workunit.client.1.vm09.stdout:3/810: creat d5/d16/d31/d37/d58/d8a/da8/ff6 x:0 0 0 2026-03-09T17:30:39.537 INFO:tasks.workunit.client.1.vm09.stdout:7/951: readlink da/d11/l24 0 2026-03-09T17:30:39.538 INFO:tasks.workunit.client.1.vm09.stdout:4/834: creat d11/d1e/d45/d60/df1/f103 x:0 0 0 2026-03-09T17:30:39.545 INFO:tasks.workunit.client.1.vm09.stdout:3/811: rename d5/d16/d31/d3d/d9f/cdc to d5/d16/d31/cf7 0 2026-03-09T17:30:39.547 INFO:tasks.workunit.client.1.vm09.stdout:7/952: rename da/l97 to da/d11/d47/d5b/d6c/d9e/dc6/ddb/l142 0 2026-03-09T17:30:39.548 INFO:tasks.workunit.client.1.vm09.stdout:7/953: chown da/d11/d64/da7/db1/l120 1709480 1 2026-03-09T17:30:39.548 INFO:tasks.workunit.client.1.vm09.stdout:9/860: dread d5/de/d4e/dca/f75 [0,4194304] 0 2026-03-09T17:30:39.551 INFO:tasks.workunit.client.1.vm09.stdout:7/954: creat da/d11/d47/d5b/d6c/d9e/d4e/d4c/d13e/f143 x:0 0 0 2026-03-09T17:30:39.560 INFO:tasks.workunit.client.1.vm09.stdout:2/821: truncate d13/d15/d34/d37/d66/f80 1121750 0 2026-03-09T17:30:39.572 INFO:tasks.workunit.client.1.vm09.stdout:9/861: sync 2026-03-09T17:30:39.572 INFO:tasks.workunit.client.1.vm09.stdout:7/955: sync 2026-03-09T17:30:39.573 INFO:tasks.workunit.client.1.vm09.stdout:2/822: truncate d13/d15/d36/d72/d94/da7/f7c 1907994 0 2026-03-09T17:30:39.610 INFO:tasks.workunit.client.1.vm09.stdout:7/956: dwrite da/d11/d47/d5b/d6c/f118 [0,4194304] 0 2026-03-09T17:30:39.617 INFO:tasks.workunit.client.1.vm09.stdout:8/878: truncate d1/da/dd/f22 3276010 0 2026-03-09T17:30:39.618 INFO:tasks.workunit.client.1.vm09.stdout:7/957: truncate da/d11/d3e/f88 802852 0 2026-03-09T17:30:39.623 INFO:tasks.workunit.client.1.vm09.stdout:6/837: dwrite d3/d21/d76/d5c/f6d [0,4194304] 0 2026-03-09T17:30:39.629 INFO:tasks.workunit.client.1.vm09.stdout:8/879: rename d1/d14/d2a/d42/d43/dfd/d111 to d1/da/d23/d114 0 2026-03-09T17:30:39.629 INFO:tasks.workunit.client.1.vm09.stdout:7/958: symlink da/d11/d64/l144 0 2026-03-09T17:30:39.630 INFO:tasks.workunit.client.1.vm09.stdout:8/880: fsync d1/da/dd/d47/f66 0 2026-03-09T17:30:39.638 INFO:tasks.workunit.client.1.vm09.stdout:8/881: mkdir d1/da/d23/d71/db6/d115 0 2026-03-09T17:30:39.641 INFO:tasks.workunit.client.1.vm09.stdout:8/882: stat d1/da/d23/dc2/l9b 0 2026-03-09T17:30:39.645 INFO:tasks.workunit.client.1.vm09.stdout:8/883: readlink d1/da/d23/d71/l107 0 2026-03-09T17:30:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:39 vm09.local ceph-mon[62061]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T17:30:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:39 vm09.local ceph-mon[62061]: Updating vm09:/etc/ceph/ceph.conf 2026-03-09T17:30:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:39 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:39 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:39.647 INFO:tasks.workunit.client.1.vm09.stdout:7/959: dwrite da/d11/d64/da7/d137/dbe/d106/f131 [0,4194304] 0 2026-03-09T17:30:39.656 INFO:tasks.workunit.client.1.vm09.stdout:7/960: write da/d11/d47/d5b/d6c/d9e/f130 [540218,97876] 0 2026-03-09T17:30:39.663 INFO:tasks.workunit.client.1.vm09.stdout:6/838: dread d3/d21/d25/d26/d6b/f79 [0,4194304] 0 2026-03-09T17:30:39.670 INFO:tasks.workunit.client.1.vm09.stdout:0/885: write d6/d1d/d24/d32/d59/d9c/dac/dcc/f108 [889436,96199] 0 2026-03-09T17:30:39.672 INFO:tasks.workunit.client.1.vm09.stdout:0/886: unlink d6/d1d/d24/d5e/d6c/l106 0 2026-03-09T17:30:39.672 INFO:tasks.workunit.client.1.vm09.stdout:0/887: stat d6/d64/dd9 0 2026-03-09T17:30:39.675 INFO:tasks.workunit.client.1.vm09.stdout:0/888: getdents d6/d1d/d24/d5e/d6c/ded 0 2026-03-09T17:30:39.679 INFO:tasks.workunit.client.1.vm09.stdout:6/839: dwrite d3/d21/d76/d5c/d7e/dc5/d98/fee [0,4194304] 0 2026-03-09T17:30:39.680 INFO:tasks.workunit.client.1.vm09.stdout:6/840: stat d3/d21/d25/d26/f50 0 2026-03-09T17:30:39.680 INFO:tasks.workunit.client.1.vm09.stdout:6/841: stat d3/d21/d25/cf4 0 2026-03-09T17:30:39.687 INFO:tasks.workunit.client.1.vm09.stdout:6/842: symlink d3/d21/db1/l114 0 2026-03-09T17:30:39.693 INFO:tasks.workunit.client.1.vm09.stdout:6/843: chown d3/d21/d76/d5c/d61/d95/c38 3510801 1 2026-03-09T17:30:39.696 INFO:tasks.workunit.client.1.vm09.stdout:6/844: fdatasync d3/d7/f112 0 2026-03-09T17:30:39.707 INFO:tasks.workunit.client.1.vm09.stdout:8/884: dread d1/da/dd/fc6 [0,4194304] 0 2026-03-09T17:30:39.708 INFO:tasks.workunit.client.1.vm09.stdout:0/889: dread d6/d1d/d24/f5d [4194304,4194304] 0 2026-03-09T17:30:39.712 INFO:tasks.workunit.client.1.vm09.stdout:0/890: read d6/fa6 [507120,42456] 0 2026-03-09T17:30:39.714 INFO:tasks.workunit.client.1.vm09.stdout:8/885: write d1/da/d23/d6c/d32/fb5 [4766567,102272] 0 2026-03-09T17:30:39.720 INFO:tasks.workunit.client.1.vm09.stdout:0/891: creat d6/d1d/d24/d5e/dc2/d11c/f124 x:0 0 0 2026-03-09T17:30:39.722 INFO:tasks.workunit.client.1.vm09.stdout:8/886: creat d1/f116 x:0 0 0 2026-03-09T17:30:39.725 INFO:tasks.workunit.client.1.vm09.stdout:8/887: rmdir d1/da/dd/d47/d4c 39 2026-03-09T17:30:39.731 INFO:tasks.workunit.client.1.vm09.stdout:1/865: dwrite d9/f11 [0,4194304] 0 2026-03-09T17:30:39.739 INFO:tasks.workunit.client.1.vm09.stdout:8/888: dwrite d1/da/dd/d77/f106 [0,4194304] 0 2026-03-09T17:30:39.742 INFO:tasks.workunit.client.1.vm09.stdout:5/898: dwrite d0/d9/f34 [0,4194304] 0 2026-03-09T17:30:39.752 INFO:tasks.workunit.client.1.vm09.stdout:5/899: sync 2026-03-09T17:30:39.752 INFO:tasks.workunit.client.1.vm09.stdout:8/889: sync 2026-03-09T17:30:39.754 INFO:tasks.workunit.client.1.vm09.stdout:5/900: readlink d0/d46/l47 0 2026-03-09T17:30:39.766 INFO:tasks.workunit.client.1.vm09.stdout:3/812: write d5/d16/d31/d37/f6d [1440810,80503] 0 2026-03-09T17:30:39.766 INFO:tasks.workunit.client.1.vm09.stdout:4/835: dwrite d11/f3f [0,4194304] 0 2026-03-09T17:30:39.767 INFO:tasks.workunit.client.1.vm09.stdout:5/901: mkdir d0/d46/d11f 0 2026-03-09T17:30:39.768 INFO:tasks.workunit.client.1.vm09.stdout:8/890: write d1/da/d3a/fa3 [2593963,87611] 0 2026-03-09T17:30:39.768 INFO:tasks.workunit.client.1.vm09.stdout:5/902: fdatasync d0/d9/d74/d75/d9f/f92 0 2026-03-09T17:30:39.774 INFO:tasks.workunit.client.1.vm09.stdout:4/836: fsync d11/d1e/d29/d36/fad 0 2026-03-09T17:30:39.784 INFO:tasks.workunit.client.1.vm09.stdout:5/903: fdatasync d0/f91 0 2026-03-09T17:30:39.784 INFO:tasks.workunit.client.1.vm09.stdout:8/891: truncate d1/da/d23/d71/d101/f36 1611098 0 2026-03-09T17:30:39.784 INFO:tasks.workunit.client.1.vm09.stdout:4/837: fsync d11/d1e/d45/d60/df1/d78/fd6 0 2026-03-09T17:30:39.784 INFO:tasks.workunit.client.1.vm09.stdout:4/838: write d11/d1e/def/ff6 [102821,119639] 0 2026-03-09T17:30:39.784 INFO:tasks.workunit.client.1.vm09.stdout:3/813: rename d5/d9/l23 to d5/d16/d31/d37/d58/d8a/da8/lf8 0 2026-03-09T17:30:39.784 INFO:tasks.workunit.client.1.vm09.stdout:5/904: mkdir d0/d46/d120 0 2026-03-09T17:30:39.786 INFO:tasks.workunit.client.1.vm09.stdout:8/892: dread d1/da/dd/d77/f106 [0,4194304] 0 2026-03-09T17:30:39.788 INFO:tasks.workunit.client.1.vm09.stdout:3/814: dread d5/d16/d31/d37/f5b [0,4194304] 0 2026-03-09T17:30:39.793 INFO:tasks.workunit.client.1.vm09.stdout:4/839: mknod d11/d1e/d29/c104 0 2026-03-09T17:30:39.795 INFO:tasks.workunit.client.1.vm09.stdout:8/893: mkdir d1/da/d23/d71/db6/d117 0 2026-03-09T17:30:39.808 INFO:tasks.workunit.client.1.vm09.stdout:8/894: chown d1/da/d23/d71/dde 0 1 2026-03-09T17:30:39.808 INFO:tasks.workunit.client.1.vm09.stdout:8/895: chown d1 2020646176 1 2026-03-09T17:30:39.808 INFO:tasks.workunit.client.1.vm09.stdout:3/815: dwrite d5/d9/da9/fd4 [0,4194304] 0 2026-03-09T17:30:39.808 INFO:tasks.workunit.client.1.vm09.stdout:5/905: mknod d0/dc/d21/d33/d11b/c121 0 2026-03-09T17:30:39.808 INFO:tasks.workunit.client.1.vm09.stdout:5/906: rename d0/d9/d16/c3f to d0/dc/d21/d26/c122 0 2026-03-09T17:30:39.811 INFO:tasks.workunit.client.1.vm09.stdout:9/862: write d5/de/d29/fe1 [444685,32082] 0 2026-03-09T17:30:39.811 INFO:tasks.workunit.client.1.vm09.stdout:9/863: chown d5/d7e/ced 370582957 1 2026-03-09T17:30:39.812 INFO:tasks.workunit.client.1.vm09.stdout:9/864: chown d5/d2e/ld1 1255307 1 2026-03-09T17:30:39.815 INFO:tasks.workunit.client.1.vm09.stdout:5/907: mknod d0/d46/d4b/db7/d109/d10d/c123 0 2026-03-09T17:30:39.818 INFO:tasks.workunit.client.1.vm09.stdout:5/908: truncate d0/d52/d20/f7c 3643306 0 2026-03-09T17:30:39.832 INFO:tasks.workunit.client.1.vm09.stdout:8/896: sync 2026-03-09T17:30:39.837 INFO:tasks.workunit.client.1.vm09.stdout:8/897: dwrite d1/dbd/fe3 [0,4194304] 0 2026-03-09T17:30:39.853 INFO:tasks.workunit.client.1.vm09.stdout:2/823: dwrite d13/d15/d21/f30 [8388608,4194304] 0 2026-03-09T17:30:39.856 INFO:tasks.workunit.client.1.vm09.stdout:8/898: dwrite d1/d14/d96/fe0 [0,4194304] 0 2026-03-09T17:30:39.858 INFO:tasks.workunit.client.1.vm09.stdout:8/899: readlink d1/d14/laa 0 2026-03-09T17:30:39.864 INFO:tasks.workunit.client.1.vm09.stdout:8/900: write d1/d14/fd5 [1769523,64098] 0 2026-03-09T17:30:39.874 INFO:tasks.workunit.client.1.vm09.stdout:2/824: dread d13/d15/d3b/d43/f46 [0,4194304] 0 2026-03-09T17:30:39.879 INFO:tasks.workunit.client.1.vm09.stdout:8/901: rename d1/da/dd/l15 to d1/da/d23/d114/l118 0 2026-03-09T17:30:39.884 INFO:tasks.workunit.client.1.vm09.stdout:8/902: creat d1/da/d23/d114/f119 x:0 0 0 2026-03-09T17:30:39.884 INFO:tasks.workunit.client.1.vm09.stdout:8/903: mkdir d1/da/dd/d47/d4c/d11a 0 2026-03-09T17:30:39.913 INFO:tasks.workunit.client.1.vm09.stdout:7/961: dwrite da/d11/d2d/fee [0,4194304] 0 2026-03-09T17:30:39.925 INFO:tasks.workunit.client.1.vm09.stdout:6/845: dwrite d3/d21/d76/d5c/fbd [0,4194304] 0 2026-03-09T17:30:39.925 INFO:tasks.workunit.client.1.vm09.stdout:7/962: mknod da/d11/d3e/c145 0 2026-03-09T17:30:39.933 INFO:tasks.workunit.client.1.vm09.stdout:6/846: unlink d3/d21/d25/fea 0 2026-03-09T17:30:39.933 INFO:tasks.workunit.client.1.vm09.stdout:7/963: creat da/d11/d3e/dd8/f146 x:0 0 0 2026-03-09T17:30:39.934 INFO:tasks.workunit.client.1.vm09.stdout:7/964: symlink da/d11/d64/d84/l147 0 2026-03-09T17:30:39.935 INFO:tasks.workunit.client.1.vm09.stdout:7/965: read - da/f138 zero size 2026-03-09T17:30:39.938 INFO:tasks.workunit.client.1.vm09.stdout:6/847: getdents d3/d21/d76/d5c/d61/d6a 0 2026-03-09T17:30:39.938 INFO:tasks.workunit.client.1.vm09.stdout:6/848: chown d3/d21/d76/d5c/d7e/l8e 73973 1 2026-03-09T17:30:39.940 INFO:tasks.workunit.client.1.vm09.stdout:7/966: rename da/d11/d2d/d56/lc0 to da/d11/d77/de5/dec/l148 0 2026-03-09T17:30:39.942 INFO:tasks.workunit.client.1.vm09.stdout:6/849: mknod d3/d21/d76/d5c/d7e/dc5/d98/c115 0 2026-03-09T17:30:39.947 INFO:tasks.workunit.client.1.vm09.stdout:7/967: creat da/d11/d64/da7/f149 x:0 0 0 2026-03-09T17:30:39.947 INFO:tasks.workunit.client.1.vm09.stdout:6/850: rename d3/d21/d25/d26/d86/dbc/f107 to d3/d21/d76/d5c/d7e/dc5/f116 0 2026-03-09T17:30:39.947 INFO:tasks.workunit.client.1.vm09.stdout:6/851: creat d3/d48/d10f/f117 x:0 0 0 2026-03-09T17:30:39.950 INFO:tasks.workunit.client.1.vm09.stdout:6/852: dwrite d3/d7/d59/d5a/f83 [8388608,4194304] 0 2026-03-09T17:30:39.954 INFO:tasks.workunit.client.1.vm09.stdout:6/853: dread - d3/d21/d25/d26/d86/dbc/fd8 zero size 2026-03-09T17:30:39.965 INFO:tasks.workunit.client.1.vm09.stdout:0/892: dwrite d6/fa6 [0,4194304] 0 2026-03-09T17:30:39.976 INFO:tasks.workunit.client.1.vm09.stdout:1/866: dwrite d9/d9e/dc0/f4a [0,4194304] 0 2026-03-09T17:30:39.980 INFO:tasks.workunit.client.1.vm09.stdout:6/854: rename d3/d21/d25/f54 to d3/d21/d76/d88/f118 0 2026-03-09T17:30:39.981 INFO:tasks.workunit.client.1.vm09.stdout:0/893: truncate d6/d1d/f1e 1171819 0 2026-03-09T17:30:39.987 INFO:tasks.workunit.client.1.vm09.stdout:0/894: unlink d6/d64/d94/lca 0 2026-03-09T17:30:39.994 INFO:tasks.workunit.client.1.vm09.stdout:1/867: link d9/d9e/dc0/d37/ce6 d9/dc/dd/d9f/de4/dba/c10e 0 2026-03-09T17:30:39.994 INFO:tasks.workunit.client.1.vm09.stdout:1/868: mkdir d9/d38/d61/dff/d109/d10f 0 2026-03-09T17:30:39.994 INFO:tasks.workunit.client.1.vm09.stdout:1/869: read - d9/dc/dd/d9f/de4/ff7 zero size 2026-03-09T17:30:39.996 INFO:tasks.workunit.client.1.vm09.stdout:1/870: mkdir d9/dc/dd/d40/ddb/d105/d110 0 2026-03-09T17:30:39.998 INFO:tasks.workunit.client.1.vm09.stdout:0/895: dread d6/d64/fa7 [0,4194304] 0 2026-03-09T17:30:39.999 INFO:tasks.workunit.client.1.vm09.stdout:0/896: mkdir d6/d64/d97/dc9/d125 0 2026-03-09T17:30:40.007 INFO:tasks.workunit.client.1.vm09.stdout:1/871: link d9/dc/l18 d9/dc/dd/d9f/de4/l111 0 2026-03-09T17:30:40.007 INFO:tasks.workunit.client.1.vm09.stdout:1/872: link d9/d38/d61/c100 d9/dc/dd/d9f/d9c/c112 0 2026-03-09T17:30:40.007 INFO:tasks.workunit.client.1.vm09.stdout:1/873: chown d9/dc/dd/d9f/d9c/lc2 1257818 1 2026-03-09T17:30:40.021 INFO:tasks.workunit.client.1.vm09.stdout:0/897: sync 2026-03-09T17:30:40.021 INFO:tasks.workunit.client.1.vm09.stdout:1/874: sync 2026-03-09T17:30:40.027 INFO:tasks.workunit.client.1.vm09.stdout:1/875: truncate d9/dc/f76 780608 0 2026-03-09T17:30:40.040 INFO:tasks.workunit.client.1.vm09.stdout:4/840: dwrite d11/d1e/d45/d60/d71/f76 [0,4194304] 0 2026-03-09T17:30:40.042 INFO:tasks.workunit.client.1.vm09.stdout:5/909: write d0/d2/d76/d86/fa8 [3791751,27457] 0 2026-03-09T17:30:40.042 INFO:tasks.workunit.client.1.vm09.stdout:3/816: write d5/d9/d30/d65/d59/fa2 [481194,96465] 0 2026-03-09T17:30:40.045 INFO:tasks.workunit.client.1.vm09.stdout:4/841: sync 2026-03-09T17:30:40.045 INFO:tasks.workunit.client.1.vm09.stdout:1/876: write d9/d9e/dc0/d37/f2e [3044477,86794] 0 2026-03-09T17:30:40.048 INFO:tasks.workunit.client.1.vm09.stdout:5/910: unlink d0/dc/d21/d26/d5e/dd4/cdd 0 2026-03-09T17:30:40.051 INFO:tasks.workunit.client.1.vm09.stdout:9/865: dwrite d5/de/d29/f36 [8388608,4194304] 0 2026-03-09T17:30:40.052 INFO:tasks.workunit.client.1.vm09.stdout:3/817: read d5/d9/d30/f61 [513661,96963] 0 2026-03-09T17:30:40.053 INFO:tasks.workunit.client.1.vm09.stdout:9/866: fsync d5/de/d29/d33/db8/dfb/f118 0 2026-03-09T17:30:40.054 INFO:tasks.workunit.client.1.vm09.stdout:9/867: write d5/d91/d99/dc9/dde/fec [3742670,106714] 0 2026-03-09T17:30:40.059 INFO:tasks.workunit.client.1.vm09.stdout:4/842: creat d11/d1e/d45/d60/d71/db7/d89/f105 x:0 0 0 2026-03-09T17:30:40.061 INFO:tasks.workunit.client.1.vm09.stdout:3/818: mkdir d5/d16/d31/d37/d58/d8a/da8/ddf/df9 0 2026-03-09T17:30:40.065 INFO:tasks.workunit.client.1.vm09.stdout:5/911: link d0/ff d0/d2/d76/d87/da4/d10e/f124 0 2026-03-09T17:30:40.065 INFO:tasks.workunit.client.1.vm09.stdout:3/819: read - d5/d9/d90/fb9 zero size 2026-03-09T17:30:40.065 INFO:tasks.workunit.client.1.vm09.stdout:4/843: sync 2026-03-09T17:30:40.066 INFO:tasks.workunit.client.1.vm09.stdout:4/844: write d11/d1e/d45/d60/d71/db7/f96 [1223248,67088] 0 2026-03-09T17:30:40.067 INFO:tasks.workunit.client.1.vm09.stdout:5/912: creat d0/d52/d20/f125 x:0 0 0 2026-03-09T17:30:40.070 INFO:tasks.workunit.client.1.vm09.stdout:9/868: dread d5/de/f3c [0,4194304] 0 2026-03-09T17:30:40.070 INFO:tasks.workunit.client.1.vm09.stdout:0/898: read d6/d1d/d24/d32/d59/d81/f82 [2152108,110098] 0 2026-03-09T17:30:40.072 INFO:tasks.workunit.client.1.vm09.stdout:9/869: write d5/de/d29/fc0 [3535533,63617] 0 2026-03-09T17:30:40.075 INFO:tasks.workunit.client.1.vm09.stdout:3/820: rename d5/d16/d31/d3d/db3/fc1 to d5/d9/d90/ffa 0 2026-03-09T17:30:40.076 INFO:tasks.workunit.client.1.vm09.stdout:5/913: dwrite d0/d52/d20/f125 [0,4194304] 0 2026-03-09T17:30:40.089 INFO:tasks.workunit.client.1.vm09.stdout:0/899: read d6/d1d/d24/d32/d59/d9c/dac/dcc/ff2 [1349933,116763] 0 2026-03-09T17:30:40.089 INFO:tasks.workunit.client.1.vm09.stdout:9/870: mknod d5/de/d29/d90/dc7/c123 0 2026-03-09T17:30:40.096 INFO:tasks.workunit.client.1.vm09.stdout:3/821: truncate d5/d16/d31/d3d/fe 4946932 0 2026-03-09T17:30:40.099 INFO:tasks.workunit.client.1.vm09.stdout:9/871: link d5/d21/c71 d5/d2e/d8b/d116/c124 0 2026-03-09T17:30:40.100 INFO:tasks.workunit.client.1.vm09.stdout:3/822: dwrite d5/d9/d30/d65/fdb [0,4194304] 0 2026-03-09T17:30:40.106 INFO:tasks.workunit.client.1.vm09.stdout:9/872: mkdir d5/d2e/d8b/de0/d125 0 2026-03-09T17:30:40.106 INFO:tasks.workunit.client.1.vm09.stdout:3/823: dread - d5/d16/f54 zero size 2026-03-09T17:30:40.107 INFO:tasks.workunit.client.1.vm09.stdout:3/824: write d5/d9c/fd7 [2531182,100960] 0 2026-03-09T17:30:40.108 INFO:tasks.workunit.client.1.vm09.stdout:9/873: readlink d5/de/d4e/dca/d84/d97/ld9 0 2026-03-09T17:30:40.110 INFO:tasks.workunit.client.1.vm09.stdout:9/874: chown d5/cc3 3 1 2026-03-09T17:30:40.111 INFO:tasks.workunit.client.1.vm09.stdout:0/900: dread d6/d1d/d24/f75 [0,4194304] 0 2026-03-09T17:30:40.126 INFO:tasks.workunit.client.1.vm09.stdout:9/875: mkdir d5/de/d88/d126 0 2026-03-09T17:30:40.126 INFO:tasks.workunit.client.1.vm09.stdout:9/876: fdatasync d5/f14 0 2026-03-09T17:30:40.126 INFO:tasks.workunit.client.1.vm09.stdout:9/877: readlink d5/de/d29/la3 0 2026-03-09T17:30:40.126 INFO:tasks.workunit.client.1.vm09.stdout:9/878: rename d5/de/d29/d90/dc7/cd7 to d5/d2e/d8b/db4/c127 0 2026-03-09T17:30:40.133 INFO:tasks.workunit.client.1.vm09.stdout:9/879: fsync d5/de/d29/f36 0 2026-03-09T17:30:40.134 INFO:tasks.workunit.client.1.vm09.stdout:9/880: stat d5/d2e/d8b/db4/ff8 0 2026-03-09T17:30:40.140 INFO:tasks.workunit.client.1.vm09.stdout:2/825: truncate d13/d15/d3b/ddf/d85/faf 3917071 0 2026-03-09T17:30:40.140 INFO:tasks.workunit.client.1.vm09.stdout:2/826: chown d13/d15/f7e 535697 1 2026-03-09T17:30:40.141 INFO:tasks.workunit.client.1.vm09.stdout:2/827: truncate d13/da4/fea 217543 0 2026-03-09T17:30:40.149 INFO:tasks.workunit.client.1.vm09.stdout:2/828: symlink d13/d15/d3b/ddf/l109 0 2026-03-09T17:30:40.157 INFO:tasks.workunit.client.1.vm09.stdout:7/968: write da/d11/d77/fba [889281,129275] 0 2026-03-09T17:30:40.162 INFO:tasks.workunit.client.1.vm09.stdout:7/969: fdatasync da/d11/d47/d5b/fc9 0 2026-03-09T17:30:40.170 INFO:tasks.workunit.client.1.vm09.stdout:6/855: dwrite d3/d21/f80 [0,4194304] 0 2026-03-09T17:30:40.177 INFO:tasks.workunit.client.1.vm09.stdout:7/970: truncate da/d11/d47/d5b/d6c/d9e/d4e/d5f/fca 189927 0 2026-03-09T17:30:40.178 INFO:tasks.workunit.client.1.vm09.stdout:6/856: fsync d3/d21/d76/d3f/f51 0 2026-03-09T17:30:40.182 INFO:tasks.workunit.client.1.vm09.stdout:1/877: dwrite d9/dc/dd/f7b [4194304,4194304] 0 2026-03-09T17:30:40.183 INFO:tasks.workunit.client.1.vm09.stdout:1/878: fsync d9/dc/dd/d9f/de4/dba/fb4 0 2026-03-09T17:30:40.193 INFO:tasks.workunit.client.1.vm09.stdout:7/971: creat da/d11/d2d/d56/da1/d13d/f14a x:0 0 0 2026-03-09T17:30:40.197 INFO:tasks.workunit.client.1.vm09.stdout:6/857: dread d3/d21/d76/d5c/d7e/dc5/d9a/fb4 [0,4194304] 0 2026-03-09T17:30:40.197 INFO:tasks.workunit.client.1.vm09.stdout:6/858: stat d3/d48/f68 0 2026-03-09T17:30:40.198 INFO:tasks.workunit.client.1.vm09.stdout:6/859: readlink d3/d21/d76/d3f/d8f/lf8 0 2026-03-09T17:30:40.199 INFO:tasks.workunit.client.1.vm09.stdout:7/972: unlink da/d11/d2d/fee 0 2026-03-09T17:30:40.200 INFO:tasks.workunit.client.1.vm09.stdout:7/973: chown da/d11/d47/d5b/d6c/l13c 67381 1 2026-03-09T17:30:40.200 INFO:tasks.workunit.client.1.vm09.stdout:7/974: dread - da/d11/d2d/d56/da1/d13d/f14a zero size 2026-03-09T17:30:40.201 INFO:tasks.workunit.client.1.vm09.stdout:1/879: mknod d9/dc/dd/d40/c113 0 2026-03-09T17:30:40.201 INFO:tasks.workunit.client.1.vm09.stdout:7/975: dread - da/d11/d64/da7/db1/fc5 zero size 2026-03-09T17:30:40.202 INFO:tasks.workunit.client.1.vm09.stdout:6/860: rmdir d3 39 2026-03-09T17:30:40.207 INFO:tasks.workunit.client.1.vm09.stdout:1/880: rmdir d9/d38/d61 39 2026-03-09T17:30:40.217 INFO:tasks.workunit.client.1.vm09.stdout:7/976: rename da/d11/l20 to da/d11/d77/de5/dec/l14b 0 2026-03-09T17:30:40.221 INFO:tasks.workunit.client.1.vm09.stdout:1/881: creat d9/dc/dd/d40/d1d/f114 x:0 0 0 2026-03-09T17:30:40.224 INFO:tasks.workunit.client.1.vm09.stdout:4/845: dwrite d11/d1e/d31/f3a [0,4194304] 0 2026-03-09T17:30:40.232 INFO:tasks.workunit.client.1.vm09.stdout:7/977: dwrite da/d11/d47/d5b/d6c/d9e/d4e/d4c/f66 [0,4194304] 0 2026-03-09T17:30:40.232 INFO:tasks.workunit.client.1.vm09.stdout:7/978: chown da/d11/f1a 44 1 2026-03-09T17:30:40.232 INFO:tasks.workunit.client.1.vm09.stdout:7/979: fdatasync da/d11/d64/d11f/f12c 0 2026-03-09T17:30:40.246 INFO:tasks.workunit.client.1.vm09.stdout:1/882: dread f6 [0,4194304] 0 2026-03-09T17:30:40.252 INFO:tasks.workunit.client.1.vm09.stdout:7/980: rename da/d11/d64/da7/db1/l120 to da/d11/d2d/l14c 0 2026-03-09T17:30:40.258 INFO:tasks.workunit.client.1.vm09.stdout:7/981: rename da/f135 to da/d11/d47/d5b/df2/f14d 0 2026-03-09T17:30:40.263 INFO:tasks.workunit.client.1.vm09.stdout:1/883: dread d9/dc/dd/d40/d21/d35/d88/f9a [0,4194304] 0 2026-03-09T17:30:40.279 INFO:tasks.workunit.client.1.vm09.stdout:5/914: dwrite d0/d9/fd2 [0,4194304] 0 2026-03-09T17:30:40.280 INFO:tasks.workunit.client.1.vm09.stdout:1/884: creat d9/dc/dd/d9f/d9c/f115 x:0 0 0 2026-03-09T17:30:40.287 INFO:tasks.workunit.client.1.vm09.stdout:1/885: dwrite d9/d9e/fe2 [0,4194304] 0 2026-03-09T17:30:40.290 INFO:tasks.workunit.client.1.vm09.stdout:1/886: truncate d9/dc/dd/d40/d1d/f17 1105257 0 2026-03-09T17:30:40.295 INFO:tasks.workunit.client.1.vm09.stdout:3/825: write d5/d16/fcc [108607,129203] 0 2026-03-09T17:30:40.326 INFO:tasks.workunit.client.1.vm09.stdout:1/887: fsync d9/d38/fbd 0 2026-03-09T17:30:40.327 INFO:tasks.workunit.client.1.vm09.stdout:1/888: write d9/dc/dd/d40/d1d/f1e [1431113,23211] 0 2026-03-09T17:30:40.338 INFO:tasks.workunit.client.1.vm09.stdout:0/901: write d6/d1d/d24/d32/f45 [7658069,56443] 0 2026-03-09T17:30:40.340 INFO:tasks.workunit.client.1.vm09.stdout:9/881: write d5/de/d29/f35 [1990709,5005] 0 2026-03-09T17:30:40.340 INFO:tasks.workunit.client.1.vm09.stdout:8/904: write d1/da/dd/f22 [812571,28106] 0 2026-03-09T17:30:40.340 INFO:tasks.workunit.client.1.vm09.stdout:5/915: link d0/d2/lae d0/d9/d74/d104/l126 0 2026-03-09T17:30:40.342 INFO:tasks.workunit.client.1.vm09.stdout:5/916: chown d0/dc/d21/d26/d5e/d68/d6d/lb2 1364 1 2026-03-09T17:30:40.342 INFO:tasks.workunit.client.1.vm09.stdout:2/829: dwrite d13/d15/d36/d72/d94/da7/f7a [0,4194304] 0 2026-03-09T17:30:40.343 INFO:tasks.workunit.client.1.vm09.stdout:5/917: write d0/dc/f37 [1306756,123] 0 2026-03-09T17:30:40.345 INFO:tasks.workunit.client.1.vm09.stdout:2/830: truncate d13/d15/d36/d72/ff9 832923 0 2026-03-09T17:30:40.347 INFO:tasks.workunit.client.1.vm09.stdout:2/831: fsync d13/d15/d21/d88/db8/dd1/de5/ffe 0 2026-03-09T17:30:40.366 INFO:tasks.workunit.client.1.vm09.stdout:9/882: unlink d5/de/d29/d33/l3d 0 2026-03-09T17:30:40.366 INFO:tasks.workunit.client.1.vm09.stdout:1/889: link d9/d9e/dc0/f50 d9/d9e/dc0/d91/d99/dcf/f116 0 2026-03-09T17:30:40.367 INFO:tasks.workunit.client.1.vm09.stdout:9/883: write d5/de/d88/f119 [787490,95682] 0 2026-03-09T17:30:40.370 INFO:tasks.workunit.client.1.vm09.stdout:1/890: dwrite d9/de5/dea/f10a [0,4194304] 0 2026-03-09T17:30:40.378 INFO:tasks.workunit.client.1.vm09.stdout:5/918: rename d0/d9/d74/f99 to d0/d46/d11f/f127 0 2026-03-09T17:30:40.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:40 vm06.local ceph-mon[57307]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:30:40.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:40 vm06.local ceph-mon[57307]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:30:40.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:40 vm06.local ceph-mon[57307]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:30:40.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:40 vm06.local ceph-mon[57307]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.client.admin.keyring 2026-03-09T17:30:40.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:40 vm06.local ceph-mon[57307]: Updating vm09:/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:30:40.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:40 vm06.local ceph-mon[57307]: pgmap v7: 65 pgs: 65 active+clean; 3.7 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 46 MiB/s rd, 73 MiB/s wr, 229 op/s 2026-03-09T17:30:40.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:40 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:40.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:40 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:40.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:40 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:40 vm09.local ceph-mon[62061]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:30:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:40 vm09.local ceph-mon[62061]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:30:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:40 vm09.local ceph-mon[62061]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:30:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:40 vm09.local ceph-mon[62061]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.client.admin.keyring 2026-03-09T17:30:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:40 vm09.local ceph-mon[62061]: Updating vm09:/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:30:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:40 vm09.local ceph-mon[62061]: pgmap v7: 65 pgs: 65 active+clean; 3.7 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 46 MiB/s rd, 73 MiB/s wr, 229 op/s 2026-03-09T17:30:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:40 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:40 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:40 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:40.395 INFO:tasks.workunit.client.1.vm09.stdout:2/832: unlink d13/d4d/c5f 0 2026-03-09T17:30:40.396 INFO:tasks.workunit.client.1.vm09.stdout:2/833: write d13/d15/d34/d37/ff3 [377045,38731] 0 2026-03-09T17:30:40.396 INFO:tasks.workunit.client.1.vm09.stdout:1/891: dread d9/d9e/dc0/d37/d3f/d42/d55/f69 [0,4194304] 0 2026-03-09T17:30:40.397 INFO:tasks.workunit.client.1.vm09.stdout:2/834: dread - d13/d15/d34/d45/f61 zero size 2026-03-09T17:30:40.397 INFO:tasks.workunit.client.1.vm09.stdout:1/892: chown f6 36329 1 2026-03-09T17:30:40.398 INFO:tasks.workunit.client.1.vm09.stdout:1/893: write d9/de5/dea/ff5 [708099,7296] 0 2026-03-09T17:30:40.411 INFO:tasks.workunit.client.1.vm09.stdout:5/919: symlink d0/d52/d20/l128 0 2026-03-09T17:30:40.415 INFO:tasks.workunit.client.1.vm09.stdout:2/835: creat d13/d15/d3b/d43/f10a x:0 0 0 2026-03-09T17:30:40.418 INFO:tasks.workunit.client.1.vm09.stdout:1/894: truncate d9/dc/dd/d9f/d9c/fb3 652202 0 2026-03-09T17:30:40.421 INFO:tasks.workunit.client.1.vm09.stdout:8/905: creat d1/d14/d2a/d42/d5d/f11b x:0 0 0 2026-03-09T17:30:40.428 INFO:tasks.workunit.client.1.vm09.stdout:9/884: mkdir d5/de/d4e/d128 0 2026-03-09T17:30:40.429 INFO:tasks.workunit.client.1.vm09.stdout:5/920: mknod d0/d52/c129 0 2026-03-09T17:30:40.430 INFO:tasks.workunit.client.1.vm09.stdout:6/861: dwrite d3/d21/d25/fdf [0,4194304] 0 2026-03-09T17:30:40.433 INFO:tasks.workunit.client.1.vm09.stdout:2/836: creat d13/d4d/f10b x:0 0 0 2026-03-09T17:30:40.433 INFO:tasks.workunit.client.1.vm09.stdout:4/846: write d11/d1e/d45/d60/d71/db7/d89/d8b/f53 [2902862,103126] 0 2026-03-09T17:30:40.434 INFO:tasks.workunit.client.1.vm09.stdout:9/885: symlink d5/de/d4e/dca/d84/l129 0 2026-03-09T17:30:40.435 INFO:tasks.workunit.client.1.vm09.stdout:1/895: symlink d9/dc/dd/l117 0 2026-03-09T17:30:40.435 INFO:tasks.workunit.client.1.vm09.stdout:8/906: creat d1/da/d23/d6c/ddd/dcb/d97/dc5/d112/f11c x:0 0 0 2026-03-09T17:30:40.439 INFO:tasks.workunit.client.1.vm09.stdout:4/847: mknod d11/d1e/d45/daf/c106 0 2026-03-09T17:30:40.439 INFO:tasks.workunit.client.1.vm09.stdout:9/886: truncate d5/de/d29/f73 133861 0 2026-03-09T17:30:40.440 INFO:tasks.workunit.client.1.vm09.stdout:9/887: truncate d5/de/d29/d90/dc7/fbe 5076826 0 2026-03-09T17:30:40.442 INFO:tasks.workunit.client.1.vm09.stdout:8/907: unlink d1/da/d23/fc4 0 2026-03-09T17:30:40.453 INFO:tasks.workunit.client.1.vm09.stdout:7/982: write da/d11/d47/d5b/d6c/d9e/d4e/fd9 [1845568,27289] 0 2026-03-09T17:30:40.454 INFO:tasks.workunit.client.1.vm09.stdout:4/848: rmdir d11/d1e/d29/db5 39 2026-03-09T17:30:40.454 INFO:tasks.workunit.client.1.vm09.stdout:7/983: chown da/d11/d47/d5b/d6c/d9e/d4e/l140 49 1 2026-03-09T17:30:40.455 INFO:tasks.workunit.client.1.vm09.stdout:6/862: mknod d3/d21/d25/d26/d6b/c119 0 2026-03-09T17:30:40.455 INFO:tasks.workunit.client.1.vm09.stdout:9/888: creat d5/de/d4e/dca/d84/d97/f12a x:0 0 0 2026-03-09T17:30:40.456 INFO:tasks.workunit.client.1.vm09.stdout:9/889: rename d5/de/d29 to d5/de/d29/d33/d12b 22 2026-03-09T17:30:40.457 INFO:tasks.workunit.client.1.vm09.stdout:4/849: write d11/d1e/d45/d60/d71/db7/d89/d8b/d58/f75 [2321469,114239] 0 2026-03-09T17:30:40.460 INFO:tasks.workunit.client.1.vm09.stdout:2/837: getdents d13/d15/d34/dd3 0 2026-03-09T17:30:40.463 INFO:tasks.workunit.client.1.vm09.stdout:7/984: rename da/d11/c102 to da/d11/d2d/d56/da1/d13d/c14e 0 2026-03-09T17:30:40.463 INFO:tasks.workunit.client.1.vm09.stdout:9/890: symlink d5/d2e/d8b/de0/df1/l12c 0 2026-03-09T17:30:40.463 INFO:tasks.workunit.client.1.vm09.stdout:9/891: chown d5/de/d29/d90 1389617 1 2026-03-09T17:30:40.463 INFO:tasks.workunit.client.1.vm09.stdout:2/838: unlink d13/d15/d34/d37/c70 0 2026-03-09T17:30:40.463 INFO:tasks.workunit.client.1.vm09.stdout:9/892: chown d5/de/d29 1 1 2026-03-09T17:30:40.464 INFO:tasks.workunit.client.1.vm09.stdout:9/893: readlink d5/de/d29/la3 0 2026-03-09T17:30:40.467 INFO:tasks.workunit.client.1.vm09.stdout:6/863: getdents d3/d48/d10f 0 2026-03-09T17:30:40.467 INFO:tasks.workunit.client.1.vm09.stdout:7/985: mkdir da/d11/d47/d5b/d6c/d9e/d4e/d4c/de4/d14f 0 2026-03-09T17:30:40.471 INFO:tasks.workunit.client.1.vm09.stdout:9/894: creat d5/de/d29/d90/dc7/da9/d104/d11c/f12d x:0 0 0 2026-03-09T17:30:40.471 INFO:tasks.workunit.client.1.vm09.stdout:6/864: mkdir d3/d21/d25/d26/d86/d11a 0 2026-03-09T17:30:40.471 INFO:tasks.workunit.client.1.vm09.stdout:9/895: readlink d5/de/d4e/l8d 0 2026-03-09T17:30:40.473 INFO:tasks.workunit.client.1.vm09.stdout:6/865: chown d3/d21/d25/d96/de0/l103 3880 1 2026-03-09T17:30:40.480 INFO:tasks.workunit.client.1.vm09.stdout:4/850: dread d11/f6e [0,4194304] 0 2026-03-09T17:30:40.484 INFO:tasks.workunit.client.1.vm09.stdout:3/826: dwrite d5/d9/d90/db0/dbb/fbd [0,4194304] 0 2026-03-09T17:30:40.487 INFO:tasks.workunit.client.1.vm09.stdout:6/866: write d3/d7/d59/d73/f82 [4735496,30619] 0 2026-03-09T17:30:40.487 INFO:tasks.workunit.client.1.vm09.stdout:4/851: unlink d11/d1e/d45/d60/d71/c98 0 2026-03-09T17:30:40.487 INFO:tasks.workunit.client.1.vm09.stdout:4/852: fdatasync f10 0 2026-03-09T17:30:40.488 INFO:tasks.workunit.client.1.vm09.stdout:4/853: stat d11/d1e/d45/fb4 0 2026-03-09T17:30:40.492 INFO:tasks.workunit.client.1.vm09.stdout:9/896: creat d5/de/d29/f12e x:0 0 0 2026-03-09T17:30:40.493 INFO:tasks.workunit.client.1.vm09.stdout:4/854: dwrite d11/d1e/d45/d60/df1/f103 [0,4194304] 0 2026-03-09T17:30:40.495 INFO:tasks.workunit.client.1.vm09.stdout:6/867: creat d3/d7/d59/d73/de1/f11b x:0 0 0 2026-03-09T17:30:40.500 INFO:tasks.workunit.client.1.vm09.stdout:3/827: creat d5/d16/d31/ffb x:0 0 0 2026-03-09T17:30:40.504 INFO:tasks.workunit.client.1.vm09.stdout:4/855: mkdir d11/dc8/d107 0 2026-03-09T17:30:40.504 INFO:tasks.workunit.client.1.vm09.stdout:9/897: symlink d5/de/l12f 0 2026-03-09T17:30:40.506 INFO:tasks.workunit.client.1.vm09.stdout:4/856: truncate d11/d1e/d45/daf/ff3 964553 0 2026-03-09T17:30:40.507 INFO:tasks.workunit.client.1.vm09.stdout:9/898: dwrite d5/de/d4e/dca/d84/d97/f103 [0,4194304] 0 2026-03-09T17:30:40.517 INFO:tasks.workunit.client.1.vm09.stdout:9/899: readlink d5/de/d4e/l5b 0 2026-03-09T17:30:40.521 INFO:tasks.workunit.client.1.vm09.stdout:9/900: dwrite d5/de/d29/d33/db8/dfb/f118 [0,4194304] 0 2026-03-09T17:30:40.523 INFO:tasks.workunit.client.1.vm09.stdout:9/901: fsync d5/de/d29/f52 0 2026-03-09T17:30:40.527 INFO:tasks.workunit.client.1.vm09.stdout:9/902: fdatasync d5/d21/f2b 0 2026-03-09T17:30:40.530 INFO:tasks.workunit.client.1.vm09.stdout:9/903: symlink d5/de/d88/d126/l130 0 2026-03-09T17:30:40.538 INFO:tasks.workunit.client.1.vm09.stdout:7/986: dread da/d11/d3e/f60 [0,4194304] 0 2026-03-09T17:30:40.539 INFO:tasks.workunit.client.1.vm09.stdout:7/987: dread - da/d11/d47/d5b/d6c/d9e/dc6/ddb/f122 zero size 2026-03-09T17:30:40.541 INFO:tasks.workunit.client.1.vm09.stdout:7/988: creat da/d11/d3e/dd8/f150 x:0 0 0 2026-03-09T17:30:40.545 INFO:tasks.workunit.client.1.vm09.stdout:7/989: creat da/d11/d77/f151 x:0 0 0 2026-03-09T17:30:40.546 INFO:tasks.workunit.client.1.vm09.stdout:7/990: mknod da/d11/d77/de5/c152 0 2026-03-09T17:30:40.546 INFO:tasks.workunit.client.1.vm09.stdout:7/991: fsync da/d11/d77/fd5 0 2026-03-09T17:30:40.548 INFO:tasks.workunit.client.1.vm09.stdout:7/992: creat da/d11/d47/d5b/d6c/d9e/d4e/f153 x:0 0 0 2026-03-09T17:30:40.555 INFO:tasks.workunit.client.1.vm09.stdout:7/993: dread da/d11/d47/d5b/d6c/d9e/d4e/d4c/f67 [0,4194304] 0 2026-03-09T17:30:40.560 INFO:tasks.workunit.client.1.vm09.stdout:6/868: sync 2026-03-09T17:30:40.561 INFO:tasks.workunit.client.1.vm09.stdout:3/828: sync 2026-03-09T17:30:40.563 INFO:tasks.workunit.client.1.vm09.stdout:6/869: chown d3/d7/l16 353207608 1 2026-03-09T17:30:40.564 INFO:tasks.workunit.client.1.vm09.stdout:3/829: truncate d5/d16/d31/d37/f5b 2733222 0 2026-03-09T17:30:40.574 INFO:tasks.workunit.client.1.vm09.stdout:3/830: write d5/f2f [1038303,11784] 0 2026-03-09T17:30:40.575 INFO:tasks.workunit.client.1.vm09.stdout:3/831: fsync d5/d16/d25/f2b 0 2026-03-09T17:30:40.582 INFO:tasks.workunit.client.1.vm09.stdout:6/870: dread d3/d21/d25/d26/d6b/dbf/f66 [0,4194304] 0 2026-03-09T17:30:40.596 INFO:tasks.workunit.client.1.vm09.stdout:6/871: rename d3/d7/f58 to d3/d21/d25/d26/d86/dbe/f11c 0 2026-03-09T17:30:40.600 INFO:tasks.workunit.client.1.vm09.stdout:0/902: dwrite d6/d93/fcd [0,4194304] 0 2026-03-09T17:30:40.612 INFO:tasks.workunit.client.1.vm09.stdout:0/903: rmdir d6/d1d/d24/d32/d59/d81/d8c 39 2026-03-09T17:30:40.612 INFO:tasks.workunit.client.1.vm09.stdout:0/904: stat d6/f6d 0 2026-03-09T17:30:40.613 INFO:tasks.workunit.client.1.vm09.stdout:6/872: dread - d3/d7/d99/fcf zero size 2026-03-09T17:30:40.614 INFO:tasks.workunit.client.1.vm09.stdout:6/873: stat d3/d7/d59/d73/fa3 0 2026-03-09T17:30:40.617 INFO:tasks.workunit.client.1.vm09.stdout:6/874: dwrite d3/d21/d76/d5c/d61/d6a/ffc [0,4194304] 0 2026-03-09T17:30:40.625 INFO:tasks.workunit.client.1.vm09.stdout:0/905: link d6/d1d/d24/d32/d59/d9c/dac/fe6 d6/d1d/d24/d5e/f126 0 2026-03-09T17:30:40.629 INFO:tasks.workunit.client.1.vm09.stdout:5/921: dwrite d0/d52/f97 [0,4194304] 0 2026-03-09T17:30:40.630 INFO:tasks.workunit.client.1.vm09.stdout:5/922: write d0/dc/d21/d33/fa7 [3380936,45333] 0 2026-03-09T17:30:40.632 INFO:tasks.workunit.client.1.vm09.stdout:5/923: write d0/dc/d21/d6f/f5f [661572,49802] 0 2026-03-09T17:30:40.633 INFO:tasks.workunit.client.1.vm09.stdout:5/924: stat d0/d9/d16/d5c 0 2026-03-09T17:30:40.633 INFO:tasks.workunit.client.1.vm09.stdout:5/925: truncate d0/d2/ff6 1115911 0 2026-03-09T17:30:40.655 INFO:tasks.workunit.client.1.vm09.stdout:1/896: dwrite d9/dc/dd/d9f/d9c/f9b [0,4194304] 0 2026-03-09T17:30:40.655 INFO:tasks.workunit.client.1.vm09.stdout:1/897: chown d9/d9e/cc7 223940 1 2026-03-09T17:30:40.662 INFO:tasks.workunit.client.1.vm09.stdout:1/898: dwrite d9/dc/dd/d40/d21/d6f/fd6 [0,4194304] 0 2026-03-09T17:30:40.677 INFO:tasks.workunit.client.1.vm09.stdout:1/899: symlink d9/d9e/dc0/d8b/l118 0 2026-03-09T17:30:40.678 INFO:tasks.workunit.client.1.vm09.stdout:5/926: link d0/ff0 d0/d9/d74/d104/f12a 0 2026-03-09T17:30:40.679 INFO:tasks.workunit.client.1.vm09.stdout:5/927: chown d0/d2/d76/d87/d95/cd7 2 1 2026-03-09T17:30:40.684 INFO:tasks.workunit.client.1.vm09.stdout:1/900: creat d9/d38/d61/dff/f119 x:0 0 0 2026-03-09T17:30:40.684 INFO:tasks.workunit.client.1.vm09.stdout:5/928: symlink d0/d2/d76/d87/da4/dbe/l12b 0 2026-03-09T17:30:40.688 INFO:tasks.workunit.client.1.vm09.stdout:6/875: dread d3/d7/d59/d73/f75 [0,4194304] 0 2026-03-09T17:30:40.689 INFO:tasks.workunit.client.1.vm09.stdout:1/901: read - d9/d9e/dc0/d37/d3f/d42/d55/fb5 zero size 2026-03-09T17:30:40.690 INFO:tasks.workunit.client.1.vm09.stdout:1/902: write d9/dc/dd/d9f/de4/dba/fd7 [861039,68178] 0 2026-03-09T17:30:40.691 INFO:tasks.workunit.client.1.vm09.stdout:1/903: chown d9/dc/dd/d40/l27 0 1 2026-03-09T17:30:40.699 INFO:tasks.workunit.client.1.vm09.stdout:8/908: dwrite d1/d14/d2a/f2e [0,4194304] 0 2026-03-09T17:30:40.701 INFO:tasks.workunit.client.1.vm09.stdout:1/904: rmdir d9/d5a 39 2026-03-09T17:30:40.702 INFO:tasks.workunit.client.1.vm09.stdout:1/905: chown d9/f6c 136368 1 2026-03-09T17:30:40.715 INFO:tasks.workunit.client.1.vm09.stdout:5/929: creat d0/d2/d76/d87/f12c x:0 0 0 2026-03-09T17:30:40.715 INFO:tasks.workunit.client.1.vm09.stdout:8/909: rmdir d1/da/d23/d71/db6 39 2026-03-09T17:30:40.716 INFO:tasks.workunit.client.1.vm09.stdout:8/910: write d1/da/d23/d71/d101/f10e [899795,105499] 0 2026-03-09T17:30:40.722 INFO:tasks.workunit.client.1.vm09.stdout:5/930: mknod d0/d2/d76/d87/d95/d9b/dc0/df4/c12d 0 2026-03-09T17:30:40.722 INFO:tasks.workunit.client.1.vm09.stdout:5/931: write d0/d2/d76/d87/da4/fa6 [4093642,70210] 0 2026-03-09T17:30:40.731 INFO:tasks.workunit.client.1.vm09.stdout:1/906: creat d9/d5a/f11a x:0 0 0 2026-03-09T17:30:40.732 INFO:tasks.workunit.client.1.vm09.stdout:2/839: write d13/d15/d21/f24 [7883744,77494] 0 2026-03-09T17:30:40.735 INFO:tasks.workunit.client.1.vm09.stdout:2/840: creat d13/d15/d21/d88/db8/dd1/f10c x:0 0 0 2026-03-09T17:30:40.736 INFO:tasks.workunit.client.1.vm09.stdout:2/841: creat d13/d15/d34/d45/d84/dcb/f10d x:0 0 0 2026-03-09T17:30:40.739 INFO:tasks.workunit.client.1.vm09.stdout:2/842: fdatasync d13/f26 0 2026-03-09T17:30:40.744 INFO:tasks.workunit.client.1.vm09.stdout:1/907: sync 2026-03-09T17:30:40.747 INFO:tasks.workunit.client.1.vm09.stdout:2/843: rename d13/d15/d34/d37/d66/l91 to d13/d15/d34/d37/d66/l10e 0 2026-03-09T17:30:40.749 INFO:tasks.workunit.client.1.vm09.stdout:1/908: read d9/dc/dd/d40/d1d/f98 [3744150,101200] 0 2026-03-09T17:30:40.752 INFO:tasks.workunit.client.1.vm09.stdout:2/844: mknod d13/db3/df1/c10f 0 2026-03-09T17:30:40.754 INFO:tasks.workunit.client.1.vm09.stdout:2/845: mknod d13/d15/d3b/ddf/d90/c110 0 2026-03-09T17:30:40.760 INFO:tasks.workunit.client.1.vm09.stdout:8/911: read d1/da/d23/d6c/f70 [2411635,127427] 0 2026-03-09T17:30:40.765 INFO:tasks.workunit.client.1.vm09.stdout:8/912: dread d1/dbd/fe3 [0,4194304] 0 2026-03-09T17:30:40.765 INFO:tasks.workunit.client.1.vm09.stdout:8/913: dread - d1/d14/d2a/d42/d43/fa4 zero size 2026-03-09T17:30:40.766 INFO:tasks.workunit.client.1.vm09.stdout:2/846: sync 2026-03-09T17:30:40.771 INFO:tasks.workunit.client.1.vm09.stdout:2/847: unlink d13/d15/d36/d72/d94/fc0 0 2026-03-09T17:30:40.778 INFO:tasks.workunit.client.1.vm09.stdout:8/914: getdents d1 0 2026-03-09T17:30:40.782 INFO:tasks.workunit.client.1.vm09.stdout:2/848: mkdir d13/d15/d3b/ddf/d90/d111 0 2026-03-09T17:30:40.798 INFO:tasks.workunit.client.1.vm09.stdout:4/857: write d11/fa4 [1478368,106604] 0 2026-03-09T17:30:40.817 INFO:tasks.workunit.client.1.vm09.stdout:4/858: write d11/d1e/d45/d60/df1/f8f [900619,15042] 0 2026-03-09T17:30:40.847 INFO:tasks.workunit.client.1.vm09.stdout:9/904: dwrite d5/de/fd6 [0,4194304] 0 2026-03-09T17:30:40.849 INFO:tasks.workunit.client.1.vm09.stdout:9/905: write d5/d2e/d8b/de0/ffe [960165,10752] 0 2026-03-09T17:30:40.849 INFO:tasks.workunit.client.1.vm09.stdout:7/994: dwrite da/d11/d47/d5b/d6c/d9e/d4e/d4c/de4/fd7 [0,4194304] 0 2026-03-09T17:30:40.861 INFO:tasks.workunit.client.1.vm09.stdout:8/915: getdents d1 0 2026-03-09T17:30:40.866 INFO:tasks.workunit.client.1.vm09.stdout:9/906: dread d5/f11 [0,4194304] 0 2026-03-09T17:30:40.877 INFO:tasks.workunit.client.1.vm09.stdout:7/995: creat da/d11/d2d/d56/da1/f154 x:0 0 0 2026-03-09T17:30:40.878 INFO:tasks.workunit.client.1.vm09.stdout:8/916: fdatasync d1/d14/d2a/d42/d43/d44/fe5 0 2026-03-09T17:30:40.878 INFO:tasks.workunit.client.1.vm09.stdout:3/832: write d5/d9/d30/d65/d59/f81 [242483,91705] 0 2026-03-09T17:30:40.878 INFO:tasks.workunit.client.1.vm09.stdout:7/996: dread - da/d11/d64/da7/db1/fc5 zero size 2026-03-09T17:30:40.879 INFO:tasks.workunit.client.1.vm09.stdout:9/907: mkdir d5/de/d29/dd4/d131 0 2026-03-09T17:30:40.879 INFO:tasks.workunit.client.1.vm09.stdout:8/917: chown d1/d14/d2a/d42/d43/ldc 453527 1 2026-03-09T17:30:40.882 INFO:tasks.workunit.client.1.vm09.stdout:3/833: creat d5/d16/d31/d3d/db3/ffc x:0 0 0 2026-03-09T17:30:40.882 INFO:tasks.workunit.client.1.vm09.stdout:9/908: write d5/de/d4e/dca/d84/fee [205131,70605] 0 2026-03-09T17:30:40.884 INFO:tasks.workunit.client.1.vm09.stdout:9/909: chown d5/de/d4e/dca/de7/d93/cd2 12418 1 2026-03-09T17:30:40.886 INFO:tasks.workunit.client.1.vm09.stdout:7/997: dwrite da/f16 [0,4194304] 0 2026-03-09T17:30:40.888 INFO:tasks.workunit.client.1.vm09.stdout:9/910: rename d5/d91 to d5/d2e/d8b/db4/d132 0 2026-03-09T17:30:40.891 INFO:tasks.workunit.client.1.vm09.stdout:7/998: creat da/d11/d47/d5b/df2/f155 x:0 0 0 2026-03-09T17:30:40.892 INFO:tasks.workunit.client.1.vm09.stdout:7/999: truncate da/d11/d3e/dd8/f112 2032093 0 2026-03-09T17:30:40.897 INFO:tasks.workunit.client.1.vm09.stdout:3/834: fsync d5/d9/d30/d65/d59/f81 0 2026-03-09T17:30:40.898 INFO:tasks.workunit.client.1.vm09.stdout:3/835: fdatasync d5/d9/da9/fc9 0 2026-03-09T17:30:40.900 INFO:tasks.workunit.client.1.vm09.stdout:9/911: mknod d5/de/d4e/c133 0 2026-03-09T17:30:40.900 INFO:tasks.workunit.client.1.vm09.stdout:8/918: link d1/da/d23/d71/d101/cc7 d1/da/d23/d71/dde/c11d 0 2026-03-09T17:30:40.902 INFO:tasks.workunit.client.1.vm09.stdout:3/836: rename d5/d9/d30/d65/d59/f81 to d5/d16/d31/d37/dae/db4/ffd 0 2026-03-09T17:30:40.910 INFO:tasks.workunit.client.1.vm09.stdout:9/912: dwrite d5/de/d88/f110 [0,4194304] 0 2026-03-09T17:30:40.911 INFO:tasks.workunit.client.1.vm09.stdout:3/837: dread d5/d9/d30/d65/f4f [0,4194304] 0 2026-03-09T17:30:40.915 INFO:tasks.workunit.client.1.vm09.stdout:3/838: symlink d5/d16/d46/lfe 0 2026-03-09T17:30:40.918 INFO:tasks.workunit.client.1.vm09.stdout:9/913: dread d5/d2e/d8b/db4/ff8 [0,4194304] 0 2026-03-09T17:30:40.918 INFO:tasks.workunit.client.1.vm09.stdout:3/839: symlink d5/d16/d31/d37/d58/d8a/lff 0 2026-03-09T17:30:40.922 INFO:tasks.workunit.client.1.vm09.stdout:9/914: truncate d5/de/d29/dd4/df0/f96 38411 0 2026-03-09T17:30:40.924 INFO:tasks.workunit.client.1.vm09.stdout:9/915: rename d5/de/d88/lb2 to d5/de/d4e/dca/d84/db7/l134 0 2026-03-09T17:30:40.926 INFO:tasks.workunit.client.1.vm09.stdout:9/916: read d5/d2e/d8b/db4/d132/fdb [1894589,54777] 0 2026-03-09T17:30:40.927 INFO:tasks.workunit.client.1.vm09.stdout:9/917: fsync d5/de/d4e/dca/d84/f105 0 2026-03-09T17:30:40.934 INFO:tasks.workunit.client.1.vm09.stdout:9/918: getdents d5/de/d29/d90/dc7/d101 0 2026-03-09T17:30:40.935 INFO:tasks.workunit.client.1.vm09.stdout:9/919: chown d5/f1b 28108 1 2026-03-09T17:30:40.935 INFO:tasks.workunit.client.1.vm09.stdout:0/906: stat d6/d1d/d24/d5e/dc2/df7/f11f 0 2026-03-09T17:30:40.936 INFO:tasks.workunit.client.1.vm09.stdout:9/920: chown d5/de/d4e/dca/de7/f8c 2 1 2026-03-09T17:30:40.937 INFO:tasks.workunit.client.1.vm09.stdout:0/907: symlink d6/d1d/d24/d32/d59/d9c/dac/dd1/l127 0 2026-03-09T17:30:40.937 INFO:tasks.workunit.client.1.vm09.stdout:9/921: rmdir d5/de/d4e/dca 39 2026-03-09T17:30:40.939 INFO:tasks.workunit.client.1.vm09.stdout:0/908: dread - d6/d1d/d24/d5e/d6c/ff6 zero size 2026-03-09T17:30:40.941 INFO:tasks.workunit.client.1.vm09.stdout:9/922: dread - d5/d2e/d8b/db4/fe5 zero size 2026-03-09T17:30:40.941 INFO:tasks.workunit.client.1.vm09.stdout:0/909: stat d6/d1d/d24/d32/d59/d9c/dac/f112 0 2026-03-09T17:30:40.941 INFO:tasks.workunit.client.1.vm09.stdout:9/923: creat d5/d2e/d8b/d116/f135 x:0 0 0 2026-03-09T17:30:40.942 INFO:tasks.workunit.client.1.vm09.stdout:0/910: mkdir d6/d1d/d24/d5e/dc2/d11c/d128 0 2026-03-09T17:30:40.943 INFO:tasks.workunit.client.1.vm09.stdout:0/911: chown d6/d1d/f57 4432085 1 2026-03-09T17:30:40.946 INFO:tasks.workunit.client.1.vm09.stdout:0/912: truncate d6/d64/ffe 765515 0 2026-03-09T17:30:40.948 INFO:tasks.workunit.client.1.vm09.stdout:0/913: getdents d6/d64/d94 0 2026-03-09T17:30:40.950 INFO:tasks.workunit.client.1.vm09.stdout:0/914: creat d6/d64/d97/dd6/f129 x:0 0 0 2026-03-09T17:30:40.952 INFO:tasks.workunit.client.1.vm09.stdout:0/915: link d6/d1d/l47 d6/d1d/d24/d5e/dc2/df7/l12a 0 2026-03-09T17:30:40.953 INFO:tasks.workunit.client.1.vm09.stdout:0/916: write d6/d1d/d24/d32/f10a [1029454,43349] 0 2026-03-09T17:30:40.956 INFO:tasks.workunit.client.1.vm09.stdout:0/917: creat d6/d1d/d24/d5e/dc2/d11c/f12b x:0 0 0 2026-03-09T17:30:40.958 INFO:tasks.workunit.client.1.vm09.stdout:0/918: truncate d6/d1d/d24/d32/fbe 4390155 0 2026-03-09T17:30:40.960 INFO:tasks.workunit.client.1.vm09.stdout:0/919: symlink d6/d1d/d24/d32/d59/d9c/dac/d109/l12c 0 2026-03-09T17:30:40.966 INFO:tasks.workunit.client.1.vm09.stdout:6/876: write d3/d21/d76/d5c/f65 [5036232,21599] 0 2026-03-09T17:30:40.968 INFO:tasks.workunit.client.1.vm09.stdout:6/877: mkdir d3/d21/d76/d3f/d11d 0 2026-03-09T17:30:40.970 INFO:tasks.workunit.client.1.vm09.stdout:6/878: dread d3/d21/d76/d81/fa2 [0,4194304] 0 2026-03-09T17:30:40.972 INFO:tasks.workunit.client.1.vm09.stdout:6/879: symlink d3/d7/d59/d73/l11e 0 2026-03-09T17:30:40.975 INFO:tasks.workunit.client.1.vm09.stdout:6/880: dwrite d3/d21/db1/f110 [0,4194304] 0 2026-03-09T17:30:40.977 INFO:tasks.workunit.client.1.vm09.stdout:5/932: truncate d0/d46/f4c 1546408 0 2026-03-09T17:30:40.987 INFO:tasks.workunit.client.1.vm09.stdout:5/933: symlink d0/d9/d8b/l12e 0 2026-03-09T17:30:40.990 INFO:tasks.workunit.client.1.vm09.stdout:5/934: write d0/dc/d21/d6f/f5f [2634030,113505] 0 2026-03-09T17:30:40.991 INFO:tasks.workunit.client.1.vm09.stdout:1/909: dwrite d9/dc/f76 [0,4194304] 0 2026-03-09T17:30:40.994 INFO:tasks.workunit.client.1.vm09.stdout:1/910: dread d9/d38/d61/feb [0,4194304] 0 2026-03-09T17:30:40.998 INFO:tasks.workunit.client.1.vm09.stdout:1/911: dwrite d9/d38/d61/feb [0,4194304] 0 2026-03-09T17:30:41.012 INFO:tasks.workunit.client.1.vm09.stdout:5/935: dread d0/d9/d8b/fc2 [0,4194304] 0 2026-03-09T17:30:41.015 INFO:tasks.workunit.client.1.vm09.stdout:5/936: dwrite d0/d2/d76/d87/d95/f9d [0,4194304] 0 2026-03-09T17:30:41.016 INFO:tasks.workunit.client.1.vm09.stdout:5/937: chown d0/d46/d120 12344 1 2026-03-09T17:30:41.022 INFO:tasks.workunit.client.1.vm09.stdout:1/912: dread - d9/d5a/fd5 zero size 2026-03-09T17:30:41.047 INFO:tasks.workunit.client.1.vm09.stdout:1/913: mknod d9/d9e/dc0/d91/c11b 0 2026-03-09T17:30:41.106 INFO:tasks.workunit.client.1.vm09.stdout:2/849: dwrite d13/d15/d34/d45/f82 [0,4194304] 0 2026-03-09T17:30:41.107 INFO:tasks.workunit.client.1.vm09.stdout:4/859: dwrite d11/f16 [0,4194304] 0 2026-03-09T17:30:41.108 INFO:tasks.workunit.client.1.vm09.stdout:4/860: read - d11/d1e/d45/d60/df1/d78/fd6 zero size 2026-03-09T17:30:41.139 INFO:tasks.workunit.client.1.vm09.stdout:4/861: mknod d11/d1e/d45/d60/d71/db7/d89/d8b/d58/deb/c108 0 2026-03-09T17:30:41.139 INFO:tasks.workunit.client.1.vm09.stdout:4/862: write d11/d1e/fe6 [729952,12421] 0 2026-03-09T17:30:41.143 INFO:tasks.workunit.client.1.vm09.stdout:4/863: dwrite d11/d1e/d45/d60/d71/db7/d89/d8b/f53 [0,4194304] 0 2026-03-09T17:30:41.153 INFO:tasks.workunit.client.1.vm09.stdout:4/864: creat d11/d1e/d45/daf/dff/f109 x:0 0 0 2026-03-09T17:30:41.171 INFO:tasks.workunit.client.1.vm09.stdout:4/865: unlink d11/d1e/d29/f2f 0 2026-03-09T17:30:41.193 INFO:tasks.workunit.client.1.vm09.stdout:8/919: truncate d1/da/d23/d6c/f1c 1862199 0 2026-03-09T17:30:41.194 INFO:tasks.workunit.client.1.vm09.stdout:3/840: truncate d5/d16/d31/d37/f6d 3065463 0 2026-03-09T17:30:41.195 INFO:tasks.workunit.client.1.vm09.stdout:9/924: write d5/de/d29/da7/fb3 [2786216,60831] 0 2026-03-09T17:30:41.197 INFO:tasks.workunit.client.1.vm09.stdout:9/925: stat d5/de/d29/d113 0 2026-03-09T17:30:41.198 INFO:tasks.workunit.client.1.vm09.stdout:3/841: dwrite d5/d16/d31/d37/fbf [4194304,4194304] 0 2026-03-09T17:30:41.201 INFO:tasks.workunit.client.1.vm09.stdout:3/842: chown d5/d9c/de7/cf0 38 1 2026-03-09T17:30:41.212 INFO:tasks.workunit.client.1.vm09.stdout:0/920: dwrite d6/d1d/d24/d32/d59/d81/d8c/fa3 [0,4194304] 0 2026-03-09T17:30:41.214 INFO:tasks.workunit.client.1.vm09.stdout:9/926: fdatasync d5/d21/f2b 0 2026-03-09T17:30:41.215 INFO:tasks.workunit.client.1.vm09.stdout:8/920: dread d1/d14/d2a/d42/d5d/d8a/fb8 [0,4194304] 0 2026-03-09T17:30:41.233 INFO:tasks.workunit.client.1.vm09.stdout:3/843: fdatasync d5/d16/d31/d37/fbf 0 2026-03-09T17:30:41.239 INFO:tasks.workunit.client.1.vm09.stdout:6/881: dwrite d3/d21/d76/d5c/d7e/dc5/d9a/fc0 [0,4194304] 0 2026-03-09T17:30:41.250 INFO:tasks.workunit.client.1.vm09.stdout:8/921: mknod d1/da/dd/d47/d4c/c11e 0 2026-03-09T17:30:41.251 INFO:tasks.workunit.client.1.vm09.stdout:8/922: write d1/da/f4b [4253954,67450] 0 2026-03-09T17:30:41.251 INFO:tasks.workunit.client.1.vm09.stdout:5/938: write d0/d2/fcf [919705,114163] 0 2026-03-09T17:30:41.252 INFO:tasks.workunit.client.1.vm09.stdout:8/923: write d1/d14/d96/fe0 [1506752,20480] 0 2026-03-09T17:30:41.252 INFO:tasks.workunit.client.1.vm09.stdout:3/844: dread - d5/d16/d31/d37/d58/d8a/da8/ddf/fee zero size 2026-03-09T17:30:41.253 INFO:tasks.workunit.client.1.vm09.stdout:8/924: readlink d1/d14/d2a/d42/d5d/l109 0 2026-03-09T17:30:41.262 INFO:tasks.workunit.client.1.vm09.stdout:1/914: dwrite d9/dc/dd/d40/d21/d35/d88/f9a [0,4194304] 0 2026-03-09T17:30:41.267 INFO:tasks.workunit.client.1.vm09.stdout:1/915: dwrite d9/dc/dd/d9f/de4/ffc [0,4194304] 0 2026-03-09T17:30:41.268 INFO:tasks.workunit.client.1.vm09.stdout:6/882: symlink d3/d21/d76/l11f 0 2026-03-09T17:30:41.269 INFO:tasks.workunit.client.1.vm09.stdout:5/939: fsync d0/d52/d20/f7c 0 2026-03-09T17:30:41.270 INFO:tasks.workunit.client.1.vm09.stdout:5/940: chown d0/dc/cea 89 1 2026-03-09T17:30:41.270 INFO:tasks.workunit.client.1.vm09.stdout:5/941: chown d0/d9 291232091 1 2026-03-09T17:30:41.270 INFO:tasks.workunit.client.1.vm09.stdout:5/942: write d0/d52/d20/f125 [4938205,106376] 0 2026-03-09T17:30:41.272 INFO:tasks.workunit.client.1.vm09.stdout:6/883: fdatasync d3/d21/d25/f2f 0 2026-03-09T17:30:41.274 INFO:tasks.workunit.client.1.vm09.stdout:5/943: symlink d0/d46/d4b/db7/l12f 0 2026-03-09T17:30:41.291 INFO:tasks.workunit.client.1.vm09.stdout:6/884: link d3/d7/f23 d3/d21/d76/d5c/d7e/d94/f120 0 2026-03-09T17:30:41.292 INFO:tasks.workunit.client.1.vm09.stdout:1/916: link d9/dc/dd/l14 d9/d9e/dc0/d37/d3f/d42/d55/de0/l11c 0 2026-03-09T17:30:41.294 INFO:tasks.workunit.client.1.vm09.stdout:6/885: write d3/d21/d76/d88/f118 [1445822,112711] 0 2026-03-09T17:30:41.299 INFO:tasks.workunit.client.1.vm09.stdout:6/886: chown d3/d7/d59/d73/fa3 0 1 2026-03-09T17:30:41.299 INFO:tasks.workunit.client.1.vm09.stdout:6/887: dread - d3/d21/d76/d5c/d61/d95/fe5 zero size 2026-03-09T17:30:41.316 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:41 vm06.local ceph-mon[57307]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.client.admin.keyring 2026-03-09T17:30:41.316 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:41 vm06.local ceph-mon[57307]: Reconfiguring prometheus.vm06 (dependencies changed)... 2026-03-09T17:30:41.316 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:41 vm06.local ceph-mon[57307]: Reconfiguring daemon prometheus.vm06 on vm06 2026-03-09T17:30:41.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:41 vm09.local ceph-mon[62061]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.client.admin.keyring 2026-03-09T17:30:41.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:41 vm09.local ceph-mon[62061]: Reconfiguring prometheus.vm06 (dependencies changed)... 2026-03-09T17:30:41.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:41 vm09.local ceph-mon[62061]: Reconfiguring daemon prometheus.vm06 on vm06 2026-03-09T17:30:41.402 INFO:tasks.workunit.client.1.vm09.stdout:3/845: mkdir d5/d9/d30/d65/d59/d84/d100 0 2026-03-09T17:30:41.424 INFO:tasks.workunit.client.1.vm09.stdout:2/850: write d13/d15/f7e [832530,35808] 0 2026-03-09T17:30:41.427 INFO:tasks.workunit.client.1.vm09.stdout:2/851: dread d13/d15/d3b/ddf/f97 [0,4194304] 0 2026-03-09T17:30:41.430 INFO:tasks.workunit.client.1.vm09.stdout:2/852: link d13/d15/d3b/ddf/d90/f92 d13/d15/d34/d37/d6f/dde/f112 0 2026-03-09T17:30:41.433 INFO:tasks.workunit.client.1.vm09.stdout:2/853: truncate d13/f73 132069 0 2026-03-09T17:30:41.433 INFO:tasks.workunit.client.1.vm09.stdout:4/866: write d11/d1e/d45/f70 [252971,67390] 0 2026-03-09T17:30:41.434 INFO:tasks.workunit.client.1.vm09.stdout:2/854: readlink d13/d15/d3b/ldc 0 2026-03-09T17:30:41.436 INFO:tasks.workunit.client.1.vm09.stdout:4/867: dread d11/d1e/d45/d60/df1/fbc [0,4194304] 0 2026-03-09T17:30:41.441 INFO:tasks.workunit.client.1.vm09.stdout:4/868: rmdir d11/d1e/d29/d36 39 2026-03-09T17:30:41.467 INFO:tasks.workunit.client.1.vm09.stdout:5/944: unlink d0/d2/d76/d87/cd6 0 2026-03-09T17:30:41.470 INFO:tasks.workunit.client.1.vm09.stdout:8/925: dwrite d1/da/dd/d47/d4c/f67 [0,4194304] 0 2026-03-09T17:30:41.471 INFO:tasks.workunit.client.1.vm09.stdout:5/945: creat d0/d2/d76/d87/d95/d9b/dc0/de6/f130 x:0 0 0 2026-03-09T17:30:41.472 INFO:tasks.workunit.client.1.vm09.stdout:5/946: write d0/d52/f97 [780047,93892] 0 2026-03-09T17:30:41.473 INFO:tasks.workunit.client.1.vm09.stdout:8/926: write d1/da/d23/d71/d101/f10e [1604851,74269] 0 2026-03-09T17:30:41.474 INFO:tasks.workunit.client.1.vm09.stdout:6/888: write d3/d21/d76/d88/f111 [2221231,117764] 0 2026-03-09T17:30:41.479 INFO:tasks.workunit.client.1.vm09.stdout:1/917: dwrite d9/dc/dd/d40/d1d/fa1 [0,4194304] 0 2026-03-09T17:30:41.496 INFO:tasks.workunit.client.1.vm09.stdout:8/927: stat d1/da/d23/d71/db6 0 2026-03-09T17:30:41.506 INFO:tasks.workunit.client.1.vm09.stdout:5/947: creat d0/d2/d76/d87/d95/d9b/dc0/de6/f131 x:0 0 0 2026-03-09T17:30:41.506 INFO:tasks.workunit.client.1.vm09.stdout:6/889: fsync d3/d7/d59/d73/f7d 0 2026-03-09T17:30:41.506 INFO:tasks.workunit.client.1.vm09.stdout:5/948: chown d0/d2/d76/cd9 18390 1 2026-03-09T17:30:41.506 INFO:tasks.workunit.client.1.vm09.stdout:1/918: creat d9/d38/d61/dff/d109/f11d x:0 0 0 2026-03-09T17:30:41.506 INFO:tasks.workunit.client.1.vm09.stdout:5/949: dwrite d0/d52/f108 [0,4194304] 0 2026-03-09T17:30:41.506 INFO:tasks.workunit.client.1.vm09.stdout:8/928: fdatasync d1/d14/d2a/d42/d5d/d8a/f99 0 2026-03-09T17:30:41.506 INFO:tasks.workunit.client.1.vm09.stdout:1/919: mkdir d9/d38/d61/dff/d11e 0 2026-03-09T17:30:41.509 INFO:tasks.workunit.client.1.vm09.stdout:1/920: symlink d9/d9e/dc0/d37/d3f/d42/d55/l11f 0 2026-03-09T17:30:41.510 INFO:tasks.workunit.client.1.vm09.stdout:6/890: link d3/d21/d76/d5c/d7e/l8e d3/d21/d76/d88/l121 0 2026-03-09T17:30:41.510 INFO:tasks.workunit.client.1.vm09.stdout:5/950: chown d0/d2/d76/d86/lf3 2 1 2026-03-09T17:30:41.510 INFO:tasks.workunit.client.1.vm09.stdout:6/891: stat d3/d21/d76/d5c/d7e/d94/c87 0 2026-03-09T17:30:41.519 INFO:tasks.workunit.client.1.vm09.stdout:5/951: mkdir d0/d9/d74/d10f/d132 0 2026-03-09T17:30:41.520 INFO:tasks.workunit.client.1.vm09.stdout:6/892: creat d3/d21/d25/d26/d86/dbe/f122 x:0 0 0 2026-03-09T17:30:41.523 INFO:tasks.workunit.client.1.vm09.stdout:5/952: mknod d0/d9/d74/d104/c133 0 2026-03-09T17:30:41.530 INFO:tasks.workunit.client.1.vm09.stdout:5/953: symlink d0/dc/d21/l134 0 2026-03-09T17:30:41.530 INFO:tasks.workunit.client.1.vm09.stdout:5/954: readlink d0/d46/l4d 0 2026-03-09T17:30:41.531 INFO:tasks.workunit.client.1.vm09.stdout:5/955: chown d0/d52/d20/l27 22980492 1 2026-03-09T17:30:41.531 INFO:tasks.workunit.client.1.vm09.stdout:3/846: rmdir d5/d9/d90 39 2026-03-09T17:30:41.533 INFO:tasks.workunit.client.1.vm09.stdout:2/855: write d13/d15/d36/d72/dc3/fc5 [197542,25384] 0 2026-03-09T17:30:41.536 INFO:tasks.workunit.client.1.vm09.stdout:3/847: symlink d5/d9c/de7/l101 0 2026-03-09T17:30:41.536 INFO:tasks.workunit.client.1.vm09.stdout:5/956: mknod d0/dc/d21/d26/d5e/d68/c135 0 2026-03-09T17:30:41.536 INFO:tasks.workunit.client.1.vm09.stdout:2/856: mknod d13/d15/d36/d72/d94/da7/db0/c113 0 2026-03-09T17:30:41.539 INFO:tasks.workunit.client.1.vm09.stdout:3/848: read d5/d16/d46/f6b [2523977,29333] 0 2026-03-09T17:30:41.540 INFO:tasks.workunit.client.1.vm09.stdout:9/927: rename d5/de/l12f to d5/de/d88/d126/l136 0 2026-03-09T17:30:41.540 INFO:tasks.workunit.client.1.vm09.stdout:2/857: creat d13/d15/d34/d45/d84/dcb/f114 x:0 0 0 2026-03-09T17:30:41.542 INFO:tasks.workunit.client.1.vm09.stdout:0/921: unlink d6/d1d/d24/d32/fbe 0 2026-03-09T17:30:41.542 INFO:tasks.workunit.client.1.vm09.stdout:1/921: dread d9/f6c [0,4194304] 0 2026-03-09T17:30:41.542 INFO:tasks.workunit.client.1.vm09.stdout:0/922: stat d6/d1d/d24 0 2026-03-09T17:30:41.545 INFO:tasks.workunit.client.1.vm09.stdout:1/922: write d9/dc/dd/d9f/de4/dba/fd3 [1377973,121309] 0 2026-03-09T17:30:41.546 INFO:tasks.workunit.client.1.vm09.stdout:2/858: dwrite d13/f100 [0,4194304] 0 2026-03-09T17:30:41.554 INFO:tasks.workunit.client.1.vm09.stdout:4/869: rename d11/d1e/d29/d36/l80 to d11/d1e/d31/db6/l10a 0 2026-03-09T17:30:41.554 INFO:tasks.workunit.client.1.vm09.stdout:5/957: link d0/d9/f34 d0/d46/d4b/db7/f136 0 2026-03-09T17:30:41.556 INFO:tasks.workunit.client.1.vm09.stdout:1/923: dwrite d9/dc/dd/d40/d21/d6f/fd6 [0,4194304] 0 2026-03-09T17:30:41.560 INFO:tasks.workunit.client.1.vm09.stdout:2/859: rmdir d13/d15/d3b/ddf 39 2026-03-09T17:30:41.560 INFO:tasks.workunit.client.1.vm09.stdout:0/923: creat d6/d1d/d24/d5e/dc2/df7/f12d x:0 0 0 2026-03-09T17:30:41.576 INFO:tasks.workunit.client.1.vm09.stdout:3/849: rename d5/d9/d30/d65/cf5 to d5/d16/d31/d37/dae/db4/c102 0 2026-03-09T17:30:41.586 INFO:tasks.workunit.client.1.vm09.stdout:5/958: mkdir d0/d2/d76/d87/d95/d9b/d137 0 2026-03-09T17:30:41.586 INFO:tasks.workunit.client.1.vm09.stdout:8/929: dread d1/d14/d2a/d42/f46 [0,4194304] 0 2026-03-09T17:30:41.586 INFO:tasks.workunit.client.1.vm09.stdout:8/930: write d1/d14/d2a/d49/fe2 [5289438,81719] 0 2026-03-09T17:30:41.586 INFO:tasks.workunit.client.1.vm09.stdout:1/924: creat d9/dc/dd/d40/ddb/f120 x:0 0 0 2026-03-09T17:30:41.586 INFO:tasks.workunit.client.1.vm09.stdout:3/850: dwrite d5/d9/d30/d65/d59/fa2 [0,4194304] 0 2026-03-09T17:30:41.586 INFO:tasks.workunit.client.1.vm09.stdout:8/931: dwrite d1/d14/f9c [4194304,4194304] 0 2026-03-09T17:30:41.596 INFO:tasks.workunit.client.1.vm09.stdout:9/928: link d5/de/d29/d90/dc7/fbe d5/d2e/d8b/f137 0 2026-03-09T17:30:41.597 INFO:tasks.workunit.client.1.vm09.stdout:0/924: symlink d6/d1d/d24/l12e 0 2026-03-09T17:30:41.599 INFO:tasks.workunit.client.1.vm09.stdout:9/929: write d5/de/d29/fe1 [320263,73053] 0 2026-03-09T17:30:41.600 INFO:tasks.workunit.client.1.vm09.stdout:0/925: write d6/d64/f105 [291589,16775] 0 2026-03-09T17:30:41.601 INFO:tasks.workunit.client.1.vm09.stdout:1/925: truncate d9/dc/dd/d40/d1d/f77 834583 0 2026-03-09T17:30:41.610 INFO:tasks.workunit.client.1.vm09.stdout:9/930: creat d5/d21/f138 x:0 0 0 2026-03-09T17:30:41.612 INFO:tasks.workunit.client.1.vm09.stdout:4/870: dread d11/d1e/d29/f6d [0,4194304] 0 2026-03-09T17:30:41.614 INFO:tasks.workunit.client.1.vm09.stdout:3/851: link d5/d9c/fd7 d5/d16/d31/d37/d58/d8a/da8/ddf/df9/f103 0 2026-03-09T17:30:41.617 INFO:tasks.workunit.client.1.vm09.stdout:9/931: dwrite d5/d2e/d8b/db4/d132/d99/dc9/dde/fec [4194304,4194304] 0 2026-03-09T17:30:41.625 INFO:tasks.workunit.client.1.vm09.stdout:0/926: unlink d6/d1d/d46/l35 0 2026-03-09T17:30:41.626 INFO:tasks.workunit.client.1.vm09.stdout:2/860: dread d13/d15/d21/f28 [0,4194304] 0 2026-03-09T17:30:41.633 INFO:tasks.workunit.client.1.vm09.stdout:3/852: dread d5/d9/d30/d65/f3e [0,4194304] 0 2026-03-09T17:30:41.635 INFO:tasks.workunit.client.1.vm09.stdout:6/893: sync 2026-03-09T17:30:41.636 INFO:tasks.workunit.client.1.vm09.stdout:5/959: sync 2026-03-09T17:30:41.636 INFO:tasks.workunit.client.1.vm09.stdout:8/932: sync 2026-03-09T17:30:41.654 INFO:tasks.workunit.client.1.vm09.stdout:9/932: rename d5/f8e to d5/d2e/d8b/db4/f139 0 2026-03-09T17:30:41.673 INFO:tasks.workunit.client.1.vm09.stdout:2/861: truncate d13/d15/d34/d45/f61 24096 0 2026-03-09T17:30:41.690 INFO:tasks.workunit.client.1.vm09.stdout:3/853: symlink d5/d9/d30/d65/l104 0 2026-03-09T17:30:41.693 INFO:tasks.workunit.client.1.vm09.stdout:6/894: chown d3/d7/c63 198837 1 2026-03-09T17:30:41.698 INFO:tasks.workunit.client.1.vm09.stdout:2/862: creat d13/d15/d36/f115 x:0 0 0 2026-03-09T17:30:41.699 INFO:tasks.workunit.client.1.vm09.stdout:4/871: truncate d11/f25 4978999 0 2026-03-09T17:30:41.700 INFO:tasks.workunit.client.1.vm09.stdout:3/854: chown d5/d16/d31/d37/fa5 2840 1 2026-03-09T17:30:41.701 INFO:tasks.workunit.client.1.vm09.stdout:9/933: dwrite d5/de/d29/d90/dc7/fbe [0,4194304] 0 2026-03-09T17:30:41.702 INFO:tasks.workunit.client.1.vm09.stdout:3/855: truncate d5/d16/d31/d37/f94 4568298 0 2026-03-09T17:30:41.731 INFO:tasks.workunit.client.1.vm09.stdout:5/960: symlink d0/dc/d21/d26/d5e/l138 0 2026-03-09T17:30:41.732 INFO:tasks.workunit.client.1.vm09.stdout:2/863: fdatasync d13/f39 0 2026-03-09T17:30:41.735 INFO:tasks.workunit.client.1.vm09.stdout:4/872: mkdir d11/d1e/d45/d60/d71/d10b 0 2026-03-09T17:30:41.754 INFO:tasks.workunit.client.1.vm09.stdout:1/926: dread d9/dc/dd/d40/d1d/f77 [0,4194304] 0 2026-03-09T17:30:41.760 INFO:tasks.workunit.client.1.vm09.stdout:0/927: write d6/d1d/d24/d5e/d6c/ff6 [247730,46044] 0 2026-03-09T17:30:41.762 INFO:tasks.workunit.client.1.vm09.stdout:8/933: dread d1/d14/f3c [0,4194304] 0 2026-03-09T17:30:41.778 INFO:tasks.workunit.client.1.vm09.stdout:6/895: write d3/d21/d76/d5c/d61/d95/fe4 [146365,95539] 0 2026-03-09T17:30:41.806 INFO:tasks.workunit.client.1.vm09.stdout:6/896: creat d3/d7/d59/d73/db0/f123 x:0 0 0 2026-03-09T17:30:41.811 INFO:tasks.workunit.client.1.vm09.stdout:2/864: creat d13/d15/d3b/ddf/d85/f116 x:0 0 0 2026-03-09T17:30:41.811 INFO:tasks.workunit.client.1.vm09.stdout:4/873: write d11/d1e/d45/d60/d71/db7/d89/f94 [2702578,6337] 0 2026-03-09T17:30:41.812 INFO:tasks.workunit.client.1.vm09.stdout:3/856: write d5/d9/d90/db0/f69 [5813016,123846] 0 2026-03-09T17:30:41.812 INFO:tasks.workunit.client.1.vm09.stdout:1/927: write d9/dc/dd/d40/d21/fb6 [1795469,30767] 0 2026-03-09T17:30:41.813 INFO:tasks.workunit.client.1.vm09.stdout:2/865: dread - d13/d15/d34/d37/d6f/dde/f106 zero size 2026-03-09T17:30:41.817 INFO:tasks.workunit.client.1.vm09.stdout:8/934: write d1/d14/d2a/d49/fac [1627983,104084] 0 2026-03-09T17:30:41.818 INFO:tasks.workunit.client.1.vm09.stdout:9/934: truncate d5/d7e/f100 3034960 0 2026-03-09T17:30:41.820 INFO:tasks.workunit.client.1.vm09.stdout:4/874: dread - d11/d1e/d31/db6/ff4 zero size 2026-03-09T17:30:41.821 INFO:tasks.workunit.client.1.vm09.stdout:5/961: dwrite d0/dc/d21/d33/fe9 [0,4194304] 0 2026-03-09T17:30:41.822 INFO:tasks.workunit.client.1.vm09.stdout:4/875: write f3 [3628425,48007] 0 2026-03-09T17:30:41.829 INFO:tasks.workunit.client.1.vm09.stdout:0/928: rename d6/d1d/d24/d5e/f9e to d6/d64/d97/dc9/d125/f12f 0 2026-03-09T17:30:41.837 INFO:tasks.workunit.client.1.vm09.stdout:9/935: creat d5/de/d29/d33/db8/f13a x:0 0 0 2026-03-09T17:30:41.840 INFO:tasks.workunit.client.1.vm09.stdout:3/857: dwrite d5/d9c/de7/f99 [0,4194304] 0 2026-03-09T17:30:41.843 INFO:tasks.workunit.client.1.vm09.stdout:4/876: read fd [1201687,62586] 0 2026-03-09T17:30:41.850 INFO:tasks.workunit.client.1.vm09.stdout:3/858: read d5/d9/d30/d65/d59/d84/f6e [716512,94133] 0 2026-03-09T17:30:41.857 INFO:tasks.workunit.client.1.vm09.stdout:2/866: symlink d13/d15/d36/d72/d94/da7/l117 0 2026-03-09T17:30:41.860 INFO:tasks.workunit.client.1.vm09.stdout:5/962: mkdir d0/d115/d139 0 2026-03-09T17:30:41.861 INFO:tasks.workunit.client.1.vm09.stdout:5/963: write d0/d2/d76/d87/d95/d9b/dc0/de6/f10b [513162,9213] 0 2026-03-09T17:30:41.864 INFO:tasks.workunit.client.1.vm09.stdout:2/867: dwrite d13/d15/d34/d45/d84/dcb/f2d [0,4194304] 0 2026-03-09T17:30:41.866 INFO:tasks.workunit.client.1.vm09.stdout:4/877: mknod d11/d1e/d45/d60/df1/d78/c10c 0 2026-03-09T17:30:41.867 INFO:tasks.workunit.client.1.vm09.stdout:3/859: mkdir d5/d9/d90/db0/d105 0 2026-03-09T17:30:41.874 INFO:tasks.workunit.client.1.vm09.stdout:9/936: mkdir d5/d2e/d8b/de0/d125/d13b 0 2026-03-09T17:30:41.874 INFO:tasks.workunit.client.1.vm09.stdout:5/964: unlink d0/d9/d16/c23 0 2026-03-09T17:30:41.878 INFO:tasks.workunit.client.1.vm09.stdout:2/868: rename d13/d15/d36/d72/d94/da7/cca to d13/d15/d36/d72/dc3/c118 0 2026-03-09T17:30:41.880 INFO:tasks.workunit.client.1.vm09.stdout:4/878: unlink d11/d1e/f61 0 2026-03-09T17:30:41.882 INFO:tasks.workunit.client.1.vm09.stdout:9/937: fdatasync d5/d2e/d8b/db4/f139 0 2026-03-09T17:30:41.882 INFO:tasks.workunit.client.1.vm09.stdout:5/965: dwrite d0/d9/fd2 [0,4194304] 0 2026-03-09T17:30:41.889 INFO:tasks.workunit.client.1.vm09.stdout:2/869: write d13/d15/d21/f30 [216972,127838] 0 2026-03-09T17:30:41.889 INFO:tasks.workunit.client.1.vm09.stdout:3/860: creat d5/d16/d31/d37/f106 x:0 0 0 2026-03-09T17:30:41.892 INFO:tasks.workunit.client.1.vm09.stdout:3/861: mknod d5/d9/c107 0 2026-03-09T17:30:41.892 INFO:tasks.workunit.client.1.vm09.stdout:9/938: symlink d5/de/d4e/d128/l13c 0 2026-03-09T17:30:41.893 INFO:tasks.workunit.client.1.vm09.stdout:5/966: mknod d0/dc/d21/d26/c13a 0 2026-03-09T17:30:41.894 INFO:tasks.workunit.client.1.vm09.stdout:3/862: mkdir d5/d9/d30/d65/d59/d108 0 2026-03-09T17:30:41.896 INFO:tasks.workunit.client.1.vm09.stdout:9/939: dread - d5/de/d29/d90/fb9 zero size 2026-03-09T17:30:41.906 INFO:tasks.workunit.client.1.vm09.stdout:1/928: write d9/d9e/dc0/d91/d99/fbc [500502,46043] 0 2026-03-09T17:30:41.909 INFO:tasks.workunit.client.1.vm09.stdout:8/935: dwrite d1/d14/d2a/d42/d43/d44/f5c [0,4194304] 0 2026-03-09T17:30:41.909 INFO:tasks.workunit.client.1.vm09.stdout:9/940: mknod d5/de/d29/d90/dc7/da9/d104/d120/c13d 0 2026-03-09T17:30:41.911 INFO:tasks.workunit.client.1.vm09.stdout:0/929: dwrite d6/d1d/f70 [0,4194304] 0 2026-03-09T17:30:41.912 INFO:tasks.workunit.client.1.vm09.stdout:6/897: dwrite d3/d7/d59/ff3 [0,4194304] 0 2026-03-09T17:30:41.914 INFO:tasks.workunit.client.1.vm09.stdout:6/898: stat d3/d48/fc7 0 2026-03-09T17:30:41.921 INFO:tasks.workunit.client.1.vm09.stdout:6/899: mkdir d3/d7/d99/d124 0 2026-03-09T17:30:41.922 INFO:tasks.workunit.client.1.vm09.stdout:3/863: rename d5/d16/d31/d37/dae/db4/c102 to d5/d16/d31/d37/d58/c109 0 2026-03-09T17:30:41.922 INFO:tasks.workunit.client.1.vm09.stdout:6/900: chown d3/d21/d76/d81/c108 108590 1 2026-03-09T17:30:41.925 INFO:tasks.workunit.client.1.vm09.stdout:9/941: chown d5/de/d4e/dca/d84/db7/lf2 815407 1 2026-03-09T17:30:41.931 INFO:tasks.workunit.client.1.vm09.stdout:1/929: rename d9/dc/dd/d40/d21/d6f/f85 to d9/de5/dfb/f121 0 2026-03-09T17:30:41.937 INFO:tasks.workunit.client.1.vm09.stdout:8/936: mknod d1/da/c11f 0 2026-03-09T17:30:41.942 INFO:tasks.workunit.client.1.vm09.stdout:9/942: fdatasync d5/d2e/f5e 0 2026-03-09T17:30:41.943 INFO:tasks.workunit.client.1.vm09.stdout:1/930: mknod d9/d9e/dc0/d8b/c122 0 2026-03-09T17:30:41.953 INFO:tasks.workunit.client.1.vm09.stdout:0/930: mkdir d6/d1d/d24/d5e/dc2/d11c/d128/d130 0 2026-03-09T17:30:41.953 INFO:tasks.workunit.client.1.vm09.stdout:0/931: fdatasync d6/d1d/d24/d32/d59/fb0 0 2026-03-09T17:30:41.953 INFO:tasks.workunit.client.1.vm09.stdout:8/937: fdatasync d1/d14/d2a/d49/fa5 0 2026-03-09T17:30:41.953 INFO:tasks.workunit.client.1.vm09.stdout:4/879: dwrite d11/d1e/d45/d60/d71/db7/d89/fba [0,4194304] 0 2026-03-09T17:30:41.953 INFO:tasks.workunit.client.1.vm09.stdout:8/938: chown d1/da/d23/d71/dde 31 1 2026-03-09T17:30:41.953 INFO:tasks.workunit.client.1.vm09.stdout:3/864: creat d5/d9/f10a x:0 0 0 2026-03-09T17:30:41.953 INFO:tasks.workunit.client.1.vm09.stdout:2/870: dwrite d13/d15/d34/d45/f61 [0,4194304] 0 2026-03-09T17:30:41.968 INFO:tasks.workunit.client.1.vm09.stdout:1/931: mkdir d9/d123 0 2026-03-09T17:30:41.968 INFO:tasks.workunit.client.1.vm09.stdout:9/943: unlink d5/de/d29/dd4/df0/f96 0 2026-03-09T17:30:41.973 INFO:tasks.workunit.client.1.vm09.stdout:4/880: mknod d11/dc8/d107/c10d 0 2026-03-09T17:30:41.975 INFO:tasks.workunit.client.1.vm09.stdout:3/865: mknod d5/c10b 0 2026-03-09T17:30:41.976 INFO:tasks.workunit.client.1.vm09.stdout:8/939: mkdir d1/da/d3a/d103/d120 0 2026-03-09T17:30:41.977 INFO:tasks.workunit.client.1.vm09.stdout:9/944: chown d5/de/d29/c7c 100966 1 2026-03-09T17:30:41.979 INFO:tasks.workunit.client.1.vm09.stdout:4/881: chown d11/d1e/d29/d36/f6a 24690924 1 2026-03-09T17:30:41.979 INFO:tasks.workunit.client.1.vm09.stdout:4/882: fsync d11/f16 0 2026-03-09T17:30:41.979 INFO:tasks.workunit.client.1.vm09.stdout:8/940: mkdir d1/da/d3a/d103/d121 0 2026-03-09T17:30:41.979 INFO:tasks.workunit.client.1.vm09.stdout:3/866: write d5/d16/d31/d37/d58/d8a/da8/faf [3249726,409] 0 2026-03-09T17:30:41.979 INFO:tasks.workunit.client.1.vm09.stdout:8/941: stat d1/c1a 0 2026-03-09T17:30:41.980 INFO:tasks.workunit.client.1.vm09.stdout:4/883: stat d11/d1e/d45/d60/df1 0 2026-03-09T17:30:41.980 INFO:tasks.workunit.client.1.vm09.stdout:2/871: rename d13/d15/d34/d37/d6f/f7b to d13/d15/f119 0 2026-03-09T17:30:41.981 INFO:tasks.workunit.client.1.vm09.stdout:9/945: rename d5/d2e to d5/d2e/d8b/de0/df1/d13e 22 2026-03-09T17:30:41.981 INFO:tasks.workunit.client.1.vm09.stdout:3/867: fsync d5/d9/f1e 0 2026-03-09T17:30:41.983 INFO:tasks.workunit.client.1.vm09.stdout:8/942: symlink d1/d14/d2a/l122 0 2026-03-09T17:30:41.984 INFO:tasks.workunit.client.1.vm09.stdout:4/884: symlink d11/d1e/d45/d60/d71/db7/d89/d8b/d58/deb/df9/l10e 0 2026-03-09T17:30:41.985 INFO:tasks.workunit.client.1.vm09.stdout:9/946: mkdir d5/de/d29/d90/dc7/d101/d13f 0 2026-03-09T17:30:41.985 INFO:tasks.workunit.client.1.vm09.stdout:8/943: read d1/d14/d2a/fe9 [4137257,33113] 0 2026-03-09T17:30:41.985 INFO:tasks.workunit.client.1.vm09.stdout:3/868: rename d5/d9/d90/db0/fcd to d5/d16/d31/d37/d58/d64/f10c 0 2026-03-09T17:30:41.987 INFO:tasks.workunit.client.1.vm09.stdout:8/944: chown d1/da/d23/dc2/l9b 7 1 2026-03-09T17:30:42.002 INFO:tasks.workunit.client.1.vm09.stdout:2/872: creat d13/d15/d21/df5/f11a x:0 0 0 2026-03-09T17:30:42.002 INFO:tasks.workunit.client.1.vm09.stdout:9/947: link d5/de/d29/f12e d5/de/d88/f140 0 2026-03-09T17:30:42.002 INFO:tasks.workunit.client.1.vm09.stdout:2/873: creat d13/d4d/daa/f11b x:0 0 0 2026-03-09T17:30:42.002 INFO:tasks.workunit.client.1.vm09.stdout:4/885: dwrite d11/d1e/d45/d60/d71/db7/d89/fec [0,4194304] 0 2026-03-09T17:30:42.002 INFO:tasks.workunit.client.1.vm09.stdout:9/948: rmdir d5 39 2026-03-09T17:30:42.006 INFO:tasks.workunit.client.1.vm09.stdout:3/869: dread d5/d16/d31/d37/f5b [0,4194304] 0 2026-03-09T17:30:42.008 INFO:tasks.workunit.client.1.vm09.stdout:3/870: symlink d5/d16/d31/d37/d58/d8a/l10d 0 2026-03-09T17:30:42.008 INFO:tasks.workunit.client.1.vm09.stdout:3/871: readlink d5/d16/l83 0 2026-03-09T17:30:42.013 INFO:tasks.workunit.client.1.vm09.stdout:5/967: write d0/d46/f56 [1661154,41782] 0 2026-03-09T17:30:42.016 INFO:tasks.workunit.client.1.vm09.stdout:5/968: symlink d0/d2/d76/d87/d95/d9b/dc0/dde/l13b 0 2026-03-09T17:30:42.023 INFO:tasks.workunit.client.1.vm09.stdout:1/932: sync 2026-03-09T17:30:42.027 INFO:tasks.workunit.client.1.vm09.stdout:6/901: write d3/d21/d76/d5c/d7e/dc5/d98/fa6 [617878,52963] 0 2026-03-09T17:30:42.037 INFO:tasks.workunit.client.1.vm09.stdout:8/945: dwrite d1/d14/d2a/d49/fa5 [0,4194304] 0 2026-03-09T17:30:42.044 INFO:tasks.workunit.client.1.vm09.stdout:1/933: fsync d9/dc/dd/d40/d1d/f98 0 2026-03-09T17:30:42.044 INFO:tasks.workunit.client.1.vm09.stdout:4/886: rename d11/dc8/d107 to d11/d1e/d45/d60/d71/db7/d89/d8b/d58/deb/df9/d10f 0 2026-03-09T17:30:42.048 INFO:tasks.workunit.client.1.vm09.stdout:6/902: mknod d3/d21/d76/d5c/d9f/c125 0 2026-03-09T17:30:42.051 INFO:tasks.workunit.client.1.vm09.stdout:0/932: dwrite d6/d1d/fc5 [0,4194304] 0 2026-03-09T17:30:42.052 INFO:tasks.workunit.client.1.vm09.stdout:8/946: mkdir d1/d123 0 2026-03-09T17:30:42.053 INFO:tasks.workunit.client.1.vm09.stdout:1/934: dwrite d9/dc/dd/d40/d1d/fa1 [0,4194304] 0 2026-03-09T17:30:42.068 INFO:tasks.workunit.client.1.vm09.stdout:0/933: read d6/d1d/d24/d32/d59/d9c/dac/dcc/fe9 [1074044,14262] 0 2026-03-09T17:30:42.069 INFO:tasks.workunit.client.1.vm09.stdout:2/874: dwrite d13/d15/f2a [0,4194304] 0 2026-03-09T17:30:42.079 INFO:tasks.workunit.client.1.vm09.stdout:8/947: symlink d1/da/dd/d77/l124 0 2026-03-09T17:30:42.079 INFO:tasks.workunit.client.1.vm09.stdout:1/935: symlink d9/dc/l124 0 2026-03-09T17:30:42.084 INFO:tasks.workunit.client.1.vm09.stdout:1/936: symlink d9/dc/dd/d40/d21/d35/l125 0 2026-03-09T17:30:42.085 INFO:tasks.workunit.client.1.vm09.stdout:0/934: getdents d6/d1d 0 2026-03-09T17:30:42.087 INFO:tasks.workunit.client.1.vm09.stdout:2/875: creat d13/d15/d36/d72/d94/da7/f11c x:0 0 0 2026-03-09T17:30:42.089 INFO:tasks.workunit.client.1.vm09.stdout:1/937: rename d9/dc/dd/d40/f92 to d9/d5a/f126 0 2026-03-09T17:30:42.089 INFO:tasks.workunit.client.1.vm09.stdout:8/948: creat d1/d14/d2a/d42/d43/f125 x:0 0 0 2026-03-09T17:30:42.090 INFO:tasks.workunit.client.1.vm09.stdout:0/935: rmdir d6/d1d/d24/d32/d59/d9c/dac/dcc 39 2026-03-09T17:30:42.097 INFO:tasks.workunit.client.1.vm09.stdout:3/872: write d5/d9/d30/d65/d59/fd6 [499978,121641] 0 2026-03-09T17:30:42.097 INFO:tasks.workunit.client.1.vm09.stdout:0/936: truncate d6/d1d/d24/d5e/f8a 2316175 0 2026-03-09T17:30:42.100 INFO:tasks.workunit.client.1.vm09.stdout:9/949: dwrite d5/d2e/d8b/fac [0,4194304] 0 2026-03-09T17:30:42.110 INFO:tasks.workunit.client.1.vm09.stdout:8/949: dread d1/da/dd/f22 [0,4194304] 0 2026-03-09T17:30:42.110 INFO:tasks.workunit.client.1.vm09.stdout:9/950: fsync d5/d2e/d8b/db4/f115 0 2026-03-09T17:30:42.110 INFO:tasks.workunit.client.1.vm09.stdout:9/951: dread - d5/de/df7/f107 zero size 2026-03-09T17:30:42.110 INFO:tasks.workunit.client.1.vm09.stdout:9/952: read d5/de/d88/f110 [818468,98466] 0 2026-03-09T17:30:42.110 INFO:tasks.workunit.client.1.vm09.stdout:4/887: write d11/d1e/d29/f3b [2983649,20957] 0 2026-03-09T17:30:42.112 INFO:tasks.workunit.client.1.vm09.stdout:5/969: dwrite d0/d52/d20/f63 [0,4194304] 0 2026-03-09T17:30:42.114 INFO:tasks.workunit.client.1.vm09.stdout:1/938: link d9/d9e/dc0/d37/d3f/f68 d9/dc/dd/d40/d1d/f127 0 2026-03-09T17:30:42.115 INFO:tasks.workunit.client.1.vm09.stdout:1/939: write d9/dc/dd/d9f/de4/dba/fd3 [1231269,67458] 0 2026-03-09T17:30:42.116 INFO:tasks.workunit.client.1.vm09.stdout:1/940: dread - d9/de5/dfb/f108 zero size 2026-03-09T17:30:42.117 INFO:tasks.workunit.client.1.vm09.stdout:9/953: dwrite d5/de/d29/d90/dc7/da9/d104/d11c/f12d [0,4194304] 0 2026-03-09T17:30:42.119 INFO:tasks.workunit.client.1.vm09.stdout:0/937: dwrite d6/d1d/d24/d32/d59/d81/d8c/fa3 [0,4194304] 0 2026-03-09T17:30:42.119 INFO:tasks.workunit.client.1.vm09.stdout:3/873: creat d5/d16/d31/d3d/db3/df3/f10e x:0 0 0 2026-03-09T17:30:42.120 INFO:tasks.workunit.client.1.vm09.stdout:9/954: fdatasync d5/de/d4e/dca/de7/d93/fb0 0 2026-03-09T17:30:42.125 INFO:tasks.workunit.client.1.vm09.stdout:8/950: truncate d1/d14/fa8 57147 0 2026-03-09T17:30:42.126 INFO:tasks.workunit.client.1.vm09.stdout:5/970: rename d0/dc to d0/d46/d11f/d13c 0 2026-03-09T17:30:42.127 INFO:tasks.workunit.client.1.vm09.stdout:4/888: readlink d11/d1e/d29/db5/lf2 0 2026-03-09T17:30:42.127 INFO:tasks.workunit.client.1.vm09.stdout:1/941: unlink d9/dc/dd/d40/d21/d35/l3e 0 2026-03-09T17:30:42.140 INFO:tasks.workunit.client.1.vm09.stdout:3/874: sync 2026-03-09T17:30:42.147 INFO:tasks.workunit.client.1.vm09.stdout:9/955: symlink d5/de/d29/d90/l141 0 2026-03-09T17:30:42.147 INFO:tasks.workunit.client.1.vm09.stdout:1/942: stat d9/d9e/dc0/d37/d3f/l57 0 2026-03-09T17:30:42.160 INFO:tasks.workunit.client.1.vm09.stdout:1/943: rename d9/dc/dd/d9f/de4/fc8 to d9/d9e/dc0/d37/d3f/d42/d55/df0/f128 0 2026-03-09T17:30:42.163 INFO:tasks.workunit.client.1.vm09.stdout:4/889: getdents d11/d1e/d45 0 2026-03-09T17:30:42.165 INFO:tasks.workunit.client.1.vm09.stdout:8/951: truncate d1/d14/d2a/d42/d5d/d8a/f94 681441 0 2026-03-09T17:30:42.166 INFO:tasks.workunit.client.1.vm09.stdout:4/890: chown d11/d1e/d45/d60/d71/db7/d89/ld9 1357 1 2026-03-09T17:30:42.169 INFO:tasks.workunit.client.1.vm09.stdout:4/891: mkdir d11/d1e/d31/d110 0 2026-03-09T17:30:42.169 INFO:tasks.workunit.client.1.vm09.stdout:8/952: symlink d1/d14/l126 0 2026-03-09T17:30:42.172 INFO:tasks.workunit.client.1.vm09.stdout:4/892: chown d11/d1e/d31/cde 2029 1 2026-03-09T17:30:42.173 INFO:tasks.workunit.client.1.vm09.stdout:8/953: symlink d1/da/dd/d79/l127 0 2026-03-09T17:30:42.173 INFO:tasks.workunit.client.1.vm09.stdout:4/893: creat d11/d1e/d45/daf/f111 x:0 0 0 2026-03-09T17:30:42.174 INFO:tasks.workunit.client.1.vm09.stdout:8/954: creat d1/da/d23/d71/db6/f128 x:0 0 0 2026-03-09T17:30:42.177 INFO:tasks.workunit.client.1.vm09.stdout:8/955: dread - d1/da/d23/f7d zero size 2026-03-09T17:30:42.177 INFO:tasks.workunit.client.1.vm09.stdout:4/894: fdatasync d11/d1e/d45/d60/df1/fbc 0 2026-03-09T17:30:42.178 INFO:tasks.workunit.client.1.vm09.stdout:8/956: chown d1/da/d23/d71/d101/ca7 158008 1 2026-03-09T17:30:42.179 INFO:tasks.workunit.client.1.vm09.stdout:8/957: fsync d1/da/d23/d6c/fb2 0 2026-03-09T17:30:42.181 INFO:tasks.workunit.client.1.vm09.stdout:8/958: mknod d1/c129 0 2026-03-09T17:30:42.182 INFO:tasks.workunit.client.1.vm09.stdout:4/895: rename d11/d1e/d31/f5a to d11/d1e/d29/d36/dd7/f112 0 2026-03-09T17:30:42.183 INFO:tasks.workunit.client.1.vm09.stdout:6/903: write d3/d7/fd3 [694432,129857] 0 2026-03-09T17:30:42.185 INFO:tasks.workunit.client.1.vm09.stdout:8/959: rmdir d1/da/d23/d6c/ddd/dcb/d97/dc5 39 2026-03-09T17:30:42.189 INFO:tasks.workunit.client.1.vm09.stdout:2/876: write d13/d15/f2b [182421,5574] 0 2026-03-09T17:30:42.206 INFO:tasks.workunit.client.1.vm09.stdout:2/877: read d13/d4d/f81 [438608,52732] 0 2026-03-09T17:30:42.207 INFO:tasks.workunit.client.1.vm09.stdout:8/960: rename d1/da/dd/d47/d4c to d1/d14/d2a/d42/d12a 0 2026-03-09T17:30:42.209 INFO:tasks.workunit.client.1.vm09.stdout:0/938: write d6/d1d/d24/d32/fec [218727,93285] 0 2026-03-09T17:30:42.213 INFO:tasks.workunit.client.1.vm09.stdout:0/939: rmdir d6/d1d/d39 39 2026-03-09T17:30:42.213 INFO:tasks.workunit.client.1.vm09.stdout:2/878: rmdir d13/d15/d3b/ddf/d90 39 2026-03-09T17:30:42.215 INFO:tasks.workunit.client.1.vm09.stdout:8/961: dread d1/d14/f3c [0,4194304] 0 2026-03-09T17:30:42.216 INFO:tasks.workunit.client.1.vm09.stdout:2/879: chown d13/d15/d34/d37/c64 814958 1 2026-03-09T17:30:42.221 INFO:tasks.workunit.client.1.vm09.stdout:0/940: write d6/d1d/d24/d5e/db2/fb9 [128318,127782] 0 2026-03-09T17:30:42.222 INFO:tasks.workunit.client.1.vm09.stdout:9/956: dread d5/d2e/f5a [0,4194304] 0 2026-03-09T17:30:42.223 INFO:tasks.workunit.client.1.vm09.stdout:0/941: chown d6/d1d/d24/d32/d59/f5c 525673653 1 2026-03-09T17:30:42.226 INFO:tasks.workunit.client.1.vm09.stdout:5/971: write d0/d46/f4c [1002095,104817] 0 2026-03-09T17:30:42.226 INFO:tasks.workunit.client.1.vm09.stdout:2/880: creat d13/d15/d3b/ddf/f11d x:0 0 0 2026-03-09T17:30:42.228 INFO:tasks.workunit.client.1.vm09.stdout:8/962: creat d1/da/d23/d6c/ddd/dcb/d97/dc5/d112/f12b x:0 0 0 2026-03-09T17:30:42.230 INFO:tasks.workunit.client.1.vm09.stdout:8/963: chown d1/da/d23/d71 129077140 1 2026-03-09T17:30:42.233 INFO:tasks.workunit.client.1.vm09.stdout:2/881: rmdir d13/d15/d34/d37/d6f/dde 39 2026-03-09T17:30:42.233 INFO:tasks.workunit.client.1.vm09.stdout:2/882: readlink d13/d15/d34/lb4 0 2026-03-09T17:30:42.236 INFO:tasks.workunit.client.1.vm09.stdout:0/942: write d6/d1d/df0/f10d [986100,40053] 0 2026-03-09T17:30:42.240 INFO:tasks.workunit.client.1.vm09.stdout:8/964: rmdir d1/da/dd/d77 39 2026-03-09T17:30:42.240 INFO:tasks.workunit.client.1.vm09.stdout:2/883: write d13/d4d/daa/f11b [167701,81266] 0 2026-03-09T17:30:42.240 INFO:tasks.workunit.client.1.vm09.stdout:1/944: write d9/d9e/dc0/d91/d99/fa5 [735469,84981] 0 2026-03-09T17:30:42.241 INFO:tasks.workunit.client.1.vm09.stdout:8/965: chown d1/da/d23/d71/db6/d115 365 1 2026-03-09T17:30:42.248 INFO:tasks.workunit.client.1.vm09.stdout:3/875: dwrite d5/d9/d30/f61 [0,4194304] 0 2026-03-09T17:30:42.254 INFO:tasks.workunit.client.1.vm09.stdout:2/884: mknod d13/da4/c11e 0 2026-03-09T17:30:42.264 INFO:tasks.workunit.client.1.vm09.stdout:0/943: unlink d6/d1d/d39/l78 0 2026-03-09T17:30:42.265 INFO:tasks.workunit.client.1.vm09.stdout:0/944: write d6/d64/f105 [432186,97552] 0 2026-03-09T17:30:42.266 INFO:tasks.workunit.client.1.vm09.stdout:4/896: write d11/d1e/d29/d36/f82 [1788282,95882] 0 2026-03-09T17:30:42.269 INFO:tasks.workunit.client.1.vm09.stdout:2/885: stat d13/d15/d3b/ddf/d90/f9f 0 2026-03-09T17:30:42.270 INFO:tasks.workunit.client.1.vm09.stdout:1/945: mknod d9/d9e/dc0/d91/c129 0 2026-03-09T17:30:42.271 INFO:tasks.workunit.client.1.vm09.stdout:6/904: dwrite d3/d21/db1/ff0 [0,4194304] 0 2026-03-09T17:30:42.279 INFO:tasks.workunit.client.1.vm09.stdout:0/945: stat d6/d1d/d24/c28 0 2026-03-09T17:30:42.281 INFO:tasks.workunit.client.1.vm09.stdout:5/972: dwrite d0/d52/f1c [0,4194304] 0 2026-03-09T17:30:42.286 INFO:tasks.workunit.client.1.vm09.stdout:4/897: fsync d11/d1e/d45/d60/d71/db7/f90 0 2026-03-09T17:30:42.288 INFO:tasks.workunit.client.1.vm09.stdout:2/886: truncate d13/f4c 3905908 0 2026-03-09T17:30:42.289 INFO:tasks.workunit.client.1.vm09.stdout:0/946: creat d6/d64/d97/dc9/f131 x:0 0 0 2026-03-09T17:30:42.291 INFO:tasks.workunit.client.1.vm09.stdout:5/973: creat d0/d52/d20/f13d x:0 0 0 2026-03-09T17:30:42.292 INFO:tasks.workunit.client.1.vm09.stdout:4/898: creat d11/d1e/d31/f113 x:0 0 0 2026-03-09T17:30:42.294 INFO:tasks.workunit.client.1.vm09.stdout:2/887: stat d13/d15/c56 0 2026-03-09T17:30:42.296 INFO:tasks.workunit.client.1.vm09.stdout:0/947: dwrite d6/d64/ffe [0,4194304] 0 2026-03-09T17:30:42.301 INFO:tasks.workunit.client.1.vm09.stdout:2/888: chown d13/f39 27101 1 2026-03-09T17:30:42.301 INFO:tasks.workunit.client.1.vm09.stdout:6/905: getdents d3/d7/d59/d73 0 2026-03-09T17:30:42.301 INFO:tasks.workunit.client.1.vm09.stdout:1/946: creat d9/f12a x:0 0 0 2026-03-09T17:30:42.301 INFO:tasks.workunit.client.1.vm09.stdout:5/974: rename d0/d46/d11f/d13c/d21/d26/d5e to d0/d46/d11f/d13c/d21/d26/d13e 0 2026-03-09T17:30:42.307 INFO:tasks.workunit.client.1.vm09.stdout:0/948: creat d6/d1d/d24/d5e/dc2/d11c/d128/f132 x:0 0 0 2026-03-09T17:30:42.309 INFO:tasks.workunit.client.1.vm09.stdout:4/899: getdents d11/d1e/d45 0 2026-03-09T17:30:42.309 INFO:tasks.workunit.client.1.vm09.stdout:1/947: readlink d9/d9e/dc0/d91/ld4 0 2026-03-09T17:30:42.311 INFO:tasks.workunit.client.1.vm09.stdout:1/948: write d9/d9e/dc0/d91/d99/fa5 [1267358,61276] 0 2026-03-09T17:30:42.312 INFO:tasks.workunit.client.1.vm09.stdout:5/975: dwrite d0/d46/d11f/d13c/d21/d6f/f5f [0,4194304] 0 2026-03-09T17:30:42.314 INFO:tasks.workunit.client.1.vm09.stdout:6/906: getdents d3/d21/d25/d26/d6b/d100 0 2026-03-09T17:30:42.318 INFO:tasks.workunit.client.1.vm09.stdout:0/949: dwrite d6/f63 [0,4194304] 0 2026-03-09T17:30:42.329 INFO:tasks.workunit.client.1.vm09.stdout:9/957: write d5/d21/f2a [572441,87881] 0 2026-03-09T17:30:42.331 INFO:tasks.workunit.client.1.vm09.stdout:5/976: dwrite d0/d46/d11f/d13c/d21/d26/d13e/fbc [0,4194304] 0 2026-03-09T17:30:42.336 INFO:tasks.workunit.client.1.vm09.stdout:8/966: dwrite d1/d14/d2a/f62 [4194304,4194304] 0 2026-03-09T17:30:42.344 INFO:tasks.workunit.client.1.vm09.stdout:0/950: mkdir d6/d93/d133 0 2026-03-09T17:30:42.345 INFO:tasks.workunit.client.1.vm09.stdout:0/951: write d6/d1d/d24/d32/d59/d9c/dac/f112 [757872,121652] 0 2026-03-09T17:30:42.345 INFO:tasks.workunit.client.1.vm09.stdout:6/907: rename d3/d21/d25/d26/d6b/f79 to d3/d21/d25/d26/d6b/dbf/f126 0 2026-03-09T17:30:42.347 INFO:tasks.workunit.client.1.vm09.stdout:9/958: mkdir d5/de/d29/dd4/d131/d142 0 2026-03-09T17:30:42.349 INFO:tasks.workunit.client.1.vm09.stdout:0/952: mknod d6/d1d/d24/d5e/d6c/ded/c134 0 2026-03-09T17:30:42.357 INFO:tasks.workunit.client.1.vm09.stdout:9/959: creat d5/d2e/d8b/db4/f143 x:0 0 0 2026-03-09T17:30:42.359 INFO:tasks.workunit.client.1.vm09.stdout:9/960: mkdir d5/d2e/d8b/de0/d125/d13b/d144 0 2026-03-09T17:30:42.365 INFO:tasks.workunit.client.1.vm09.stdout:9/961: fsync d5/de/d4e/dca/f75 0 2026-03-09T17:30:42.366 INFO:tasks.workunit.client.1.vm09.stdout:9/962: chown d5/de/d4e/dca/d84/d97/f103 204 1 2026-03-09T17:30:42.368 INFO:tasks.workunit.client.1.vm09.stdout:9/963: fdatasync d5/d21/f2b 0 2026-03-09T17:30:42.369 INFO:tasks.workunit.client.1.vm09.stdout:9/964: fdatasync d5/f34 0 2026-03-09T17:30:42.371 INFO:tasks.workunit.client.1.vm09.stdout:9/965: creat d5/de/d29/d90/dc7/da9/d104/d120/f145 x:0 0 0 2026-03-09T17:30:42.372 INFO:tasks.workunit.client.1.vm09.stdout:9/966: dread - d5/de/d29/d33/db8/ff5 zero size 2026-03-09T17:30:42.374 INFO:tasks.workunit.client.1.vm09.stdout:9/967: link d5/de/d29/dd4/cda d5/de/d29/da7/c146 0 2026-03-09T17:30:42.375 INFO:tasks.workunit.client.1.vm09.stdout:9/968: stat d5/de/d29/d90/dc7/f121 0 2026-03-09T17:30:42.378 INFO:tasks.workunit.client.1.vm09.stdout:9/969: fsync d5/de/d29/d33/f4a 0 2026-03-09T17:30:42.380 INFO:tasks.workunit.client.1.vm09.stdout:5/977: sync 2026-03-09T17:30:42.380 INFO:tasks.workunit.client.1.vm09.stdout:0/953: sync 2026-03-09T17:30:42.381 INFO:tasks.workunit.client.1.vm09.stdout:0/954: write d6/d1d/d24/d32/d59/d81/ff3 [911352,46448] 0 2026-03-09T17:30:42.382 INFO:tasks.workunit.client.1.vm09.stdout:5/978: mknod d0/d46/c13f 0 2026-03-09T17:30:42.384 INFO:tasks.workunit.client.1.vm09.stdout:0/955: creat d6/d64/dbd/f135 x:0 0 0 2026-03-09T17:30:42.385 INFO:tasks.workunit.client.1.vm09.stdout:9/970: getdents d5/de/d29/d33/db8 0 2026-03-09T17:30:42.385 INFO:tasks.workunit.client.1.vm09.stdout:0/956: chown d6/d1d/d24/d32/d59/ld4 15355 1 2026-03-09T17:30:42.387 INFO:tasks.workunit.client.1.vm09.stdout:0/957: readlink d6/d1d/l1f 0 2026-03-09T17:30:42.397 INFO:tasks.workunit.client.1.vm09.stdout:3/876: fsync d5/d16/d31/d37/f94 0 2026-03-09T17:30:42.399 INFO:tasks.workunit.client.1.vm09.stdout:9/971: dread d5/de/d29/f89 [0,4194304] 0 2026-03-09T17:30:42.401 INFO:tasks.workunit.client.1.vm09.stdout:3/877: rename d5/d16/d31/d37/dae/ce6 to d5/d16/d31/d37/d58/d8a/c10f 0 2026-03-09T17:30:42.402 INFO:tasks.workunit.client.1.vm09.stdout:3/878: chown d5/d16/d31/d37/d58/f73 27741210 1 2026-03-09T17:30:42.403 INFO:tasks.workunit.client.1.vm09.stdout:0/958: sync 2026-03-09T17:30:42.406 INFO:tasks.workunit.client.1.vm09.stdout:0/959: chown d6/f6d 837 1 2026-03-09T17:30:42.406 INFO:tasks.workunit.client.1.vm09.stdout:3/879: mkdir d5/d16/d31/d3d/d110 0 2026-03-09T17:30:42.407 INFO:tasks.workunit.client.1.vm09.stdout:4/900: write d11/d1e/d45/d60/df1/d78/fd6 [57630,73017] 0 2026-03-09T17:30:42.410 INFO:tasks.workunit.client.1.vm09.stdout:9/972: dread d5/d2e/d8b/de0/ffe [0,4194304] 0 2026-03-09T17:30:42.411 INFO:tasks.workunit.client.1.vm09.stdout:1/949: write d9/dc/dd/d40/d1d/f77 [1783151,685] 0 2026-03-09T17:30:42.415 INFO:tasks.workunit.client.1.vm09.stdout:0/960: rmdir d6/d1d/d24/d5e/dc2/d11a 0 2026-03-09T17:30:42.416 INFO:tasks.workunit.client.1.vm09.stdout:3/880: dwrite d5/d9/d30/d65/d59/d84/fa7 [0,4194304] 0 2026-03-09T17:30:42.418 INFO:tasks.workunit.client.1.vm09.stdout:4/901: dread d11/d1e/d31/fbf [0,4194304] 0 2026-03-09T17:30:42.423 INFO:tasks.workunit.client.1.vm09.stdout:2/889: dread d13/f4c [0,4194304] 0 2026-03-09T17:30:42.423 INFO:tasks.workunit.client.1.vm09.stdout:3/881: dread - d5/d16/d31/d3d/db3/df3/f10e zero size 2026-03-09T17:30:42.424 INFO:tasks.workunit.client.1.vm09.stdout:1/950: getdents d9/ddd 0 2026-03-09T17:30:42.425 INFO:tasks.workunit.client.1.vm09.stdout:1/951: chown d9/f59 19799082 1 2026-03-09T17:30:42.425 INFO:tasks.workunit.client.1.vm09.stdout:3/882: mknod d5/d9c/de7/c111 0 2026-03-09T17:30:42.426 INFO:tasks.workunit.client.1.vm09.stdout:9/973: creat d5/de/d4e/dca/f147 x:0 0 0 2026-03-09T17:30:42.430 INFO:tasks.workunit.client.1.vm09.stdout:3/883: write d5/d9/d30/d65/d59/fd9 [918422,120894] 0 2026-03-09T17:30:42.433 INFO:tasks.workunit.client.1.vm09.stdout:9/974: dread d5/de/d29/d90/dc7/fbe [0,4194304] 0 2026-03-09T17:30:42.437 INFO:tasks.workunit.client.1.vm09.stdout:3/884: dread d5/d9/da9/fc9 [0,4194304] 0 2026-03-09T17:30:42.437 INFO:tasks.workunit.client.1.vm09.stdout:3/885: chown d5/d16/d25/f2c 47 1 2026-03-09T17:30:42.437 INFO:tasks.workunit.client.1.vm09.stdout:1/952: getdents d9/dc/dd/d9f/de4/dba 0 2026-03-09T17:30:42.441 INFO:tasks.workunit.client.1.vm09.stdout:0/961: dread d6/d1d/d24/d32/d59/d81/fc0 [0,4194304] 0 2026-03-09T17:30:42.442 INFO:tasks.workunit.client.1.vm09.stdout:1/953: rename d9/d9e/dc0/d91/d99/dcf/f116 to d9/d9e/dc0/d8b/f12b 0 2026-03-09T17:30:42.443 INFO:tasks.workunit.client.1.vm09.stdout:8/967: truncate d1/d14/d2a/f2e 2891434 0 2026-03-09T17:30:42.443 INFO:tasks.workunit.client.1.vm09.stdout:3/886: mknod d5/d16/d31/d37/d58/d8a/c112 0 2026-03-09T17:30:42.446 INFO:tasks.workunit.client.1.vm09.stdout:9/975: getdents d5/d2e/d8b/db4/d132/d99/dc9/dde 0 2026-03-09T17:30:42.448 INFO:tasks.workunit.client.1.vm09.stdout:3/887: rename d5/d16/d25/c21 to d5/d9/d30/c113 0 2026-03-09T17:30:42.449 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:42 vm09.local ceph-mon[62061]: pgmap v8: 65 pgs: 65 active+clean; 3.7 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 35 MiB/s rd, 55 MiB/s wr, 174 op/s 2026-03-09T17:30:42.449 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:42 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:42.449 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:42 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:42.449 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:42 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:30:42.449 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:42 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:42.449 INFO:tasks.workunit.client.1.vm09.stdout:1/954: mkdir d9/dc/d12c 0 2026-03-09T17:30:42.450 INFO:tasks.workunit.client.1.vm09.stdout:0/962: creat d6/d1d/d24/d32/d59/d9c/dac/dcc/f136 x:0 0 0 2026-03-09T17:30:42.452 INFO:tasks.workunit.client.1.vm09.stdout:0/963: write d6/d1d/d24/d32/f45 [4420592,115570] 0 2026-03-09T17:30:42.457 INFO:tasks.workunit.client.1.vm09.stdout:8/968: sync 2026-03-09T17:30:42.459 INFO:tasks.workunit.client.1.vm09.stdout:4/902: dread d11/f16 [0,4194304] 0 2026-03-09T17:30:42.461 INFO:tasks.workunit.client.1.vm09.stdout:1/955: creat d9/dc/dd/d40/d21/d35/d88/f12d x:0 0 0 2026-03-09T17:30:42.462 INFO:tasks.workunit.client.1.vm09.stdout:0/964: stat d6/d1d/d24/d32/d59/d9c/dac/dd1/c104 0 2026-03-09T17:30:42.463 INFO:tasks.workunit.client.1.vm09.stdout:1/956: sync 2026-03-09T17:30:42.471 INFO:tasks.workunit.client.1.vm09.stdout:1/957: chown d9/d9e/dc0/d37/d3f/f68 3550809 1 2026-03-09T17:30:42.477 INFO:tasks.workunit.client.1.vm09.stdout:1/958: mkdir d9/de5/dfb/d12e 0 2026-03-09T17:30:42.477 INFO:tasks.workunit.client.1.vm09.stdout:6/908: dwrite d3/d21/d25/fdb [0,4194304] 0 2026-03-09T17:30:42.483 INFO:tasks.workunit.client.1.vm09.stdout:8/969: link d1/da/d23/d6c/d32/c92 d1/da/d23/d71/dde/c12c 0 2026-03-09T17:30:42.497 INFO:tasks.workunit.client.1.vm09.stdout:5/979: dwrite d0/d46/d11f/d13c/d21/d26/fc6 [0,4194304] 0 2026-03-09T17:30:42.497 INFO:tasks.workunit.client.1.vm09.stdout:1/959: symlink d9/dc/dd/d40/d21/d35/l12f 0 2026-03-09T17:30:42.501 INFO:tasks.workunit.client.1.vm09.stdout:2/890: dwrite d13/d15/d34/d45/d84/dcb/db1/ff4 [0,4194304] 0 2026-03-09T17:30:42.504 INFO:tasks.workunit.client.1.vm09.stdout:8/970: fsync d1/d14/f3c 0 2026-03-09T17:30:42.504 INFO:tasks.workunit.client.1.vm09.stdout:6/909: mknod d3/d21/d76/d5c/df6/c127 0 2026-03-09T17:30:42.504 INFO:tasks.workunit.client.1.vm09.stdout:0/965: write d6/d64/db5/f114 [2764076,36702] 0 2026-03-09T17:30:42.505 INFO:tasks.workunit.client.1.vm09.stdout:6/910: write d3/d21/d76/d5c/d9f/f105 [1019242,123348] 0 2026-03-09T17:30:42.505 INFO:tasks.workunit.client.1.vm09.stdout:8/971: chown d1/da/dd/d47/f82 397547 1 2026-03-09T17:30:42.507 INFO:tasks.workunit.client.1.vm09.stdout:3/888: dwrite d5/fa1 [0,4194304] 0 2026-03-09T17:30:42.511 INFO:tasks.workunit.client.1.vm09.stdout:4/903: dwrite d11/d1e/d31/f7c [0,4194304] 0 2026-03-09T17:30:42.523 INFO:tasks.workunit.client.1.vm09.stdout:8/972: fsync d1/f6e 0 2026-03-09T17:30:42.523 INFO:tasks.workunit.client.1.vm09.stdout:1/960: fsync d9/d38/d61/fd0 0 2026-03-09T17:30:42.524 INFO:tasks.workunit.client.1.vm09.stdout:9/976: dwrite d5/d2e/d8b/db4/d132/fdb [0,4194304] 0 2026-03-09T17:30:42.529 INFO:tasks.workunit.client.1.vm09.stdout:8/973: sync 2026-03-09T17:30:42.531 INFO:tasks.workunit.client.1.vm09.stdout:2/891: truncate d13/d15/d34/f44 2022558 0 2026-03-09T17:30:42.533 INFO:tasks.workunit.client.1.vm09.stdout:8/974: chown d1/d14/d2a/d42/d12a/c11e 5579 1 2026-03-09T17:30:42.535 INFO:tasks.workunit.client.1.vm09.stdout:4/904: dwrite d11/d1e/d45/ffd [0,4194304] 0 2026-03-09T17:30:42.538 INFO:tasks.workunit.client.1.vm09.stdout:2/892: chown d13/d15/d34/l8d 55951 1 2026-03-09T17:30:42.539 INFO:tasks.workunit.client.1.vm09.stdout:4/905: fdatasync d11/d1e/d45/d60/df1/d78/fd6 0 2026-03-09T17:30:42.542 INFO:tasks.workunit.client.1.vm09.stdout:4/906: chown d11/d1e/d31/fbf 0 1 2026-03-09T17:30:42.543 INFO:tasks.workunit.client.1.vm09.stdout:0/966: rename d6/d1d/d24/d5e/dc8 to d6/d1d/d24/d137 0 2026-03-09T17:30:42.557 INFO:tasks.workunit.client.1.vm09.stdout:9/977: dwrite d5/d2e/d8b/db4/d132/d99/dc9/dde/fec [0,4194304] 0 2026-03-09T17:30:42.557 INFO:tasks.workunit.client.1.vm09.stdout:3/889: link d5/d9c/de7/cf0 d5/d16/d31/c114 0 2026-03-09T17:30:42.557 INFO:tasks.workunit.client.1.vm09.stdout:2/893: creat d13/d15/d3b/ddf/d90/f11f x:0 0 0 2026-03-09T17:30:42.557 INFO:tasks.workunit.client.1.vm09.stdout:2/894: write d13/f89 [5670448,111975] 0 2026-03-09T17:30:42.557 INFO:tasks.workunit.client.1.vm09.stdout:3/890: fsync d5/d9/d30/d65/d59/fa2 0 2026-03-09T17:30:42.562 INFO:tasks.workunit.client.1.vm09.stdout:2/895: mkdir d13/d15/d21/d120 0 2026-03-09T17:30:42.563 INFO:tasks.workunit.client.1.vm09.stdout:4/907: dwrite d11/d1e/d45/d60/d71/db7/d89/fba [0,4194304] 0 2026-03-09T17:30:42.563 INFO:tasks.workunit.client.1.vm09.stdout:2/896: readlink d13/d15/d34/d37/l55 0 2026-03-09T17:30:42.572 INFO:tasks.workunit.client.1.vm09.stdout:2/897: rename d13/d15/fdb to d13/d15/d21/d120/f121 0 2026-03-09T17:30:42.574 INFO:tasks.workunit.client.1.vm09.stdout:4/908: getdents d11 0 2026-03-09T17:30:42.588 INFO:tasks.workunit.client.1.vm09.stdout:2/898: dwrite d13/d15/d34/d45/d84/dcb/db1/ff4 [4194304,4194304] 0 2026-03-09T17:30:42.614 INFO:tasks.workunit.client.1.vm09.stdout:4/909: dread d11/d1e/d31/f3a [0,4194304] 0 2026-03-09T17:30:42.617 INFO:tasks.workunit.client.1.vm09.stdout:4/910: dwrite d11/d1e/d31/f3a [0,4194304] 0 2026-03-09T17:30:42.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:42 vm06.local ceph-mon[57307]: pgmap v8: 65 pgs: 65 active+clean; 3.7 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 35 MiB/s rd, 55 MiB/s wr, 174 op/s 2026-03-09T17:30:42.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:42 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:42.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:42 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:42.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:42 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:30:42.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:42 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:42.645 INFO:tasks.workunit.client.1.vm09.stdout:4/911: dread d11/d1e/d29/d36/f7f [0,4194304] 0 2026-03-09T17:30:42.648 INFO:tasks.workunit.client.1.vm09.stdout:4/912: chown d11/d1e/d45/d60/df1/lc4 9844268 1 2026-03-09T17:30:42.653 INFO:tasks.workunit.client.1.vm09.stdout:1/961: dread d9/dc/dd/d40/d1d/fab [0,4194304] 0 2026-03-09T17:30:42.654 INFO:tasks.workunit.client.1.vm09.stdout:4/913: creat d11/d1e/d45/d60/f114 x:0 0 0 2026-03-09T17:30:42.654 INFO:tasks.workunit.client.1.vm09.stdout:4/914: stat d11/d1e/d45/d60/df1/fa0 0 2026-03-09T17:30:42.657 INFO:tasks.workunit.client.1.vm09.stdout:1/962: creat d9/dc/dd/d9f/f130 x:0 0 0 2026-03-09T17:30:42.659 INFO:tasks.workunit.client.1.vm09.stdout:4/915: chown d11/d1e/d45/d60/d71/db7/d89/d8b/l54 67819 1 2026-03-09T17:30:42.663 INFO:tasks.workunit.client.1.vm09.stdout:1/963: getdents d9/dc/d12c 0 2026-03-09T17:30:42.663 INFO:tasks.workunit.client.1.vm09.stdout:4/916: rmdir d11/d1e/d29/d36 39 2026-03-09T17:30:42.663 INFO:tasks.workunit.client.1.vm09.stdout:1/964: unlink d9/dc/dd/d40/d21/d35/d88/f12d 0 2026-03-09T17:30:42.663 INFO:tasks.workunit.client.1.vm09.stdout:4/917: stat d11/d1e/d29/d36/f40 0 2026-03-09T17:30:42.663 INFO:tasks.workunit.client.1.vm09.stdout:4/918: stat d11/d1e/d45/c88 0 2026-03-09T17:30:42.663 INFO:tasks.workunit.client.1.vm09.stdout:1/965: write d9/dc/dd/d9f/d9c/f9b [3667359,86372] 0 2026-03-09T17:30:42.664 INFO:tasks.workunit.client.1.vm09.stdout:1/966: chown d9/d38 129601 1 2026-03-09T17:30:42.667 INFO:tasks.workunit.client.1.vm09.stdout:4/919: mkdir d11/d1e/d115 0 2026-03-09T17:30:42.670 INFO:tasks.workunit.client.1.vm09.stdout:4/920: mknod d11/d1e/d29/d36/df7/c116 0 2026-03-09T17:30:42.671 INFO:tasks.workunit.client.1.vm09.stdout:1/967: dwrite d9/dc/dd/d40/f86 [0,4194304] 0 2026-03-09T17:30:42.683 INFO:tasks.workunit.client.1.vm09.stdout:1/968: write d9/d9e/dc0/d37/d3f/d42/f95 [733357,58938] 0 2026-03-09T17:30:42.684 INFO:tasks.workunit.client.1.vm09.stdout:4/921: dwrite f3 [0,4194304] 0 2026-03-09T17:30:42.691 INFO:tasks.workunit.client.1.vm09.stdout:1/969: symlink d9/d9e/dc0/d37/d3f/l131 0 2026-03-09T17:30:42.692 INFO:tasks.workunit.client.1.vm09.stdout:4/922: truncate d11/d1e/d45/d60/df1/fa0 236420 0 2026-03-09T17:30:42.697 INFO:tasks.workunit.client.1.vm09.stdout:1/970: creat d9/de5/dea/d102/f132 x:0 0 0 2026-03-09T17:30:42.700 INFO:tasks.workunit.client.1.vm09.stdout:1/971: chown d9/dc/dd/d40/d21/d6f/lec 3358 1 2026-03-09T17:30:42.704 INFO:tasks.workunit.client.1.vm09.stdout:4/923: dwrite d11/f15 [0,4194304] 0 2026-03-09T17:30:42.716 INFO:tasks.workunit.client.1.vm09.stdout:4/924: unlink d11/d1e/d29/d36/lcd 0 2026-03-09T17:30:42.732 INFO:tasks.workunit.client.1.vm09.stdout:4/925: rename d11/d1e/d45/c85 to d11/d1e/d45/d60/d71/db7/d89/d8b/d58/c117 0 2026-03-09T17:30:42.733 INFO:tasks.workunit.client.1.vm09.stdout:4/926: readlink d11/d1e/d45/d60/d71/db7/d89/ld9 0 2026-03-09T17:30:42.758 INFO:tasks.workunit.client.1.vm09.stdout:4/927: sync 2026-03-09T17:30:42.758 INFO:tasks.workunit.client.1.vm09.stdout:4/928: readlink d11/d1e/d31/l39 0 2026-03-09T17:30:42.767 INFO:tasks.workunit.client.1.vm09.stdout:4/929: read d11/d1e/d45/d60/df1/f67 [1203178,31299] 0 2026-03-09T17:30:42.792 INFO:tasks.workunit.client.1.vm09.stdout:6/911: dwrite d3/d21/d25/d26/d86/dbe/ff5 [0,4194304] 0 2026-03-09T17:30:42.800 INFO:tasks.workunit.client.1.vm09.stdout:6/912: sync 2026-03-09T17:30:42.807 INFO:tasks.workunit.client.1.vm09.stdout:6/913: creat d3/d21/d76/d3f/d11d/f128 x:0 0 0 2026-03-09T17:30:42.811 INFO:tasks.workunit.client.1.vm09.stdout:6/914: creat d3/d48/d10f/f129 x:0 0 0 2026-03-09T17:30:42.815 INFO:tasks.workunit.client.1.vm09.stdout:6/915: creat d3/d21/d76/d5c/d7e/f12a x:0 0 0 2026-03-09T17:30:42.828 INFO:tasks.workunit.client.1.vm09.stdout:6/916: dread d3/d21/d76/d5c/fbd [0,4194304] 0 2026-03-09T17:30:42.834 INFO:tasks.workunit.client.1.vm09.stdout:6/917: creat d3/d21/d25/d26/d86/f12b x:0 0 0 2026-03-09T17:30:42.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.847+0000 7f1606816700 1 -- 192.168.123.106:0/1274292659 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1600071a60 msgr2=0x7f1600071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:42.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.847+0000 7f1606816700 1 --2- 192.168.123.106:0/1274292659 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1600071a60 0x7f1600071e70 secure :-1 s=READY pgs=341 cs=0 l=1 rev1=1 crypto rx=0x7f15fc009b00 tx=0x7f15fc009e10 comp rx=0 tx=0).stop 2026-03-09T17:30:42.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.849+0000 7f1606816700 1 -- 192.168.123.106:0/1274292659 shutdown_connections 2026-03-09T17:30:42.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.849+0000 7f1606816700 1 --2- 192.168.123.106:0/1274292659 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1600072440 0x7f160010be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:42.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.849+0000 7f1606816700 1 --2- 192.168.123.106:0/1274292659 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1600071a60 0x7f1600071e70 unknown :-1 s=CLOSED pgs=341 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:42.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.849+0000 7f1606816700 1 -- 192.168.123.106:0/1274292659 >> 192.168.123.106:0/1274292659 conn(0x7f160006d1a0 msgr2=0x7f160006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:42.852 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.849+0000 7f1606816700 1 -- 192.168.123.106:0/1274292659 shutdown_connections 2026-03-09T17:30:42.852 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.849+0000 7f1606816700 1 -- 192.168.123.106:0/1274292659 wait complete. 2026-03-09T17:30:42.852 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.850+0000 7f1606816700 1 Processor -- start 2026-03-09T17:30:42.852 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.850+0000 7f1606816700 1 -- start start 2026-03-09T17:30:42.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.850+0000 7f1606816700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1600071a60 0x7f16001a49b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:42.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.850+0000 7f1606816700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1600072440 0x7f16001a4ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:42.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.850+0000 7f1606816700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f16001a5510 con 0x7f1600072440 2026-03-09T17:30:42.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.850+0000 7f1606816700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f16001a5650 con 0x7f1600071a60 2026-03-09T17:30:42.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.850+0000 7f1605013700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1600072440 0x7f16001a4ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:42.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.850+0000 7f1605814700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1600071a60 0x7f16001a49b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:42.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.850+0000 7f1605013700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1600072440 0x7f16001a4ef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:56466/0 (socket says 192.168.123.106:56466) 2026-03-09T17:30:42.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.850+0000 7f1605013700 1 -- 192.168.123.106:0/1949152027 learned_addr learned my addr 192.168.123.106:0/1949152027 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:30:42.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.851+0000 7f1605814700 1 -- 192.168.123.106:0/1949152027 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1600072440 msgr2=0x7f16001a4ef0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:42.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.851+0000 7f1605814700 1 --2- 192.168.123.106:0/1949152027 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1600072440 0x7f16001a4ef0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:42.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.851+0000 7f1605814700 1 -- 192.168.123.106:0/1949152027 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f15fc0097e0 con 0x7f1600071a60 2026-03-09T17:30:42.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.851+0000 7f1605814700 1 --2- 192.168.123.106:0/1949152027 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1600071a60 0x7f16001a49b0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f15fc000c00 tx=0x7f15fc004930 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:42.855 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.853+0000 7f15f6ffd700 1 -- 192.168.123.106:0/1949152027 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f15fc01d070 con 0x7f1600071a60 2026-03-09T17:30:42.855 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.853+0000 7f1606816700 1 -- 192.168.123.106:0/1949152027 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f16000770e0 con 0x7f1600071a60 2026-03-09T17:30:42.855 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.853+0000 7f1606816700 1 -- 192.168.123.106:0/1949152027 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f16000775a0 con 0x7f1600071a60 2026-03-09T17:30:42.855 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.853+0000 7f15f6ffd700 1 -- 192.168.123.106:0/1949152027 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f15fc00bc50 con 0x7f1600071a60 2026-03-09T17:30:42.855 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.853+0000 7f1606816700 1 -- 192.168.123.106:0/1949152027 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f160019eba0 con 0x7f1600071a60 2026-03-09T17:30:42.856 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.854+0000 7f15f6ffd700 1 -- 192.168.123.106:0/1949152027 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f15fc00f670 con 0x7f1600071a60 2026-03-09T17:30:42.857 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.855+0000 7f15f6ffd700 1 -- 192.168.123.106:0/1949152027 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f15fc00f890 con 0x7f1600071a60 2026-03-09T17:30:42.858 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.856+0000 7f15f6ffd700 1 --2- 192.168.123.106:0/1949152027 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f15ec077620 0x7f15ec079ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:42.858 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.856+0000 7f15f6ffd700 1 -- 192.168.123.106:0/1949152027 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f15fc09b530 con 0x7f1600071a60 2026-03-09T17:30:42.858 INFO:tasks.workunit.client.1.vm09.stdout:5/980: truncate d0/d2/d76/d87/d95/f9d 52429 0 2026-03-09T17:30:42.859 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.856+0000 7f1605013700 1 --2- 192.168.123.106:0/1949152027 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f15ec077620 0x7f15ec079ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:42.859 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.857+0000 7f1605013700 1 --2- 192.168.123.106:0/1949152027 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f15ec077620 0x7f15ec079ad0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f15f000bd10 tx=0x7f15f000b480 comp rx=0 tx=0).ready entity=mgr.14712 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:42.860 INFO:tasks.workunit.client.1.vm09.stdout:5/981: creat d0/d46/d4b/f140 x:0 0 0 2026-03-09T17:30:42.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:42.860+0000 7f15f6ffd700 1 -- 192.168.123.106:0/1949152027 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f15fc064170 con 0x7f1600071a60 2026-03-09T17:30:42.863 INFO:tasks.workunit.client.1.vm09.stdout:5/982: truncate d0/f22 3298128 0 2026-03-09T17:30:42.879 INFO:tasks.workunit.client.1.vm09.stdout:9/978: write d5/d21/f9d [3157974,31627] 0 2026-03-09T17:30:42.883 INFO:tasks.workunit.client.1.vm09.stdout:9/979: creat d5/de/d4e/d128/f148 x:0 0 0 2026-03-09T17:30:42.886 INFO:tasks.workunit.client.1.vm09.stdout:5/983: dread d0/d2/d76/d87/faf [0,4194304] 0 2026-03-09T17:30:42.886 INFO:tasks.workunit.client.1.vm09.stdout:9/980: mkdir d5/de/d4e/dca/de7/d93/d149 0 2026-03-09T17:30:42.887 INFO:tasks.workunit.client.1.vm09.stdout:9/981: stat d5/de/d29/da7 0 2026-03-09T17:30:42.894 INFO:tasks.workunit.client.1.vm09.stdout:0/967: dwrite d6/d1d/f57 [4194304,4194304] 0 2026-03-09T17:30:42.895 INFO:tasks.workunit.client.1.vm09.stdout:3/891: write d5/d16/d46/f6b [1406637,88478] 0 2026-03-09T17:30:42.896 INFO:tasks.workunit.client.1.vm09.stdout:9/982: creat d5/de/d4e/f14a x:0 0 0 2026-03-09T17:30:42.903 INFO:tasks.workunit.client.1.vm09.stdout:3/892: rename d5/d16/d46/lfe to d5/d9c/de7/de1/l115 0 2026-03-09T17:30:42.906 INFO:tasks.workunit.client.1.vm09.stdout:9/983: fsync d5/de/d29/d33/db8/ff5 0 2026-03-09T17:30:42.911 INFO:tasks.workunit.client.1.vm09.stdout:9/984: creat d5/de/d4e/dca/de7/f14b x:0 0 0 2026-03-09T17:30:42.912 INFO:tasks.workunit.client.1.vm09.stdout:3/893: getdents d5/d9c 0 2026-03-09T17:30:42.913 INFO:tasks.workunit.client.1.vm09.stdout:3/894: chown d5/d9/da9 5588962 1 2026-03-09T17:30:42.915 INFO:tasks.workunit.client.1.vm09.stdout:9/985: creat d5/de/d29/d90/dc7/da9/f14c x:0 0 0 2026-03-09T17:30:42.917 INFO:tasks.workunit.client.1.vm09.stdout:9/986: mkdir d5/de/d29/da7/d14d 0 2026-03-09T17:30:42.918 INFO:tasks.workunit.client.1.vm09.stdout:3/895: getdents d5/d9c/de7 0 2026-03-09T17:30:42.926 INFO:tasks.workunit.client.1.vm09.stdout:9/987: rmdir d5/de/d29/da7/d14d 0 2026-03-09T17:30:42.927 INFO:tasks.workunit.client.1.vm09.stdout:3/896: rename d5/d16/d31/cc2 to d5/d16/d46/c116 0 2026-03-09T17:30:42.930 INFO:tasks.workunit.client.1.vm09.stdout:9/988: creat d5/d2e/d8b/de0/d125/d13b/d144/f14e x:0 0 0 2026-03-09T17:30:42.930 INFO:tasks.workunit.client.1.vm09.stdout:3/897: mkdir d5/d9/d90/db0/dbb/d117 0 2026-03-09T17:30:42.935 INFO:tasks.workunit.client.1.vm09.stdout:0/968: dread d6/d1d/d24/d32/f45 [4194304,4194304] 0 2026-03-09T17:30:42.943 INFO:tasks.workunit.client.1.vm09.stdout:9/989: sync 2026-03-09T17:30:42.945 INFO:tasks.workunit.client.1.vm09.stdout:3/898: truncate d5/d9/f4e 7300438 0 2026-03-09T17:30:42.951 INFO:tasks.workunit.client.1.vm09.stdout:0/969: dread d6/d1d/f70 [0,4194304] 0 2026-03-09T17:30:42.951 INFO:tasks.workunit.client.1.vm09.stdout:3/899: rename d5/d16/d31/d3d/feb to d5/d9/d90/db0/dbb/f118 0 2026-03-09T17:30:42.953 INFO:tasks.workunit.client.1.vm09.stdout:2/899: truncate d13/d15/d21/d120/f121 446296 0 2026-03-09T17:30:42.955 INFO:tasks.workunit.client.1.vm09.stdout:2/900: creat d13/d15/d34/d45/d84/dcb/f122 x:0 0 0 2026-03-09T17:30:42.956 INFO:tasks.workunit.client.1.vm09.stdout:0/970: mknod d6/d1d/d24/d32/d59/d9c/dac/c138 0 2026-03-09T17:30:42.958 INFO:tasks.workunit.client.1.vm09.stdout:2/901: rename d13/da4/ca8 to d13/d4d/daa/dff/c123 0 2026-03-09T17:30:42.958 INFO:tasks.workunit.client.1.vm09.stdout:0/971: chown d6/d1d/d24/d5e/f67 24 1 2026-03-09T17:30:42.962 INFO:tasks.workunit.client.1.vm09.stdout:2/902: unlink d13/dc8/fee 0 2026-03-09T17:30:42.970 INFO:tasks.workunit.client.1.vm09.stdout:2/903: write d13/d15/d3b/ddf/f97 [84267,59483] 0 2026-03-09T17:30:42.986 INFO:tasks.workunit.client.1.vm09.stdout:4/930: rmdir d11/d1e 39 2026-03-09T17:30:42.986 INFO:tasks.workunit.client.1.vm09.stdout:8/975: write d1/d14/d2a/f2b [606708,99657] 0 2026-03-09T17:30:42.987 INFO:tasks.workunit.client.1.vm09.stdout:1/972: dwrite d9/f6c [0,4194304] 0 2026-03-09T17:30:42.999 INFO:tasks.workunit.client.1.vm09.stdout:0/972: link d6/l9d d6/d1d/d24/d32/d59/d9c/dac/dcc/d111/l139 0 2026-03-09T17:30:42.999 INFO:tasks.workunit.client.1.vm09.stdout:0/973: chown d6/f9 1748 1 2026-03-09T17:30:42.999 INFO:tasks.workunit.client.1.vm09.stdout:4/931: fsync d11/d1e/d45/d60/f64 0 2026-03-09T17:30:43.002 INFO:tasks.workunit.client.1.vm09.stdout:1/973: mknod d9/d123/c133 0 2026-03-09T17:30:43.008 INFO:tasks.workunit.client.1.vm09.stdout:8/976: creat d1/da/d23/d6c/ddd/dcb/d97/f12d x:0 0 0 2026-03-09T17:30:43.011 INFO:tasks.workunit.client.1.vm09.stdout:1/974: creat d9/de5/f134 x:0 0 0 2026-03-09T17:30:43.012 INFO:tasks.workunit.client.1.vm09.stdout:1/975: truncate d9/d5a/f126 4870685 0 2026-03-09T17:30:43.013 INFO:tasks.workunit.client.1.vm09.stdout:1/976: dread - d9/dc/dd/d9f/f130 zero size 2026-03-09T17:30:43.014 INFO:tasks.workunit.client.1.vm09.stdout:4/932: fdatasync d11/d1e/d45/d60/d71/db7/d89/d8b/f38 0 2026-03-09T17:30:43.014 INFO:tasks.workunit.client.1.vm09.stdout:4/933: fsync d11/d1e/d31/f3a 0 2026-03-09T17:30:43.015 INFO:tasks.workunit.client.1.vm09.stdout:8/977: rmdir d1/da/d23/d114 39 2026-03-09T17:30:43.019 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.017+0000 7f1606816700 1 -- 192.168.123.106:0/1949152027 --> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1600061190 con 0x7f15ec077620 2026-03-09T17:30:43.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.019+0000 7f15f6ffd700 1 -- 192.168.123.106:0/1949152027 <== mgr.14712 v2:192.168.123.106:6800/1431796821 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+368 (secure 0 0 0) 0x7f1600061190 con 0x7f15ec077620 2026-03-09T17:30:43.023 INFO:tasks.workunit.client.1.vm09.stdout:1/977: mknod d9/dc/dd/d40/d21/d35/db9/c135 0 2026-03-09T17:30:43.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.022+0000 7f15f4ef9700 1 -- 192.168.123.106:0/1949152027 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f15ec077620 msgr2=0x7f15ec079ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.023+0000 7f15f4ef9700 1 --2- 192.168.123.106:0/1949152027 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f15ec077620 0x7f15ec079ad0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f15f000bd10 tx=0x7f15f000b480 comp rx=0 tx=0).stop 2026-03-09T17:30:43.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.023+0000 7f15f4ef9700 1 -- 192.168.123.106:0/1949152027 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1600071a60 msgr2=0x7f16001a49b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.023+0000 7f15f4ef9700 1 --2- 192.168.123.106:0/1949152027 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1600071a60 0x7f16001a49b0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f15fc000c00 tx=0x7f15fc004930 comp rx=0 tx=0).stop 2026-03-09T17:30:43.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.023+0000 7f15f4ef9700 1 -- 192.168.123.106:0/1949152027 shutdown_connections 2026-03-09T17:30:43.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.023+0000 7f15f4ef9700 1 --2- 192.168.123.106:0/1949152027 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f15ec077620 0x7f15ec079ad0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.023+0000 7f15f4ef9700 1 --2- 192.168.123.106:0/1949152027 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1600071a60 0x7f16001a49b0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.023+0000 7f15f4ef9700 1 --2- 192.168.123.106:0/1949152027 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1600072440 0x7f16001a4ef0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.023+0000 7f15f4ef9700 1 -- 192.168.123.106:0/1949152027 >> 192.168.123.106:0/1949152027 conn(0x7f160006d1a0 msgr2=0x7f160010a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:43.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.029+0000 7f15f4ef9700 1 -- 192.168.123.106:0/1949152027 shutdown_connections 2026-03-09T17:30:43.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.029+0000 7f15f4ef9700 1 -- 192.168.123.106:0/1949152027 wait complete. 2026-03-09T17:30:43.032 INFO:tasks.workunit.client.1.vm09.stdout:0/974: getdents d6/d1d/d24/d5e/dc2/d11c/d128 0 2026-03-09T17:30:43.033 INFO:tasks.workunit.client.1.vm09.stdout:4/934: creat d11/d1e/d45/d60/d71/db7/d89/d8b/d58/f118 x:0 0 0 2026-03-09T17:30:43.034 INFO:tasks.workunit.client.1.vm09.stdout:8/978: chown d1/da/d23/d71/cef 117953 1 2026-03-09T17:30:43.034 INFO:tasks.workunit.client.1.vm09.stdout:6/918: write d3/d21/d76/d88/fc1 [42345,92917] 0 2026-03-09T17:30:43.045 INFO:tasks.workunit.client.1.vm09.stdout:4/935: truncate d11/d1e/d29/fb1 917102 0 2026-03-09T17:30:43.045 INFO:tasks.workunit.client.1.vm09.stdout:5/984: write d0/d9/d16/fe1 [664492,2452] 0 2026-03-09T17:30:43.048 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:30:43.052 INFO:tasks.workunit.client.1.vm09.stdout:4/936: sync 2026-03-09T17:30:43.078 INFO:tasks.workunit.client.1.vm09.stdout:5/985: truncate d0/d52/d20/f7c 1198298 0 2026-03-09T17:30:43.080 INFO:tasks.workunit.client.1.vm09.stdout:1/978: creat d9/d9e/dc0/d37/d3f/d42/f136 x:0 0 0 2026-03-09T17:30:43.080 INFO:tasks.workunit.client.1.vm09.stdout:6/919: symlink d3/d21/d76/d5c/d61/d6a/df2/l12c 0 2026-03-09T17:30:43.081 INFO:tasks.workunit.client.1.vm09.stdout:1/979: write d9/d38/d61/dff/f119 [185714,2285] 0 2026-03-09T17:30:43.081 INFO:tasks.workunit.client.1.vm09.stdout:0/975: rmdir d6/d64/d97/dc9/dfc/d11d 0 2026-03-09T17:30:43.084 INFO:tasks.workunit.client.1.vm09.stdout:8/979: getdents d1/da/d23/d6c/ddd/dcb/d97/dc5 0 2026-03-09T17:30:43.098 INFO:tasks.workunit.client.1.vm09.stdout:8/980: creat d1/d14/d2a/d42/d43/d44/df5/f12e x:0 0 0 2026-03-09T17:30:43.098 INFO:tasks.workunit.client.1.vm09.stdout:0/976: stat d6/d1d/d46/l66 0 2026-03-09T17:30:43.098 INFO:tasks.workunit.client.1.vm09.stdout:5/986: dread d0/d9/d8b/fe7 [0,4194304] 0 2026-03-09T17:30:43.109 INFO:tasks.workunit.client.1.vm09.stdout:3/900: write d5/d9/f1e [5154590,65316] 0 2026-03-09T17:30:43.111 INFO:tasks.workunit.client.1.vm09.stdout:9/990: dwrite d5/de/d4e/dca/de7/d93/f74 [0,4194304] 0 2026-03-09T17:30:43.111 INFO:tasks.workunit.client.1.vm09.stdout:8/981: fdatasync d1/da/d23/d114/f119 0 2026-03-09T17:30:43.119 INFO:tasks.workunit.client.1.vm09.stdout:5/987: unlink d0/d2/d76/d87/d95/d9b/dc0/dce/ff8 0 2026-03-09T17:30:43.119 INFO:tasks.workunit.client.1.vm09.stdout:3/901: fsync d5/d9/d30/f61 0 2026-03-09T17:30:43.126 INFO:tasks.workunit.client.1.vm09.stdout:5/988: sync 2026-03-09T17:30:43.130 INFO:tasks.workunit.client.1.vm09.stdout:0/977: rename d6/d1d/d24/d5e/dc2/df7/l12a to d6/d64/d97/dc9/l13a 0 2026-03-09T17:30:43.135 INFO:tasks.workunit.client.1.vm09.stdout:1/980: link d9/dc/l18 d9/d9e/dc0/d37/da4/l137 0 2026-03-09T17:30:43.135 INFO:tasks.workunit.client.1.vm09.stdout:6/920: dread d3/d48/f6c [0,4194304] 0 2026-03-09T17:30:43.135 INFO:tasks.workunit.client.1.vm09.stdout:5/989: fsync d0/d46/d11f/d13c/d21/d26/d13e/d68/d79/fc7 0 2026-03-09T17:30:43.137 INFO:tasks.workunit.client.1.vm09.stdout:9/991: symlink d5/d2e/d8b/d116/l14f 0 2026-03-09T17:30:43.138 INFO:tasks.workunit.client.1.vm09.stdout:5/990: write d0/d46/d4b/feb [4106615,47308] 0 2026-03-09T17:30:43.139 INFO:tasks.workunit.client.1.vm09.stdout:3/902: symlink d5/d9/d90/db0/dbb/l119 0 2026-03-09T17:30:43.141 INFO:tasks.workunit.client.1.vm09.stdout:1/981: dread - d9/d9e/fdf zero size 2026-03-09T17:30:43.146 INFO:tasks.workunit.client.1.vm09.stdout:0/978: symlink d6/d1d/d24/d5e/dc2/l13b 0 2026-03-09T17:30:43.148 INFO:tasks.workunit.client.1.vm09.stdout:1/982: write d9/d9e/dc0/d37/d3f/d42/d55/f69 [3491908,72728] 0 2026-03-09T17:30:43.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.145+0000 7f34c7fff700 1 -- 192.168.123.106:0/1695018365 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f34c00a42d0 msgr2=0x7f34c00a46e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.145+0000 7f34c7fff700 1 --2- 192.168.123.106:0/1695018365 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f34c00a42d0 0x7f34c00a46e0 secure :-1 s=READY pgs=342 cs=0 l=1 rev1=1 crypto rx=0x7f34b8009b00 tx=0x7f34b8009e10 comp rx=0 tx=0).stop 2026-03-09T17:30:43.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.145+0000 7f34c7fff700 1 -- 192.168.123.106:0/1695018365 shutdown_connections 2026-03-09T17:30:43.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.145+0000 7f34c7fff700 1 --2- 192.168.123.106:0/1695018365 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f34c00a5410 0x7f34c00a5880 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.145+0000 7f34c7fff700 1 --2- 192.168.123.106:0/1695018365 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f34c00a42d0 0x7f34c00a46e0 unknown :-1 s=CLOSED pgs=342 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.145+0000 7f34c7fff700 1 -- 192.168.123.106:0/1695018365 >> 192.168.123.106:0/1695018365 conn(0x7f34c009f7a0 msgr2=0x7f34c00a1bf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:43.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.145+0000 7f34c7fff700 1 -- 192.168.123.106:0/1695018365 shutdown_connections 2026-03-09T17:30:43.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.146+0000 7f34c7fff700 1 -- 192.168.123.106:0/1695018365 wait complete. 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.146+0000 7f34c7fff700 1 Processor -- start 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.146+0000 7f34c7fff700 1 -- start start 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.146+0000 7f34c7fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f34c00a42d0 0x7f34c0142290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.146+0000 7f34c7fff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f34c00a5410 0x7f34c01427d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.146+0000 7f34c7fff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f34c0142df0 con 0x7f34c00a42d0 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.146+0000 7f34c7fff700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f34c0142f30 con 0x7f34c00a5410 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.146+0000 7f34c6ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f34c00a42d0 0x7f34c0142290 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.147+0000 7f34c67fc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f34c00a5410 0x7f34c01427d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.147+0000 7f34c67fc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f34c00a5410 0x7f34c01427d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:53768/0 (socket says 192.168.123.106:53768) 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.147+0000 7f34c67fc700 1 -- 192.168.123.106:0/3222653309 learned_addr learned my addr 192.168.123.106:0/3222653309 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.147+0000 7f34c67fc700 1 -- 192.168.123.106:0/3222653309 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f34c00a42d0 msgr2=0x7f34c0142290 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.147+0000 7f34c67fc700 1 --2- 192.168.123.106:0/3222653309 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f34c00a42d0 0x7f34c0142290 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.147+0000 7f34c67fc700 1 -- 192.168.123.106:0/3222653309 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f34b80097e0 con 0x7f34c00a5410 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.147+0000 7f34c67fc700 1 --2- 192.168.123.106:0/3222653309 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f34c00a5410 0x7f34c01427d0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f34bc00d8d0 tx=0x7f34bc00dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:43.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.147+0000 7f34affff700 1 -- 192.168.123.106:0/3222653309 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f34bc009940 con 0x7f34c00a5410 2026-03-09T17:30:43.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.148+0000 7f34affff700 1 -- 192.168.123.106:0/3222653309 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f34bc010460 con 0x7f34c00a5410 2026-03-09T17:30:43.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.148+0000 7f34affff700 1 -- 192.168.123.106:0/3222653309 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f34bc00f5d0 con 0x7f34c00a5410 2026-03-09T17:30:43.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.148+0000 7f34c7fff700 1 -- 192.168.123.106:0/3222653309 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f34c01479e0 con 0x7f34c00a5410 2026-03-09T17:30:43.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.148+0000 7f34c7fff700 1 -- 192.168.123.106:0/3222653309 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f34c0147eb0 con 0x7f34c00a5410 2026-03-09T17:30:43.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.149+0000 7f34c7fff700 1 -- 192.168.123.106:0/3222653309 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f34c0004eb0 con 0x7f34c00a5410 2026-03-09T17:30:43.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.150+0000 7f34affff700 1 -- 192.168.123.106:0/3222653309 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f34bc009aa0 con 0x7f34c00a5410 2026-03-09T17:30:43.153 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.151+0000 7f34affff700 1 --2- 192.168.123.106:0/3222653309 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f34b0077450 0x7f34b0079900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:43.153 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.151+0000 7f34affff700 1 -- 192.168.123.106:0/3222653309 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f34bc099c00 con 0x7f34c00a5410 2026-03-09T17:30:43.153 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.151+0000 7f34c6ffd700 1 --2- 192.168.123.106:0/3222653309 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f34b0077450 0x7f34b0079900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:43.154 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.152+0000 7f34c6ffd700 1 --2- 192.168.123.106:0/3222653309 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f34b0077450 0x7f34b0079900 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f34b800b5c0 tx=0x7f34b8005fb0 comp rx=0 tx=0).ready entity=mgr.14712 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:43.154 INFO:tasks.workunit.client.1.vm09.stdout:3/903: dread d5/d9/d30/d65/d59/fd9 [0,4194304] 0 2026-03-09T17:30:43.154 INFO:tasks.workunit.client.1.vm09.stdout:5/991: dread - d0/d46/d11f/d13c/d21/d26/d13e/d68/d6d/ffe zero size 2026-03-09T17:30:43.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.154+0000 7f34affff700 1 -- 192.168.123.106:0/3222653309 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f34bc0620c0 con 0x7f34c00a5410 2026-03-09T17:30:43.160 INFO:tasks.workunit.client.1.vm09.stdout:2/904: write d13/d15/d34/f5b [5561250,79250] 0 2026-03-09T17:30:43.161 INFO:tasks.workunit.client.1.vm09.stdout:6/921: mknod d3/d21/d25/d26/d86/dbc/dfa/c12d 0 2026-03-09T17:30:43.167 INFO:tasks.workunit.client.1.vm09.stdout:1/983: rename d9/d5a to d9/dc/dd/d40/d21/d35/d138 0 2026-03-09T17:30:43.172 INFO:tasks.workunit.client.1.vm09.stdout:5/992: creat d0/d9/d74/d75/dbd/f141 x:0 0 0 2026-03-09T17:30:43.179 INFO:tasks.workunit.client.1.vm09.stdout:0/979: unlink d6/d1d/d46/le0 0 2026-03-09T17:30:43.179 INFO:tasks.workunit.client.1.vm09.stdout:3/904: mkdir d5/d16/d31/d3d/db3/df3/d11a 0 2026-03-09T17:30:43.179 INFO:tasks.workunit.client.1.vm09.stdout:3/905: readlink d5/d9/d30/dc4/lf2 0 2026-03-09T17:30:43.179 INFO:tasks.workunit.client.1.vm09.stdout:2/905: dread - d13/d15/d3b/ddf/d90/f9f zero size 2026-03-09T17:30:43.179 INFO:tasks.workunit.client.1.vm09.stdout:3/906: chown d5/d16/d31/d37/d58 368 1 2026-03-09T17:30:43.179 INFO:tasks.workunit.client.1.vm09.stdout:2/906: read - d13/d15/d34/d45/d84/dcb/f114 zero size 2026-03-09T17:30:43.179 INFO:tasks.workunit.client.1.vm09.stdout:5/993: dread - d0/d2/ff1 zero size 2026-03-09T17:30:43.179 INFO:tasks.workunit.client.1.vm09.stdout:0/980: symlink d6/d1d/d24/d5e/db2/l13c 0 2026-03-09T17:30:43.183 INFO:tasks.workunit.client.1.vm09.stdout:2/907: dwrite d13/d15/d3b/f105 [0,4194304] 0 2026-03-09T17:30:43.192 INFO:tasks.workunit.client.1.vm09.stdout:1/984: creat d9/dc/dd/d9f/d107/f139 x:0 0 0 2026-03-09T17:30:43.196 INFO:tasks.workunit.client.1.vm09.stdout:2/908: creat d13/d15/d36/d72/d94/da7/db0/dd6/f124 x:0 0 0 2026-03-09T17:30:43.197 INFO:tasks.workunit.client.1.vm09.stdout:5/994: getdents d0/d9/d74/d75/d9f/db6/d111 0 2026-03-09T17:30:43.202 INFO:tasks.workunit.client.1.vm09.stdout:0/981: dread d6/d93/fb7 [0,4194304] 0 2026-03-09T17:30:43.211 INFO:tasks.workunit.client.1.vm09.stdout:0/982: chown d6/d1d/d24/l120 2 1 2026-03-09T17:30:43.211 INFO:tasks.workunit.client.1.vm09.stdout:1/985: unlink d9/dc/dd/d9f/de4/dba/fb4 0 2026-03-09T17:30:43.211 INFO:tasks.workunit.client.1.vm09.stdout:0/983: chown d6/d1d/d24/d32/d59/d9c/fa2 0 1 2026-03-09T17:30:43.211 INFO:tasks.workunit.client.1.vm09.stdout:1/986: chown d9/dc/dd/d40/d21/d35/db9/lc6 98826 1 2026-03-09T17:30:43.211 INFO:tasks.workunit.client.1.vm09.stdout:5/995: mkdir d0/d46/d4b/db7/d142 0 2026-03-09T17:30:43.211 INFO:tasks.workunit.client.1.vm09.stdout:0/984: dread d6/d1d/d24/f5d [4194304,4194304] 0 2026-03-09T17:30:43.211 INFO:tasks.workunit.client.1.vm09.stdout:2/909: unlink d13/d15/d34/d37/d66/le0 0 2026-03-09T17:30:43.211 INFO:tasks.workunit.client.1.vm09.stdout:2/910: chown d13/d15/d36/d72/dc3/fc5 0 1 2026-03-09T17:30:43.214 INFO:tasks.workunit.client.1.vm09.stdout:1/987: creat d9/d9e/dc0/d37/d3f/d42/d55/f13a x:0 0 0 2026-03-09T17:30:43.215 INFO:tasks.workunit.client.1.vm09.stdout:2/911: rename d13/d15/d36/d72/d94/da7/c6c to d13/d4d/daa/c125 0 2026-03-09T17:30:43.216 INFO:tasks.workunit.client.1.vm09.stdout:2/912: chown d13/d15/d34/d45/d84/db5/dcf/lf8 336 1 2026-03-09T17:30:43.218 INFO:tasks.workunit.client.1.vm09.stdout:0/985: mknod d6/d1d/d46/c13d 0 2026-03-09T17:30:43.219 INFO:tasks.workunit.client.1.vm09.stdout:0/986: write d6/d1d/d24/d5e/d6c/ff6 [1084567,91299] 0 2026-03-09T17:30:43.221 INFO:tasks.workunit.client.1.vm09.stdout:2/913: dwrite d13/d15/d34/f5b [4194304,4194304] 0 2026-03-09T17:30:43.236 INFO:tasks.workunit.client.1.vm09.stdout:0/987: mknod d6/d1d/d24/d32/d59/d9c/dac/dd1/c13e 0 2026-03-09T17:30:43.237 INFO:tasks.workunit.client.1.vm09.stdout:0/988: dread - d6/d64/d97/dc9/f122 zero size 2026-03-09T17:30:43.239 INFO:tasks.workunit.client.1.vm09.stdout:2/914: creat d13/d15/d34/d45/d84/db5/f126 x:0 0 0 2026-03-09T17:30:43.242 INFO:tasks.workunit.client.1.vm09.stdout:0/989: rename d6/d1d/d24/d5e/d86/ce4 to d6/d116/d117/c13f 0 2026-03-09T17:30:43.242 INFO:tasks.workunit.client.1.vm09.stdout:0/990: readlink d6/d1d/d24/d32/l34 0 2026-03-09T17:30:43.247 INFO:tasks.workunit.client.1.vm09.stdout:2/915: write d13/d15/f20 [2149996,79149] 0 2026-03-09T17:30:43.248 INFO:tasks.workunit.client.1.vm09.stdout:0/991: creat d6/d1d/d24/d137/f140 x:0 0 0 2026-03-09T17:30:43.249 INFO:tasks.workunit.client.1.vm09.stdout:2/916: creat d13/d15/d36/d72/d94/da7/db0/dd6/f127 x:0 0 0 2026-03-09T17:30:43.250 INFO:tasks.workunit.client.1.vm09.stdout:0/992: symlink d6/d116/l141 0 2026-03-09T17:30:43.250 INFO:tasks.workunit.client.1.vm09.stdout:2/917: truncate d13/d15/d34/d45/d84/db5/f126 293099 0 2026-03-09T17:30:43.251 INFO:tasks.workunit.client.1.vm09.stdout:2/918: fdatasync d13/da4/fea 0 2026-03-09T17:30:43.254 INFO:tasks.workunit.client.1.vm09.stdout:2/919: unlink d13/d4d/daa/ce4 0 2026-03-09T17:30:43.257 INFO:tasks.workunit.client.1.vm09.stdout:2/920: mknod d13/d15/d34/d37/c128 0 2026-03-09T17:30:43.266 INFO:tasks.workunit.client.1.vm09.stdout:2/921: link d13/d15/d3b/ddf/fcd d13/d15/d34/d45/d84/dcb/f129 0 2026-03-09T17:30:43.266 INFO:tasks.workunit.client.1.vm09.stdout:2/922: dread - d13/d15/d34/d45/d84/dcb/f122 zero size 2026-03-09T17:30:43.267 INFO:tasks.workunit.client.1.vm09.stdout:2/923: chown d13/d15/d3b/ddf/d90/f11f 42115 1 2026-03-09T17:30:43.269 INFO:tasks.workunit.client.1.vm09.stdout:2/924: mkdir d13/d15/d21/d88/db8/dd1/de5/d12a 0 2026-03-09T17:30:43.272 INFO:tasks.workunit.client.1.vm09.stdout:2/925: rename d13/d4d/f10b to d13/d15/d3b/ddf/d90/f12b 0 2026-03-09T17:30:43.275 INFO:tasks.workunit.client.1.vm09.stdout:2/926: read d13/f79 [35761,115252] 0 2026-03-09T17:30:43.277 INFO:tasks.workunit.client.1.vm09.stdout:2/927: mknod d13/d4d/daa/dff/c12c 0 2026-03-09T17:30:43.279 INFO:tasks.workunit.client.1.vm09.stdout:2/928: fsync d13/d15/d34/f5e 0 2026-03-09T17:30:43.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.296+0000 7f34c7fff700 1 -- 192.168.123.106:0/3222653309 --> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f34c00a9d60 con 0x7f34b0077450 2026-03-09T17:30:43.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.297+0000 7f34affff700 1 -- 192.168.123.106:0/3222653309 <== mgr.14712 v2:192.168.123.106:6800/1431796821 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+368 (secure 0 0 0) 0x7f34c00a9d60 con 0x7f34b0077450 2026-03-09T17:30:43.300 INFO:tasks.workunit.client.1.vm09.stdout:2/929: write d13/d4d/daa/ff0 [683938,25451] 0 2026-03-09T17:30:43.300 INFO:tasks.workunit.client.1.vm09.stdout:2/930: dread d13/d15/f2b [0,4194304] 0 2026-03-09T17:30:43.300 INFO:tasks.workunit.client.1.vm09.stdout:2/931: unlink d13/d15/d34/f44 0 2026-03-09T17:30:43.300 INFO:tasks.workunit.client.1.vm09.stdout:2/932: unlink d13/d15/d3b/d43/f10a 0 2026-03-09T17:30:43.300 INFO:tasks.workunit.client.1.vm09.stdout:2/933: truncate d13/da4/fea 429175 0 2026-03-09T17:30:43.300 INFO:tasks.workunit.client.1.vm09.stdout:2/934: dwrite d13/d15/d3b/ddf/d85/f116 [0,4194304] 0 2026-03-09T17:30:43.306 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.304+0000 7f34c7fff700 1 -- 192.168.123.106:0/3222653309 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f34b0077450 msgr2=0x7f34b0079900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.306 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.304+0000 7f34c7fff700 1 --2- 192.168.123.106:0/3222653309 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f34b0077450 0x7f34b0079900 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f34b800b5c0 tx=0x7f34b8005fb0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.306 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.304+0000 7f34c7fff700 1 -- 192.168.123.106:0/3222653309 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f34c00a5410 msgr2=0x7f34c01427d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.306 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.304+0000 7f34c7fff700 1 --2- 192.168.123.106:0/3222653309 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f34c00a5410 0x7f34c01427d0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f34bc00d8d0 tx=0x7f34bc00dc90 comp rx=0 tx=0).stop 2026-03-09T17:30:43.307 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.305+0000 7f34c7fff700 1 -- 192.168.123.106:0/3222653309 shutdown_connections 2026-03-09T17:30:43.307 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.305+0000 7f34c7fff700 1 --2- 192.168.123.106:0/3222653309 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f34b0077450 0x7f34b0079900 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.307 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.305+0000 7f34c7fff700 1 --2- 192.168.123.106:0/3222653309 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f34c00a42d0 0x7f34c0142290 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.307 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.305+0000 7f34c7fff700 1 --2- 192.168.123.106:0/3222653309 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f34c00a5410 0x7f34c01427d0 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.307 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.305+0000 7f34c7fff700 1 -- 192.168.123.106:0/3222653309 >> 192.168.123.106:0/3222653309 conn(0x7f34c009f7a0 msgr2=0x7f34c00a8640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:43.307 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.305+0000 7f34c7fff700 1 -- 192.168.123.106:0/3222653309 shutdown_connections 2026-03-09T17:30:43.307 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.305+0000 7f34c7fff700 1 -- 192.168.123.106:0/3222653309 wait complete. 2026-03-09T17:30:43.312 INFO:tasks.workunit.client.1.vm09.stdout:2/935: unlink d13/d15/d34/d45/f108 0 2026-03-09T17:30:43.318 INFO:tasks.workunit.client.1.vm09.stdout:2/936: creat d13/d15/d36/d72/d94/da7/f12d x:0 0 0 2026-03-09T17:30:43.321 INFO:tasks.workunit.client.1.vm09.stdout:2/937: write d13/d15/d34/d45/d84/dcb/f10d [169658,126559] 0 2026-03-09T17:30:43.328 INFO:tasks.workunit.client.1.vm09.stdout:2/938: creat d13/d15/d21/d88/db8/dd1/f12e x:0 0 0 2026-03-09T17:30:43.336 INFO:tasks.workunit.client.1.vm09.stdout:2/939: chown d13/d15/d36/d72/d94 973 1 2026-03-09T17:30:43.368 INFO:tasks.workunit.client.1.vm09.stdout:2/940: sync 2026-03-09T17:30:43.371 INFO:tasks.workunit.client.1.vm09.stdout:2/941: read d13/d15/d34/d45/d84/dcb/db1/ff4 [2125294,74175] 0 2026-03-09T17:30:43.385 INFO:tasks.workunit.client.1.vm09.stdout:2/942: rename d13/d15/d3b/ddf/d90/d111 to d13/d15/d36/d72/d94/da7/db0/dd6/d12f 0 2026-03-09T17:30:43.395 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.392+0000 7f91d0244700 1 -- 192.168.123.106:0/195052852 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8071980 msgr2=0x7f91c8071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.395 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.392+0000 7f91d0244700 1 --2- 192.168.123.106:0/195052852 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8071980 0x7f91c8071d90 secure :-1 s=READY pgs=343 cs=0 l=1 rev1=1 crypto rx=0x7f91c4007780 tx=0x7f91c400c050 comp rx=0 tx=0).stop 2026-03-09T17:30:43.395 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.393+0000 7f91d0244700 1 -- 192.168.123.106:0/195052852 shutdown_connections 2026-03-09T17:30:43.395 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.393+0000 7f91d0244700 1 --2- 192.168.123.106:0/195052852 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f91c8072360 0x7f91c80770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.395 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.393+0000 7f91d0244700 1 --2- 192.168.123.106:0/195052852 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8071980 0x7f91c8071d90 unknown :-1 s=CLOSED pgs=343 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.395 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.393+0000 7f91d0244700 1 -- 192.168.123.106:0/195052852 >> 192.168.123.106:0/195052852 conn(0x7f91c806d1a0 msgr2=0x7f91c806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:43.395 INFO:tasks.workunit.client.1.vm09.stdout:2/943: symlink d13/d15/d21/d88/db8/dd1/l130 0 2026-03-09T17:30:43.396 INFO:tasks.workunit.client.1.vm09.stdout:4/937: write d11/d1e/d31/f74 [1561597,20595] 0 2026-03-09T17:30:43.397 INFO:tasks.workunit.client.1.vm09.stdout:4/938: write d11/d1e/d31/db6/ffe [225889,78635] 0 2026-03-09T17:30:43.398 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.393+0000 7f91d0244700 1 -- 192.168.123.106:0/195052852 shutdown_connections 2026-03-09T17:30:43.398 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.393+0000 7f91d0244700 1 -- 192.168.123.106:0/195052852 wait complete. 2026-03-09T17:30:43.398 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.394+0000 7f91d0244700 1 Processor -- start 2026-03-09T17:30:43.398 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.394+0000 7f91d0244700 1 -- start start 2026-03-09T17:30:43.399 INFO:tasks.workunit.client.1.vm09.stdout:4/939: chown d11/d1e/d45/d60/d71/db7/d89/d8b/d58/f118 24168 1 2026-03-09T17:30:43.400 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:43 vm06.local ceph-mon[57307]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:30:43.400 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:43 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm09.lqzvkh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:30:43.400 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:43 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm09.lqzvkh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:30:43.400 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:43 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:30:43.400 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:43 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:43.400 INFO:tasks.workunit.client.1.vm09.stdout:2/944: dread d13/d15/d3b/f105 [0,4194304] 0 2026-03-09T17:30:43.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.394+0000 7f91d0244700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f91c8072360 0x7f91c8131340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:43.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.394+0000 7f91d0244700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8131880 0x7f91c807f500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:43.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.394+0000 7f91d0244700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f91c8131d80 con 0x7f91c8131880 2026-03-09T17:30:43.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.394+0000 7f91d0244700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f91c8131ef0 con 0x7f91c8072360 2026-03-09T17:30:43.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.395+0000 7f91cd7df700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8131880 0x7f91c807f500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:43.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.395+0000 7f91cd7df700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8131880 0x7f91c807f500 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:56514/0 (socket says 192.168.123.106:56514) 2026-03-09T17:30:43.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.395+0000 7f91cd7df700 1 -- 192.168.123.106:0/1174433475 learned_addr learned my addr 192.168.123.106:0/1174433475 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:30:43.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.395+0000 7f91cd7df700 1 -- 192.168.123.106:0/1174433475 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f91c8072360 msgr2=0x7f91c8131340 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T17:30:43.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.395+0000 7f91cd7df700 1 --2- 192.168.123.106:0/1174433475 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f91c8072360 0x7f91c8131340 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.395+0000 7f91cd7df700 1 -- 192.168.123.106:0/1174433475 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f91c4007430 con 0x7f91c8131880 2026-03-09T17:30:43.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.401+0000 7f91cd7df700 1 --2- 192.168.123.106:0/1174433475 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8131880 0x7f91c807f500 secure :-1 s=READY pgs=344 cs=0 l=1 rev1=1 crypto rx=0x7f91c000bf40 tx=0x7f91c000bf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:43.405 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.402+0000 7f91beffd700 1 -- 192.168.123.106:0/1174433475 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f91c000cb40 con 0x7f91c8131880 2026-03-09T17:30:43.405 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.402+0000 7f91beffd700 1 -- 192.168.123.106:0/1174433475 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f91c000cca0 con 0x7f91c8131880 2026-03-09T17:30:43.405 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.402+0000 7f91beffd700 1 -- 192.168.123.106:0/1174433475 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f91c0012720 con 0x7f91c8131880 2026-03-09T17:30:43.405 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.402+0000 7f91d0244700 1 -- 192.168.123.106:0/1174433475 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f91c807faa0 con 0x7f91c8131880 2026-03-09T17:30:43.405 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.402+0000 7f91d0244700 1 -- 192.168.123.106:0/1174433475 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f91c807ffc0 con 0x7f91c8131880 2026-03-09T17:30:43.405 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.403+0000 7f91d0244700 1 -- 192.168.123.106:0/1174433475 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f91c812b500 con 0x7f91c8131880 2026-03-09T17:30:43.412 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.408+0000 7f91beffd700 1 -- 192.168.123.106:0/1174433475 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f91c0012880 con 0x7f91c8131880 2026-03-09T17:30:43.412 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.409+0000 7f91beffd700 1 --2- 192.168.123.106:0/1174433475 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f91b4077450 0x7f91b4079900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:43.412 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.409+0000 7f91beffd700 1 -- 192.168.123.106:0/1174433475 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f91c0099a40 con 0x7f91c8131880 2026-03-09T17:30:43.412 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.409+0000 7f91beffd700 1 -- 192.168.123.106:0/1174433475 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f91c00c8ce0 con 0x7f91c8131880 2026-03-09T17:30:43.412 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.409+0000 7f91cdfe0700 1 --2- 192.168.123.106:0/1174433475 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f91b4077450 0x7f91b4079900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:43.415 INFO:tasks.workunit.client.1.vm09.stdout:4/940: rmdir d11/d1e/d31/d110 0 2026-03-09T17:30:43.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.413+0000 7f91cdfe0700 1 --2- 192.168.123.106:0/1174433475 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f91b4077450 0x7f91b4079900 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f91c4005b40 tx=0x7f91c4005ab0 comp rx=0 tx=0).ready entity=mgr.14712 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:43.434 INFO:tasks.workunit.client.1.vm09.stdout:4/941: dread d11/d1e/d45/d60/d71/f76 [0,4194304] 0 2026-03-09T17:30:43.461 INFO:tasks.workunit.client.1.vm09.stdout:8/982: dwrite d1/d14/d2a/d42/f46 [0,4194304] 0 2026-03-09T17:30:43.498 INFO:tasks.workunit.client.1.vm09.stdout:9/992: truncate d5/de/d29/d33/db8/dfb/f118 362616 0 2026-03-09T17:30:43.499 INFO:tasks.workunit.client.1.vm09.stdout:9/993: mkdir d5/de/d29/d33/db8/dfb/d150 0 2026-03-09T17:30:43.500 INFO:tasks.workunit.client.1.vm09.stdout:9/994: write d5/de/d4e/dca/d84/d97/f112 [378889,83653] 0 2026-03-09T17:30:43.505 INFO:tasks.workunit.client.1.vm09.stdout:9/995: getdents d5/de/d88 0 2026-03-09T17:30:43.506 INFO:tasks.workunit.client.1.vm09.stdout:9/996: write d5/d2e/d8b/d116/f135 [529890,90196] 0 2026-03-09T17:30:43.511 INFO:tasks.workunit.client.1.vm09.stdout:9/997: truncate d5/de/d4e/d128/f148 717073 0 2026-03-09T17:30:43.511 INFO:tasks.workunit.client.1.vm09.stdout:6/922: dwrite d3/d7/f11 [0,4194304] 0 2026-03-09T17:30:43.512 INFO:tasks.workunit.client.1.vm09.stdout:9/998: chown d5/de/d88/f110 610 1 2026-03-09T17:30:43.514 INFO:tasks.workunit.client.1.vm09.stdout:6/923: write d3/d21/d76/d3f/fb8 [4509288,49658] 0 2026-03-09T17:30:43.520 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:43 vm09.local ceph-mon[62061]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:30:43.520 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:43 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm09.lqzvkh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:30:43.520 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:43 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm09.lqzvkh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:30:43.520 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:43 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:30:43.520 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:43 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:43.529 INFO:tasks.workunit.client.1.vm09.stdout:6/924: rename d3/d21/d25/d26/d86/dbe/ff5 to d3/d21/d25/d26/d86/dbc/dfa/f12e 0 2026-03-09T17:30:43.532 INFO:tasks.workunit.client.1.vm09.stdout:6/925: mkdir d3/d7/d59/d5a/d12f 0 2026-03-09T17:30:43.536 INFO:tasks.workunit.client.1.vm09.stdout:6/926: write d3/d21/d76/d5c/f6d [4390771,59837] 0 2026-03-09T17:30:43.536 INFO:tasks.workunit.client.1.vm09.stdout:6/927: chown d3/d21/d76/d5c/f78 3874065 1 2026-03-09T17:30:43.538 INFO:tasks.workunit.client.1.vm09.stdout:6/928: write d3/d21/d25/d96/fec [75239,111082] 0 2026-03-09T17:30:43.550 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.546+0000 7f91d0244700 1 -- 192.168.123.106:0/1174433475 --> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f91c802d0a0 con 0x7f91b4077450 2026-03-09T17:30:43.551 INFO:tasks.workunit.client.1.vm09.stdout:3/907: write d5/d9/d30/d65/d59/f87 [881409,85531] 0 2026-03-09T17:30:43.558 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (4m) 7s ago 5m 25.5M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (5m) 7s ago 5m 8409k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (5m) 8s ago 5m 10.7M - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (5m) 7s ago 5m 7402k - 18.2.0 dc2bc1663786 8c6366ef2954 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (5m) 8s ago 5m 7419k - 18.2.0 dc2bc1663786 78af352f0367 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (4m) 7s ago 5m 90.8M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (3m) 7s ago 3m 14.8M - 18.2.0 dc2bc1663786 4b4cbdf0c640 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (3m) 7s ago 3m 230M - 18.2.0 dc2bc1663786 4c8e86b2b8cd 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (3m) 8s ago 3m 134M - 18.2.0 dc2bc1663786 aa1f0430b448 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (3m) 8s ago 3m 16.4M - 18.2.0 dc2bc1663786 8dc8a0159213 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:8443,9283,8765 running (22s) 7s ago 6m 595M - 19.2.3-678-ge911bdeb 654f31e6858e a857a5a84e3a 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (47s) 8s ago 5m 503M - 19.2.3-678-ge911bdeb 654f31e6858e 6994beea5467 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (6m) 7s ago 6m 54.1M 2048M 18.2.0 dc2bc1663786 e0e1a20b1577 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (5m) 8s ago 5m 44.5M 2048M 18.2.0 dc2bc1663786 4c30d1217de3 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (5m) 7s ago 5m 14.6M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (5m) 8s ago 5m 15.7M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (4m) 7s ago 4m 355M 4096M 18.2.0 dc2bc1663786 7a07f019bdd7 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (4m) 7s ago 4m 380M 4096M 18.2.0 dc2bc1663786 decf7e88ebaf 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (4m) 7s ago 4m 322M 4096M 18.2.0 dc2bc1663786 df6c7067c3e4 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (4m) 8s ago 4m 421M 4096M 18.2.0 dc2bc1663786 48a594500ef1 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (4m) 8s ago 4m 385M 4096M 18.2.0 dc2bc1663786 a47c39052541 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (3m) 8s ago 3m 381M 4096M 18.2.0 dc2bc1663786 89f436540a49 2026-03-09T17:30:43.561 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 starting - - - - 2026-03-09T17:30:43.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.554+0000 7f91beffd700 1 -- 192.168.123.106:0/1174433475 <== mgr.14712 v2:192.168.123.106:6800/1431796821 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f91c802d0a0 con 0x7f91b4077450 2026-03-09T17:30:43.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.557+0000 7f91d0244700 1 -- 192.168.123.106:0/1174433475 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f91b4077450 msgr2=0x7f91b4079900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.557+0000 7f91d0244700 1 --2- 192.168.123.106:0/1174433475 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f91b4077450 0x7f91b4079900 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f91c4005b40 tx=0x7f91c4005ab0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.557+0000 7f91d0244700 1 -- 192.168.123.106:0/1174433475 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8131880 msgr2=0x7f91c807f500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.557+0000 7f91d0244700 1 --2- 192.168.123.106:0/1174433475 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8131880 0x7f91c807f500 secure :-1 s=READY pgs=344 cs=0 l=1 rev1=1 crypto rx=0x7f91c000bf40 tx=0x7f91c000bf70 comp rx=0 tx=0).stop 2026-03-09T17:30:43.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.557+0000 7f91d0244700 1 -- 192.168.123.106:0/1174433475 shutdown_connections 2026-03-09T17:30:43.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.557+0000 7f91d0244700 1 --2- 192.168.123.106:0/1174433475 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f91b4077450 0x7f91b4079900 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.557+0000 7f91d0244700 1 --2- 192.168.123.106:0/1174433475 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f91c8072360 0x7f91c8131340 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.557+0000 7f91d0244700 1 --2- 192.168.123.106:0/1174433475 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f91c8131880 0x7f91c807f500 unknown :-1 s=CLOSED pgs=344 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.557+0000 7f91d0244700 1 -- 192.168.123.106:0/1174433475 >> 192.168.123.106:0/1174433475 conn(0x7f91c806d1a0 msgr2=0x7f91c80764d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:43.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.558+0000 7f91d0244700 1 -- 192.168.123.106:0/1174433475 shutdown_connections 2026-03-09T17:30:43.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.558+0000 7f91d0244700 1 -- 192.168.123.106:0/1174433475 wait complete. 2026-03-09T17:30:43.569 INFO:tasks.workunit.client.1.vm09.stdout:3/908: dread d5/f22 [4194304,4194304] 0 2026-03-09T17:30:43.569 INFO:tasks.workunit.client.1.vm09.stdout:6/929: dread d3/d21/d76/d3f/f42 [0,4194304] 0 2026-03-09T17:30:43.570 INFO:tasks.workunit.client.1.vm09.stdout:6/930: chown d3/d48/d10f/f129 1 1 2026-03-09T17:30:43.571 INFO:tasks.workunit.client.1.vm09.stdout:3/909: creat d5/f11b x:0 0 0 2026-03-09T17:30:43.571 INFO:tasks.workunit.client.1.vm09.stdout:3/910: read d5/d9c/de7/f99 [2458991,92518] 0 2026-03-09T17:30:43.580 INFO:tasks.workunit.client.1.vm09.stdout:6/931: dwrite d3/d21/d76/d5c/d7e/dc5/d9a/fc0 [0,4194304] 0 2026-03-09T17:30:43.586 INFO:tasks.workunit.client.1.vm09.stdout:3/911: rename d5/d16/l75 to d5/d9/d90/db0/dbb/l11c 0 2026-03-09T17:30:43.587 INFO:tasks.workunit.client.1.vm09.stdout:6/932: chown d3/d21/d25/d26/d86/dbc/cc2 4557 1 2026-03-09T17:30:43.589 INFO:tasks.workunit.client.1.vm09.stdout:6/933: fsync d3/d21/d25/d26/d86/f12b 0 2026-03-09T17:30:43.644 INFO:tasks.workunit.client.1.vm09.stdout:3/912: fdatasync d5/d9/da9/fc9 0 2026-03-09T17:30:43.644 INFO:tasks.workunit.client.1.vm09.stdout:6/934: symlink d3/d21/d76/d5c/d61/d6a/df2/l130 0 2026-03-09T17:30:43.644 INFO:tasks.workunit.client.1.vm09.stdout:3/913: fsync d5/d9/d30/d65/f1d 0 2026-03-09T17:30:43.649 INFO:tasks.workunit.client.1.vm09.stdout:3/914: write d5/d9/d90/db0/dbb/f118 [725399,97781] 0 2026-03-09T17:30:43.652 INFO:tasks.workunit.client.1.vm09.stdout:3/915: creat d5/d9/d30/dc4/f11d x:0 0 0 2026-03-09T17:30:43.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.651+0000 7f920b6fb700 1 -- 192.168.123.106:0/265006183 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92000a3e40 msgr2=0x7f92000a4290 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.651+0000 7f920b6fb700 1 --2- 192.168.123.106:0/265006183 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92000a3e40 0x7f92000a4290 secure :-1 s=READY pgs=345 cs=0 l=1 rev1=1 crypto rx=0x7f92040669f0 tx=0x7f92040671e0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.651+0000 7f920b6fb700 1 -- 192.168.123.106:0/265006183 shutdown_connections 2026-03-09T17:30:43.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.651+0000 7f920b6fb700 1 --2- 192.168.123.106:0/265006183 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92000a3e40 0x7f92000a4290 unknown :-1 s=CLOSED pgs=345 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.651+0000 7f920b6fb700 1 --2- 192.168.123.106:0/265006183 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f92000a5800 0x7f92000a5c10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.651+0000 7f920b6fb700 1 -- 192.168.123.106:0/265006183 >> 192.168.123.106:0/265006183 conn(0x7f920009f7b0 msgr2=0x7f92000a1c00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:43.657 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.654+0000 7f920b6fb700 1 -- 192.168.123.106:0/265006183 shutdown_connections 2026-03-09T17:30:43.657 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.654+0000 7f920b6fb700 1 -- 192.168.123.106:0/265006183 wait complete. 2026-03-09T17:30:43.657 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.654+0000 7f920b6fb700 1 Processor -- start 2026-03-09T17:30:43.657 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.655+0000 7f920b6fb700 1 -- start start 2026-03-09T17:30:43.657 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.655+0000 7f920b6fb700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f92000a5800 0x7f92000cfcc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:43.657 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.655+0000 7f920b6fb700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92000d0200 0x7f9200010f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:43.657 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.655+0000 7f920b6fb700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92000d0700 con 0x7f92000d0200 2026-03-09T17:30:43.657 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.655+0000 7f920b6fb700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92000d0870 con 0x7f92000a5800 2026-03-09T17:30:43.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.655+0000 7f9209ef8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92000d0200 0x7f9200010f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:43.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.655+0000 7f9209ef8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92000d0200 0x7f9200010f40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:56526/0 (socket says 192.168.123.106:56526) 2026-03-09T17:30:43.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.655+0000 7f9209ef8700 1 -- 192.168.123.106:0/2099664328 learned_addr learned my addr 192.168.123.106:0/2099664328 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:30:43.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.655+0000 7f920a6f9700 1 --2- 192.168.123.106:0/2099664328 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f92000a5800 0x7f92000cfcc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:43.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.656+0000 7f920a6f9700 1 -- 192.168.123.106:0/2099664328 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92000d0200 msgr2=0x7f9200010f40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.656+0000 7f920a6f9700 1 --2- 192.168.123.106:0/2099664328 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92000d0200 0x7f9200010f40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.656+0000 7f920a6f9700 1 -- 192.168.123.106:0/2099664328 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9204067050 con 0x7f92000a5800 2026-03-09T17:30:43.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.656+0000 7f920a6f9700 1 --2- 192.168.123.106:0/2099664328 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f92000a5800 0x7f92000cfcc0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f91f800b6d0 tx=0x7f91f800b9e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:43.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.656+0000 7f91f77fe700 1 -- 192.168.123.106:0/2099664328 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f91f8011630 con 0x7f92000a5800 2026-03-09T17:30:43.659 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.656+0000 7f920b6fb700 1 -- 192.168.123.106:0/2099664328 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f92000114e0 con 0x7f92000a5800 2026-03-09T17:30:43.659 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.656+0000 7f920b6fb700 1 -- 192.168.123.106:0/2099664328 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9200011a00 con 0x7f92000a5800 2026-03-09T17:30:43.659 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.657+0000 7f91f77fe700 1 -- 192.168.123.106:0/2099664328 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f91f8011c70 con 0x7f92000a5800 2026-03-09T17:30:43.659 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.657+0000 7f91f77fe700 1 -- 192.168.123.106:0/2099664328 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f91f8010e80 con 0x7f92000a5800 2026-03-09T17:30:43.660 INFO:tasks.workunit.client.1.vm09.stdout:3/916: unlink d5/d9/d30/d65/f3e 0 2026-03-09T17:30:43.660 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.658+0000 7f920b6fb700 1 -- 192.168.123.106:0/2099664328 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f91e8005320 con 0x7f92000a5800 2026-03-09T17:30:43.662 INFO:tasks.workunit.client.1.vm09.stdout:3/917: mknod d5/d9/d30/d65/d59/d108/c11e 0 2026-03-09T17:30:43.662 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.660+0000 7f91f77fe700 1 -- 192.168.123.106:0/2099664328 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f91f8010380 con 0x7f92000a5800 2026-03-09T17:30:43.662 INFO:tasks.workunit.client.1.vm09.stdout:3/918: readlink d5/d9/d30/d65/l104 0 2026-03-09T17:30:43.662 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.660+0000 7f91f77fe700 1 --2- 192.168.123.106:0/2099664328 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f91f00776d0 0x7f91f0079b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:43.662 INFO:tasks.workunit.client.1.vm09.stdout:6/935: dread d3/d21/db1/fd1 [0,4194304] 0 2026-03-09T17:30:43.663 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.661+0000 7f9209ef8700 1 --2- 192.168.123.106:0/2099664328 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f91f00776d0 0x7f91f0079b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:43.663 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.661+0000 7f91f77fe700 1 -- 192.168.123.106:0/2099664328 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f91f8099460 con 0x7f92000a5800 2026-03-09T17:30:43.663 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.661+0000 7f9209ef8700 1 --2- 192.168.123.106:0/2099664328 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f91f00776d0 0x7f91f0079b80 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f9204066860 tx=0x7f920407a040 comp rx=0 tx=0).ready entity=mgr.14712 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:43.665 INFO:tasks.workunit.client.1.vm09.stdout:6/936: symlink d3/d21/d76/d5c/d7e/l131 0 2026-03-09T17:30:43.668 INFO:tasks.workunit.client.1.vm09.stdout:6/937: rename d3/d21/d76/d3f/d11d/f128 to d3/d21/d76/d88/d101/f132 0 2026-03-09T17:30:43.670 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.668+0000 7f91f77fe700 1 -- 192.168.123.106:0/2099664328 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f91f8061ff0 con 0x7f92000a5800 2026-03-09T17:30:43.676 INFO:tasks.workunit.client.1.vm09.stdout:6/938: rename d3/d21/d76/d5c/f65 to d3/d21/d76/d88/f133 0 2026-03-09T17:30:43.678 INFO:tasks.workunit.client.1.vm09.stdout:5/996: write d0/d2/f31 [1358830,52018] 0 2026-03-09T17:30:43.686 INFO:tasks.workunit.client.1.vm09.stdout:5/997: mkdir d0/d9/d74/d75/dbd/d143 0 2026-03-09T17:30:43.689 INFO:tasks.workunit.client.1.vm09.stdout:1/988: dwrite d9/dc/dd/d9f/de4/dba/fd7 [0,4194304] 0 2026-03-09T17:30:43.693 INFO:tasks.workunit.client.1.vm09.stdout:5/998: mknod d0/d9/d74/d75/d9f/c144 0 2026-03-09T17:30:43.693 INFO:tasks.workunit.client.1.vm09.stdout:5/999: write d0/d2/f31 [3332674,101275] 0 2026-03-09T17:30:43.699 INFO:tasks.workunit.client.1.vm09.stdout:6/939: rename d3/d21/d76/d5c/d7e/dc5/d9a/fb4 to d3/d21/d76/d5c/d61/f134 0 2026-03-09T17:30:43.699 INFO:tasks.workunit.client.1.vm09.stdout:6/940: stat d3/d7/c63 0 2026-03-09T17:30:43.705 INFO:tasks.workunit.client.1.vm09.stdout:1/989: dwrite d9/dc/dd/d9f/de4/fd1 [0,4194304] 0 2026-03-09T17:30:43.721 INFO:tasks.workunit.client.1.vm09.stdout:1/990: sync 2026-03-09T17:30:43.724 INFO:tasks.workunit.client.1.vm09.stdout:1/991: rename d9/dc/dd/d9f/d9c to d9/d9e/dc0/d37/d3f/d42/d55/db1/d13b 0 2026-03-09T17:30:43.740 INFO:tasks.workunit.client.1.vm09.stdout:3/919: dread d5/d9/d30/d65/d59/d84/fab [0,4194304] 0 2026-03-09T17:30:43.765 INFO:tasks.workunit.client.1.vm09.stdout:3/920: dread d5/d9/d30/d65/d59/fa2 [0,4194304] 0 2026-03-09T17:30:43.769 INFO:tasks.workunit.client.1.vm09.stdout:3/921: dread d5/d16/fcc [0,4194304] 0 2026-03-09T17:30:43.771 INFO:tasks.workunit.client.1.vm09.stdout:3/922: getdents d5/d9/d30/d65/d59/d84/d100 0 2026-03-09T17:30:43.771 INFO:tasks.workunit.client.1.vm09.stdout:3/923: dread - d5/d9/d30/d65/fdd zero size 2026-03-09T17:30:43.780 INFO:tasks.workunit.client.1.vm09.stdout:3/924: rename d5/d16/l78 to d5/d16/l11f 0 2026-03-09T17:30:43.814 INFO:tasks.workunit.client.1.vm09.stdout:0/993: dwrite d6/d64/db5/f102 [0,4194304] 0 2026-03-09T17:30:43.825 INFO:tasks.workunit.client.1.vm09.stdout:0/994: mknod d6/d1d/d39/c142 0 2026-03-09T17:30:43.849 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.846+0000 7f920b6fb700 1 -- 192.168.123.106:0/2099664328 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f91e8005cc0 con 0x7f92000a5800 2026-03-09T17:30:43.849 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.847+0000 7f91f77fe700 1 -- 192.168.123.106:0/2099664328 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f91f8061740 con 0x7f92000a5800 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 12, 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:30:43.850 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:30:43.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.851+0000 7f920b6fb700 1 -- 192.168.123.106:0/2099664328 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f91f00776d0 msgr2=0x7f91f0079b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.851+0000 7f920b6fb700 1 --2- 192.168.123.106:0/2099664328 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f91f00776d0 0x7f91f0079b80 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f9204066860 tx=0x7f920407a040 comp rx=0 tx=0).stop 2026-03-09T17:30:43.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.851+0000 7f920b6fb700 1 -- 192.168.123.106:0/2099664328 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f92000a5800 msgr2=0x7f92000cfcc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.851+0000 7f920b6fb700 1 --2- 192.168.123.106:0/2099664328 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f92000a5800 0x7f92000cfcc0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f91f800b6d0 tx=0x7f91f800b9e0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.851+0000 7f920b6fb700 1 -- 192.168.123.106:0/2099664328 shutdown_connections 2026-03-09T17:30:43.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.851+0000 7f920b6fb700 1 --2- 192.168.123.106:0/2099664328 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f91f00776d0 0x7f91f0079b80 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.851+0000 7f920b6fb700 1 --2- 192.168.123.106:0/2099664328 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f92000a5800 0x7f92000cfcc0 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.851+0000 7f920b6fb700 1 --2- 192.168.123.106:0/2099664328 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f92000d0200 0x7f9200010f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.851+0000 7f920b6fb700 1 -- 192.168.123.106:0/2099664328 >> 192.168.123.106:0/2099664328 conn(0x7f920009f7b0 msgr2=0x7f92000a1b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:43.854 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.852+0000 7f920b6fb700 1 -- 192.168.123.106:0/2099664328 shutdown_connections 2026-03-09T17:30:43.854 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.852+0000 7f920b6fb700 1 -- 192.168.123.106:0/2099664328 wait complete. 2026-03-09T17:30:43.903 INFO:tasks.workunit.client.1.vm09.stdout:2/945: write d13/d15/d34/d45/f6a [1843898,47677] 0 2026-03-09T17:30:43.937 INFO:tasks.workunit.client.1.vm09.stdout:4/942: dwrite d11/d1e/d29/fcc [0,4194304] 0 2026-03-09T17:30:43.939 INFO:tasks.workunit.client.1.vm09.stdout:4/943: readlink d11/d1e/d45/d60/df1/dce/ld0 0 2026-03-09T17:30:43.941 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.938+0000 7f609ad9a700 1 -- 192.168.123.106:0/114985540 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6094071db0 msgr2=0x7f60940721c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.941 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.938+0000 7f609ad9a700 1 --2- 192.168.123.106:0/114985540 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6094071db0 0x7f60940721c0 secure :-1 s=READY pgs=346 cs=0 l=1 rev1=1 crypto rx=0x7f608c00b3a0 tx=0x7f608c00b6b0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.942+0000 7f609ad9a700 1 -- 192.168.123.106:0/114985540 shutdown_connections 2026-03-09T17:30:43.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.942+0000 7f609ad9a700 1 --2- 192.168.123.106:0/114985540 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6094107d50 0x7f60941081c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.942+0000 7f609ad9a700 1 --2- 192.168.123.106:0/114985540 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6094071db0 0x7f60940721c0 unknown :-1 s=CLOSED pgs=346 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.942+0000 7f609ad9a700 1 -- 192.168.123.106:0/114985540 >> 192.168.123.106:0/114985540 conn(0x7f609406d3e0 msgr2=0x7f609406f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:43.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.942+0000 7f609ad9a700 1 -- 192.168.123.106:0/114985540 shutdown_connections 2026-03-09T17:30:43.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.942+0000 7f609ad9a700 1 -- 192.168.123.106:0/114985540 wait complete. 2026-03-09T17:30:43.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.943+0000 7f609ad9a700 1 Processor -- start 2026-03-09T17:30:43.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.943+0000 7f609ad9a700 1 -- start start 2026-03-09T17:30:43.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.943+0000 7f609ad9a700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6094071db0 0x7f60941a4c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:43.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.943+0000 7f609ad9a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6094107d50 0x7f60941a5180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:43.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.943+0000 7f609ad9a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f60941a57a0 con 0x7f6094107d50 2026-03-09T17:30:43.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.943+0000 7f609ad9a700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f60941aa1b0 con 0x7f6094071db0 2026-03-09T17:30:43.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.943+0000 7f6098b36700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6094071db0 0x7f60941a4c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:43.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.943+0000 7f6098b36700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6094071db0 0x7f60941a4c40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:53822/0 (socket says 192.168.123.106:53822) 2026-03-09T17:30:43.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.943+0000 7f6098b36700 1 -- 192.168.123.106:0/2574880100 learned_addr learned my addr 192.168.123.106:0/2574880100 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:30:43.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.943+0000 7f6093fff700 1 --2- 192.168.123.106:0/2574880100 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6094107d50 0x7f60941a5180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:43.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.944+0000 7f6093fff700 1 -- 192.168.123.106:0/2574880100 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6094071db0 msgr2=0x7f60941a4c40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:43.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.944+0000 7f6093fff700 1 --2- 192.168.123.106:0/2574880100 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6094071db0 0x7f60941a4c40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:43.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.944+0000 7f6093fff700 1 -- 192.168.123.106:0/2574880100 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f608c00b050 con 0x7f6094107d50 2026-03-09T17:30:43.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.944+0000 7f6093fff700 1 --2- 192.168.123.106:0/2574880100 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6094107d50 0x7f60941a5180 secure :-1 s=READY pgs=347 cs=0 l=1 rev1=1 crypto rx=0x7f608800b6d0 tx=0x7f608800b9e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:43.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.944+0000 7f6091ffb700 1 -- 192.168.123.106:0/2574880100 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6088011630 con 0x7f6094107d50 2026-03-09T17:30:43.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.944+0000 7f6091ffb700 1 -- 192.168.123.106:0/2574880100 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6088011c70 con 0x7f6094107d50 2026-03-09T17:30:43.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.944+0000 7f6091ffb700 1 -- 192.168.123.106:0/2574880100 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f608800f2e0 con 0x7f6094107d50 2026-03-09T17:30:43.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.944+0000 7f609ad9a700 1 -- 192.168.123.106:0/2574880100 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f60941aa3b0 con 0x7f6094107d50 2026-03-09T17:30:43.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.944+0000 7f609ad9a700 1 -- 192.168.123.106:0/2574880100 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f60941aa8d0 con 0x7f6094107d50 2026-03-09T17:30:43.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.945+0000 7f609ad9a700 1 -- 192.168.123.106:0/2574880100 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6094066e40 con 0x7f6094107d50 2026-03-09T17:30:43.957 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.954+0000 7f6091ffb700 1 -- 192.168.123.106:0/2574880100 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f6088011790 con 0x7f6094107d50 2026-03-09T17:30:43.957 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.955+0000 7f6091ffb700 1 --2- 192.168.123.106:0/2574880100 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f607c0776b0 0x7f607c079b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:43.957 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.955+0000 7f6098b36700 1 --2- 192.168.123.106:0/2574880100 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f607c0776b0 0x7f607c079b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:43.958 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.956+0000 7f6091ffb700 1 -- 192.168.123.106:0/2574880100 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f6088066610 con 0x7f6094107d50 2026-03-09T17:30:43.958 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.956+0000 7f6098b36700 1 --2- 192.168.123.106:0/2574880100 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f607c0776b0 0x7f607c079b60 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f608c007ee0 tx=0x7f608c00bab0 comp rx=0 tx=0).ready entity=mgr.14712 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:43.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:43.957+0000 7f6091ffb700 1 -- 192.168.123.106:0/2574880100 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f6088062030 con 0x7f6094107d50 2026-03-09T17:30:43.964 INFO:tasks.workunit.client.1.vm09.stdout:4/944: getdents d11/d1e/d29 0 2026-03-09T17:30:43.965 INFO:tasks.workunit.client.1.vm09.stdout:8/983: dwrite d1/da/dd/f45 [0,4194304] 0 2026-03-09T17:30:43.965 INFO:tasks.workunit.client.1.vm09.stdout:4/945: readlink d11/d1e/d45/d60/d71/db7/d89/d8b/d58/l5d 0 2026-03-09T17:30:43.994 INFO:tasks.workunit.client.1.vm09.stdout:8/984: truncate d1/da/d23/d6c/ddd/ffc 4092222 0 2026-03-09T17:30:43.995 INFO:tasks.workunit.client.1.vm09.stdout:8/985: readlink d1/da/d23/d71/d101/l29 0 2026-03-09T17:30:43.998 INFO:tasks.workunit.client.1.vm09.stdout:8/986: mkdir d1/d14/d2a/d42/d43/dfd/d12f 0 2026-03-09T17:30:44.006 INFO:tasks.workunit.client.1.vm09.stdout:8/987: fdatasync d1/da/d23/d6c/d32/f6d 0 2026-03-09T17:30:44.007 INFO:tasks.workunit.client.1.vm09.stdout:9/999: write d5/de/fb1 [771284,56595] 0 2026-03-09T17:30:44.015 INFO:tasks.workunit.client.1.vm09.stdout:8/988: dread d1/da/d23/d6c/d32/f6d [0,4194304] 0 2026-03-09T17:30:44.018 INFO:tasks.workunit.client.1.vm09.stdout:8/989: mknod d1/da/d23/d6c/ddd/dcb/c130 0 2026-03-09T17:30:44.023 INFO:tasks.workunit.client.1.vm09.stdout:8/990: unlink d1/da/d23/dc2/da2/ddf/f105 0 2026-03-09T17:30:44.040 INFO:tasks.workunit.client.1.vm09.stdout:8/991: symlink d1/da/d23/d6c/ddd/dcb/l131 0 2026-03-09T17:30:44.052 INFO:tasks.workunit.client.1.vm09.stdout:6/941: getdents d3/d21/d76/d88/d101 0 2026-03-09T17:30:44.055 INFO:tasks.workunit.client.1.vm09.stdout:6/942: rmdir d3/d21/d76/d5c/d7e/dc5/d9a 39 2026-03-09T17:30:44.057 INFO:tasks.workunit.client.1.vm09.stdout:6/943: rmdir d3/d21/d76/d81 39 2026-03-09T17:30:44.063 INFO:tasks.workunit.client.1.vm09.stdout:1/992: write d9/dc/dd/d40/d1d/f4d [1690786,85661] 0 2026-03-09T17:30:44.071 INFO:tasks.workunit.client.1.vm09.stdout:0/995: dwrite d6/d64/db5/f102 [4194304,4194304] 0 2026-03-09T17:30:44.071 INFO:tasks.workunit.client.1.vm09.stdout:3/925: dwrite d5/d9/f88 [0,4194304] 0 2026-03-09T17:30:44.074 INFO:tasks.workunit.client.1.vm09.stdout:1/993: creat d9/d9e/dc0/d37/d3f/d42/f13c x:0 0 0 2026-03-09T17:30:44.075 INFO:tasks.workunit.client.1.vm09.stdout:3/926: chown d5/d9/d30/d65/l26 764 1 2026-03-09T17:30:44.076 INFO:tasks.workunit.client.1.vm09.stdout:3/927: truncate d5/d16/d31/ffb 167357 0 2026-03-09T17:30:44.079 INFO:tasks.workunit.client.1.vm09.stdout:3/928: write d5/d16/d46/f6b [93914,110103] 0 2026-03-09T17:30:44.080 INFO:tasks.workunit.client.1.vm09.stdout:3/929: write d5/d16/d31/d37/d58/d8a/da8/ff6 [502716,126312] 0 2026-03-09T17:30:44.088 INFO:tasks.workunit.client.1.vm09.stdout:0/996: rename d6/d1d/l40 to d6/d64/db5/l143 0 2026-03-09T17:30:44.091 INFO:tasks.workunit.client.1.vm09.stdout:1/994: link d9/d9e/dc0/d37/d3f/f80 d9/d9e/dc0/d37/da4/dfd/f13d 0 2026-03-09T17:30:44.098 INFO:tasks.workunit.client.1.vm09.stdout:3/930: dread - d5/d16/dc5/fd1 zero size 2026-03-09T17:30:44.100 INFO:tasks.workunit.client.1.vm09.stdout:3/931: read d5/fa1 [2370877,23342] 0 2026-03-09T17:30:44.101 INFO:tasks.workunit.client.1.vm09.stdout:2/946: dwrite d13/f40 [0,4194304] 0 2026-03-09T17:30:44.101 INFO:tasks.workunit.client.1.vm09.stdout:2/947: dread - d13/d15/d36/f115 zero size 2026-03-09T17:30:44.101 INFO:tasks.workunit.client.1.vm09.stdout:0/997: creat d6/d1d/d24/d5e/dc2/d11c/f144 x:0 0 0 2026-03-09T17:30:44.102 INFO:tasks.workunit.client.1.vm09.stdout:1/995: mkdir d9/d9e/dc9/d13e 0 2026-03-09T17:30:44.109 INFO:tasks.workunit.client.1.vm09.stdout:2/948: fsync d13/d15/d34/d45/ff6 0 2026-03-09T17:30:44.112 INFO:tasks.workunit.client.1.vm09.stdout:1/996: symlink d9/d9e/dc9/l13f 0 2026-03-09T17:30:44.113 INFO:tasks.workunit.client.1.vm09.stdout:1/997: readlink d9/dc/dd/d40/d21/d35/l60 0 2026-03-09T17:30:44.114 INFO:tasks.workunit.client.1.vm09.stdout:1/998: truncate d9/dc/dd/d9f/de4/dba/fd7 4785631 0 2026-03-09T17:30:44.116 INFO:tasks.workunit.client.1.vm09.stdout:2/949: unlink d13/d15/d34/d45/d84/dcb/c54 0 2026-03-09T17:30:44.119 INFO:tasks.workunit.client.1.vm09.stdout:2/950: dwrite d13/f100 [4194304,4194304] 0 2026-03-09T17:30:44.120 INFO:tasks.workunit.client.1.vm09.stdout:1/999: mkdir d9/dc/dd/d40/d21/d35/d140 0 2026-03-09T17:30:44.140 INFO:tasks.workunit.client.1.vm09.stdout:2/951: unlink d13/d15/d3b/ddf/f97 0 2026-03-09T17:30:44.140 INFO:tasks.workunit.client.1.vm09.stdout:0/998: getdents d6/d1d/d24/d5e/dc2/df7 0 2026-03-09T17:30:44.141 INFO:tasks.workunit.client.1.vm09.stdout:0/999: chown d6/c15 3683 1 2026-03-09T17:30:44.141 INFO:tasks.workunit.client.1.vm09.stdout:2/952: read d13/d15/d3b/d43/f96 [447285,30273] 0 2026-03-09T17:30:44.142 INFO:tasks.workunit.client.1.vm09.stdout:2/953: readlink d13/d15/d3b/ddf/lec 0 2026-03-09T17:30:44.147 INFO:tasks.workunit.client.1.vm09.stdout:2/954: dwrite d13/d15/d34/d45/d84/dcb/f114 [0,4194304] 0 2026-03-09T17:30:44.167 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.164+0000 7f609ad9a700 1 -- 192.168.123.106:0/2574880100 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f60941aab80 con 0x7f6094107d50 2026-03-09T17:30:44.167 INFO:tasks.workunit.client.1.vm09.stdout:2/955: sync 2026-03-09T17:30:44.168 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.165+0000 7f6091ffb700 1 -- 192.168.123.106:0/2574880100 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1851 (secure 0 0 0) 0x7f6088061780 con 0x7f6094107d50 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:e13 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:epoch 11 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24291} 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:30:44.169 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:30:44.172 INFO:tasks.workunit.client.1.vm09.stdout:2/956: dwrite d13/d15/d34/d45/d84/dcb/f2d [4194304,4194304] 0 2026-03-09T17:30:44.172 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.170+0000 7f609ad9a700 1 -- 192.168.123.106:0/2574880100 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f607c0776b0 msgr2=0x7f607c079b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:44.172 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.170+0000 7f609ad9a700 1 --2- 192.168.123.106:0/2574880100 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f607c0776b0 0x7f607c079b60 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f608c007ee0 tx=0x7f608c00bab0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.172 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.170+0000 7f609ad9a700 1 -- 192.168.123.106:0/2574880100 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6094107d50 msgr2=0x7f60941a5180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:44.172 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.170+0000 7f609ad9a700 1 --2- 192.168.123.106:0/2574880100 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6094107d50 0x7f60941a5180 secure :-1 s=READY pgs=347 cs=0 l=1 rev1=1 crypto rx=0x7f608800b6d0 tx=0x7f608800b9e0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.171+0000 7f609ad9a700 1 -- 192.168.123.106:0/2574880100 shutdown_connections 2026-03-09T17:30:44.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.171+0000 7f609ad9a700 1 --2- 192.168.123.106:0/2574880100 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f607c0776b0 0x7f607c079b60 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.171+0000 7f609ad9a700 1 --2- 192.168.123.106:0/2574880100 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6094071db0 0x7f60941a4c40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.171+0000 7f609ad9a700 1 --2- 192.168.123.106:0/2574880100 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6094107d50 0x7f60941a5180 unknown :-1 s=CLOSED pgs=347 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.171+0000 7f609ad9a700 1 -- 192.168.123.106:0/2574880100 >> 192.168.123.106:0/2574880100 conn(0x7f609406d3e0 msgr2=0x7f609410af80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:44.180 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.178+0000 7f609ad9a700 1 -- 192.168.123.106:0/2574880100 shutdown_connections 2026-03-09T17:30:44.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.180+0000 7f609ad9a700 1 -- 192.168.123.106:0/2574880100 wait complete. 2026-03-09T17:30:44.187 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:30:44.192 INFO:tasks.workunit.client.1.vm09.stdout:2/957: mkdir d13/d15/d34/d37/d6f/dde/d131 0 2026-03-09T17:30:44.196 INFO:tasks.workunit.client.1.vm09.stdout:2/958: dread d13/d15/d36/d72/dc3/fc5 [0,4194304] 0 2026-03-09T17:30:44.200 INFO:tasks.workunit.client.1.vm09.stdout:2/959: mkdir d13/d15/d21/d88/d132 0 2026-03-09T17:30:44.203 INFO:tasks.workunit.client.1.vm09.stdout:2/960: readlink d13/d15/d36/d72/l99 0 2026-03-09T17:30:44.204 INFO:tasks.workunit.client.1.vm09.stdout:2/961: chown d13/d15/d3b/ddf/d90/f9f 95 1 2026-03-09T17:30:44.204 INFO:tasks.workunit.client.1.vm09.stdout:2/962: stat d13/da4/c11e 0 2026-03-09T17:30:44.208 INFO:tasks.workunit.client.1.vm09.stdout:2/963: unlink d13/d15/d36/c42 0 2026-03-09T17:30:44.219 INFO:tasks.workunit.client.1.vm09.stdout:2/964: dread d13/d15/d21/d120/f121 [0,4194304] 0 2026-03-09T17:30:44.225 INFO:tasks.workunit.client.1.vm09.stdout:2/965: mkdir d13/d15/d21/d120/d133 0 2026-03-09T17:30:44.241 INFO:tasks.workunit.client.1.vm09.stdout:4/946: dwrite d11/d1e/d29/d36/dd7/fdd [0,4194304] 0 2026-03-09T17:30:44.255 INFO:tasks.workunit.client.1.vm09.stdout:2/966: dread d13/f79 [0,4194304] 0 2026-03-09T17:30:44.265 INFO:tasks.workunit.client.1.vm09.stdout:6/944: dwrite d3/d21/db1/fd9 [0,4194304] 0 2026-03-09T17:30:44.268 INFO:tasks.workunit.client.1.vm09.stdout:8/992: dwrite d1/d14/d2a/d42/d43/d44/f108 [0,4194304] 0 2026-03-09T17:30:44.272 INFO:tasks.workunit.client.1.vm09.stdout:4/947: creat d11/d1e/d45/d60/d71/db7/d89/d8b/d58/f119 x:0 0 0 2026-03-09T17:30:44.272 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.270+0000 7f4d99c72700 1 -- 192.168.123.106:0/3636621875 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d94071980 msgr2=0x7f4d94071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:44.272 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.270+0000 7f4d99c72700 1 --2- 192.168.123.106:0/3636621875 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d94071980 0x7f4d94071d90 secure :-1 s=READY pgs=348 cs=0 l=1 rev1=1 crypto rx=0x7f4d84007780 tx=0x7f4d8400c050 comp rx=0 tx=0).stop 2026-03-09T17:30:44.272 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.270+0000 7f4d99c72700 1 -- 192.168.123.106:0/3636621875 shutdown_connections 2026-03-09T17:30:44.272 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.270+0000 7f4d99c72700 1 --2- 192.168.123.106:0/3636621875 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4d94072360 0x7f4d940770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.272 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.270+0000 7f4d99c72700 1 --2- 192.168.123.106:0/3636621875 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d94071980 0x7f4d94071d90 unknown :-1 s=CLOSED pgs=348 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.272 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.270+0000 7f4d99c72700 1 -- 192.168.123.106:0/3636621875 >> 192.168.123.106:0/3636621875 conn(0x7f4d9406d1a0 msgr2=0x7f4d9406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:44.272 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.270+0000 7f4d99c72700 1 -- 192.168.123.106:0/3636621875 shutdown_connections 2026-03-09T17:30:44.272 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.270+0000 7f4d99c72700 1 -- 192.168.123.106:0/3636621875 wait complete. 2026-03-09T17:30:44.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.270+0000 7f4d99c72700 1 Processor -- start 2026-03-09T17:30:44.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.270+0000 7f4d99c72700 1 -- start start 2026-03-09T17:30:44.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.270+0000 7f4d99c72700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4d94072360 0x7f4d941313a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:44.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.270+0000 7f4d99c72700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d941318e0 0x7f4d9407f590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:44.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.270+0000 7f4d99c72700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d94131de0 con 0x7f4d941318e0 2026-03-09T17:30:44.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.270+0000 7f4d99c72700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d94131f20 con 0x7f4d94072360 2026-03-09T17:30:44.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.271+0000 7f4d92ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d941318e0 0x7f4d9407f590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:44.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.271+0000 7f4d92ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d941318e0 0x7f4d9407f590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:56548/0 (socket says 192.168.123.106:56548) 2026-03-09T17:30:44.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.271+0000 7f4d92ffd700 1 -- 192.168.123.106:0/3255667415 learned_addr learned my addr 192.168.123.106:0/3255667415 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:30:44.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.271+0000 7f4d937fe700 1 --2- 192.168.123.106:0/3255667415 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4d94072360 0x7f4d941313a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:44.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.271+0000 7f4d92ffd700 1 -- 192.168.123.106:0/3255667415 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4d94072360 msgr2=0x7f4d941313a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:44.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.271+0000 7f4d92ffd700 1 --2- 192.168.123.106:0/3255667415 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4d94072360 0x7f4d941313a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.271+0000 7f4d92ffd700 1 -- 192.168.123.106:0/3255667415 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4d84007430 con 0x7f4d941318e0 2026-03-09T17:30:44.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.272+0000 7f4d92ffd700 1 --2- 192.168.123.106:0/3255667415 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d941318e0 0x7f4d9407f590 secure :-1 s=READY pgs=349 cs=0 l=1 rev1=1 crypto rx=0x7f4d8c00bf40 tx=0x7f4d8c009f50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:44.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.272+0000 7f4d90ff9700 1 -- 192.168.123.106:0/3255667415 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4d8c00cac0 con 0x7f4d941318e0 2026-03-09T17:30:44.275 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.272+0000 7f4d99c72700 1 -- 192.168.123.106:0/3255667415 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4d9407fb30 con 0x7f4d941318e0 2026-03-09T17:30:44.275 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.272+0000 7f4d99c72700 1 -- 192.168.123.106:0/3255667415 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4d940800b0 con 0x7f4d941318e0 2026-03-09T17:30:44.275 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.273+0000 7f4d90ff9700 1 -- 192.168.123.106:0/3255667415 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4d8c00cc20 con 0x7f4d941318e0 2026-03-09T17:30:44.275 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.273+0000 7f4d90ff9700 1 -- 192.168.123.106:0/3255667415 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4d8c012750 con 0x7f4d941318e0 2026-03-09T17:30:44.277 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.275+0000 7f4d99c72700 1 -- 192.168.123.106:0/3255667415 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4d9412b500 con 0x7f4d941318e0 2026-03-09T17:30:44.278 INFO:tasks.workunit.client.1.vm09.stdout:4/948: chown d11/d1e/d45/d60/d71/db7/d89/d8b/d58/l5d 12213866 1 2026-03-09T17:30:44.280 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.278+0000 7f4d90ff9700 1 -- 192.168.123.106:0/3255667415 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f4d8c014440 con 0x7f4d941318e0 2026-03-09T17:30:44.280 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.278+0000 7f4d90ff9700 1 --2- 192.168.123.106:0/3255667415 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f4d7c077680 0x7f4d7c079b30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:44.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.282+0000 7f4d937fe700 1 --2- 192.168.123.106:0/3255667415 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f4d7c077680 0x7f4d7c079b30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:44.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.282+0000 7f4d937fe700 1 --2- 192.168.123.106:0/3255667415 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f4d7c077680 0x7f4d7c079b30 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f4d840073a0 tx=0x7f4d840072b0 comp rx=0 tx=0).ready entity=mgr.14712 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:44.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.282+0000 7f4d90ff9700 1 -- 192.168.123.106:0/3255667415 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f4d8c066440 con 0x7f4d941318e0 2026-03-09T17:30:44.285 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.283+0000 7f4d90ff9700 1 -- 192.168.123.106:0/3255667415 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f4d8c061f10 con 0x7f4d941318e0 2026-03-09T17:30:44.302 INFO:tasks.workunit.client.1.vm09.stdout:4/949: mknod d11/d1e/d29/c11a 0 2026-03-09T17:30:44.308 INFO:tasks.workunit.client.1.vm09.stdout:8/993: mknod d1/d14/d2a/d42/c132 0 2026-03-09T17:30:44.308 INFO:tasks.workunit.client.1.vm09.stdout:6/945: rmdir d3/d7/d99/d124 0 2026-03-09T17:30:44.309 INFO:tasks.workunit.client.1.vm09.stdout:2/967: getdents d13/d15/d34/d45 0 2026-03-09T17:30:44.311 INFO:tasks.workunit.client.1.vm09.stdout:8/994: symlink d1/d14/d2a/d42/d5d/d8a/l133 0 2026-03-09T17:30:44.313 INFO:tasks.workunit.client.1.vm09.stdout:8/995: fdatasync d1/d14/d2a/f102 0 2026-03-09T17:30:44.314 INFO:tasks.workunit.client.1.vm09.stdout:2/968: dwrite d13/da4/fea [0,4194304] 0 2026-03-09T17:30:44.316 INFO:tasks.workunit.client.1.vm09.stdout:2/969: stat d13/d15/d3b/d43 0 2026-03-09T17:30:44.324 INFO:tasks.workunit.client.1.vm09.stdout:4/950: sync 2026-03-09T17:30:44.324 INFO:tasks.workunit.client.1.vm09.stdout:6/946: sync 2026-03-09T17:30:44.330 INFO:tasks.workunit.client.1.vm09.stdout:6/947: creat d3/d21/d76/d3f/d8f/f135 x:0 0 0 2026-03-09T17:30:44.331 INFO:tasks.workunit.client.1.vm09.stdout:8/996: dread d1/d14/d2a/f62 [4194304,4194304] 0 2026-03-09T17:30:44.337 INFO:tasks.workunit.client.1.vm09.stdout:8/997: fsync d1/d14/d2a/f81 0 2026-03-09T17:30:44.342 INFO:tasks.workunit.client.1.vm09.stdout:2/970: rename d13/d15/d36/d72/d94/da7/db0/dd6/f127 to d13/d15/d3b/f134 0 2026-03-09T17:30:44.342 INFO:tasks.workunit.client.1.vm09.stdout:2/971: read - d13/d15/d3b/f134 zero size 2026-03-09T17:30:44.344 INFO:tasks.workunit.client.1.vm09.stdout:2/972: sync 2026-03-09T17:30:44.354 INFO:tasks.workunit.client.1.vm09.stdout:6/948: dread d3/d21/f8c [0,4194304] 0 2026-03-09T17:30:44.357 INFO:tasks.workunit.client.1.vm09.stdout:3/932: dwrite d5/f53 [0,4194304] 0 2026-03-09T17:30:44.359 INFO:tasks.workunit.client.1.vm09.stdout:2/973: mkdir d13/d15/d34/d37/d135 0 2026-03-09T17:30:44.359 INFO:tasks.workunit.client.1.vm09.stdout:8/998: creat d1/f134 x:0 0 0 2026-03-09T17:30:44.364 INFO:tasks.workunit.client.1.vm09.stdout:6/949: creat d3/d21/d76/d5c/d7e/d94/f136 x:0 0 0 2026-03-09T17:30:44.364 INFO:tasks.workunit.client.1.vm09.stdout:2/974: readlink d13/d15/d3b/ddf/lec 0 2026-03-09T17:30:44.367 INFO:tasks.workunit.client.1.vm09.stdout:6/950: fdatasync d3/d7/f112 0 2026-03-09T17:30:44.368 INFO:tasks.workunit.client.1.vm09.stdout:6/951: write d3/d21/d76/d88/f118 [5085606,102183] 0 2026-03-09T17:30:44.375 INFO:tasks.workunit.client.1.vm09.stdout:8/999: chown d1/d14/ca6 3403053 1 2026-03-09T17:30:44.378 INFO:tasks.workunit.client.1.vm09.stdout:2/975: mknod d13/d15/d21/d88/db8/dd1/de5/d12a/c136 0 2026-03-09T17:30:44.381 INFO:tasks.workunit.client.1.vm09.stdout:2/976: rename d13/d15/d3b/ddf/d90/f92 to d13/d4d/daa/f137 0 2026-03-09T17:30:44.388 INFO:tasks.workunit.client.1.vm09.stdout:2/977: truncate d13/d15/fd2 901321 0 2026-03-09T17:30:44.391 INFO:tasks.workunit.client.1.vm09.stdout:2/978: rmdir d13/d15/d34/d37/d66 39 2026-03-09T17:30:44.400 INFO:tasks.workunit.client.1.vm09.stdout:2/979: dread d13/d15/d3b/ddf/d85/faf [0,4194304] 0 2026-03-09T17:30:44.404 INFO:tasks.workunit.client.1.vm09.stdout:3/933: dread d5/d16/d31/d37/d58/d8a/da8/faf [0,4194304] 0 2026-03-09T17:30:44.404 INFO:tasks.workunit.client.1.vm09.stdout:3/934: chown d5/d9/d30/dc4/lf2 2 1 2026-03-09T17:30:44.407 INFO:tasks.workunit.client.1.vm09.stdout:3/935: dread - d5/d9/d90/ffa zero size 2026-03-09T17:30:44.414 INFO:tasks.workunit.client.1.vm09.stdout:3/936: dread d5/d9/d90/db0/dbb/fbd [0,4194304] 0 2026-03-09T17:30:44.417 INFO:tasks.workunit.client.1.vm09.stdout:3/937: creat d5/d9/d30/d65/d59/d84/f120 x:0 0 0 2026-03-09T17:30:44.421 INFO:tasks.workunit.client.1.vm09.stdout:3/938: dwrite d5/d16/d31/d37/fef [0,4194304] 0 2026-03-09T17:30:44.427 INFO:tasks.workunit.client.1.vm09.stdout:3/939: dwrite d5/d9/d90/db0/dbb/f118 [0,4194304] 0 2026-03-09T17:30:44.450 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.445+0000 7f4d99c72700 1 -- 192.168.123.106:0/3255667415 --> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f4d94132090 con 0x7f4d7c077680 2026-03-09T17:30:44.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.446+0000 7f4d90ff9700 1 -- 192.168.123.106:0/3255667415 <== mgr.14712 v2:192.168.123.106:6800/1431796821 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+368 (secure 0 0 0) 0x7f4d94132090 con 0x7f4d7c077680 2026-03-09T17:30:44.453 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:44 vm06.local ceph-mon[57307]: Upgrade: Updating mgr.vm09.lqzvkh 2026-03-09T17:30:44.453 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:44 vm06.local ceph-mon[57307]: Deploying daemon mgr.vm09.lqzvkh on vm09 2026-03-09T17:30:44.453 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:44 vm06.local ceph-mon[57307]: from='client.24543 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:44.453 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:44 vm06.local ceph-mon[57307]: pgmap v9: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 54 MiB/s rd, 110 MiB/s wr, 285 op/s 2026-03-09T17:30:44.453 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:44 vm06.local ceph-mon[57307]: from='client.24547 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:44.453 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:44 vm06.local ceph-mon[57307]: from='client.14750 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:44.453 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:44 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/2099664328' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:30:44.453 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:44 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/2574880100' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [ 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stdout: "mgr" 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stdout: ], 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "2/23 daemons upgraded", 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stdout: "message": "Currently upgrading mgr daemons", 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.450+0000 7f4d99c72700 1 -- 192.168.123.106:0/3255667415 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f4d7c077680 msgr2=0x7f4d7c079b30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.450+0000 7f4d99c72700 1 --2- 192.168.123.106:0/3255667415 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f4d7c077680 0x7f4d7c079b30 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f4d840073a0 tx=0x7f4d840072b0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.450+0000 7f4d99c72700 1 -- 192.168.123.106:0/3255667415 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d941318e0 msgr2=0x7f4d9407f590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.450+0000 7f4d99c72700 1 --2- 192.168.123.106:0/3255667415 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d941318e0 0x7f4d9407f590 secure :-1 s=READY pgs=349 cs=0 l=1 rev1=1 crypto rx=0x7f4d8c00bf40 tx=0x7f4d8c009f50 comp rx=0 tx=0).stop 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.450+0000 7f4d99c72700 1 -- 192.168.123.106:0/3255667415 shutdown_connections 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.450+0000 7f4d99c72700 1 --2- 192.168.123.106:0/3255667415 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f4d7c077680 0x7f4d7c079b30 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.450+0000 7f4d99c72700 1 --2- 192.168.123.106:0/3255667415 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4d94072360 0x7f4d941313a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.450+0000 7f4d99c72700 1 --2- 192.168.123.106:0/3255667415 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4d941318e0 0x7f4d9407f590 unknown :-1 s=CLOSED pgs=349 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.451+0000 7f4d99c72700 1 -- 192.168.123.106:0/3255667415 >> 192.168.123.106:0/3255667415 conn(0x7f4d9406d1a0 msgr2=0x7f4d94076480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.451+0000 7f4d99c72700 1 -- 192.168.123.106:0/3255667415 shutdown_connections 2026-03-09T17:30:44.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.451+0000 7f4d99c72700 1 -- 192.168.123.106:0/3255667415 wait complete. 2026-03-09T17:30:44.479 INFO:tasks.workunit.client.1.vm09.stdout:4/951: fdatasync d11/d1e/d29/d36/dd7/f112 0 2026-03-09T17:30:44.485 INFO:tasks.workunit.client.1.vm09.stdout:4/952: mknod d11/d1e/d29/d36/de3/d101/c11b 0 2026-03-09T17:30:44.490 INFO:tasks.workunit.client.1.vm09.stdout:4/953: rename d11/d1e/d45/d60/df1/d78/c10c to d11/d1e/d45/d60/df1/dce/c11c 0 2026-03-09T17:30:44.510 INFO:tasks.workunit.client.1.vm09.stdout:4/954: symlink d11/d1e/d45/d60/df1/d78/l11d 0 2026-03-09T17:30:44.521 INFO:tasks.workunit.client.1.vm09.stdout:6/952: dwrite d3/d21/d76/d5c/d7e/dc5/fca [0,4194304] 0 2026-03-09T17:30:44.523 INFO:tasks.workunit.client.1.vm09.stdout:6/953: chown d3/d21/d25/f5f 14005793 1 2026-03-09T17:30:44.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.532+0000 7f12ccc59700 1 -- 192.168.123.106:0/2374084082 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12c8071980 msgr2=0x7f12c8071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:44.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.532+0000 7f12ccc59700 1 --2- 192.168.123.106:0/2374084082 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12c8071980 0x7f12c8071d90 secure :-1 s=READY pgs=350 cs=0 l=1 rev1=1 crypto rx=0x7f12b8008790 tx=0x7f12b800ae50 comp rx=0 tx=0).stop 2026-03-09T17:30:44.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.532+0000 7f12ccc59700 1 -- 192.168.123.106:0/2374084082 shutdown_connections 2026-03-09T17:30:44.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.532+0000 7f12ccc59700 1 --2- 192.168.123.106:0/2374084082 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f12c8072360 0x7f12c80770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.532+0000 7f12ccc59700 1 --2- 192.168.123.106:0/2374084082 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12c8071980 0x7f12c8071d90 unknown :-1 s=CLOSED pgs=350 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.532+0000 7f12ccc59700 1 -- 192.168.123.106:0/2374084082 >> 192.168.123.106:0/2374084082 conn(0x7f12c806d1a0 msgr2=0x7f12c806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:44.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.532+0000 7f12ccc59700 1 -- 192.168.123.106:0/2374084082 shutdown_connections 2026-03-09T17:30:44.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.532+0000 7f12ccc59700 1 -- 192.168.123.106:0/2374084082 wait complete. 2026-03-09T17:30:44.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.532+0000 7f12ccc59700 1 Processor -- start 2026-03-09T17:30:44.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.532+0000 7f12ccc59700 1 -- start start 2026-03-09T17:30:44.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.533+0000 7f12ccc59700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12c8072360 0x7f12c80824e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:44.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.533+0000 7f12ccc59700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f12c8082a20 0x7f12c8082e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:44.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.533+0000 7f12ccc59700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f12c812dd80 con 0x7f12c8072360 2026-03-09T17:30:44.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.533+0000 7f12ccc59700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f12c812def0 con 0x7f12c8082a20 2026-03-09T17:30:44.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.533+0000 7f12c659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12c8072360 0x7f12c80824e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:44.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.533+0000 7f12c5d9b700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f12c8082a20 0x7f12c8082e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:44.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.533+0000 7f12c5d9b700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f12c8082a20 0x7f12c8082e90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:53858/0 (socket says 192.168.123.106:53858) 2026-03-09T17:30:44.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.533+0000 7f12c5d9b700 1 -- 192.168.123.106:0/780889172 learned_addr learned my addr 192.168.123.106:0/780889172 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:30:44.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.533+0000 7f12c5d9b700 1 -- 192.168.123.106:0/780889172 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12c8072360 msgr2=0x7f12c80824e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:44.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.533+0000 7f12c5d9b700 1 --2- 192.168.123.106:0/780889172 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12c8072360 0x7f12c80824e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.533+0000 7f12c5d9b700 1 -- 192.168.123.106:0/780889172 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f12b8008440 con 0x7f12c8082a20 2026-03-09T17:30:44.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.533+0000 7f12c5d9b700 1 --2- 192.168.123.106:0/780889172 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f12c8082a20 0x7f12c8082e90 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f12c0009fc0 tx=0x7f12c00076a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:44.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.534+0000 7f12b77fe700 1 -- 192.168.123.106:0/780889172 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f12c0010040 con 0x7f12c8082a20 2026-03-09T17:30:44.537 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.534+0000 7f12ccc59700 1 -- 192.168.123.106:0/780889172 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f12c812e1d0 con 0x7f12c8082a20 2026-03-09T17:30:44.537 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.534+0000 7f12ccc59700 1 -- 192.168.123.106:0/780889172 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f12c812e720 con 0x7f12c8082a20 2026-03-09T17:30:44.537 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.535+0000 7f12b77fe700 1 -- 192.168.123.106:0/780889172 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f12c00091e0 con 0x7f12c8082a20 2026-03-09T17:30:44.537 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.535+0000 7f12b77fe700 1 -- 192.168.123.106:0/780889172 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f12c0008870 con 0x7f12c8082a20 2026-03-09T17:30:44.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.536+0000 7f12b77fe700 1 -- 192.168.123.106:0/780889172 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f12c0008ad0 con 0x7f12c8082a20 2026-03-09T17:30:44.539 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.537+0000 7f12b77fe700 1 --2- 192.168.123.106:0/780889172 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f12b0077790 0x7f12b0079c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:30:44.539 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.537+0000 7f12b77fe700 1 -- 192.168.123.106:0/780889172 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f12c009a6c0 con 0x7f12c8082a20 2026-03-09T17:30:44.540 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.538+0000 7f12c659c700 1 --2- 192.168.123.106:0/780889172 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f12b0077790 0x7f12b0079c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:30:44.540 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.538+0000 7f12c659c700 1 --2- 192.168.123.106:0/780889172 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f12b0077790 0x7f12b0079c40 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f12b8008760 tx=0x7f12b800b320 comp rx=0 tx=0).ready entity=mgr.14712 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:30:44.540 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.538+0000 7f12ccc59700 1 -- 192.168.123.106:0/780889172 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f12a8005320 con 0x7f12c8082a20 2026-03-09T17:30:44.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.541+0000 7f12b77fe700 1 -- 192.168.123.106:0/780889172 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f12c0063300 con 0x7f12c8082a20 2026-03-09T17:30:44.546 INFO:tasks.workunit.client.1.vm09.stdout:2/980: write d13/d15/d34/d45/d84/dcb/fe7 [3711696,67367] 0 2026-03-09T17:30:44.550 INFO:tasks.workunit.client.1.vm09.stdout:2/981: dwrite d13/d15/d36/d72/d94/da7/db0/dd6/f124 [0,4194304] 0 2026-03-09T17:30:44.588 INFO:tasks.workunit.client.1.vm09.stdout:3/940: write d5/d9/d30/d65/f4f [1094683,17544] 0 2026-03-09T17:30:44.588 INFO:tasks.workunit.client.1.vm09.stdout:3/941: chown d5/d9/d90/ffa 6 1 2026-03-09T17:30:44.597 INFO:tasks.workunit.client.1.vm09.stdout:2/982: creat d13/d15/d34/d45/f138 x:0 0 0 2026-03-09T17:30:44.599 INFO:tasks.workunit.client.1.vm09.stdout:6/954: truncate d3/d48/f68 558036 0 2026-03-09T17:30:44.602 INFO:tasks.workunit.client.1.vm09.stdout:4/955: dwrite d11/d1e/d45/d60/df1/fbc [0,4194304] 0 2026-03-09T17:30:44.609 INFO:tasks.workunit.client.1.vm09.stdout:2/983: dread d13/d15/d34/d45/f61 [0,4194304] 0 2026-03-09T17:30:44.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:44 vm09.local ceph-mon[62061]: Upgrade: Updating mgr.vm09.lqzvkh 2026-03-09T17:30:44.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:44 vm09.local ceph-mon[62061]: Deploying daemon mgr.vm09.lqzvkh on vm09 2026-03-09T17:30:44.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:44 vm09.local ceph-mon[62061]: from='client.24543 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:44.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:44 vm09.local ceph-mon[62061]: pgmap v9: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 54 MiB/s rd, 110 MiB/s wr, 285 op/s 2026-03-09T17:30:44.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:44 vm09.local ceph-mon[62061]: from='client.24547 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:44.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:44 vm09.local ceph-mon[62061]: from='client.14750 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:44.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:44 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/2099664328' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:30:44.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:44 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/2574880100' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:30:44.694 INFO:tasks.workunit.client.1.vm09.stdout:6/955: truncate d3/d7/d59/d73/f7d 4253352 0 2026-03-09T17:30:44.702 INFO:tasks.workunit.client.1.vm09.stdout:4/956: mkdir d11/d1e/d45/d60/d71/db7/d89/d8b/d58/deb/d11e 0 2026-03-09T17:30:44.709 INFO:tasks.workunit.client.1.vm09.stdout:2/984: rename d13/d4d/daa/ff0 to d13/dc8/f139 0 2026-03-09T17:30:44.718 INFO:tasks.workunit.client.1.vm09.stdout:6/956: sync 2026-03-09T17:30:44.719 INFO:tasks.workunit.client.1.vm09.stdout:6/957: dread - d3/d7/d59/d73/de1/f11b zero size 2026-03-09T17:30:44.730 INFO:tasks.workunit.client.1.vm09.stdout:2/985: mkdir d13/d15/d36/d13a 0 2026-03-09T17:30:44.734 INFO:tasks.workunit.client.1.vm09.stdout:6/958: creat d3/d7/d99/f137 x:0 0 0 2026-03-09T17:30:44.735 INFO:tasks.workunit.client.1.vm09.stdout:4/957: truncate d11/f12 2168994 0 2026-03-09T17:30:44.737 INFO:tasks.workunit.client.1.vm09.stdout:3/942: getdents d5/d9c/de7/de1 0 2026-03-09T17:30:44.738 INFO:tasks.workunit.client.1.vm09.stdout:3/943: readlink d5/d9/d30/d65/d59/d84/l9e 0 2026-03-09T17:30:44.739 INFO:tasks.workunit.client.1.vm09.stdout:2/986: chown d13/d4d/f7d 6211 1 2026-03-09T17:30:44.740 INFO:tasks.workunit.client.1.vm09.stdout:6/959: symlink d3/d21/d76/d3f/d8f/l138 0 2026-03-09T17:30:44.743 INFO:tasks.workunit.client.1.vm09.stdout:4/958: unlink d11/d1e/def/ff6 0 2026-03-09T17:30:44.748 INFO:tasks.workunit.client.1.vm09.stdout:2/987: mkdir d13/d15/d36/d72/d94/da7/db0/dd6/d13b 0 2026-03-09T17:30:44.748 INFO:tasks.workunit.client.1.vm09.stdout:2/988: chown d13/d15/d3b/f3f 493776 1 2026-03-09T17:30:44.749 INFO:tasks.workunit.client.1.vm09.stdout:2/989: chown d13/d15/f2b 932067 1 2026-03-09T17:30:44.749 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.747+0000 7f12ccc59700 1 -- 192.168.123.106:0/780889172 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f12a8005190 con 0x7f12c8082a20 2026-03-09T17:30:44.749 INFO:tasks.workunit.client.1.vm09.stdout:2/990: chown d13/d15/d21/df5 2319 1 2026-03-09T17:30:44.751 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.749+0000 7f12b77fe700 1 -- 192.168.123.106:0/780889172 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f12c001b090 con 0x7f12c8082a20 2026-03-09T17:30:44.751 INFO:tasks.workunit.client.1.vm09.stdout:6/960: fsync d3/d21/d25/d26/d6b/dbf/f126 0 2026-03-09T17:30:44.752 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_OK 2026-03-09T17:30:44.752 INFO:tasks.workunit.client.1.vm09.stdout:6/961: chown d3/d21/d25/d26/d86/dbc/dfa/f12e 20205230 1 2026-03-09T17:30:44.752 INFO:tasks.workunit.client.1.vm09.stdout:6/962: fdatasync d3/d21/d25/fdb 0 2026-03-09T17:30:44.755 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.752+0000 7f12ccc59700 1 -- 192.168.123.106:0/780889172 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f12b0077790 msgr2=0x7f12b0079c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:44.755 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.752+0000 7f12ccc59700 1 --2- 192.168.123.106:0/780889172 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f12b0077790 0x7f12b0079c40 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f12b8008760 tx=0x7f12b800b320 comp rx=0 tx=0).stop 2026-03-09T17:30:44.755 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.752+0000 7f12ccc59700 1 -- 192.168.123.106:0/780889172 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f12c8082a20 msgr2=0x7f12c8082e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:30:44.755 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.752+0000 7f12ccc59700 1 --2- 192.168.123.106:0/780889172 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f12c8082a20 0x7f12c8082e90 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f12c0009fc0 tx=0x7f12c00076a0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.755 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.753+0000 7f12ccc59700 1 -- 192.168.123.106:0/780889172 shutdown_connections 2026-03-09T17:30:44.755 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.753+0000 7f12ccc59700 1 --2- 192.168.123.106:0/780889172 >> [v2:192.168.123.106:6800/1431796821,v1:192.168.123.106:6801/1431796821] conn(0x7f12b0077790 0x7f12b0079c40 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.755 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.753+0000 7f12ccc59700 1 --2- 192.168.123.106:0/780889172 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12c8072360 0x7f12c80824e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.755 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.753+0000 7f12ccc59700 1 --2- 192.168.123.106:0/780889172 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f12c8082a20 0x7f12c8082e90 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:30:44.755 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.753+0000 7f12ccc59700 1 -- 192.168.123.106:0/780889172 >> 192.168.123.106:0/780889172 conn(0x7f12c806d1a0 msgr2=0x7f12c806e190 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:30:44.755 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.753+0000 7f12ccc59700 1 -- 192.168.123.106:0/780889172 shutdown_connections 2026-03-09T17:30:44.755 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:30:44.753+0000 7f12ccc59700 1 -- 192.168.123.106:0/780889172 wait complete. 2026-03-09T17:30:44.765 INFO:tasks.workunit.client.1.vm09.stdout:3/944: write d5/d9/d90/db0/fa3 [2272042,75618] 0 2026-03-09T17:30:44.770 INFO:tasks.workunit.client.1.vm09.stdout:2/991: rename d13/d15/d34/d45/d84/dcb/f122 to d13/d4d/daa/dff/f13c 0 2026-03-09T17:30:44.771 INFO:tasks.workunit.client.1.vm09.stdout:2/992: readlink d13/d15/d36/d72/dc3/le1 0 2026-03-09T17:30:44.772 INFO:tasks.workunit.client.1.vm09.stdout:6/963: truncate d3/d21/d76/d81/fa2 559138 0 2026-03-09T17:30:44.782 INFO:tasks.workunit.client.1.vm09.stdout:4/959: dread d11/d1e/d45/d60/f95 [0,4194304] 0 2026-03-09T17:30:44.783 INFO:tasks.workunit.client.1.vm09.stdout:4/960: dread - d11/d1e/d45/fb3 zero size 2026-03-09T17:30:44.784 INFO:tasks.workunit.client.1.vm09.stdout:6/964: symlink d3/l139 0 2026-03-09T17:30:44.785 INFO:tasks.workunit.client.1.vm09.stdout:6/965: truncate d3/d7/f112 110250 0 2026-03-09T17:30:44.791 INFO:tasks.workunit.client.1.vm09.stdout:6/966: dread d3/d7/f40 [0,4194304] 0 2026-03-09T17:30:44.792 INFO:tasks.workunit.client.1.vm09.stdout:2/993: getdents d13/d4d/daa/dff 0 2026-03-09T17:30:44.792 INFO:tasks.workunit.client.1.vm09.stdout:2/994: stat d13/d15/f2a 0 2026-03-09T17:30:44.794 INFO:tasks.workunit.client.1.vm09.stdout:6/967: mkdir d3/d21/db1/d13a 0 2026-03-09T17:30:44.795 INFO:tasks.workunit.client.1.vm09.stdout:3/945: dwrite d5/d9/d30/d65/f18 [0,4194304] 0 2026-03-09T17:30:44.803 INFO:tasks.workunit.client.1.vm09.stdout:2/995: dwrite d13/d15/d3b/f134 [0,4194304] 0 2026-03-09T17:30:44.811 INFO:tasks.workunit.client.1.vm09.stdout:4/961: dwrite d11/d1e/d45/d60/d71/db7/d89/d8b/d58/f7d [0,4194304] 0 2026-03-09T17:30:44.831 INFO:tasks.workunit.client.1.vm09.stdout:2/996: unlink d13/d15/d3b/ddf/lbc 0 2026-03-09T17:30:44.832 INFO:tasks.workunit.client.1.vm09.stdout:2/997: dread - d13/d15/d36/d72/d94/da7/f12d zero size 2026-03-09T17:30:44.834 INFO:tasks.workunit.client.1.vm09.stdout:4/962: mknod d11/d1e/d31/c11f 0 2026-03-09T17:30:44.835 INFO:tasks.workunit.client.1.vm09.stdout:2/998: truncate d13/d15/d34/d45/f138 319939 0 2026-03-09T17:30:44.836 INFO:tasks.workunit.client.1.vm09.stdout:2/999: write d13/d15/d34/d45/f138 [1345952,97215] 0 2026-03-09T17:30:44.836 INFO:tasks.workunit.client.1.vm09.stdout:3/946: creat d5/d9/d90/f121 x:0 0 0 2026-03-09T17:30:44.839 INFO:tasks.workunit.client.1.vm09.stdout:6/968: rename d3/d21/db1/cc3 to d3/d21/d25/c13b 0 2026-03-09T17:30:44.848 INFO:tasks.workunit.client.1.vm09.stdout:6/969: creat d3/f13c x:0 0 0 2026-03-09T17:30:44.850 INFO:tasks.workunit.client.1.vm09.stdout:6/970: creat d3/d21/d76/d3f/d11d/f13d x:0 0 0 2026-03-09T17:30:44.863 INFO:tasks.workunit.client.1.vm09.stdout:6/971: sync 2026-03-09T17:30:44.871 INFO:tasks.workunit.client.1.vm09.stdout:6/972: creat d3/d21/d76/d5c/f13e x:0 0 0 2026-03-09T17:30:44.874 INFO:tasks.workunit.client.1.vm09.stdout:6/973: chown d3/d21/d25/d26/d86/d113 26 1 2026-03-09T17:30:44.875 INFO:tasks.workunit.client.1.vm09.stdout:6/974: chown d3/d21/d76/d5c/d7e/l131 5682 1 2026-03-09T17:30:44.883 INFO:tasks.workunit.client.1.vm09.stdout:3/947: dwrite d5/d16/d31/d37/f5b [0,4194304] 0 2026-03-09T17:30:44.885 INFO:tasks.workunit.client.1.vm09.stdout:3/948: readlink d5/d16/d31/d37/dae/lde 0 2026-03-09T17:30:44.887 INFO:tasks.workunit.client.1.vm09.stdout:6/975: write d3/d21/d76/d5c/d7e/d94/f10e [324534,40918] 0 2026-03-09T17:30:44.889 INFO:tasks.workunit.client.1.vm09.stdout:3/949: rmdir d5/d9/d90/db0 39 2026-03-09T17:30:44.896 INFO:tasks.workunit.client.1.vm09.stdout:3/950: dwrite d5/d9/d30/d65/f18 [0,4194304] 0 2026-03-09T17:30:44.901 INFO:tasks.workunit.client.1.vm09.stdout:4/963: dwrite d11/d1e/d29/d36/f6a [0,4194304] 0 2026-03-09T17:30:44.905 INFO:tasks.workunit.client.1.vm09.stdout:6/976: chown d3/d21/d76/d5c/d61/d6a/df2/l130 1652008 1 2026-03-09T17:30:44.911 INFO:tasks.workunit.client.1.vm09.stdout:6/977: symlink d3/d21/d76/d88/d101/l13f 0 2026-03-09T17:30:44.916 INFO:tasks.workunit.client.1.vm09.stdout:4/964: dread d11/d1e/d31/db6/ffe [0,4194304] 0 2026-03-09T17:30:44.917 INFO:tasks.workunit.client.1.vm09.stdout:6/978: fdatasync d3/d21/f28 0 2026-03-09T17:30:44.923 INFO:tasks.workunit.client.1.vm09.stdout:4/965: symlink d11/d1e/d45/d60/l120 0 2026-03-09T17:30:44.931 INFO:tasks.workunit.client.1.vm09.stdout:3/951: dread d5/d16/d46/f63 [0,4194304] 0 2026-03-09T17:30:44.935 INFO:tasks.workunit.client.1.vm09.stdout:3/952: stat d5/d16/f54 0 2026-03-09T17:30:44.935 INFO:tasks.workunit.client.1.vm09.stdout:3/953: stat d5/d16/d31/d37/d58/f91 0 2026-03-09T17:30:44.950 INFO:tasks.workunit.client.1.vm09.stdout:3/954: rename d5/d16/d31/d3d to d5/d16/dc5/d122 0 2026-03-09T17:30:44.954 INFO:tasks.workunit.client.1.vm09.stdout:6/979: write d3/d21/d25/d26/d86/dbc/fd8 [918651,69334] 0 2026-03-09T17:30:44.978 INFO:tasks.workunit.client.1.vm09.stdout:6/980: truncate d3/d21/d76/d5c/d61/d6a/ffc 1526199 0 2026-03-09T17:30:44.983 INFO:tasks.workunit.client.1.vm09.stdout:6/981: fsync d3/d21/d76/d5c/d61/d95/fe4 0 2026-03-09T17:30:44.984 INFO:tasks.workunit.client.1.vm09.stdout:3/955: dwrite d5/d16/d31/d37/f6d [0,4194304] 0 2026-03-09T17:30:44.984 INFO:tasks.workunit.client.1.vm09.stdout:6/982: chown d3/d21/d76/d5c/d7e/d94/c87 0 1 2026-03-09T17:30:45.007 INFO:tasks.workunit.client.1.vm09.stdout:6/983: rename d3/d48/fc7 to d3/d21/d76/d5c/d61/d6a/f140 0 2026-03-09T17:30:45.008 INFO:tasks.workunit.client.1.vm09.stdout:6/984: fdatasync d3/d21/d76/d5c/d7e/dc5/d98/fa6 0 2026-03-09T17:30:45.014 INFO:tasks.workunit.client.1.vm09.stdout:3/956: write d5/d9c/de7/fed [951657,118622] 0 2026-03-09T17:30:45.017 INFO:tasks.workunit.client.1.vm09.stdout:4/966: dread d11/d1e/d29/f2e [0,4194304] 0 2026-03-09T17:30:45.026 INFO:tasks.workunit.client.1.vm09.stdout:6/985: getdents d3/d21/d76 0 2026-03-09T17:30:45.027 INFO:tasks.workunit.client.1.vm09.stdout:4/967: symlink d11/d1e/d45/d60/df1/d78/l121 0 2026-03-09T17:30:45.031 INFO:tasks.workunit.client.1.vm09.stdout:6/986: sync 2026-03-09T17:30:45.036 INFO:tasks.workunit.client.1.vm09.stdout:6/987: write d3/d21/d76/d88/d101/f132 [998774,113668] 0 2026-03-09T17:30:45.039 INFO:tasks.workunit.client.1.vm09.stdout:4/968: mkdir d11/d1e/d45/d60/d122 0 2026-03-09T17:30:45.048 INFO:tasks.workunit.client.1.vm09.stdout:3/957: dread d5/d16/d25/f2c [0,4194304] 0 2026-03-09T17:30:45.050 INFO:tasks.workunit.client.1.vm09.stdout:3/958: chown d5/f11b 26 1 2026-03-09T17:30:45.051 INFO:tasks.workunit.client.1.vm09.stdout:6/988: dread d3/d7/d59/d9c/fdc [0,4194304] 0 2026-03-09T17:30:45.053 INFO:tasks.workunit.client.1.vm09.stdout:4/969: dread d11/f23 [0,4194304] 0 2026-03-09T17:30:45.056 INFO:tasks.workunit.client.1.vm09.stdout:6/989: rmdir d3/d21/d25/d26/d6b 39 2026-03-09T17:30:45.062 INFO:tasks.workunit.client.1.vm09.stdout:3/959: creat d5/d16/d31/d37/d58/d8a/f123 x:0 0 0 2026-03-09T17:30:45.063 INFO:tasks.workunit.client.1.vm09.stdout:3/960: stat d5/d9/d30/d65/d59/c82 0 2026-03-09T17:30:45.066 INFO:tasks.workunit.client.1.vm09.stdout:3/961: dwrite d5/f53 [0,4194304] 0 2026-03-09T17:30:45.073 INFO:tasks.workunit.client.1.vm09.stdout:4/970: getdents d11/d1e/d31/db6 0 2026-03-09T17:30:45.085 INFO:tasks.workunit.client.1.vm09.stdout:3/962: dwrite d5/d16/d31/d37/dae/db4/ffd [0,4194304] 0 2026-03-09T17:30:45.086 INFO:tasks.workunit.client.1.vm09.stdout:6/990: getdents d3/d7/d59/d9c 0 2026-03-09T17:30:45.105 INFO:tasks.workunit.client.1.vm09.stdout:6/991: read - d3/d21/d25/d26/db7/f10c zero size 2026-03-09T17:30:45.106 INFO:tasks.workunit.client.1.vm09.stdout:3/963: link d5/d16/d31/d37/dae/db4/cc7 d5/d9/d30/c124 0 2026-03-09T17:30:45.107 INFO:tasks.workunit.client.1.vm09.stdout:6/992: dread d3/d21/d76/d3f/f42 [0,4194304] 0 2026-03-09T17:30:45.108 INFO:tasks.workunit.client.1.vm09.stdout:3/964: creat d5/d16/dc5/d122/d9f/f125 x:0 0 0 2026-03-09T17:30:45.108 INFO:tasks.workunit.client.1.vm09.stdout:6/993: fdatasync d3/d7/d59/d73/fa3 0 2026-03-09T17:30:45.114 INFO:tasks.workunit.client.1.vm09.stdout:6/994: creat d3/d7/d59/d73/db0/f141 x:0 0 0 2026-03-09T17:30:45.115 INFO:tasks.workunit.client.1.vm09.stdout:3/965: mknod d5/d9/d90/db0/d105/c126 0 2026-03-09T17:30:45.116 INFO:tasks.workunit.client.1.vm09.stdout:4/971: dwrite d11/f19 [0,4194304] 0 2026-03-09T17:30:45.116 INFO:tasks.workunit.client.1.vm09.stdout:6/995: truncate d3/d21/d76/d5c/f13e 788021 0 2026-03-09T17:30:45.128 INFO:tasks.workunit.client.1.vm09.stdout:6/996: creat d3/d21/d76/d81/f142 x:0 0 0 2026-03-09T17:30:45.129 INFO:tasks.workunit.client.1.vm09.stdout:3/966: unlink d5/d16/d31/d37/d58/lb5 0 2026-03-09T17:30:45.131 INFO:tasks.workunit.client.1.vm09.stdout:6/997: stat d3/d21/d76/d5c/d61/d95 0 2026-03-09T17:30:45.132 INFO:tasks.workunit.client.1.vm09.stdout:4/972: dread d11/d1e/d45/d60/d71/db7/d89/d8b/f38 [0,4194304] 0 2026-03-09T17:30:45.133 INFO:tasks.workunit.client.1.vm09.stdout:4/973: truncate d11/d1e/d45/d60/f114 775120 0 2026-03-09T17:30:45.134 INFO:tasks.workunit.client.1.vm09.stdout:3/967: fsync d5/d9/f4e 0 2026-03-09T17:30:45.140 INFO:tasks.workunit.client.1.vm09.stdout:4/974: link d11/d1e/d45/d60/df1/f67 d11/d1e/d45/d60/d71/db7/d89/d8b/d58/deb/df9/f123 0 2026-03-09T17:30:45.145 INFO:tasks.workunit.client.1.vm09.stdout:4/975: creat d11/d1e/d45/d60/f124 x:0 0 0 2026-03-09T17:30:45.148 INFO:tasks.workunit.client.1.vm09.stdout:4/976: mknod d11/d1e/d45/d60/c125 0 2026-03-09T17:30:45.151 INFO:tasks.workunit.client.1.vm09.stdout:6/998: dread d3/d48/f6c [0,4194304] 0 2026-03-09T17:30:45.152 INFO:tasks.workunit.client.1.vm09.stdout:3/968: write d5/d9/d30/d65/f43 [3696795,28219] 0 2026-03-09T17:30:45.157 INFO:tasks.workunit.client.1.vm09.stdout:6/999: symlink d3/d48/l143 0 2026-03-09T17:30:45.159 INFO:tasks.workunit.client.1.vm09.stdout:3/969: creat d5/d16/d31/d37/d58/d8a/da8/ddf/df9/f127 x:0 0 0 2026-03-09T17:30:45.163 INFO:tasks.workunit.client.1.vm09.stdout:4/977: link d11/d1e/d31/c11f d11/d1e/d29/db5/c126 0 2026-03-09T17:30:45.164 INFO:tasks.workunit.client.1.vm09.stdout:3/970: creat d5/d9/d30/d65/d59/f128 x:0 0 0 2026-03-09T17:30:45.165 INFO:tasks.workunit.client.1.vm09.stdout:4/978: fsync fe 0 2026-03-09T17:30:45.165 INFO:tasks.workunit.client.1.vm09.stdout:3/971: fsync d5/d9/d30/d65/d59/d84/f6e 0 2026-03-09T17:30:45.172 INFO:tasks.workunit.client.1.vm09.stdout:3/972: symlink d5/d9/d30/d65/d59/d84/d100/l129 0 2026-03-09T17:30:45.173 INFO:tasks.workunit.client.1.vm09.stdout:3/973: readlink d5/d9/d30/d65/d59/l97 0 2026-03-09T17:30:45.174 INFO:tasks.workunit.client.1.vm09.stdout:3/974: chown d5/d16/d31/d37/d58/d8a/da8/lf4 706187 1 2026-03-09T17:30:45.176 INFO:tasks.workunit.client.1.vm09.stdout:4/979: write d11/d1e/d31/f65 [1147167,62272] 0 2026-03-09T17:30:45.180 INFO:tasks.workunit.client.1.vm09.stdout:3/975: dwrite d5/d16/d31/d37/f106 [0,4194304] 0 2026-03-09T17:30:45.183 INFO:tasks.workunit.client.1.vm09.stdout:3/976: fsync d5/d16/d31/d37/f5b 0 2026-03-09T17:30:45.186 INFO:tasks.workunit.client.1.vm09.stdout:4/980: truncate d11/f15 8533393 0 2026-03-09T17:30:45.194 INFO:tasks.workunit.client.1.vm09.stdout:3/977: creat d5/d16/dc5/f12a x:0 0 0 2026-03-09T17:30:45.194 INFO:tasks.workunit.client.1.vm09.stdout:4/981: fdatasync d11/d1e/d29/d36/f6a 0 2026-03-09T17:30:45.195 INFO:tasks.workunit.client.1.vm09.stdout:3/978: creat d5/d9c/de7/f12b x:0 0 0 2026-03-09T17:30:45.195 INFO:tasks.workunit.client.1.vm09.stdout:4/982: symlink d11/d1e/d29/d36/l127 0 2026-03-09T17:30:45.203 INFO:tasks.workunit.client.1.vm09.stdout:3/979: rename d5/c8d to d5/d9/d90/db0/dbb/c12c 0 2026-03-09T17:30:45.215 INFO:tasks.workunit.client.1.vm09.stdout:3/980: mkdir d5/d16/dc5/d12d 0 2026-03-09T17:30:45.219 INFO:tasks.workunit.client.1.vm09.stdout:4/983: link d11/d1e/def/lf8 d11/d1e/d45/daf/dff/l128 0 2026-03-09T17:30:45.221 INFO:tasks.workunit.client.1.vm09.stdout:4/984: read d11/d1e/d45/d60/d71/db7/f90 [1247740,79242] 0 2026-03-09T17:30:45.221 INFO:tasks.workunit.client.1.vm09.stdout:3/981: dwrite d5/f11b [0,4194304] 0 2026-03-09T17:30:45.237 INFO:tasks.workunit.client.1.vm09.stdout:3/982: symlink d5/d9/d30/d65/l12e 0 2026-03-09T17:30:45.244 INFO:tasks.workunit.client.1.vm09.stdout:3/983: write d5/d16/dc5/f12a [1015896,63227] 0 2026-03-09T17:30:45.247 INFO:tasks.workunit.client.1.vm09.stdout:3/984: chown d5/d9c 426218 1 2026-03-09T17:30:45.260 INFO:tasks.workunit.client.1.vm09.stdout:3/985: symlink d5/d16/d31/d37/d58/d8a/da8/ddf/l12f 0 2026-03-09T17:30:45.271 INFO:tasks.workunit.client.1.vm09.stdout:4/985: dwrite d11/d1e/f8c [0,4194304] 0 2026-03-09T17:30:45.281 INFO:tasks.workunit.client.1.vm09.stdout:3/986: write d5/d16/d31/d37/d58/f91 [918202,98463] 0 2026-03-09T17:30:45.289 INFO:tasks.workunit.client.1.vm09.stdout:3/987: symlink d5/l130 0 2026-03-09T17:30:45.294 INFO:tasks.workunit.client.1.vm09.stdout:4/986: getdents d11/d1e/d45/d60/d71/db7/d89/d8b/d58/deb 0 2026-03-09T17:30:45.318 INFO:tasks.workunit.client.1.vm09.stdout:4/987: sync 2026-03-09T17:30:45.319 INFO:tasks.workunit.client.1.vm09.stdout:3/988: write d5/d9/d90/ffa [292299,57560] 0 2026-03-09T17:30:45.319 INFO:tasks.workunit.client.1.vm09.stdout:4/988: sync 2026-03-09T17:30:45.344 INFO:tasks.workunit.client.1.vm09.stdout:4/989: dread d11/d1e/d31/f3a [0,4194304] 0 2026-03-09T17:30:45.344 INFO:tasks.workunit.client.1.vm09.stdout:3/989: dread d5/d9c/de7/f99 [0,4194304] 0 2026-03-09T17:30:45.363 INFO:tasks.workunit.client.1.vm09.stdout:3/990: truncate d5/d9/d30/d65/f15 1743824 0 2026-03-09T17:30:45.374 INFO:tasks.workunit.client.1.vm09.stdout:3/991: fdatasync d5/d9/f4e 0 2026-03-09T17:30:45.374 INFO:tasks.workunit.client.1.vm09.stdout:3/992: read d5/d16/d31/d37/f106 [3612421,79584] 0 2026-03-09T17:30:45.391 INFO:tasks.workunit.client.1.vm09.stdout:4/990: rename d11/d1e/d45/d60/d71/db7/d89/d8b/l91 to d11/d1e/d45/d60/d71/db7/d89/l129 0 2026-03-09T17:30:45.401 INFO:tasks.workunit.client.1.vm09.stdout:4/991: unlink f3 0 2026-03-09T17:30:45.411 INFO:tasks.workunit.client.1.vm09.stdout:3/993: write d5/d9/d30/d65/d59/fd9 [42135,90653] 0 2026-03-09T17:30:45.421 INFO:tasks.workunit.client.1.vm09.stdout:3/994: dwrite d5/d9/d30/d65/f4f [0,4194304] 0 2026-03-09T17:30:45.424 INFO:tasks.workunit.client.1.vm09.stdout:4/992: rename d11/d1e/d45/d60/cd3 to d11/d1e/d45/d60/d71/db7/d89/d8b/c12a 0 2026-03-09T17:30:45.430 INFO:tasks.workunit.client.1.vm09.stdout:3/995: symlink d5/d16/d31/d37/dae/db4/l131 0 2026-03-09T17:30:45.431 INFO:tasks.workunit.client.1.vm09.stdout:3/996: chown d5/d9/d90/db0/d105/c126 147708690 1 2026-03-09T17:30:45.449 INFO:tasks.workunit.client.1.vm09.stdout:3/997: dread d5/d16/d31/d37/f76 [0,4194304] 0 2026-03-09T17:30:45.453 INFO:tasks.workunit.client.1.vm09.stdout:3/998: truncate d5/d16/dc5/d122/db3/df3/f10e 608128 0 2026-03-09T17:30:45.453 INFO:tasks.workunit.client.1.vm09.stdout:3/999: readlink d5/d9/d30/d65/l26 0 2026-03-09T17:30:45.465 INFO:tasks.workunit.client.1.vm09.stdout:4/993: mkdir d11/d1e/d45/d60/df1/d78/d12b 0 2026-03-09T17:30:45.470 INFO:tasks.workunit.client.1.vm09.stdout:4/994: fsync d11/d1e/d31/db6/fc5 0 2026-03-09T17:30:45.474 INFO:tasks.workunit.client.1.vm09.stdout:4/995: creat d11/d1e/d45/d60/d71/db7/d89/d8b/d58/deb/df9/f12c x:0 0 0 2026-03-09T17:30:45.477 INFO:tasks.workunit.client.1.vm09.stdout:4/996: chown d11/d1e/d29/d36/dd7 105 1 2026-03-09T17:30:45.485 INFO:tasks.workunit.client.1.vm09.stdout:4/997: write d11/d1e/d45/d60/d71/db7/d89/d8b/d58/deb/df9/f12c [989330,12600] 0 2026-03-09T17:30:45.501 INFO:tasks.workunit.client.1.vm09.stdout:4/998: dwrite d11/f12 [0,4194304] 0 2026-03-09T17:30:45.544 INFO:tasks.workunit.client.1.vm09.stdout:4/999: dread d11/d1e/d45/d60/f7b [0,4194304] 0 2026-03-09T17:30:45.556 INFO:tasks.workunit.client.1.vm09.stderr:+ rm -rf -- ./tmp.KQiywNAPAT 2026-03-09T17:30:45.634 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:45 vm06.local ceph-mon[57307]: from='client.14762 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:45.634 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:45 vm06.local ceph-mon[57307]: from='client.? 192.168.123.106:0/780889172' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:30:45.634 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:45 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:45.634 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:45 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:45.634 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:45 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:45.634 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:45 vm06.local ceph-mon[57307]: pgmap v10: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 47 MiB/s rd, 96 MiB/s wr, 248 op/s 2026-03-09T17:30:45.670 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:45 vm09.local ceph-mon[62061]: from='client.14762 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:30:45.670 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:45 vm09.local ceph-mon[62061]: from='client.? 192.168.123.106:0/780889172' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:30:45.670 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:45 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:45.670 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:45 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:45.670 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:45 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:45.670 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:45 vm09.local ceph-mon[62061]: pgmap v10: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 47 MiB/s rd, 96 MiB/s wr, 248 op/s 2026-03-09T17:30:47.889 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:47 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:47.890 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:47 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:47.890 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:47 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:30:47.890 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:47 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:47.890 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:47 vm09.local ceph-mon[62061]: pgmap v11: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 47 MiB/s rd, 96 MiB/s wr, 248 op/s 2026-03-09T17:30:47.890 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:47 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:47.890 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:47 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:47.890 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:47 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:48.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:47 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:48.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:47 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:48.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:47 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:30:48.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:47 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:48.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:47 vm06.local ceph-mon[57307]: pgmap v11: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 47 MiB/s rd, 96 MiB/s wr, 248 op/s 2026-03-09T17:30:48.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:47 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:48.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:47 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:48.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:47 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: Upgrade: Setting container_image for all mgr 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.pbgzei"}]: dispatch 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.pbgzei"}]: dispatch 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.pbgzei"}]': finished 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm09.lqzvkh"}]: dispatch 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm09.lqzvkh"}]: dispatch 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm09.lqzvkh"}]': finished 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: pgmap v12: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 57 MiB/s rd, 113 MiB/s wr, 323 op/s 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T17:30:50.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:49 vm06.local ceph-mon[57307]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:50.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: Upgrade: Setting container_image for all mgr 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.pbgzei"}]: dispatch 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.pbgzei"}]: dispatch 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.pbgzei"}]': finished 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm09.lqzvkh"}]: dispatch 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm09.lqzvkh"}]: dispatch 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm09.lqzvkh"}]': finished 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: pgmap v12: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 57 MiB/s rd, 113 MiB/s wr, 323 op/s 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='mgr.vm06.pbgzei' 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T17:30:50.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:49 vm09.local ceph-mon[62061]: from='mgr.14712 192.168.123.106:0/1410464639' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:30:50.555 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:50 vm06.local systemd[1]: Stopping Ceph mon.vm06 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:30:50.555 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:50 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06[57303]: 2026-03-09T17:30:50.437+0000 7f8728a40700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm06 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:30:50.555 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:50 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06[57303]: 2026-03-09T17:30:50.437+0000 7f8728a40700 -1 mon.vm06@0(leader) e2 *** Got Signal Terminated *** 2026-03-09T17:30:50.555 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:50 vm06.local podman[109708]: 2026-03-09 17:30:50.515890702 +0000 UTC m=+0.106923012 container died e0e1a20b15774e123118b89b6bcd72097e9e605c2aae3b546764a23ffc53003d (image=quay.io/ceph/ceph:v18.2.0, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06, org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, GIT_BRANCH=HEAD, RELEASE=HEAD, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True, ceph=True, org.label-schema.build-date=20231212, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.29.1) 2026-03-09T17:30:50.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:50 vm06.local podman[109708]: 2026-03-09 17:30:50.573961925 +0000 UTC m=+0.164994246 container remove e0e1a20b15774e123118b89b6bcd72097e9e605c2aae3b546764a23ffc53003d (image=quay.io/ceph/ceph:v18.2.0, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06, io.buildah.version=1.29.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20231212, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.0, RELEASE=HEAD, ceph=True, maintainer=Guillaume Abrioux , GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git) 2026-03-09T17:30:50.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:50 vm06.local bash[109708]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06 2026-03-09T17:30:50.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:50 vm06.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm06.service: Deactivated successfully. 2026-03-09T17:30:50.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:50 vm06.local systemd[1]: Stopped Ceph mon.vm06 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:30:50.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:50 vm06.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm06.service: Consumed 6.872s CPU time. 2026-03-09T17:30:51.234 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local systemd[1]: Starting Ceph mon.vm06 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local podman[109817]: 2026-03-09 17:30:51.308334831 +0000 UTC m=+0.031556956 container create 86c27c9946b5a993ea6fd12734f7abf110998ceaec3e60c817d3a6dde96f73ab (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local podman[109817]: 2026-03-09 17:30:51.351215744 +0000 UTC m=+0.074437858 container init 86c27c9946b5a993ea6fd12734f7abf110998ceaec3e60c817d3a6dde96f73ab (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local podman[109817]: 2026-03-09 17:30:51.359272593 +0000 UTC m=+0.082494718 container start 86c27c9946b5a993ea6fd12734f7abf110998ceaec3e60c817d3a6dde96f73ab (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3) 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local bash[109817]: 86c27c9946b5a993ea6fd12734f7abf110998ceaec3e60c817d3a6dde96f73ab 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local podman[109817]: 2026-03-09 17:30:51.288716101 +0000 UTC m=+0.011938226 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local systemd[1]: Started Ceph mon.vm06 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: set uid:gid to 167:167 (ceph:ceph) 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: pidfile_write: ignore empty --pid-file 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: load: jerasure load: lrc 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: RocksDB version: 7.9.2 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Git sha 0 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: DB SUMMARY 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: DB Session ID: ABL97NIC5F3F19KUF8G7 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: CURRENT file: CURRENT 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: IDENTITY file: IDENTITY 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: MANIFEST file: MANIFEST-000015 size: 776 Bytes 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm06/store.db dir, Total Num: 1, files: 000023.sst 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm06/store.db: 000021.log size: 2063632 ; 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.error_if_exists: 0 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.create_if_missing: 0 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.paranoid_checks: 1 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.env: 0x563596cc4dc0 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.fs: PosixFileSystem 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.info_log: 0x563599169900 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_file_opening_threads: 16 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.statistics: (nil) 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.use_fsync: 0 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_log_file_size: 0 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-09T17:30:51.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.keep_log_file_num: 1000 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.recycle_log_file_num: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.allow_fallocate: 1 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.allow_mmap_reads: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.allow_mmap_writes: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.use_direct_reads: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.create_missing_column_families: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.db_log_dir: 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.wal_dir: 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.advise_random_on_open: 1 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.db_write_buffer_size: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.write_buffer_manager: 0x56359916d900 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.rate_limiter: (nil) 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.wal_recovery_mode: 2 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.enable_thread_tracking: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.enable_pipelined_write: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.unordered_write: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.row_cache: None 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.wal_filter: None 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.allow_ingest_behind: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.two_write_queues: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.manual_wal_flush: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.wal_compression: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.atomic_flush: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.log_readahead_size: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.best_efforts_recovery: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.allow_data_in_errors: 0 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.db_host_id: __hostname__ 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_background_jobs: 2 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_background_compactions: -1 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_subcompactions: 1 2026-03-09T17:30:51.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_total_wal_size: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_open_files: -1 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.bytes_per_sync: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compaction_readahead_size: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_background_flushes: -1 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Compression algorithms supported: 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: kZSTD supported: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: kXpressCompression supported: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: kBZip2Compression supported: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: kLZ4Compression supported: 1 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: kZlibCompression supported: 1 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: kLZ4HCCompression supported: 1 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: kSnappyCompression supported: 1 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm06/store.db/MANIFEST-000015 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.merge_operator: 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compaction_filter: None 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compaction_filter_factory: None 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.sst_partitioner_factory: None 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x563599168500) 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout: cache_index_and_filter_blocks: 1 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout: pin_top_level_index_and_filter: 1 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_type: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout: data_block_index_type: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_shortening: 1 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout: checksum: 4 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout: no_block_cache: 0 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache: 0x56359918d350 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_name: BinnedLRUCache 2026-03-09T17:30:51.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_options: 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: capacity : 536870912 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: num_shard_bits : 4 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: strict_capacity_limit : 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: high_pri_pool_ratio: 0.000 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_compressed: (nil) 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: persistent_cache: (nil) 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_size: 4096 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_size_deviation: 10 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_restart_interval: 16 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_block_restart_interval: 1 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: metadata_block_size: 4096 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: partition_filters: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: use_delta_encoding: 1 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: filter_policy: bloomfilter 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: whole_key_filtering: 1 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: verify_compression: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: read_amp_bytes_per_bit: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: format_version: 5 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: enable_index_compression: 1 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_align: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: max_auto_readahead_size: 262144 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: prepopulate_block_cache: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: initial_auto_readahead_size: 8192 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout: num_file_reads_for_auto_readahead: 2 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.write_buffer_size: 33554432 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_write_buffer_number: 2 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compression: NoCompression 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.bottommost_compression: Disabled 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.prefix_extractor: nullptr 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.num_levels: 7 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compression_opts.level: 32767 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compression_opts.strategy: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compression_opts.enabled: false 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-09T17:30:51.645 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.target_file_size_base: 67108864 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.arena_block_size: 1048576 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.disable_auto_compactions: 0 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.inplace_update_support: 0 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.bloom_locality: 0 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.max_successive_merges: 0 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.paranoid_file_checks: 0 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.force_consistency_checks: 1 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.report_bg_io_stats: 0 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.ttl: 2592000 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.enable_blob_files: false 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.min_blob_size: 0 2026-03-09T17:30:51.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.blob_file_size: 268435456 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.blob_file_starting_level: 0 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm06/store.db/MANIFEST-000015 succeeded,manifest_file_number is 15, next_file_number is 25, last_sequence is 8271, log_number is 21,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 21 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 21 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 486416ba-d76b-4929-a454-49d49e04ccf4 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773077451419746, "job": 1, "event": "recovery_started", "wal_files": [21]} 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #21 mode 2 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773077451429128, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 26, "file_size": 1870595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8272, "largest_seqno": 8909, "table_properties": {"data_size": 1866384, "index_size": 2338, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 7747, "raw_average_key_size": 24, "raw_value_size": 1859085, "raw_average_value_size": 5791, "num_data_blocks": 112, "num_entries": 321, "num_filter_entries": 321, "num_deletions": 2, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773077451, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "486416ba-d76b-4929-a454-49d49e04ccf4", "db_session_id": "ABL97NIC5F3F19KUF8G7", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773077451429222, "job": 1, "event": "recovery_finished"} 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: [db/version_set.cc:5047] Creating manifest 28 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm06/store.db/000021.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x56359918ee00 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: DB pointer 0x56359929a000 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** DB Stats ** 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** Compaction Stats [default] ** 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: L0 1/0 1.78 MB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 251.9 0.01 0.00 1 0.007 0 0 0.0 0.0 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: L6 1/0 6.34 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Sum 2/0 8.12 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 251.9 0.01 0.00 1 0.007 0 0 0.0 0.0 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 251.9 0.01 0.00 1 0.007 0 0 0.0 0.0 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** Compaction Stats [default] ** 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 251.9 0.01 0.00 1 0.007 0 0 0.0 0.0 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Flush(GB): cumulative 0.002, interval 0.002 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative compaction: 0.00 GB write, 85.46 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval compaction: 0.00 GB write, 85.46 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Block cache BinnedLRUCache@0x56359918d350#2 capacity: 512.00 MB usage: 27.59 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.4e-05 secs_since: 0 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Block cache entry stats(count,size,portion): FilterBlock(2,8.94 KB,0.00170469%) IndexBlock(2,18.66 KB,0.0035584%) Misc(1,0.00 KB,0%) 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:30:51.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: starting mon.vm06 rank 0 at public addrs [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] at bind addrs [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon_data /var/lib/ceph/mon/ceph-vm06 fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: mon.vm06@-1(???) e2 preinit fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: mon.vm06@-1(???).mds e13 new map 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: mon.vm06@-1(???).mds e13 print_map 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: e13 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: legacy client fscid: 1 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Filesystem 'cephfs' (1) 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: fs_name cephfs 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: epoch 11 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: tableserver 0 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: root 0 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: session_timeout 60 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: session_autoclose 300 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: max_file_size 1099511627776 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: max_xattr_size 65536 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: required_client_features {} 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: last_failure 0 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: last_failure_osd_epoch 0 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: max_mds 1 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: in 0 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: up {0=24291} 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: failed 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: damaged 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: stopped 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: data_pools [3] 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: metadata_pool 2 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: inline_data disabled 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: balancer 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: bal_rank_mask -1 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: standby_count_wanted 1 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: qdb_cluster leader: 0 members: 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: [mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: [mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Standby daemons: 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: [mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout: [mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: mon.vm06@-1(???).osd e41 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: mon.vm06@-1(???).osd e41 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: mon.vm06@-1(???).osd e41 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: mon.vm06@-1(???).osd e41 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T17:30:51.648 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:51 vm06.local ceph-mon[109831]: mon.vm06@-1(???).paxosservice(auth 1..21) refresh upgraded, format 0 -> 3 2026-03-09T17:30:52.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: from='mgr.? 192.168.123.109:0/829281664' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/crt"}]: dispatch 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: from='mgr.? 192.168.123.109:0/829281664' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: from='mgr.? 192.168.123.109:0/829281664' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/key"}]: dispatch 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: from='mgr.? 192.168.123.109:0/829281664' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: pgmap v13: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 32 MiB/s rd, 74 MiB/s wr, 199 op/s 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: mon.vm06 calling monitor election 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: mon.vm06 is new leader, mons vm06,vm09 in quorum (ranks 0,1) 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: monmap epoch 2 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: last_changed 2026-03-09T17:25:35.571460+0000 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: created 2026-03-09T17:24:17.098520+0000 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: min_mon_release 18 (reef) 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: election_strategy: 1 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: 0: [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon.vm06 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: 1: [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] mon.vm09 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 2 up:standby 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: osdmap e41: 6 total, 6 up, 6 in 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: mgrmap e30: vm06.pbgzei(active, since 20s), standbys: vm09.lqzvkh 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: overall HEALTH_OK 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: from='mgr.14712 ' entity='' 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: Standby manager daemon vm09.lqzvkh restarted 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: Standby manager daemon vm09.lqzvkh started 2026-03-09T17:30:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:52 vm06.local ceph-mon[109831]: mgrmap e31: vm06.pbgzei(active, since 20s), standbys: vm09.lqzvkh 2026-03-09T17:30:52.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.109:0/829281664' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/crt"}]: dispatch 2026-03-09T17:30:52.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.109:0/829281664' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:30:52.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.109:0/829281664' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/key"}]: dispatch 2026-03-09T17:30:52.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.109:0/829281664' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T17:30:52.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: pgmap v13: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 32 MiB/s rd, 74 MiB/s wr, 199 op/s 2026-03-09T17:30:52.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: mon.vm06 calling monitor election 2026-03-09T17:30:52.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: mon.vm06 is new leader, mons vm06,vm09 in quorum (ranks 0,1) 2026-03-09T17:30:52.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: monmap epoch 2 2026-03-09T17:30:52.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:30:52.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: last_changed 2026-03-09T17:25:35.571460+0000 2026-03-09T17:30:52.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: created 2026-03-09T17:24:17.098520+0000 2026-03-09T17:30:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: min_mon_release 18 (reef) 2026-03-09T17:30:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: election_strategy: 1 2026-03-09T17:30:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: 0: [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon.vm06 2026-03-09T17:30:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: 1: [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] mon.vm09 2026-03-09T17:30:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 2 up:standby 2026-03-09T17:30:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: osdmap e41: 6 total, 6 up, 6 in 2026-03-09T17:30:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: mgrmap e30: vm06.pbgzei(active, since 20s), standbys: vm09.lqzvkh 2026-03-09T17:30:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: overall HEALTH_OK 2026-03-09T17:30:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: from='mgr.14712 ' entity='' 2026-03-09T17:30:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: Standby manager daemon vm09.lqzvkh restarted 2026-03-09T17:30:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: Standby manager daemon vm09.lqzvkh started 2026-03-09T17:30:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:52 vm09.local ceph-mon[62061]: mgrmap e31: vm06.pbgzei(active, since 20s), standbys: vm09.lqzvkh 2026-03-09T17:30:54.365 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:54 vm06.local ceph-mon[109831]: mgrmap e32: vm06.pbgzei(active, since 21s), standbys: vm09.lqzvkh 2026-03-09T17:30:54.391 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:54 vm09.local ceph-mon[62061]: mgrmap e32: vm06.pbgzei(active, since 21s), standbys: vm09.lqzvkh 2026-03-09T17:30:56.521 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:56 vm06.local ceph-mon[109831]: Standby manager daemon vm09.lqzvkh restarted 2026-03-09T17:30:56.521 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:56 vm06.local ceph-mon[109831]: Standby manager daemon vm09.lqzvkh started 2026-03-09T17:30:56.521 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:56 vm06.local ceph-mon[109831]: from='mgr.? 192.168.123.109:0/1853894193' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/crt"}]: dispatch 2026-03-09T17:30:56.521 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:56 vm06.local ceph-mon[109831]: from='mgr.? 192.168.123.109:0/1853894193' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:30:56.521 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:56 vm06.local ceph-mon[109831]: from='mgr.? 192.168.123.109:0/1853894193' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/key"}]: dispatch 2026-03-09T17:30:56.521 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:56 vm06.local ceph-mon[109831]: from='mgr.? 192.168.123.109:0/1853894193' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T17:30:56.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:56 vm09.local ceph-mon[62061]: Standby manager daemon vm09.lqzvkh restarted 2026-03-09T17:30:56.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:56 vm09.local ceph-mon[62061]: Standby manager daemon vm09.lqzvkh started 2026-03-09T17:30:56.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:56 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.109:0/1853894193' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/crt"}]: dispatch 2026-03-09T17:30:56.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:56 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.109:0/1853894193' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T17:30:56.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:56 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.109:0/1853894193' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm09.lqzvkh/key"}]: dispatch 2026-03-09T17:30:56.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:56 vm09.local ceph-mon[62061]: from='mgr.? 192.168.123.109:0/1853894193' entity='mgr.vm09.lqzvkh' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T17:30:57.577 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:57 vm06.local ceph-mon[109831]: mgrmap e33: vm06.pbgzei(active, since 25s), standbys: vm09.lqzvkh 2026-03-09T17:30:57.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:57 vm09.local ceph-mon[62061]: mgrmap e33: vm06.pbgzei(active, since 25s), standbys: vm09.lqzvkh 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: Active manager daemon vm06.pbgzei restarted 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: Activating manager daemon vm06.pbgzei 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: osdmap e42: 6 total, 6 up, 6 in 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: mgrmap e34: vm06.pbgzei(active, starting, since 0.0413751s), standbys: vm09.lqzvkh 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.cjcawy"}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vmzmbb"}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.drzmdt"}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr metadata", "who": "vm06.pbgzei", "id": "vm06.pbgzei"}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr metadata", "who": "vm09.lqzvkh", "id": "vm09.lqzvkh"}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T17:30:58.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:58 vm06.local ceph-mon[109831]: Manager daemon vm06.pbgzei is now available 2026-03-09T17:30:58.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: Active manager daemon vm06.pbgzei restarted 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: Activating manager daemon vm06.pbgzei 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: osdmap e42: 6 total, 6 up, 6 in 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: mgrmap e34: vm06.pbgzei(active, starting, since 0.0413751s), standbys: vm09.lqzvkh 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.cjcawy"}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vmzmbb"}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.drzmdt"}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr metadata", "who": "vm06.pbgzei", "id": "vm06.pbgzei"}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr metadata", "who": "vm09.lqzvkh", "id": "vm09.lqzvkh"}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T17:30:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:58 vm09.local ceph-mon[62061]: Manager daemon vm06.pbgzei is now available 2026-03-09T17:30:59.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:59 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:59 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:30:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:59 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/mirror_snapshot_schedule"}]: dispatch 2026-03-09T17:30:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:30:59 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/trash_purge_schedule"}]: dispatch 2026-03-09T17:30:59.851 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:30:59.851 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:30:59.851 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/mirror_snapshot_schedule"}]: dispatch 2026-03-09T17:30:59.851 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:30:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.pbgzei/trash_purge_schedule"}]: dispatch 2026-03-09T17:31:00.638 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:00 vm09.local ceph-mon[62061]: mgrmap e35: vm06.pbgzei(active, since 1.1753s), standbys: vm09.lqzvkh 2026-03-09T17:31:00.638 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:00 vm09.local ceph-mon[62061]: pgmap v3: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail 2026-03-09T17:31:00.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:00 vm06.local ceph-mon[109831]: mgrmap e35: vm06.pbgzei(active, since 1.1753s), standbys: vm09.lqzvkh 2026-03-09T17:31:00.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:00 vm06.local ceph-mon[109831]: pgmap v3: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail 2026-03-09T17:31:01.795 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:01 vm09.local ceph-mon[62061]: mgrmap e36: vm06.pbgzei(active, since 2s), standbys: vm09.lqzvkh 2026-03-09T17:31:01.796 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:01 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:01.796 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:01 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:01.796 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:01 vm09.local ceph-mon[62061]: [09/Mar/2026:17:31:00] ENGINE Bus STARTING 2026-03-09T17:31:01.796 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:01 vm09.local ceph-mon[62061]: [09/Mar/2026:17:31:00] ENGINE Serving on https://192.168.123.106:7150 2026-03-09T17:31:01.796 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:01 vm09.local ceph-mon[62061]: [09/Mar/2026:17:31:00] ENGINE Client ('192.168.123.106', 39428) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T17:31:01.796 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:01 vm09.local ceph-mon[62061]: [09/Mar/2026:17:31:01] ENGINE Serving on http://192.168.123.106:8765 2026-03-09T17:31:01.796 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:01 vm09.local ceph-mon[62061]: [09/Mar/2026:17:31:01] ENGINE Bus STARTED 2026-03-09T17:31:01.796 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:01 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:01.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:01 vm06.local ceph-mon[109831]: mgrmap e36: vm06.pbgzei(active, since 2s), standbys: vm09.lqzvkh 2026-03-09T17:31:01.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:01.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:01.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:01 vm06.local ceph-mon[109831]: [09/Mar/2026:17:31:00] ENGINE Bus STARTING 2026-03-09T17:31:01.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:01 vm06.local ceph-mon[109831]: [09/Mar/2026:17:31:00] ENGINE Serving on https://192.168.123.106:7150 2026-03-09T17:31:01.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:01 vm06.local ceph-mon[109831]: [09/Mar/2026:17:31:00] ENGINE Client ('192.168.123.106', 39428) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T17:31:01.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:01 vm06.local ceph-mon[109831]: [09/Mar/2026:17:31:01] ENGINE Serving on http://192.168.123.106:8765 2026-03-09T17:31:01.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:01 vm06.local ceph-mon[109831]: [09/Mar/2026:17:31:01] ENGINE Bus STARTED 2026-03-09T17:31:01.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:02.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:02 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:02.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:02 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:02.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:02 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:03.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:02 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:03.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:02 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:03.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:02 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: pgmap v5: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: mgrmap e37: vm06.pbgzei(active, since 4s), standbys: vm09.lqzvkh 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: Updating vm09:/etc/ceph/ceph.conf 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.125 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:03 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: pgmap v5: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail 2026-03-09T17:31:04.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: mgrmap e37: vm06.pbgzei(active, since 4s), standbys: vm09.lqzvkh 2026-03-09T17:31:04.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:31:04.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:31:04.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:04.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:31:04.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T17:31:04.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: Updating vm09:/etc/ceph/ceph.conf 2026-03-09T17:31:04.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:31:04.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.conf 2026-03-09T17:31:04.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:04.236 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-09T17:31:04.236 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-09T17:31:05.133 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:04 vm09.local ceph-mon[62061]: Updating vm09:/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:31:05.133 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:04 vm09.local ceph-mon[62061]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:31:05.133 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:04 vm09.local ceph-mon[62061]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.client.admin.keyring 2026-03-09T17:31:05.133 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:04 vm09.local ceph-mon[62061]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.client.admin.keyring 2026-03-09T17:31:05.133 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:04 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:31:05.133 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:04 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:05.133 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:04 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T17:31:05.133 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:04 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:05.133 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:04 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T17:31:05.133 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:04 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T17:31:05.133 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:04 vm09.local ceph-mon[62061]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:05.133 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:04 vm09.local systemd[1]: Stopping Ceph mon.vm09 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:31:05.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:04 vm06.local ceph-mon[109831]: Updating vm09:/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:31:05.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:04 vm06.local ceph-mon[109831]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-09T17:31:05.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:04 vm06.local ceph-mon[109831]: Updating vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.client.admin.keyring 2026-03-09T17:31:05.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:04 vm06.local ceph-mon[109831]: Updating vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/config/ceph.client.admin.keyring 2026-03-09T17:31:05.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:31:05.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:05.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T17:31:05.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:05.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T17:31:05.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T17:31:05.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:05.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm09[62057]: 2026-03-09T17:31:05.130+0000 7f1d40301700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm09 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:31:05.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm09[62057]: 2026-03-09T17:31:05.130+0000 7f1d40301700 -1 mon.vm09@1(peon) e2 *** Got Signal Terminated *** 2026-03-09T17:31:05.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local podman[97872]: 2026-03-09 17:31:05.213089114 +0000 UTC m=+0.117753760 container died 4c30d1217de39037a09ad45db9ae6dc5126b8d099cde71b99ea143fe8838928d (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm09, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, GIT_CLEAN=True) 2026-03-09T17:31:05.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local podman[97872]: 2026-03-09 17:31:05.237845679 +0000 UTC m=+0.142510325 container remove 4c30d1217de39037a09ad45db9ae6dc5126b8d099cde71b99ea143fe8838928d (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm09, RELEASE=HEAD, ceph=True, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True) 2026-03-09T17:31:05.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local bash[97872]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm09 2026-03-09T17:31:05.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm09.service: Deactivated successfully. 2026-03-09T17:31:05.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local systemd[1]: Stopped Ceph mon.vm09 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:31:05.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm09.service: Consumed 3.941s CPU time. 2026-03-09T17:31:05.710 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local systemd[1]: Starting Ceph mon.vm09 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local podman[97981]: 2026-03-09 17:31:05.708733883 +0000 UTC m=+0.051401006 container create 65d270c6a306964790a627a32e51d0f9a5e6e5c0b3971111e299edc53e3c24aa (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm09, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local podman[97981]: 2026-03-09 17:31:05.666793461 +0000 UTC m=+0.009460584 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local podman[97981]: 2026-03-09 17:31:05.822978203 +0000 UTC m=+0.165645326 container init 65d270c6a306964790a627a32e51d0f9a5e6e5c0b3971111e299edc53e3c24aa (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm09, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS) 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local podman[97981]: 2026-03-09 17:31:05.831742633 +0000 UTC m=+0.174409746 container start 65d270c6a306964790a627a32e51d0f9a5e6e5c0b3971111e299edc53e3c24aa (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm09, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local bash[97981]: 65d270c6a306964790a627a32e51d0f9a5e6e5c0b3971111e299edc53e3c24aa 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local systemd[1]: Started Ceph mon.vm09 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: set uid:gid to 167:167 (ceph:ceph) 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: pidfile_write: ignore empty --pid-file 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: load: jerasure load: lrc 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: RocksDB version: 7.9.2 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Git sha 0 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: DB SUMMARY 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: DB Session ID: BNM4K75E0SK0J99891XR 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: CURRENT file: CURRENT 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: IDENTITY file: IDENTITY 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: MANIFEST file: MANIFEST-000010 size: 669 Bytes 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm09/store.db dir, Total Num: 1, files: 000018.sst 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm09/store.db: 000016.log size: 6500908 ; 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.error_if_exists: 0 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.create_if_missing: 0 2026-03-09T17:31:05.962 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.paranoid_checks: 1 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.env: 0x5648ccb3ddc0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.fs: PosixFileSystem 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.info_log: 0x5648cdc9b900 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_file_opening_threads: 16 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.statistics: (nil) 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.use_fsync: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_log_file_size: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.keep_log_file_num: 1000 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.recycle_log_file_num: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.allow_fallocate: 1 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.allow_mmap_reads: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.allow_mmap_writes: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.use_direct_reads: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.create_missing_column_families: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.db_log_dir: 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.wal_dir: 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.advise_random_on_open: 1 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.db_write_buffer_size: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.write_buffer_manager: 0x5648cdc9f900 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.rate_limiter: (nil) 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.wal_recovery_mode: 2 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.enable_thread_tracking: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.enable_pipelined_write: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.unordered_write: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.row_cache: None 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.wal_filter: None 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.allow_ingest_behind: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.two_write_queues: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.manual_wal_flush: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.wal_compression: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.atomic_flush: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.log_readahead_size: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.best_efforts_recovery: 0 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-09T17:31:05.963 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.allow_data_in_errors: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.db_host_id: __hostname__ 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_background_jobs: 2 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_background_compactions: -1 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_subcompactions: 1 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_total_wal_size: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_open_files: -1 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.bytes_per_sync: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compaction_readahead_size: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_background_flushes: -1 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Compression algorithms supported: 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: kZSTD supported: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: kXpressCompression supported: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: kBZip2Compression supported: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: kLZ4Compression supported: 1 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: kZlibCompression supported: 1 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: kLZ4HCCompression supported: 1 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: kSnappyCompression supported: 1 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm09/store.db/MANIFEST-000010 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.merge_operator: 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compaction_filter: None 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compaction_filter_factory: None 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.sst_partitioner_factory: None 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5648cdc9b580) 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: cache_index_and_filter_blocks: 1 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: pin_top_level_index_and_filter: 1 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: index_type: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: data_block_index_type: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: index_shortening: 1 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: checksum: 4 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: no_block_cache: 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: block_cache: 0x5648cdcbe9b0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: block_cache_name: BinnedLRUCache 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: block_cache_options: 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: capacity : 536870912 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: num_shard_bits : 4 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: strict_capacity_limit : 0 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: high_pri_pool_ratio: 0.000 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: block_cache_compressed: (nil) 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: persistent_cache: (nil) 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: block_size: 4096 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: block_size_deviation: 10 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: block_restart_interval: 16 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: index_block_restart_interval: 1 2026-03-09T17:31:05.964 INFO:journalctl@ceph.mon.vm09.vm09.stdout: metadata_block_size: 4096 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout: partition_filters: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout: use_delta_encoding: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout: filter_policy: bloomfilter 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout: whole_key_filtering: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout: verify_compression: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout: read_amp_bytes_per_bit: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout: format_version: 5 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout: enable_index_compression: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout: block_align: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout: max_auto_readahead_size: 262144 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout: prepopulate_block_cache: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout: initial_auto_readahead_size: 8192 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout: num_file_reads_for_auto_readahead: 2 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.write_buffer_size: 33554432 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_write_buffer_number: 2 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compression: NoCompression 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.bottommost_compression: Disabled 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.prefix_extractor: nullptr 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.num_levels: 7 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compression_opts.level: 32767 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compression_opts.strategy: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compression_opts.enabled: false 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.target_file_size_base: 67108864 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.arena_block_size: 1048576 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-09T17:31:05.965 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.disable_auto_compactions: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.inplace_update_support: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.bloom_locality: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.max_successive_merges: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.paranoid_file_checks: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.force_consistency_checks: 1 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.report_bg_io_stats: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.ttl: 2592000 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.enable_blob_files: false 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.min_blob_size: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.blob_file_size: 268435456 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.blob_file_starting_level: 0 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm09/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 20, last_sequence is 8256, log_number is 16,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 16 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 16 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 0466a380-049f-4e23-8acb-3dfb6f08a8bf 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773077465896803, "job": 1, "event": "recovery_started", "wal_files": [16]} 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #16 mode 2 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773077465938522, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 21, "file_size": 3906378, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8261, "largest_seqno": 9368, "table_properties": {"data_size": 3900285, "index_size": 3767, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 12133, "raw_average_key_size": 23, "raw_value_size": 3888937, "raw_average_value_size": 7655, "num_data_blocks": 178, "num_entries": 508, "num_filter_entries": 508, "num_deletions": 2, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773077465, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0466a380-049f-4e23-8acb-3dfb6f08a8bf", "db_session_id": "BNM4K75E0SK0J99891XR", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773077465939874, "job": 1, "event": "recovery_finished"} 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: [db/version_set.cc:5047] Creating manifest 23 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm09/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5648cdcc0e00 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: DB pointer 0x5648cdcd0000 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout: ** DB Stats ** 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Uptime(secs): 0.1 total, 0.1 interval 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T17:31:05.966 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: ** Compaction Stats [default] ** 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: L0 1/0 3.73 MB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 146.1 0.03 0.00 1 0.026 0 0 0.0 0.0 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: L6 1/0 6.34 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Sum 2/0 10.06 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 146.1 0.03 0.00 1 0.026 0 0 0.0 0.0 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 146.1 0.03 0.00 1 0.026 0 0 0.0 0.0 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: ** Compaction Stats [default] ** 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 146.1 0.03 0.00 1 0.026 0 0 0.0 0.0 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Uptime(secs): 0.1 total, 0.1 interval 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Flush(GB): cumulative 0.004, interval 0.004 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Cumulative compaction: 0.00 GB write, 65.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Interval compaction: 0.00 GB write, 65.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Block cache BinnedLRUCache@0x5648cdcbe9b0#2 capacity: 512.00 MB usage: 49.83 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 0.000224 secs_since: 0 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Block cache entry stats(count,size,portion): FilterBlock(2,9.44 KB,0.00180006%) IndexBlock(2,20.16 KB,0.0038445%) Misc(4,20.23 KB,0.0038594%) 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: starting mon.vm09 rank 1 at public addrs [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] at bind addrs [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] mon_data /var/lib/ceph/mon/ceph-vm09 fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: mon.vm09@-1(???) e2 preinit fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:31:05.967 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: mon.vm09@-1(???).mds e13 new map 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: mon.vm09@-1(???).mds e13 print_map 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: e13 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: legacy client fscid: 1 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Filesystem 'cephfs' (1) 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: fs_name cephfs 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: epoch 11 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: tableserver 0 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: root 0 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: session_timeout 60 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: session_autoclose 300 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: max_file_size 1099511627776 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: max_xattr_size 65536 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: required_client_features {} 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: last_failure 0 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: last_failure_osd_epoch 0 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: max_mds 1 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: in 0 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: up {0=24291} 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: failed 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: damaged 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: stopped 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: data_pools [3] 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: metadata_pool 2 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: inline_data disabled 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: balancer 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: bal_rank_mask -1 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: standby_count_wanted 1 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: qdb_cluster leader: 0 members: 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: [mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: [mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: Standby daemons: 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: [mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout: [mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:31:06.267 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: mon.vm09@-1(???).osd e42 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-09T17:31:06.268 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: mon.vm09@-1(???).osd e42 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T17:31:06.268 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: mon.vm09@-1(???).osd e42 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T17:31:06.268 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: mon.vm09@-1(???).osd e42 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T17:31:06.268 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:05 vm09.local ceph-mon[97995]: mon.vm09@-1(???).paxosservice(auth 1..22) refresh upgraded, format 0 -> 3 2026-03-09T17:31:07.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: pgmap v7: 65 pgs: 65 active+clean; 1.3 GiB data, 6.4 GiB used, 114 GiB / 120 GiB avail; 597 KiB/s rd, 545 KiB/s wr, 87 op/s 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: mon.vm09 calling monitor election 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: mon.vm06 calling monitor election 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: mon.vm06 is new leader, mons vm06,vm09 in quorum (ranks 0,1) 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: monmap epoch 3 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: last_changed 2026-03-09T17:31:06.718735+0000 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: created 2026-03-09T17:24:17.098520+0000 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: min_mon_release 19 (squid) 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: election_strategy: 1 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: 0: [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon.vm06 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: 1: [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] mon.vm09 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 2 up:standby 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: osdmap e42: 6 total, 6 up, 6 in 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: mgrmap e37: vm06.pbgzei(active, since 8s), standbys: vm09.lqzvkh 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: overall HEALTH_OK 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:07.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:07 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:31:08.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: pgmap v7: 65 pgs: 65 active+clean; 1.3 GiB data, 6.4 GiB used, 114 GiB / 120 GiB avail; 597 KiB/s rd, 545 KiB/s wr, 87 op/s 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: mon.vm09 calling monitor election 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: mon.vm06 calling monitor election 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: mon.vm06 is new leader, mons vm06,vm09 in quorum (ranks 0,1) 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mon metadata", "id": "vm09"}]: dispatch 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: monmap epoch 3 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: last_changed 2026-03-09T17:31:06.718735+0000 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: created 2026-03-09T17:24:17.098520+0000 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: min_mon_release 19 (squid) 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: election_strategy: 1 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: 0: [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon.vm06 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: 1: [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] mon.vm09 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 2 up:standby 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: osdmap e42: 6 total, 6 up, 6 in 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: mgrmap e37: vm06.pbgzei(active, since 8s), standbys: vm09.lqzvkh 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: overall HEALTH_OK 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:08.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:07 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:31:09.563 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:09 vm09.local ceph-mon[97995]: pgmap v8: 65 pgs: 65 active+clean; 1.3 GiB data, 6.4 GiB used, 114 GiB / 120 GiB avail; 462 KiB/s rd, 422 KiB/s wr, 67 op/s 2026-03-09T17:31:09.563 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:09.563 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:09.563 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:09.563 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:09.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:09 vm06.local ceph-mon[109831]: pgmap v8: 65 pgs: 65 active+clean; 1.3 GiB data, 6.4 GiB used, 114 GiB / 120 GiB avail; 462 KiB/s rd, 422 KiB/s wr, 67 op/s 2026-03-09T17:31:09.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:09.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:09.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:09.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: pgmap v9: 65 pgs: 65 active+clean; 654 MiB data, 4.4 GiB used, 116 GiB / 120 GiB avail; 917 KiB/s rd, 947 KiB/s wr, 155 op/s 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: Reconfiguring mon.vm06 (monmap changed)... 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: Reconfiguring daemon mon.vm06 on vm06 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: Reconfiguring mgr.vm06.pbgzei (monmap changed)... 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.pbgzei", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: Reconfiguring daemon mgr.vm06.pbgzei on vm06 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: Reconfiguring ceph-exporter.vm06 (monmap changed)... 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm06"}]: dispatch 2026-03-09T17:31:11.532 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: pgmap v9: 65 pgs: 65 active+clean; 654 MiB data, 4.4 GiB used, 116 GiB / 120 GiB avail; 917 KiB/s rd, 947 KiB/s wr, 155 op/s 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: Reconfiguring mon.vm06 (monmap changed)... 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: Reconfiguring daemon mon.vm06 on vm06 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: Reconfiguring mgr.vm06.pbgzei (monmap changed)... 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.pbgzei", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: Reconfiguring daemon mgr.vm06.pbgzei on vm06 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: Reconfiguring ceph-exporter.vm06 (monmap changed)... 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T17:31:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm06"}]: dispatch 2026-03-09T17:31:11.646 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:12.660 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-09T17:31:12.660 DEBUG:teuthology.orchestra.run.vm09:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1/tmp 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: Unable to update caps for client.ceph-exporter.vm06 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: Reconfiguring daemon ceph-exporter.vm06 on vm06 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: Reconfiguring crash.vm06 (monmap changed)... 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: Reconfiguring daemon crash.vm06 on vm06 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: Reconfiguring osd.0 (monmap changed)... 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: Reconfiguring daemon osd.0 on vm06 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T17:31:13.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:13.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: Unable to update caps for client.ceph-exporter.vm06 2026-03-09T17:31:13.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: Reconfiguring daemon ceph-exporter.vm06 on vm06 2026-03-09T17:31:13.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:13.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:13.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: Reconfiguring crash.vm06 (monmap changed)... 2026-03-09T17:31:13.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T17:31:13.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:13.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: Reconfiguring daemon crash.vm06 on vm06 2026-03-09T17:31:13.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:13.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:13.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: Reconfiguring osd.0 (monmap changed)... 2026-03-09T17:31:13.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T17:31:13.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:13.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: Reconfiguring daemon osd.0 on vm06 2026-03-09T17:31:13.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:13.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:13.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T17:31:13.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: pgmap v10: 65 pgs: 65 active+clean; 654 MiB data, 4.4 GiB used, 116 GiB / 120 GiB avail; 829 KiB/s rd, 856 KiB/s wr, 140 op/s 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: Reconfiguring osd.1 (monmap changed)... 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: Reconfiguring daemon osd.1 on vm06 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: Reconfiguring osd.2 (monmap changed)... 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: Reconfiguring daemon osd.2 on vm06 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vmzmbb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.gzymac", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:31:14.137 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: pgmap v10: 65 pgs: 65 active+clean; 654 MiB data, 4.4 GiB used, 116 GiB / 120 GiB avail; 829 KiB/s rd, 856 KiB/s wr, 140 op/s 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: Reconfiguring osd.1 (monmap changed)... 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: Reconfiguring daemon osd.1 on vm06 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: Reconfiguring osd.2 (monmap changed)... 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: Reconfiguring daemon osd.2 on vm06 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vmzmbb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.gzymac", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:31:14.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:14.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.897+0000 7f92817ca700 1 -- 192.168.123.106:0/2750063142 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f927c071950 msgr2=0x7f927c071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:14.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.897+0000 7f92817ca700 1 --2- 192.168.123.106:0/2750063142 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f927c071950 0x7f927c071d60 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f926c007780 tx=0x7f926c00c050 comp rx=0 tx=0).stop 2026-03-09T17:31:14.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.897+0000 7f92817ca700 1 -- 192.168.123.106:0/2750063142 shutdown_connections 2026-03-09T17:31:14.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.897+0000 7f92817ca700 1 --2- 192.168.123.106:0/2750063142 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f927c072330 0x7f927c0770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:14.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.897+0000 7f92817ca700 1 --2- 192.168.123.106:0/2750063142 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f927c071950 0x7f927c071d60 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:14.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.897+0000 7f92817ca700 1 -- 192.168.123.106:0/2750063142 >> 192.168.123.106:0/2750063142 conn(0x7f927c06d1a0 msgr2=0x7f927c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:14.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.897+0000 7f92817ca700 1 -- 192.168.123.106:0/2750063142 shutdown_connections 2026-03-09T17:31:14.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.897+0000 7f92817ca700 1 -- 192.168.123.106:0/2750063142 wait complete. 2026-03-09T17:31:14.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.897+0000 7f92817ca700 1 Processor -- start 2026-03-09T17:31:14.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.897+0000 7f92817ca700 1 -- start start 2026-03-09T17:31:14.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.897+0000 7f92817ca700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f927c071950 0x7f927c0824b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:14.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.897+0000 7f92817ca700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f927c072330 0x7f927c0829f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:14.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.897+0000 7f92817ca700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f927c082f30 con 0x7f927c071950 2026-03-09T17:31:14.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.897+0000 7f92817ca700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f927c083070 con 0x7f927c072330 2026-03-09T17:31:14.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.898+0000 7f927b7fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f927c072330 0x7f927c0829f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:14.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.898+0000 7f927b7fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f927c072330 0x7f927c0829f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:39180/0 (socket says 192.168.123.106:39180) 2026-03-09T17:31:14.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.898+0000 7f927b7fe700 1 -- 192.168.123.106:0/956004155 learned_addr learned my addr 192.168.123.106:0/956004155 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:31:14.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.898+0000 7f927bfff700 1 --2- 192.168.123.106:0/956004155 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f927c071950 0x7f927c0824b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:14.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.898+0000 7f927b7fe700 1 -- 192.168.123.106:0/956004155 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f927c071950 msgr2=0x7f927c0824b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:14.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.898+0000 7f927b7fe700 1 --2- 192.168.123.106:0/956004155 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f927c071950 0x7f927c0824b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:14.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.898+0000 7f927b7fe700 1 -- 192.168.123.106:0/956004155 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f926c007430 con 0x7f927c072330 2026-03-09T17:31:14.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.898+0000 7f927b7fe700 1 --2- 192.168.123.106:0/956004155 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f927c072330 0x7f927c0829f0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f927400c370 tx=0x7f927400c730 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:14.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.900+0000 7f92797fa700 1 -- 192.168.123.106:0/956004155 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f927400e050 con 0x7f927c072330 2026-03-09T17:31:14.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.900+0000 7f92797fa700 1 -- 192.168.123.106:0/956004155 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f927400f040 con 0x7f927c072330 2026-03-09T17:31:14.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.900+0000 7f92817ca700 1 -- 192.168.123.106:0/956004155 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f927c0832a0 con 0x7f927c072330 2026-03-09T17:31:14.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.900+0000 7f92797fa700 1 -- 192.168.123.106:0/956004155 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9274013610 con 0x7f927c072330 2026-03-09T17:31:14.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.900+0000 7f92817ca700 1 -- 192.168.123.106:0/956004155 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f927c1b2dc0 con 0x7f927c072330 2026-03-09T17:31:14.904 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.901+0000 7f92817ca700 1 -- 192.168.123.106:0/956004155 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f927c07c8a0 con 0x7f927c072330 2026-03-09T17:31:14.904 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.902+0000 7f92797fa700 1 -- 192.168.123.106:0/956004155 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f92740090d0 con 0x7f927c072330 2026-03-09T17:31:14.906 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.903+0000 7f92797fa700 1 --2- 192.168.123.106:0/956004155 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9264077b00 0x7f9264079fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:14.906 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.903+0000 7f92797fa700 1 -- 192.168.123.106:0/956004155 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9274099a30 con 0x7f927c072330 2026-03-09T17:31:14.907 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.905+0000 7f927bfff700 1 --2- 192.168.123.106:0/956004155 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9264077b00 0x7f9264079fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:14.908 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.906+0000 7f927bfff700 1 --2- 192.168.123.106:0/956004155 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9264077b00 0x7f9264079fb0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f926c00c420 tx=0x7f926c0046e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:14.913 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:14.907+0000 7f92797fa700 1 -- 192.168.123.106:0/956004155 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f92740620c0 con 0x7f927c072330 2026-03-09T17:31:15.056 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.053+0000 7f92817ca700 1 -- 192.168.123.106:0/956004155 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f927c061190 con 0x7f9264077b00 2026-03-09T17:31:15.060 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.056+0000 7f92797fa700 1 -- 192.168.123.106:0/956004155 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+383 (secure 0 0 0) 0x7f927c061190 con 0x7f9264077b00 2026-03-09T17:31:15.062 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.059+0000 7f9262ffd700 1 -- 192.168.123.106:0/956004155 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9264077b00 msgr2=0x7f9264079fb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:15.062 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.059+0000 7f9262ffd700 1 --2- 192.168.123.106:0/956004155 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9264077b00 0x7f9264079fb0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f926c00c420 tx=0x7f926c0046e0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.062 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.059+0000 7f9262ffd700 1 -- 192.168.123.106:0/956004155 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f927c072330 msgr2=0x7f927c0829f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:15.062 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.059+0000 7f9262ffd700 1 --2- 192.168.123.106:0/956004155 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f927c072330 0x7f927c0829f0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f927400c370 tx=0x7f927400c730 comp rx=0 tx=0).stop 2026-03-09T17:31:15.062 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.060+0000 7f9262ffd700 1 -- 192.168.123.106:0/956004155 shutdown_connections 2026-03-09T17:31:15.062 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.060+0000 7f9262ffd700 1 --2- 192.168.123.106:0/956004155 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9264077b00 0x7f9264079fb0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.062 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.060+0000 7f9262ffd700 1 --2- 192.168.123.106:0/956004155 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f927c071950 0x7f927c0824b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.062 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.060+0000 7f9262ffd700 1 --2- 192.168.123.106:0/956004155 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f927c072330 0x7f927c0829f0 secure :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f927400c370 tx=0x7f927400c730 comp rx=0 tx=0).stop 2026-03-09T17:31:15.062 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.060+0000 7f9262ffd700 1 -- 192.168.123.106:0/956004155 >> 192.168.123.106:0/956004155 conn(0x7f927c06d1a0 msgr2=0x7f927c0758d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:15.063 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.060+0000 7f9262ffd700 1 -- 192.168.123.106:0/956004155 shutdown_connections 2026-03-09T17:31:15.065 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.062+0000 7f9262ffd700 1 -- 192.168.123.106:0/956004155 wait complete. 2026-03-09T17:31:15.080 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:31:15.237 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.235+0000 7f9595ba7700 1 -- 192.168.123.106:0/1381886930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95900fee80 msgr2=0x7f95901012a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:15.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.235+0000 7f9595ba7700 1 --2- 192.168.123.106:0/1381886930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95900fee80 0x7f95901012a0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f9580009b00 tx=0x7f9580009e10 comp rx=0 tx=0).stop 2026-03-09T17:31:15.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.235+0000 7f9595ba7700 1 -- 192.168.123.106:0/1381886930 shutdown_connections 2026-03-09T17:31:15.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.235+0000 7f9595ba7700 1 --2- 192.168.123.106:0/1381886930 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f95901017e0 0x7f9590103c60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.235+0000 7f9595ba7700 1 --2- 192.168.123.106:0/1381886930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95900fee80 0x7f95901012a0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.235+0000 7f9595ba7700 1 -- 192.168.123.106:0/1381886930 >> 192.168.123.106:0/1381886930 conn(0x7f95900faa70 msgr2=0x7f95900fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:15.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.235+0000 7f9595ba7700 1 -- 192.168.123.106:0/1381886930 shutdown_connections 2026-03-09T17:31:15.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.235+0000 7f9595ba7700 1 -- 192.168.123.106:0/1381886930 wait complete. 2026-03-09T17:31:15.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.236+0000 7f9595ba7700 1 Processor -- start 2026-03-09T17:31:15.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.236+0000 7f9595ba7700 1 -- start start 2026-03-09T17:31:15.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.236+0000 7f9595ba7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95900fee80 0x7f959019c410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:15.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.236+0000 7f9595ba7700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f95901017e0 0x7f959019c950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:15.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.236+0000 7f9595ba7700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f959019cee0 con 0x7f95900fee80 2026-03-09T17:31:15.238 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.236+0000 7f9595ba7700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f959019d020 con 0x7f95901017e0 2026-03-09T17:31:15.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.237+0000 7f958f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95900fee80 0x7f959019c410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:15.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.237+0000 7f958f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95900fee80 0x7f959019c410 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:59330/0 (socket says 192.168.123.106:59330) 2026-03-09T17:31:15.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.237+0000 7f958f7fe700 1 -- 192.168.123.106:0/1434726923 learned_addr learned my addr 192.168.123.106:0/1434726923 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:31:15.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.237+0000 7f958effd700 1 --2- 192.168.123.106:0/1434726923 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f95901017e0 0x7f959019c950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:15.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.237+0000 7f958f7fe700 1 -- 192.168.123.106:0/1434726923 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f95901017e0 msgr2=0x7f959019c950 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:15.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.237+0000 7f958f7fe700 1 --2- 192.168.123.106:0/1434726923 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f95901017e0 0x7f959019c950 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.237+0000 7f958f7fe700 1 -- 192.168.123.106:0/1434726923 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f95800097e0 con 0x7f95900fee80 2026-03-09T17:31:15.240 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.238+0000 7f958f7fe700 1 --2- 192.168.123.106:0/1434726923 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95900fee80 0x7f959019c410 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f9580009fd0 tx=0x7f9580004ab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:15.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.238+0000 7f958cff9700 1 -- 192.168.123.106:0/1434726923 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f958001d070 con 0x7f95900fee80 2026-03-09T17:31:15.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.238+0000 7f958cff9700 1 -- 192.168.123.106:0/1434726923 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f958000bc50 con 0x7f95900fee80 2026-03-09T17:31:15.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.238+0000 7f958cff9700 1 -- 192.168.123.106:0/1434726923 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f958000f7e0 con 0x7f95900fee80 2026-03-09T17:31:15.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.238+0000 7f9595ba7700 1 -- 192.168.123.106:0/1434726923 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f95901a1a80 con 0x7f95900fee80 2026-03-09T17:31:15.242 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.238+0000 7f9595ba7700 1 -- 192.168.123.106:0/1434726923 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f95901a1f40 con 0x7f95900fee80 2026-03-09T17:31:15.242 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.239+0000 7f9595ba7700 1 -- 192.168.123.106:0/1434726923 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9590196590 con 0x7f95900fee80 2026-03-09T17:31:15.251 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.249+0000 7f958cff9700 1 -- 192.168.123.106:0/1434726923 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f958000f940 con 0x7f95900fee80 2026-03-09T17:31:15.251 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.249+0000 7f958cff9700 1 --2- 192.168.123.106:0/1434726923 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f95780779f0 0x7f9578079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:15.252 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.250+0000 7f958cff9700 1 -- 192.168.123.106:0/1434726923 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f958000bdc0 con 0x7f95900fee80 2026-03-09T17:31:15.252 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.250+0000 7f958effd700 1 --2- 192.168.123.106:0/1434726923 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f95780779f0 0x7f9578079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:15.255 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.252+0000 7f958effd700 1 --2- 192.168.123.106:0/1434726923 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f95780779f0 0x7f9578079ea0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f9584005ea0 tx=0x7f9584005e30 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:15.255 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.253+0000 7f958cff9700 1 -- 192.168.123.106:0/1434726923 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9580064600 con 0x7f95900fee80 2026-03-09T17:31:15.412 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.409+0000 7f9595ba7700 1 -- 192.168.123.106:0/1434726923 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9590061190 con 0x7f95780779f0 2026-03-09T17:31:15.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.411+0000 7f958cff9700 1 -- 192.168.123.106:0/1434726923 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+383 (secure 0 0 0) 0x7f9590061190 con 0x7f95780779f0 2026-03-09T17:31:15.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.414+0000 7f95767fc700 1 -- 192.168.123.106:0/1434726923 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f95780779f0 msgr2=0x7f9578079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:15.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.414+0000 7f95767fc700 1 --2- 192.168.123.106:0/1434726923 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f95780779f0 0x7f9578079ea0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f9584005ea0 tx=0x7f9584005e30 comp rx=0 tx=0).stop 2026-03-09T17:31:15.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.414+0000 7f95767fc700 1 -- 192.168.123.106:0/1434726923 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95900fee80 msgr2=0x7f959019c410 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:15.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.414+0000 7f95767fc700 1 --2- 192.168.123.106:0/1434726923 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95900fee80 0x7f959019c410 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f9580009fd0 tx=0x7f9580004ab0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.414+0000 7f95767fc700 1 -- 192.168.123.106:0/1434726923 shutdown_connections 2026-03-09T17:31:15.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.414+0000 7f95767fc700 1 --2- 192.168.123.106:0/1434726923 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f95780779f0 0x7f9578079ea0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.414+0000 7f95767fc700 1 --2- 192.168.123.106:0/1434726923 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95900fee80 0x7f959019c410 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.414+0000 7f95767fc700 1 --2- 192.168.123.106:0/1434726923 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f95901017e0 0x7f959019c950 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.414+0000 7f95767fc700 1 -- 192.168.123.106:0/1434726923 >> 192.168.123.106:0/1434726923 conn(0x7f95900faa70 msgr2=0x7f95900fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:15.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.414+0000 7f95767fc700 1 -- 192.168.123.106:0/1434726923 shutdown_connections 2026-03-09T17:31:15.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.414+0000 7f95767fc700 1 -- 192.168.123.106:0/1434726923 wait complete. 2026-03-09T17:31:15.504 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.502+0000 7f688728f700 1 -- 192.168.123.106:0/2409126625 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68800722f0 msgr2=0x7f6880077070 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:15.504 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.502+0000 7f688728f700 1 --2- 192.168.123.106:0/2409126625 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68800722f0 0x7f6880077070 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f687800d3f0 tx=0x7f687800d700 comp rx=0 tx=0).stop 2026-03-09T17:31:15.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.502+0000 7f688728f700 1 -- 192.168.123.106:0/2409126625 shutdown_connections 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: Reconfiguring mds.cephfs.vm06.vmzmbb (monmap changed)... 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: Reconfiguring daemon mds.cephfs.vm06.vmzmbb on vm06 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: Reconfiguring mds.cephfs.vm06.gzymac (monmap changed)... 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: Reconfiguring daemon mds.cephfs.vm06.gzymac on vm06 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm09", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm09", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm09", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm09"}]: dispatch 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm09", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T17:31:15.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:15.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.502+0000 7f688728f700 1 --2- 192.168.123.106:0/2409126625 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68800722f0 0x7f6880077070 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.502+0000 7f688728f700 1 --2- 192.168.123.106:0/2409126625 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6880071910 0x7f6880071d20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.502+0000 7f688728f700 1 -- 192.168.123.106:0/2409126625 >> 192.168.123.106:0/2409126625 conn(0x7f688006d160 msgr2=0x7f688006f5b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:15.506 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.503+0000 7f688728f700 1 -- 192.168.123.106:0/2409126625 shutdown_connections 2026-03-09T17:31:15.506 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.503+0000 7f688728f700 1 -- 192.168.123.106:0/2409126625 wait complete. 2026-03-09T17:31:15.506 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.503+0000 7f688728f700 1 Processor -- start 2026-03-09T17:31:15.506 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.503+0000 7f688728f700 1 -- start start 2026-03-09T17:31:15.506 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.504+0000 7f688728f700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6880071910 0x7f68801313b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:15.506 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.504+0000 7f688728f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68801318f0 0x7f688007f510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:15.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.504+0000 7f688728f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6880131df0 con 0x7f68801318f0 2026-03-09T17:31:15.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.504+0000 7f688728f700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6880131f60 con 0x7f6880071910 2026-03-09T17:31:15.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.504+0000 7f688482a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68801318f0 0x7f688007f510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:15.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.504+0000 7f688482a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68801318f0 0x7f688007f510 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:59350/0 (socket says 192.168.123.106:59350) 2026-03-09T17:31:15.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.504+0000 7f688482a700 1 -- 192.168.123.106:0/387722479 learned_addr learned my addr 192.168.123.106:0/387722479 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:31:15.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.505+0000 7f688482a700 1 -- 192.168.123.106:0/387722479 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6880071910 msgr2=0x7f68801313b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:15.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.505+0000 7f688482a700 1 --2- 192.168.123.106:0/387722479 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6880071910 0x7f68801313b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.505+0000 7f688482a700 1 -- 192.168.123.106:0/387722479 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6878007ed0 con 0x7f68801318f0 2026-03-09T17:31:15.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.505+0000 7f688482a700 1 --2- 192.168.123.106:0/387722479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68801318f0 0x7f688007f510 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f6878004010 tx=0x7f68780040f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:15.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.505+0000 7f68767fc700 1 -- 192.168.123.106:0/387722479 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f687801c070 con 0x7f68801318f0 2026-03-09T17:31:15.508 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.505+0000 7f688728f700 1 -- 192.168.123.106:0/387722479 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f688007fa50 con 0x7f68801318f0 2026-03-09T17:31:15.508 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.505+0000 7f688728f700 1 -- 192.168.123.106:0/387722479 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f688007ff90 con 0x7f68801318f0 2026-03-09T17:31:15.508 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.506+0000 7f68767fc700 1 -- 192.168.123.106:0/387722479 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6878004760 con 0x7f68801318f0 2026-03-09T17:31:15.508 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.506+0000 7f68767fc700 1 -- 192.168.123.106:0/387722479 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6878017850 con 0x7f68801318f0 2026-03-09T17:31:15.509 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.507+0000 7f68767fc700 1 -- 192.168.123.106:0/387722479 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f68780179b0 con 0x7f68801318f0 2026-03-09T17:31:15.509 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.507+0000 7f68767fc700 1 --2- 192.168.123.106:0/387722479 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f686c077a40 0x7f686c079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:15.510 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.508+0000 7f68767fc700 1 -- 192.168.123.106:0/387722479 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f6878013070 con 0x7f68801318f0 2026-03-09T17:31:15.510 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.508+0000 7f688502b700 1 --2- 192.168.123.106:0/387722479 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f686c077a40 0x7f686c079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:15.510 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.508+0000 7f688728f700 1 -- 192.168.123.106:0/387722479 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6864005320 con 0x7f68801318f0 2026-03-09T17:31:15.511 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.509+0000 7f688502b700 1 --2- 192.168.123.106:0/387722479 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f686c077a40 0x7f686c079ef0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f687c0098a0 tx=0x7f687c006d90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:15.516 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.514+0000 7f68767fc700 1 -- 192.168.123.106:0/387722479 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6878064d80 con 0x7f68801318f0 2026-03-09T17:31:15.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.642+0000 7f688728f700 1 -- 192.168.123.106:0/387722479 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f6864000bf0 con 0x7f686c077a40 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: Reconfiguring mds.cephfs.vm06.vmzmbb (monmap changed)... 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: Reconfiguring daemon mds.cephfs.vm06.vmzmbb on vm06 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: Reconfiguring mds.cephfs.vm06.gzymac (monmap changed)... 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: Reconfiguring daemon mds.cephfs.vm06.gzymac on vm06 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm09", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm09", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm09", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm09"}]: dispatch 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm09", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T17:31:15.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (5m) 14s ago 6m 25.6M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (6m) 14s ago 6m 8442k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (5m) 7s ago 5m 11.1M - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (6m) 14s ago 6m 7402k - 18.2.0 dc2bc1663786 8c6366ef2954 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (5m) 7s ago 5m 7419k - 18.2.0 dc2bc1663786 78af352f0367 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (5m) 14s ago 6m 92.2M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (4m) 14s ago 4m 15.5M - 18.2.0 dc2bc1663786 4b4cbdf0c640 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (4m) 14s ago 4m 238M - 18.2.0 dc2bc1663786 4c8e86b2b8cd 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (4m) 7s ago 4m 148M - 18.2.0 dc2bc1663786 aa1f0430b448 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (4m) 7s ago 4m 17.4M - 18.2.0 dc2bc1663786 8dc8a0159213 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:8443,9283,8765 running (54s) 14s ago 6m 591M - 19.2.3-678-ge911bdeb 654f31e6858e a857a5a84e3a 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (30s) 7s ago 5m 487M - 19.2.3-678-ge911bdeb 654f31e6858e ce41ce6b68d6 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (24s) 14s ago 6m 42.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 86c27c9946b5 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (9s) 7s ago 5m 37.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 65d270c6a306 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (6m) 14s ago 6m 14.1M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (5m) 7s ago 5m 15.9M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (5m) 14s ago 5m 354M 4096M 18.2.0 dc2bc1663786 7a07f019bdd7 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (5m) 14s ago 5m 380M 4096M 18.2.0 dc2bc1663786 decf7e88ebaf 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (4m) 14s ago 4m 319M 4096M 18.2.0 dc2bc1663786 df6c7067c3e4 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (4m) 7s ago 4m 424M 4096M 18.2.0 dc2bc1663786 48a594500ef1 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (4m) 7s ago 4m 405M 4096M 18.2.0 dc2bc1663786 a47c39052541 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (4m) 7s ago 4m 347M 4096M 18.2.0 dc2bc1663786 89f436540a49 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (34s) 14s ago 5m 52.5M - 2.43.0 a07b618ecd1d f6ece95f2fd5 2026-03-09T17:31:15.652 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.649+0000 7f68767fc700 1 -- 192.168.123.106:0/387722479 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f6864000bf0 con 0x7f686c077a40 2026-03-09T17:31:15.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.652+0000 7f688728f700 1 -- 192.168.123.106:0/387722479 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f686c077a40 msgr2=0x7f686c079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:15.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.652+0000 7f688728f700 1 --2- 192.168.123.106:0/387722479 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f686c077a40 0x7f686c079ef0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f687c0098a0 tx=0x7f687c006d90 comp rx=0 tx=0).stop 2026-03-09T17:31:15.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.652+0000 7f688728f700 1 -- 192.168.123.106:0/387722479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68801318f0 msgr2=0x7f688007f510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:15.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.652+0000 7f688728f700 1 --2- 192.168.123.106:0/387722479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68801318f0 0x7f688007f510 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f6878004010 tx=0x7f68780040f0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.652+0000 7f688728f700 1 -- 192.168.123.106:0/387722479 shutdown_connections 2026-03-09T17:31:15.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.652+0000 7f688728f700 1 --2- 192.168.123.106:0/387722479 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f686c077a40 0x7f686c079ef0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.652+0000 7f688728f700 1 --2- 192.168.123.106:0/387722479 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6880071910 0x7f68801313b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.652+0000 7f688728f700 1 --2- 192.168.123.106:0/387722479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68801318f0 0x7f688007f510 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.652+0000 7f688728f700 1 -- 192.168.123.106:0/387722479 >> 192.168.123.106:0/387722479 conn(0x7f688006d160 msgr2=0x7f6880076460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:15.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.653+0000 7f688728f700 1 -- 192.168.123.106:0/387722479 shutdown_connections 2026-03-09T17:31:15.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.653+0000 7f688728f700 1 -- 192.168.123.106:0/387722479 wait complete. 2026-03-09T17:31:15.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.737+0000 7fa7a732c700 1 -- 192.168.123.106:0/494404151 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7a0072330 msgr2=0x7fa7a00770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:15.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.737+0000 7fa7a732c700 1 --2- 192.168.123.106:0/494404151 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7a0072330 0x7fa7a00770b0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fa79800b3a0 tx=0x7fa79800b6b0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.737+0000 7fa7a732c700 1 -- 192.168.123.106:0/494404151 shutdown_connections 2026-03-09T17:31:15.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.737+0000 7fa7a732c700 1 --2- 192.168.123.106:0/494404151 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7a0072330 0x7fa7a00770b0 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.737+0000 7fa7a732c700 1 --2- 192.168.123.106:0/494404151 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7a0071950 0x7fa7a0071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.737+0000 7fa7a732c700 1 -- 192.168.123.106:0/494404151 >> 192.168.123.106:0/494404151 conn(0x7fa7a006d1a0 msgr2=0x7fa7a006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.737+0000 7fa7a732c700 1 -- 192.168.123.106:0/494404151 shutdown_connections 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.737+0000 7fa7a732c700 1 -- 192.168.123.106:0/494404151 wait complete. 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.737+0000 7fa7a732c700 1 Processor -- start 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.737+0000 7fa7a732c700 1 -- start start 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.737+0000 7fa7a732c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7a0071950 0x7fa7a01b0a80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.737+0000 7fa7a732c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7a01b0fc0 0x7fa7a01b33a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.737+0000 7fa7a732c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa7a01b39e0 con 0x7fa7a0071950 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.737+0000 7fa7a732c700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa7a01b3b50 con 0x7fa7a01b0fc0 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.738+0000 7fa7a632a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7a0071950 0x7fa7a01b0a80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.738+0000 7fa7a632a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7a0071950 0x7fa7a01b0a80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:59372/0 (socket says 192.168.123.106:59372) 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.738+0000 7fa7a632a700 1 -- 192.168.123.106:0/344911054 learned_addr learned my addr 192.168.123.106:0/344911054 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.738+0000 7fa7a5b29700 1 --2- 192.168.123.106:0/344911054 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7a01b0fc0 0x7fa7a01b33a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.738+0000 7fa7a632a700 1 -- 192.168.123.106:0/344911054 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7a01b0fc0 msgr2=0x7fa7a01b33a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.738+0000 7fa7a632a700 1 --2- 192.168.123.106:0/344911054 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7a01b0fc0 0x7fa7a01b33a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.738+0000 7fa7a632a700 1 -- 192.168.123.106:0/344911054 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa79c009710 con 0x7fa7a0071950 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.738+0000 7fa7a632a700 1 --2- 192.168.123.106:0/344911054 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7a0071950 0x7fa7a01b0a80 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fa79c00eee0 tx=0x7fa79c00c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.738+0000 7fa7977fe700 1 -- 192.168.123.106:0/344911054 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa79c009a70 con 0x7fa7a0071950 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.739+0000 7fa7a732c700 1 -- 192.168.123.106:0/344911054 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa79800b050 con 0x7fa7a0071950 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.739+0000 7fa7a732c700 1 -- 192.168.123.106:0/344911054 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa7a007b970 con 0x7fa7a0071950 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.739+0000 7fa7977fe700 1 -- 192.168.123.106:0/344911054 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa79c00cdb0 con 0x7fa7a0071950 2026-03-09T17:31:15.741 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.739+0000 7fa7977fe700 1 -- 192.168.123.106:0/344911054 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa79c020900 con 0x7fa7a0071950 2026-03-09T17:31:15.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.741+0000 7fa7977fe700 1 -- 192.168.123.106:0/344911054 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fa79c020ac0 con 0x7fa7a0071950 2026-03-09T17:31:15.744 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.741+0000 7fa7977fe700 1 --2- 192.168.123.106:0/344911054 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa78c077a50 0x7fa78c079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:15.744 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.741+0000 7fa7a732c700 1 -- 192.168.123.106:0/344911054 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa7a004ea50 con 0x7fa7a0071950 2026-03-09T17:31:15.744 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.742+0000 7fa7a5b29700 1 --2- 192.168.123.106:0/344911054 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa78c077a50 0x7fa78c079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:15.744 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.742+0000 7fa7a5b29700 1 --2- 192.168.123.106:0/344911054 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa78c077a50 0x7fa78c079f00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fa7a0083920 tx=0x7fa798006040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:15.744 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.742+0000 7fa7977fe700 1 -- 192.168.123.106:0/344911054 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fa79c014070 con 0x7fa7a0071950 2026-03-09T17:31:15.751 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.747+0000 7fa7977fe700 1 -- 192.168.123.106:0/344911054 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa79c0639c0 con 0x7fa7a0071950 2026-03-09T17:31:15.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.960+0000 7fa7a732c700 1 -- 192.168.123.106:0/344911054 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fa7a007cda0 con 0x7fa7a0071950 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.961+0000 7fa7977fe700 1 -- 192.168.123.106:0/344911054 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+694 (secure 0 0 0) 0x7fa79c063110 con 0x7fa7a0071950 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 10, 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:31:15.964 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:31:15.966 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.964+0000 7fa7957fa700 1 -- 192.168.123.106:0/344911054 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa78c077a50 msgr2=0x7fa78c079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:15.966 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.964+0000 7fa7957fa700 1 --2- 192.168.123.106:0/344911054 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa78c077a50 0x7fa78c079f00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fa7a0083920 tx=0x7fa798006040 comp rx=0 tx=0).stop 2026-03-09T17:31:15.966 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.964+0000 7fa7957fa700 1 -- 192.168.123.106:0/344911054 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7a0071950 msgr2=0x7fa7a01b0a80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:15.966 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.964+0000 7fa7957fa700 1 --2- 192.168.123.106:0/344911054 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7a0071950 0x7fa7a01b0a80 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fa79c00eee0 tx=0x7fa79c00c5b0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.967 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.965+0000 7fa7957fa700 1 -- 192.168.123.106:0/344911054 shutdown_connections 2026-03-09T17:31:15.967 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.965+0000 7fa7957fa700 1 --2- 192.168.123.106:0/344911054 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa78c077a50 0x7fa78c079f00 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.967 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.965+0000 7fa7957fa700 1 --2- 192.168.123.106:0/344911054 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7a0071950 0x7fa7a01b0a80 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.967 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.965+0000 7fa7957fa700 1 --2- 192.168.123.106:0/344911054 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7a01b0fc0 0x7fa7a01b33a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:15.967 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.965+0000 7fa7957fa700 1 -- 192.168.123.106:0/344911054 >> 192.168.123.106:0/344911054 conn(0x7fa7a006d1a0 msgr2=0x7fa7a00753c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:15.967 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.965+0000 7fa7957fa700 1 -- 192.168.123.106:0/344911054 shutdown_connections 2026-03-09T17:31:15.967 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:15.965+0000 7fa7957fa700 1 -- 192.168.123.106:0/344911054 wait complete. 2026-03-09T17:31:16.066 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.063+0000 7f9ccd4ed700 1 -- 192.168.123.106:0/2163115716 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9cc8071980 msgr2=0x7f9cc8071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:16.066 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.063+0000 7f9ccd4ed700 1 --2- 192.168.123.106:0/2163115716 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9cc8071980 0x7f9cc8071d90 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f9cb8007780 tx=0x7f9cb800c050 comp rx=0 tx=0).stop 2026-03-09T17:31:16.066 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.064+0000 7f9ccd4ed700 1 -- 192.168.123.106:0/2163115716 shutdown_connections 2026-03-09T17:31:16.066 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.064+0000 7f9ccd4ed700 1 --2- 192.168.123.106:0/2163115716 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9cc8072360 0x7f9cc80770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.066 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.064+0000 7f9ccd4ed700 1 --2- 192.168.123.106:0/2163115716 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9cc8071980 0x7f9cc8071d90 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.066 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.064+0000 7f9ccd4ed700 1 -- 192.168.123.106:0/2163115716 >> 192.168.123.106:0/2163115716 conn(0x7f9cc806d1a0 msgr2=0x7f9cc806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:16.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.064+0000 7f9ccd4ed700 1 -- 192.168.123.106:0/2163115716 shutdown_connections 2026-03-09T17:31:16.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.064+0000 7f9ccd4ed700 1 -- 192.168.123.106:0/2163115716 wait complete. 2026-03-09T17:31:16.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.064+0000 7f9ccd4ed700 1 Processor -- start 2026-03-09T17:31:16.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.064+0000 7f9ccd4ed700 1 -- start start 2026-03-09T17:31:16.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.064+0000 7f9ccd4ed700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9cc8072360 0x7f9cc8082540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:16.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.064+0000 7f9ccd4ed700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9cc8082a80 0x7f9cc8082ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:16.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.064+0000 7f9ccd4ed700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9cc81b2a90 con 0x7f9cc8072360 2026-03-09T17:31:16.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.064+0000 7f9ccd4ed700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9cc81b2bd0 con 0x7f9cc8082a80 2026-03-09T17:31:16.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.065+0000 7f9cc6ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9cc8072360 0x7f9cc8082540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:16.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.065+0000 7f9cc6ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9cc8072360 0x7f9cc8082540 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:59378/0 (socket says 192.168.123.106:59378) 2026-03-09T17:31:16.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.065+0000 7f9cc6ffd700 1 -- 192.168.123.106:0/2050515068 learned_addr learned my addr 192.168.123.106:0/2050515068 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:31:16.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.065+0000 7f9cc6ffd700 1 -- 192.168.123.106:0/2050515068 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9cc8082a80 msgr2=0x7f9cc8082ef0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:16.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.065+0000 7f9cc6ffd700 1 --2- 192.168.123.106:0/2050515068 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9cc8082a80 0x7f9cc8082ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.065+0000 7f9cc6ffd700 1 -- 192.168.123.106:0/2050515068 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9cb8007430 con 0x7f9cc8072360 2026-03-09T17:31:16.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.066+0000 7f9cc6ffd700 1 --2- 192.168.123.106:0/2050515068 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9cc8072360 0x7f9cc8082540 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f9cb8005950 tx=0x7f9cb800ca60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:16.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.066+0000 7f9caffff700 1 -- 192.168.123.106:0/2050515068 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9cb800f050 con 0x7f9cc8072360 2026-03-09T17:31:16.069 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.066+0000 7f9ccd4ed700 1 -- 192.168.123.106:0/2050515068 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9cc81b2d10 con 0x7f9cc8072360 2026-03-09T17:31:16.069 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.066+0000 7f9ccd4ed700 1 -- 192.168.123.106:0/2050515068 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9cc81b3180 con 0x7f9cc8072360 2026-03-09T17:31:16.069 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.067+0000 7f9ccd4ed700 1 -- 192.168.123.106:0/2050515068 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9cc804ea50 con 0x7f9cc8072360 2026-03-09T17:31:16.069 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.067+0000 7f9caffff700 1 -- 192.168.123.106:0/2050515068 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9cb800ced0 con 0x7f9cc8072360 2026-03-09T17:31:16.069 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.067+0000 7f9caffff700 1 -- 192.168.123.106:0/2050515068 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9cb8008250 con 0x7f9cc8072360 2026-03-09T17:31:16.071 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.069+0000 7f9caffff700 1 -- 192.168.123.106:0/2050515068 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f9cb801a040 con 0x7f9cc8072360 2026-03-09T17:31:16.071 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.069+0000 7f9caffff700 1 --2- 192.168.123.106:0/2050515068 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9cb0079c60 0x7f9cb007c110 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:16.071 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.069+0000 7f9caffff700 1 -- 192.168.123.106:0/2050515068 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9cb809a880 con 0x7f9cc8072360 2026-03-09T17:31:16.072 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.070+0000 7f9cc67fc700 1 --2- 192.168.123.106:0/2050515068 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9cb0079c60 0x7f9cb007c110 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:16.072 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.070+0000 7f9cc67fc700 1 --2- 192.168.123.106:0/2050515068 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9cb0079c60 0x7f9cb007c110 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f9cc00060b0 tx=0x7f9cc0006040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:16.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.071+0000 7f9caffff700 1 -- 192.168.123.106:0/2050515068 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9cb8062f10 con 0x7f9cc8072360 2026-03-09T17:31:16.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.270+0000 7f9ccd4ed700 1 -- 192.168.123.106:0/2050515068 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f9cc81b3550 con 0x7f9cc8072360 2026-03-09T17:31:16.275 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.273+0000 7f9caffff700 1 -- 192.168.123.106:0/2050515068 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1942 (secure 0 0 0) 0x7f9cb8062660 con 0x7f9cc8072360 2026-03-09T17:31:16.275 INFO:teuthology.orchestra.run.vm06.stdout:e13 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:epoch 11 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:max_xattr_size 65536 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24291} 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:qdb_cluster leader: 0 members: 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:31:16.276 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:31:16.278 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.276+0000 7f9cadffb700 1 -- 192.168.123.106:0/2050515068 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9cb0079c60 msgr2=0x7f9cb007c110 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:16.278 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.276+0000 7f9cadffb700 1 --2- 192.168.123.106:0/2050515068 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9cb0079c60 0x7f9cb007c110 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f9cc00060b0 tx=0x7f9cc0006040 comp rx=0 tx=0).stop 2026-03-09T17:31:16.278 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.276+0000 7f9cadffb700 1 -- 192.168.123.106:0/2050515068 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9cc8072360 msgr2=0x7f9cc8082540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:16.278 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.276+0000 7f9cadffb700 1 --2- 192.168.123.106:0/2050515068 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9cc8072360 0x7f9cc8082540 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f9cb8005950 tx=0x7f9cb800ca60 comp rx=0 tx=0).stop 2026-03-09T17:31:16.279 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.277+0000 7f9cadffb700 1 -- 192.168.123.106:0/2050515068 shutdown_connections 2026-03-09T17:31:16.279 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.277+0000 7f9cadffb700 1 --2- 192.168.123.106:0/2050515068 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9cb0079c60 0x7f9cb007c110 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.279 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.277+0000 7f9cadffb700 1 --2- 192.168.123.106:0/2050515068 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9cc8072360 0x7f9cc8082540 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.279 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.277+0000 7f9cadffb700 1 --2- 192.168.123.106:0/2050515068 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9cc8082a80 0x7f9cc8082ef0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.279 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.277+0000 7f9cadffb700 1 -- 192.168.123.106:0/2050515068 >> 192.168.123.106:0/2050515068 conn(0x7f9cc806d1a0 msgr2=0x7f9cc8070600 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:16.279 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.277+0000 7f9cadffb700 1 -- 192.168.123.106:0/2050515068 shutdown_connections 2026-03-09T17:31:16.279 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.277+0000 7f9cadffb700 1 -- 192.168.123.106:0/2050515068 wait complete. 2026-03-09T17:31:16.285 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.400+0000 7f33faed1700 1 -- 192.168.123.106:0/1239234923 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f33f4072360 msgr2=0x7f33f40770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.400+0000 7f33faed1700 1 --2- 192.168.123.106:0/1239234923 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f33f4072360 0x7f33f40770e0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f33ec00d3f0 tx=0x7f33ec00d700 comp rx=0 tx=0).stop 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.400+0000 7f33faed1700 1 -- 192.168.123.106:0/1239234923 shutdown_connections 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.400+0000 7f33faed1700 1 --2- 192.168.123.106:0/1239234923 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f33f4072360 0x7f33f40770e0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.400+0000 7f33faed1700 1 --2- 192.168.123.106:0/1239234923 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f33f4071980 0x7f33f4071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.400+0000 7f33faed1700 1 -- 192.168.123.106:0/1239234923 >> 192.168.123.106:0/1239234923 conn(0x7f33f406d1a0 msgr2=0x7f33f406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.400+0000 7f33faed1700 1 -- 192.168.123.106:0/1239234923 shutdown_connections 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.400+0000 7f33faed1700 1 -- 192.168.123.106:0/1239234923 wait complete. 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.400+0000 7f33faed1700 1 Processor -- start 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.400+0000 7f33faed1700 1 -- start start 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.400+0000 7f33faed1700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f33f4071980 0x7f33f4131420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.400+0000 7f33faed1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f33f4131960 0x7f33f407f5f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.400+0000 7f33faed1700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f33f4131e60 con 0x7f33f4131960 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.400+0000 7f33faed1700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f33f4131fd0 con 0x7f33f4071980 2026-03-09T17:31:16.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.401+0000 7f33f8c6d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f33f4071980 0x7f33f4131420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:16.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.401+0000 7f33f8c6d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f33f4071980 0x7f33f4131420 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:39296/0 (socket says 192.168.123.106:39296) 2026-03-09T17:31:16.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.401+0000 7f33f8c6d700 1 -- 192.168.123.106:0/4250975971 learned_addr learned my addr 192.168.123.106:0/4250975971 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:31:16.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.401+0000 7f33f8c6d700 1 -- 192.168.123.106:0/4250975971 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f33f4131960 msgr2=0x7f33f407f5f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:16.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.401+0000 7f33f8c6d700 1 --2- 192.168.123.106:0/4250975971 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f33f4131960 0x7f33f407f5f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.401+0000 7f33f8c6d700 1 -- 192.168.123.106:0/4250975971 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f33ec007ed0 con 0x7f33f4071980 2026-03-09T17:31:16.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.401+0000 7f33f8c6d700 1 --2- 192.168.123.106:0/4250975971 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f33f4071980 0x7f33f4131420 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f33e400c8a0 tx=0x7f33e400cc60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:16.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.401+0000 7f33f1ffb700 1 -- 192.168.123.106:0/4250975971 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f33e400cea0 con 0x7f33f4071980 2026-03-09T17:31:16.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.401+0000 7f33faed1700 1 -- 192.168.123.106:0/4250975971 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f33f407fb90 con 0x7f33f4071980 2026-03-09T17:31:16.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.402+0000 7f33faed1700 1 -- 192.168.123.106:0/4250975971 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f33f40800b0 con 0x7f33f4071980 2026-03-09T17:31:16.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.402+0000 7f33f1ffb700 1 -- 192.168.123.106:0/4250975971 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f33e4004830 con 0x7f33f4071980 2026-03-09T17:31:16.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.402+0000 7f33f1ffb700 1 -- 192.168.123.106:0/4250975971 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f33e4005630 con 0x7f33f4071980 2026-03-09T17:31:16.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.404+0000 7f33f1ffb700 1 -- 192.168.123.106:0/4250975971 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f33e4004e10 con 0x7f33f4071980 2026-03-09T17:31:16.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.404+0000 7f33f1ffb700 1 --2- 192.168.123.106:0/4250975971 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f33dc077a40 0x7f33dc079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:16.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.404+0000 7f33f1ffb700 1 -- 192.168.123.106:0/4250975971 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f33e4099760 con 0x7f33f4071980 2026-03-09T17:31:16.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.404+0000 7f33faed1700 1 -- 192.168.123.106:0/4250975971 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f33e0005320 con 0x7f33f4071980 2026-03-09T17:31:16.407 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.405+0000 7f33f3fff700 1 --2- 192.168.123.106:0/4250975971 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f33dc077a40 0x7f33dc079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:16.407 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.405+0000 7f33f3fff700 1 --2- 192.168.123.106:0/4250975971 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f33dc077a40 0x7f33dc079ef0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f33ec00db80 tx=0x7f33ec0061f0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:16.414 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.408+0000 7f33f1ffb700 1 -- 192.168.123.106:0/4250975971 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f33e4061d70 con 0x7f33f4071980 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: Reconfiguring ceph-exporter.vm09 (monmap changed)... 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: Unable to update caps for client.ceph-exporter.vm09 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: Reconfiguring daemon ceph-exporter.vm09 on vm09 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: pgmap v11: 65 pgs: 65 active+clean; 654 MiB data, 4.4 GiB used, 116 GiB / 120 GiB avail; 829 KiB/s rd, 856 KiB/s wr, 140 op/s 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: Reconfiguring crash.vm09 (monmap changed)... 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: Reconfiguring daemon crash.vm09 on vm09 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='client.44101 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='client.34128 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: Reconfiguring mgr.vm09.lqzvkh (monmap changed)... 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm09.lqzvkh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: Reconfiguring daemon mgr.vm09.lqzvkh on vm09 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='client.34132 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: Reconfiguring mon.vm09 (monmap changed)... 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: Reconfiguring daemon mon.vm09 on vm09 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/344911054' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:16.568 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:16 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/2050515068' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:31:16.568 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.565+0000 7f33faed1700 1 -- 192.168.123.106:0/4250975971 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f33e0000bf0 con 0x7f33dc077a40 2026-03-09T17:31:16.573 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.571+0000 7f33f1ffb700 1 -- 192.168.123.106:0/4250975971 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+383 (secure 0 0 0) 0x7f33e0000bf0 con 0x7f33dc077a40 2026-03-09T17:31:16.573 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:31:16.573 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T17:31:16.573 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:31:16.573 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:31:16.573 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [ 2026-03-09T17:31:16.573 INFO:teuthology.orchestra.run.vm06.stdout: "mon", 2026-03-09T17:31:16.573 INFO:teuthology.orchestra.run.vm06.stdout: "mgr" 2026-03-09T17:31:16.573 INFO:teuthology.orchestra.run.vm06.stdout: ], 2026-03-09T17:31:16.573 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "4/23 daemons upgraded", 2026-03-09T17:31:16.573 INFO:teuthology.orchestra.run.vm06.stdout: "message": "Currently upgrading mon daemons", 2026-03-09T17:31:16.573 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:31:16.574 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:31:16.576 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.574+0000 7f33faed1700 1 -- 192.168.123.106:0/4250975971 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f33dc077a40 msgr2=0x7f33dc079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:16.576 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.574+0000 7f33faed1700 1 --2- 192.168.123.106:0/4250975971 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f33dc077a40 0x7f33dc079ef0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f33ec00db80 tx=0x7f33ec0061f0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.576 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.574+0000 7f33faed1700 1 -- 192.168.123.106:0/4250975971 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f33f4071980 msgr2=0x7f33f4131420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:16.576 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.574+0000 7f33faed1700 1 --2- 192.168.123.106:0/4250975971 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f33f4071980 0x7f33f4131420 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f33e400c8a0 tx=0x7f33e400cc60 comp rx=0 tx=0).stop 2026-03-09T17:31:16.576 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.574+0000 7f33faed1700 1 -- 192.168.123.106:0/4250975971 shutdown_connections 2026-03-09T17:31:16.576 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.574+0000 7f33faed1700 1 --2- 192.168.123.106:0/4250975971 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f33dc077a40 0x7f33dc079ef0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.576 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.574+0000 7f33faed1700 1 --2- 192.168.123.106:0/4250975971 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f33f4071980 0x7f33f4131420 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.576 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.574+0000 7f33faed1700 1 --2- 192.168.123.106:0/4250975971 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f33f4131960 0x7f33f407f5f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.576 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.574+0000 7f33faed1700 1 -- 192.168.123.106:0/4250975971 >> 192.168.123.106:0/4250975971 conn(0x7f33f406d1a0 msgr2=0x7f33f4076500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:16.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.574+0000 7f33faed1700 1 -- 192.168.123.106:0/4250975971 shutdown_connections 2026-03-09T17:31:16.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.574+0000 7f33faed1700 1 -- 192.168.123.106:0/4250975971 wait complete. 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: Reconfiguring ceph-exporter.vm09 (monmap changed)... 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: Unable to update caps for client.ceph-exporter.vm09 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: Reconfiguring daemon ceph-exporter.vm09 on vm09 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: pgmap v11: 65 pgs: 65 active+clean; 654 MiB data, 4.4 GiB used, 116 GiB / 120 GiB avail; 829 KiB/s rd, 856 KiB/s wr, 140 op/s 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: Reconfiguring crash.vm09 (monmap changed)... 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: Reconfiguring daemon crash.vm09 on vm09 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='client.44101 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='client.34128 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: Reconfiguring mgr.vm09.lqzvkh (monmap changed)... 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm09.lqzvkh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: Reconfiguring daemon mgr.vm09.lqzvkh on vm09 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='client.34132 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: Reconfiguring mon.vm09 (monmap changed)... 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: Reconfiguring daemon mon.vm09 on vm09 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/344911054' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:16.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:16 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/2050515068' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:31:16.652 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.650+0000 7fb1cbb5f700 1 -- 192.168.123.106:0/2976521069 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb1c4072470 msgr2=0x7fb1c410beb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:16.652 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.650+0000 7fb1cbb5f700 1 --2- 192.168.123.106:0/2976521069 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb1c4072470 0x7fb1c410beb0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7fb1bc00b3a0 tx=0x7fb1bc00b6b0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.652 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.650+0000 7fb1cbb5f700 1 -- 192.168.123.106:0/2976521069 shutdown_connections 2026-03-09T17:31:16.653 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.650+0000 7fb1cbb5f700 1 --2- 192.168.123.106:0/2976521069 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb1c4072470 0x7fb1c410beb0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.653 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.650+0000 7fb1cbb5f700 1 --2- 192.168.123.106:0/2976521069 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb1c4071a90 0x7fb1c4071ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.653 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.650+0000 7fb1cbb5f700 1 -- 192.168.123.106:0/2976521069 >> 192.168.123.106:0/2976521069 conn(0x7fb1c406d1a0 msgr2=0x7fb1c406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:16.653 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.651+0000 7fb1cbb5f700 1 -- 192.168.123.106:0/2976521069 shutdown_connections 2026-03-09T17:31:16.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.651+0000 7fb1cbb5f700 1 -- 192.168.123.106:0/2976521069 wait complete. 2026-03-09T17:31:16.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.652+0000 7fb1cbb5f700 1 Processor -- start 2026-03-09T17:31:16.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.652+0000 7fb1cbb5f700 1 -- start start 2026-03-09T17:31:16.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.652+0000 7fb1cbb5f700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb1c4071a90 0x7fb1c4116b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:16.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.652+0000 7fb1cbb5f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb1c4117060 0x7fb1c41b28f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:16.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.652+0000 7fb1cbb5f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb1c4117560 con 0x7fb1c4117060 2026-03-09T17:31:16.654 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.652+0000 7fb1cbb5f700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb1c41176d0 con 0x7fb1c4071a90 2026-03-09T17:31:16.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.653+0000 7fb1c90fa700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb1c4117060 0x7fb1c41b28f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:16.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.653+0000 7fb1c90fa700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb1c4117060 0x7fb1c41b28f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:59400/0 (socket says 192.168.123.106:59400) 2026-03-09T17:31:16.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.653+0000 7fb1c90fa700 1 -- 192.168.123.106:0/156363596 learned_addr learned my addr 192.168.123.106:0/156363596 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:31:16.655 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.653+0000 7fb1c98fb700 1 --2- 192.168.123.106:0/156363596 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb1c4071a90 0x7fb1c4116b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:16.656 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.653+0000 7fb1c90fa700 1 -- 192.168.123.106:0/156363596 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb1c4071a90 msgr2=0x7fb1c4116b20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:16.656 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.653+0000 7fb1c90fa700 1 --2- 192.168.123.106:0/156363596 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb1c4071a90 0x7fb1c4116b20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.656 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.653+0000 7fb1c90fa700 1 -- 192.168.123.106:0/156363596 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb1bc00b050 con 0x7fb1c4117060 2026-03-09T17:31:16.656 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.654+0000 7fb1c90fa700 1 --2- 192.168.123.106:0/156363596 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb1c4117060 0x7fb1c41b28f0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fb1bc007b60 tx=0x7fb1bc00bce0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:16.657 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.655+0000 7fb1baffd700 1 -- 192.168.123.106:0/156363596 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb1bc00e050 con 0x7fb1c4117060 2026-03-09T17:31:16.657 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.655+0000 7fb1baffd700 1 -- 192.168.123.106:0/156363596 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb1bc004730 con 0x7fb1c4117060 2026-03-09T17:31:16.659 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.655+0000 7fb1baffd700 1 -- 192.168.123.106:0/156363596 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb1bc01bbf0 con 0x7fb1c4117060 2026-03-09T17:31:16.659 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.655+0000 7fb1cbb5f700 1 -- 192.168.123.106:0/156363596 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb1c41b2e30 con 0x7fb1c4117060 2026-03-09T17:31:16.659 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.655+0000 7fb1cbb5f700 1 -- 192.168.123.106:0/156363596 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb1c41b3190 con 0x7fb1c4117060 2026-03-09T17:31:16.659 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.657+0000 7fb1cbb5f700 1 -- 192.168.123.106:0/156363596 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb1c4110c20 con 0x7fb1c4117060 2026-03-09T17:31:16.662 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.659+0000 7fb1baffd700 1 -- 192.168.123.106:0/156363596 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fb1bc01bd50 con 0x7fb1c4117060 2026-03-09T17:31:16.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.659+0000 7fb1baffd700 1 --2- 192.168.123.106:0/156363596 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb1b0077820 0x7fb1b0079cd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:16.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.661+0000 7fb1c98fb700 1 --2- 192.168.123.106:0/156363596 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb1b0077820 0x7fb1b0079cd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:16.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.662+0000 7fb1c98fb700 1 --2- 192.168.123.106:0/156363596 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb1b0077820 0x7fb1b0079cd0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fb1b400a850 tx=0x7fb1b4008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:16.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.663+0000 7fb1baffd700 1 -- 192.168.123.106:0/156363596 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fb1bc0681d0 con 0x7fb1c4117060 2026-03-09T17:31:16.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.663+0000 7fb1baffd700 1 -- 192.168.123.106:0/156363596 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb1bc063bf0 con 0x7fb1c4117060 2026-03-09T17:31:16.868 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.865+0000 7fb1cbb5f700 1 -- 192.168.123.106:0/156363596 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fb1c404ea50 con 0x7fb1c4117060 2026-03-09T17:31:16.871 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.869+0000 7fb1baffd700 1 -- 192.168.123.106:0/156363596 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7fb1bc01f090 con 0x7fb1c4117060 2026-03-09T17:31:16.871 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_OK 2026-03-09T17:31:16.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.872+0000 7fb1b8fb9700 1 -- 192.168.123.106:0/156363596 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb1b0077820 msgr2=0x7fb1b0079cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:16.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.872+0000 7fb1b8fb9700 1 --2- 192.168.123.106:0/156363596 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb1b0077820 0x7fb1b0079cd0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fb1b400a850 tx=0x7fb1b4008040 comp rx=0 tx=0).stop 2026-03-09T17:31:16.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.873+0000 7fb1b8fb9700 1 -- 192.168.123.106:0/156363596 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb1c4117060 msgr2=0x7fb1c41b28f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:16.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.873+0000 7fb1b8fb9700 1 --2- 192.168.123.106:0/156363596 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb1c4117060 0x7fb1c41b28f0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fb1bc007b60 tx=0x7fb1bc00bce0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.873+0000 7fb1b8fb9700 1 -- 192.168.123.106:0/156363596 shutdown_connections 2026-03-09T17:31:16.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.873+0000 7fb1b8fb9700 1 --2- 192.168.123.106:0/156363596 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb1b0077820 0x7fb1b0079cd0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.873+0000 7fb1b8fb9700 1 --2- 192.168.123.106:0/156363596 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb1c4071a90 0x7fb1c4116b20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.873+0000 7fb1b8fb9700 1 --2- 192.168.123.106:0/156363596 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb1c4117060 0x7fb1c41b28f0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:16.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.873+0000 7fb1b8fb9700 1 -- 192.168.123.106:0/156363596 >> 192.168.123.106:0/156363596 conn(0x7fb1c406d1a0 msgr2=0x7fb1c410b300 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:16.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.873+0000 7fb1b8fb9700 1 -- 192.168.123.106:0/156363596 shutdown_connections 2026-03-09T17:31:16.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:16.873+0000 7fb1b8fb9700 1 -- 192.168.123.106:0/156363596 wait complete. 2026-03-09T17:31:17.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: Reconfiguring osd.3 (monmap changed)... 2026-03-09T17:31:17.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: Reconfiguring daemon osd.3 on vm09 2026-03-09T17:31:17.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: pgmap v12: 65 pgs: 65 active+clean; 294 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.2 MiB/s wr, 235 op/s 2026-03-09T17:31:17.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: from='client.44113 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:17.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:17.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:17.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: Reconfiguring osd.4 (monmap changed)... 2026-03-09T17:31:17.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T17:31:17.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:17.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: Reconfiguring daemon osd.4 on vm09 2026-03-09T17:31:17.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/156363596' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:31:17.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:17.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:17.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: Reconfiguring osd.5 (monmap changed)... 2026-03-09T17:31:17.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T17:31:17.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:17.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:17 vm09.local ceph-mon[97995]: Reconfiguring daemon osd.5 on vm09 2026-03-09T17:31:18.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: Reconfiguring osd.3 (monmap changed)... 2026-03-09T17:31:18.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: Reconfiguring daemon osd.3 on vm09 2026-03-09T17:31:18.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: pgmap v12: 65 pgs: 65 active+clean; 294 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.2 MiB/s wr, 235 op/s 2026-03-09T17:31:18.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: from='client.44113 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: Reconfiguring osd.4 (monmap changed)... 2026-03-09T17:31:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T17:31:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: Reconfiguring daemon osd.4 on vm09 2026-03-09T17:31:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/156363596' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:31:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: Reconfiguring osd.5 (monmap changed)... 2026-03-09T17:31:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T17:31:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:17 vm06.local ceph-mon[109831]: Reconfiguring daemon osd.5 on vm09 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: Reconfiguring mds.cephfs.vm09.cjcawy (monmap changed)... 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.cjcawy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: Reconfiguring daemon mds.cephfs.vm09.cjcawy on vm09 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: Reconfiguring mds.cephfs.vm09.drzmdt (monmap changed)... 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.drzmdt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: Reconfiguring daemon mds.cephfs.vm09.drzmdt on vm09 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:18.950 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:18.951 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:18.951 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm06"}]: dispatch 2026-03-09T17:31:18.951 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm06"}]': finished 2026-03-09T17:31:18.951 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm09"}]: dispatch 2026-03-09T17:31:18.951 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm09"}]': finished 2026-03-09T17:31:19.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:19.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: Reconfiguring mds.cephfs.vm09.cjcawy (monmap changed)... 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.cjcawy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: Reconfiguring daemon mds.cephfs.vm09.cjcawy on vm09 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: Reconfiguring mds.cephfs.vm09.drzmdt (monmap changed)... 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.drzmdt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: Reconfiguring daemon mds.cephfs.vm09.drzmdt on vm09 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm06"}]: dispatch 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm06"}]': finished 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm09"}]: dispatch 2026-03-09T17:31:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm09"}]': finished 2026-03-09T17:31:20.296 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:19 vm06.local ceph-mon[109831]: pgmap v13: 65 pgs: 65 active+clean; 294 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 821 KiB/s rd, 938 KiB/s wr, 185 op/s 2026-03-09T17:31:20.296 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:19 vm06.local ceph-mon[109831]: Upgrade: Setting container_image for all mon 2026-03-09T17:31:20.296 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:19 vm06.local ceph-mon[109831]: Upgrade: Updating crash.vm06 (1/2) 2026-03-09T17:31:20.296 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:20.296 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T17:31:20.296 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:20.296 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:19 vm06.local ceph-mon[109831]: Deploying daemon crash.vm06 on vm06 2026-03-09T17:31:20.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:19 vm09.local ceph-mon[97995]: pgmap v13: 65 pgs: 65 active+clean; 294 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 821 KiB/s rd, 938 KiB/s wr, 185 op/s 2026-03-09T17:31:20.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:19 vm09.local ceph-mon[97995]: Upgrade: Setting container_image for all mon 2026-03-09T17:31:20.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:19 vm09.local ceph-mon[97995]: Upgrade: Updating crash.vm06 (1/2) 2026-03-09T17:31:20.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:20.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T17:31:20.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:20.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:19 vm09.local ceph-mon[97995]: Deploying daemon crash.vm06 on vm06 2026-03-09T17:31:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:21 vm06.local ceph-mon[109831]: pgmap v14: 65 pgs: 65 active+clean; 286 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1.3 MiB/s rd, 1.3 MiB/s wr, 285 op/s 2026-03-09T17:31:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:21 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:21 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:21 vm06.local ceph-mon[109831]: Upgrade: Updating crash.vm09 (2/2) 2026-03-09T17:31:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:21 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:21.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:21 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm09", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T17:31:21.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:21 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:21.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:21 vm06.local ceph-mon[109831]: Deploying daemon crash.vm09 on vm09 2026-03-09T17:31:22.078 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:21 vm09.local ceph-mon[97995]: pgmap v14: 65 pgs: 65 active+clean; 286 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1.3 MiB/s rd, 1.3 MiB/s wr, 285 op/s 2026-03-09T17:31:22.078 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:21 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:22.078 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:21 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:22.078 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:21 vm09.local ceph-mon[97995]: Upgrade: Updating crash.vm09 (2/2) 2026-03-09T17:31:22.078 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:21 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:22.078 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:21 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm09", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T17:31:22.078 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:21 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:22.078 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:21 vm09.local ceph-mon[97995]: Deploying daemon crash.vm09 on vm09 2026-03-09T17:31:23.373 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:23 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:23.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:23 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:23.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:23 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:31:23.374 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:23 vm06.local ceph-mon[109831]: pgmap v15: 65 pgs: 65 active+clean; 286 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 800 KiB/s rd, 790 KiB/s wr, 194 op/s 2026-03-09T17:31:23.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:23 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:23.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:23 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:23.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:23 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:31:23.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:23 vm09.local ceph-mon[97995]: pgmap v15: 65 pgs: 65 active+clean; 286 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 800 KiB/s rd, 790 KiB/s wr, 194 op/s 2026-03-09T17:31:25.094 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:25 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:25.094 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:25 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:25.094 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:25 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:25.094 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:25 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:25 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:25 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:25 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:25 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:26.377 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:26 vm06.local ceph-mon[109831]: pgmap v16: 65 pgs: 65 active+clean; 286 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 799 KiB/s rd, 790 KiB/s wr, 194 op/s 2026-03-09T17:31:26.377 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:26 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:26.377 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:26 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:26.377 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:26 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:26.377 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:26 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:26.377 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:26 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:26.377 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:26 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:26.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:26 vm09.local ceph-mon[97995]: pgmap v16: 65 pgs: 65 active+clean; 286 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 799 KiB/s rd, 790 KiB/s wr, 194 op/s 2026-03-09T17:31:26.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:26 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:26.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:26 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:26.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:26 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:26.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:26 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:26.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:26 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:26.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:26 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: pgmap v17: 65 pgs: 65 active+clean; 289 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.3 MiB/s wr, 306 op/s 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm06"}]: dispatch 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm06"}]': finished 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm09"}]: dispatch 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm09"}]': finished 2026-03-09T17:31:28.358 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: pgmap v17: 65 pgs: 65 active+clean; 289 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.3 MiB/s wr, 306 op/s 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm06"}]: dispatch 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm06"}]': finished 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm09"}]: dispatch 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm09"}]': finished 2026-03-09T17:31:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T17:31:29.309 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:29 vm06.local ceph-mon[109831]: Upgrade: Setting container_image for all crash 2026-03-09T17:31:29.310 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:29 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T17:31:29.310 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:29 vm06.local ceph-mon[109831]: Upgrade: osd.0 is safe to restart 2026-03-09T17:31:29.310 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:29 vm06.local ceph-mon[109831]: Upgrade: Updating osd.0 2026-03-09T17:31:29.310 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:29 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:29.310 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:29 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T17:31:29.310 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:29 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:29.310 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:29 vm06.local ceph-mon[109831]: Deploying daemon osd.0 on vm06 2026-03-09T17:31:29.310 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:29 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:31:29.641 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:29 vm06.local systemd[1]: Stopping Ceph osd.0 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:31:29.642 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:29 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0[74161]: 2026-03-09T17:31:29.448+0000 7f155429e700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:31:29.642 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:29 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0[74161]: 2026-03-09T17:31:29.448+0000 7f155429e700 -1 osd.0 42 *** Got signal Terminated *** 2026-03-09T17:31:29.642 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:29 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0[74161]: 2026-03-09T17:31:29.448+0000 7f155429e700 -1 osd.0 42 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T17:31:29.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:29 vm09.local ceph-mon[97995]: Upgrade: Setting container_image for all crash 2026-03-09T17:31:29.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:29 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T17:31:29.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:29 vm09.local ceph-mon[97995]: Upgrade: osd.0 is safe to restart 2026-03-09T17:31:29.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:29 vm09.local ceph-mon[97995]: Upgrade: Updating osd.0 2026-03-09T17:31:29.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:29 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:29.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:29 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T17:31:29.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:29 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:29.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:29 vm09.local ceph-mon[97995]: Deploying daemon osd.0 on vm06 2026-03-09T17:31:29.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:29 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:31:30.543 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:30 vm06.local podman[115242]: 2026-03-09 17:31:30.287988258 +0000 UTC m=+0.872346331 container died 7a07f019bdd7516e308edc6b7cbca3c96bd9f6386c3b50ec6eff25f16cac46d2 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0, GIT_CLEAN=True, io.buildah.version=1.29.1, org.label-schema.build-date=20231212, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) 2026-03-09T17:31:30.543 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:30 vm06.local podman[115242]: 2026-03-09 17:31:30.310685198 +0000 UTC m=+0.895043271 container remove 7a07f019bdd7516e308edc6b7cbca3c96bd9f6386c3b50ec6eff25f16cac46d2 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20231212, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True, io.buildah.version=1.29.1) 2026-03-09T17:31:30.543 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:30 vm06.local bash[115242]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0 2026-03-09T17:31:30.543 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:30 vm06.local ceph-mon[109831]: pgmap v18: 65 pgs: 65 active+clean; 289 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 898 KiB/s rd, 899 KiB/s wr, 211 op/s 2026-03-09T17:31:30.543 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:30 vm06.local ceph-mon[109831]: osd.0 marked itself down and dead 2026-03-09T17:31:30.545 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:30 vm09.local ceph-mon[97995]: pgmap v18: 65 pgs: 65 active+clean; 289 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 898 KiB/s rd, 899 KiB/s wr, 211 op/s 2026-03-09T17:31:30.545 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:30 vm09.local ceph-mon[97995]: osd.0 marked itself down and dead 2026-03-09T17:31:30.814 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:30 vm06.local podman[115307]: 2026-03-09 17:31:30.589708089 +0000 UTC m=+0.041599892 container create 025d927cd8199c236464b2a269d9f39c4736dd56b9d198c7e7231d9ea146d854 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0) 2026-03-09T17:31:30.814 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:30 vm06.local podman[115307]: 2026-03-09 17:31:30.648043301 +0000 UTC m=+0.099935104 container init 025d927cd8199c236464b2a269d9f39c4736dd56b9d198c7e7231d9ea146d854 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-deactivate, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223) 2026-03-09T17:31:30.814 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:30 vm06.local podman[115307]: 2026-03-09 17:31:30.65247233 +0000 UTC m=+0.104364133 container start 025d927cd8199c236464b2a269d9f39c4736dd56b9d198c7e7231d9ea146d854 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-deactivate, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T17:31:30.814 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:30 vm06.local podman[115307]: 2026-03-09 17:31:30.653982628 +0000 UTC m=+0.105874441 container attach 025d927cd8199c236464b2a269d9f39c4736dd56b9d198c7e7231d9ea146d854 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-deactivate, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3) 2026-03-09T17:31:30.814 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:30 vm06.local podman[115307]: 2026-03-09 17:31:30.576857918 +0000 UTC m=+0.028749731 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:31:31.106 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:30 vm06.local podman[115307]: 2026-03-09 17:31:30.81433006 +0000 UTC m=+0.266221863 container died 025d927cd8199c236464b2a269d9f39c4736dd56b9d198c7e7231d9ea146d854 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-deactivate, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T17:31:31.106 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:30 vm06.local podman[115307]: 2026-03-09 17:31:30.846773573 +0000 UTC m=+0.298665376 container remove 025d927cd8199c236464b2a269d9f39c4736dd56b9d198c7e7231d9ea146d854 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T17:31:31.106 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:30 vm06.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.0.service: Deactivated successfully. 2026-03-09T17:31:31.106 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:30 vm06.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.0.service: Unit process 115322 (conmon) remains running after unit stopped. 2026-03-09T17:31:31.106 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:30 vm06.local systemd[1]: Stopped Ceph osd.0 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:31:31.106 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:30 vm06.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.0.service: Consumed 31.718s CPU time, 531.9M memory peak. 2026-03-09T17:31:31.106 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:31 vm06.local systemd[1]: Starting Ceph osd.0 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:31:31.357 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:31 vm06.local ceph-mon[109831]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T17:31:31.357 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:31 vm06.local ceph-mon[109831]: osdmap e43: 6 total, 5 up, 6 in 2026-03-09T17:31:31.369 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:31 vm06.local podman[115412]: 2026-03-09 17:31:31.298463193 +0000 UTC m=+0.049775653 container create 17fdbf39cf6fff6f9eca3783760dfac1d5c28391bcc6f913161b15c16c090891 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T17:31:31.369 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:31 vm06.local podman[115412]: 2026-03-09 17:31:31.261827705 +0000 UTC m=+0.013140185 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:31:31.369 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:31 vm06.local podman[115412]: 2026-03-09 17:31:31.364161407 +0000 UTC m=+0.115473876 container init 17fdbf39cf6fff6f9eca3783760dfac1d5c28391bcc6f913161b15c16c090891 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate, ceph=True, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T17:31:31.542 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:31 vm09.local ceph-mon[97995]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T17:31:31.542 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:31 vm09.local ceph-mon[97995]: osdmap e43: 6 total, 5 up, 6 in 2026-03-09T17:31:31.615 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:31 vm06.local podman[115412]: 2026-03-09 17:31:31.383737737 +0000 UTC m=+0.135050206 container start 17fdbf39cf6fff6f9eca3783760dfac1d5c28391bcc6f913161b15c16c090891 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) 2026-03-09T17:31:31.615 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:31 vm06.local podman[115412]: 2026-03-09 17:31:31.412604405 +0000 UTC m=+0.163916874 container attach 17fdbf39cf6fff6f9eca3783760dfac1d5c28391bcc6f913161b15c16c090891 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T17:31:31.615 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:31 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate[115423]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:31:31.615 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:31 vm06.local bash[115412]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:31:31.616 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:31 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate[115423]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:31:31.616 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:31 vm06.local bash[115412]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:31:32.446 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:32 vm06.local ceph-mon[109831]: pgmap v20: 65 pgs: 9 stale+active+clean, 56 active+clean; 290 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 891 KiB/s rd, 924 KiB/s wr, 270 op/s 2026-03-09T17:31:32.446 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:32 vm06.local ceph-mon[109831]: osdmap e44: 6 total, 5 up, 6 in 2026-03-09T17:31:32.446 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate[115423]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T17:31:32.446 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate[115423]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:31:32.446 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local bash[115412]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T17:31:32.446 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local bash[115412]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:31:32.446 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate[115423]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:31:32.446 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local bash[115412]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:31:32.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:32 vm09.local ceph-mon[97995]: pgmap v20: 65 pgs: 9 stale+active+clean, 56 active+clean; 290 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 891 KiB/s rd, 924 KiB/s wr, 270 op/s 2026-03-09T17:31:32.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:32 vm09.local ceph-mon[97995]: osdmap e44: 6 total, 5 up, 6 in 2026-03-09T17:31:32.745 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate[115423]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T17:31:32.745 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local bash[115412]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T17:31:32.745 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate[115423]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-435218dd-6a88-475e-9c01-ac35a10afc3b/osd-block-7b23c291-c26f-47f6-aa9d-2b35b2448578 --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-09T17:31:32.745 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local bash[115412]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-435218dd-6a88-475e-9c01-ac35a10afc3b/osd-block-7b23c291-c26f-47f6-aa9d-2b35b2448578 --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-09T17:31:33.011 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate[115423]: Running command: /usr/bin/ln -snf /dev/ceph-435218dd-6a88-475e-9c01-ac35a10afc3b/osd-block-7b23c291-c26f-47f6-aa9d-2b35b2448578 /var/lib/ceph/osd/ceph-0/block 2026-03-09T17:31:33.011 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local bash[115412]: Running command: /usr/bin/ln -snf /dev/ceph-435218dd-6a88-475e-9c01-ac35a10afc3b/osd-block-7b23c291-c26f-47f6-aa9d-2b35b2448578 /var/lib/ceph/osd/ceph-0/block 2026-03-09T17:31:33.011 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate[115423]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-09T17:31:33.011 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local bash[115412]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-09T17:31:33.011 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate[115423]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T17:31:33.011 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local bash[115412]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T17:31:33.011 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate[115423]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T17:31:33.011 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local bash[115412]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T17:31:33.011 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate[115423]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-09T17:31:33.011 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local bash[115412]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-09T17:31:33.011 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local conmon[115423]: conmon 17fdbf39cf6fff6f9eca : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-17fdbf39cf6fff6f9eca3783760dfac1d5c28391bcc6f913161b15c16c090891.scope/container/memory.events 2026-03-09T17:31:33.011 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local podman[115412]: 2026-03-09 17:31:32.815945787 +0000 UTC m=+1.567258256 container died 17fdbf39cf6fff6f9eca3783760dfac1d5c28391bcc6f913161b15c16c090891 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T17:31:33.011 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:32 vm06.local podman[115412]: 2026-03-09 17:31:32.856793769 +0000 UTC m=+1.608106238 container remove 17fdbf39cf6fff6f9eca3783760dfac1d5c28391bcc6f913161b15c16c090891 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-activate, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T17:31:33.393 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:33 vm06.local podman[115677]: 2026-03-09 17:31:33.009251294 +0000 UTC m=+0.025321943 container create 3b19d9fcb067bacd704ab71f5839810b08a99307b95db949ac2e980087b234b0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T17:31:33.393 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:33 vm06.local podman[115677]: 2026-03-09 17:31:33.055157946 +0000 UTC m=+0.071228595 container init 3b19d9fcb067bacd704ab71f5839810b08a99307b95db949ac2e980087b234b0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T17:31:33.393 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:33 vm06.local podman[115677]: 2026-03-09 17:31:33.062882753 +0000 UTC m=+0.078953402 container start 3b19d9fcb067bacd704ab71f5839810b08a99307b95db949ac2e980087b234b0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T17:31:33.393 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:33 vm06.local bash[115677]: 3b19d9fcb067bacd704ab71f5839810b08a99307b95db949ac2e980087b234b0 2026-03-09T17:31:33.393 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:33 vm06.local podman[115677]: 2026-03-09 17:31:32.999600952 +0000 UTC m=+0.015671612 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:31:33.393 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:33 vm06.local systemd[1]: Started Ceph osd.0 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:31:33.830 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:33 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0[115687]: 2026-03-09T17:31:33.529+0000 7f9b236b8740 -1 Falling back to public interface 2026-03-09T17:31:34.461 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:34 vm06.local ceph-mon[109831]: pgmap v22: 65 pgs: 9 stale+active+clean, 56 active+clean; 290 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.1 MiB/s wr, 338 op/s 2026-03-09T17:31:34.461 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:34 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:34.461 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:34 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:34.461 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:34 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:31:34.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:34 vm09.local ceph-mon[97995]: pgmap v22: 65 pgs: 9 stale+active+clean, 56 active+clean; 290 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.1 MiB/s wr, 338 op/s 2026-03-09T17:31:34.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:34 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:34.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:34 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:34.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:34 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:31:35.552 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:35 vm06.local ceph-mon[109831]: pgmap v23: 65 pgs: 9 stale+active+clean, 56 active+clean; 290 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 468 KiB/s rd, 401 KiB/s wr, 169 op/s 2026-03-09T17:31:35.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:35 vm09.local ceph-mon[97995]: pgmap v23: 65 pgs: 9 stale+active+clean, 56 active+clean; 290 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 468 KiB/s rd, 401 KiB/s wr, 169 op/s 2026-03-09T17:31:37.263 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:36 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:37.263 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:36 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:37.263 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:36 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:37.263 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:36 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:37.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:36 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:37.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:36 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:37.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:36 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:37.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:36 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:38.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:38 vm06.local ceph-mon[109831]: pgmap v24: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 292 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 415 KiB/s rd, 1.0 MiB/s wr, 328 op/s; 7085/47790 objects degraded (14.825%) 2026-03-09T17:31:38.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:38 vm06.local ceph-mon[109831]: Health check failed: Degraded data redundancy: 7085/47790 objects degraded (14.825%), 33 pgs degraded (PG_DEGRADED) 2026-03-09T17:31:38.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:38 vm09.local ceph-mon[97995]: pgmap v24: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 292 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 415 KiB/s rd, 1.0 MiB/s wr, 328 op/s; 7085/47790 objects degraded (14.825%) 2026-03-09T17:31:38.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:38 vm09.local ceph-mon[97995]: Health check failed: Degraded data redundancy: 7085/47790 objects degraded (14.825%), 33 pgs degraded (PG_DEGRADED) 2026-03-09T17:31:39.141 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:38 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0[115687]: 2026-03-09T17:31:38.831+0000 7f9b236b8740 -1 osd.0 0 read_superblock omap replica is missing. 2026-03-09T17:31:39.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:39.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:39.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:39 vm06.local ceph-mon[109831]: pgmap v25: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 292 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 0 B/s rd, 649 KiB/s wr, 155 op/s; 7085/47790 objects degraded (14.825%) 2026-03-09T17:31:39.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:39.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:31:39.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:39.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:31:39.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:39.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:39.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:39.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:31:39.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:39 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:31:39.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:39 vm06.local ceph-mon[109831]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-09T17:31:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:39 vm09.local ceph-mon[97995]: pgmap v25: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 292 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 0 B/s rd, 649 KiB/s wr, 155 op/s; 7085/47790 objects degraded (14.825%) 2026-03-09T17:31:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:31:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:31:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:31:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:31:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:39 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:31:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:39 vm09.local ceph-mon[97995]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-09T17:31:40.141 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:39 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0[115687]: 2026-03-09T17:31:39.721+0000 7f9b236b8740 -1 osd.0 42 log_to_monitors true 2026-03-09T17:31:40.627 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:40 vm09.local ceph-mon[97995]: from='osd.0 [v2:192.168.123.106:6802/1436141635,v1:192.168.123.106:6803/1436141635]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T17:31:40.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:40 vm06.local ceph-mon[109831]: from='osd.0 [v2:192.168.123.106:6802/1436141635,v1:192.168.123.106:6803/1436141635]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T17:31:40.641 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:31:40 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0[115687]: 2026-03-09T17:31:40.388+0000 7f9b1b452640 -1 osd.0 42 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T17:31:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:41 vm06.local ceph-mon[109831]: pgmap v26: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 288 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 419 KiB/s rd, 994 KiB/s wr, 238 op/s; 6648/44859 objects degraded (14.820%) 2026-03-09T17:31:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:41 vm06.local ceph-mon[109831]: from='osd.0 [v2:192.168.123.106:6802/1436141635,v1:192.168.123.106:6803/1436141635]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T17:31:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:41 vm06.local ceph-mon[109831]: osdmap e45: 6 total, 5 up, 6 in 2026-03-09T17:31:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:41 vm06.local ceph-mon[109831]: from='osd.0 [v2:192.168.123.106:6802/1436141635,v1:192.168.123.106:6803/1436141635]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T17:31:41.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:41 vm09.local ceph-mon[97995]: pgmap v26: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 288 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 419 KiB/s rd, 994 KiB/s wr, 238 op/s; 6648/44859 objects degraded (14.820%) 2026-03-09T17:31:41.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:41 vm09.local ceph-mon[97995]: from='osd.0 [v2:192.168.123.106:6802/1436141635,v1:192.168.123.106:6803/1436141635]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T17:31:41.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:41 vm09.local ceph-mon[97995]: osdmap e45: 6 total, 5 up, 6 in 2026-03-09T17:31:41.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:41 vm09.local ceph-mon[97995]: from='osd.0 [v2:192.168.123.106:6802/1436141635,v1:192.168.123.106:6803/1436141635]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T17:31:42.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:42 vm06.local ceph-mon[109831]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T17:31:42.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:42 vm06.local ceph-mon[109831]: osd.0 [v2:192.168.123.106:6802/1436141635,v1:192.168.123.106:6803/1436141635] boot 2026-03-09T17:31:42.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:42 vm06.local ceph-mon[109831]: osdmap e46: 6 total, 6 up, 6 in 2026-03-09T17:31:42.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:42 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:31:42.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:42 vm09.local ceph-mon[97995]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T17:31:42.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:42 vm09.local ceph-mon[97995]: osd.0 [v2:192.168.123.106:6802/1436141635,v1:192.168.123.106:6803/1436141635] boot 2026-03-09T17:31:42.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:42 vm09.local ceph-mon[97995]: osdmap e46: 6 total, 6 up, 6 in 2026-03-09T17:31:42.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:42 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T17:31:43.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:43 vm06.local ceph-mon[109831]: pgmap v29: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 288 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 523 KiB/s rd, 1.2 MiB/s wr, 297 op/s; 6648/44859 objects degraded (14.820%) 2026-03-09T17:31:43.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:43 vm06.local ceph-mon[109831]: osdmap e47: 6 total, 6 up, 6 in 2026-03-09T17:31:43.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:43.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:31:43.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:43 vm06.local ceph-mon[109831]: osdmap e48: 6 total, 6 up, 6 in 2026-03-09T17:31:43.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:43 vm09.local ceph-mon[97995]: pgmap v29: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 288 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 523 KiB/s rd, 1.2 MiB/s wr, 297 op/s; 6648/44859 objects degraded (14.820%) 2026-03-09T17:31:43.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:43 vm09.local ceph-mon[97995]: osdmap e47: 6 total, 6 up, 6 in 2026-03-09T17:31:43.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:31:43.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:31:43.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:43 vm09.local ceph-mon[97995]: osdmap e48: 6 total, 6 up, 6 in 2026-03-09T17:31:45.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:45 vm06.local ceph-mon[109831]: pgmap v32: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 288 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 6648/44859 objects degraded (14.820%) 2026-03-09T17:31:45.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:45 vm06.local ceph-mon[109831]: osdmap e49: 6 total, 6 up, 6 in 2026-03-09T17:31:45.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:45 vm09.local ceph-mon[97995]: pgmap v32: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 288 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 6648/44859 objects degraded (14.820%) 2026-03-09T17:31:45.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:45 vm09.local ceph-mon[97995]: osdmap e49: 6 total, 6 up, 6 in 2026-03-09T17:31:46.986 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:46 vm06.local ceph-mon[109831]: osdmap e50: 6 total, 6 up, 6 in 2026-03-09T17:31:46.986 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:46 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 6648/44859 objects degraded (14.820%), 33 pgs degraded (PG_DEGRADED) 2026-03-09T17:31:47.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.003+0000 7f56bdead700 1 -- 192.168.123.106:0/1120291828 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f56b8071950 msgr2=0x7f56b8071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:47.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.003+0000 7f56bdead700 1 --2- 192.168.123.106:0/1120291828 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f56b8071950 0x7f56b8071d60 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f56a8007780 tx=0x7f56a800c050 comp rx=0 tx=0).stop 2026-03-09T17:31:47.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.004+0000 7f56bdead700 1 -- 192.168.123.106:0/1120291828 shutdown_connections 2026-03-09T17:31:47.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.004+0000 7f56bdead700 1 --2- 192.168.123.106:0/1120291828 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56b8072330 0x7f56b80770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.004+0000 7f56bdead700 1 --2- 192.168.123.106:0/1120291828 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f56b8071950 0x7f56b8071d60 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.004+0000 7f56bdead700 1 -- 192.168.123.106:0/1120291828 >> 192.168.123.106:0/1120291828 conn(0x7f56b806d1a0 msgr2=0x7f56b806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:47.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.004+0000 7f56bdead700 1 -- 192.168.123.106:0/1120291828 shutdown_connections 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.004+0000 7f56bdead700 1 -- 192.168.123.106:0/1120291828 wait complete. 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.005+0000 7f56bdead700 1 Processor -- start 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.005+0000 7f56bdead700 1 -- start start 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.005+0000 7f56bdead700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f56b8071950 0x7f56b81311f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.005+0000 7f56bdead700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56b8072330 0x7f56b8131730 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.005+0000 7f56bdead700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f56b8131d30 con 0x7f56b8072330 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.005+0000 7f56bdead700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f56b8131ea0 con 0x7f56b8071950 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.005+0000 7f56bceab700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f56b8071950 0x7f56b81311f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.005+0000 7f56bceab700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f56b8071950 0x7f56b81311f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:34714/0 (socket says 192.168.123.106:34714) 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.005+0000 7f56bceab700 1 -- 192.168.123.106:0/3652198240 learned_addr learned my addr 192.168.123.106:0/3652198240 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.005+0000 7f56bceab700 1 -- 192.168.123.106:0/3652198240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56b8072330 msgr2=0x7f56b8131730 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.005+0000 7f56bceab700 1 --2- 192.168.123.106:0/3652198240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56b8072330 0x7f56b8131730 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.005+0000 7f56bceab700 1 -- 192.168.123.106:0/3652198240 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f56a8007430 con 0x7f56b8071950 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.006+0000 7f56bceab700 1 --2- 192.168.123.106:0/3652198240 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f56b8071950 0x7f56b81311f0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f56a800e010 tx=0x7f56a800cdd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:47.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.006+0000 7f56b5ffb700 1 -- 192.168.123.106:0/3652198240 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f56a800f040 con 0x7f56b8071950 2026-03-09T17:31:47.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.006+0000 7f56bdead700 1 -- 192.168.123.106:0/3652198240 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f56b807f4c0 con 0x7f56b8071950 2026-03-09T17:31:47.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.006+0000 7f56bdead700 1 -- 192.168.123.106:0/3652198240 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f56b807f9b0 con 0x7f56b8071950 2026-03-09T17:31:47.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.007+0000 7f56b5ffb700 1 -- 192.168.123.106:0/3652198240 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f56a8004680 con 0x7f56b8071950 2026-03-09T17:31:47.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.007+0000 7f56b5ffb700 1 -- 192.168.123.106:0/3652198240 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f56a800a9e0 con 0x7f56b8071950 2026-03-09T17:31:47.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.007+0000 7f56bdead700 1 -- 192.168.123.106:0/3652198240 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f56a4005320 con 0x7f56b8071950 2026-03-09T17:31:47.010 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.008+0000 7f56b5ffb700 1 -- 192.168.123.106:0/3652198240 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f56a800a1d0 con 0x7f56b8071950 2026-03-09T17:31:47.011 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.009+0000 7f56b5ffb700 1 --2- 192.168.123.106:0/3652198240 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f56a0077a40 0x7f56a0079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:47.011 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.009+0000 7f56b5ffb700 1 -- 192.168.123.106:0/3652198240 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f56a809b380 con 0x7f56b8071950 2026-03-09T17:31:47.012 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.010+0000 7f56b7fff700 1 --2- 192.168.123.106:0/3652198240 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f56a0077a40 0x7f56a0079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:47.012 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.010+0000 7f56b7fff700 1 --2- 192.168.123.106:0/3652198240 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f56a0077a40 0x7f56a0079ef0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f56b00060b0 tx=0x7f56b0006040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:47.013 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.011+0000 7f56b5ffb700 1 -- 192.168.123.106:0/3652198240 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f56a8063a10 con 0x7f56b8071950 2026-03-09T17:31:47.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:46 vm09.local ceph-mon[97995]: osdmap e50: 6 total, 6 up, 6 in 2026-03-09T17:31:47.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:46 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 6648/44859 objects degraded (14.820%), 33 pgs degraded (PG_DEGRADED) 2026-03-09T17:31:47.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.175+0000 7f56bdead700 1 -- 192.168.123.106:0/3652198240 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f56a4000bf0 con 0x7f56a0077a40 2026-03-09T17:31:47.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.176+0000 7f56b5ffb700 1 -- 192.168.123.106:0/3652198240 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f56a4000bf0 con 0x7f56a0077a40 2026-03-09T17:31:47.181 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.178+0000 7f56bdead700 1 -- 192.168.123.106:0/3652198240 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f56a0077a40 msgr2=0x7f56a0079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:47.181 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.178+0000 7f56bdead700 1 --2- 192.168.123.106:0/3652198240 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f56a0077a40 0x7f56a0079ef0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f56b00060b0 tx=0x7f56b0006040 comp rx=0 tx=0).stop 2026-03-09T17:31:47.181 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.178+0000 7f56bdead700 1 -- 192.168.123.106:0/3652198240 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f56b8071950 msgr2=0x7f56b81311f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:47.181 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.178+0000 7f56bdead700 1 --2- 192.168.123.106:0/3652198240 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f56b8071950 0x7f56b81311f0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f56a800e010 tx=0x7f56a800cdd0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.180+0000 7f56bdead700 1 -- 192.168.123.106:0/3652198240 shutdown_connections 2026-03-09T17:31:47.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.180+0000 7f56bdead700 1 --2- 192.168.123.106:0/3652198240 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f56a0077a40 0x7f56a0079ef0 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.180+0000 7f56bdead700 1 --2- 192.168.123.106:0/3652198240 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f56b8071950 0x7f56b81311f0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.180+0000 7f56bdead700 1 --2- 192.168.123.106:0/3652198240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56b8072330 0x7f56b8131730 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.180+0000 7f56bdead700 1 -- 192.168.123.106:0/3652198240 >> 192.168.123.106:0/3652198240 conn(0x7f56b806d1a0 msgr2=0x7f56b80758d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:47.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.180+0000 7f56bdead700 1 -- 192.168.123.106:0/3652198240 shutdown_connections 2026-03-09T17:31:47.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.180+0000 7f56bdead700 1 -- 192.168.123.106:0/3652198240 wait complete. 2026-03-09T17:31:47.194 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:31:47.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.261+0000 7fe242c21700 1 -- 192.168.123.106:0/1743800890 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe23c072360 msgr2=0x7fe23c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:47.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.261+0000 7fe242c21700 1 --2- 192.168.123.106:0/1743800890 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe23c072360 0x7fe23c0770e0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fe234009230 tx=0x7fe234009260 comp rx=0 tx=0).stop 2026-03-09T17:31:47.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.261+0000 7fe242c21700 1 -- 192.168.123.106:0/1743800890 shutdown_connections 2026-03-09T17:31:47.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.261+0000 7fe242c21700 1 --2- 192.168.123.106:0/1743800890 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe23c072360 0x7fe23c0770e0 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.261+0000 7fe242c21700 1 --2- 192.168.123.106:0/1743800890 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe23c071980 0x7fe23c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.261+0000 7fe242c21700 1 -- 192.168.123.106:0/1743800890 >> 192.168.123.106:0/1743800890 conn(0x7fe23c06d1a0 msgr2=0x7fe23c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:47.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.264+0000 7fe242c21700 1 -- 192.168.123.106:0/1743800890 shutdown_connections 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.264+0000 7fe242c21700 1 -- 192.168.123.106:0/1743800890 wait complete. 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.264+0000 7fe242c21700 1 Processor -- start 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.264+0000 7fe242c21700 1 -- start start 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.264+0000 7fe242c21700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe23c071980 0x7fe23c082530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.264+0000 7fe242c21700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe23c082a70 0x7fe23c082ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.264+0000 7fe242c21700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe23c1b2a90 con 0x7fe23c082a70 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.264+0000 7fe242c21700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe23c1b2bd0 con 0x7fe23c071980 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.264+0000 7fe2409bd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe23c071980 0x7fe23c082530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.265+0000 7fe2409bd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe23c071980 0x7fe23c082530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:34740/0 (socket says 192.168.123.106:34740) 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.265+0000 7fe2409bd700 1 -- 192.168.123.106:0/4291436538 learned_addr learned my addr 192.168.123.106:0/4291436538 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.265+0000 7fe23bfff700 1 --2- 192.168.123.106:0/4291436538 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe23c082a70 0x7fe23c082ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.265+0000 7fe2409bd700 1 -- 192.168.123.106:0/4291436538 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe23c082a70 msgr2=0x7fe23c082ee0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.265+0000 7fe2409bd700 1 --2- 192.168.123.106:0/4291436538 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe23c082a70 0x7fe23c082ee0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.265+0000 7fe2409bd700 1 -- 192.168.123.106:0/4291436538 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe234008ee0 con 0x7fe23c071980 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.265+0000 7fe2409bd700 1 --2- 192.168.123.106:0/4291436538 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe23c071980 0x7fe23c082530 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fe22c00eb40 tx=0x7fe22c00ef00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:47.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.265+0000 7fe239ffb700 1 -- 192.168.123.106:0/4291436538 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe22c00ccf0 con 0x7fe23c071980 2026-03-09T17:31:47.268 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.266+0000 7fe242c21700 1 -- 192.168.123.106:0/4291436538 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe23c1b2d10 con 0x7fe23c071980 2026-03-09T17:31:47.268 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.266+0000 7fe242c21700 1 -- 192.168.123.106:0/4291436538 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe23c1b3180 con 0x7fe23c071980 2026-03-09T17:31:47.268 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.266+0000 7fe239ffb700 1 -- 192.168.123.106:0/4291436538 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe22c00ce50 con 0x7fe23c071980 2026-03-09T17:31:47.268 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.266+0000 7fe239ffb700 1 -- 192.168.123.106:0/4291436538 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe22c018900 con 0x7fe23c071980 2026-03-09T17:31:47.270 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.267+0000 7fe239ffb700 1 -- 192.168.123.106:0/4291436538 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fe22c010c30 con 0x7fe23c071980 2026-03-09T17:31:47.270 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.268+0000 7fe239ffb700 1 --2- 192.168.123.106:0/4291436538 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe224077a50 0x7fe224079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:47.270 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.268+0000 7fe23bfff700 1 --2- 192.168.123.106:0/4291436538 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe224077a50 0x7fe224079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:47.270 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.268+0000 7fe239ffb700 1 -- 192.168.123.106:0/4291436538 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fe22c014070 con 0x7fe23c071980 2026-03-09T17:31:47.271 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.269+0000 7fe242c21700 1 -- 192.168.123.106:0/4291436538 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe228005320 con 0x7fe23c071980 2026-03-09T17:31:47.271 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.269+0000 7fe23bfff700 1 --2- 192.168.123.106:0/4291436538 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe224077a50 0x7fe224079f00 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fe23400ebb0 tx=0x7fe234010040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:47.274 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.272+0000 7fe239ffb700 1 -- 192.168.123.106:0/4291436538 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe22c062d20 con 0x7fe23c071980 2026-03-09T17:31:47.427 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.425+0000 7fe242c21700 1 -- 192.168.123.106:0/4291436538 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe228000bf0 con 0x7fe224077a50 2026-03-09T17:31:47.428 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.426+0000 7fe239ffb700 1 -- 192.168.123.106:0/4291436538 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fe228000bf0 con 0x7fe224077a50 2026-03-09T17:31:47.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.429+0000 7fe2237fe700 1 -- 192.168.123.106:0/4291436538 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe224077a50 msgr2=0x7fe224079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:47.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.429+0000 7fe2237fe700 1 --2- 192.168.123.106:0/4291436538 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe224077a50 0x7fe224079f00 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fe23400ebb0 tx=0x7fe234010040 comp rx=0 tx=0).stop 2026-03-09T17:31:47.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.429+0000 7fe2237fe700 1 -- 192.168.123.106:0/4291436538 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe23c071980 msgr2=0x7fe23c082530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:47.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.429+0000 7fe2237fe700 1 --2- 192.168.123.106:0/4291436538 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe23c071980 0x7fe23c082530 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fe22c00eb40 tx=0x7fe22c00ef00 comp rx=0 tx=0).stop 2026-03-09T17:31:47.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.429+0000 7fe2237fe700 1 -- 192.168.123.106:0/4291436538 shutdown_connections 2026-03-09T17:31:47.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.429+0000 7fe2237fe700 1 --2- 192.168.123.106:0/4291436538 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe224077a50 0x7fe224079f00 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.429+0000 7fe2237fe700 1 --2- 192.168.123.106:0/4291436538 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe23c071980 0x7fe23c082530 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.429+0000 7fe2237fe700 1 --2- 192.168.123.106:0/4291436538 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe23c082a70 0x7fe23c082ee0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.429+0000 7fe2237fe700 1 -- 192.168.123.106:0/4291436538 >> 192.168.123.106:0/4291436538 conn(0x7fe23c06d1a0 msgr2=0x7fe23c0764c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:47.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.429+0000 7fe2237fe700 1 -- 192.168.123.106:0/4291436538 shutdown_connections 2026-03-09T17:31:47.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.429+0000 7fe2237fe700 1 -- 192.168.123.106:0/4291436538 wait complete. 2026-03-09T17:31:47.556 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.554+0000 7f2da6a20700 1 -- 192.168.123.106:0/3577857513 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2da0072470 msgr2=0x7f2da010beb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:47.556 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.554+0000 7f2da6a20700 1 --2- 192.168.123.106:0/3577857513 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2da0072470 0x7f2da010beb0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f2d9800b3a0 tx=0x7f2d9800b6b0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.556 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.554+0000 7f2da6a20700 1 -- 192.168.123.106:0/3577857513 shutdown_connections 2026-03-09T17:31:47.556 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.554+0000 7f2da6a20700 1 --2- 192.168.123.106:0/3577857513 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2da0072470 0x7f2da010beb0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.556 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.554+0000 7f2da6a20700 1 --2- 192.168.123.106:0/3577857513 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2da0071a90 0x7f2da0071ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.556 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.554+0000 7f2da6a20700 1 -- 192.168.123.106:0/3577857513 >> 192.168.123.106:0/3577857513 conn(0x7f2da006d1a0 msgr2=0x7f2da006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:47.556 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.554+0000 7f2da6a20700 1 -- 192.168.123.106:0/3577857513 shutdown_connections 2026-03-09T17:31:47.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.554+0000 7f2da6a20700 1 -- 192.168.123.106:0/3577857513 wait complete. 2026-03-09T17:31:47.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.558+0000 7f2da6a20700 1 Processor -- start 2026-03-09T17:31:47.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.558+0000 7f2da6a20700 1 -- start start 2026-03-09T17:31:47.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.558+0000 7f2da6a20700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2da0071a90 0x7f2da0116a00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:47.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.558+0000 7f2da6a20700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2da0116f40 0x7f2da01b27b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:47.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.558+0000 7f2da6a20700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2da0117440 con 0x7f2da0116f40 2026-03-09T17:31:47.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.558+0000 7f2da6a20700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2da01175b0 con 0x7f2da0071a90 2026-03-09T17:31:47.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.558+0000 7f2d9ffff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2da0071a90 0x7f2da0116a00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:47.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.558+0000 7f2d9ffff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2da0071a90 0x7f2da0116a00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:34752/0 (socket says 192.168.123.106:34752) 2026-03-09T17:31:47.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.558+0000 7f2d9ffff700 1 -- 192.168.123.106:0/137172165 learned_addr learned my addr 192.168.123.106:0/137172165 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:31:47.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.558+0000 7f2d9f7fe700 1 --2- 192.168.123.106:0/137172165 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2da0116f40 0x7f2da01b27b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:47.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.559+0000 7f2d9ffff700 1 -- 192.168.123.106:0/137172165 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2da0116f40 msgr2=0x7f2da01b27b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:47.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.559+0000 7f2d9ffff700 1 --2- 192.168.123.106:0/137172165 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2da0116f40 0x7f2da01b27b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.559+0000 7f2d9ffff700 1 -- 192.168.123.106:0/137172165 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2d9800b050 con 0x7f2da0071a90 2026-03-09T17:31:47.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.559+0000 7f2d9ffff700 1 --2- 192.168.123.106:0/137172165 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2da0071a90 0x7f2da0116a00 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f2d9000b770 tx=0x7f2d9000bb30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:47.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.559+0000 7f2d9d7fa700 1 -- 192.168.123.106:0/137172165 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2d9000f820 con 0x7f2da0071a90 2026-03-09T17:31:47.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.560+0000 7f2da6a20700 1 -- 192.168.123.106:0/137172165 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2da01b2d50 con 0x7f2da0071a90 2026-03-09T17:31:47.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.560+0000 7f2da6a20700 1 -- 192.168.123.106:0/137172165 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2da01b3210 con 0x7f2da0071a90 2026-03-09T17:31:47.563 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.560+0000 7f2d9d7fa700 1 -- 192.168.123.106:0/137172165 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2d9000fe60 con 0x7f2da0071a90 2026-03-09T17:31:47.563 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.560+0000 7f2d9d7fa700 1 -- 192.168.123.106:0/137172165 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2d9000d610 con 0x7f2da0071a90 2026-03-09T17:31:47.565 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.563+0000 7f2d9d7fa700 1 -- 192.168.123.106:0/137172165 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f2d9000f980 con 0x7f2da0071a90 2026-03-09T17:31:47.565 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.563+0000 7f2d9d7fa700 1 --2- 192.168.123.106:0/137172165 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2d88077a40 0x7f2d88079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:47.566 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.564+0000 7f2d9d7fa700 1 -- 192.168.123.106:0/137172165 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f2d900998e0 con 0x7f2da0071a90 2026-03-09T17:31:47.566 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.564+0000 7f2d9f7fe700 1 --2- 192.168.123.106:0/137172165 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2d88077a40 0x7f2d88079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:47.566 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.564+0000 7f2da6a20700 1 -- 192.168.123.106:0/137172165 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2d8c005320 con 0x7f2da0071a90 2026-03-09T17:31:47.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.565+0000 7f2d9f7fe700 1 --2- 192.168.123.106:0/137172165 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2d88077a40 0x7f2d88079ef0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f2d9800bb30 tx=0x7f2d98014040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:47.574 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.568+0000 7f2d9d7fa700 1 -- 192.168.123.106:0/137172165 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2d90061ef0 con 0x7f2da0071a90 2026-03-09T17:31:47.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.736+0000 7f2da6a20700 1 -- 192.168.123.106:0/137172165 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f2d8c000bf0 con 0x7f2d88077a40 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.742+0000 7f2d9d7fa700 1 -- 192.168.123.106:0/137172165 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f2d8c000bf0 con 0x7f2d88077a40 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (6m) 11s ago 6m 25.9M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (6m) 11s ago 6m 8740k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (6m) 24s ago 6m 11.1M - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (27s) 11s ago 6m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 3e47d040f792 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (25s) 24s ago 6m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e a8538a05db57 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (6m) 11s ago 6m 94.1M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (4m) 11s ago 4m 16.0M - 18.2.0 dc2bc1663786 4b4cbdf0c640 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (4m) 11s ago 4m 240M - 18.2.0 dc2bc1663786 4c8e86b2b8cd 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (4m) 24s ago 4m 145M - 18.2.0 dc2bc1663786 aa1f0430b448 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (4m) 24s ago 4m 17.5M - 18.2.0 dc2bc1663786 8dc8a0159213 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:8443,9283,8765 running (86s) 11s ago 7m 619M - 19.2.3-678-ge911bdeb 654f31e6858e a857a5a84e3a 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (63s) 24s ago 6m 487M - 19.2.3-678-ge911bdeb 654f31e6858e ce41ce6b68d6 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (56s) 11s ago 7m 56.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 86c27c9946b5 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (42s) 24s ago 6m 48.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 65d270c6a306 2026-03-09T17:31:47.744 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (6m) 11s ago 6m 14.7M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:31:47.745 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (6m) 24s ago 6m 15.9M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:31:47.745 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (14s) 11s ago 5m 32.1M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 3b19d9fcb067 2026-03-09T17:31:47.745 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (5m) 11s ago 5m 367M 4096M 18.2.0 dc2bc1663786 decf7e88ebaf 2026-03-09T17:31:47.745 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (5m) 11s ago 5m 316M 4096M 18.2.0 dc2bc1663786 df6c7067c3e4 2026-03-09T17:31:47.745 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (5m) 24s ago 5m 427M 4096M 18.2.0 dc2bc1663786 48a594500ef1 2026-03-09T17:31:47.745 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (5m) 24s ago 5m 410M 4096M 18.2.0 dc2bc1663786 a47c39052541 2026-03-09T17:31:47.745 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (5m) 24s ago 5m 334M 4096M 18.2.0 dc2bc1663786 89f436540a49 2026-03-09T17:31:47.745 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (66s) 11s ago 6m 53.4M - 2.43.0 a07b618ecd1d f6ece95f2fd5 2026-03-09T17:31:47.748 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.746+0000 7f2d86ffd700 1 -- 192.168.123.106:0/137172165 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2d88077a40 msgr2=0x7f2d88079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:47.748 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.746+0000 7f2d86ffd700 1 --2- 192.168.123.106:0/137172165 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2d88077a40 0x7f2d88079ef0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f2d9800bb30 tx=0x7f2d98014040 comp rx=0 tx=0).stop 2026-03-09T17:31:47.748 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.746+0000 7f2d86ffd700 1 -- 192.168.123.106:0/137172165 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2da0071a90 msgr2=0x7f2da0116a00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:47.748 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.746+0000 7f2d86ffd700 1 --2- 192.168.123.106:0/137172165 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2da0071a90 0x7f2da0116a00 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f2d9000b770 tx=0x7f2d9000bb30 comp rx=0 tx=0).stop 2026-03-09T17:31:47.748 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.746+0000 7f2d86ffd700 1 -- 192.168.123.106:0/137172165 shutdown_connections 2026-03-09T17:31:47.749 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.746+0000 7f2d86ffd700 1 --2- 192.168.123.106:0/137172165 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2d88077a40 0x7f2d88079ef0 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.749 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.746+0000 7f2d86ffd700 1 --2- 192.168.123.106:0/137172165 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2da0071a90 0x7f2da0116a00 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.749 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.746+0000 7f2d86ffd700 1 --2- 192.168.123.106:0/137172165 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2da0116f40 0x7f2da01b27b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.749 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.746+0000 7f2d86ffd700 1 -- 192.168.123.106:0/137172165 >> 192.168.123.106:0/137172165 conn(0x7f2da006d1a0 msgr2=0x7f2da010b1a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:47.749 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.747+0000 7f2d86ffd700 1 -- 192.168.123.106:0/137172165 shutdown_connections 2026-03-09T17:31:47.749 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.747+0000 7f2d86ffd700 1 -- 192.168.123.106:0/137172165 wait complete. 2026-03-09T17:31:47.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.841+0000 7f88616a4700 1 -- 192.168.123.106:0/1009055581 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f885c072360 msgr2=0x7f885c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:47.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.841+0000 7f88616a4700 1 --2- 192.168.123.106:0/1009055581 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f885c072360 0x7f885c0770e0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f885400d3f0 tx=0x7f885400d700 comp rx=0 tx=0).stop 2026-03-09T17:31:47.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.841+0000 7f88616a4700 1 -- 192.168.123.106:0/1009055581 shutdown_connections 2026-03-09T17:31:47.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.841+0000 7f88616a4700 1 --2- 192.168.123.106:0/1009055581 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f885c072360 0x7f885c0770e0 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.841+0000 7f88616a4700 1 --2- 192.168.123.106:0/1009055581 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f885c071980 0x7f885c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.843 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.841+0000 7f88616a4700 1 -- 192.168.123.106:0/1009055581 >> 192.168.123.106:0/1009055581 conn(0x7f885c06d1a0 msgr2=0x7f885c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:47.844 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:47 vm06.local ceph-mon[109831]: pgmap v35: 65 pgs: 12 active+recovery_wait+degraded, 1 active+undersized+remapped, 2 active+recovering, 50 active+clean; 284 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 951 KiB/s rd, 1.3 MiB/s wr, 394 op/s; 875/40596 objects degraded (2.155%); 2.0 MiB/s, 6 objects/s recovering 2026-03-09T17:31:47.844 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:47 vm06.local ceph-mon[109831]: from='client.44121 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:47.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.841+0000 7f88616a4700 1 -- 192.168.123.106:0/1009055581 shutdown_connections 2026-03-09T17:31:47.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.841+0000 7f88616a4700 1 -- 192.168.123.106:0/1009055581 wait complete. 2026-03-09T17:31:47.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.844+0000 7f88616a4700 1 Processor -- start 2026-03-09T17:31:47.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.844+0000 7f88616a4700 1 -- start start 2026-03-09T17:31:47.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.845+0000 7f88616a4700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f885c071980 0x7f885c1b6010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:47.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.845+0000 7f88616a4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f885c1b6550 0x7f885c07f530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:47.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.845+0000 7f88616a4700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f885c1b69c0 con 0x7f885c1b6550 2026-03-09T17:31:47.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.845+0000 7f88616a4700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f885c1b6b30 con 0x7f885c071980 2026-03-09T17:31:47.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.845+0000 7f885affd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f885c071980 0x7f885c1b6010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:47.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.845+0000 7f885affd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f885c071980 0x7f885c1b6010 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:34768/0 (socket says 192.168.123.106:34768) 2026-03-09T17:31:47.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.845+0000 7f885affd700 1 -- 192.168.123.106:0/4043691974 learned_addr learned my addr 192.168.123.106:0/4043691974 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:31:47.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.845+0000 7f885a7fc700 1 --2- 192.168.123.106:0/4043691974 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f885c1b6550 0x7f885c07f530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:47.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.845+0000 7f885affd700 1 -- 192.168.123.106:0/4043691974 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f885c1b6550 msgr2=0x7f885c07f530 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:47.848 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.845+0000 7f885affd700 1 --2- 192.168.123.106:0/4043691974 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f885c1b6550 0x7f885c07f530 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:47.848 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.845+0000 7f885affd700 1 -- 192.168.123.106:0/4043691974 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8854007ed0 con 0x7f885c071980 2026-03-09T17:31:47.848 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.846+0000 7f885affd700 1 --2- 192.168.123.106:0/4043691974 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f885c071980 0x7f885c1b6010 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f884c00b770 tx=0x7f884c00bb30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:47.848 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.846+0000 7f8843fff700 1 -- 192.168.123.106:0/4043691974 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f884c00f820 con 0x7f885c071980 2026-03-09T17:31:47.849 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.846+0000 7f88616a4700 1 -- 192.168.123.106:0/4043691974 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f885c07fa70 con 0x7f885c071980 2026-03-09T17:31:47.849 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.846+0000 7f88616a4700 1 -- 192.168.123.106:0/4043691974 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f885c07ff60 con 0x7f885c071980 2026-03-09T17:31:47.849 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.847+0000 7f8843fff700 1 -- 192.168.123.106:0/4043691974 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f884c00fe60 con 0x7f885c071980 2026-03-09T17:31:47.849 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.847+0000 7f8843fff700 1 -- 192.168.123.106:0/4043691974 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f884c00d610 con 0x7f885c071980 2026-03-09T17:31:47.849 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.847+0000 7f88616a4700 1 -- 192.168.123.106:0/4043691974 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8848005320 con 0x7f885c071980 2026-03-09T17:31:47.850 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.848+0000 7f8843fff700 1 -- 192.168.123.106:0/4043691974 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f884c00f980 con 0x7f885c071980 2026-03-09T17:31:47.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.849+0000 7f8843fff700 1 --2- 192.168.123.106:0/4043691974 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8844077a40 0x7f8844079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:47.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.849+0000 7f8843fff700 1 -- 192.168.123.106:0/4043691974 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f884c099b10 con 0x7f885c071980 2026-03-09T17:31:47.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.849+0000 7f885a7fc700 1 --2- 192.168.123.106:0/4043691974 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8844077a40 0x7f8844079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:47.852 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.850+0000 7f885a7fc700 1 --2- 192.168.123.106:0/4043691974 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8844077a40 0x7f8844079ef0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f8854007590 tx=0x7f88540074a0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:47.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:47.851+0000 7f8843fff700 1 -- 192.168.123.106:0/4043691974 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f884c0621a0 con 0x7f885c071980 2026-03-09T17:31:48.046 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.044+0000 7f88616a4700 1 -- 192.168.123.106:0/4043691974 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f8848005cc0 con 0x7f885c071980 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.046+0000 7f8843fff700 1 -- 192.168.123.106:0/4043691974 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f884c0618f0 con 0x7f885c071980 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 5, 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 9, 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:31:48.048 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:31:48.050 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.048+0000 7f8841ffb700 1 -- 192.168.123.106:0/4043691974 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8844077a40 msgr2=0x7f8844079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:48.051 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.048+0000 7f8841ffb700 1 --2- 192.168.123.106:0/4043691974 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8844077a40 0x7f8844079ef0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f8854007590 tx=0x7f88540074a0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.051 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.049+0000 7f8841ffb700 1 -- 192.168.123.106:0/4043691974 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f885c071980 msgr2=0x7f885c1b6010 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:48.051 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.049+0000 7f8841ffb700 1 --2- 192.168.123.106:0/4043691974 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f885c071980 0x7f885c1b6010 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f884c00b770 tx=0x7f884c00bb30 comp rx=0 tx=0).stop 2026-03-09T17:31:48.051 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.049+0000 7f8841ffb700 1 -- 192.168.123.106:0/4043691974 shutdown_connections 2026-03-09T17:31:48.051 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.049+0000 7f8841ffb700 1 --2- 192.168.123.106:0/4043691974 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8844077a40 0x7f8844079ef0 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.051 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.049+0000 7f8841ffb700 1 --2- 192.168.123.106:0/4043691974 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f885c071980 0x7f885c1b6010 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.051 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.049+0000 7f8841ffb700 1 --2- 192.168.123.106:0/4043691974 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f885c1b6550 0x7f885c07f530 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.051 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.049+0000 7f8841ffb700 1 -- 192.168.123.106:0/4043691974 >> 192.168.123.106:0/4043691974 conn(0x7f885c06d1a0 msgr2=0x7f885c0763b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:48.051 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.049+0000 7f8841ffb700 1 -- 192.168.123.106:0/4043691974 shutdown_connections 2026-03-09T17:31:48.051 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.049+0000 7f8841ffb700 1 -- 192.168.123.106:0/4043691974 wait complete. 2026-03-09T17:31:48.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:47 vm09.local ceph-mon[97995]: pgmap v35: 65 pgs: 12 active+recovery_wait+degraded, 1 active+undersized+remapped, 2 active+recovering, 50 active+clean; 284 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 951 KiB/s rd, 1.3 MiB/s wr, 394 op/s; 875/40596 objects degraded (2.155%); 2.0 MiB/s, 6 objects/s recovering 2026-03-09T17:31:48.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:47 vm09.local ceph-mon[97995]: from='client.44121 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:48.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.155+0000 7f8cdcbe2700 1 -- 192.168.123.106:0/2903609756 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8cd8071980 msgr2=0x7f8cd8071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:48.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.155+0000 7f8cdcbe2700 1 --2- 192.168.123.106:0/2903609756 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8cd8071980 0x7f8cd8071d90 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f8cc8009a60 tx=0x7f8cc8009d70 comp rx=0 tx=0).stop 2026-03-09T17:31:48.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.155+0000 7f8cdcbe2700 1 -- 192.168.123.106:0/2903609756 shutdown_connections 2026-03-09T17:31:48.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.155+0000 7f8cdcbe2700 1 --2- 192.168.123.106:0/2903609756 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8cd8072360 0x7f8cd80770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.155+0000 7f8cdcbe2700 1 --2- 192.168.123.106:0/2903609756 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8cd8071980 0x7f8cd8071d90 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.158 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.155+0000 7f8cdcbe2700 1 -- 192.168.123.106:0/2903609756 >> 192.168.123.106:0/2903609756 conn(0x7f8cd806d1a0 msgr2=0x7f8cd806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:48.158 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.155+0000 7f8cdcbe2700 1 -- 192.168.123.106:0/2903609756 shutdown_connections 2026-03-09T17:31:48.158 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.155+0000 7f8cdcbe2700 1 -- 192.168.123.106:0/2903609756 wait complete. 2026-03-09T17:31:48.158 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.156+0000 7f8cdcbe2700 1 Processor -- start 2026-03-09T17:31:48.158 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.156+0000 7f8cdcbe2700 1 -- start start 2026-03-09T17:31:48.158 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.156+0000 7f8cdcbe2700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8cd8072360 0x7f8cd80824d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:48.158 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.156+0000 7f8cdcbe2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8cd8082a10 0x7f8cd8082e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:48.158 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.156+0000 7f8cdcbe2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8cd81b2a90 con 0x7f8cd8082a10 2026-03-09T17:31:48.158 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.156+0000 7f8cdcbe2700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8cd81b2bd0 con 0x7f8cd8072360 2026-03-09T17:31:48.158 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.156+0000 7f8cd659c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8cd8072360 0x7f8cd80824d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:48.158 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.156+0000 7f8cd659c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8cd8072360 0x7f8cd80824d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:34788/0 (socket says 192.168.123.106:34788) 2026-03-09T17:31:48.158 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.156+0000 7f8cd659c700 1 -- 192.168.123.106:0/4079081556 learned_addr learned my addr 192.168.123.106:0/4079081556 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:31:48.159 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.156+0000 7f8cd659c700 1 -- 192.168.123.106:0/4079081556 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8cd8082a10 msgr2=0x7f8cd8082e80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:48.159 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.156+0000 7f8cd659c700 1 --2- 192.168.123.106:0/4079081556 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8cd8082a10 0x7f8cd8082e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.159 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.156+0000 7f8cd659c700 1 -- 192.168.123.106:0/4079081556 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8cc8009710 con 0x7f8cd8072360 2026-03-09T17:31:48.159 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.157+0000 7f8cd659c700 1 --2- 192.168.123.106:0/4079081556 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8cd8072360 0x7f8cd80824d0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f8cc8004570 tx=0x7f8cc8005d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:48.159 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.157+0000 7f8cc77fe700 1 -- 192.168.123.106:0/4079081556 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8cc801c070 con 0x7f8cd8072360 2026-03-09T17:31:48.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.157+0000 7f8cdcbe2700 1 -- 192.168.123.106:0/4079081556 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8cd81b2d10 con 0x7f8cd8072360 2026-03-09T17:31:48.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.157+0000 7f8cdcbe2700 1 -- 192.168.123.106:0/4079081556 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8cd81b3180 con 0x7f8cd8072360 2026-03-09T17:31:48.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.158+0000 7f8cc77fe700 1 -- 192.168.123.106:0/4079081556 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8cc8003c30 con 0x7f8cd8072360 2026-03-09T17:31:48.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.158+0000 7f8cc77fe700 1 -- 192.168.123.106:0/4079081556 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8cc8020620 con 0x7f8cd8072360 2026-03-09T17:31:48.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.158+0000 7f8cdcbe2700 1 -- 192.168.123.106:0/4079081556 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8cb8005320 con 0x7f8cd8072360 2026-03-09T17:31:48.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.159+0000 7f8cc77fe700 1 -- 192.168.123.106:0/4079081556 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f8cc800f460 con 0x7f8cd8072360 2026-03-09T17:31:48.162 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.160+0000 7f8cc77fe700 1 --2- 192.168.123.106:0/4079081556 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8cc0079c50 0x7f8cc007c100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:48.162 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.160+0000 7f8cc77fe700 1 -- 192.168.123.106:0/4079081556 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f8cc809b460 con 0x7f8cd8072360 2026-03-09T17:31:48.162 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.160+0000 7f8cd5d9b700 1 --2- 192.168.123.106:0/4079081556 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8cc0079c50 0x7f8cc007c100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:48.163 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.161+0000 7f8cd5d9b700 1 --2- 192.168.123.106:0/4079081556 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8cc0079c50 0x7f8cc007c100 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f8cd00076d0 tx=0x7f8cd0007820 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:48.164 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.162+0000 7f8cc77fe700 1 -- 192.168.123.106:0/4079081556 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8cc8063af0 con 0x7f8cd8072360 2026-03-09T17:31:48.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.348+0000 7f8cdcbe2700 1 -- 192.168.123.106:0/4079081556 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f8cb8005cc0 con 0x7f8cd8072360 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.349+0000 7f8cc77fe700 1 -- 192.168.123.106:0/4079081556 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1942 (secure 0 0 0) 0x7f8cc8063240 con 0x7f8cd8072360 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:e13 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:epoch 11 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:31:48.351 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:max_xattr_size 65536 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24291} 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:qdb_cluster leader: 0 members: 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:31:48.352 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:31:48.354 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.352+0000 7f8cc57fa700 1 -- 192.168.123.106:0/4079081556 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8cc0079c50 msgr2=0x7f8cc007c100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:48.354 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.352+0000 7f8cc57fa700 1 --2- 192.168.123.106:0/4079081556 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8cc0079c50 0x7f8cc007c100 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f8cd00076d0 tx=0x7f8cd0007820 comp rx=0 tx=0).stop 2026-03-09T17:31:48.354 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.352+0000 7f8cc57fa700 1 -- 192.168.123.106:0/4079081556 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8cd8072360 msgr2=0x7f8cd80824d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:48.354 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.352+0000 7f8cc57fa700 1 --2- 192.168.123.106:0/4079081556 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8cd8072360 0x7f8cd80824d0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f8cc8004570 tx=0x7f8cc8005d40 comp rx=0 tx=0).stop 2026-03-09T17:31:48.355 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.352+0000 7f8cc57fa700 1 -- 192.168.123.106:0/4079081556 shutdown_connections 2026-03-09T17:31:48.355 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.352+0000 7f8cc57fa700 1 --2- 192.168.123.106:0/4079081556 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8cc0079c50 0x7f8cc007c100 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.355 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.352+0000 7f8cc57fa700 1 --2- 192.168.123.106:0/4079081556 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8cd8072360 0x7f8cd80824d0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.355 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.352+0000 7f8cc57fa700 1 --2- 192.168.123.106:0/4079081556 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8cd8082a10 0x7f8cd8082e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.355 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.352+0000 7f8cc57fa700 1 -- 192.168.123.106:0/4079081556 >> 192.168.123.106:0/4079081556 conn(0x7f8cd806d1a0 msgr2=0x7f8cd8070590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:48.355 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.352+0000 7f8cc57fa700 1 -- 192.168.123.106:0/4079081556 shutdown_connections 2026-03-09T17:31:48.355 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.352+0000 7f8cc57fa700 1 -- 192.168.123.106:0/4079081556 wait complete. 2026-03-09T17:31:48.355 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.438+0000 7f9d6b2e6700 1 -- 192.168.123.106:0/3994315861 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9d64071980 msgr2=0x7f9d64071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.438+0000 7f9d6b2e6700 1 --2- 192.168.123.106:0/3994315861 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9d64071980 0x7f9d64071d90 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f9d600077e0 tx=0x7f9d60007af0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.438+0000 7f9d6b2e6700 1 -- 192.168.123.106:0/3994315861 shutdown_connections 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.438+0000 7f9d6b2e6700 1 --2- 192.168.123.106:0/3994315861 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9d64072360 0x7f9d640770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.438+0000 7f9d6b2e6700 1 --2- 192.168.123.106:0/3994315861 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9d64071980 0x7f9d64071d90 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.438+0000 7f9d6b2e6700 1 -- 192.168.123.106:0/3994315861 >> 192.168.123.106:0/3994315861 conn(0x7f9d6406d1a0 msgr2=0x7f9d6406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.438+0000 7f9d6b2e6700 1 -- 192.168.123.106:0/3994315861 shutdown_connections 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.438+0000 7f9d6b2e6700 1 -- 192.168.123.106:0/3994315861 wait complete. 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.439+0000 7f9d6b2e6700 1 Processor -- start 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.439+0000 7f9d6b2e6700 1 -- start start 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.439+0000 7f9d6b2e6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9d64072360 0x7f9d64082550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.439+0000 7f9d6b2e6700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9d64082a90 0x7f9d64082f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.439+0000 7f9d6b2e6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d6412dd80 con 0x7f9d64072360 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.439+0000 7f9d6b2e6700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d6412def0 con 0x7f9d64082a90 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.439+0000 7f9d69082700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9d64072360 0x7f9d64082550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.439+0000 7f9d68881700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9d64082a90 0x7f9d64082f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.439+0000 7f9d68881700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9d64082a90 0x7f9d64082f00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:34810/0 (socket says 192.168.123.106:34810) 2026-03-09T17:31:48.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.439+0000 7f9d68881700 1 -- 192.168.123.106:0/21370473 learned_addr learned my addr 192.168.123.106:0/21370473 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:31:48.443 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.441+0000 7f9d68881700 1 -- 192.168.123.106:0/21370473 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9d64072360 msgr2=0x7f9d64082550 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:48.443 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.441+0000 7f9d68881700 1 --2- 192.168.123.106:0/21370473 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9d64072360 0x7f9d64082550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.443 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.441+0000 7f9d68881700 1 -- 192.168.123.106:0/21370473 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9d60007430 con 0x7f9d64082a90 2026-03-09T17:31:48.443 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.441+0000 7f9d68881700 1 --2- 192.168.123.106:0/21370473 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9d64082a90 0x7f9d64082f00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f9d5c007f00 tx=0x7f9d5c00d3b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:48.443 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.441+0000 7f9d5a7fc700 1 -- 192.168.123.106:0/21370473 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d5c00dcf0 con 0x7f9d64082a90 2026-03-09T17:31:48.444 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.441+0000 7f9d6b2e6700 1 -- 192.168.123.106:0/21370473 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9d6412e170 con 0x7f9d64082a90 2026-03-09T17:31:48.444 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.441+0000 7f9d6b2e6700 1 -- 192.168.123.106:0/21370473 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9d6412e6c0 con 0x7f9d64082a90 2026-03-09T17:31:48.444 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.442+0000 7f9d5a7fc700 1 -- 192.168.123.106:0/21370473 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9d5c00f040 con 0x7f9d64082a90 2026-03-09T17:31:48.444 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.442+0000 7f9d5a7fc700 1 -- 192.168.123.106:0/21370473 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d5c0127c0 con 0x7f9d64082a90 2026-03-09T17:31:48.445 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.442+0000 7f9d6b2e6700 1 -- 192.168.123.106:0/21370473 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9d48005320 con 0x7f9d64082a90 2026-03-09T17:31:48.445 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.443+0000 7f9d5a7fc700 1 -- 192.168.123.106:0/21370473 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f9d5c004ad0 con 0x7f9d64082a90 2026-03-09T17:31:48.446 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.444+0000 7f9d5a7fc700 1 --2- 192.168.123.106:0/21370473 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9d50077a50 0x7f9d50079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:48.446 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.444+0000 7f9d5a7fc700 1 -- 192.168.123.106:0/21370473 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9d5c099410 con 0x7f9d64082a90 2026-03-09T17:31:48.449 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.447+0000 7f9d69082700 1 --2- 192.168.123.106:0/21370473 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9d50077a50 0x7f9d50079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:48.449 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.447+0000 7f9d5a7fc700 1 -- 192.168.123.106:0/21370473 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9d5c061aa0 con 0x7f9d64082a90 2026-03-09T17:31:48.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.481+0000 7f9d69082700 1 --2- 192.168.123.106:0/21370473 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9d50077a50 0x7f9d50079f00 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f9d60000c00 tx=0x7f9d60014040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:48.652 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.650+0000 7f9d6b2e6700 1 -- 192.168.123.106:0/21370473 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9d48000bf0 con 0x7f9d50077a50 2026-03-09T17:31:48.653 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.651+0000 7f9d5a7fc700 1 -- 192.168.123.106:0/21370473 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f9d48000bf0 con 0x7f9d50077a50 2026-03-09T17:31:48.653 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:31:48.653 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T17:31:48.653 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:31:48.653 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:31:48.653 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [ 2026-03-09T17:31:48.653 INFO:teuthology.orchestra.run.vm06.stdout: "mon", 2026-03-09T17:31:48.653 INFO:teuthology.orchestra.run.vm06.stdout: "crash", 2026-03-09T17:31:48.653 INFO:teuthology.orchestra.run.vm06.stdout: "mgr" 2026-03-09T17:31:48.653 INFO:teuthology.orchestra.run.vm06.stdout: ], 2026-03-09T17:31:48.653 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "7/23 daemons upgraded", 2026-03-09T17:31:48.654 INFO:teuthology.orchestra.run.vm06.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T17:31:48.654 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:31:48.654 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:31:48.657 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.655+0000 7f9d4ffff700 1 -- 192.168.123.106:0/21370473 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9d50077a50 msgr2=0x7f9d50079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:48.657 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.655+0000 7f9d4ffff700 1 --2- 192.168.123.106:0/21370473 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9d50077a50 0x7f9d50079f00 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f9d60000c00 tx=0x7f9d60014040 comp rx=0 tx=0).stop 2026-03-09T17:31:48.657 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.655+0000 7f9d4ffff700 1 -- 192.168.123.106:0/21370473 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9d64082a90 msgr2=0x7f9d64082f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:48.657 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.655+0000 7f9d4ffff700 1 --2- 192.168.123.106:0/21370473 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9d64082a90 0x7f9d64082f00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f9d5c007f00 tx=0x7f9d5c00d3b0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.656+0000 7f9d4ffff700 1 -- 192.168.123.106:0/21370473 shutdown_connections 2026-03-09T17:31:48.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.656+0000 7f9d4ffff700 1 --2- 192.168.123.106:0/21370473 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9d50077a50 0x7f9d50079f00 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.656+0000 7f9d4ffff700 1 --2- 192.168.123.106:0/21370473 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9d64072360 0x7f9d64082550 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.656+0000 7f9d4ffff700 1 --2- 192.168.123.106:0/21370473 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9d64082a90 0x7f9d64082f00 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.659 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.656+0000 7f9d4ffff700 1 -- 192.168.123.106:0/21370473 >> 192.168.123.106:0/21370473 conn(0x7f9d6406d1a0 msgr2=0x7f9d6406e090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:48.659 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.656+0000 7f9d4ffff700 1 -- 192.168.123.106:0/21370473 shutdown_connections 2026-03-09T17:31:48.659 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.656+0000 7f9d4ffff700 1 -- 192.168.123.106:0/21370473 wait complete. 2026-03-09T17:31:48.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.765+0000 7f2d1eda2700 1 -- 192.168.123.106:0/3017523522 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d18075700 msgr2=0x7f2d18075b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:48.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.765+0000 7f2d1eda2700 1 --2- 192.168.123.106:0/3017523522 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d18075700 0x7f2d18075b10 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f2d1000b3a0 tx=0x7f2d1000b6b0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.766+0000 7f2d1eda2700 1 -- 192.168.123.106:0/3017523522 shutdown_connections 2026-03-09T17:31:48.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.766+0000 7f2d1eda2700 1 --2- 192.168.123.106:0/3017523522 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d18076950 0x7f2d18076dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.769 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.766+0000 7f2d1eda2700 1 --2- 192.168.123.106:0/3017523522 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d18075700 0x7f2d18075b10 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.769 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.766+0000 7f2d1eda2700 1 -- 192.168.123.106:0/3017523522 >> 192.168.123.106:0/3017523522 conn(0x7f2d180fda80 msgr2=0x7f2d180ffeb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:48.769 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.767+0000 7f2d1eda2700 1 -- 192.168.123.106:0/3017523522 shutdown_connections 2026-03-09T17:31:48.769 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.767+0000 7f2d1eda2700 1 -- 192.168.123.106:0/3017523522 wait complete. 2026-03-09T17:31:48.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.773+0000 7f2d1eda2700 1 Processor -- start 2026-03-09T17:31:48.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.773+0000 7f2d1eda2700 1 -- start start 2026-03-09T17:31:48.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.773+0000 7f2d1eda2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d18076950 0x7f2d1819c1e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:48.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.773+0000 7f2d1eda2700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d1819c720 0x7f2d181a1790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:48.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.773+0000 7f2d1eda2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2d1819cc20 con 0x7f2d18076950 2026-03-09T17:31:48.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.773+0000 7f2d1eda2700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2d1819cd90 con 0x7f2d1819c720 2026-03-09T17:31:48.777 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.774+0000 7f2d1d59f700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d1819c720 0x7f2d181a1790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:48.777 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.774+0000 7f2d1d59f700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d1819c720 0x7f2d181a1790 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:34830/0 (socket says 192.168.123.106:34830) 2026-03-09T17:31:48.777 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.774+0000 7f2d1d59f700 1 -- 192.168.123.106:0/4107724703 learned_addr learned my addr 192.168.123.106:0/4107724703 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:31:48.777 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.774+0000 7f2d1d59f700 1 -- 192.168.123.106:0/4107724703 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d18076950 msgr2=0x7f2d1819c1e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:48.777 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.774+0000 7f2d1d59f700 1 --2- 192.168.123.106:0/4107724703 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d18076950 0x7f2d1819c1e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:48.777 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.774+0000 7f2d1d59f700 1 -- 192.168.123.106:0/4107724703 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2d1000b050 con 0x7f2d1819c720 2026-03-09T17:31:48.777 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.774+0000 7f2d1d59f700 1 --2- 192.168.123.106:0/4107724703 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d1819c720 0x7f2d181a1790 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f2d1400eb10 tx=0x7f2d1400eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:48.779 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.777+0000 7f2d0effd700 1 -- 192.168.123.106:0/4107724703 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2d1400cca0 con 0x7f2d1819c720 2026-03-09T17:31:48.780 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.778+0000 7f2d0effd700 1 -- 192.168.123.106:0/4107724703 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2d1400ce00 con 0x7f2d1819c720 2026-03-09T17:31:48.780 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.778+0000 7f2d0effd700 1 -- 192.168.123.106:0/4107724703 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2d14018910 con 0x7f2d1819c720 2026-03-09T17:31:48.780 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.778+0000 7f2d1eda2700 1 -- 192.168.123.106:0/4107724703 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2d181a1d30 con 0x7f2d1819c720 2026-03-09T17:31:48.780 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.778+0000 7f2d1eda2700 1 -- 192.168.123.106:0/4107724703 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2d181a2250 con 0x7f2d1819c720 2026-03-09T17:31:48.780 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.778+0000 7f2d1eda2700 1 -- 192.168.123.106:0/4107724703 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2d1804ea50 con 0x7f2d1819c720 2026-03-09T17:31:48.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.780+0000 7f2d0effd700 1 -- 192.168.123.106:0/4107724703 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f2d14018a70 con 0x7f2d1819c720 2026-03-09T17:31:48.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.781+0000 7f2d0effd700 1 --2- 192.168.123.106:0/4107724703 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2d04077b10 0x7f2d04079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:31:48.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.781+0000 7f2d0effd700 1 -- 192.168.123.106:0/4107724703 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f2d14014070 con 0x7f2d1819c720 2026-03-09T17:31:48.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.781+0000 7f2d1dda0700 1 --2- 192.168.123.106:0/4107724703 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2d04077b10 0x7f2d04079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:31:48.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.782+0000 7f2d1dda0700 1 --2- 192.168.123.106:0/4107724703 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2d04077b10 0x7f2d04079fc0 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f2d1000ba80 tx=0x7f2d10008d00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:31:48.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:48.784+0000 7f2d0effd700 1 -- 192.168.123.106:0/4107724703 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2d14063390 con 0x7f2d1819c720 2026-03-09T17:31:49.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:49.025+0000 7f2d1eda2700 1 -- 192.168.123.106:0/4107724703 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f2d181a2660 con 0x7f2d1819c720 2026-03-09T17:31:49.027 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:48 vm06.local ceph-mon[109831]: from='client.44125 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:49.028 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:48 vm06.local ceph-mon[109831]: from='client.44129 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:49.028 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:48 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/4043691974' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:49.028 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:48 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/4079081556' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:31:49.029 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:49.027+0000 7f2d0effd700 1 -- 192.168.123.106:0/4107724703 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+917 (secure 0 0 0) 0x7f2d14062ae0 con 0x7f2d1819c720 2026-03-09T17:31:49.029 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_WARN Degraded data redundancy: 875/40596 objects degraded (2.155%), 12 pgs degraded 2026-03-09T17:31:49.029 INFO:teuthology.orchestra.run.vm06.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 875/40596 objects degraded (2.155%), 12 pgs degraded 2026-03-09T17:31:49.029 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.1 is active+recovery_wait+degraded, acting [0,4,2] 2026-03-09T17:31:49.029 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.6 is active+recovery_wait+degraded, acting [0,1,4] 2026-03-09T17:31:49.029 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.c is active+recovery_wait+degraded, acting [5,0,3] 2026-03-09T17:31:49.029 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.f is active+recovery_wait+degraded, acting [5,3,0] 2026-03-09T17:31:49.029 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.10 is active+recovery_wait+degraded, acting [5,0,1] 2026-03-09T17:31:49.029 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.11 is active+recovery_wait+degraded, acting [3,4,0] 2026-03-09T17:31:49.029 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.12 is active+recovery_wait+degraded, acting [0,3,1] 2026-03-09T17:31:49.029 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.15 is active+recovery_wait+degraded, acting [3,0,4] 2026-03-09T17:31:49.029 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.17 is active+recovery_wait+degraded, acting [0,5,2] 2026-03-09T17:31:49.029 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.18 is active+recovery_wait+degraded, acting [2,0,1] 2026-03-09T17:31:49.030 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.1b is active+recovery_wait+degraded, acting [0,4,3] 2026-03-09T17:31:49.030 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.1f is active+recovery_wait+degraded, acting [0,3,2] 2026-03-09T17:31:49.033 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:49.030+0000 7f2d0cff9700 1 -- 192.168.123.106:0/4107724703 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2d04077b10 msgr2=0x7f2d04079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:49.033 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:49.031+0000 7f2d0cff9700 1 --2- 192.168.123.106:0/4107724703 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2d04077b10 0x7f2d04079fc0 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f2d1000ba80 tx=0x7f2d10008d00 comp rx=0 tx=0).stop 2026-03-09T17:31:49.033 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:49.031+0000 7f2d0cff9700 1 -- 192.168.123.106:0/4107724703 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d1819c720 msgr2=0x7f2d181a1790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:31:49.033 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:49.031+0000 7f2d0cff9700 1 --2- 192.168.123.106:0/4107724703 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d1819c720 0x7f2d181a1790 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f2d1400eb10 tx=0x7f2d1400eed0 comp rx=0 tx=0).stop 2026-03-09T17:31:49.035 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:49.033+0000 7f2d0cff9700 1 -- 192.168.123.106:0/4107724703 shutdown_connections 2026-03-09T17:31:49.035 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:49.033+0000 7f2d0cff9700 1 --2- 192.168.123.106:0/4107724703 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2d04077b10 0x7f2d04079fc0 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:49.035 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:49.033+0000 7f2d0cff9700 1 --2- 192.168.123.106:0/4107724703 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2d18076950 0x7f2d1819c1e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:49.035 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:49.033+0000 7f2d0cff9700 1 --2- 192.168.123.106:0/4107724703 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2d1819c720 0x7f2d181a1790 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:31:49.035 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:49.033+0000 7f2d0cff9700 1 -- 192.168.123.106:0/4107724703 >> 192.168.123.106:0/4107724703 conn(0x7f2d180fda80 msgr2=0x7f2d180ffd20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:31:49.035 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:49.033+0000 7f2d0cff9700 1 -- 192.168.123.106:0/4107724703 shutdown_connections 2026-03-09T17:31:49.035 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:31:49.033+0000 7f2d0cff9700 1 -- 192.168.123.106:0/4107724703 wait complete. 2026-03-09T17:31:49.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:48 vm09.local ceph-mon[97995]: from='client.44125 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:49.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:48 vm09.local ceph-mon[97995]: from='client.44129 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:49.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:48 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/4043691974' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:31:49.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:48 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/4079081556' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:31:50.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:49 vm06.local ceph-mon[109831]: pgmap v36: 65 pgs: 12 active+recovery_wait+degraded, 1 active+undersized+remapped, 2 active+recovering, 50 active+clean; 284 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 656 KiB/s rd, 935 KiB/s wr, 271 op/s; 875/40596 objects degraded (2.155%); 1.4 MiB/s, 4 objects/s recovering 2026-03-09T17:31:50.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:49 vm06.local ceph-mon[109831]: from='client.44141 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:50.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:49 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/4107724703' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:31:50.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:49 vm09.local ceph-mon[97995]: pgmap v36: 65 pgs: 12 active+recovery_wait+degraded, 1 active+undersized+remapped, 2 active+recovering, 50 active+clean; 284 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 656 KiB/s rd, 935 KiB/s wr, 271 op/s; 875/40596 objects degraded (2.155%); 1.4 MiB/s, 4 objects/s recovering 2026-03-09T17:31:50.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:49 vm09.local ceph-mon[97995]: from='client.44141 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:31:50.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:49 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/4107724703' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:31:52.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:51 vm09.local ceph-mon[97995]: pgmap v37: 65 pgs: 12 active+recovery_wait+degraded, 2 active+recovering, 51 active+clean; 285 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.7 MiB/s rd, 1.7 MiB/s wr, 449 op/s; 875/36819 objects degraded (2.376%); 1.2 MiB/s, 16 objects/s recovering 2026-03-09T17:31:52.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:51 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 875/36819 objects degraded (2.376%), 12 pgs degraded (PG_DEGRADED) 2026-03-09T17:31:52.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:51 vm06.local ceph-mon[109831]: pgmap v37: 65 pgs: 12 active+recovery_wait+degraded, 2 active+recovering, 51 active+clean; 285 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.7 MiB/s rd, 1.7 MiB/s wr, 449 op/s; 875/36819 objects degraded (2.376%); 1.2 MiB/s, 16 objects/s recovering 2026-03-09T17:31:52.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:51 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 875/36819 objects degraded (2.376%), 12 pgs degraded (PG_DEGRADED) 2026-03-09T17:31:54.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:54 vm06.local ceph-mon[109831]: pgmap v38: 65 pgs: 12 active+recovery_wait+degraded, 2 active+recovering, 51 active+clean; 284 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.5 MiB/s rd, 1.4 MiB/s wr, 423 op/s; 875/35826 objects degraded (2.442%); 1020 KiB/s, 14 objects/s recovering 2026-03-09T17:31:54.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:54 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:31:54.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:54 vm09.local ceph-mon[97995]: pgmap v38: 65 pgs: 12 active+recovery_wait+degraded, 2 active+recovering, 51 active+clean; 284 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.5 MiB/s rd, 1.4 MiB/s wr, 423 op/s; 875/35826 objects degraded (2.442%); 1020 KiB/s, 14 objects/s recovering 2026-03-09T17:31:54.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:54 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:31:55.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:55 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:31:55.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:55 vm06.local ceph-mon[109831]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-09T17:31:55.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:55 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:31:55.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:55 vm09.local ceph-mon[97995]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-09T17:31:56.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:56 vm06.local ceph-mon[109831]: pgmap v39: 65 pgs: 12 active+recovery_wait+degraded, 2 active+recovering, 51 active+clean; 284 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 831 KiB/s rd, 652 KiB/s wr, 213 op/s; 875/35826 objects degraded (2.442%); 0 B/s, 9 objects/s recovering 2026-03-09T17:31:56.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:56 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 875/35826 objects degraded (2.442%), 12 pgs degraded (PG_DEGRADED) 2026-03-09T17:31:56.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:56 vm09.local ceph-mon[97995]: pgmap v39: 65 pgs: 12 active+recovery_wait+degraded, 2 active+recovering, 51 active+clean; 284 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 831 KiB/s rd, 652 KiB/s wr, 213 op/s; 875/35826 objects degraded (2.442%); 0 B/s, 9 objects/s recovering 2026-03-09T17:31:56.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:56 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 875/35826 objects degraded (2.442%), 12 pgs degraded (PG_DEGRADED) 2026-03-09T17:31:57.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:57 vm06.local ceph-mon[109831]: pgmap v40: 65 pgs: 11 active+recovery_wait+degraded, 1 active+recovering, 53 active+clean; 282 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.1 MiB/s wr, 514 op/s; 810/33108 objects degraded (2.447%); 0 B/s, 15 objects/s recovering 2026-03-09T17:31:57.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:57 vm09.local ceph-mon[97995]: pgmap v40: 65 pgs: 11 active+recovery_wait+degraded, 1 active+recovering, 53 active+clean; 282 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.1 MiB/s wr, 514 op/s; 810/33108 objects degraded (2.447%); 0 B/s, 15 objects/s recovering 2026-03-09T17:31:59.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:31:59.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:58 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:32:00.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:31:59 vm06.local ceph-mon[109831]: pgmap v41: 65 pgs: 11 active+recovery_wait+degraded, 1 active+recovering, 53 active+clean; 282 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.0 MiB/s wr, 459 op/s; 810/33108 objects degraded (2.447%); 0 B/s, 13 objects/s recovering 2026-03-09T17:32:00.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:31:59 vm09.local ceph-mon[97995]: pgmap v41: 65 pgs: 11 active+recovery_wait+degraded, 1 active+recovering, 53 active+clean; 282 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.0 MiB/s wr, 459 op/s; 810/33108 objects degraded (2.447%); 0 B/s, 13 objects/s recovering 2026-03-09T17:32:02.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:01 vm09.local ceph-mon[97995]: pgmap v42: 65 pgs: 10 active+recovery_wait+degraded, 2 active+recovering, 53 active+clean; 276 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.6 MiB/s rd, 1.6 MiB/s wr, 590 op/s; 749/29838 objects degraded (2.510%); 0 B/s, 13 objects/s recovering 2026-03-09T17:32:02.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:01 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 749/29838 objects degraded (2.510%), 10 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:02.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:01 vm06.local ceph-mon[109831]: pgmap v42: 65 pgs: 10 active+recovery_wait+degraded, 2 active+recovering, 53 active+clean; 276 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.6 MiB/s rd, 1.6 MiB/s wr, 590 op/s; 749/29838 objects degraded (2.510%); 0 B/s, 13 objects/s recovering 2026-03-09T17:32:02.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:01 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 749/29838 objects degraded (2.510%), 10 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:04.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:04 vm06.local ceph-mon[109831]: pgmap v43: 65 pgs: 10 active+recovery_wait+degraded, 1 active+recovering, 54 active+clean; 274 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1005 KiB/s rd, 1.1 MiB/s wr, 495 op/s; 749/28908 objects degraded (2.591%); 0 B/s, 9 objects/s recovering 2026-03-09T17:32:04.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:04 vm09.local ceph-mon[97995]: pgmap v43: 65 pgs: 10 active+recovery_wait+degraded, 1 active+recovering, 54 active+clean; 274 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1005 KiB/s rd, 1.1 MiB/s wr, 495 op/s; 749/28908 objects degraded (2.591%); 0 B/s, 9 objects/s recovering 2026-03-09T17:32:05.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:05 vm06.local ceph-mon[109831]: pgmap v44: 65 pgs: 10 active+recovery_wait+degraded, 1 active+recovering, 54 active+clean; 274 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1005 KiB/s rd, 1.1 MiB/s wr, 467 op/s; 749/28908 objects degraded (2.591%); 0 B/s, 9 objects/s recovering 2026-03-09T17:32:05.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:05 vm09.local ceph-mon[97995]: pgmap v44: 65 pgs: 10 active+recovery_wait+degraded, 1 active+recovering, 54 active+clean; 274 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1005 KiB/s rd, 1.1 MiB/s wr, 467 op/s; 749/28908 objects degraded (2.591%); 0 B/s, 9 objects/s recovering 2026-03-09T17:32:07.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:06 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 749/28908 objects degraded (2.591%), 10 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:07.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:06 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 749/28908 objects degraded (2.591%), 10 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:08.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:08 vm06.local ceph-mon[109831]: pgmap v45: 65 pgs: 10 active+recovery_wait+degraded, 1 active+recovering, 54 active+clean; 269 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.6 MiB/s wr, 616 op/s; 749/24348 objects degraded (3.076%); 0 B/s, 13 objects/s recovering 2026-03-09T17:32:08.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:08 vm09.local ceph-mon[97995]: pgmap v45: 65 pgs: 10 active+recovery_wait+degraded, 1 active+recovering, 54 active+clean; 269 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.6 MiB/s wr, 616 op/s; 749/24348 objects degraded (3.076%); 0 B/s, 13 objects/s recovering 2026-03-09T17:32:09.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:32:09.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:32:10.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:10 vm06.local ceph-mon[109831]: pgmap v46: 65 pgs: 10 active+recovery_wait+degraded, 1 active+recovering, 54 active+clean; 269 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 984 KiB/s rd, 1.1 MiB/s wr, 307 op/s; 749/24348 objects degraded (3.076%); 0 B/s, 6 objects/s recovering 2026-03-09T17:32:10.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:10 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:32:10.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:10 vm06.local ceph-mon[109831]: Upgrade: unsafe to stop osd(s) at this time (2 PGs are or would become offline) 2026-03-09T17:32:10.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:10 vm09.local ceph-mon[97995]: pgmap v46: 65 pgs: 10 active+recovery_wait+degraded, 1 active+recovering, 54 active+clean; 269 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 984 KiB/s rd, 1.1 MiB/s wr, 307 op/s; 749/24348 objects degraded (3.076%); 0 B/s, 6 objects/s recovering 2026-03-09T17:32:10.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:10 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:32:10.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:10 vm09.local ceph-mon[97995]: Upgrade: unsafe to stop osd(s) at this time (2 PGs are or would become offline) 2026-03-09T17:32:12.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:12 vm09.local ceph-mon[97995]: pgmap v47: 65 pgs: 9 active+recovery_wait+degraded, 1 active+recovering, 55 active+clean; 266 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.5 MiB/s wr, 431 op/s; 674/21156 objects degraded (3.186%); 0 B/s, 10 objects/s recovering 2026-03-09T17:32:12.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:12 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 674/21156 objects degraded (3.186%), 9 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:12.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:12 vm06.local ceph-mon[109831]: pgmap v47: 65 pgs: 9 active+recovery_wait+degraded, 1 active+recovering, 55 active+clean; 266 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.5 MiB/s wr, 431 op/s; 674/21156 objects degraded (3.186%); 0 B/s, 10 objects/s recovering 2026-03-09T17:32:12.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:12 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 674/21156 objects degraded (3.186%), 9 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:14.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:14 vm06.local ceph-mon[109831]: pgmap v48: 65 pgs: 9 active+recovery_wait+degraded, 1 active+recovering, 55 active+clean; 261 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 913 KiB/s rd, 949 KiB/s wr, 318 op/s; 674/20508 objects degraded (3.287%); 0 B/s, 10 objects/s recovering 2026-03-09T17:32:14.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:14 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:32:14.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:14 vm09.local ceph-mon[97995]: pgmap v48: 65 pgs: 9 active+recovery_wait+degraded, 1 active+recovering, 55 active+clean; 261 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 913 KiB/s rd, 949 KiB/s wr, 318 op/s; 674/20508 objects degraded (3.287%); 0 B/s, 10 objects/s recovering 2026-03-09T17:32:14.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:14 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:32:15.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:15 vm06.local ceph-mon[109831]: pgmap v49: 65 pgs: 9 active+recovery_wait+degraded, 1 active+recovering, 55 active+clean; 261 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 913 KiB/s rd, 949 KiB/s wr, 290 op/s; 674/20508 objects degraded (3.287%); 0 B/s, 7 objects/s recovering 2026-03-09T17:32:15.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:15 vm09.local ceph-mon[97995]: pgmap v49: 65 pgs: 9 active+recovery_wait+degraded, 1 active+recovering, 55 active+clean; 261 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 913 KiB/s rd, 949 KiB/s wr, 290 op/s; 674/20508 objects degraded (3.287%); 0 B/s, 7 objects/s recovering 2026-03-09T17:32:18.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:18 vm06.local ceph-mon[109831]: pgmap v50: 65 pgs: 8 active+recovery_wait+degraded, 2 active+recovering, 55 active+clean; 260 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.4 MiB/s wr, 437 op/s; 601/16458 objects degraded (3.652%); 0 B/s, 11 objects/s recovering 2026-03-09T17:32:18.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:18 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 601/16458 objects degraded (3.652%), 8 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:18.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:18 vm09.local ceph-mon[97995]: pgmap v50: 65 pgs: 8 active+recovery_wait+degraded, 2 active+recovering, 55 active+clean; 260 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.4 MiB/s wr, 437 op/s; 601/16458 objects degraded (3.652%); 0 B/s, 11 objects/s recovering 2026-03-09T17:32:18.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:18 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 601/16458 objects degraded (3.652%), 8 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:19.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.143+0000 7fda6da7f700 1 -- 192.168.123.106:0/1534755774 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda68072360 msgr2=0x7fda680770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:19.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.143+0000 7fda6da7f700 1 --2- 192.168.123.106:0/1534755774 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda68072360 0x7fda680770e0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fda6000a200 tx=0x7fda6000a510 comp rx=0 tx=0).stop 2026-03-09T17:32:19.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.143+0000 7fda6da7f700 1 -- 192.168.123.106:0/1534755774 shutdown_connections 2026-03-09T17:32:19.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.143+0000 7fda6da7f700 1 --2- 192.168.123.106:0/1534755774 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda68072360 0x7fda680770e0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.143+0000 7fda6da7f700 1 --2- 192.168.123.106:0/1534755774 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda68071980 0x7fda68071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.143+0000 7fda6da7f700 1 -- 192.168.123.106:0/1534755774 >> 192.168.123.106:0/1534755774 conn(0x7fda6806d1a0 msgr2=0x7fda6806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:19.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.143+0000 7fda6da7f700 1 -- 192.168.123.106:0/1534755774 shutdown_connections 2026-03-09T17:32:19.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.143+0000 7fda6da7f700 1 -- 192.168.123.106:0/1534755774 wait complete. 2026-03-09T17:32:19.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.144+0000 7fda6da7f700 1 Processor -- start 2026-03-09T17:32:19.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.144+0000 7fda6da7f700 1 -- start start 2026-03-09T17:32:19.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.144+0000 7fda6da7f700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda68071980 0x7fda68082500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:19.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.144+0000 7fda6da7f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda68082a40 0x7fda68082eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:19.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.144+0000 7fda6da7f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda681b2a90 con 0x7fda68082a40 2026-03-09T17:32:19.146 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.144+0000 7fda6da7f700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda681b2bd0 con 0x7fda68071980 2026-03-09T17:32:19.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.144+0000 7fda66ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda68082a40 0x7fda68082eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:19.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.144+0000 7fda66ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda68082a40 0x7fda68082eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58336/0 (socket says 192.168.123.106:58336) 2026-03-09T17:32:19.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.144+0000 7fda66ffd700 1 -- 192.168.123.106:0/1176615696 learned_addr learned my addr 192.168.123.106:0/1176615696 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:32:19.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.144+0000 7fda677fe700 1 --2- 192.168.123.106:0/1176615696 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda68071980 0x7fda68082500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:19.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.145+0000 7fda66ffd700 1 -- 192.168.123.106:0/1176615696 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda68071980 msgr2=0x7fda68082500 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:19.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.145+0000 7fda66ffd700 1 --2- 192.168.123.106:0/1176615696 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda68071980 0x7fda68082500 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.145+0000 7fda66ffd700 1 -- 192.168.123.106:0/1176615696 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fda60009e30 con 0x7fda68082a40 2026-03-09T17:32:19.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.145+0000 7fda66ffd700 1 --2- 192.168.123.106:0/1176615696 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda68082a40 0x7fda68082eb0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fda60000f80 tx=0x7fda600079a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:19.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.145+0000 7fda64ff9700 1 -- 192.168.123.106:0/1176615696 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda6000a860 con 0x7fda68082a40 2026-03-09T17:32:19.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.145+0000 7fda6da7f700 1 -- 192.168.123.106:0/1176615696 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fda681b2d10 con 0x7fda68082a40 2026-03-09T17:32:19.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.145+0000 7fda6da7f700 1 -- 192.168.123.106:0/1176615696 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fda681b3200 con 0x7fda68082a40 2026-03-09T17:32:19.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.146+0000 7fda64ff9700 1 -- 192.168.123.106:0/1176615696 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fda60004890 con 0x7fda68082a40 2026-03-09T17:32:19.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.146+0000 7fda64ff9700 1 -- 192.168.123.106:0/1176615696 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda60027430 con 0x7fda68082a40 2026-03-09T17:32:19.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.147+0000 7fda6da7f700 1 -- 192.168.123.106:0/1176615696 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fda6807c8b0 con 0x7fda68082a40 2026-03-09T17:32:19.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.148+0000 7fda64ff9700 1 -- 192.168.123.106:0/1176615696 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fda6000a9c0 con 0x7fda68082a40 2026-03-09T17:32:19.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.148+0000 7fda64ff9700 1 --2- 192.168.123.106:0/1176615696 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fda50077a50 0x7fda50079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:19.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.148+0000 7fda64ff9700 1 -- 192.168.123.106:0/1176615696 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fda6002c080 con 0x7fda68082a40 2026-03-09T17:32:19.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.148+0000 7fda677fe700 1 --2- 192.168.123.106:0/1176615696 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fda50077a50 0x7fda50079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:19.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.150+0000 7fda677fe700 1 --2- 192.168.123.106:0/1176615696 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fda50077a50 0x7fda50079f00 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fda58005fd0 tx=0x7fda58014040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:19.153 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.151+0000 7fda64ff9700 1 -- 192.168.123.106:0/1176615696 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fda600630c0 con 0x7fda68082a40 2026-03-09T17:32:19.311 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.309+0000 7fda6da7f700 1 -- 192.168.123.106:0/1176615696 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fda6802d080 con 0x7fda50077a50 2026-03-09T17:32:19.312 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.310+0000 7fda64ff9700 1 -- 192.168.123.106:0/1176615696 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fda6802d080 con 0x7fda50077a50 2026-03-09T17:32:19.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.313+0000 7fda4e7fc700 1 -- 192.168.123.106:0/1176615696 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fda50077a50 msgr2=0x7fda50079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:19.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.313+0000 7fda4e7fc700 1 --2- 192.168.123.106:0/1176615696 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fda50077a50 0x7fda50079f00 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fda58005fd0 tx=0x7fda58014040 comp rx=0 tx=0).stop 2026-03-09T17:32:19.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.313+0000 7fda4e7fc700 1 -- 192.168.123.106:0/1176615696 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda68082a40 msgr2=0x7fda68082eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:19.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.313+0000 7fda4e7fc700 1 --2- 192.168.123.106:0/1176615696 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda68082a40 0x7fda68082eb0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fda60000f80 tx=0x7fda600079a0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.318 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.316+0000 7fda4e7fc700 1 -- 192.168.123.106:0/1176615696 shutdown_connections 2026-03-09T17:32:19.318 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.316+0000 7fda4e7fc700 1 --2- 192.168.123.106:0/1176615696 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fda50077a50 0x7fda50079f00 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.318 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.316+0000 7fda4e7fc700 1 --2- 192.168.123.106:0/1176615696 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fda68071980 0x7fda68082500 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.318 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.316+0000 7fda4e7fc700 1 --2- 192.168.123.106:0/1176615696 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda68082a40 0x7fda68082eb0 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.318 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.316+0000 7fda4e7fc700 1 -- 192.168.123.106:0/1176615696 >> 192.168.123.106:0/1176615696 conn(0x7fda6806d1a0 msgr2=0x7fda68076460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:19.319 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.316+0000 7fda4e7fc700 1 -- 192.168.123.106:0/1176615696 shutdown_connections 2026-03-09T17:32:19.319 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.316+0000 7fda4e7fc700 1 -- 192.168.123.106:0/1176615696 wait complete. 2026-03-09T17:32:19.330 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:32:19.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.399+0000 7f5bdfc9c700 1 -- 192.168.123.106:0/276433322 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5bd81002e0 msgr2=0x7f5bd81006f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:19.401 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.399+0000 7f5bdfc9c700 1 --2- 192.168.123.106:0/276433322 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5bd81002e0 0x7f5bd81006f0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f5bd4009b50 tx=0x7f5bd4009e60 comp rx=0 tx=0).stop 2026-03-09T17:32:19.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.400+0000 7f5bdfc9c700 1 -- 192.168.123.106:0/276433322 shutdown_connections 2026-03-09T17:32:19.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.400+0000 7f5bdfc9c700 1 --2- 192.168.123.106:0/276433322 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5bd81014e0 0x7f5bd8101930 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.400+0000 7f5bdfc9c700 1 --2- 192.168.123.106:0/276433322 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5bd81002e0 0x7f5bd81006f0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.400+0000 7f5bdfc9c700 1 -- 192.168.123.106:0/276433322 >> 192.168.123.106:0/276433322 conn(0x7f5bd80fb890 msgr2=0x7f5bd80fdcc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:19.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.400+0000 7f5bdfc9c700 1 -- 192.168.123.106:0/276433322 shutdown_connections 2026-03-09T17:32:19.402 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.400+0000 7f5bdfc9c700 1 -- 192.168.123.106:0/276433322 wait complete. 2026-03-09T17:32:19.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.401+0000 7f5bdfc9c700 1 Processor -- start 2026-03-09T17:32:19.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.401+0000 7f5bdfc9c700 1 -- start start 2026-03-09T17:32:19.403 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.401+0000 7f5bdfc9c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5bd81002e0 0x7f5bd81959e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:19.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.402+0000 7f5bdda38700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5bd81002e0 0x7f5bd81959e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:19.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.402+0000 7f5bdfc9c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5bd81014e0 0x7f5bd8195f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:19.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.402+0000 7f5bdda38700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5bd81002e0 0x7f5bd81959e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58352/0 (socket says 192.168.123.106:58352) 2026-03-09T17:32:19.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.402+0000 7f5bdda38700 1 -- 192.168.123.106:0/1769618274 learned_addr learned my addr 192.168.123.106:0/1769618274 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:32:19.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.402+0000 7f5bdfc9c700 1 -- 192.168.123.106:0/1769618274 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5bd8196540 con 0x7f5bd81002e0 2026-03-09T17:32:19.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.402+0000 7f5bdfc9c700 1 -- 192.168.123.106:0/1769618274 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5bd8196680 con 0x7f5bd81014e0 2026-03-09T17:32:19.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.402+0000 7f5bdd237700 1 --2- 192.168.123.106:0/1769618274 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5bd81014e0 0x7f5bd8195f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:19.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.402+0000 7f5bdda38700 1 -- 192.168.123.106:0/1769618274 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5bd81014e0 msgr2=0x7f5bd8195f20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:19.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.402+0000 7f5bdda38700 1 --2- 192.168.123.106:0/1769618274 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5bd81014e0 0x7f5bd8195f20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.404 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.402+0000 7f5bdda38700 1 -- 192.168.123.106:0/1769618274 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5bd40097e0 con 0x7f5bd81002e0 2026-03-09T17:32:19.405 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.403+0000 7f5bdda38700 1 --2- 192.168.123.106:0/1769618274 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5bd81002e0 0x7f5bd81959e0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f5bd4005740 tx=0x7f5bd4005770 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:19.405 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.403+0000 7f5bceffd700 1 -- 192.168.123.106:0/1769618274 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5bd401d070 con 0x7f5bd81002e0 2026-03-09T17:32:19.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.403+0000 7f5bdfc9c700 1 -- 192.168.123.106:0/1769618274 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5bd819b0d0 con 0x7f5bd81002e0 2026-03-09T17:32:19.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.403+0000 7f5bdfc9c700 1 -- 192.168.123.106:0/1769618274 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5bd819b5c0 con 0x7f5bd81002e0 2026-03-09T17:32:19.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.404+0000 7f5bceffd700 1 -- 192.168.123.106:0/1769618274 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5bd400bcb0 con 0x7f5bd81002e0 2026-03-09T17:32:19.406 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.404+0000 7f5bceffd700 1 -- 192.168.123.106:0/1769618274 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5bd400f810 con 0x7f5bd81002e0 2026-03-09T17:32:19.408 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.404+0000 7f5bdfc9c700 1 -- 192.168.123.106:0/1769618274 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5bbc005320 con 0x7f5bd81002e0 2026-03-09T17:32:19.408 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.406+0000 7f5bceffd700 1 -- 192.168.123.106:0/1769618274 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f5bd400f970 con 0x7f5bd81002e0 2026-03-09T17:32:19.408 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.406+0000 7f5bceffd700 1 --2- 192.168.123.106:0/1769618274 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f5bc4077a00 0x7f5bc4079eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:19.409 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.406+0000 7f5bceffd700 1 -- 192.168.123.106:0/1769618274 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f5bd409b8c0 con 0x7f5bd81002e0 2026-03-09T17:32:19.409 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.407+0000 7f5bdd237700 1 --2- 192.168.123.106:0/1769618274 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f5bc4077a00 0x7f5bc4079eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:19.410 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.408+0000 7f5bdd237700 1 --2- 192.168.123.106:0/1769618274 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f5bc4077a00 0x7f5bc4079eb0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f5bc8005950 tx=0x7f5bc800b410 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:19.411 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.408+0000 7f5bceffd700 1 -- 192.168.123.106:0/1769618274 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5bd4064000 con 0x7f5bd81002e0 2026-03-09T17:32:19.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.558+0000 7f5bdfc9c700 1 -- 192.168.123.106:0/1769618274 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f5bbc000bf0 con 0x7f5bc4077a00 2026-03-09T17:32:19.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.559+0000 7f5bceffd700 1 -- 192.168.123.106:0/1769618274 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f5bbc000bf0 con 0x7f5bc4077a00 2026-03-09T17:32:19.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.565+0000 7f5bccff9700 1 -- 192.168.123.106:0/1769618274 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f5bc4077a00 msgr2=0x7f5bc4079eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:19.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.565+0000 7f5bccff9700 1 --2- 192.168.123.106:0/1769618274 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f5bc4077a00 0x7f5bc4079eb0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f5bc8005950 tx=0x7f5bc800b410 comp rx=0 tx=0).stop 2026-03-09T17:32:19.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.565+0000 7f5bccff9700 1 -- 192.168.123.106:0/1769618274 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5bd81002e0 msgr2=0x7f5bd81959e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:19.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.565+0000 7f5bccff9700 1 --2- 192.168.123.106:0/1769618274 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5bd81002e0 0x7f5bd81959e0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f5bd4005740 tx=0x7f5bd4005770 comp rx=0 tx=0).stop 2026-03-09T17:32:19.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.565+0000 7f5bccff9700 1 -- 192.168.123.106:0/1769618274 shutdown_connections 2026-03-09T17:32:19.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.565+0000 7f5bccff9700 1 --2- 192.168.123.106:0/1769618274 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f5bc4077a00 0x7f5bc4079eb0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.565+0000 7f5bccff9700 1 --2- 192.168.123.106:0/1769618274 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5bd81002e0 0x7f5bd81959e0 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.565+0000 7f5bccff9700 1 --2- 192.168.123.106:0/1769618274 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5bd81014e0 0x7f5bd8195f20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.565+0000 7f5bccff9700 1 -- 192.168.123.106:0/1769618274 >> 192.168.123.106:0/1769618274 conn(0x7f5bd80fb890 msgr2=0x7f5bd80fdb50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:19.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.565+0000 7f5bccff9700 1 -- 192.168.123.106:0/1769618274 shutdown_connections 2026-03-09T17:32:19.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.565+0000 7f5bccff9700 1 -- 192.168.123.106:0/1769618274 wait complete. 2026-03-09T17:32:19.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.642+0000 7fb97bfff700 1 -- 192.168.123.106:0/2822648486 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb97c071a60 msgr2=0x7fb97c071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:19.644 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.642+0000 7fb97bfff700 1 --2- 192.168.123.106:0/2822648486 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb97c071a60 0x7fb97c071e70 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fb96c009b00 tx=0x7fb96c009e10 comp rx=0 tx=0).stop 2026-03-09T17:32:19.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.643+0000 7fb97bfff700 1 -- 192.168.123.106:0/2822648486 shutdown_connections 2026-03-09T17:32:19.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.643+0000 7fb97bfff700 1 --2- 192.168.123.106:0/2822648486 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb97c072440 0x7fb97c10be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.643+0000 7fb97bfff700 1 --2- 192.168.123.106:0/2822648486 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb97c071a60 0x7fb97c071e70 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.643+0000 7fb97bfff700 1 -- 192.168.123.106:0/2822648486 >> 192.168.123.106:0/2822648486 conn(0x7fb97c06d1a0 msgr2=0x7fb97c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:19.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.643+0000 7fb97bfff700 1 -- 192.168.123.106:0/2822648486 shutdown_connections 2026-03-09T17:32:19.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.643+0000 7fb97bfff700 1 -- 192.168.123.106:0/2822648486 wait complete. 2026-03-09T17:32:19.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.643+0000 7fb97bfff700 1 Processor -- start 2026-03-09T17:32:19.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.643+0000 7fb97bfff700 1 -- start start 2026-03-09T17:32:19.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.643+0000 7fb97bfff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb97c071a60 0x7fb97c1a4c70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:19.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.643+0000 7fb97bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb97c072440 0x7fb97c1a51b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:19.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.643+0000 7fb97bfff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb97c1a57d0 con 0x7fb97c072440 2026-03-09T17:32:19.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.643+0000 7fb97bfff700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb97c1aa190 con 0x7fb97c071a60 2026-03-09T17:32:19.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.644+0000 7fb97affd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb97c071a60 0x7fb97c1a4c70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:19.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.644+0000 7fb973fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb97c072440 0x7fb97c1a51b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:19.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.644+0000 7fb97affd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb97c071a60 0x7fb97c1a4c70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:44578/0 (socket says 192.168.123.106:44578) 2026-03-09T17:32:19.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.644+0000 7fb97affd700 1 -- 192.168.123.106:0/276763882 learned_addr learned my addr 192.168.123.106:0/276763882 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:32:19.651 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.644+0000 7fb973fff700 1 -- 192.168.123.106:0/276763882 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb97c071a60 msgr2=0x7fb97c1a4c70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:19.651 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.644+0000 7fb973fff700 1 --2- 192.168.123.106:0/276763882 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb97c071a60 0x7fb97c1a4c70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.651 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.644+0000 7fb973fff700 1 -- 192.168.123.106:0/276763882 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb96c0097e0 con 0x7fb97c072440 2026-03-09T17:32:19.651 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.649+0000 7fb973fff700 1 --2- 192.168.123.106:0/276763882 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb97c072440 0x7fb97c1a51b0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fb97c107dc0 tx=0x7fb96400dc20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:19.656 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.652+0000 7fb978ff9700 1 -- 192.168.123.106:0/276763882 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb9640098e0 con 0x7fb97c072440 2026-03-09T17:32:19.656 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.653+0000 7fb978ff9700 1 -- 192.168.123.106:0/276763882 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb96400de60 con 0x7fb97c072440 2026-03-09T17:32:19.656 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.653+0000 7fb978ff9700 1 -- 192.168.123.106:0/276763882 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb96400f3c0 con 0x7fb97c072440 2026-03-09T17:32:19.656 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.653+0000 7fb97bfff700 1 -- 192.168.123.106:0/276763882 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb97c1aa390 con 0x7fb97c072440 2026-03-09T17:32:19.656 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.653+0000 7fb97bfff700 1 -- 192.168.123.106:0/276763882 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb97c1aa8b0 con 0x7fb97c072440 2026-03-09T17:32:19.656 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.654+0000 7fb97bfff700 1 -- 192.168.123.106:0/276763882 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb97c19eba0 con 0x7fb97c072440 2026-03-09T17:32:19.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.655+0000 7fb978ff9700 1 -- 192.168.123.106:0/276763882 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fb964010460 con 0x7fb97c072440 2026-03-09T17:32:19.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.656+0000 7fb978ff9700 1 --2- 192.168.123.106:0/276763882 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb968077a00 0x7fb968079eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:19.658 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.656+0000 7fb978ff9700 1 -- 192.168.123.106:0/276763882 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fb964099a30 con 0x7fb97c072440 2026-03-09T17:32:19.660 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.658+0000 7fb97affd700 1 --2- 192.168.123.106:0/276763882 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb968077a00 0x7fb968079eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:19.661 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.659+0000 7fb97affd700 1 --2- 192.168.123.106:0/276763882 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb968077a00 0x7fb968079eb0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fb96c005fd0 tx=0x7fb96c01a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:19.661 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.659+0000 7fb978ff9700 1 -- 192.168.123.106:0/276763882 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb964062180 con 0x7fb97c072440 2026-03-09T17:32:19.809 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.807+0000 7fb97bfff700 1 -- 192.168.123.106:0/276763882 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fb97c061190 con 0x7fb968077a00 2026-03-09T17:32:19.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.812+0000 7fb978ff9700 1 -- 192.168.123.106:0/276763882 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fb97c061190 con 0x7fb968077a00 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (6m) 43s ago 7m 25.9M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (7m) 43s ago 7m 8740k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (6m) 56s ago 6m 11.1M - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (59s) 43s ago 7m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 3e47d040f792 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (57s) 56s ago 6m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e a8538a05db57 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (6m) 43s ago 7m 94.1M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (5m) 43s ago 5m 16.0M - 18.2.0 dc2bc1663786 4b4cbdf0c640 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (5m) 43s ago 5m 240M - 18.2.0 dc2bc1663786 4c8e86b2b8cd 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (5m) 56s ago 5m 145M - 18.2.0 dc2bc1663786 aa1f0430b448 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (5m) 56s ago 5m 17.5M - 18.2.0 dc2bc1663786 8dc8a0159213 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:8443,9283,8765 running (118s) 43s ago 7m 619M - 19.2.3-678-ge911bdeb 654f31e6858e a857a5a84e3a 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (95s) 56s ago 6m 487M - 19.2.3-678-ge911bdeb 654f31e6858e ce41ce6b68d6 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (88s) 43s ago 8m 56.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 86c27c9946b5 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (74s) 56s ago 6m 48.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 65d270c6a306 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (7m) 43s ago 7m 14.7M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (6m) 56s ago 6m 15.9M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (46s) 43s ago 6m 32.1M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 3b19d9fcb067 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (6m) 43s ago 6m 367M 4096M 18.2.0 dc2bc1663786 decf7e88ebaf 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (6m) 43s ago 6m 316M 4096M 18.2.0 dc2bc1663786 df6c7067c3e4 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (5m) 56s ago 5m 427M 4096M 18.2.0 dc2bc1663786 48a594500ef1 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (5m) 56s ago 5m 410M 4096M 18.2.0 dc2bc1663786 a47c39052541 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (5m) 56s ago 5m 334M 4096M 18.2.0 dc2bc1663786 89f436540a49 2026-03-09T17:32:19.815 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (98s) 43s ago 7m 53.4M - 2.43.0 a07b618ecd1d f6ece95f2fd5 2026-03-09T17:32:19.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.818+0000 7fb97bfff700 1 -- 192.168.123.106:0/276763882 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb968077a00 msgr2=0x7fb968079eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:19.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.818+0000 7fb97bfff700 1 --2- 192.168.123.106:0/276763882 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb968077a00 0x7fb968079eb0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fb96c005fd0 tx=0x7fb96c01a040 comp rx=0 tx=0).stop 2026-03-09T17:32:19.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.818+0000 7fb97bfff700 1 -- 192.168.123.106:0/276763882 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb97c072440 msgr2=0x7fb97c1a51b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:19.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.818+0000 7fb97bfff700 1 --2- 192.168.123.106:0/276763882 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb97c072440 0x7fb97c1a51b0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fb97c107dc0 tx=0x7fb96400dc20 comp rx=0 tx=0).stop 2026-03-09T17:32:19.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.819+0000 7fb97bfff700 1 -- 192.168.123.106:0/276763882 shutdown_connections 2026-03-09T17:32:19.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.819+0000 7fb97bfff700 1 --2- 192.168.123.106:0/276763882 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb968077a00 0x7fb968079eb0 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.819+0000 7fb97bfff700 1 --2- 192.168.123.106:0/276763882 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb97c071a60 0x7fb97c1a4c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.819+0000 7fb97bfff700 1 --2- 192.168.123.106:0/276763882 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb97c072440 0x7fb97c1a51b0 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.819+0000 7fb97bfff700 1 -- 192.168.123.106:0/276763882 >> 192.168.123.106:0/276763882 conn(0x7fb97c06d1a0 msgr2=0x7fb97c10a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:19.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.819+0000 7fb97bfff700 1 -- 192.168.123.106:0/276763882 shutdown_connections 2026-03-09T17:32:19.821 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.819+0000 7fb97bfff700 1 -- 192.168.123.106:0/276763882 wait complete. 2026-03-09T17:32:19.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.921+0000 7f969fe4d700 1 -- 192.168.123.106:0/3977676692 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9698071980 msgr2=0x7f9698071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:19.923 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.921+0000 7f969fe4d700 1 --2- 192.168.123.106:0/3977676692 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9698071980 0x7f9698071d90 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f9694005fd0 tx=0x7f96940078c0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.921+0000 7f969fe4d700 1 -- 192.168.123.106:0/3977676692 shutdown_connections 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.921+0000 7f969fe4d700 1 --2- 192.168.123.106:0/3977676692 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9698072360 0x7f96980770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.921+0000 7f969fe4d700 1 --2- 192.168.123.106:0/3977676692 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9698071980 0x7f9698071d90 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.921+0000 7f969fe4d700 1 -- 192.168.123.106:0/3977676692 >> 192.168.123.106:0/3977676692 conn(0x7f969806d1a0 msgr2=0x7f969806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.921+0000 7f969fe4d700 1 -- 192.168.123.106:0/3977676692 shutdown_connections 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.921+0000 7f969fe4d700 1 -- 192.168.123.106:0/3977676692 wait complete. 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.922+0000 7f969fe4d700 1 Processor -- start 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.922+0000 7f969fe4d700 1 -- start start 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.922+0000 7f969fe4d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9698072360 0x7f9698082560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.922+0000 7f969fe4d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9698082aa0 0x7f9698082f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.922+0000 7f969fe4d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f96981b2a90 con 0x7f9698072360 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.922+0000 7f969fe4d700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f96981b2bd0 con 0x7f9698082aa0 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.922+0000 7f969d3e8700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9698082aa0 0x7f9698082f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.922+0000 7f969d3e8700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9698082aa0 0x7f9698082f10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:44588/0 (socket says 192.168.123.106:44588) 2026-03-09T17:32:19.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.922+0000 7f969d3e8700 1 -- 192.168.123.106:0/337362580 learned_addr learned my addr 192.168.123.106:0/337362580 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:32:19.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.923+0000 7f969d3e8700 1 -- 192.168.123.106:0/337362580 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9698072360 msgr2=0x7f9698082560 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:19.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.923+0000 7f969d3e8700 1 --2- 192.168.123.106:0/337362580 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9698072360 0x7f9698082560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:19.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.923+0000 7f969d3e8700 1 -- 192.168.123.106:0/337362580 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f96940072c0 con 0x7f9698082aa0 2026-03-09T17:32:19.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.923+0000 7f969d3e8700 1 --2- 192.168.123.106:0/337362580 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9698082aa0 0x7f9698082f10 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f969000e3c0 tx=0x7f969000e780 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:19.925 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.923+0000 7f968effd700 1 -- 192.168.123.106:0/337362580 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f969000c170 con 0x7f9698082aa0 2026-03-09T17:32:19.926 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.923+0000 7f969fe4d700 1 -- 192.168.123.106:0/337362580 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f96981b2d70 con 0x7f9698082aa0 2026-03-09T17:32:19.926 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.923+0000 7f969fe4d700 1 -- 192.168.123.106:0/337362580 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f96981b3260 con 0x7f9698082aa0 2026-03-09T17:32:19.926 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.924+0000 7f968effd700 1 -- 192.168.123.106:0/337362580 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f969000f040 con 0x7f9698082aa0 2026-03-09T17:32:19.926 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.924+0000 7f968effd700 1 -- 192.168.123.106:0/337362580 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9690014720 con 0x7f9698082aa0 2026-03-09T17:32:19.927 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.925+0000 7f968effd700 1 -- 192.168.123.106:0/337362580 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f96900148e0 con 0x7f9698082aa0 2026-03-09T17:32:19.929 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.927+0000 7f968effd700 1 --2- 192.168.123.106:0/337362580 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9684079c50 0x7f968407c100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:19.929 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.927+0000 7f968effd700 1 -- 192.168.123.106:0/337362580 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f969009b040 con 0x7f9698082aa0 2026-03-09T17:32:19.929 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.927+0000 7f969dbe9700 1 --2- 192.168.123.106:0/337362580 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9684079c50 0x7f968407c100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:19.929 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.927+0000 7f969fe4d700 1 -- 192.168.123.106:0/337362580 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f967c005320 con 0x7f9698082aa0 2026-03-09T17:32:19.929 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.927+0000 7f969dbe9700 1 --2- 192.168.123.106:0/337362580 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9684079c50 0x7f968407c100 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f969400afd0 tx=0x7f969400c330 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:19.936 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:19.931+0000 7f968effd700 1 -- 192.168.123.106:0/337362580 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9690063650 con 0x7f9698082aa0 2026-03-09T17:32:20.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.155+0000 7f969fe4d700 1 -- 192.168.123.106:0/337362580 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f967c005cc0 con 0x7f9698082aa0 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.161+0000 7f968effd700 1 -- 192.168.123.106:0/337362580 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f9690062da0 con 0x7f9698082aa0 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 5, 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 9, 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:32:20.163 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:32:20.166 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.164+0000 7f969fe4d700 1 -- 192.168.123.106:0/337362580 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9684079c50 msgr2=0x7f968407c100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:20.166 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.164+0000 7f969fe4d700 1 --2- 192.168.123.106:0/337362580 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9684079c50 0x7f968407c100 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f969400afd0 tx=0x7f969400c330 comp rx=0 tx=0).stop 2026-03-09T17:32:20.166 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.164+0000 7f969fe4d700 1 -- 192.168.123.106:0/337362580 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9698082aa0 msgr2=0x7f9698082f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:20.166 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.164+0000 7f969fe4d700 1 --2- 192.168.123.106:0/337362580 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9698082aa0 0x7f9698082f10 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f969000e3c0 tx=0x7f969000e780 comp rx=0 tx=0).stop 2026-03-09T17:32:20.166 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.164+0000 7f969fe4d700 1 -- 192.168.123.106:0/337362580 shutdown_connections 2026-03-09T17:32:20.166 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.164+0000 7f969fe4d700 1 --2- 192.168.123.106:0/337362580 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9684079c50 0x7f968407c100 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.167 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.164+0000 7f969fe4d700 1 --2- 192.168.123.106:0/337362580 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9698072360 0x7f9698082560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.167 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.164+0000 7f969fe4d700 1 --2- 192.168.123.106:0/337362580 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9698082aa0 0x7f9698082f10 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.167 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.164+0000 7f969fe4d700 1 -- 192.168.123.106:0/337362580 >> 192.168.123.106:0/337362580 conn(0x7f969806d1a0 msgr2=0x7f9698070620 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:20.167 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.164+0000 7f969fe4d700 1 -- 192.168.123.106:0/337362580 shutdown_connections 2026-03-09T17:32:20.167 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.164+0000 7f969fe4d700 1 -- 192.168.123.106:0/337362580 wait complete. 2026-03-09T17:32:20.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.260+0000 7f927f0a1700 1 -- 192.168.123.106:0/3474184000 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9278072360 msgr2=0x7f92780770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:20.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.260+0000 7f927f0a1700 1 --2- 192.168.123.106:0/3474184000 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9278072360 0x7f92780770e0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f927000d3f0 tx=0x7f927000d700 comp rx=0 tx=0).stop 2026-03-09T17:32:20.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.260+0000 7f927f0a1700 1 -- 192.168.123.106:0/3474184000 shutdown_connections 2026-03-09T17:32:20.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.260+0000 7f927f0a1700 1 --2- 192.168.123.106:0/3474184000 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9278072360 0x7f92780770e0 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.260+0000 7f927f0a1700 1 --2- 192.168.123.106:0/3474184000 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9278071980 0x7f9278071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.260+0000 7f927f0a1700 1 -- 192.168.123.106:0/3474184000 >> 192.168.123.106:0/3474184000 conn(0x7f927806d1a0 msgr2=0x7f927806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:20.262 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:20 vm06.local ceph-mon[109831]: pgmap v51: 65 pgs: 8 active+recovery_wait+degraded, 2 active+recovering, 55 active+clean; 260 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.0 MiB/s rd, 968 KiB/s wr, 288 op/s; 601/16458 objects degraded (3.652%); 0 B/s, 7 objects/s recovering 2026-03-09T17:32:20.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.260+0000 7f927f0a1700 1 -- 192.168.123.106:0/3474184000 shutdown_connections 2026-03-09T17:32:20.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.260+0000 7f927f0a1700 1 -- 192.168.123.106:0/3474184000 wait complete. 2026-03-09T17:32:20.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.260+0000 7f927f0a1700 1 Processor -- start 2026-03-09T17:32:20.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.260+0000 7f927f0a1700 1 -- start start 2026-03-09T17:32:20.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.260+0000 7f927f0a1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9278071980 0x7f92780825c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:20.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.260+0000 7f927f0a1700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9278082b00 0x7f9278082f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:20.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.260+0000 7f927f0a1700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92781b2a90 con 0x7f9278071980 2026-03-09T17:32:20.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.260+0000 7f927f0a1700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92781b2bd0 con 0x7f9278082b00 2026-03-09T17:32:20.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.261+0000 7f9277fff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9278082b00 0x7f9278082f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:20.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.261+0000 7f9277fff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9278082b00 0x7f9278082f70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:44604/0 (socket says 192.168.123.106:44604) 2026-03-09T17:32:20.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.261+0000 7f9277fff700 1 -- 192.168.123.106:0/897676848 learned_addr learned my addr 192.168.123.106:0/897676848 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:32:20.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.261+0000 7f9277fff700 1 -- 192.168.123.106:0/897676848 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9278071980 msgr2=0x7f92780825c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:20.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.261+0000 7f9277fff700 1 --2- 192.168.123.106:0/897676848 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9278071980 0x7f92780825c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.261+0000 7f9277fff700 1 -- 192.168.123.106:0/897676848 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9270007ed0 con 0x7f9278082b00 2026-03-09T17:32:20.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.261+0000 7f9277fff700 1 --2- 192.168.123.106:0/897676848 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9278082b00 0x7f9278082f70 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f9270003fd0 tx=0x7f92700040b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:20.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.262+0000 7f9275ffb700 1 -- 192.168.123.106:0/897676848 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f927001c070 con 0x7f9278082b00 2026-03-09T17:32:20.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.262+0000 7f927f0a1700 1 -- 192.168.123.106:0/897676848 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f92781b2d70 con 0x7f9278082b00 2026-03-09T17:32:20.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.262+0000 7f927f0a1700 1 -- 192.168.123.106:0/897676848 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f92781b3260 con 0x7f9278082b00 2026-03-09T17:32:20.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.262+0000 7f9275ffb700 1 -- 192.168.123.106:0/897676848 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f927000f660 con 0x7f9278082b00 2026-03-09T17:32:20.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.262+0000 7f9275ffb700 1 -- 192.168.123.106:0/897676848 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f92700177f0 con 0x7f9278082b00 2026-03-09T17:32:20.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.264+0000 7f9275ffb700 1 -- 192.168.123.106:0/897676848 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f9270004660 con 0x7f9278082b00 2026-03-09T17:32:20.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.264+0000 7f9275ffb700 1 --2- 192.168.123.106:0/897676848 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9260079c50 0x7f926007c100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:20.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.265+0000 7f927ce3d700 1 --2- 192.168.123.106:0/897676848 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9260079c50 0x7f926007c100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:20.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.265+0000 7f9275ffb700 1 -- 192.168.123.106:0/897676848 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9270013070 con 0x7f9278082b00 2026-03-09T17:32:20.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.263+0000 7f927f0a1700 1 -- 192.168.123.106:0/897676848 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9264005320 con 0x7f9278082b00 2026-03-09T17:32:20.269 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.267+0000 7f927ce3d700 1 --2- 192.168.123.106:0/897676848 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9260079c50 0x7f926007c100 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f9268005950 tx=0x7f92680058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:20.270 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.268+0000 7f9275ffb700 1 -- 192.168.123.106:0/897676848 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9270063aa0 con 0x7f9278082b00 2026-03-09T17:32:20.442 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.439+0000 7f927f0a1700 1 -- 192.168.123.106:0/897676848 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f9264005cc0 con 0x7f9278082b00 2026-03-09T17:32:20.442 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.440+0000 7f9275ffb700 1 -- 192.168.123.106:0/897676848 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1942 (secure 0 0 0) 0x7f9270025070 con 0x7f9278082b00 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:e13 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:epoch 11 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:max_xattr_size 65536 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24291} 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:qdb_cluster leader: 0 members: 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:32:20.444 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:32:20.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.446+0000 7f925f7fe700 1 -- 192.168.123.106:0/897676848 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9260079c50 msgr2=0x7f926007c100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:20.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.446+0000 7f925f7fe700 1 --2- 192.168.123.106:0/897676848 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9260079c50 0x7f926007c100 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f9268005950 tx=0x7f92680058e0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.446+0000 7f925f7fe700 1 -- 192.168.123.106:0/897676848 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9278082b00 msgr2=0x7f9278082f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:20.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.446+0000 7f925f7fe700 1 --2- 192.168.123.106:0/897676848 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9278082b00 0x7f9278082f70 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f9270003fd0 tx=0x7f92700040b0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.446+0000 7f925f7fe700 1 -- 192.168.123.106:0/897676848 shutdown_connections 2026-03-09T17:32:20.449 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.447+0000 7f925f7fe700 1 --2- 192.168.123.106:0/897676848 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9260079c50 0x7f926007c100 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.449 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.447+0000 7f925f7fe700 1 --2- 192.168.123.106:0/897676848 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9278071980 0x7f92780825c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.449 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.447+0000 7f925f7fe700 1 --2- 192.168.123.106:0/897676848 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9278082b00 0x7f9278082f70 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.449 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.447+0000 7f925f7fe700 1 -- 192.168.123.106:0/897676848 >> 192.168.123.106:0/897676848 conn(0x7f927806d1a0 msgr2=0x7f9278076500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:20.450 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.448+0000 7f925f7fe700 1 -- 192.168.123.106:0/897676848 shutdown_connections 2026-03-09T17:32:20.450 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.448+0000 7f925f7fe700 1 -- 192.168.123.106:0/897676848 wait complete. 2026-03-09T17:32:20.452 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:32:20.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.542+0000 7f9e15050700 1 -- 192.168.123.106:0/1544165140 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e10072470 msgr2=0x7f9e1010beb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:20.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.542+0000 7f9e15050700 1 --2- 192.168.123.106:0/1544165140 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e10072470 0x7f9e1010beb0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f9e0000b3a0 tx=0x7f9e0000b6b0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.542+0000 7f9e15050700 1 -- 192.168.123.106:0/1544165140 shutdown_connections 2026-03-09T17:32:20.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.542+0000 7f9e15050700 1 --2- 192.168.123.106:0/1544165140 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e10072470 0x7f9e1010beb0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.542+0000 7f9e15050700 1 --2- 192.168.123.106:0/1544165140 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9e10071a90 0x7f9e10071ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.542+0000 7f9e15050700 1 -- 192.168.123.106:0/1544165140 >> 192.168.123.106:0/1544165140 conn(0x7f9e1006d1a0 msgr2=0x7f9e1006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:20.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.542+0000 7f9e15050700 1 -- 192.168.123.106:0/1544165140 shutdown_connections 2026-03-09T17:32:20.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.543+0000 7f9e15050700 1 -- 192.168.123.106:0/1544165140 wait complete. 2026-03-09T17:32:20.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.544+0000 7f9e15050700 1 Processor -- start 2026-03-09T17:32:20.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.544+0000 7f9e15050700 1 -- start start 2026-03-09T17:32:20.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.544+0000 7f9e15050700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9e10071a90 0x7f9e101a4a90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:20.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.544+0000 7f9e15050700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e10072470 0x7f9e101a4fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:20.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.544+0000 7f9e15050700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9e101a55f0 con 0x7f9e10072470 2026-03-09T17:32:20.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.544+0000 7f9e15050700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9e101a5730 con 0x7f9e10071a90 2026-03-09T17:32:20.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.544+0000 7f9e0ed9d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9e10071a90 0x7f9e101a4a90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:20.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.544+0000 7f9e0ed9d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9e10071a90 0x7f9e101a4a90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:44624/0 (socket says 192.168.123.106:44624) 2026-03-09T17:32:20.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.544+0000 7f9e0ed9d700 1 -- 192.168.123.106:0/1333217863 learned_addr learned my addr 192.168.123.106:0/1333217863 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:32:20.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.545+0000 7f9e0ed9d700 1 -- 192.168.123.106:0/1333217863 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e10072470 msgr2=0x7f9e101a4fd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:20.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.545+0000 7f9e0ed9d700 1 --2- 192.168.123.106:0/1333217863 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e10072470 0x7f9e101a4fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.545+0000 7f9e0ed9d700 1 -- 192.168.123.106:0/1333217863 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9e0000b050 con 0x7f9e10071a90 2026-03-09T17:32:20.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.545+0000 7f9e0ed9d700 1 --2- 192.168.123.106:0/1333217863 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9e10071a90 0x7f9e101a4a90 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f9df800b700 tx=0x7f9df800bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:20.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.545+0000 7f9e07fff700 1 -- 192.168.123.106:0/1333217863 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9df8010840 con 0x7f9e10071a90 2026-03-09T17:32:20.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.546+0000 7f9e15050700 1 -- 192.168.123.106:0/1333217863 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9e1010f620 con 0x7f9e10071a90 2026-03-09T17:32:20.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.546+0000 7f9e15050700 1 -- 192.168.123.106:0/1333217863 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9e1010fbf0 con 0x7f9e10071a90 2026-03-09T17:32:20.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.546+0000 7f9e07fff700 1 -- 192.168.123.106:0/1333217863 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9df8010e80 con 0x7f9e10071a90 2026-03-09T17:32:20.549 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.547+0000 7f9e07fff700 1 -- 192.168.123.106:0/1333217863 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9df800d590 con 0x7f9e10071a90 2026-03-09T17:32:20.549 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.547+0000 7f9e07fff700 1 -- 192.168.123.106:0/1333217863 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f9df800d770 con 0x7f9e10071a90 2026-03-09T17:32:20.550 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.548+0000 7f9e07fff700 1 --2- 192.168.123.106:0/1333217863 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9dfc077b80 0x7f9dfc07a030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:20.550 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.548+0000 7f9e07fff700 1 -- 192.168.123.106:0/1333217863 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9df809a7c0 con 0x7f9e10071a90 2026-03-09T17:32:20.550 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.548+0000 7f9e0e59c700 1 --2- 192.168.123.106:0/1333217863 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9dfc077b80 0x7f9dfc07a030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:20.551 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.549+0000 7f9e15050700 1 -- 192.168.123.106:0/1333217863 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9df0005320 con 0x7f9e10071a90 2026-03-09T17:32:20.552 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.550+0000 7f9e0e59c700 1 --2- 192.168.123.106:0/1333217863 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9dfc077b80 0x7f9dfc07a030 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f9e00009250 tx=0x7f9e0000bf90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:20.555 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.553+0000 7f9e07fff700 1 -- 192.168.123.106:0/1333217863 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9df8062e50 con 0x7f9e10071a90 2026-03-09T17:32:20.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:20 vm09.local ceph-mon[97995]: pgmap v51: 65 pgs: 8 active+recovery_wait+degraded, 2 active+recovering, 55 active+clean; 260 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.0 MiB/s rd, 968 KiB/s wr, 288 op/s; 601/16458 objects degraded (3.652%); 0 B/s, 7 objects/s recovering 2026-03-09T17:32:20.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.695+0000 7f9e15050700 1 -- 192.168.123.106:0/1333217863 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9df0000bf0 con 0x7f9dfc077b80 2026-03-09T17:32:20.698 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.696+0000 7f9e07fff700 1 -- 192.168.123.106:0/1333217863 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f9df0000bf0 con 0x7f9dfc077b80 2026-03-09T17:32:20.698 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:32:20.698 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T17:32:20.698 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:32:20.698 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:32:20.698 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [ 2026-03-09T17:32:20.698 INFO:teuthology.orchestra.run.vm06.stdout: "mon", 2026-03-09T17:32:20.698 INFO:teuthology.orchestra.run.vm06.stdout: "crash", 2026-03-09T17:32:20.698 INFO:teuthology.orchestra.run.vm06.stdout: "mgr" 2026-03-09T17:32:20.698 INFO:teuthology.orchestra.run.vm06.stdout: ], 2026-03-09T17:32:20.699 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "7/23 daemons upgraded", 2026-03-09T17:32:20.699 INFO:teuthology.orchestra.run.vm06.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T17:32:20.699 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:32:20.699 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:32:20.701 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.699+0000 7f9e15050700 1 -- 192.168.123.106:0/1333217863 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9dfc077b80 msgr2=0x7f9dfc07a030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:20.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.699+0000 7f9e15050700 1 --2- 192.168.123.106:0/1333217863 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9dfc077b80 0x7f9dfc07a030 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f9e00009250 tx=0x7f9e0000bf90 comp rx=0 tx=0).stop 2026-03-09T17:32:20.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.700+0000 7f9e15050700 1 -- 192.168.123.106:0/1333217863 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9e10071a90 msgr2=0x7f9e101a4a90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:20.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.700+0000 7f9e15050700 1 --2- 192.168.123.106:0/1333217863 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9e10071a90 0x7f9e101a4a90 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f9df800b700 tx=0x7f9df800bac0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.700+0000 7f9e15050700 1 -- 192.168.123.106:0/1333217863 shutdown_connections 2026-03-09T17:32:20.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.700+0000 7f9e15050700 1 --2- 192.168.123.106:0/1333217863 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9dfc077b80 0x7f9dfc07a030 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.700+0000 7f9e15050700 1 --2- 192.168.123.106:0/1333217863 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9e10071a90 0x7f9e101a4a90 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.700+0000 7f9e15050700 1 --2- 192.168.123.106:0/1333217863 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e10072470 0x7f9e101a4fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.701+0000 7f9e15050700 1 -- 192.168.123.106:0/1333217863 >> 192.168.123.106:0/1333217863 conn(0x7f9e1006d1a0 msgr2=0x7f9e1010b4e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:20.704 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.702+0000 7f9e15050700 1 -- 192.168.123.106:0/1333217863 shutdown_connections 2026-03-09T17:32:20.704 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.702+0000 7f9e15050700 1 -- 192.168.123.106:0/1333217863 wait complete. 2026-03-09T17:32:20.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.783+0000 7fb61a955700 1 -- 192.168.123.106:0/3371026968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb614102760 msgr2=0x7fb614102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:20.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.783+0000 7fb61a955700 1 --2- 192.168.123.106:0/3371026968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb614102760 0x7fb614102b70 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fb5fc009b50 tx=0x7fb5fc009e60 comp rx=0 tx=0).stop 2026-03-09T17:32:20.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.784+0000 7fb61a955700 1 -- 192.168.123.106:0/3371026968 shutdown_connections 2026-03-09T17:32:20.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.784+0000 7fb61a955700 1 --2- 192.168.123.106:0/3371026968 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb614103960 0x7fb614103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.784+0000 7fb61a955700 1 --2- 192.168.123.106:0/3371026968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb614102760 0x7fb614102b70 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.784+0000 7fb61a955700 1 -- 192.168.123.106:0/3371026968 >> 192.168.123.106:0/3371026968 conn(0x7fb6140fdcf0 msgr2=0x7fb614100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:20.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.784+0000 7fb61a955700 1 -- 192.168.123.106:0/3371026968 shutdown_connections 2026-03-09T17:32:20.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.784+0000 7fb61a955700 1 -- 192.168.123.106:0/3371026968 wait complete. 2026-03-09T17:32:20.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.785+0000 7fb61a955700 1 Processor -- start 2026-03-09T17:32:20.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.785+0000 7fb61a955700 1 -- start start 2026-03-09T17:32:20.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.785+0000 7fb61a955700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb614102760 0x7fb614198030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:20.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.785+0000 7fb61a955700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb614103960 0x7fb614198570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:20.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.785+0000 7fb61a955700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb614198bb0 con 0x7fb614103960 2026-03-09T17:32:20.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.785+0000 7fb61a955700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb614198d20 con 0x7fb614102760 2026-03-09T17:32:20.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.785+0000 7fb613fff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb614102760 0x7fb614198030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:20.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.786+0000 7fb613fff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb614102760 0x7fb614198030 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:44646/0 (socket says 192.168.123.106:44646) 2026-03-09T17:32:20.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.786+0000 7fb613fff700 1 -- 192.168.123.106:0/4186962049 learned_addr learned my addr 192.168.123.106:0/4186962049 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:32:20.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.786+0000 7fb613fff700 1 -- 192.168.123.106:0/4186962049 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb614103960 msgr2=0x7fb614198570 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:32:20.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.786+0000 7fb6137fe700 1 --2- 192.168.123.106:0/4186962049 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb614103960 0x7fb614198570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:20.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.786+0000 7fb613fff700 1 --2- 192.168.123.106:0/4186962049 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb614103960 0x7fb614198570 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:20.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.786+0000 7fb613fff700 1 -- 192.168.123.106:0/4186962049 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb5fc0097e0 con 0x7fb614102760 2026-03-09T17:32:20.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.787+0000 7fb613fff700 1 --2- 192.168.123.106:0/4186962049 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb614102760 0x7fb614198030 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fb5fc004ce0 tx=0x7fb5fc0057f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:20.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.787+0000 7fb6117fa700 1 -- 192.168.123.106:0/4186962049 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb5fc01d070 con 0x7fb614102760 2026-03-09T17:32:20.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.787+0000 7fb61a955700 1 -- 192.168.123.106:0/4186962049 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb6140792f0 con 0x7fb614102760 2026-03-09T17:32:20.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.787+0000 7fb61a955700 1 -- 192.168.123.106:0/4186962049 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb6141a2090 con 0x7fb614102760 2026-03-09T17:32:20.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.787+0000 7fb6117fa700 1 -- 192.168.123.106:0/4186962049 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb5fc00bc30 con 0x7fb614102760 2026-03-09T17:32:20.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.787+0000 7fb6117fa700 1 -- 192.168.123.106:0/4186962049 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb5fc00f780 con 0x7fb614102760 2026-03-09T17:32:20.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.788+0000 7fb6137fe700 1 --2- 192.168.123.106:0/4186962049 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb614103960 0x7fb614198570 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:32:20.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.789+0000 7fb6117fa700 1 -- 192.168.123.106:0/4186962049 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fb5fc00f8e0 con 0x7fb614102760 2026-03-09T17:32:20.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.789+0000 7fb61a955700 1 -- 192.168.123.106:0/4186962049 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb5f4005320 con 0x7fb614102760 2026-03-09T17:32:20.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.789+0000 7fb6117fa700 1 --2- 192.168.123.106:0/4186962049 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb600077a00 0x7fb600079eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:20.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.789+0000 7fb6117fa700 1 -- 192.168.123.106:0/4186962049 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fb5fc09b2b0 con 0x7fb614102760 2026-03-09T17:32:20.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.789+0000 7fb6137fe700 1 --2- 192.168.123.106:0/4186962049 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb600077a00 0x7fb600079eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:20.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.790+0000 7fb6137fe700 1 --2- 192.168.123.106:0/4186962049 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb600077a00 0x7fb600079eb0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fb604005fd0 tx=0x7fb604005ee0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:20.797 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:20.793+0000 7fb6117fa700 1 -- 192.168.123.106:0/4186962049 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb5fc063970 con 0x7fb614102760 2026-03-09T17:32:21.012 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:21.007+0000 7fb61a955700 1 -- 192.168.123.106:0/4186962049 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fb5f4005190 con 0x7fb614102760 2026-03-09T17:32:21.012 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:21.010+0000 7fb6117fa700 1 -- 192.168.123.106:0/4186962049 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+672 (secure 0 0 0) 0x7fb5fc0630c0 con 0x7fb614102760 2026-03-09T17:32:21.012 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_WARN Degraded data redundancy: 601/16458 objects degraded (3.652%), 8 pgs degraded 2026-03-09T17:32:21.012 INFO:teuthology.orchestra.run.vm06.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 601/16458 objects degraded (3.652%), 8 pgs degraded 2026-03-09T17:32:21.012 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.1 is active+recovery_wait+degraded, acting [0,4,2] 2026-03-09T17:32:21.012 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.c is active+recovery_wait+degraded, acting [5,0,3] 2026-03-09T17:32:21.012 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.f is active+recovery_wait+degraded, acting [5,3,0] 2026-03-09T17:32:21.012 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.15 is active+recovery_wait+degraded, acting [3,0,4] 2026-03-09T17:32:21.012 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.17 is active+recovery_wait+degraded, acting [0,5,2] 2026-03-09T17:32:21.012 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.18 is active+recovery_wait+degraded, acting [2,0,1] 2026-03-09T17:32:21.012 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.1b is active+recovery_wait+degraded, acting [0,4,3] 2026-03-09T17:32:21.012 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.1f is active+recovery_wait+degraded, acting [0,3,2] 2026-03-09T17:32:21.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:21.014+0000 7fb60affd700 1 -- 192.168.123.106:0/4186962049 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb600077a00 msgr2=0x7fb600079eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:21.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:21.014+0000 7fb60affd700 1 --2- 192.168.123.106:0/4186962049 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb600077a00 0x7fb600079eb0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fb604005fd0 tx=0x7fb604005ee0 comp rx=0 tx=0).stop 2026-03-09T17:32:21.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:21.014+0000 7fb60affd700 1 -- 192.168.123.106:0/4186962049 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb614102760 msgr2=0x7fb614198030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:21.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:21.014+0000 7fb60affd700 1 --2- 192.168.123.106:0/4186962049 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb614102760 0x7fb614198030 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fb5fc004ce0 tx=0x7fb5fc0057f0 comp rx=0 tx=0).stop 2026-03-09T17:32:21.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:21.014+0000 7fb60affd700 1 -- 192.168.123.106:0/4186962049 shutdown_connections 2026-03-09T17:32:21.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:21.014+0000 7fb60affd700 1 --2- 192.168.123.106:0/4186962049 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb600077a00 0x7fb600079eb0 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:21.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:21.014+0000 7fb60affd700 1 --2- 192.168.123.106:0/4186962049 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb614102760 0x7fb614198030 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:21.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:21.014+0000 7fb60affd700 1 --2- 192.168.123.106:0/4186962049 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb614103960 0x7fb614198570 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:21.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:21.014+0000 7fb60affd700 1 -- 192.168.123.106:0/4186962049 >> 192.168.123.106:0/4186962049 conn(0x7fb6140fdcf0 msgr2=0x7fb614106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:21.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:21.014+0000 7fb60affd700 1 -- 192.168.123.106:0/4186962049 shutdown_connections 2026-03-09T17:32:21.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:21.014+0000 7fb60affd700 1 -- 192.168.123.106:0/4186962049 wait complete. 2026-03-09T17:32:21.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:21 vm06.local ceph-mon[109831]: from='client.34166 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:21.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:21 vm06.local ceph-mon[109831]: from='client.34170 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:21.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:21 vm06.local ceph-mon[109831]: from='client.34174 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:21.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:21 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/337362580' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:21.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:21 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/897676848' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:32:21.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:21 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/4186962049' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:32:21.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:21 vm09.local ceph-mon[97995]: from='client.34166 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:21.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:21 vm09.local ceph-mon[97995]: from='client.34170 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:21.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:21 vm09.local ceph-mon[97995]: from='client.34174 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:21.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:21 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/337362580' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:21.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:21 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/897676848' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:32:21.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:21 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/4186962049' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:32:22.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:22 vm09.local ceph-mon[97995]: pgmap v52: 65 pgs: 8 active+recovery_wait+degraded, 1 active+recovering, 56 active+clean; 250 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.3 MiB/s wr, 403 op/s; 601/13428 objects degraded (4.476%); 0 B/s, 7 objects/s recovering 2026-03-09T17:32:22.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:22 vm09.local ceph-mon[97995]: from='client.44167 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:22.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:22 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 601/13428 objects degraded (4.476%), 8 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:22.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:22 vm06.local ceph-mon[109831]: pgmap v52: 65 pgs: 8 active+recovery_wait+degraded, 1 active+recovering, 56 active+clean; 250 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.3 MiB/s wr, 403 op/s; 601/13428 objects degraded (4.476%); 0 B/s, 7 objects/s recovering 2026-03-09T17:32:22.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:22 vm06.local ceph-mon[109831]: from='client.44167 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:22.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:22 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 601/13428 objects degraded (4.476%), 8 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:24.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:24 vm06.local ceph-mon[109831]: pgmap v53: 65 pgs: 8 active+recovery_wait+degraded, 1 active+recovering, 56 active+clean; 249 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 904 KiB/s rd, 931 KiB/s wr, 294 op/s; 601/12885 objects degraded (4.664%); 0 B/s, 7 objects/s recovering 2026-03-09T17:32:24.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:24 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:32:24.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:24 vm09.local ceph-mon[97995]: pgmap v53: 65 pgs: 8 active+recovery_wait+degraded, 1 active+recovering, 56 active+clean; 249 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 904 KiB/s rd, 931 KiB/s wr, 294 op/s; 601/12885 objects degraded (4.664%); 0 B/s, 7 objects/s recovering 2026-03-09T17:32:24.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:24 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:32:25.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:25 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:32:25.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:25 vm06.local ceph-mon[109831]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-09T17:32:25.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:25 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:32:25.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:25 vm09.local ceph-mon[97995]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-09T17:32:26.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:26 vm06.local ceph-mon[109831]: pgmap v54: 65 pgs: 8 active+recovery_wait+degraded, 1 active+recovering, 56 active+clean; 249 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 904 KiB/s rd, 931 KiB/s wr, 277 op/s; 601/12885 objects degraded (4.664%); 0 B/s, 7 objects/s recovering 2026-03-09T17:32:26.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:26 vm09.local ceph-mon[97995]: pgmap v54: 65 pgs: 8 active+recovery_wait+degraded, 1 active+recovering, 56 active+clean; 249 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 904 KiB/s rd, 931 KiB/s wr, 277 op/s; 601/12885 objects degraded (4.664%); 0 B/s, 7 objects/s recovering 2026-03-09T17:32:27.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:27 vm06.local ceph-mon[109831]: pgmap v55: 65 pgs: 6 active+recovery_wait+degraded, 2 active+recovering, 57 active+clean; 245 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.4 MiB/s wr, 402 op/s; 454/9639 objects degraded (4.710%); 0 B/s, 12 objects/s recovering 2026-03-09T17:32:27.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:27 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 601/12885 objects degraded (4.664%), 8 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:27.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:27 vm09.local ceph-mon[97995]: pgmap v55: 65 pgs: 6 active+recovery_wait+degraded, 2 active+recovering, 57 active+clean; 245 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.4 MiB/s wr, 402 op/s; 454/9639 objects degraded (4.710%); 0 B/s, 12 objects/s recovering 2026-03-09T17:32:27.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:27 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 601/12885 objects degraded (4.664%), 8 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:28.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:32:28.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:32:29.853 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.0... 2026-03-09T17:32:29.854 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-09T17:32:30.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:29 vm06.local ceph-mon[109831]: pgmap v56: 65 pgs: 6 active+recovery_wait+degraded, 2 active+recovering, 57 active+clean; 245 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 842 KiB/s rd, 890 KiB/s wr, 255 op/s; 454/9639 objects degraded (4.710%); 0 B/s, 8 objects/s recovering 2026-03-09T17:32:30.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:29 vm09.local ceph-mon[97995]: pgmap v56: 65 pgs: 6 active+recovery_wait+degraded, 2 active+recovering, 57 active+clean; 245 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 842 KiB/s rd, 890 KiB/s wr, 255 op/s; 454/9639 objects degraded (4.710%); 0 B/s, 8 objects/s recovering 2026-03-09T17:32:30.749 DEBUG:teuthology.parallel:result is None 2026-03-09T17:32:32.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:31 vm06.local ceph-mon[109831]: pgmap v57: 65 pgs: 6 active+recovery_wait+degraded, 2 active+recovering, 57 active+clean; 238 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.4 MiB/s wr, 374 op/s; 454/6663 objects degraded (6.814%); 0 B/s, 12 objects/s recovering 2026-03-09T17:32:32.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:31 vm06.local ceph-mon[109831]: mgrmap e38: vm06.pbgzei(active, since 92s), standbys: vm09.lqzvkh 2026-03-09T17:32:32.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:31 vm09.local ceph-mon[97995]: pgmap v57: 65 pgs: 6 active+recovery_wait+degraded, 2 active+recovering, 57 active+clean; 238 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.4 MiB/s wr, 374 op/s; 454/6663 objects degraded (6.814%); 0 B/s, 12 objects/s recovering 2026-03-09T17:32:32.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:31 vm09.local ceph-mon[97995]: mgrmap e38: vm06.pbgzei(active, since 92s), standbys: vm09.lqzvkh 2026-03-09T17:32:33.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:32 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 454/6663 objects degraded (6.814%), 6 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:33.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:32 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 454/6663 objects degraded (6.814%), 6 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:34.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:34 vm06.local ceph-mon[109831]: pgmap v58: 65 pgs: 6 active+recovery_wait+degraded, 2 active+recovering, 57 active+clean; 237 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 901 KiB/s rd, 976 KiB/s wr, 273 op/s; 454/6144 objects degraded (7.389%); 0 B/s, 15 objects/s recovering 2026-03-09T17:32:34.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:34 vm09.local ceph-mon[97995]: pgmap v58: 65 pgs: 6 active+recovery_wait+degraded, 2 active+recovering, 57 active+clean; 237 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 901 KiB/s rd, 976 KiB/s wr, 273 op/s; 454/6144 objects degraded (7.389%); 0 B/s, 15 objects/s recovering 2026-03-09T17:32:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:36 vm06.local ceph-mon[109831]: pgmap v59: 65 pgs: 6 active+recovery_wait+degraded, 2 active+recovering, 57 active+clean; 237 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 901 KiB/s rd, 976 KiB/s wr, 258 op/s; 454/6144 objects degraded (7.389%); 0 B/s, 12 objects/s recovering 2026-03-09T17:32:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:36 vm09.local ceph-mon[97995]: pgmap v59: 65 pgs: 6 active+recovery_wait+degraded, 2 active+recovering, 57 active+clean; 237 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 901 KiB/s rd, 976 KiB/s wr, 258 op/s; 454/6144 objects degraded (7.389%); 0 B/s, 12 objects/s recovering 2026-03-09T17:32:36.981 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.1... 2026-03-09T17:32:36.981 DEBUG:teuthology.orchestra.run.vm09:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.1 /home/ubuntu/cephtest/clone.client.1 2026-03-09T17:32:37.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:37 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 454/6144 objects degraded (7.389%), 6 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:37.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:37 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 454/6144 objects degraded (7.389%), 6 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:37.425 DEBUG:teuthology.parallel:result is None 2026-03-09T17:32:37.425 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-09T17:32:37.467 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-09T17:32:37.467 DEBUG:teuthology.orchestra.run.vm09:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1 2026-03-09T17:32:37.517 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.1/client.1 2026-03-09T17:32:37.517 DEBUG:teuthology.parallel:result is None 2026-03-09T17:32:38.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:38 vm06.local ceph-mon[109831]: pgmap v60: 65 pgs: 4 active+recovery_wait+degraded, 2 active+recovering, 59 active+clean; 230 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.3 MiB/s wr, 372 op/s; 291/3333 objects degraded (8.731%); 0 B/s, 19 objects/s recovering 2026-03-09T17:32:38.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:38 vm09.local ceph-mon[97995]: pgmap v60: 65 pgs: 4 active+recovery_wait+degraded, 2 active+recovering, 59 active+clean; 230 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.3 MiB/s wr, 372 op/s; 291/3333 objects degraded (8.731%); 0 B/s, 19 objects/s recovering 2026-03-09T17:32:39.535 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:32:39.535 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:32:39.535 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:32:39.535 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:39.535 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:32:39.535 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:39.535 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:39.535 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:39.535 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:32:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:32:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:32:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:32:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:32:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:32:40.392 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:40 vm06.local systemd[1]: Stopping Ceph osd.1 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:32:40.392 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:40 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1[80006]: 2026-03-09T17:32:40.203+0000 7f60a1467700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:32:40.392 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:40 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1[80006]: 2026-03-09T17:32:40.203+0000 7f60a1467700 -1 osd.1 50 *** Got signal Terminated *** 2026-03-09T17:32:40.392 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:40 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1[80006]: 2026-03-09T17:32:40.203+0000 7f60a1467700 -1 osd.1 50 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T17:32:40.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:40 vm06.local ceph-mon[109831]: pgmap v61: 65 pgs: 4 active+recovery_wait+degraded, 2 active+recovering, 59 active+clean; 230 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 764 KiB/s rd, 833 KiB/s wr, 246 op/s; 291/3333 objects degraded (8.731%); 0 B/s, 14 objects/s recovering 2026-03-09T17:32:40.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:40 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:32:40.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:40 vm06.local ceph-mon[109831]: Upgrade: osd.1 is safe to restart 2026-03-09T17:32:40.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:40 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:40.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:40 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T17:32:40.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:40 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:32:40.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:40 vm09.local ceph-mon[97995]: pgmap v61: 65 pgs: 4 active+recovery_wait+degraded, 2 active+recovering, 59 active+clean; 230 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 764 KiB/s rd, 833 KiB/s wr, 246 op/s; 291/3333 objects degraded (8.731%); 0 B/s, 14 objects/s recovering 2026-03-09T17:32:40.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:40 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T17:32:40.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:40 vm09.local ceph-mon[97995]: Upgrade: osd.1 is safe to restart 2026-03-09T17:32:40.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:40 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:40.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:40 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T17:32:40.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:40 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:32:41.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:41 vm06.local ceph-mon[109831]: Upgrade: Updating osd.1 2026-03-09T17:32:41.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:41 vm06.local ceph-mon[109831]: Deploying daemon osd.1 on vm06 2026-03-09T17:32:41.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:41 vm06.local ceph-mon[109831]: osd.1 marked itself down and dead 2026-03-09T17:32:41.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:41 vm09.local ceph-mon[97995]: Upgrade: Updating osd.1 2026-03-09T17:32:41.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:41 vm09.local ceph-mon[97995]: Deploying daemon osd.1 on vm06 2026-03-09T17:32:41.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:41 vm09.local ceph-mon[97995]: osd.1 marked itself down and dead 2026-03-09T17:32:41.679 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local podman[119556]: 2026-03-09 17:32:41.411761638 +0000 UTC m=+1.224794330 container died decf7e88ebaf795af138a09f268ceab7aa25e37eb4b4dc8699307412de72e899 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1, RELEASE=HEAD, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, io.buildah.version=1.29.1, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True, org.label-schema.schema-version=1.0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, GIT_BRANCH=HEAD) 2026-03-09T17:32:41.679 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local podman[119556]: 2026-03-09 17:32:41.430372641 +0000 UTC m=+1.243405344 container remove decf7e88ebaf795af138a09f268ceab7aa25e37eb4b4dc8699307412de72e899 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1, org.label-schema.name=CentOS Stream 8 Base Image, GIT_CLEAN=True, io.buildah.version=1.29.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, CEPH_POINT_RELEASE=-18.2.0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.build-date=20231212) 2026-03-09T17:32:41.679 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local bash[119556]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1 2026-03-09T17:32:41.680 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local podman[119622]: 2026-03-09 17:32:41.601194199 +0000 UTC m=+0.031268306 container create 6608bdffee01147c30a4e32ba6c274cbb4aacb4e13a1d3cb2d371d655c4d7f72 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-deactivate, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T17:32:41.680 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local podman[119622]: 2026-03-09 17:32:41.64247248 +0000 UTC m=+0.072546587 container init 6608bdffee01147c30a4e32ba6c274cbb4aacb4e13a1d3cb2d371d655c4d7f72 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True) 2026-03-09T17:32:41.680 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local podman[119622]: 2026-03-09 17:32:41.645956149 +0000 UTC m=+0.076030256 container start 6608bdffee01147c30a4e32ba6c274cbb4aacb4e13a1d3cb2d371d655c4d7f72 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T17:32:41.680 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local podman[119622]: 2026-03-09 17:32:41.649331768 +0000 UTC m=+0.079405885 container attach 6608bdffee01147c30a4e32ba6c274cbb4aacb4e13a1d3cb2d371d655c4d7f72 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2) 2026-03-09T17:32:42.022 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local podman[119622]: 2026-03-09 17:32:41.579623265 +0000 UTC m=+0.009697381 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:32:42.022 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local podman[119622]: 2026-03-09 17:32:41.781942285 +0000 UTC m=+0.212016392 container died 6608bdffee01147c30a4e32ba6c274cbb4aacb4e13a1d3cb2d371d655c4d7f72 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid) 2026-03-09T17:32:42.022 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local podman[119622]: 2026-03-09 17:32:41.801245175 +0000 UTC m=+0.231319282 container remove 6608bdffee01147c30a4e32ba6c274cbb4aacb4e13a1d3cb2d371d655c4d7f72 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-deactivate, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T17:32:42.022 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.1.service: Deactivated successfully. 2026-03-09T17:32:42.022 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.1.service: Unit process 119633 (conmon) remains running after unit stopped. 2026-03-09T17:32:42.022 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.1.service: Unit process 119642 (podman) remains running after unit stopped. 2026-03-09T17:32:42.022 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local systemd[1]: Stopped Ceph osd.1 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:32:42.022 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:41 vm06.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.1.service: Consumed 49.076s CPU time, 589.5M memory peak. 2026-03-09T17:32:42.022 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local systemd[1]: Starting Ceph osd.1 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:32:42.339 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local podman[119724]: 2026-03-09 17:32:42.116137629 +0000 UTC m=+0.015890292 container create 08016adc24aace78f449dfd1a7c4f68d32c25931e18b85cf33b2e68fbc0f38f5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid) 2026-03-09T17:32:42.339 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local podman[119724]: 2026-03-09 17:32:42.155484114 +0000 UTC m=+0.055236777 container init 08016adc24aace78f449dfd1a7c4f68d32c25931e18b85cf33b2e68fbc0f38f5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T17:32:42.339 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local podman[119724]: 2026-03-09 17:32:42.159926469 +0000 UTC m=+0.059679132 container start 08016adc24aace78f449dfd1a7c4f68d32c25931e18b85cf33b2e68fbc0f38f5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T17:32:42.339 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local podman[119724]: 2026-03-09 17:32:42.161131664 +0000 UTC m=+0.060884327 container attach 08016adc24aace78f449dfd1a7c4f68d32c25931e18b85cf33b2e68fbc0f38f5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS) 2026-03-09T17:32:42.339 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local podman[119724]: 2026-03-09 17:32:42.109526986 +0000 UTC m=+0.009279649 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:32:42.339 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate[119734]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:32:42.339 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local bash[119724]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:32:42.339 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate[119734]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:32:42.339 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local bash[119724]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:32:42.340 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:42 vm06.local ceph-mon[109831]: pgmap v62: 65 pgs: 4 active+recovery_wait+degraded, 2 active+recovering, 59 active+clean; 225 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.0 MiB/s rd, 1.1 MiB/s wr, 338 op/s; 291/690 objects degraded (42.174%); 0 B/s, 18 objects/s recovering 2026-03-09T17:32:42.340 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:42 vm06.local ceph-mon[109831]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T17:32:42.340 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:42 vm06.local ceph-mon[109831]: osdmap e51: 6 total, 5 up, 6 in 2026-03-09T17:32:42.340 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:42 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 291/690 objects degraded (42.174%), 4 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:42.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:42 vm09.local ceph-mon[97995]: pgmap v62: 65 pgs: 4 active+recovery_wait+degraded, 2 active+recovering, 59 active+clean; 225 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.0 MiB/s rd, 1.1 MiB/s wr, 338 op/s; 291/690 objects degraded (42.174%); 0 B/s, 18 objects/s recovering 2026-03-09T17:32:42.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:42 vm09.local ceph-mon[97995]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T17:32:42.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:42 vm09.local ceph-mon[97995]: osdmap e51: 6 total, 5 up, 6 in 2026-03-09T17:32:42.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:42 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 291/690 objects degraded (42.174%), 4 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:43.075 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate[119734]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T17:32:43.075 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate[119734]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:32:43.075 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local bash[119724]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T17:32:43.075 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local bash[119724]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:32:43.075 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate[119734]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:32:43.075 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local bash[119724]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:32:43.075 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate[119734]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T17:32:43.075 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local bash[119724]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T17:32:43.075 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate[119734]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-2aca2dfe-9b99-448e-ab72-8e1fd5cffc17/osd-block-e31bb3c2-190d-419c-bb90-f0909a02113b --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-09T17:32:43.075 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:42 vm06.local bash[119724]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-2aca2dfe-9b99-448e-ab72-8e1fd5cffc17/osd-block-e31bb3c2-190d-419c-bb90-f0909a02113b --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate[119734]: Running command: /usr/bin/ln -snf /dev/ceph-2aca2dfe-9b99-448e-ab72-8e1fd5cffc17/osd-block-e31bb3c2-190d-419c-bb90-f0909a02113b /var/lib/ceph/osd/ceph-1/block 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local bash[119724]: Running command: /usr/bin/ln -snf /dev/ceph-2aca2dfe-9b99-448e-ab72-8e1fd5cffc17/osd-block-e31bb3c2-190d-419c-bb90-f0909a02113b /var/lib/ceph/osd/ceph-1/block 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate[119734]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local bash[119724]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate[119734]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local bash[119724]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate[119734]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local bash[119724]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate[119734]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local bash[119724]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local conmon[119734]: conmon 08016adc24aace78f449 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-08016adc24aace78f449dfd1a7c4f68d32c25931e18b85cf33b2e68fbc0f38f5.scope/container/memory.events 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local podman[119724]: 2026-03-09 17:32:43.103024668 +0000 UTC m=+1.002777331 container died 08016adc24aace78f449dfd1a7c4f68d32c25931e18b85cf33b2e68fbc0f38f5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS) 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local podman[119724]: 2026-03-09 17:32:43.12176788 +0000 UTC m=+1.021520543 container remove 08016adc24aace78f449dfd1a7c4f68d32c25931e18b85cf33b2e68fbc0f38f5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local podman[119983]: 2026-03-09 17:32:43.213846894 +0000 UTC m=+0.015706609 container create b63df0190ed326264a0bcd546fd3898de755b6579c1f0b3f080d173dc98b6dc9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local podman[119983]: 2026-03-09 17:32:43.255687598 +0000 UTC m=+0.057547313 container init b63df0190ed326264a0bcd546fd3898de755b6579c1f0b3f080d173dc98b6dc9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local podman[119983]: 2026-03-09 17:32:43.259230148 +0000 UTC m=+0.061089853 container start b63df0190ed326264a0bcd546fd3898de755b6579c1f0b3f080d173dc98b6dc9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2) 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local bash[119983]: b63df0190ed326264a0bcd546fd3898de755b6579c1f0b3f080d173dc98b6dc9 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local podman[119983]: 2026-03-09 17:32:43.207301033 +0000 UTC m=+0.009160748 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:32:43.389 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local systemd[1]: Started Ceph osd.1 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:32:43.390 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:43 vm06.local ceph-mon[109831]: osdmap e52: 6 total, 5 up, 6 in 2026-03-09T17:32:43.390 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:43 vm06.local ceph-mon[109831]: pgmap v65: 65 pgs: 12 stale+active+clean, 4 active+recovery_wait+degraded, 1 active+recovering, 48 active+clean; 220 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1005 KiB/s rd, 899 KiB/s wr, 327 op/s; 291/243 objects degraded (119.753%); 0 B/s, 20 objects/s recovering 2026-03-09T17:32:43.390 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:43.390 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:43.390 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:32:43.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:43 vm09.local ceph-mon[97995]: osdmap e52: 6 total, 5 up, 6 in 2026-03-09T17:32:43.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:43 vm09.local ceph-mon[97995]: pgmap v65: 65 pgs: 12 stale+active+clean, 4 active+recovery_wait+degraded, 1 active+recovering, 48 active+clean; 220 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1005 KiB/s rd, 899 KiB/s wr, 327 op/s; 291/243 objects degraded (119.753%); 0 B/s, 20 objects/s recovering 2026-03-09T17:32:43.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:43.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:43.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:32:44.008 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:43 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1[119994]: 2026-03-09T17:32:43.838+0000 7f2c3bf6b740 -1 Falling back to public interface 2026-03-09T17:32:44.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:44 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:44.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:44 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:32:44.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:44 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:44.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:44 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:32:46.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:45 vm06.local ceph-mon[109831]: pgmap v66: 65 pgs: 12 stale+active+clean, 4 active+recovery_wait+degraded, 1 active+recovering, 48 active+clean; 220 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 451 KiB/s rd, 394 KiB/s wr, 156 op/s; 291/243 objects degraded (119.753%); 0 B/s, 10 objects/s recovering 2026-03-09T17:32:46.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:45 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:46.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:45 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:46.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:45 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:46.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:45 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:46.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:45 vm09.local ceph-mon[97995]: pgmap v66: 65 pgs: 12 stale+active+clean, 4 active+recovery_wait+degraded, 1 active+recovering, 48 active+clean; 220 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 451 KiB/s rd, 394 KiB/s wr, 156 op/s; 291/243 objects degraded (119.753%); 0 B/s, 10 objects/s recovering 2026-03-09T17:32:46.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:45 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:46.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:45 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:46.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:45 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:46.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:45 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-mon[109831]: pgmap v67: 65 pgs: 22 active+undersized, 2 active+recovery_wait+degraded, 2 active+recovering, 12 active+undersized+degraded, 27 active+clean; 220 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 488 KiB/s rd, 431 KiB/s wr, 159 op/s; 189/237 objects degraded (79.747%); 0 B/s, 20 objects/s recovering 2026-03-09T17:32:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:32:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:32:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:32:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T17:32:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T17:32:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-mon[109831]: Upgrade: unsafe to stop osd(s) at this time (12 PGs are or would become offline) 2026-03-09T17:32:47.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 189/237 objects degraded (79.747%), 14 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:47.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:47 vm09.local ceph-mon[97995]: pgmap v67: 65 pgs: 22 active+undersized, 2 active+recovery_wait+degraded, 2 active+recovering, 12 active+undersized+degraded, 27 active+clean; 220 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 488 KiB/s rd, 431 KiB/s wr, 159 op/s; 189/237 objects degraded (79.747%); 0 B/s, 20 objects/s recovering 2026-03-09T17:32:47.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:47 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:47.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:47 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:47.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:47 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:32:47.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:47 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:32:47.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:47 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:47.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:47 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:32:47.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:47 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:47.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:47 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:47.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:47 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:47.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:47 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T17:32:47.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:47 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T17:32:47.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:47 vm09.local ceph-mon[97995]: Upgrade: unsafe to stop osd(s) at this time (12 PGs are or would become offline) 2026-03-09T17:32:47.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:47 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 189/237 objects degraded (79.747%), 14 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:48.277 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:47 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1[119994]: 2026-03-09T17:32:47.992+0000 7f2c3bf6b740 -1 osd.1 0 read_superblock omap replica is missing. 2026-03-09T17:32:48.639 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:48 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1[119994]: 2026-03-09T17:32:48.274+0000 7f2c3bf6b740 -1 osd.1 50 log_to_monitors true 2026-03-09T17:32:48.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:48 vm06.local ceph-mon[109831]: from='osd.1 [v2:192.168.123.106:6810/3817886577,v1:192.168.123.106:6811/3817886577]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T17:32:48.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:48 vm09.local ceph-mon[97995]: from='osd.1 [v2:192.168.123.106:6810/3817886577,v1:192.168.123.106:6811/3817886577]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T17:32:49.891 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:32:49 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1[119994]: 2026-03-09T17:32:49.578+0000 7f2c33504640 -1 osd.1 50 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T17:32:49.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:49 vm06.local ceph-mon[109831]: pgmap v68: 65 pgs: 22 active+undersized, 2 active+recovery_wait+degraded, 2 active+recovering, 12 active+undersized+degraded, 27 active+clean; 220 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 37 KiB/s wr, 21 op/s; 189/237 objects degraded (79.747%); 0 B/s, 14 objects/s recovering 2026-03-09T17:32:49.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:49 vm06.local ceph-mon[109831]: from='osd.1 [v2:192.168.123.106:6810/3817886577,v1:192.168.123.106:6811/3817886577]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T17:32:49.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:49 vm06.local ceph-mon[109831]: osdmap e53: 6 total, 5 up, 6 in 2026-03-09T17:32:49.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:49 vm06.local ceph-mon[109831]: from='osd.1 [v2:192.168.123.106:6810/3817886577,v1:192.168.123.106:6811/3817886577]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T17:32:49.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:49 vm06.local ceph-mon[109831]: from='osd.1 [v2:192.168.123.106:6810/3817886577,v1:192.168.123.106:6811/3817886577]' entity='osd.1' 2026-03-09T17:32:50.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:49 vm09.local ceph-mon[97995]: pgmap v68: 65 pgs: 22 active+undersized, 2 active+recovery_wait+degraded, 2 active+recovering, 12 active+undersized+degraded, 27 active+clean; 220 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 37 KiB/s wr, 21 op/s; 189/237 objects degraded (79.747%); 0 B/s, 14 objects/s recovering 2026-03-09T17:32:50.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:49 vm09.local ceph-mon[97995]: from='osd.1 [v2:192.168.123.106:6810/3817886577,v1:192.168.123.106:6811/3817886577]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T17:32:50.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:49 vm09.local ceph-mon[97995]: osdmap e53: 6 total, 5 up, 6 in 2026-03-09T17:32:50.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:49 vm09.local ceph-mon[97995]: from='osd.1 [v2:192.168.123.106:6810/3817886577,v1:192.168.123.106:6811/3817886577]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T17:32:50.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:49 vm09.local ceph-mon[97995]: from='osd.1 [v2:192.168.123.106:6810/3817886577,v1:192.168.123.106:6811/3817886577]' entity='osd.1' 2026-03-09T17:32:51.103 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.100+0000 7f44a790c700 1 -- 192.168.123.106:0/797238191 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44a0103860 msgr2=0x7f44a0103cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.103 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:50 vm06.local ceph-mon[109831]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T17:32:51.103 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:50 vm06.local ceph-mon[109831]: osd.1 [v2:192.168.123.106:6810/3817886577,v1:192.168.123.106:6811/3817886577] boot 2026-03-09T17:32:51.103 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:50 vm06.local ceph-mon[109831]: osdmap e54: 6 total, 6 up, 6 in 2026-03-09T17:32:51.103 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:50 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:32:51.103 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.100+0000 7f44a790c700 1 --2- 192.168.123.106:0/797238191 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44a0103860 0x7f44a0103cb0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f449c009b00 tx=0x7f449c009e10 comp rx=0 tx=0).stop 2026-03-09T17:32:51.103 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.101+0000 7f44a790c700 1 -- 192.168.123.106:0/797238191 shutdown_connections 2026-03-09T17:32:51.103 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.101+0000 7f44a790c700 1 --2- 192.168.123.106:0/797238191 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44a0103860 0x7f44a0103cb0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.103 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.101+0000 7f44a790c700 1 --2- 192.168.123.106:0/797238191 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f44a0102660 0x7f44a0102a70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.103 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.101+0000 7f44a790c700 1 -- 192.168.123.106:0/797238191 >> 192.168.123.106:0/797238191 conn(0x7f44a00fdc10 msgr2=0x7f44a0100040 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:51.103 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.101+0000 7f44a790c700 1 -- 192.168.123.106:0/797238191 shutdown_connections 2026-03-09T17:32:51.103 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.101+0000 7f44a790c700 1 -- 192.168.123.106:0/797238191 wait complete. 2026-03-09T17:32:51.104 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.102+0000 7f44a790c700 1 Processor -- start 2026-03-09T17:32:51.104 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.102+0000 7f44a790c700 1 -- start start 2026-03-09T17:32:51.104 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.102+0000 7f44a790c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44a0102660 0x7f44a0197f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:51.104 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.102+0000 7f44a790c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f44a0103860 0x7f44a0198440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:51.104 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.102+0000 7f44a790c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f44a0198a60 con 0x7f44a0102660 2026-03-09T17:32:51.104 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.102+0000 7f44a790c700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f44a0198ba0 con 0x7f44a0103860 2026-03-09T17:32:51.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.102+0000 7f44a4ea7700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f44a0103860 0x7f44a0198440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:51.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.103+0000 7f44a4ea7700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f44a0103860 0x7f44a0198440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:43678/0 (socket says 192.168.123.106:43678) 2026-03-09T17:32:51.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.103+0000 7f44a4ea7700 1 -- 192.168.123.106:0/3576698005 learned_addr learned my addr 192.168.123.106:0/3576698005 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:32:51.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.103+0000 7f44a56a8700 1 --2- 192.168.123.106:0/3576698005 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44a0102660 0x7f44a0197f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:51.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.103+0000 7f44a56a8700 1 -- 192.168.123.106:0/3576698005 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f44a0103860 msgr2=0x7f44a0198440 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.103+0000 7f44a56a8700 1 --2- 192.168.123.106:0/3576698005 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f44a0103860 0x7f44a0198440 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.103+0000 7f44a56a8700 1 -- 192.168.123.106:0/3576698005 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f449c0097e0 con 0x7f44a0102660 2026-03-09T17:32:51.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.103+0000 7f44a56a8700 1 --2- 192.168.123.106:0/3576698005 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44a0102660 0x7f44a0197f00 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f449000ea80 tx=0x7f449000ed90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:51.106 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.104+0000 7f44967fc700 1 -- 192.168.123.106:0/3576698005 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f449000cb20 con 0x7f44a0102660 2026-03-09T17:32:51.106 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.104+0000 7f44967fc700 1 -- 192.168.123.106:0/3576698005 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4490004500 con 0x7f44a0102660 2026-03-09T17:32:51.106 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.104+0000 7f44a790c700 1 -- 192.168.123.106:0/3576698005 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f44a019d650 con 0x7f44a0102660 2026-03-09T17:32:51.107 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.104+0000 7f44967fc700 1 -- 192.168.123.106:0/3576698005 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4490010430 con 0x7f44a0102660 2026-03-09T17:32:51.107 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.104+0000 7f44a790c700 1 -- 192.168.123.106:0/3576698005 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f44a019dba0 con 0x7f44a0102660 2026-03-09T17:32:51.108 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.105+0000 7f44967fc700 1 -- 192.168.123.106:0/3576698005 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f44900106a0 con 0x7f44a0102660 2026-03-09T17:32:51.108 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.105+0000 7f44a790c700 1 -- 192.168.123.106:0/3576698005 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f44a004ea50 con 0x7f44a0102660 2026-03-09T17:32:51.110 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.106+0000 7f44967fc700 1 --2- 192.168.123.106:0/3576698005 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f448c077920 0x7f448c079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:51.111 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.106+0000 7f44967fc700 1 -- 192.168.123.106:0/3576698005 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f4490014070 con 0x7f44a0102660 2026-03-09T17:32:51.111 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.109+0000 7f44a4ea7700 1 --2- 192.168.123.106:0/3576698005 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f448c077920 0x7f448c079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:51.111 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.109+0000 7f44967fc700 1 -- 192.168.123.106:0/3576698005 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f44900636d0 con 0x7f44a0102660 2026-03-09T17:32:51.111 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.109+0000 7f44a4ea7700 1 --2- 192.168.123.106:0/3576698005 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f448c077920 0x7f448c079dd0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f449c005200 tx=0x7f449c01a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:51.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:50 vm09.local ceph-mon[97995]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T17:32:51.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:50 vm09.local ceph-mon[97995]: osd.1 [v2:192.168.123.106:6810/3817886577,v1:192.168.123.106:6811/3817886577] boot 2026-03-09T17:32:51.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:50 vm09.local ceph-mon[97995]: osdmap e54: 6 total, 6 up, 6 in 2026-03-09T17:32:51.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:50 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T17:32:51.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.287+0000 7f44a790c700 1 -- 192.168.123.106:0/3576698005 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f44a019de80 con 0x7f448c077920 2026-03-09T17:32:51.292 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.290+0000 7f44967fc700 1 -- 192.168.123.106:0/3576698005 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f44a019de80 con 0x7f448c077920 2026-03-09T17:32:51.296 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.293+0000 7f44a790c700 1 -- 192.168.123.106:0/3576698005 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f448c077920 msgr2=0x7f448c079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.296 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.293+0000 7f44a790c700 1 --2- 192.168.123.106:0/3576698005 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f448c077920 0x7f448c079dd0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f449c005200 tx=0x7f449c01a040 comp rx=0 tx=0).stop 2026-03-09T17:32:51.296 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.294+0000 7f44a790c700 1 -- 192.168.123.106:0/3576698005 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44a0102660 msgr2=0x7f44a0197f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.296 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.294+0000 7f44a790c700 1 --2- 192.168.123.106:0/3576698005 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44a0102660 0x7f44a0197f00 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f449000ea80 tx=0x7f449000ed90 comp rx=0 tx=0).stop 2026-03-09T17:32:51.296 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.294+0000 7f44a790c700 1 -- 192.168.123.106:0/3576698005 shutdown_connections 2026-03-09T17:32:51.296 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.294+0000 7f44a790c700 1 --2- 192.168.123.106:0/3576698005 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f448c077920 0x7f448c079dd0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.297 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.294+0000 7f44a790c700 1 --2- 192.168.123.106:0/3576698005 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44a0102660 0x7f44a0197f00 secure :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f449000ea80 tx=0x7f449000ed90 comp rx=0 tx=0).stop 2026-03-09T17:32:51.297 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.294+0000 7f44a790c700 1 --2- 192.168.123.106:0/3576698005 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f44a0103860 0x7f44a0198440 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.297 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.294+0000 7f44a790c700 1 -- 192.168.123.106:0/3576698005 >> 192.168.123.106:0/3576698005 conn(0x7f44a00fdc10 msgr2=0x7f44a0106a90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:51.297 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.295+0000 7f44a790c700 1 -- 192.168.123.106:0/3576698005 shutdown_connections 2026-03-09T17:32:51.297 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.295+0000 7f44a790c700 1 -- 192.168.123.106:0/3576698005 wait complete. 2026-03-09T17:32:51.307 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:32:51.375 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.370+0000 7f098610c700 1 -- 192.168.123.106:0/3239838238 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0980103990 msgr2=0x7f0980105d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.375 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.370+0000 7f098610c700 1 --2- 192.168.123.106:0/3239838238 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0980103990 0x7f0980105d70 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f0978009a60 tx=0x7f0978009d70 comp rx=0 tx=0).stop 2026-03-09T17:32:51.375 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.373+0000 7f098610c700 1 -- 192.168.123.106:0/3239838238 shutdown_connections 2026-03-09T17:32:51.375 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.373+0000 7f098610c700 1 --2- 192.168.123.106:0/3239838238 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0980103990 0x7f0980105d70 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.375 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.373+0000 7f098610c700 1 --2- 192.168.123.106:0/3239838238 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0980101070 0x7f0980103450 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.375 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.373+0000 7f098610c700 1 -- 192.168.123.106:0/3239838238 >> 192.168.123.106:0/3239838238 conn(0x7f09800fa9f0 msgr2=0x7f09800fce40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:51.375 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.373+0000 7f098610c700 1 -- 192.168.123.106:0/3239838238 shutdown_connections 2026-03-09T17:32:51.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.374+0000 7f098610c700 1 -- 192.168.123.106:0/3239838238 wait complete. 2026-03-09T17:32:51.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.374+0000 7f098610c700 1 Processor -- start 2026-03-09T17:32:51.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.374+0000 7f098610c700 1 -- start start 2026-03-09T17:32:51.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.374+0000 7f098610c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0980101070 0x7f0980195dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:51.377 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.375+0000 7f098610c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0980103990 0x7f0980196310 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:51.377 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.375+0000 7f098610c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0980196930 con 0x7f0980103990 2026-03-09T17:32:51.377 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.375+0000 7f098610c700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0980196a70 con 0x7f0980101070 2026-03-09T17:32:51.377 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.375+0000 7f097effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0980103990 0x7f0980196310 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:51.377 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.375+0000 7f097effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0980103990 0x7f0980196310 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:41692/0 (socket says 192.168.123.106:41692) 2026-03-09T17:32:51.377 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.375+0000 7f097effd700 1 -- 192.168.123.106:0/1372104968 learned_addr learned my addr 192.168.123.106:0/1372104968 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:32:51.377 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.375+0000 7f097f7fe700 1 --2- 192.168.123.106:0/1372104968 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0980101070 0x7f0980195dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:51.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.376+0000 7f097f7fe700 1 -- 192.168.123.106:0/1372104968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0980103990 msgr2=0x7f0980196310 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.376+0000 7f097f7fe700 1 --2- 192.168.123.106:0/1372104968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0980103990 0x7f0980196310 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.376+0000 7f097f7fe700 1 -- 192.168.123.106:0/1372104968 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0978009710 con 0x7f0980101070 2026-03-09T17:32:51.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.376+0000 7f097f7fe700 1 --2- 192.168.123.106:0/1372104968 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0980101070 0x7f0980195dd0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f0968009fd0 tx=0x7f096800ec90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:51.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.376+0000 7f097cff9700 1 -- 192.168.123.106:0/1372104968 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f096800cb80 con 0x7f0980101070 2026-03-09T17:32:51.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.376+0000 7f098610c700 1 -- 192.168.123.106:0/1372104968 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f09800feec0 con 0x7f0980101070 2026-03-09T17:32:51.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.376+0000 7f098610c700 1 -- 192.168.123.106:0/1372104968 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f09800ff410 con 0x7f0980101070 2026-03-09T17:32:51.380 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.377+0000 7f097cff9700 1 -- 192.168.123.106:0/1372104968 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f096800eed0 con 0x7f0980101070 2026-03-09T17:32:51.380 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.377+0000 7f097cff9700 1 -- 192.168.123.106:0/1372104968 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f09680186c0 con 0x7f0980101070 2026-03-09T17:32:51.380 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.378+0000 7f097cff9700 1 -- 192.168.123.106:0/1372104968 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0968018910 con 0x7f0980101070 2026-03-09T17:32:51.381 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.379+0000 7f097cff9700 1 --2- 192.168.123.106:0/1372104968 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f096c0778c0 0x7f096c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:51.381 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.379+0000 7f098610c700 1 -- 192.168.123.106:0/1372104968 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0980190020 con 0x7f0980101070 2026-03-09T17:32:51.381 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.379+0000 7f097effd700 1 --2- 192.168.123.106:0/1372104968 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f096c0778c0 0x7f096c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:51.381 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.379+0000 7f097effd700 1 --2- 192.168.123.106:0/1372104968 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f096c0778c0 0x7f096c079d70 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f0978003820 tx=0x7f09780095f0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:51.382 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.380+0000 7f097cff9700 1 -- 192.168.123.106:0/1372104968 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f0968014070 con 0x7f0980101070 2026-03-09T17:32:51.385 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.383+0000 7f097cff9700 1 -- 192.168.123.106:0/1372104968 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0968063b40 con 0x7f0980101070 2026-03-09T17:32:51.525 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.523+0000 7f098610c700 1 -- 192.168.123.106:0/1372104968 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0980061190 con 0x7f096c0778c0 2026-03-09T17:32:51.526 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.524+0000 7f097cff9700 1 -- 192.168.123.106:0/1372104968 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f0980061190 con 0x7f096c0778c0 2026-03-09T17:32:51.529 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.527+0000 7f098610c700 1 -- 192.168.123.106:0/1372104968 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f096c0778c0 msgr2=0x7f096c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.529 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.527+0000 7f098610c700 1 --2- 192.168.123.106:0/1372104968 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f096c0778c0 0x7f096c079d70 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f0978003820 tx=0x7f09780095f0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.529 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.527+0000 7f098610c700 1 -- 192.168.123.106:0/1372104968 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0980101070 msgr2=0x7f0980195dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.529 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.527+0000 7f098610c700 1 --2- 192.168.123.106:0/1372104968 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0980101070 0x7f0980195dd0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f0968009fd0 tx=0x7f096800ec90 comp rx=0 tx=0).stop 2026-03-09T17:32:51.529 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.527+0000 7f098610c700 1 -- 192.168.123.106:0/1372104968 shutdown_connections 2026-03-09T17:32:51.529 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.527+0000 7f098610c700 1 --2- 192.168.123.106:0/1372104968 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f096c0778c0 0x7f096c079d70 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.529 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.527+0000 7f098610c700 1 --2- 192.168.123.106:0/1372104968 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0980101070 0x7f0980195dd0 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.529 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.527+0000 7f098610c700 1 --2- 192.168.123.106:0/1372104968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0980103990 0x7f0980196310 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.529 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.527+0000 7f098610c700 1 -- 192.168.123.106:0/1372104968 >> 192.168.123.106:0/1372104968 conn(0x7f09800fa9f0 msgr2=0x7f09800fce40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:51.529 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.527+0000 7f098610c700 1 -- 192.168.123.106:0/1372104968 shutdown_connections 2026-03-09T17:32:51.529 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.527+0000 7f098610c700 1 -- 192.168.123.106:0/1372104968 wait complete. 2026-03-09T17:32:51.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.601+0000 7f3d163cc700 1 -- 192.168.123.106:0/639323110 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d10102760 msgr2=0x7f3d10102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.601+0000 7f3d163cc700 1 --2- 192.168.123.106:0/639323110 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d10102760 0x7f3d10102b70 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f3d00009b00 tx=0x7f3d00009e10 comp rx=0 tx=0).stop 2026-03-09T17:32:51.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.601+0000 7f3d163cc700 1 -- 192.168.123.106:0/639323110 shutdown_connections 2026-03-09T17:32:51.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.601+0000 7f3d163cc700 1 --2- 192.168.123.106:0/639323110 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3d10103a00 0x7f3d10103e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.601+0000 7f3d163cc700 1 --2- 192.168.123.106:0/639323110 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d10102760 0x7f3d10102b70 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.601+0000 7f3d163cc700 1 -- 192.168.123.106:0/639323110 >> 192.168.123.106:0/639323110 conn(0x7f3d100fddb0 msgr2=0x7f3d101001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:51.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.602+0000 7f3d163cc700 1 -- 192.168.123.106:0/639323110 shutdown_connections 2026-03-09T17:32:51.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.602+0000 7f3d163cc700 1 -- 192.168.123.106:0/639323110 wait complete. 2026-03-09T17:32:51.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.602+0000 7f3d163cc700 1 Processor -- start 2026-03-09T17:32:51.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.602+0000 7f3d163cc700 1 -- start start 2026-03-09T17:32:51.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.603+0000 7f3d163cc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d10102760 0x7f3d10198060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:51.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.603+0000 7f3d0ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d10102760 0x7f3d10198060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:51.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.603+0000 7f3d0ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d10102760 0x7f3d10198060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:41704/0 (socket says 192.168.123.106:41704) 2026-03-09T17:32:51.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.603+0000 7f3d163cc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3d10103a00 0x7f3d101985a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:51.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.603+0000 7f3d163cc700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d10198bc0 con 0x7f3d10102760 2026-03-09T17:32:51.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.603+0000 7f3d163cc700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d10198d00 con 0x7f3d10103a00 2026-03-09T17:32:51.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.603+0000 7f3d0ffff700 1 -- 192.168.123.106:0/3689017787 learned_addr learned my addr 192.168.123.106:0/3689017787 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:32:51.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.603+0000 7f3d075ff700 1 --2- 192.168.123.106:0/3689017787 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3d10103a00 0x7f3d101985a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:51.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.603+0000 7f3d0ffff700 1 -- 192.168.123.106:0/3689017787 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3d10103a00 msgr2=0x7f3d101985a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.603+0000 7f3d0ffff700 1 --2- 192.168.123.106:0/3689017787 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3d10103a00 0x7f3d101985a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.603+0000 7f3d0ffff700 1 -- 192.168.123.106:0/3689017787 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3d000097e0 con 0x7f3d10102760 2026-03-09T17:32:51.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.604+0000 7f3d0ffff700 1 --2- 192.168.123.106:0/3689017787 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d10102760 0x7f3d10198060 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f3d00004a30 tx=0x7f3d00004b10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:51.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.604+0000 7f3d0dffb700 1 -- 192.168.123.106:0/3689017787 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d0001d070 con 0x7f3d10102760 2026-03-09T17:32:51.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.604+0000 7f3d163cc700 1 -- 192.168.123.106:0/3689017787 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3d1019d750 con 0x7f3d10102760 2026-03-09T17:32:51.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.604+0000 7f3d163cc700 1 -- 192.168.123.106:0/3689017787 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3d1019dc40 con 0x7f3d10102760 2026-03-09T17:32:51.607 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.604+0000 7f3d0dffb700 1 -- 192.168.123.106:0/3689017787 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3d0000bc50 con 0x7f3d10102760 2026-03-09T17:32:51.607 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.605+0000 7f3d0dffb700 1 -- 192.168.123.106:0/3689017787 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d00021830 con 0x7f3d10102760 2026-03-09T17:32:51.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.606+0000 7f3d0dffb700 1 -- 192.168.123.106:0/3689017787 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3d0002b430 con 0x7f3d10102760 2026-03-09T17:32:51.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.606+0000 7f3d0dffb700 1 --2- 192.168.123.106:0/3689017787 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3cfc0779e0 0x7f3cfc079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:51.609 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.606+0000 7f3d075ff700 1 --2- 192.168.123.106:0/3689017787 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3cfc0779e0 0x7f3cfc079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:51.609 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.607+0000 7f3d163cc700 1 -- 192.168.123.106:0/3689017787 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3cf0005320 con 0x7f3d10102760 2026-03-09T17:32:51.609 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.607+0000 7f3d0dffb700 1 -- 192.168.123.106:0/3689017787 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f3d0009b080 con 0x7f3d10102760 2026-03-09T17:32:51.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.607+0000 7f3d075ff700 1 --2- 192.168.123.106:0/3689017787 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3cfc0779e0 0x7f3cfc079e90 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f3cf8006fd0 tx=0x7f3cf8008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:51.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.610+0000 7f3d0dffb700 1 -- 192.168.123.106:0/3689017787 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3d00064910 con 0x7f3d10102760 2026-03-09T17:32:51.735 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.733+0000 7f3d163cc700 1 -- 192.168.123.106:0/3689017787 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f3cf0000bf0 con 0x7f3cfc0779e0 2026-03-09T17:32:51.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.738+0000 7f3d0dffb700 1 -- 192.168.123.106:0/3689017787 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f3cf0000bf0 con 0x7f3cfc0779e0 2026-03-09T17:32:51.740 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:32:51.740 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (7m) 7s ago 7m 25.9M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:32:51.740 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (7m) 7s ago 7m 9126k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:32:51.740 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (7m) 88s ago 7m 11.1M - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:32:51.740 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (91s) 7s ago 7m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 3e47d040f792 2026-03-09T17:32:51.740 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (89s) 88s ago 7m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e a8538a05db57 2026-03-09T17:32:51.740 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (7m) 7s ago 7m 94.8M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (5m) 7s ago 5m 16.6M - 18.2.0 dc2bc1663786 4b4cbdf0c640 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (5m) 7s ago 5m 182M - 18.2.0 dc2bc1663786 4c8e86b2b8cd 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (5m) 88s ago 5m 145M - 18.2.0 dc2bc1663786 aa1f0430b448 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (5m) 88s ago 5m 17.5M - 18.2.0 dc2bc1663786 8dc8a0159213 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:8443,9283,8765 running (2m) 7s ago 8m 622M - 19.2.3-678-ge911bdeb 654f31e6858e a857a5a84e3a 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (2m) 88s ago 7m 487M - 19.2.3-678-ge911bdeb 654f31e6858e ce41ce6b68d6 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (2m) 7s ago 8m 58.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 86c27c9946b5 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (106s) 88s ago 7m 48.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 65d270c6a306 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (7m) 7s ago 7m 14.6M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (7m) 88s ago 7m 15.9M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (78s) 7s ago 6m 205M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 3b19d9fcb067 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (8s) 7s ago 6m 13.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b63df0190ed3 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (6m) 7s ago 6m 359M 4096M 18.2.0 dc2bc1663786 df6c7067c3e4 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (6m) 88s ago 6m 427M 4096M 18.2.0 dc2bc1663786 48a594500ef1 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (6m) 88s ago 6m 410M 4096M 18.2.0 dc2bc1663786 a47c39052541 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (6m) 88s ago 6m 334M 4096M 18.2.0 dc2bc1663786 89f436540a49 2026-03-09T17:32:51.741 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (2m) 7s ago 7m 58.3M - 2.43.0 a07b618ecd1d f6ece95f2fd5 2026-03-09T17:32:51.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.741+0000 7f3d163cc700 1 -- 192.168.123.106:0/3689017787 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3cfc0779e0 msgr2=0x7f3cfc079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.741+0000 7f3d163cc700 1 --2- 192.168.123.106:0/3689017787 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3cfc0779e0 0x7f3cfc079e90 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f3cf8006fd0 tx=0x7f3cf8008040 comp rx=0 tx=0).stop 2026-03-09T17:32:51.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.741+0000 7f3d163cc700 1 -- 192.168.123.106:0/3689017787 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d10102760 msgr2=0x7f3d10198060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.741+0000 7f3d163cc700 1 --2- 192.168.123.106:0/3689017787 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d10102760 0x7f3d10198060 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f3d00004a30 tx=0x7f3d00004b10 comp rx=0 tx=0).stop 2026-03-09T17:32:51.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.741+0000 7f3d163cc700 1 -- 192.168.123.106:0/3689017787 shutdown_connections 2026-03-09T17:32:51.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.741+0000 7f3d163cc700 1 --2- 192.168.123.106:0/3689017787 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3cfc0779e0 0x7f3cfc079e90 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.741+0000 7f3d163cc700 1 --2- 192.168.123.106:0/3689017787 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d10102760 0x7f3d10198060 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.741+0000 7f3d163cc700 1 --2- 192.168.123.106:0/3689017787 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3d10103a00 0x7f3d101985a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.741+0000 7f3d163cc700 1 -- 192.168.123.106:0/3689017787 >> 192.168.123.106:0/3689017787 conn(0x7f3d100fddb0 msgr2=0x7f3d10100170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:51.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.741+0000 7f3d163cc700 1 -- 192.168.123.106:0/3689017787 shutdown_connections 2026-03-09T17:32:51.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.741+0000 7f3d163cc700 1 -- 192.168.123.106:0/3689017787 wait complete. 2026-03-09T17:32:51.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.811+0000 7f008e894700 1 -- 192.168.123.106:0/1027201594 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0088103680 msgr2=0x7f0088105ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.811+0000 7f008e894700 1 --2- 192.168.123.106:0/1027201594 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0088103680 0x7f0088105ac0 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f0078009a60 tx=0x7f0078009d70 comp rx=0 tx=0).stop 2026-03-09T17:32:51.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.811+0000 7f008e894700 1 -- 192.168.123.106:0/1027201594 shutdown_connections 2026-03-09T17:32:51.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.811+0000 7f008e894700 1 --2- 192.168.123.106:0/1027201594 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0088103680 0x7f0088105ac0 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.811+0000 7f008e894700 1 --2- 192.168.123.106:0/1027201594 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0088069180 0x7f0088103140 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.811+0000 7f008e894700 1 -- 192.168.123.106:0/1027201594 >> 192.168.123.106:0/1027201594 conn(0x7f00880faa70 msgr2=0x7f00880fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:51.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.812+0000 7f008e894700 1 -- 192.168.123.106:0/1027201594 shutdown_connections 2026-03-09T17:32:51.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.812+0000 7f008e894700 1 -- 192.168.123.106:0/1027201594 wait complete. 2026-03-09T17:32:51.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.812+0000 7f008e894700 1 Processor -- start 2026-03-09T17:32:51.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.812+0000 7f008e894700 1 -- start start 2026-03-09T17:32:51.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.812+0000 7f008e894700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0088069180 0x7f0088195df0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:51.815 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.812+0000 7f008e894700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0088103680 0x7f0088196330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:51.815 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.812+0000 7f008e894700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0088196950 con 0x7f0088069180 2026-03-09T17:32:51.815 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.812+0000 7f008e894700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0088196a90 con 0x7f0088103680 2026-03-09T17:32:51.815 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.813+0000 7f0087fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0088069180 0x7f0088195df0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:51.815 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.813+0000 7f0087fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0088069180 0x7f0088195df0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:41728/0 (socket says 192.168.123.106:41728) 2026-03-09T17:32:51.815 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.813+0000 7f0087fff700 1 -- 192.168.123.106:0/4042446996 learned_addr learned my addr 192.168.123.106:0/4042446996 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:32:51.815 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.813+0000 7f0087fff700 1 -- 192.168.123.106:0/4042446996 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0088103680 msgr2=0x7f0088196330 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T17:32:51.815 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.813+0000 7f0087fff700 1 --2- 192.168.123.106:0/4042446996 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0088103680 0x7f0088196330 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.813+0000 7f0087fff700 1 -- 192.168.123.106:0/4042446996 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f00700097e0 con 0x7f0088069180 2026-03-09T17:32:51.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.813+0000 7f0087fff700 1 --2- 192.168.123.106:0/4042446996 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0088069180 0x7f0088195df0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f0070009a30 tx=0x7f007000c660 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:51.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.813+0000 7f00857fa700 1 -- 192.168.123.106:0/4042446996 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0070004830 con 0x7f0088069180 2026-03-09T17:32:51.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.813+0000 7f00857fa700 1 -- 192.168.123.106:0/4042446996 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0070004e70 con 0x7f0088069180 2026-03-09T17:32:51.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.813+0000 7f00857fa700 1 -- 192.168.123.106:0/4042446996 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0070005430 con 0x7f0088069180 2026-03-09T17:32:51.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.813+0000 7f008e894700 1 -- 192.168.123.106:0/4042446996 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0078009710 con 0x7f0088069180 2026-03-09T17:32:51.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.813+0000 7f008e894700 1 -- 192.168.123.106:0/4042446996 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f00880ff340 con 0x7f0088069180 2026-03-09T17:32:51.819 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.815+0000 7f00857fa700 1 -- 192.168.123.106:0/4042446996 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0070005590 con 0x7f0088069180 2026-03-09T17:32:51.819 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.815+0000 7f008e894700 1 -- 192.168.123.106:0/4042446996 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f008818ffc0 con 0x7f0088069180 2026-03-09T17:32:51.819 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.815+0000 7f00857fa700 1 --2- 192.168.123.106:0/4042446996 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f00740778c0 0x7f0074079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:51.819 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.815+0000 7f00857fa700 1 -- 192.168.123.106:0/4042446996 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f007009e4c0 con 0x7f0088069180 2026-03-09T17:32:51.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.818+0000 7f00877fe700 1 --2- 192.168.123.106:0/4042446996 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f00740778c0 0x7f0074079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:51.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.818+0000 7f00857fa700 1 -- 192.168.123.106:0/4042446996 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f00700674a0 con 0x7f0088069180 2026-03-09T17:32:51.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.818+0000 7f00877fe700 1 --2- 192.168.123.106:0/4042446996 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f00740778c0 0x7f0074079d70 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f007800b5c0 tx=0x7f00780058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:51.984 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:51 vm06.local ceph-mon[109831]: pgmap v71: 65 pgs: 22 active+undersized, 2 active+recovery_wait+degraded, 2 active+recovering, 12 active+undersized+degraded, 27 active+clean; 216 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 37 KiB/s wr, 5 op/s; 188/234 objects degraded (80.342%); 0 B/s, 14 objects/s recovering 2026-03-09T17:32:51.984 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:51 vm06.local ceph-mon[109831]: osdmap e55: 6 total, 6 up, 6 in 2026-03-09T17:32:51.985 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.982+0000 7f008e894700 1 -- 192.168.123.106:0/4042446996 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f00881007a0 con 0x7f0088069180 2026-03-09T17:32:51.985 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.983+0000 7f00857fa700 1 -- 192.168.123.106:0/4042446996 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f0070066bf0 con 0x7f0088069180 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4, 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 8, 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:32:51.987 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:32:51.990 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.988+0000 7f008e894700 1 -- 192.168.123.106:0/4042446996 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f00740778c0 msgr2=0x7f0074079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.990 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.988+0000 7f008e894700 1 --2- 192.168.123.106:0/4042446996 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f00740778c0 0x7f0074079d70 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f007800b5c0 tx=0x7f00780058e0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.988+0000 7f008e894700 1 -- 192.168.123.106:0/4042446996 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0088069180 msgr2=0x7f0088195df0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:51.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.988+0000 7f008e894700 1 --2- 192.168.123.106:0/4042446996 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0088069180 0x7f0088195df0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f0070009a30 tx=0x7f007000c660 comp rx=0 tx=0).stop 2026-03-09T17:32:51.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.988+0000 7f008e894700 1 -- 192.168.123.106:0/4042446996 shutdown_connections 2026-03-09T17:32:51.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.988+0000 7f008e894700 1 --2- 192.168.123.106:0/4042446996 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f00740778c0 0x7f0074079d70 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.988+0000 7f008e894700 1 --2- 192.168.123.106:0/4042446996 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0088069180 0x7f0088195df0 secure :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f0070009a30 tx=0x7f007000c660 comp rx=0 tx=0).stop 2026-03-09T17:32:51.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.988+0000 7f008e894700 1 --2- 192.168.123.106:0/4042446996 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0088103680 0x7f0088196330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:51.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.988+0000 7f008e894700 1 -- 192.168.123.106:0/4042446996 >> 192.168.123.106:0/4042446996 conn(0x7f00880faa70 msgr2=0x7f00880fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:51.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.989+0000 7f008e894700 1 -- 192.168.123.106:0/4042446996 shutdown_connections 2026-03-09T17:32:51.992 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:51.989+0000 7f008e894700 1 -- 192.168.123.106:0/4042446996 wait complete. 2026-03-09T17:32:52.062 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.060+0000 7effe5749700 1 -- 192.168.123.106:0/1633469221 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effe0101760 msgr2=0x7effe0101bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:52.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.060+0000 7effe5749700 1 --2- 192.168.123.106:0/1633469221 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effe0101760 0x7effe0101bb0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7effd0009b00 tx=0x7effd0009e10 comp rx=0 tx=0).stop 2026-03-09T17:32:52.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.060+0000 7effe5749700 1 -- 192.168.123.106:0/1633469221 shutdown_connections 2026-03-09T17:32:52.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.060+0000 7effe5749700 1 --2- 192.168.123.106:0/1633469221 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effe0101760 0x7effe0101bb0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.060+0000 7effe5749700 1 --2- 192.168.123.106:0/1633469221 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effe0100560 0x7effe0100970 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.060+0000 7effe5749700 1 -- 192.168.123.106:0/1633469221 >> 192.168.123.106:0/1633469221 conn(0x7effe00fbb10 msgr2=0x7effe00fdf40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:52.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.061+0000 7effe5749700 1 -- 192.168.123.106:0/1633469221 shutdown_connections 2026-03-09T17:32:52.067 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.061+0000 7effe5749700 1 -- 192.168.123.106:0/1633469221 wait complete. 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.061+0000 7effe5749700 1 Processor -- start 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.061+0000 7effe5749700 1 -- start start 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.062+0000 7effe5749700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effe0100560 0x7effe0195da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.062+0000 7effe5749700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effe0101760 0x7effe01962e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.062+0000 7effe5749700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7effe0196900 con 0x7effe0101760 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.062+0000 7effde7fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effe0101760 0x7effe01962e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.062+0000 7effde7fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effe0101760 0x7effe01962e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:41738/0 (socket says 192.168.123.106:41738) 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.062+0000 7effde7fc700 1 -- 192.168.123.106:0/3163934302 learned_addr learned my addr 192.168.123.106:0/3163934302 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.062+0000 7effe5749700 1 -- 192.168.123.106:0/3163934302 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7effe0196a40 con 0x7effe0100560 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.062+0000 7effde7fc700 1 -- 192.168.123.106:0/3163934302 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effe0100560 msgr2=0x7effe0195da0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.063+0000 7effdeffd700 1 --2- 192.168.123.106:0/3163934302 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effe0100560 0x7effe0195da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.063+0000 7effde7fc700 1 --2- 192.168.123.106:0/3163934302 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effe0100560 0x7effe0195da0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.063+0000 7effde7fc700 1 -- 192.168.123.106:0/3163934302 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7effd00097e0 con 0x7effe0101760 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.063+0000 7effdeffd700 1 --2- 192.168.123.106:0/3163934302 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effe0100560 0x7effe0195da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.063+0000 7effde7fc700 1 --2- 192.168.123.106:0/3163934302 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effe0101760 0x7effe01962e0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7effd0004a00 tx=0x7effd0004ae0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.063+0000 7effd7fff700 1 -- 192.168.123.106:0/3163934302 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7effd001d070 con 0x7effe0101760 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.063+0000 7effd7fff700 1 -- 192.168.123.106:0/3163934302 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7effd000bcd0 con 0x7effe0101760 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.063+0000 7effe5749700 1 -- 192.168.123.106:0/3163934302 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7effe019b490 con 0x7effe0101760 2026-03-09T17:32:52.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.063+0000 7effd7fff700 1 -- 192.168.123.106:0/3163934302 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7effd0022620 con 0x7effe0101760 2026-03-09T17:32:52.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.064+0000 7effe5749700 1 -- 192.168.123.106:0/3163934302 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7effe019b980 con 0x7effe0101760 2026-03-09T17:32:52.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.065+0000 7effd7fff700 1 -- 192.168.123.106:0/3163934302 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7effd000f770 con 0x7effe0101760 2026-03-09T17:32:52.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.065+0000 7effe5749700 1 -- 192.168.123.106:0/3163934302 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7effe0066e40 con 0x7effe0101760 2026-03-09T17:32:52.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.065+0000 7effd7fff700 1 --2- 192.168.123.106:0/3163934302 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7effcc077870 0x7effcc079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:52.070 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.066+0000 7effd7fff700 1 -- 192.168.123.106:0/3163934302 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6136+0+0 (secure 0 0 0) 0x7effd0067c70 con 0x7effe0101760 2026-03-09T17:32:52.071 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.069+0000 7effdeffd700 1 --2- 192.168.123.106:0/3163934302 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7effcc077870 0x7effcc079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:52.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.069+0000 7effd7fff700 1 -- 192.168.123.106:0/3163934302 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7effd00a0050 con 0x7effe0101760 2026-03-09T17:32:52.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.069+0000 7effdeffd700 1 --2- 192.168.123.106:0/3163934302 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7effcc077870 0x7effcc079d20 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7effc8005fd0 tx=0x7effc8005dc0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:52.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:51 vm09.local ceph-mon[97995]: pgmap v71: 65 pgs: 22 active+undersized, 2 active+recovery_wait+degraded, 2 active+recovering, 12 active+undersized+degraded, 27 active+clean; 216 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 37 KiB/s wr, 5 op/s; 188/234 objects degraded (80.342%); 0 B/s, 14 objects/s recovering 2026-03-09T17:32:52.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:51 vm09.local ceph-mon[97995]: osdmap e55: 6 total, 6 up, 6 in 2026-03-09T17:32:52.221 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.218+0000 7effe5749700 1 -- 192.168.123.106:0/3163934302 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7effe019bcc0 con 0x7effe0101760 2026-03-09T17:32:52.221 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.219+0000 7effd7fff700 1 -- 192.168.123.106:0/3163934302 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1942 (secure 0 0 0) 0x7effd0063690 con 0x7effe0101760 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:e13 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:epoch 11 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:max_xattr_size 65536 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24291} 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:qdb_cluster leader: 0 members: 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:32:52.223 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:32:52.225 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.223+0000 7effe5749700 1 -- 192.168.123.106:0/3163934302 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7effcc077870 msgr2=0x7effcc079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:52.225 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.223+0000 7effe5749700 1 --2- 192.168.123.106:0/3163934302 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7effcc077870 0x7effcc079d20 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7effc8005fd0 tx=0x7effc8005dc0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.225 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.223+0000 7effe5749700 1 -- 192.168.123.106:0/3163934302 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effe0101760 msgr2=0x7effe01962e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:52.225 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.223+0000 7effe5749700 1 --2- 192.168.123.106:0/3163934302 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effe0101760 0x7effe01962e0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7effd0004a00 tx=0x7effd0004ae0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.225 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.223+0000 7effe5749700 1 -- 192.168.123.106:0/3163934302 shutdown_connections 2026-03-09T17:32:52.226 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.223+0000 7effe5749700 1 --2- 192.168.123.106:0/3163934302 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7effcc077870 0x7effcc079d20 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.226 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.223+0000 7effe5749700 1 --2- 192.168.123.106:0/3163934302 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7effe0100560 0x7effe0195da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.226 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.223+0000 7effe5749700 1 --2- 192.168.123.106:0/3163934302 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7effe0101760 0x7effe01962e0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.226 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.223+0000 7effe5749700 1 -- 192.168.123.106:0/3163934302 >> 192.168.123.106:0/3163934302 conn(0x7effe00fbb10 msgr2=0x7effe0104990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:52.226 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.224+0000 7effe5749700 1 -- 192.168.123.106:0/3163934302 shutdown_connections 2026-03-09T17:32:52.226 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.224+0000 7effe5749700 1 -- 192.168.123.106:0/3163934302 wait complete. 2026-03-09T17:32:52.227 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:32:52.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.298+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1527754683 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe5041038d0 msgr2=0x7fe504105cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:52.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.298+0000 7fe50c1ce700 1 --2- 192.168.123.106:0/1527754683 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe5041038d0 0x7fe504105cb0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fe500009b00 tx=0x7fe500009e10 comp rx=0 tx=0).stop 2026-03-09T17:32:52.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.298+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1527754683 shutdown_connections 2026-03-09T17:32:52.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.298+0000 7fe50c1ce700 1 --2- 192.168.123.106:0/1527754683 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe5041038d0 0x7fe504105cb0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.298+0000 7fe50c1ce700 1 --2- 192.168.123.106:0/1527754683 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe504100fb0 0x7fe504103390 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.298+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1527754683 >> 192.168.123.106:0/1527754683 conn(0x7fe5040fa9b0 msgr2=0x7fe5040fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:52.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.299+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1527754683 shutdown_connections 2026-03-09T17:32:52.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.299+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1527754683 wait complete. 2026-03-09T17:32:52.302 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.299+0000 7fe50c1ce700 1 Processor -- start 2026-03-09T17:32:52.302 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.300+0000 7fe50c1ce700 1 -- start start 2026-03-09T17:32:52.302 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.300+0000 7fe50c1ce700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe504100fb0 0x7fe504071cd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:52.302 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.300+0000 7fe50c1ce700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe5041038d0 0x7fe504072210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:52.302 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.300+0000 7fe50c1ce700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe504072830 con 0x7fe5041038d0 2026-03-09T17:32:52.302 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.300+0000 7fe50c1ce700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe5041a3820 con 0x7fe504100fb0 2026-03-09T17:32:52.303 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.301+0000 7fe509769700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe5041038d0 0x7fe504072210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:52.303 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.301+0000 7fe509f6a700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe504100fb0 0x7fe504071cd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:52.303 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.301+0000 7fe509769700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe5041038d0 0x7fe504072210 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:41764/0 (socket says 192.168.123.106:41764) 2026-03-09T17:32:52.303 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.301+0000 7fe509769700 1 -- 192.168.123.106:0/1991002172 learned_addr learned my addr 192.168.123.106:0/1991002172 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:32:52.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.301+0000 7fe509f6a700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe504100fb0 0x7fe504071cd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:43770/0 (socket says 192.168.123.106:43770) 2026-03-09T17:32:52.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.302+0000 7fe509769700 1 -- 192.168.123.106:0/1991002172 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe504100fb0 msgr2=0x7fe504071cd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:52.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.302+0000 7fe509769700 1 --2- 192.168.123.106:0/1991002172 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe504100fb0 0x7fe504071cd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.302+0000 7fe509769700 1 -- 192.168.123.106:0/1991002172 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe5000097e0 con 0x7fe5041038d0 2026-03-09T17:32:52.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.302+0000 7fe509769700 1 --2- 192.168.123.106:0/1991002172 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe5041038d0 0x7fe504072210 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fe500004990 tx=0x7fe500004a70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:52.305 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.303+0000 7fe4faffd700 1 -- 192.168.123.106:0/1991002172 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe50001d070 con 0x7fe5041038d0 2026-03-09T17:32:52.305 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.303+0000 7fe4faffd700 1 -- 192.168.123.106:0/1991002172 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe50000bcd0 con 0x7fe5041038d0 2026-03-09T17:32:52.305 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.303+0000 7fe4faffd700 1 -- 192.168.123.106:0/1991002172 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe50000f850 con 0x7fe5041038d0 2026-03-09T17:32:52.305 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.303+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1991002172 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe5041a3aa0 con 0x7fe5041038d0 2026-03-09T17:32:52.305 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.303+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1991002172 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe5041a3f90 con 0x7fe5041038d0 2026-03-09T17:32:52.306 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.304+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1991002172 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe50406bf10 con 0x7fe5041038d0 2026-03-09T17:32:52.310 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.307+0000 7fe4faffd700 1 -- 192.168.123.106:0/1991002172 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe500022bc0 con 0x7fe5041038d0 2026-03-09T17:32:52.311 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.308+0000 7fe4faffd700 1 --2- 192.168.123.106:0/1991002172 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe4f0077990 0x7fe4f0079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:52.311 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.308+0000 7fe4faffd700 1 -- 192.168.123.106:0/1991002172 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fe50009b9d0 con 0x7fe5041038d0 2026-03-09T17:32:52.311 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.308+0000 7fe4faffd700 1 -- 192.168.123.106:0/1991002172 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe5000cba90 con 0x7fe5041038d0 2026-03-09T17:32:52.312 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.309+0000 7fe509f6a700 1 --2- 192.168.123.106:0/1991002172 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe4f0077990 0x7fe4f0079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:52.312 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.310+0000 7fe509f6a700 1 --2- 192.168.123.106:0/1991002172 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe4f0077990 0x7fe4f0079e40 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fe4f4005950 tx=0x7fe4f4005ed0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:52.451 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.449+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1991002172 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe504061190 con 0x7fe4f0077990 2026-03-09T17:32:52.452 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.450+0000 7fe4faffd700 1 -- 192.168.123.106:0/1991002172 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fe504061190 con 0x7fe4f0077990 2026-03-09T17:32:52.452 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:32:52.452 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T17:32:52.453 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:32:52.453 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:32:52.453 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [ 2026-03-09T17:32:52.453 INFO:teuthology.orchestra.run.vm06.stdout: "mon", 2026-03-09T17:32:52.453 INFO:teuthology.orchestra.run.vm06.stdout: "crash", 2026-03-09T17:32:52.453 INFO:teuthology.orchestra.run.vm06.stdout: "mgr" 2026-03-09T17:32:52.453 INFO:teuthology.orchestra.run.vm06.stdout: ], 2026-03-09T17:32:52.453 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "8/23 daemons upgraded", 2026-03-09T17:32:52.453 INFO:teuthology.orchestra.run.vm06.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T17:32:52.453 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:32:52.453 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:32:52.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.453+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1991002172 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe4f0077990 msgr2=0x7fe4f0079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:52.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.453+0000 7fe50c1ce700 1 --2- 192.168.123.106:0/1991002172 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe4f0077990 0x7fe4f0079e40 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fe4f4005950 tx=0x7fe4f4005ed0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.453+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1991002172 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe5041038d0 msgr2=0x7fe504072210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:52.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.453+0000 7fe50c1ce700 1 --2- 192.168.123.106:0/1991002172 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe5041038d0 0x7fe504072210 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fe500004990 tx=0x7fe500004a70 comp rx=0 tx=0).stop 2026-03-09T17:32:52.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.453+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1991002172 shutdown_connections 2026-03-09T17:32:52.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.453+0000 7fe50c1ce700 1 --2- 192.168.123.106:0/1991002172 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe4f0077990 0x7fe4f0079e40 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.453+0000 7fe50c1ce700 1 --2- 192.168.123.106:0/1991002172 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe504100fb0 0x7fe504071cd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.454+0000 7fe50c1ce700 1 --2- 192.168.123.106:0/1991002172 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe5041038d0 0x7fe504072210 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.454+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1991002172 >> 192.168.123.106:0/1991002172 conn(0x7fe5040fa9b0 msgr2=0x7fe5040fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:52.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.454+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1991002172 shutdown_connections 2026-03-09T17:32:52.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.454+0000 7fe50c1ce700 1 -- 192.168.123.106:0/1991002172 wait complete. 2026-03-09T17:32:52.532 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.530+0000 7f86a6cbb700 1 -- 192.168.123.106:0/1839540080 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f86a0103a00 msgr2=0x7f86a0103e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:52.532 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.530+0000 7f86a6cbb700 1 --2- 192.168.123.106:0/1839540080 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f86a0103a00 0x7f86a0103e70 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f8694009b00 tx=0x7f8694009e10 comp rx=0 tx=0).stop 2026-03-09T17:32:52.532 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.530+0000 7f86a6cbb700 1 -- 192.168.123.106:0/1839540080 shutdown_connections 2026-03-09T17:32:52.532 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.530+0000 7f86a6cbb700 1 --2- 192.168.123.106:0/1839540080 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f86a0103a00 0x7f86a0103e70 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.532 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.530+0000 7f86a6cbb700 1 --2- 192.168.123.106:0/1839540080 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f86a0102760 0x7f86a0102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.533 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.530+0000 7f86a6cbb700 1 -- 192.168.123.106:0/1839540080 >> 192.168.123.106:0/1839540080 conn(0x7f86a00fddb0 msgr2=0x7f86a01001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:52.533 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.530+0000 7f86a6cbb700 1 -- 192.168.123.106:0/1839540080 shutdown_connections 2026-03-09T17:32:52.533 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.531+0000 7f86a6cbb700 1 -- 192.168.123.106:0/1839540080 wait complete. 2026-03-09T17:32:52.533 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.531+0000 7f86a6cbb700 1 Processor -- start 2026-03-09T17:32:52.533 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.531+0000 7f86a6cbb700 1 -- start start 2026-03-09T17:32:52.533 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.531+0000 7f86a6cbb700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f86a0102760 0x7f86a0197fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:52.533 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.531+0000 7f86a6cbb700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f86a0103a00 0x7f86a0198500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:52.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.531+0000 7f86a6cbb700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f86a0198b20 con 0x7f86a0102760 2026-03-09T17:32:52.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.532+0000 7f86a6cbb700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f86a0198c60 con 0x7f86a0103a00 2026-03-09T17:32:52.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.532+0000 7f869ffff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f86a0103a00 0x7f86a0198500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:52.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.532+0000 7f869ffff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f86a0103a00 0x7f86a0198500 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:43790/0 (socket says 192.168.123.106:43790) 2026-03-09T17:32:52.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.532+0000 7f869ffff700 1 -- 192.168.123.106:0/704088064 learned_addr learned my addr 192.168.123.106:0/704088064 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:32:52.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.532+0000 7f86a4a57700 1 --2- 192.168.123.106:0/704088064 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f86a0102760 0x7f86a0197fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:52.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.532+0000 7f869ffff700 1 -- 192.168.123.106:0/704088064 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f86a0102760 msgr2=0x7f86a0197fc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:52.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.532+0000 7f869ffff700 1 --2- 192.168.123.106:0/704088064 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f86a0102760 0x7f86a0197fc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.533+0000 7f869ffff700 1 -- 192.168.123.106:0/704088064 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f868c009710 con 0x7f86a0103a00 2026-03-09T17:32:52.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.533+0000 7f86a4a57700 1 --2- 192.168.123.106:0/704088064 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f86a0102760 0x7f86a0197fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T17:32:52.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.533+0000 7f869ffff700 1 --2- 192.168.123.106:0/704088064 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f86a0103a00 0x7f86a0198500 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f869400bb50 tx=0x7f869400bb80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:52.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.533+0000 7f869dffb700 1 -- 192.168.123.106:0/704088064 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f869401d070 con 0x7f86a0103a00 2026-03-09T17:32:52.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.533+0000 7f86a6cbb700 1 -- 192.168.123.106:0/704088064 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f86940097e0 con 0x7f86a0103a00 2026-03-09T17:32:52.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.533+0000 7f86a6cbb700 1 -- 192.168.123.106:0/704088064 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f86a019da10 con 0x7f86a0103a00 2026-03-09T17:32:52.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.534+0000 7f869dffb700 1 -- 192.168.123.106:0/704088064 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8694022470 con 0x7f86a0103a00 2026-03-09T17:32:52.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.534+0000 7f869dffb700 1 -- 192.168.123.106:0/704088064 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f869400f670 con 0x7f86a0103a00 2026-03-09T17:32:52.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.535+0000 7f869dffb700 1 -- 192.168.123.106:0/704088064 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8694004b90 con 0x7f86a0103a00 2026-03-09T17:32:52.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.535+0000 7f86a6cbb700 1 -- 192.168.123.106:0/704088064 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8684005320 con 0x7f86a0103a00 2026-03-09T17:32:52.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.535+0000 7f869dffb700 1 --2- 192.168.123.106:0/704088064 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8690077870 0x7f8690079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:32:52.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.535+0000 7f86a4a57700 1 --2- 192.168.123.106:0/704088064 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8690077870 0x7f8690079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:32:52.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.536+0000 7f869dffb700 1 -- 192.168.123.106:0/704088064 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f869409aef0 con 0x7f86a0103a00 2026-03-09T17:32:52.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.536+0000 7f86a4a57700 1 --2- 192.168.123.106:0/704088064 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8690077870 0x7f8690079d20 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f868c00f790 tx=0x7f868c009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:32:52.540 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.538+0000 7f869dffb700 1 -- 192.168.123.106:0/704088064 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f86940637d0 con 0x7f86a0103a00 2026-03-09T17:32:52.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.719+0000 7f86a6cbb700 1 -- 192.168.123.106:0/704088064 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f8684005190 con 0x7f86a0103a00 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.720+0000 7f869dffb700 1 -- 192.168.123.106:0/704088064 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+909 (secure 0 0 0) 0x7f8694062f20 con 0x7f86a0103a00 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_WARN Degraded data redundancy: 111/234 objects degraded (47.436%), 13 pgs degraded 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 111/234 objects degraded (47.436%), 13 pgs degraded 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout: pg 1.0 is active+undersized+degraded, acting [3,0] 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout: pg 2.0 is active+undersized+degraded, acting [3,0] 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout: pg 2.1 is active+undersized+degraded, acting [2,0] 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout: pg 2.2 is active+undersized+degraded, acting [5,0] 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout: pg 2.3 is active+undersized+degraded, acting [5,2] 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout: pg 2.4 is active+undersized+degraded, acting [0,4] 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout: pg 2.6 is active+undersized+degraded, acting [3,4] 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout: pg 2.9 is active+undersized+degraded, acting [4,0] 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout: pg 2.a is active+undersized+degraded, acting [4,3] 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout: pg 2.d is active+undersized+degraded, acting [3,2] 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout: pg 2.10 is active+undersized+degraded, acting [2,0] 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout: pg 2.15 is active+undersized+degraded, acting [3,0] 2026-03-09T17:32:52.722 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.f is active+recovery_wait+degraded, acting [5,3,0] 2026-03-09T17:32:52.725 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.722+0000 7f86a6cbb700 1 -- 192.168.123.106:0/704088064 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8690077870 msgr2=0x7f8690079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:52.725 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.722+0000 7f86a6cbb700 1 --2- 192.168.123.106:0/704088064 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8690077870 0x7f8690079d20 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f868c00f790 tx=0x7f868c009450 comp rx=0 tx=0).stop 2026-03-09T17:32:52.725 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.723+0000 7f86a6cbb700 1 -- 192.168.123.106:0/704088064 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f86a0103a00 msgr2=0x7f86a0198500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:32:52.725 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.723+0000 7f86a6cbb700 1 --2- 192.168.123.106:0/704088064 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f86a0103a00 0x7f86a0198500 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f869400bb50 tx=0x7f869400bb80 comp rx=0 tx=0).stop 2026-03-09T17:32:52.725 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.723+0000 7f86a6cbb700 1 -- 192.168.123.106:0/704088064 shutdown_connections 2026-03-09T17:32:52.725 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.723+0000 7f86a6cbb700 1 --2- 192.168.123.106:0/704088064 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8690077870 0x7f8690079d20 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.725 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.723+0000 7f86a6cbb700 1 --2- 192.168.123.106:0/704088064 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f86a0102760 0x7f86a0197fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.725 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.723+0000 7f86a6cbb700 1 --2- 192.168.123.106:0/704088064 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f86a0103a00 0x7f86a0198500 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:32:52.725 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.723+0000 7f86a6cbb700 1 -- 192.168.123.106:0/704088064 >> 192.168.123.106:0/704088064 conn(0x7f86a00fddb0 msgr2=0x7f86a0106c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:32:52.725 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.723+0000 7f86a6cbb700 1 -- 192.168.123.106:0/704088064 shutdown_connections 2026-03-09T17:32:52.725 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:32:52.723+0000 7f86a6cbb700 1 -- 192.168.123.106:0/704088064 wait complete. 2026-03-09T17:32:52.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:52 vm06.local ceph-mon[109831]: from='client.34190 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:52.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:52 vm06.local ceph-mon[109831]: from='client.44181 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:52.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:52 vm06.local ceph-mon[109831]: from='client.34198 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:52.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:52 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 188/234 objects degraded (80.342%), 14 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:52.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:52 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/4042446996' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:52.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:52 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/3163934302' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:32:52.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:52 vm09.local ceph-mon[97995]: from='client.34190 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:52.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:52 vm09.local ceph-mon[97995]: from='client.44181 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:52.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:52 vm09.local ceph-mon[97995]: from='client.34198 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:52.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:52 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 188/234 objects degraded (80.342%), 14 pgs degraded (PG_DEGRADED) 2026-03-09T17:32:52.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:52 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/4042446996' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:32:52.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:52 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/3163934302' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:32:53.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:53 vm06.local ceph-mon[109831]: pgmap v73: 65 pgs: 2 peering, 20 active+undersized, 1 active+recovery_wait+degraded, 2 active+recovering, 12 active+undersized+degraded, 28 active+clean; 216 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s; 111/234 objects degraded (47.436%); 0 B/s, 12 objects/s recovering 2026-03-09T17:32:53.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:53 vm06.local ceph-mon[109831]: from='client.34210 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:53.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:53 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/704088064' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:32:54.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:53 vm09.local ceph-mon[97995]: pgmap v73: 65 pgs: 2 peering, 20 active+undersized, 1 active+recovery_wait+degraded, 2 active+recovering, 12 active+undersized+degraded, 28 active+clean; 216 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s; 111/234 objects degraded (47.436%); 0 B/s, 12 objects/s recovering 2026-03-09T17:32:54.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:53 vm09.local ceph-mon[97995]: from='client.34210 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:32:54.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:53 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/704088064' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:32:56.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:55 vm06.local ceph-mon[109831]: pgmap v74: 65 pgs: 2 peering, 14 active+undersized, 1 active+recovery_wait+degraded, 2 active+recovering, 6 active+undersized+degraded, 40 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s; 96/234 objects degraded (41.026%); 14 B/s, 13 objects/s recovering 2026-03-09T17:32:56.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:55 vm09.local ceph-mon[97995]: pgmap v74: 65 pgs: 2 peering, 14 active+undersized, 1 active+recovery_wait+degraded, 2 active+recovering, 6 active+undersized+degraded, 40 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s; 96/234 objects degraded (41.026%); 14 B/s, 13 objects/s recovering 2026-03-09T17:32:57.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:57 vm06.local ceph-mon[109831]: pgmap v75: 65 pgs: 1 active+recovery_wait+degraded, 1 active+recovering, 63 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1.6 KiB/s rd, 529 B/s wr, 3 op/s; 82/234 objects degraded (35.043%); 24 B/s, 16 objects/s recovering 2026-03-09T17:32:57.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:57 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 82/234 objects degraded (35.043%), 1 pg degraded (PG_DEGRADED) 2026-03-09T17:32:57.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:57 vm09.local ceph-mon[97995]: pgmap v75: 65 pgs: 1 active+recovery_wait+degraded, 1 active+recovering, 63 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1.6 KiB/s rd, 529 B/s wr, 3 op/s; 82/234 objects degraded (35.043%); 24 B/s, 16 objects/s recovering 2026-03-09T17:32:57.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:57 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 82/234 objects degraded (35.043%), 1 pg degraded (PG_DEGRADED) 2026-03-09T17:32:59.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:59 vm06.local ceph-mon[109831]: pgmap v76: 65 pgs: 1 active+recovery_wait+degraded, 1 active+recovering, 63 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 586 B/s rd, 0 op/s; 82/234 objects degraded (35.043%); 21 B/s, 10 objects/s recovering 2026-03-09T17:32:59.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:59.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:32:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:32:59.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:59 vm09.local ceph-mon[97995]: pgmap v76: 65 pgs: 1 active+recovery_wait+degraded, 1 active+recovering, 63 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 586 B/s rd, 0 op/s; 82/234 objects degraded (35.043%); 21 B/s, 10 objects/s recovering 2026-03-09T17:32:59.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:32:59.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:32:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:33:00.792 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:00 vm06.local ceph-mon[109831]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 82/234 objects degraded (35.043%), 1 pg degraded) 2026-03-09T17:33:00.793 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:00 vm06.local ceph-mon[109831]: Cluster is now healthy 2026-03-09T17:33:00.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:00 vm09.local ceph-mon[97995]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 82/234 objects degraded (35.043%), 1 pg degraded) 2026-03-09T17:33:00.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:00 vm09.local ceph-mon[97995]: Cluster is now healthy 2026-03-09T17:33:01.796 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:01 vm09.local ceph-mon[97995]: pgmap v77: 65 pgs: 2 active+recovering, 63 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1023 B/s rd, 1 op/s; 19 B/s, 10 objects/s recovering 2026-03-09T17:33:01.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:01 vm06.local ceph-mon[109831]: pgmap v77: 65 pgs: 2 active+recovering, 63 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1023 B/s rd, 1 op/s; 19 B/s, 10 objects/s recovering 2026-03-09T17:33:02.493 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:02 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T17:33:02.493 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:02 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T17:33:02.493 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:02 vm06.local ceph-mon[109831]: Upgrade: osd.2 is safe to restart 2026-03-09T17:33:02.493 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:02 vm06.local ceph-mon[109831]: Upgrade: Updating osd.2 2026-03-09T17:33:02.493 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:02 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:02.493 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:02 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T17:33:02.493 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:02 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:02.493 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:02 vm06.local ceph-mon[109831]: Deploying daemon osd.2 on vm06 2026-03-09T17:33:02.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:02 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T17:33:02.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:02 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T17:33:02.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:02 vm09.local ceph-mon[97995]: Upgrade: osd.2 is safe to restart 2026-03-09T17:33:02.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:02 vm09.local ceph-mon[97995]: Upgrade: Updating osd.2 2026-03-09T17:33:02.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:02 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:02.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:02 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T17:33:02.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:02 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:02.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:02 vm09.local ceph-mon[97995]: Deploying daemon osd.2 on vm06 2026-03-09T17:33:03.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:02 vm06.local systemd[1]: Stopping Ceph osd.2 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:33:03.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:02 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2[86232]: 2026-03-09T17:33:02.936+0000 7eff2b057700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:33:03.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:02 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2[86232]: 2026-03-09T17:33:02.936+0000 7eff2b057700 -1 osd.2 55 *** Got signal Terminated *** 2026-03-09T17:33:03.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:02 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2[86232]: 2026-03-09T17:33:02.936+0000 7eff2b057700 -1 osd.2 55 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T17:33:03.746 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:03 vm06.local ceph-mon[109831]: pgmap v78: 65 pgs: 1 active+recovering, 64 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 874 B/s rd, 1 op/s; 16 B/s, 7 objects/s recovering 2026-03-09T17:33:03.746 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:03 vm06.local ceph-mon[109831]: osd.2 marked itself down and dead 2026-03-09T17:33:03.746 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:03 vm06.local podman[123700]: 2026-03-09 17:33:03.551801726 +0000 UTC m=+0.631905759 container died df6c7067c3e4ee006acb7a74afa9b533dfa44103ec6daf14208db53b7e645822 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2, org.label-schema.license=GPLv2, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, ceph=True, maintainer=Guillaume Abrioux , GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.name=CentOS Stream 8 Base Image, GIT_BRANCH=HEAD, io.buildah.version=1.29.1, org.label-schema.build-date=20231212) 2026-03-09T17:33:03.746 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:03 vm06.local podman[123700]: 2026-03-09 17:33:03.572358152 +0000 UTC m=+0.652462185 container remove df6c7067c3e4ee006acb7a74afa9b533dfa44103ec6daf14208db53b7e645822 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2, GIT_BRANCH=HEAD, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.29.1, org.label-schema.build-date=20231212, CEPH_POINT_RELEASE=-18.2.0, ceph=True, org.label-schema.schema-version=1.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS) 2026-03-09T17:33:03.746 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:03 vm06.local bash[123700]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2 2026-03-09T17:33:03.746 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:03 vm06.local podman[123769]: 2026-03-09 17:33:03.714641494 +0000 UTC m=+0.016602957 container create 5c95fbed0ecb9bb3ad24b7d2a5296727c278367d12c7ac0e5841a930eae33c83 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-deactivate, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True) 2026-03-09T17:33:03.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:03 vm09.local ceph-mon[97995]: pgmap v78: 65 pgs: 1 active+recovering, 64 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 874 B/s rd, 1 op/s; 16 B/s, 7 objects/s recovering 2026-03-09T17:33:03.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:03 vm09.local ceph-mon[97995]: osd.2 marked itself down and dead 2026-03-09T17:33:04.037 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:03 vm06.local podman[123769]: 2026-03-09 17:33:03.76400582 +0000 UTC m=+0.065967283 container init 5c95fbed0ecb9bb3ad24b7d2a5296727c278367d12c7ac0e5841a930eae33c83 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T17:33:04.037 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:03 vm06.local podman[123769]: 2026-03-09 17:33:03.767753444 +0000 UTC m=+0.069714907 container start 5c95fbed0ecb9bb3ad24b7d2a5296727c278367d12c7ac0e5841a930eae33c83 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default) 2026-03-09T17:33:04.037 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:03 vm06.local podman[123769]: 2026-03-09 17:33:03.769434381 +0000 UTC m=+0.071395844 container attach 5c95fbed0ecb9bb3ad24b7d2a5296727c278367d12c7ac0e5841a930eae33c83 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-deactivate, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T17:33:04.037 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:03 vm06.local podman[123769]: 2026-03-09 17:33:03.707892582 +0000 UTC m=+0.009854056 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:33:04.037 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:03 vm06.local podman[123788]: 2026-03-09 17:33:03.922554115 +0000 UTC m=+0.010532926 container died 5c95fbed0ecb9bb3ad24b7d2a5296727c278367d12c7ac0e5841a930eae33c83 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-deactivate, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True) 2026-03-09T17:33:04.037 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:03 vm06.local podman[123788]: 2026-03-09 17:33:03.939385989 +0000 UTC m=+0.027364800 container remove 5c95fbed0ecb9bb3ad24b7d2a5296727c278367d12c7ac0e5841a930eae33c83 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-deactivate, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T17:33:04.037 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:03 vm06.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.2.service: Deactivated successfully. 2026-03-09T17:33:04.037 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:03 vm06.local systemd[1]: Stopped Ceph osd.2 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:33:04.037 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:03 vm06.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.2.service: Consumed 41.120s CPU time. 2026-03-09T17:33:04.484 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:04 vm06.local ceph-mon[109831]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T17:33:04.484 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:04 vm06.local ceph-mon[109831]: osdmap e56: 6 total, 5 up, 6 in 2026-03-09T17:33:04.484 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local systemd[1]: Starting Ceph osd.2 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:33:04.484 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local podman[123872]: 2026-03-09 17:33:04.246062195 +0000 UTC m=+0.016422368 container create 62a10eeb9a2516610123c662c1c606e00af07da9c365cfee62a21199774bc6cd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T17:33:04.484 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local podman[123872]: 2026-03-09 17:33:04.287947953 +0000 UTC m=+0.058308137 container init 62a10eeb9a2516610123c662c1c606e00af07da9c365cfee62a21199774bc6cd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , ceph=True) 2026-03-09T17:33:04.484 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local podman[123872]: 2026-03-09 17:33:04.290894448 +0000 UTC m=+0.061254621 container start 62a10eeb9a2516610123c662c1c606e00af07da9c365cfee62a21199774bc6cd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T17:33:04.484 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local podman[123872]: 2026-03-09 17:33:04.297149856 +0000 UTC m=+0.067510029 container attach 62a10eeb9a2516610123c662c1c606e00af07da9c365cfee62a21199774bc6cd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T17:33:04.484 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local podman[123872]: 2026-03-09 17:33:04.239430884 +0000 UTC m=+0.009791067 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:33:04.484 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate[123883]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:04.484 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local bash[123872]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:04.484 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate[123883]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:04.484 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local bash[123872]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:04.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:04 vm09.local ceph-mon[97995]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T17:33:04.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:04 vm09.local ceph-mon[97995]: osdmap e56: 6 total, 5 up, 6 in 2026-03-09T17:33:05.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate[123883]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T17:33:05.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate[123883]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:05.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local bash[123872]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T17:33:05.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local bash[123872]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:05.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate[123883]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:05.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local bash[123872]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:05.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate[123883]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T17:33:05.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local bash[123872]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T17:33:05.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate[123883]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-f9012d63-47aa-4b7a-8a53-4643dde8ebe0/osd-block-4aa1d45d-b786-45ab-97d1-aef76daa15f5 --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-09T17:33:05.142 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:04 vm06.local bash[123872]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-f9012d63-47aa-4b7a-8a53-4643dde8ebe0/osd-block-4aa1d45d-b786-45ab-97d1-aef76daa15f5 --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-09T17:33:05.504 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:05 vm06.local ceph-mon[109831]: pgmap v80: 65 pgs: 7 peering, 5 stale+active+clean, 1 active+recovering, 52 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1023 B/s rd, 1 op/s; 0 B/s, 8 objects/s recovering 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate[123883]: Running command: /usr/bin/ln -snf /dev/ceph-f9012d63-47aa-4b7a-8a53-4643dde8ebe0/osd-block-4aa1d45d-b786-45ab-97d1-aef76daa15f5 /var/lib/ceph/osd/ceph-2/block 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local bash[123872]: Running command: /usr/bin/ln -snf /dev/ceph-f9012d63-47aa-4b7a-8a53-4643dde8ebe0/osd-block-4aa1d45d-b786-45ab-97d1-aef76daa15f5 /var/lib/ceph/osd/ceph-2/block 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate[123883]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local bash[123872]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate[123883]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local bash[123872]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate[123883]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local bash[123872]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate[123883]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local bash[123872]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local podman[124090]: 2026-03-09 17:33:05.267114952 +0000 UTC m=+0.011705361 container died 62a10eeb9a2516610123c662c1c606e00af07da9c365cfee62a21199774bc6cd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid) 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local podman[124090]: 2026-03-09 17:33:05.286574404 +0000 UTC m=+0.031164812 container remove 62a10eeb9a2516610123c662c1c606e00af07da9c365cfee62a21199774bc6cd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-activate, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2) 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local podman[124129]: 2026-03-09 17:33:05.383135168 +0000 UTC m=+0.017220673 container create a5ccd85faf221f22746043728365d5228b18ecd24c0e4f5ac48deeaac5d785a3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local podman[124129]: 2026-03-09 17:33:05.419763434 +0000 UTC m=+0.053848930 container init a5ccd85faf221f22746043728365d5228b18ecd24c0e4f5ac48deeaac5d785a3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True) 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local podman[124129]: 2026-03-09 17:33:05.422516768 +0000 UTC m=+0.056602264 container start a5ccd85faf221f22746043728365d5228b18ecd24c0e4f5ac48deeaac5d785a3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.license=GPLv2) 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local bash[124129]: a5ccd85faf221f22746043728365d5228b18ecd24c0e4f5ac48deeaac5d785a3 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local podman[124129]: 2026-03-09 17:33:05.375871843 +0000 UTC m=+0.009957358 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:33:05.505 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local systemd[1]: Started Ceph osd.2 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:33:05.765 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:05 vm06.local ceph-mon[109831]: osdmap e57: 6 total, 5 up, 6 in 2026-03-09T17:33:05.765 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:05 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:05.765 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:05 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:05.765 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:05 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:05.765 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:05 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2[124140]: 2026-03-09T17:33:05.760+0000 7fec32313740 -1 Falling back to public interface 2026-03-09T17:33:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:05 vm09.local ceph-mon[97995]: pgmap v80: 65 pgs: 7 peering, 5 stale+active+clean, 1 active+recovering, 52 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 1023 B/s rd, 1 op/s; 0 B/s, 8 objects/s recovering 2026-03-09T17:33:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:05 vm09.local ceph-mon[97995]: osdmap e57: 6 total, 5 up, 6 in 2026-03-09T17:33:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:05 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:05 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:05.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:05 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:06.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:06 vm06.local ceph-mon[109831]: Health check failed: Degraded data redundancy: 19/234 objects degraded (8.120%), 7 pgs degraded (PG_DEGRADED) 2026-03-09T17:33:06.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:06 vm09.local ceph-mon[97995]: Health check failed: Degraded data redundancy: 19/234 objects degraded (8.120%), 7 pgs degraded (PG_DEGRADED) 2026-03-09T17:33:08.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:08 vm06.local ceph-mon[109831]: pgmap v82: 65 pgs: 9 active+undersized, 4 activating+undersized, 7 peering, 1 active+recovering, 7 active+undersized+degraded, 37 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 0 B/s rd, 2 op/s; 19/234 objects degraded (8.120%); 0 B/s, 10 objects/s recovering 2026-03-09T17:33:08.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:08 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:08.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:08 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:08.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:08 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:08.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:08 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:08.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:08 vm09.local ceph-mon[97995]: pgmap v82: 65 pgs: 9 active+undersized, 4 activating+undersized, 7 peering, 1 active+recovering, 7 active+undersized+degraded, 37 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 0 B/s rd, 2 op/s; 19/234 objects degraded (8.120%); 0 B/s, 10 objects/s recovering 2026-03-09T17:33:08.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:08 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:08.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:08 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:08.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:08 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:08.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:08 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:09.642 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2[124140]: 2026-03-09T17:33:09.373+0000 7fec32313740 -1 osd.2 0 read_superblock omap replica is missing. 2026-03-09T17:33:09.642 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2[124140]: 2026-03-09T17:33:09.599+0000 7fec32313740 -1 osd.2 55 log_to_monitors true 2026-03-09T17:33:10.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-mon[109831]: pgmap v83: 65 pgs: 9 active+undersized, 4 activating+undersized, 7 peering, 1 active+recovering, 7 active+undersized+degraded, 37 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 0 B/s rd, 0 op/s; 19/234 objects degraded (8.120%); 0 B/s, 8 objects/s recovering 2026-03-09T17:33:10.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:10.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:10.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:10.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:33:10.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:10.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:10.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:10.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:10.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:10.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T17:33:10.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T17:33:10.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-mon[109831]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-09T17:33:10.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:09 vm06.local ceph-mon[109831]: from='osd.2 [v2:192.168.123.106:6818/1621627744,v1:192.168.123.106:6819/1621627744]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T17:33:10.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:09 vm09.local ceph-mon[97995]: pgmap v83: 65 pgs: 9 active+undersized, 4 activating+undersized, 7 peering, 1 active+recovering, 7 active+undersized+degraded, 37 active+clean; 216 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 0 B/s rd, 0 op/s; 19/234 objects degraded (8.120%); 0 B/s, 8 objects/s recovering 2026-03-09T17:33:10.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:10.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:10.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:10.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:33:10.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:10.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:10.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:10.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:10.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:10.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T17:33:10.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:09 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T17:33:10.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:09 vm09.local ceph-mon[97995]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-09T17:33:10.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:09 vm09.local ceph-mon[97995]: from='osd.2 [v2:192.168.123.106:6818/1621627744,v1:192.168.123.106:6819/1621627744]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T17:33:11.575 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:11 vm06.local ceph-mon[109831]: from='osd.2 [v2:192.168.123.106:6818/1621627744,v1:192.168.123.106:6819/1621627744]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T17:33:11.575 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:11 vm06.local ceph-mon[109831]: osdmap e58: 6 total, 5 up, 6 in 2026-03-09T17:33:11.575 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:11 vm06.local ceph-mon[109831]: from='osd.2 [v2:192.168.123.106:6818/1621627744,v1:192.168.123.106:6819/1621627744]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T17:33:11.575 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:33:11 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2[124140]: 2026-03-09T17:33:11.193+0000 7fec298ac640 -1 osd.2 55 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T17:33:11.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:11 vm09.local ceph-mon[97995]: from='osd.2 [v2:192.168.123.106:6818/1621627744,v1:192.168.123.106:6819/1621627744]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T17:33:11.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:11 vm09.local ceph-mon[97995]: osdmap e58: 6 total, 5 up, 6 in 2026-03-09T17:33:11.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:11 vm09.local ceph-mon[97995]: from='osd.2 [v2:192.168.123.106:6818/1621627744,v1:192.168.123.106:6819/1621627744]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T17:33:12.589 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:12 vm09.local ceph-mon[97995]: pgmap v85: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 0 B/s rd, 2 op/s; 30/234 objects degraded (12.821%); 0 B/s, 9 objects/s recovering 2026-03-09T17:33:12.589 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:12 vm09.local ceph-mon[97995]: from='osd.2 [v2:192.168.123.106:6818/1621627744,v1:192.168.123.106:6819/1621627744]' entity='osd.2' 2026-03-09T17:33:12.589 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:12 vm09.local ceph-mon[97995]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T17:33:12.589 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:12 vm09.local ceph-mon[97995]: osd.2 [v2:192.168.123.106:6818/1621627744,v1:192.168.123.106:6819/1621627744] boot 2026-03-09T17:33:12.589 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:12 vm09.local ceph-mon[97995]: osdmap e59: 6 total, 6 up, 6 in 2026-03-09T17:33:12.589 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:33:12.589 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:12 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 30/234 objects degraded (12.821%), 12 pgs degraded (PG_DEGRADED) 2026-03-09T17:33:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:12 vm06.local ceph-mon[109831]: pgmap v85: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 0 B/s rd, 2 op/s; 30/234 objects degraded (12.821%); 0 B/s, 9 objects/s recovering 2026-03-09T17:33:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:12 vm06.local ceph-mon[109831]: from='osd.2 [v2:192.168.123.106:6818/1621627744,v1:192.168.123.106:6819/1621627744]' entity='osd.2' 2026-03-09T17:33:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:12 vm06.local ceph-mon[109831]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T17:33:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:12 vm06.local ceph-mon[109831]: osd.2 [v2:192.168.123.106:6818/1621627744,v1:192.168.123.106:6819/1621627744] boot 2026-03-09T17:33:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:12 vm06.local ceph-mon[109831]: osdmap e59: 6 total, 6 up, 6 in 2026-03-09T17:33:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T17:33:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:12 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 30/234 objects degraded (12.821%), 12 pgs degraded (PG_DEGRADED) 2026-03-09T17:33:13.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:13 vm06.local ceph-mon[109831]: osdmap e60: 6 total, 6 up, 6 in 2026-03-09T17:33:13.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:13 vm06.local ceph-mon[109831]: pgmap v88: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 853 B/s rd, 1 op/s; 30/234 objects degraded (12.821%); 0 B/s, 2 objects/s recovering 2026-03-09T17:33:13.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:13 vm09.local ceph-mon[97995]: osdmap e60: 6 total, 6 up, 6 in 2026-03-09T17:33:13.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:13 vm09.local ceph-mon[97995]: pgmap v88: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 853 B/s rd, 1 op/s; 30/234 objects degraded (12.821%); 0 B/s, 2 objects/s recovering 2026-03-09T17:33:14.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:14 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:33:14.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:14 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:33:15.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:15 vm06.local ceph-mon[109831]: pgmap v89: 65 pgs: 13 active+undersized, 11 active+undersized+degraded, 41 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 853 B/s rd, 1 op/s; 27/234 objects degraded (11.538%); 0 B/s, 2 objects/s recovering 2026-03-09T17:33:15.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:15 vm09.local ceph-mon[97995]: pgmap v89: 65 pgs: 13 active+undersized, 11 active+undersized+degraded, 41 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 853 B/s rd, 1 op/s; 27/234 objects degraded (11.538%); 0 B/s, 2 objects/s recovering 2026-03-09T17:33:16.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:16 vm06.local ceph-mon[109831]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 27/234 objects degraded (11.538%), 11 pgs degraded) 2026-03-09T17:33:16.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:16 vm06.local ceph-mon[109831]: Cluster is now healthy 2026-03-09T17:33:16.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:16 vm09.local ceph-mon[97995]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 27/234 objects degraded (11.538%), 11 pgs degraded) 2026-03-09T17:33:16.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:16 vm09.local ceph-mon[97995]: Cluster is now healthy 2026-03-09T17:33:17.750 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:17 vm06.local ceph-mon[109831]: pgmap v90: 65 pgs: 65 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 661 B/s rd, 1 op/s 2026-03-09T17:33:17.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:17 vm09.local ceph-mon[97995]: pgmap v90: 65 pgs: 65 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 661 B/s rd, 1 op/s 2026-03-09T17:33:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:19 vm06.local ceph-mon[109831]: pgmap v91: 65 pgs: 65 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 511 B/s rd, 1 op/s 2026-03-09T17:33:19.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:19 vm09.local ceph-mon[97995]: pgmap v91: 65 pgs: 65 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 511 B/s rd, 1 op/s 2026-03-09T17:33:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:21 vm06.local ceph-mon[109831]: pgmap v92: 65 pgs: 65 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 1009 B/s rd, 2 op/s 2026-03-09T17:33:21.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:21 vm09.local ceph-mon[97995]: pgmap v92: 65 pgs: 65 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 1009 B/s rd, 2 op/s 2026-03-09T17:33:22.812 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.809+0000 7ff62ffff700 1 -- 192.168.123.106:0/4023898429 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff630072440 msgr2=0x7ff63010be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:22.812 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.809+0000 7ff62ffff700 1 --2- 192.168.123.106:0/4023898429 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff630072440 0x7ff63010be90 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7ff624009b00 tx=0x7ff624009e10 comp rx=0 tx=0).stop 2026-03-09T17:33:22.812 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.809+0000 7ff62ffff700 1 -- 192.168.123.106:0/4023898429 shutdown_connections 2026-03-09T17:33:22.812 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.809+0000 7ff62ffff700 1 --2- 192.168.123.106:0/4023898429 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff630072440 0x7ff63010be90 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:22.812 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.809+0000 7ff62ffff700 1 --2- 192.168.123.106:0/4023898429 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff630071a60 0x7ff630071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:22.812 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.809+0000 7ff62ffff700 1 -- 192.168.123.106:0/4023898429 >> 192.168.123.106:0/4023898429 conn(0x7ff63006d1a0 msgr2=0x7ff63006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:22.812 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.810+0000 7ff62ffff700 1 -- 192.168.123.106:0/4023898429 shutdown_connections 2026-03-09T17:33:22.812 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.810+0000 7ff62ffff700 1 -- 192.168.123.106:0/4023898429 wait complete. 2026-03-09T17:33:22.812 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.810+0000 7ff62ffff700 1 Processor -- start 2026-03-09T17:33:22.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.811+0000 7ff62ffff700 1 -- start start 2026-03-09T17:33:22.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.811+0000 7ff62ffff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff630071a60 0x7ff63019c160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:22.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.811+0000 7ff62ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff630072440 0x7ff63019c6a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:22.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.811+0000 7ff62ffff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff63019ccc0 con 0x7ff630072440 2026-03-09T17:33:22.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.811+0000 7ff62ffff700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff63019ce00 con 0x7ff630071a60 2026-03-09T17:33:22.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.811+0000 7ff62effd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff630071a60 0x7ff63019c160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:22.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.811+0000 7ff62effd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff630071a60 0x7ff63019c160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:56124/0 (socket says 192.168.123.106:56124) 2026-03-09T17:33:22.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.811+0000 7ff62effd700 1 -- 192.168.123.106:0/1268203357 learned_addr learned my addr 192.168.123.106:0/1268203357 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:33:22.813 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.811+0000 7ff62e7fc700 1 --2- 192.168.123.106:0/1268203357 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff630072440 0x7ff63019c6a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:22.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.811+0000 7ff62effd700 1 -- 192.168.123.106:0/1268203357 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff630072440 msgr2=0x7ff63019c6a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:22.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.811+0000 7ff62effd700 1 --2- 192.168.123.106:0/1268203357 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff630072440 0x7ff63019c6a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:22.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.811+0000 7ff62effd700 1 -- 192.168.123.106:0/1268203357 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6240097e0 con 0x7ff630071a60 2026-03-09T17:33:22.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.812+0000 7ff62effd700 1 --2- 192.168.123.106:0/1268203357 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff630071a60 0x7ff63019c160 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7ff62001fd10 tx=0x7ff62001d5c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:22.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.812+0000 7ff617fff700 1 -- 192.168.123.106:0/1268203357 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff62001a990 con 0x7ff630071a60 2026-03-09T17:33:22.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.812+0000 7ff62ffff700 1 -- 192.168.123.106:0/1268203357 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff6301a18b0 con 0x7ff630071a60 2026-03-09T17:33:22.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.812+0000 7ff62ffff700 1 -- 192.168.123.106:0/1268203357 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff6301a1e00 con 0x7ff630071a60 2026-03-09T17:33:22.814 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.812+0000 7ff617fff700 1 -- 192.168.123.106:0/1268203357 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff620004500 con 0x7ff630071a60 2026-03-09T17:33:22.815 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.812+0000 7ff617fff700 1 -- 192.168.123.106:0/1268203357 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff620021690 con 0x7ff630071a60 2026-03-09T17:33:22.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.814+0000 7ff617fff700 1 -- 192.168.123.106:0/1268203357 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff620003680 con 0x7ff630071a60 2026-03-09T17:33:22.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.814+0000 7ff617fff700 1 --2- 192.168.123.106:0/1268203357 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff618077920 0x7ff618079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:22.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.814+0000 7ff617fff700 1 -- 192.168.123.106:0/1268203357 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(60..60 src has 1..60) v4 ==== 6136+0+0 (secure 0 0 0) 0x7ff620025070 con 0x7ff630071a60 2026-03-09T17:33:22.816 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.814+0000 7ff62ffff700 1 -- 192.168.123.106:0/1268203357 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff61c005320 con 0x7ff630071a60 2026-03-09T17:33:22.817 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.815+0000 7ff62e7fc700 1 --2- 192.168.123.106:0/1268203357 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff618077920 0x7ff618079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:22.818 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.816+0000 7ff62e7fc700 1 --2- 192.168.123.106:0/1268203357 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff618077920 0x7ff618079dd0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7ff62400b5c0 tx=0x7ff62401a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:22.820 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.818+0000 7ff617fff700 1 -- 192.168.123.106:0/1268203357 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff620073970 con 0x7ff630071a60 2026-03-09T17:33:22.948 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.946+0000 7ff62ffff700 1 -- 192.168.123.106:0/1268203357 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff61c000bf0 con 0x7ff618077920 2026-03-09T17:33:22.949 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.947+0000 7ff617fff700 1 -- 192.168.123.106:0/1268203357 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7ff61c000bf0 con 0x7ff618077920 2026-03-09T17:33:22.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.950+0000 7ff62ffff700 1 -- 192.168.123.106:0/1268203357 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff618077920 msgr2=0x7ff618079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:22.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.950+0000 7ff62ffff700 1 --2- 192.168.123.106:0/1268203357 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff618077920 0x7ff618079dd0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7ff62400b5c0 tx=0x7ff62401a040 comp rx=0 tx=0).stop 2026-03-09T17:33:22.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.950+0000 7ff62ffff700 1 -- 192.168.123.106:0/1268203357 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff630071a60 msgr2=0x7ff63019c160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:22.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.950+0000 7ff62ffff700 1 --2- 192.168.123.106:0/1268203357 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff630071a60 0x7ff63019c160 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7ff62001fd10 tx=0x7ff62001d5c0 comp rx=0 tx=0).stop 2026-03-09T17:33:22.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.950+0000 7ff62ffff700 1 -- 192.168.123.106:0/1268203357 shutdown_connections 2026-03-09T17:33:22.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.950+0000 7ff62ffff700 1 --2- 192.168.123.106:0/1268203357 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff618077920 0x7ff618079dd0 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:22.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.950+0000 7ff62ffff700 1 --2- 192.168.123.106:0/1268203357 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff630071a60 0x7ff63019c160 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:22.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.950+0000 7ff62ffff700 1 --2- 192.168.123.106:0/1268203357 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff630072440 0x7ff63019c6a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:22.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.950+0000 7ff62ffff700 1 -- 192.168.123.106:0/1268203357 >> 192.168.123.106:0/1268203357 conn(0x7ff63006d1a0 msgr2=0x7ff63010a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:22.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.950+0000 7ff62ffff700 1 -- 192.168.123.106:0/1268203357 shutdown_connections 2026-03-09T17:33:22.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:22.951+0000 7ff62ffff700 1 -- 192.168.123.106:0/1268203357 wait complete. 2026-03-09T17:33:22.962 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:33:23.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.021+0000 7f868f50c700 1 -- 192.168.123.106:0/1274459298 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8688103940 msgr2=0x7f8688103d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.021+0000 7f868f50c700 1 --2- 192.168.123.106:0/1274459298 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8688103940 0x7f8688103d90 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f868400b600 tx=0x7f868400b910 comp rx=0 tx=0).stop 2026-03-09T17:33:23.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.022+0000 7f868f50c700 1 -- 192.168.123.106:0/1274459298 shutdown_connections 2026-03-09T17:33:23.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.022+0000 7f868f50c700 1 --2- 192.168.123.106:0/1274459298 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8688103940 0x7f8688103d90 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.022+0000 7f868f50c700 1 --2- 192.168.123.106:0/1274459298 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8688102740 0x7f8688102b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.022+0000 7f868f50c700 1 -- 192.168.123.106:0/1274459298 >> 192.168.123.106:0/1274459298 conn(0x7f86880fdcf0 msgr2=0x7f8688100120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:23.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.022+0000 7f868f50c700 1 -- 192.168.123.106:0/1274459298 shutdown_connections 2026-03-09T17:33:23.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.022+0000 7f868f50c700 1 -- 192.168.123.106:0/1274459298 wait complete. 2026-03-09T17:33:23.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.022+0000 7f868f50c700 1 Processor -- start 2026-03-09T17:33:23.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.022+0000 7f868f50c700 1 -- start start 2026-03-09T17:33:23.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.023+0000 7f868f50c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8688102740 0x7f86881980f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.023+0000 7f868f50c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8688103940 0x7f8688198630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.023+0000 7f868f50c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8688198c50 con 0x7f8688102740 2026-03-09T17:33:23.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.023+0000 7f868f50c700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8688198d90 con 0x7f8688103940 2026-03-09T17:33:23.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.023+0000 7f868d2a8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8688102740 0x7f86881980f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:23.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.023+0000 7f868d2a8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8688102740 0x7f86881980f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35016/0 (socket says 192.168.123.106:35016) 2026-03-09T17:33:23.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.023+0000 7f868d2a8700 1 -- 192.168.123.106:0/4104423996 learned_addr learned my addr 192.168.123.106:0/4104423996 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:33:23.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.023+0000 7f868caa7700 1 --2- 192.168.123.106:0/4104423996 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8688103940 0x7f8688198630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:23.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.023+0000 7f868d2a8700 1 -- 192.168.123.106:0/4104423996 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8688103940 msgr2=0x7f8688198630 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.023+0000 7f868d2a8700 1 --2- 192.168.123.106:0/4104423996 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8688103940 0x7f8688198630 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.023+0000 7f868d2a8700 1 -- 192.168.123.106:0/4104423996 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f868400b050 con 0x7f8688102740 2026-03-09T17:33:23.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.023+0000 7f868caa7700 1 --2- 192.168.123.106:0/4104423996 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8688103940 0x7f8688198630 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T17:33:23.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.023+0000 7f868d2a8700 1 --2- 192.168.123.106:0/4104423996 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8688102740 0x7f86881980f0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f867800b700 tx=0x7f867800bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:23.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.024+0000 7f867e7fc700 1 -- 192.168.123.106:0/4104423996 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8678010820 con 0x7f8688102740 2026-03-09T17:33:23.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.024+0000 7f867e7fc700 1 -- 192.168.123.106:0/4104423996 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8678010e60 con 0x7f8688102740 2026-03-09T17:33:23.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.024+0000 7f867e7fc700 1 -- 192.168.123.106:0/4104423996 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8678017570 con 0x7f8688102740 2026-03-09T17:33:23.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.024+0000 7f868f50c700 1 -- 192.168.123.106:0/4104423996 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f868819d840 con 0x7f8688102740 2026-03-09T17:33:23.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.024+0000 7f868f50c700 1 -- 192.168.123.106:0/4104423996 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8688075470 con 0x7f8688102740 2026-03-09T17:33:23.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.026+0000 7f868f50c700 1 -- 192.168.123.106:0/4104423996 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8688066e40 con 0x7f8688102740 2026-03-09T17:33:23.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.026+0000 7f867e7fc700 1 -- 192.168.123.106:0/4104423996 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f867800f3c0 con 0x7f8688102740 2026-03-09T17:33:23.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.026+0000 7f867e7fc700 1 --2- 192.168.123.106:0/4104423996 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8674077990 0x7f8674079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.026+0000 7f867e7fc700 1 -- 192.168.123.106:0/4104423996 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(60..60 src has 1..60) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f8678099350 con 0x7f8688102740 2026-03-09T17:33:23.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.029+0000 7f867e7fc700 1 -- 192.168.123.106:0/4104423996 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8678061b80 con 0x7f8688102740 2026-03-09T17:33:23.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.029+0000 7f868caa7700 1 --2- 192.168.123.106:0/4104423996 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8674077990 0x7f8674079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:23.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.030+0000 7f868caa7700 1 --2- 192.168.123.106:0/4104423996 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8674077990 0x7f8674079e40 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f868400bd90 tx=0x7f8684009150 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:23.163 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.161+0000 7f868f50c700 1 -- 192.168.123.106:0/4104423996 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8688108290 con 0x7f8674077990 2026-03-09T17:33:23.165 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.162+0000 7f867e7fc700 1 -- 192.168.123.106:0/4104423996 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f8688108290 con 0x7f8674077990 2026-03-09T17:33:23.167 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.165+0000 7f868f50c700 1 -- 192.168.123.106:0/4104423996 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8674077990 msgr2=0x7f8674079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.168 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.165+0000 7f868f50c700 1 --2- 192.168.123.106:0/4104423996 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8674077990 0x7f8674079e40 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f868400bd90 tx=0x7f8684009150 comp rx=0 tx=0).stop 2026-03-09T17:33:23.168 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.165+0000 7f868f50c700 1 -- 192.168.123.106:0/4104423996 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8688102740 msgr2=0x7f86881980f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.168 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.165+0000 7f868f50c700 1 --2- 192.168.123.106:0/4104423996 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8688102740 0x7f86881980f0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f867800b700 tx=0x7f867800bac0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.168 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.166+0000 7f868f50c700 1 -- 192.168.123.106:0/4104423996 shutdown_connections 2026-03-09T17:33:23.168 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.166+0000 7f868f50c700 1 --2- 192.168.123.106:0/4104423996 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8674077990 0x7f8674079e40 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.168 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.166+0000 7f868f50c700 1 --2- 192.168.123.106:0/4104423996 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8688102740 0x7f86881980f0 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.168 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.166+0000 7f868f50c700 1 --2- 192.168.123.106:0/4104423996 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8688103940 0x7f8688198630 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.168 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.166+0000 7f868f50c700 1 -- 192.168.123.106:0/4104423996 >> 192.168.123.106:0/4104423996 conn(0x7f86880fdcf0 msgr2=0x7f8688106b70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:23.168 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.166+0000 7f868f50c700 1 -- 192.168.123.106:0/4104423996 shutdown_connections 2026-03-09T17:33:23.168 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.166+0000 7f868f50c700 1 -- 192.168.123.106:0/4104423996 wait complete. 2026-03-09T17:33:23.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.245+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2449670995 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa774103960 msgr2=0x7fa774103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.247 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.245+0000 7fa77c0c8700 1 --2- 192.168.123.106:0/2449670995 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa774103960 0x7fa774103db0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fa770009b00 tx=0x7fa770009e10 comp rx=0 tx=0).stop 2026-03-09T17:33:23.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.245+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2449670995 shutdown_connections 2026-03-09T17:33:23.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.245+0000 7fa77c0c8700 1 --2- 192.168.123.106:0/2449670995 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa774103960 0x7fa774103db0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.246+0000 7fa77c0c8700 1 --2- 192.168.123.106:0/2449670995 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa774102760 0x7fa774102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.246+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2449670995 >> 192.168.123.106:0/2449670995 conn(0x7fa7740fdcf0 msgr2=0x7fa774100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:23.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.246+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2449670995 shutdown_connections 2026-03-09T17:33:23.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.246+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2449670995 wait complete. 2026-03-09T17:33:23.248 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.246+0000 7fa77c0c8700 1 Processor -- start 2026-03-09T17:33:23.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.247+0000 7fa77c0c8700 1 -- start start 2026-03-09T17:33:23.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.247+0000 7fa77c0c8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa774103960 0x7fa7741981a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.247+0000 7fa77c0c8700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7741986e0 0x7fa77419d750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.247+0000 7fa77c0c8700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa774198be0 con 0x7fa774103960 2026-03-09T17:33:23.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.247+0000 7fa77c0c8700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa774198d50 con 0x7fa7741986e0 2026-03-09T17:33:23.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.247+0000 7fa779663700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7741986e0 0x7fa77419d750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:23.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.247+0000 7fa779663700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7741986e0 0x7fa77419d750 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:56150/0 (socket says 192.168.123.106:56150) 2026-03-09T17:33:23.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.247+0000 7fa779663700 1 -- 192.168.123.106:0/2538297032 learned_addr learned my addr 192.168.123.106:0/2538297032 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:33:23.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.247+0000 7fa779663700 1 -- 192.168.123.106:0/2538297032 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa774103960 msgr2=0x7fa7741981a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.248+0000 7fa779e64700 1 --2- 192.168.123.106:0/2538297032 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa774103960 0x7fa7741981a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:23.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.248+0000 7fa779663700 1 --2- 192.168.123.106:0/2538297032 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa774103960 0x7fa7741981a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.248+0000 7fa779663700 1 -- 192.168.123.106:0/2538297032 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa7700097e0 con 0x7fa7741986e0 2026-03-09T17:33:23.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.248+0000 7fa779e64700 1 --2- 192.168.123.106:0/2538297032 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa774103960 0x7fa7741981a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:33:23.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.248+0000 7fa779663700 1 --2- 192.168.123.106:0/2538297032 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7741986e0 0x7fa77419d750 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fa770005f50 tx=0x7fa770004a60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:23.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.248+0000 7fa76affd700 1 -- 192.168.123.106:0/2538297032 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa77001d070 con 0x7fa7741986e0 2026-03-09T17:33:23.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.248+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2538297032 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa77419dc90 con 0x7fa7741986e0 2026-03-09T17:33:23.252 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.248+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2538297032 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa77419e150 con 0x7fa7741986e0 2026-03-09T17:33:23.252 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.249+0000 7fa76affd700 1 -- 192.168.123.106:0/2538297032 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa77000bc50 con 0x7fa7741986e0 2026-03-09T17:33:23.252 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.249+0000 7fa76affd700 1 -- 192.168.123.106:0/2538297032 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa770017810 con 0x7fa7741986e0 2026-03-09T17:33:23.253 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.250+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2538297032 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa774066e40 con 0x7fa7741986e0 2026-03-09T17:33:23.253 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.250+0000 7fa76affd700 1 -- 192.168.123.106:0/2538297032 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa77000fb90 con 0x7fa7741986e0 2026-03-09T17:33:23.253 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.251+0000 7fa76affd700 1 --2- 192.168.123.106:0/2538297032 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa7600778c0 0x7fa760079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.253 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.251+0000 7fa76affd700 1 -- 192.168.123.106:0/2538297032 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(60..60 src has 1..60) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fa77009b420 con 0x7fa7741986e0 2026-03-09T17:33:23.254 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.251+0000 7fa779e64700 1 --2- 192.168.123.106:0/2538297032 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa7600778c0 0x7fa760079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:23.254 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.252+0000 7fa779e64700 1 --2- 192.168.123.106:0/2538297032 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa7600778c0 0x7fa760079d70 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fa764006fd0 tx=0x7fa764008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:23.255 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.253+0000 7fa76affd700 1 -- 192.168.123.106:0/2538297032 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa770063c50 con 0x7fa7741986e0 2026-03-09T17:33:23.377 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.374+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2538297032 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fa7741082b0 con 0x7fa7600778c0 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.381+0000 7fa76affd700 1 -- 192.168.123.106:0/2538297032 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fa7741082b0 con 0x7fa7600778c0 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (7m) 16s ago 8m 26.0M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (8m) 16s ago 8m 9353k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (7m) 119s ago 7m 11.1M - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (2m) 16s ago 8m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 3e47d040f792 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (2m) 119s ago 7m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e a8538a05db57 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (7m) 16s ago 8m 95.1M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (6m) 16s ago 6m 16.8M - 18.2.0 dc2bc1663786 4b4cbdf0c640 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (6m) 16s ago 6m 177M - 18.2.0 dc2bc1663786 4c8e86b2b8cd 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (6m) 119s ago 6m 145M - 18.2.0 dc2bc1663786 aa1f0430b448 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (6m) 119s ago 6m 17.5M - 18.2.0 dc2bc1663786 8dc8a0159213 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:8443,9283,8765 running (3m) 16s ago 9m 627M - 19.2.3-678-ge911bdeb 654f31e6858e a857a5a84e3a 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (2m) 119s ago 7m 487M - 19.2.3-678-ge911bdeb 654f31e6858e ce41ce6b68d6 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (2m) 16s ago 9m 60.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 86c27c9946b5 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (2m) 119s ago 7m 48.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 65d270c6a306 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (8m) 16s ago 8m 14.6M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (7m) 119s ago 7m 15.9M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (110s) 16s ago 7m 205M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 3b19d9fcb067 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (40s) 16s ago 7m 117M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b63df0190ed3 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (18s) 16s ago 7m 12.5M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a5ccd85faf22 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (6m) 119s ago 6m 427M 4096M 18.2.0 dc2bc1663786 48a594500ef1 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (6m) 119s ago 6m 410M 4096M 18.2.0 dc2bc1663786 a47c39052541 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (6m) 119s ago 6m 334M 4096M 18.2.0 dc2bc1663786 89f436540a49 2026-03-09T17:33:23.384 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (2m) 16s ago 8m 58.7M - 2.43.0 a07b618ecd1d f6ece95f2fd5 2026-03-09T17:33:23.387 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.384+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2538297032 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa7600778c0 msgr2=0x7fa760079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.387 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.385+0000 7fa77c0c8700 1 --2- 192.168.123.106:0/2538297032 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa7600778c0 0x7fa760079d70 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fa764006fd0 tx=0x7fa764008040 comp rx=0 tx=0).stop 2026-03-09T17:33:23.387 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.385+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2538297032 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7741986e0 msgr2=0x7fa77419d750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.387 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.385+0000 7fa77c0c8700 1 --2- 192.168.123.106:0/2538297032 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7741986e0 0x7fa77419d750 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fa770005f50 tx=0x7fa770004a60 comp rx=0 tx=0).stop 2026-03-09T17:33:23.387 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.385+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2538297032 shutdown_connections 2026-03-09T17:33:23.387 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.385+0000 7fa77c0c8700 1 --2- 192.168.123.106:0/2538297032 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa7600778c0 0x7fa760079d70 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.387 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.385+0000 7fa77c0c8700 1 --2- 192.168.123.106:0/2538297032 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa774103960 0x7fa7741981a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.387 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.385+0000 7fa77c0c8700 1 --2- 192.168.123.106:0/2538297032 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa7741986e0 0x7fa77419d750 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.387 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.385+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2538297032 >> 192.168.123.106:0/2538297032 conn(0x7fa7740fdcf0 msgr2=0x7fa774106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:23.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.386+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2538297032 shutdown_connections 2026-03-09T17:33:23.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.386+0000 7fa77c0c8700 1 -- 192.168.123.106:0/2538297032 wait complete. 2026-03-09T17:33:23.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.464+0000 7f55df757700 1 -- 192.168.123.106:0/2893214004 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d8101090 msgr2=0x7f55d8103470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.464+0000 7f55df757700 1 --2- 192.168.123.106:0/2893214004 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d8101090 0x7f55d8103470 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f55cc009b00 tx=0x7f55cc009e10 comp rx=0 tx=0).stop 2026-03-09T17:33:23.470 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.467+0000 7f55df757700 1 -- 192.168.123.106:0/2893214004 shutdown_connections 2026-03-09T17:33:23.470 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.467+0000 7f55df757700 1 --2- 192.168.123.106:0/2893214004 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d81039b0 0x7f55d8105d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.470 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.467+0000 7f55df757700 1 --2- 192.168.123.106:0/2893214004 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d8101090 0x7f55d8103470 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.470 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.467+0000 7f55df757700 1 -- 192.168.123.106:0/2893214004 >> 192.168.123.106:0/2893214004 conn(0x7f55d80faa70 msgr2=0x7f55d80fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:23.470 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.468+0000 7f55df757700 1 -- 192.168.123.106:0/2893214004 shutdown_connections 2026-03-09T17:33:23.470 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.468+0000 7f55df757700 1 -- 192.168.123.106:0/2893214004 wait complete. 2026-03-09T17:33:23.471 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.469+0000 7f55df757700 1 Processor -- start 2026-03-09T17:33:23.471 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.469+0000 7f55df757700 1 -- start start 2026-03-09T17:33:23.471 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.469+0000 7f55df757700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d8101090 0x7f55d8197fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.471 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.469+0000 7f55df757700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d81039b0 0x7f55d81984e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.471 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.469+0000 7f55df757700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f55d8198b00 con 0x7f55d8101090 2026-03-09T17:33:23.471 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.469+0000 7f55df757700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f55d8198c40 con 0x7f55d81039b0 2026-03-09T17:33:23.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.469+0000 7f55dd4f3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d8101090 0x7f55d8197fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:23.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.469+0000 7f55dd4f3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d8101090 0x7f55d8197fa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35058/0 (socket says 192.168.123.106:35058) 2026-03-09T17:33:23.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.469+0000 7f55dd4f3700 1 -- 192.168.123.106:0/869271728 learned_addr learned my addr 192.168.123.106:0/869271728 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:33:23.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.469+0000 7f55dd4f3700 1 -- 192.168.123.106:0/869271728 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d81039b0 msgr2=0x7f55d81984e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:33:23.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.470+0000 7f55dccf2700 1 --2- 192.168.123.106:0/869271728 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d81039b0 0x7f55d81984e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:23.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.470+0000 7f55dd4f3700 1 --2- 192.168.123.106:0/869271728 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d81039b0 0x7f55d81984e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.470+0000 7f55dd4f3700 1 -- 192.168.123.106:0/869271728 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f55d4009710 con 0x7f55d8101090 2026-03-09T17:33:23.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.470+0000 7f55dd4f3700 1 --2- 192.168.123.106:0/869271728 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d8101090 0x7f55d8197fa0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f55cc00b5c0 tx=0x7f55cc004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:23.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.470+0000 7f55ca7fc700 1 -- 192.168.123.106:0/869271728 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f55cc01d070 con 0x7f55d8101090 2026-03-09T17:33:23.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.470+0000 7f55ca7fc700 1 -- 192.168.123.106:0/869271728 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f55cc00bb40 con 0x7f55d8101090 2026-03-09T17:33:23.472 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.470+0000 7f55df757700 1 -- 192.168.123.106:0/869271728 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f55cc0097e0 con 0x7f55d8101090 2026-03-09T17:33:23.473 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.470+0000 7f55df757700 1 -- 192.168.123.106:0/869271728 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f55d819d9f0 con 0x7f55d8101090 2026-03-09T17:33:23.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.472+0000 7f55ca7fc700 1 -- 192.168.123.106:0/869271728 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f55cc021620 con 0x7f55d8101090 2026-03-09T17:33:23.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.472+0000 7f55ca7fc700 1 -- 192.168.123.106:0/869271728 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f55cc02b430 con 0x7f55d8101090 2026-03-09T17:33:23.474 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.472+0000 7f55df757700 1 -- 192.168.123.106:0/869271728 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f55bc005320 con 0x7f55d8101090 2026-03-09T17:33:23.475 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.473+0000 7f55ca7fc700 1 --2- 192.168.123.106:0/869271728 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f55c4077700 0x7f55c4079bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.475 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.473+0000 7f55ca7fc700 1 -- 192.168.123.106:0/869271728 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(60..60 src has 1..60) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f55cc09bf30 con 0x7f55d8101090 2026-03-09T17:33:23.477 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.475+0000 7f55dccf2700 1 --2- 192.168.123.106:0/869271728 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f55c4077700 0x7f55c4079bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:23.478 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.476+0000 7f55dccf2700 1 --2- 192.168.123.106:0/869271728 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f55c4077700 0x7f55c4079bb0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f55d4009e90 tx=0x7f55d4009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:23.478 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.475+0000 7f55ca7fc700 1 -- 192.168.123.106:0/869271728 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f55cc0647e0 con 0x7f55d8101090 2026-03-09T17:33:23.642 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.640+0000 7f55df757700 1 -- 192.168.123.106:0/869271728 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f55bc006200 con 0x7f55d8101090 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.640+0000 7f55ca7fc700 1 -- 192.168.123.106:0/869271728 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f55cc063f30 con 0x7f55d8101090 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 3, 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 3 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 7, 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 7 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:33:23.643 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:33:23.645 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.643+0000 7f55df757700 1 -- 192.168.123.106:0/869271728 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f55c4077700 msgr2=0x7f55c4079bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.643+0000 7f55df757700 1 --2- 192.168.123.106:0/869271728 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f55c4077700 0x7f55c4079bb0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f55d4009e90 tx=0x7f55d4009450 comp rx=0 tx=0).stop 2026-03-09T17:33:23.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.643+0000 7f55df757700 1 -- 192.168.123.106:0/869271728 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d8101090 msgr2=0x7f55d8197fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.643+0000 7f55df757700 1 --2- 192.168.123.106:0/869271728 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d8101090 0x7f55d8197fa0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f55cc00b5c0 tx=0x7f55cc004970 comp rx=0 tx=0).stop 2026-03-09T17:33:23.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.643+0000 7f55df757700 1 -- 192.168.123.106:0/869271728 shutdown_connections 2026-03-09T17:33:23.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.643+0000 7f55df757700 1 --2- 192.168.123.106:0/869271728 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f55c4077700 0x7f55c4079bb0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.643+0000 7f55df757700 1 --2- 192.168.123.106:0/869271728 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d8101090 0x7f55d8197fa0 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.643+0000 7f55df757700 1 --2- 192.168.123.106:0/869271728 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d81039b0 0x7f55d81984e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.644+0000 7f55df757700 1 -- 192.168.123.106:0/869271728 >> 192.168.123.106:0/869271728 conn(0x7f55d80faa70 msgr2=0x7f55d8104610 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:23.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.644+0000 7f55df757700 1 -- 192.168.123.106:0/869271728 shutdown_connections 2026-03-09T17:33:23.646 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.644+0000 7f55df757700 1 -- 192.168.123.106:0/869271728 wait complete. 2026-03-09T17:33:23.718 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.716+0000 7fb863bce700 1 -- 192.168.123.106:0/2347910327 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb85c100fb0 msgr2=0x7fb85c103390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.718 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.716+0000 7fb863bce700 1 --2- 192.168.123.106:0/2347910327 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb85c100fb0 0x7fb85c103390 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7fb84c009b00 tx=0x7fb84c009e10 comp rx=0 tx=0).stop 2026-03-09T17:33:23.719 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:23 vm06.local ceph-mon[109831]: pgmap v93: 65 pgs: 65 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 909 B/s rd, 1 op/s 2026-03-09T17:33:23.719 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:23 vm06.local ceph-mon[109831]: from='client.44199 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:23.719 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:23 vm06.local ceph-mon[109831]: from='client.34226 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:23.719 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:23 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:23.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.717+0000 7fb863bce700 1 -- 192.168.123.106:0/2347910327 shutdown_connections 2026-03-09T17:33:23.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.717+0000 7fb863bce700 1 --2- 192.168.123.106:0/2347910327 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb85c1038d0 0x7fb85c105cb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.717+0000 7fb863bce700 1 --2- 192.168.123.106:0/2347910327 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb85c100fb0 0x7fb85c103390 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.717+0000 7fb863bce700 1 -- 192.168.123.106:0/2347910327 >> 192.168.123.106:0/2347910327 conn(0x7fb85c0fa990 msgr2=0x7fb85c0fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:23.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.719+0000 7fb863bce700 1 -- 192.168.123.106:0/2347910327 shutdown_connections 2026-03-09T17:33:23.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.719+0000 7fb863bce700 1 -- 192.168.123.106:0/2347910327 wait complete. 2026-03-09T17:33:23.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.720+0000 7fb863bce700 1 Processor -- start 2026-03-09T17:33:23.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.720+0000 7fb863bce700 1 -- start start 2026-03-09T17:33:23.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.720+0000 7fb863bce700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb85c100fb0 0x7fb85c193bf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.722 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.720+0000 7fb863bce700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb85c1038d0 0x7fb85c194130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.723 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.720+0000 7fb863bce700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb85c1946c0 con 0x7fb85c1038d0 2026-03-09T17:33:23.723 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.720+0000 7fb863bce700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb85c194830 con 0x7fb85c100fb0 2026-03-09T17:33:23.723 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.721+0000 7fb861169700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb85c1038d0 0x7fb85c194130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:23.723 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.721+0000 7fb861169700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb85c1038d0 0x7fb85c194130 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35082/0 (socket says 192.168.123.106:35082) 2026-03-09T17:33:23.723 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.721+0000 7fb861169700 1 -- 192.168.123.106:0/1247943542 learned_addr learned my addr 192.168.123.106:0/1247943542 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:33:23.723 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.721+0000 7fb861169700 1 -- 192.168.123.106:0/1247943542 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb85c100fb0 msgr2=0x7fb85c193bf0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:33:23.723 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.721+0000 7fb861169700 1 --2- 192.168.123.106:0/1247943542 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb85c100fb0 0x7fb85c193bf0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.723 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.721+0000 7fb861169700 1 -- 192.168.123.106:0/1247943542 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb84c0097e0 con 0x7fb85c1038d0 2026-03-09T17:33:23.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.722+0000 7fb861169700 1 --2- 192.168.123.106:0/1247943542 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb85c1038d0 0x7fb85c194130 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fb85800cc60 tx=0x7fb8580074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:23.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.722+0000 7fb852ffd700 1 -- 192.168.123.106:0/1247943542 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb858007af0 con 0x7fb85c1038d0 2026-03-09T17:33:23.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.722+0000 7fb852ffd700 1 -- 192.168.123.106:0/1247943542 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb858007c50 con 0x7fb85c1038d0 2026-03-09T17:33:23.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.722+0000 7fb863bce700 1 -- 192.168.123.106:0/1247943542 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb85c06a890 con 0x7fb85c1038d0 2026-03-09T17:33:23.724 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.722+0000 7fb863bce700 1 -- 192.168.123.106:0/1247943542 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb85c06ade0 con 0x7fb85c1038d0 2026-03-09T17:33:23.725 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.723+0000 7fb852ffd700 1 -- 192.168.123.106:0/1247943542 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb858018770 con 0x7fb85c1038d0 2026-03-09T17:33:23.726 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.724+0000 7fb852ffd700 1 -- 192.168.123.106:0/1247943542 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb85800f450 con 0x7fb85c1038d0 2026-03-09T17:33:23.726 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.724+0000 7fb863bce700 1 -- 192.168.123.106:0/1247943542 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb85c18ddd0 con 0x7fb85c1038d0 2026-03-09T17:33:23.729 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.724+0000 7fb852ffd700 1 --2- 192.168.123.106:0/1247943542 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb848077870 0x7fb848079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.729 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.724+0000 7fb852ffd700 1 -- 192.168.123.106:0/1247943542 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(60..60 src has 1..60) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fb858099db0 con 0x7fb85c1038d0 2026-03-09T17:33:23.729 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.727+0000 7fb852ffd700 1 -- 192.168.123.106:0/1247943542 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb858062560 con 0x7fb85c1038d0 2026-03-09T17:33:23.729 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.727+0000 7fb86196a700 1 --2- 192.168.123.106:0/1247943542 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb848077870 0x7fb848079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:23.730 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.727+0000 7fb86196a700 1 --2- 192.168.123.106:0/1247943542 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb848077870 0x7fb848079d20 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fb84c006010 tx=0x7fb84c009f90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:23.872 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.869+0000 7fb863bce700 1 -- 192.168.123.106:0/1247943542 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fb85c06aa20 con 0x7fb85c1038d0 2026-03-09T17:33:23.872 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.870+0000 7fb852ffd700 1 -- 192.168.123.106:0/1247943542 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1942 (secure 0 0 0) 0x7fb85c06aa20 con 0x7fb85c1038d0 2026-03-09T17:33:23.873 INFO:teuthology.orchestra.run.vm06.stdout:e13 2026-03-09T17:33:23.873 INFO:teuthology.orchestra.run.vm06.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T17:33:23.873 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:33:23.873 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:33:23.873 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:33:23.873 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:33:23.873 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:33:23.873 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:33:23.873 INFO:teuthology.orchestra.run.vm06.stdout:epoch 11 2026-03-09T17:33:23.873 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:max_xattr_size 65536 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24291} 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:qdb_cluster leader: 0 members: 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:33:23.874 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:33:23.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.873+0000 7fb863bce700 1 -- 192.168.123.106:0/1247943542 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb848077870 msgr2=0x7fb848079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.873+0000 7fb863bce700 1 --2- 192.168.123.106:0/1247943542 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb848077870 0x7fb848079d20 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fb84c006010 tx=0x7fb84c009f90 comp rx=0 tx=0).stop 2026-03-09T17:33:23.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.873+0000 7fb863bce700 1 -- 192.168.123.106:0/1247943542 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb85c1038d0 msgr2=0x7fb85c194130 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.873+0000 7fb863bce700 1 --2- 192.168.123.106:0/1247943542 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb85c1038d0 0x7fb85c194130 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fb85800cc60 tx=0x7fb8580074a0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.873+0000 7fb863bce700 1 -- 192.168.123.106:0/1247943542 shutdown_connections 2026-03-09T17:33:23.876 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.873+0000 7fb863bce700 1 --2- 192.168.123.106:0/1247943542 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fb848077870 0x7fb848079d20 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.876 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.874+0000 7fb863bce700 1 --2- 192.168.123.106:0/1247943542 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fb85c100fb0 0x7fb85c193bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.876 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.874+0000 7fb863bce700 1 --2- 192.168.123.106:0/1247943542 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb85c1038d0 0x7fb85c194130 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.876 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.874+0000 7fb863bce700 1 -- 192.168.123.106:0/1247943542 >> 192.168.123.106:0/1247943542 conn(0x7fb85c0fa990 msgr2=0x7fb85c0fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:23.876 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.874+0000 7fb863bce700 1 -- 192.168.123.106:0/1247943542 shutdown_connections 2026-03-09T17:33:23.876 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.874+0000 7fb863bce700 1 -- 192.168.123.106:0/1247943542 wait complete. 2026-03-09T17:33:23.877 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:33:23.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:23 vm09.local ceph-mon[97995]: pgmap v93: 65 pgs: 65 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 909 B/s rd, 1 op/s 2026-03-09T17:33:23.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:23 vm09.local ceph-mon[97995]: from='client.44199 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:23.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:23 vm09.local ceph-mon[97995]: from='client.34226 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:23.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:23 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:23.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.949+0000 7f6f87d71700 1 -- 192.168.123.106:0/2372319943 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f800691a0 msgr2=0x7f6f80105520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.949+0000 7f6f87d71700 1 --2- 192.168.123.106:0/2372319943 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f800691a0 0x7f6f80105520 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f6f70009b50 tx=0x7f6f70009e60 comp rx=0 tx=0).stop 2026-03-09T17:33:23.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.949+0000 7f6f87d71700 1 -- 192.168.123.106:0/2372319943 shutdown_connections 2026-03-09T17:33:23.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.949+0000 7f6f87d71700 1 --2- 192.168.123.106:0/2372319943 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6f80105a60 0x7f6f80107e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.949+0000 7f6f87d71700 1 --2- 192.168.123.106:0/2372319943 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f800691a0 0x7f6f80105520 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.949+0000 7f6f87d71700 1 -- 192.168.123.106:0/2372319943 >> 192.168.123.106:0/2372319943 conn(0x7f6f800faa70 msgr2=0x7f6f800fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:23.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.949+0000 7f6f87d71700 1 -- 192.168.123.106:0/2372319943 shutdown_connections 2026-03-09T17:33:23.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.949+0000 7f6f87d71700 1 -- 192.168.123.106:0/2372319943 wait complete. 2026-03-09T17:33:23.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.950+0000 7f6f87d71700 1 Processor -- start 2026-03-09T17:33:23.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.950+0000 7f6f87d71700 1 -- start start 2026-03-09T17:33:23.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.950+0000 7f6f87d71700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f800691a0 0x7f6f80198080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.950+0000 7f6f87d71700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6f80105a60 0x7f6f801985c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.950+0000 7f6f87d71700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f80198be0 con 0x7f6f800691a0 2026-03-09T17:33:23.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.950+0000 7f6f87d71700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f80198d20 con 0x7f6f80105a60 2026-03-09T17:33:23.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.951+0000 7f6f8530c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6f80105a60 0x7f6f801985c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:23.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.951+0000 7f6f8530c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6f80105a60 0x7f6f801985c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:56186/0 (socket says 192.168.123.106:56186) 2026-03-09T17:33:23.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.951+0000 7f6f8530c700 1 -- 192.168.123.106:0/922269378 learned_addr learned my addr 192.168.123.106:0/922269378 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:33:23.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.951+0000 7f6f85b0d700 1 --2- 192.168.123.106:0/922269378 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f800691a0 0x7f6f80198080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:23.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.951+0000 7f6f8530c700 1 -- 192.168.123.106:0/922269378 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f800691a0 msgr2=0x7f6f80198080 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:23.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.951+0000 7f6f8530c700 1 --2- 192.168.123.106:0/922269378 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f800691a0 0x7f6f80198080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:23.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.951+0000 7f6f8530c700 1 -- 192.168.123.106:0/922269378 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6f700097e0 con 0x7f6f80105a60 2026-03-09T17:33:23.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.951+0000 7f6f85b0d700 1 --2- 192.168.123.106:0/922269378 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f800691a0 0x7f6f80198080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T17:33:23.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.951+0000 7f6f8530c700 1 --2- 192.168.123.106:0/922269378 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6f80105a60 0x7f6f801985c0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f6f7c00eb10 tx=0x7f6f7c00eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:23.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.951+0000 7f6f76ffd700 1 -- 192.168.123.106:0/922269378 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6f7c00cca0 con 0x7f6f80105a60 2026-03-09T17:33:23.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.951+0000 7f6f87d71700 1 -- 192.168.123.106:0/922269378 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6f8019d7d0 con 0x7f6f80105a60 2026-03-09T17:33:23.954 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.951+0000 7f6f87d71700 1 -- 192.168.123.106:0/922269378 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6f8019dd20 con 0x7f6f80105a60 2026-03-09T17:33:23.955 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.952+0000 7f6f76ffd700 1 -- 192.168.123.106:0/922269378 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6f7c00ce00 con 0x7f6f80105a60 2026-03-09T17:33:23.955 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.952+0000 7f6f76ffd700 1 -- 192.168.123.106:0/922269378 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6f7c018960 con 0x7f6f80105a60 2026-03-09T17:33:23.955 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.953+0000 7f6f76ffd700 1 -- 192.168.123.106:0/922269378 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6f7c018ac0 con 0x7f6f80105a60 2026-03-09T17:33:23.955 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.953+0000 7f6f76ffd700 1 --2- 192.168.123.106:0/922269378 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6f6c0778c0 0x7f6f6c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:23.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.954+0000 7f6f85b0d700 1 --2- 192.168.123.106:0/922269378 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6f6c0778c0 0x7f6f6c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:23.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.954+0000 7f6f76ffd700 1 -- 192.168.123.106:0/922269378 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(60..60 src has 1..60) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f6f7c014070 con 0x7f6f80105a60 2026-03-09T17:33:23.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.954+0000 7f6f85b0d700 1 --2- 192.168.123.106:0/922269378 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6f6c0778c0 0x7f6f6c079d70 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f6f700053b0 tx=0x7f6f7000b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:23.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.954+0000 7f6f87d71700 1 -- 192.168.123.106:0/922269378 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6f64005320 con 0x7f6f80105a60 2026-03-09T17:33:23.959 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:23.957+0000 7f6f76ffd700 1 -- 192.168.123.106:0/922269378 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6f7c062c00 con 0x7f6f80105a60 2026-03-09T17:33:24.089 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.085+0000 7f6f87d71700 1 -- 192.168.123.106:0/922269378 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6f64000bf0 con 0x7f6f6c0778c0 2026-03-09T17:33:24.089 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.087+0000 7f6f76ffd700 1 -- 192.168.123.106:0/922269378 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f6f64000bf0 con 0x7f6f6c0778c0 2026-03-09T17:33:24.093 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:33:24.093 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T17:33:24.093 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:33:24.093 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:33:24.093 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [ 2026-03-09T17:33:24.093 INFO:teuthology.orchestra.run.vm06.stdout: "mon", 2026-03-09T17:33:24.093 INFO:teuthology.orchestra.run.vm06.stdout: "crash", 2026-03-09T17:33:24.093 INFO:teuthology.orchestra.run.vm06.stdout: "mgr" 2026-03-09T17:33:24.093 INFO:teuthology.orchestra.run.vm06.stdout: ], 2026-03-09T17:33:24.093 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "9/23 daemons upgraded", 2026-03-09T17:33:24.093 INFO:teuthology.orchestra.run.vm06.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T17:33:24.093 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:33:24.093 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:33:24.095 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.093+0000 7f6f87d71700 1 -- 192.168.123.106:0/922269378 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6f6c0778c0 msgr2=0x7f6f6c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:24.095 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.093+0000 7f6f87d71700 1 --2- 192.168.123.106:0/922269378 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6f6c0778c0 0x7f6f6c079d70 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f6f700053b0 tx=0x7f6f7000b540 comp rx=0 tx=0).stop 2026-03-09T17:33:24.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.094+0000 7f6f87d71700 1 -- 192.168.123.106:0/922269378 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6f80105a60 msgr2=0x7f6f801985c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:24.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.094+0000 7f6f87d71700 1 --2- 192.168.123.106:0/922269378 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6f80105a60 0x7f6f801985c0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f6f7c00eb10 tx=0x7f6f7c00eed0 comp rx=0 tx=0).stop 2026-03-09T17:33:24.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.094+0000 7f6f87d71700 1 -- 192.168.123.106:0/922269378 shutdown_connections 2026-03-09T17:33:24.097 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.094+0000 7f6f87d71700 1 --2- 192.168.123.106:0/922269378 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6f6c0778c0 0x7f6f6c079d70 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:24.097 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.094+0000 7f6f87d71700 1 --2- 192.168.123.106:0/922269378 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f800691a0 0x7f6f80198080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:24.097 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.094+0000 7f6f87d71700 1 --2- 192.168.123.106:0/922269378 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6f80105a60 0x7f6f801985c0 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:24.097 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.095+0000 7f6f87d71700 1 -- 192.168.123.106:0/922269378 >> 192.168.123.106:0/922269378 conn(0x7f6f800faa70 msgr2=0x7f6f800fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:24.097 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.095+0000 7f6f87d71700 1 -- 192.168.123.106:0/922269378 shutdown_connections 2026-03-09T17:33:24.097 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.095+0000 7f6f87d71700 1 -- 192.168.123.106:0/922269378 wait complete. 2026-03-09T17:33:24.171 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.168+0000 7f4c5f983700 1 -- 192.168.123.106:0/2672620465 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58100700 msgr2=0x7f4c58100b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:24.171 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.168+0000 7f4c5f983700 1 --2- 192.168.123.106:0/2672620465 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58100700 0x7f4c58100b70 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f4c54009b00 tx=0x7f4c54009e10 comp rx=0 tx=0).stop 2026-03-09T17:33:24.171 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.169+0000 7f4c5f983700 1 -- 192.168.123.106:0/2672620465 shutdown_connections 2026-03-09T17:33:24.171 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.169+0000 7f4c5f983700 1 --2- 192.168.123.106:0/2672620465 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58100700 0x7f4c58100b70 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:24.171 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.169+0000 7f4c5f983700 1 --2- 192.168.123.106:0/2672620465 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4c580ff460 0x7f4c580ff870 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:24.171 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.169+0000 7f4c5f983700 1 -- 192.168.123.106:0/2672620465 >> 192.168.123.106:0/2672620465 conn(0x7f4c580faa70 msgr2=0x7f4c580fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:24.171 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.169+0000 7f4c5f983700 1 -- 192.168.123.106:0/2672620465 shutdown_connections 2026-03-09T17:33:24.171 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.169+0000 7f4c5f983700 1 -- 192.168.123.106:0/2672620465 wait complete. 2026-03-09T17:33:24.171 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.169+0000 7f4c5f983700 1 Processor -- start 2026-03-09T17:33:24.172 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.170+0000 7f4c5f983700 1 -- start start 2026-03-09T17:33:24.172 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.170+0000 7f4c5f983700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4c580ff460 0x7f4c58195e10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:24.172 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.170+0000 7f4c5f983700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58100700 0x7f4c58196350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:24.172 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.170+0000 7f4c5f983700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c581968e0 con 0x7f4c58100700 2026-03-09T17:33:24.172 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.170+0000 7f4c5f983700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c58196a20 con 0x7f4c580ff460 2026-03-09T17:33:24.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.170+0000 7f4c5cf1e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58100700 0x7f4c58196350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:24.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.170+0000 7f4c5cf1e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58100700 0x7f4c58196350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:35136/0 (socket says 192.168.123.106:35136) 2026-03-09T17:33:24.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.170+0000 7f4c5cf1e700 1 -- 192.168.123.106:0/2255188291 learned_addr learned my addr 192.168.123.106:0/2255188291 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:33:24.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.170+0000 7f4c5cf1e700 1 -- 192.168.123.106:0/2255188291 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4c580ff460 msgr2=0x7f4c58195e10 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:33:24.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.171+0000 7f4c5d71f700 1 --2- 192.168.123.106:0/2255188291 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4c580ff460 0x7f4c58195e10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:24.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.171+0000 7f4c5cf1e700 1 --2- 192.168.123.106:0/2255188291 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4c580ff460 0x7f4c58195e10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:24.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.171+0000 7f4c5cf1e700 1 -- 192.168.123.106:0/2255188291 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4c540097e0 con 0x7f4c58100700 2026-03-09T17:33:24.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.171+0000 7f4c5d71f700 1 --2- 192.168.123.106:0/2255188291 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4c580ff460 0x7f4c58195e10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:33:24.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.171+0000 7f4c5cf1e700 1 --2- 192.168.123.106:0/2255188291 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58100700 0x7f4c58196350 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f4c540049c0 tx=0x7f4c54004aa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:24.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.171+0000 7f4c4e7fc700 1 -- 192.168.123.106:0/2255188291 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4c5401d070 con 0x7f4c58100700 2026-03-09T17:33:24.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.171+0000 7f4c4e7fc700 1 -- 192.168.123.106:0/2255188291 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4c5400bd10 con 0x7f4c58100700 2026-03-09T17:33:24.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.171+0000 7f4c4e7fc700 1 -- 192.168.123.106:0/2255188291 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4c5400f940 con 0x7f4c58100700 2026-03-09T17:33:24.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.171+0000 7f4c5f983700 1 -- 192.168.123.106:0/2255188291 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4c5819b480 con 0x7f4c58100700 2026-03-09T17:33:24.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.171+0000 7f4c5f983700 1 -- 192.168.123.106:0/2255188291 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4c5819b940 con 0x7f4c58100700 2026-03-09T17:33:24.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.173+0000 7f4c5f983700 1 -- 192.168.123.106:0/2255188291 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4c58066e40 con 0x7f4c58100700 2026-03-09T17:33:24.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.173+0000 7f4c4e7fc700 1 -- 192.168.123.106:0/2255188291 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4c54022cc0 con 0x7f4c58100700 2026-03-09T17:33:24.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.174+0000 7f4c4e7fc700 1 --2- 192.168.123.106:0/2255188291 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f4c440778c0 0x7f4c44079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:24.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.174+0000 7f4c4e7fc700 1 -- 192.168.123.106:0/2255188291 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(60..60 src has 1..60) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f4c5409b430 con 0x7f4c58100700 2026-03-09T17:33:24.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.176+0000 7f4c4e7fc700 1 -- 192.168.123.106:0/2255188291 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4c54063c60 con 0x7f4c58100700 2026-03-09T17:33:24.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.176+0000 7f4c5d71f700 1 --2- 192.168.123.106:0/2255188291 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f4c440778c0 0x7f4c44079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:24.179 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.177+0000 7f4c5d71f700 1 --2- 192.168.123.106:0/2255188291 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f4c440778c0 0x7f4c44079d70 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f4c48005fd0 tx=0x7f4c48005e20 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:24.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.344+0000 7f4c5f983700 1 -- 192.168.123.106:0/2255188291 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f4c5819bd10 con 0x7f4c58100700 2026-03-09T17:33:24.347 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.345+0000 7f4c4e7fc700 1 -- 192.168.123.106:0/2255188291 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f4c540633b0 con 0x7f4c58100700 2026-03-09T17:33:24.348 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_OK 2026-03-09T17:33:24.352 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.349+0000 7f4c5f983700 1 -- 192.168.123.106:0/2255188291 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f4c440778c0 msgr2=0x7f4c44079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:24.352 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.349+0000 7f4c5f983700 1 --2- 192.168.123.106:0/2255188291 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f4c440778c0 0x7f4c44079d70 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f4c48005fd0 tx=0x7f4c48005e20 comp rx=0 tx=0).stop 2026-03-09T17:33:24.352 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.349+0000 7f4c5f983700 1 -- 192.168.123.106:0/2255188291 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58100700 msgr2=0x7f4c58196350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:24.352 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.349+0000 7f4c5f983700 1 --2- 192.168.123.106:0/2255188291 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58100700 0x7f4c58196350 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f4c540049c0 tx=0x7f4c54004aa0 comp rx=0 tx=0).stop 2026-03-09T17:33:24.352 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.349+0000 7f4c5f983700 1 -- 192.168.123.106:0/2255188291 shutdown_connections 2026-03-09T17:33:24.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.349+0000 7f4c5f983700 1 --2- 192.168.123.106:0/2255188291 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f4c440778c0 0x7f4c44079d70 secure :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f4c48005fd0 tx=0x7f4c48005e20 comp rx=0 tx=0).stop 2026-03-09T17:33:24.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.349+0000 7f4c5f983700 1 --2- 192.168.123.106:0/2255188291 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f4c580ff460 0x7f4c58195e10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:24.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.349+0000 7f4c5f983700 1 --2- 192.168.123.106:0/2255188291 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58100700 0x7f4c58196350 secure :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f4c540049c0 tx=0x7f4c54004aa0 comp rx=0 tx=0).stop 2026-03-09T17:33:24.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.350+0000 7f4c5f983700 1 -- 192.168.123.106:0/2255188291 >> 192.168.123.106:0/2255188291 conn(0x7f4c580faa70 msgr2=0x7f4c58103930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:24.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.350+0000 7f4c5f983700 1 -- 192.168.123.106:0/2255188291 shutdown_connections 2026-03-09T17:33:24.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:24.350+0000 7f4c5f983700 1 -- 192.168.123.106:0/2255188291 wait complete. 2026-03-09T17:33:24.592 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-mon[97995]: from='client.44203 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:24.592 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/869271728' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:24.592 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T17:33:24.592 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T17:33:24.592 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-mon[97995]: Upgrade: osd.3 is safe to restart 2026-03-09T17:33:24.592 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/1247943542' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:33:24.592 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-mon[97995]: from='client.44209 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:24.592 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-mon[97995]: Upgrade: Updating osd.3 2026-03-09T17:33:24.592 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:24.592 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T17:33:24.592 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:24.592 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-mon[97995]: Deploying daemon osd.3 on vm09 2026-03-09T17:33:24.592 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/2255188291' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:33:24.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:24 vm06.local ceph-mon[109831]: from='client.44203 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:24.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:24 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/869271728' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:24.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:24 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T17:33:24.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:24 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T17:33:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:24 vm06.local ceph-mon[109831]: Upgrade: osd.3 is safe to restart 2026-03-09T17:33:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:24 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/1247943542' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:33:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:24 vm06.local ceph-mon[109831]: from='client.44209 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:24 vm06.local ceph-mon[109831]: Upgrade: Updating osd.3 2026-03-09T17:33:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:24 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:24 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T17:33:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:24 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:24 vm06.local ceph-mon[109831]: Deploying daemon osd.3 on vm09 2026-03-09T17:33:24.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:24 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/2255188291' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:33:24.862 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:24 vm09.local systemd[1]: Stopping Ceph osd.3 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:33:25.145 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3[67939]: 2026-03-09T17:33:24.931+0000 7f7681c35700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:33:25.145 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3[67939]: 2026-03-09T17:33:24.932+0000 7f7681c35700 -1 osd.3 60 *** Got signal Terminated *** 2026-03-09T17:33:25.145 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:24 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3[67939]: 2026-03-09T17:33:24.932+0000 7f7681c35700 -1 osd.3 60 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T17:33:25.799 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:25 vm09.local ceph-mon[97995]: pgmap v94: 65 pgs: 65 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 767 B/s rd, 1 op/s 2026-03-09T17:33:25.799 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:25 vm09.local ceph-mon[97995]: osd.3 marked itself down and dead 2026-03-09T17:33:25.800 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:25 vm09.local podman[103882]: 2026-03-09 17:33:25.620314934 +0000 UTC m=+0.700245230 container died 48a594500ef19cf503d26c06ac371a074bad76d5670be03b24a976342dd10184 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3, RELEASE=HEAD, org.label-schema.build-date=20231212, org.label-schema.name=CentOS Stream 8 Base Image, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, org.label-schema.license=GPLv2, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308) 2026-03-09T17:33:25.800 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:25 vm09.local podman[103882]: 2026-03-09 17:33:25.64693182 +0000 UTC m=+0.726862117 container remove 48a594500ef19cf503d26c06ac371a074bad76d5670be03b24a976342dd10184 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.schema-version=1.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.29.1, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_CLEAN=True, RELEASE=HEAD, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.0) 2026-03-09T17:33:25.800 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:25 vm09.local bash[103882]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3 2026-03-09T17:33:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:25 vm06.local ceph-mon[109831]: pgmap v94: 65 pgs: 65 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 767 B/s rd, 1 op/s 2026-03-09T17:33:25.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:25 vm06.local ceph-mon[109831]: osd.3 marked itself down and dead 2026-03-09T17:33:26.052 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:25 vm09.local podman[103949]: 2026-03-09 17:33:25.798622638 +0000 UTC m=+0.016699627 container create 0c92d0ccc349f2809e7d3380f1c78095ced7bd57280a5d7b6ec424240c18f0d7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-deactivate, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T17:33:26.052 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:25 vm09.local podman[103949]: 2026-03-09 17:33:25.841909113 +0000 UTC m=+0.059986102 container init 0c92d0ccc349f2809e7d3380f1c78095ced7bd57280a5d7b6ec424240c18f0d7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-deactivate, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T17:33:26.052 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:25 vm09.local podman[103949]: 2026-03-09 17:33:25.845383105 +0000 UTC m=+0.063460094 container start 0c92d0ccc349f2809e7d3380f1c78095ced7bd57280a5d7b6ec424240c18f0d7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default) 2026-03-09T17:33:26.052 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:25 vm09.local podman[103949]: 2026-03-09 17:33:25.846603018 +0000 UTC m=+0.064680007 container attach 0c92d0ccc349f2809e7d3380f1c78095ced7bd57280a5d7b6ec424240c18f0d7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-deactivate, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T17:33:26.052 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:25 vm09.local podman[103949]: 2026-03-09 17:33:25.792114637 +0000 UTC m=+0.010191636 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:33:26.052 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local podman[103969]: 2026-03-09 17:33:26.018584141 +0000 UTC m=+0.012099898 container died 0c92d0ccc349f2809e7d3380f1c78095ced7bd57280a5d7b6ec424240c18f0d7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) 2026-03-09T17:33:26.052 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local podman[103969]: 2026-03-09 17:33:26.040770204 +0000 UTC m=+0.034285961 container remove 0c92d0ccc349f2809e7d3380f1c78095ced7bd57280a5d7b6ec424240c18f0d7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-deactivate, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20260223, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default) 2026-03-09T17:33:26.380 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.3.service: Deactivated successfully. 2026-03-09T17:33:26.380 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local systemd[1]: Stopped Ceph osd.3 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:33:26.381 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.3.service: Consumed 1min 1.533s CPU time. 2026-03-09T17:33:26.381 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local systemd[1]: Starting Ceph osd.3 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:33:26.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:26 vm09.local ceph-mon[97995]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T17:33:26.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:26 vm09.local ceph-mon[97995]: osdmap e61: 6 total, 5 up, 6 in 2026-03-09T17:33:26.646 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local podman[104054]: 2026-03-09 17:33:26.378831004 +0000 UTC m=+0.023621983 container create 7e5f87e2e7372aa92f20e67b29f2c77d74ccd6ec2f9f21841615e50623979d95 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS) 2026-03-09T17:33:26.646 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local podman[104054]: 2026-03-09 17:33:26.43560403 +0000 UTC m=+0.080395009 container init 7e5f87e2e7372aa92f20e67b29f2c77d74ccd6ec2f9f21841615e50623979d95 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223) 2026-03-09T17:33:26.646 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local podman[104054]: 2026-03-09 17:33:26.445315687 +0000 UTC m=+0.090106657 container start 7e5f87e2e7372aa92f20e67b29f2c77d74ccd6ec2f9f21841615e50623979d95 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True) 2026-03-09T17:33:26.646 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local podman[104054]: 2026-03-09 17:33:26.44920684 +0000 UTC m=+0.093997828 container attach 7e5f87e2e7372aa92f20e67b29f2c77d74ccd6ec2f9f21841615e50623979d95 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T17:33:26.646 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local podman[104054]: 2026-03-09 17:33:26.370655972 +0000 UTC m=+0.015446962 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:33:26.646 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate[104066]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:26.646 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local bash[104054]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:26.646 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate[104066]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:26.646 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:26 vm09.local bash[104054]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:26.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:26 vm06.local ceph-mon[109831]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T17:33:26.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:26 vm06.local ceph-mon[109831]: osdmap e61: 6 total, 5 up, 6 in 2026-03-09T17:33:27.395 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate[104066]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T17:33:27.395 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local bash[104054]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T17:33:27.395 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local bash[104054]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:27.395 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate[104066]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:27.395 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate[104066]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:27.395 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local bash[104054]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:27.395 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate[104066]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T17:33:27.395 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local bash[104054]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T17:33:27.395 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate[104066]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-ee44fecd-82bf-478a-9cb3-aaa4b00ae26d/osd-block-733bc53e-8727-4119-a70f-00c09a625789 --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-09T17:33:27.395 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local bash[104054]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-ee44fecd-82bf-478a-9cb3-aaa4b00ae26d/osd-block-733bc53e-8727-4119-a70f-00c09a625789 --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-09T17:33:27.826 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:27 vm06.local ceph-mon[109831]: pgmap v96: 65 pgs: 16 stale+active+clean, 49 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 1023 B/s rd, 1 op/s 2026-03-09T17:33:27.826 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:27 vm06.local ceph-mon[109831]: osdmap e62: 6 total, 5 up, 6 in 2026-03-09T17:33:27.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:27 vm09.local ceph-mon[97995]: pgmap v96: 65 pgs: 16 stale+active+clean, 49 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 1023 B/s rd, 1 op/s 2026-03-09T17:33:27.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:27 vm09.local ceph-mon[97995]: osdmap e62: 6 total, 5 up, 6 in 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate[104066]: Running command: /usr/bin/ln -snf /dev/ceph-ee44fecd-82bf-478a-9cb3-aaa4b00ae26d/osd-block-733bc53e-8727-4119-a70f-00c09a625789 /var/lib/ceph/osd/ceph-3/block 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local bash[104054]: Running command: /usr/bin/ln -snf /dev/ceph-ee44fecd-82bf-478a-9cb3-aaa4b00ae26d/osd-block-733bc53e-8727-4119-a70f-00c09a625789 /var/lib/ceph/osd/ceph-3/block 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate[104066]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local bash[104054]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate[104066]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local bash[104054]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate[104066]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local bash[104054]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate[104066]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local bash[104054]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local conmon[104066]: conmon 7e5f87e2e7372aa92f20 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7e5f87e2e7372aa92f20e67b29f2c77d74ccd6ec2f9f21841615e50623979d95.scope/container/memory.events 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local podman[104054]: 2026-03-09 17:33:27.517374306 +0000 UTC m=+1.162165285 container died 7e5f87e2e7372aa92f20e67b29f2c77d74ccd6ec2f9f21841615e50623979d95 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS) 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local podman[104054]: 2026-03-09 17:33:27.538202278 +0000 UTC m=+1.182993257 container remove 7e5f87e2e7372aa92f20e67b29f2c77d74ccd6ec2f9f21841615e50623979d95 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local podman[104315]: 2026-03-09 17:33:27.64134406 +0000 UTC m=+0.018707194 container create 40d8343609336e13e857e2ad041bb33ed6783fb5b5e5a916914f372c1a9c0a22 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local podman[104315]: 2026-03-09 17:33:27.685334913 +0000 UTC m=+0.062698057 container init 40d8343609336e13e857e2ad041bb33ed6783fb5b5e5a916914f372c1a9c0a22 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local podman[104315]: 2026-03-09 17:33:27.694060153 +0000 UTC m=+0.071423287 container start 40d8343609336e13e857e2ad041bb33ed6783fb5b5e5a916914f372c1a9c0a22 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local bash[104315]: 40d8343609336e13e857e2ad041bb33ed6783fb5b5e5a916914f372c1a9c0a22 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local podman[104315]: 2026-03-09 17:33:27.633473339 +0000 UTC m=+0.010836483 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:33:27.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local systemd[1]: Started Ceph osd.3 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:33:27.896 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:27 vm09.local ceph-osd[104330]: -- 192.168.123.109:0/4113328600 <== mon.1 v2:192.168.123.109:3300/0 4 ==== auth_reply(proto 2 0 (0) Success) ==== 194+0+0 (secure 0 0 0) 0x55e93414c960 con 0x55e9342fc000 2026-03-09T17:33:28.645 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:28 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3[104326]: 2026-03-09T17:33:28.549+0000 7faf33602740 -1 Falling back to public interface 2026-03-09T17:33:29.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:29.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:29.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:29.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:29.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:33:29.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:29.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:29.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:29.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:29.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:33:30.043 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:29 vm06.local ceph-mon[109831]: pgmap v98: 65 pgs: 16 stale+active+clean, 49 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 639 B/s rd, 1 op/s 2026-03-09T17:33:30.043 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:29 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:30.043 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:29 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:30.043 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:29 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:30.043 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:29 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:30.061 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:29 vm09.local ceph-mon[97995]: pgmap v98: 65 pgs: 16 stale+active+clean, 49 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 639 B/s rd, 1 op/s 2026-03-09T17:33:30.061 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:29 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:30.061 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:29 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:30.061 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:29 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:30.061 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:29 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: Upgrade: osd.4 is safe to restart 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T17:33:31.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: Upgrade: osd.4 is safe to restart 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T17:33:31.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:31 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:31.895 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:31 vm09.local systemd[1]: Stopping Ceph osd.4 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:33:31.895 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4[73591]: 2026-03-09T17:33:31.537+0000 7fcf994ad700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:33:31.895 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4[73591]: 2026-03-09T17:33:31.537+0000 7fcf994ad700 -1 osd.4 62 *** Got signal Terminated *** 2026-03-09T17:33:31.895 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:31 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4[73591]: 2026-03-09T17:33:31.537+0000 7fcf994ad700 -1 osd.4 62 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T17:33:32.399 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:32 vm09.local ceph-mon[97995]: pgmap v99: 65 pgs: 16 active+undersized, 3 stale+active+clean, 15 active+undersized+degraded, 31 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 48/234 objects degraded (20.513%) 2026-03-09T17:33:32.399 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:32 vm09.local ceph-mon[97995]: Upgrade: Updating osd.4 2026-03-09T17:33:32.399 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:32 vm09.local ceph-mon[97995]: Deploying daemon osd.4 on vm09 2026-03-09T17:33:32.399 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:32 vm09.local ceph-mon[97995]: Health check failed: Degraded data redundancy: 48/234 objects degraded (20.513%), 15 pgs degraded (PG_DEGRADED) 2026-03-09T17:33:32.399 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:32 vm09.local ceph-mon[97995]: osd.4 marked itself down and dead 2026-03-09T17:33:32.399 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local podman[107359]: 2026-03-09 17:33:32.221412196 +0000 UTC m=+0.705864246 container died a47c3905254158874fca86df7ae7d9b38938d987b15074a2c8e90b87c2c40ab7 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4, org.label-schema.build-date=20231212, CEPH_POINT_RELEASE=-18.2.0, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.license=GPLv2, GIT_BRANCH=HEAD, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, RELEASE=HEAD, ceph=True, io.buildah.version=1.29.1) 2026-03-09T17:33:32.399 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local podman[107359]: 2026-03-09 17:33:32.250493144 +0000 UTC m=+0.734945194 container remove a47c3905254158874fca86df7ae7d9b38938d987b15074a2c8e90b87c2c40ab7 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.name=CentOS Stream 8 Base Image, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, RELEASE=HEAD, ceph=True, CEPH_POINT_RELEASE=-18.2.0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_CLEAN=True, io.buildah.version=1.29.1) 2026-03-09T17:33:32.399 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local bash[107359]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4 2026-03-09T17:33:32.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:32 vm06.local ceph-mon[109831]: pgmap v99: 65 pgs: 16 active+undersized, 3 stale+active+clean, 15 active+undersized+degraded, 31 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 48/234 objects degraded (20.513%) 2026-03-09T17:33:32.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:32 vm06.local ceph-mon[109831]: Upgrade: Updating osd.4 2026-03-09T17:33:32.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:32 vm06.local ceph-mon[109831]: Deploying daemon osd.4 on vm09 2026-03-09T17:33:32.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:32 vm06.local ceph-mon[109831]: Health check failed: Degraded data redundancy: 48/234 objects degraded (20.513%), 15 pgs degraded (PG_DEGRADED) 2026-03-09T17:33:32.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:32 vm06.local ceph-mon[109831]: osd.4 marked itself down and dead 2026-03-09T17:33:32.690 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:32 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3[104326]: 2026-03-09T17:33:32.685+0000 7faf33602740 -1 osd.3 0 read_superblock omap replica is missing. 2026-03-09T17:33:32.690 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local podman[107607]: 2026-03-09 17:33:32.427132834 +0000 UTC m=+0.019594534 container create f4c06521afb0d11d21eb826259dbba7fc7025752d35f756648d5c80d454d3f79 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T17:33:32.690 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local podman[107607]: 2026-03-09 17:33:32.482024811 +0000 UTC m=+0.074486520 container init f4c06521afb0d11d21eb826259dbba7fc7025752d35f756648d5c80d454d3f79 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-deactivate, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T17:33:32.690 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local podman[107607]: 2026-03-09 17:33:32.485791351 +0000 UTC m=+0.078253049 container start f4c06521afb0d11d21eb826259dbba7fc7025752d35f756648d5c80d454d3f79 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T17:33:32.690 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local podman[107607]: 2026-03-09 17:33:32.490491127 +0000 UTC m=+0.082952836 container attach f4c06521afb0d11d21eb826259dbba7fc7025752d35f756648d5c80d454d3f79 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-deactivate, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T17:33:32.690 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local podman[107607]: 2026-03-09 17:33:32.418838932 +0000 UTC m=+0.011300641 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:33:32.690 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local conmon[107618]: conmon f4c06521afb0d11d21eb : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f4c06521afb0d11d21eb826259dbba7fc7025752d35f756648d5c80d454d3f79.scope/container/memory.events 2026-03-09T17:33:32.690 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local podman[107607]: 2026-03-09 17:33:32.618326476 +0000 UTC m=+0.210788175 container died f4c06521afb0d11d21eb826259dbba7fc7025752d35f756648d5c80d454d3f79 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-deactivate, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3) 2026-03-09T17:33:32.690 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local podman[107607]: 2026-03-09 17:33:32.653925974 +0000 UTC m=+0.246387684 container remove f4c06521afb0d11d21eb826259dbba7fc7025752d35f756648d5c80d454d3f79 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-deactivate, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T17:33:32.690 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.4.service: Deactivated successfully. 2026-03-09T17:33:32.690 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.4.service: Unit process 107618 (conmon) remains running after unit stopped. 2026-03-09T17:33:32.690 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.4.service: Unit process 107626 (podman) remains running after unit stopped. 2026-03-09T17:33:32.690 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local systemd[1]: Stopped Ceph osd.4 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:33:32.690 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.4.service: Consumed 50.326s CPU time, 883.9M memory peak. 2026-03-09T17:33:33.014 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:32 vm09.local systemd[1]: Starting Ceph osd.4 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:33:33.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:33 vm09.local ceph-mon[97995]: Health check update: 2 osds down (OSD_DOWN) 2026-03-09T17:33:33.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:33 vm09.local ceph-mon[97995]: osdmap e63: 6 total, 4 up, 6 in 2026-03-09T17:33:33.395 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local podman[107710]: 2026-03-09 17:33:33.01291328 +0000 UTC m=+0.023929909 container create 0968cda49d21753729a751af21e2f8861fd389de143f7ffee268f67ad8e44bd2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T17:33:33.395 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local podman[107710]: 2026-03-09 17:33:33.052949962 +0000 UTC m=+0.063966591 container init 0968cda49d21753729a751af21e2f8861fd389de143f7ffee268f67ad8e44bd2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True) 2026-03-09T17:33:33.395 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local podman[107710]: 2026-03-09 17:33:33.056947295 +0000 UTC m=+0.067963924 container start 0968cda49d21753729a751af21e2f8861fd389de143f7ffee268f67ad8e44bd2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0) 2026-03-09T17:33:33.395 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local podman[107710]: 2026-03-09 17:33:33.064229924 +0000 UTC m=+0.075246553 container attach 0968cda49d21753729a751af21e2f8861fd389de143f7ffee268f67ad8e44bd2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T17:33:33.395 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local podman[107710]: 2026-03-09 17:33:33.004343269 +0000 UTC m=+0.015359909 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:33:33.395 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate[107721]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:33.395 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local bash[107710]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:33.395 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate[107721]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:33.395 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local bash[107710]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:33.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:33 vm06.local ceph-mon[109831]: Health check update: 2 osds down (OSD_DOWN) 2026-03-09T17:33:33.505 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:33 vm06.local ceph-mon[109831]: osdmap e63: 6 total, 4 up, 6 in 2026-03-09T17:33:33.785 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:33 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3[104326]: 2026-03-09T17:33:33.495+0000 7faf33602740 -1 osd.3 60 log_to_monitors true 2026-03-09T17:33:33.785 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate[107721]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T17:33:33.785 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate[107721]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:33.785 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local bash[107710]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T17:33:33.785 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local bash[107710]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:33.785 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate[107721]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:33.785 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local bash[107710]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:33.785 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate[107721]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T17:33:33.785 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local bash[107710]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T17:33:34.072 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate[107721]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-d1019072-40ec-4d5d-8ece-c2d7d5c0f686/osd-block-24911c64-9b6a-4862-9972-34f73f6f3c13 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-09T17:33:34.072 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:33 vm09.local bash[107710]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-d1019072-40ec-4d5d-8ece-c2d7d5c0f686/osd-block-24911c64-9b6a-4862-9972-34f73f6f3c13 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-09T17:33:34.072 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate[107721]: Running command: /usr/bin/ln -snf /dev/ceph-d1019072-40ec-4d5d-8ece-c2d7d5c0f686/osd-block-24911c64-9b6a-4862-9972-34f73f6f3c13 /var/lib/ceph/osd/ceph-4/block 2026-03-09T17:33:34.072 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local bash[107710]: Running command: /usr/bin/ln -snf /dev/ceph-d1019072-40ec-4d5d-8ece-c2d7d5c0f686/osd-block-24911c64-9b6a-4862-9972-34f73f6f3c13 /var/lib/ceph/osd/ceph-4/block 2026-03-09T17:33:34.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:34 vm09.local ceph-mon[97995]: pgmap v101: 65 pgs: 5 stale+active+undersized, 15 active+undersized, 4 stale+active+undersized+degraded, 4 stale+active+clean, 14 active+undersized+degraded, 23 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 56/234 objects degraded (23.932%) 2026-03-09T17:33:34.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:34 vm09.local ceph-mon[97995]: osdmap e64: 6 total, 4 up, 6 in 2026-03-09T17:33:34.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:34 vm09.local ceph-mon[97995]: from='osd.3 [v2:192.168.123.109:6800/3261105196,v1:192.168.123.109:6801/3261105196]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T17:33:34.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:34 vm09.local ceph-mon[97995]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:33:34 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3[104326]: 2026-03-09T17:33:34.182+0000 7faf2b39c640 -1 osd.3 60 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate[107721]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local bash[107710]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate[107721]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local bash[107710]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate[107721]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local bash[107710]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate[107721]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local bash[107710]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local conmon[107721]: conmon 0968cda49d21753729a7 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-0968cda49d21753729a751af21e2f8861fd389de143f7ffee268f67ad8e44bd2.scope/container/memory.events 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local podman[107710]: 2026-03-09 17:33:34.099674726 +0000 UTC m=+1.110691355 container died 0968cda49d21753729a751af21e2f8861fd389de143f7ffee268f67ad8e44bd2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223) 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local podman[107710]: 2026-03-09 17:33:34.118297792 +0000 UTC m=+1.129314421 container remove 0968cda49d21753729a751af21e2f8861fd389de143f7ffee268f67ad8e44bd2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-activate, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, ceph=True, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local podman[108000]: 2026-03-09 17:33:34.229976811 +0000 UTC m=+0.017562242 container create cb6e9cd4fe303e400642c9035b2a2860c284ccbb7a646d0f1b82f1f70af57c2a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS) 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local podman[108000]: 2026-03-09 17:33:34.276124212 +0000 UTC m=+0.063709642 container init cb6e9cd4fe303e400642c9035b2a2860c284ccbb7a646d0f1b82f1f70af57c2a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True) 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local podman[108000]: 2026-03-09 17:33:34.279459865 +0000 UTC m=+0.067045295 container start cb6e9cd4fe303e400642c9035b2a2860c284ccbb7a646d0f1b82f1f70af57c2a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4, io.buildah.version=1.41.3, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS) 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local bash[108000]: cb6e9cd4fe303e400642c9035b2a2860c284ccbb7a646d0f1b82f1f70af57c2a 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local podman[108000]: 2026-03-09 17:33:34.22316911 +0000 UTC m=+0.010754530 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:33:34.396 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local systemd[1]: Started Ceph osd.4 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:33:34.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:34 vm06.local ceph-mon[109831]: pgmap v101: 65 pgs: 5 stale+active+undersized, 15 active+undersized, 4 stale+active+undersized+degraded, 4 stale+active+clean, 14 active+undersized+degraded, 23 active+clean; 216 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 56/234 objects degraded (23.932%) 2026-03-09T17:33:34.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:34 vm06.local ceph-mon[109831]: osdmap e64: 6 total, 4 up, 6 in 2026-03-09T17:33:34.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:34 vm06.local ceph-mon[109831]: from='osd.3 [v2:192.168.123.109:6800/3261105196,v1:192.168.123.109:6801/3261105196]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T17:33:34.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:34 vm06.local ceph-mon[109831]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T17:33:34.895 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:34 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4[108010]: 2026-03-09T17:33:34.609+0000 7f9de38f0740 -1 Falling back to public interface 2026-03-09T17:33:35.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:35 vm09.local ceph-mon[97995]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T17:33:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:35 vm09.local ceph-mon[97995]: osdmap e65: 6 total, 4 up, 6 in 2026-03-09T17:33:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:35 vm09.local ceph-mon[97995]: from='osd.3 [v2:192.168.123.109:6800/3261105196,v1:192.168.123.109:6801/3261105196]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:33:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:35 vm09.local ceph-mon[97995]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:33:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:35 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:35 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:35 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:35.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:35 vm06.local ceph-mon[109831]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T17:33:35.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:35 vm06.local ceph-mon[109831]: osdmap e65: 6 total, 4 up, 6 in 2026-03-09T17:33:35.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:35 vm06.local ceph-mon[109831]: from='osd.3 [v2:192.168.123.109:6800/3261105196,v1:192.168.123.109:6801/3261105196]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:33:35.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:35 vm06.local ceph-mon[109831]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:33:35.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:35 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:35.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:35 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:35.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:35 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:36.329 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:36 vm09.local ceph-mon[97995]: pgmap v104: 65 pgs: 3 stale+active+undersized, 3 undersized+peered, 17 active+undersized, 4 stale+active+undersized+degraded, 3 stale+active+clean, 2 undersized+degraded+peered, 14 active+undersized+degraded, 19 active+clean; 216 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 62/234 objects degraded (26.496%) 2026-03-09T17:33:36.329 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:36 vm09.local ceph-mon[97995]: osd.3 [v2:192.168.123.109:6800/3261105196,v1:192.168.123.109:6801/3261105196] boot 2026-03-09T17:33:36.329 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:36 vm09.local ceph-mon[97995]: osdmap e66: 6 total, 5 up, 6 in 2026-03-09T17:33:36.329 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:36 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:33:36.330 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:36 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:36.330 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:36 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:36.330 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:36 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:36.330 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:36 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:36.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:36 vm06.local ceph-mon[109831]: pgmap v104: 65 pgs: 3 stale+active+undersized, 3 undersized+peered, 17 active+undersized, 4 stale+active+undersized+degraded, 3 stale+active+clean, 2 undersized+degraded+peered, 14 active+undersized+degraded, 19 active+clean; 216 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 62/234 objects degraded (26.496%) 2026-03-09T17:33:36.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:36 vm06.local ceph-mon[109831]: osd.3 [v2:192.168.123.109:6800/3261105196,v1:192.168.123.109:6801/3261105196] boot 2026-03-09T17:33:36.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:36 vm06.local ceph-mon[109831]: osdmap e66: 6 total, 5 up, 6 in 2026-03-09T17:33:36.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:36 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T17:33:36.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:36 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:36.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:36 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:36.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:36 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:36.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:36 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:37 vm06.local ceph-mon[109831]: osdmap e67: 6 total, 5 up, 6 in 2026-03-09T17:33:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:33:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T17:33:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:37 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 62/234 objects degraded (26.496%), 20 pgs degraded (PG_DEGRADED) 2026-03-09T17:33:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:37 vm09.local ceph-mon[97995]: osdmap e67: 6 total, 5 up, 6 in 2026-03-09T17:33:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:33:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T17:33:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:37 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 62/234 objects degraded (26.496%), 20 pgs degraded (PG_DEGRADED) 2026-03-09T17:33:38.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:38 vm06.local ceph-mon[109831]: pgmap v107: 65 pgs: 8 peering, 1 stale+active+undersized, 4 undersized+peered, 20 active+undersized, 1 stale+active+undersized+degraded, 6 undersized+degraded+peered, 14 active+undersized+degraded, 11 active+clean; 216 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 82/234 objects degraded (35.043%) 2026-03-09T17:33:38.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:38 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T17:33:38.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:38 vm06.local ceph-mon[109831]: Upgrade: unsafe to stop osd(s) at this time (20 PGs are or would become offline) 2026-03-09T17:33:38.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:38 vm09.local ceph-mon[97995]: pgmap v107: 65 pgs: 8 peering, 1 stale+active+undersized, 4 undersized+peered, 20 active+undersized, 1 stale+active+undersized+degraded, 6 undersized+degraded+peered, 14 active+undersized+degraded, 11 active+clean; 216 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 82/234 objects degraded (35.043%) 2026-03-09T17:33:38.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:38 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T17:33:38.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:38 vm09.local ceph-mon[97995]: Upgrade: unsafe to stop osd(s) at this time (20 PGs are or would become offline) 2026-03-09T17:33:39.333 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:39 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4[108010]: 2026-03-09T17:33:39.075+0000 7f9de38f0740 -1 osd.4 0 read_superblock omap replica is missing. 2026-03-09T17:33:39.645 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:39 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4[108010]: 2026-03-09T17:33:39.331+0000 7f9de38f0740 -1 osd.4 62 log_to_monitors true 2026-03-09T17:33:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:39 vm09.local ceph-mon[97995]: pgmap v108: 65 pgs: 8 peering, 1 stale+active+undersized, 4 undersized+peered, 20 active+undersized, 1 stale+active+undersized+degraded, 6 undersized+degraded+peered, 14 active+undersized+degraded, 11 active+clean; 216 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 82/234 objects degraded (35.043%) 2026-03-09T17:33:39.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:39 vm06.local ceph-mon[109831]: pgmap v108: 65 pgs: 8 peering, 1 stale+active+undersized, 4 undersized+peered, 20 active+undersized, 1 stale+active+undersized+degraded, 6 undersized+degraded+peered, 14 active+undersized+degraded, 11 active+clean; 216 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 82/234 objects degraded (35.043%) 2026-03-09T17:33:40.825 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:40 vm06.local ceph-mon[109831]: from='osd.4 [v2:192.168.123.109:6808/473430795,v1:192.168.123.109:6809/473430795]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T17:33:40.825 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:40 vm06.local ceph-mon[109831]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T17:33:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:40 vm09.local ceph-mon[97995]: from='osd.4 [v2:192.168.123.109:6808/473430795,v1:192.168.123.109:6809/473430795]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T17:33:40.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:40 vm09.local ceph-mon[97995]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T17:33:40.895 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:33:40 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4[108010]: 2026-03-09T17:33:40.580+0000 7f9ddb68a640 -1 osd.4 62 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T17:33:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:41 vm06.local ceph-mon[109831]: pgmap v109: 65 pgs: 5 peering, 17 active+undersized, 14 active+undersized+degraded, 29 active+clean; 216 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 38/234 objects degraded (16.239%) 2026-03-09T17:33:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:41 vm06.local ceph-mon[109831]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T17:33:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:41 vm06.local ceph-mon[109831]: osdmap e68: 6 total, 5 up, 6 in 2026-03-09T17:33:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:41 vm06.local ceph-mon[109831]: from='osd.4 [v2:192.168.123.109:6808/473430795,v1:192.168.123.109:6809/473430795]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:33:41.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:41 vm06.local ceph-mon[109831]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:33:41.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:41 vm09.local ceph-mon[97995]: pgmap v109: 65 pgs: 5 peering, 17 active+undersized, 14 active+undersized+degraded, 29 active+clean; 216 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 38/234 objects degraded (16.239%) 2026-03-09T17:33:41.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:41 vm09.local ceph-mon[97995]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T17:33:41.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:41 vm09.local ceph-mon[97995]: osdmap e68: 6 total, 5 up, 6 in 2026-03-09T17:33:41.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:41 vm09.local ceph-mon[97995]: from='osd.4 [v2:192.168.123.109:6808/473430795,v1:192.168.123.109:6809/473430795]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:33:41.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:41 vm09.local ceph-mon[97995]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:33:42.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:42 vm06.local ceph-mon[109831]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T17:33:42.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:42 vm06.local ceph-mon[109831]: osd.4 [v2:192.168.123.109:6808/473430795,v1:192.168.123.109:6809/473430795] boot 2026-03-09T17:33:42.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:42 vm06.local ceph-mon[109831]: osdmap e69: 6 total, 6 up, 6 in 2026-03-09T17:33:42.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:42 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:33:42.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:42 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 38/234 objects degraded (16.239%), 14 pgs degraded (PG_DEGRADED) 2026-03-09T17:33:42.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:42 vm09.local ceph-mon[97995]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T17:33:42.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:42 vm09.local ceph-mon[97995]: osd.4 [v2:192.168.123.109:6808/473430795,v1:192.168.123.109:6809/473430795] boot 2026-03-09T17:33:42.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:42 vm09.local ceph-mon[97995]: osdmap e69: 6 total, 6 up, 6 in 2026-03-09T17:33:42.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:42 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T17:33:42.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:42 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 38/234 objects degraded (16.239%), 14 pgs degraded (PG_DEGRADED) 2026-03-09T17:33:43.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:43 vm06.local ceph-mon[109831]: pgmap v112: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 497 B/s rd, 0 op/s; 41/234 objects degraded (17.521%) 2026-03-09T17:33:43.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:43 vm06.local ceph-mon[109831]: osdmap e70: 6 total, 6 up, 6 in 2026-03-09T17:33:43.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:43.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:33:43.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:43 vm09.local ceph-mon[97995]: pgmap v112: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 497 B/s rd, 0 op/s; 41/234 objects degraded (17.521%) 2026-03-09T17:33:43.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:43 vm09.local ceph-mon[97995]: osdmap e70: 6 total, 6 up, 6 in 2026-03-09T17:33:43.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:43.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:33:45.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:45 vm06.local ceph-mon[109831]: pgmap v114: 65 pgs: 11 active+undersized, 6 active+undersized+degraded, 48 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 1 op/s; 16/234 objects degraded (6.838%) 2026-03-09T17:33:45.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:45 vm09.local ceph-mon[97995]: pgmap v114: 65 pgs: 11 active+undersized, 6 active+undersized+degraded, 48 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 1 op/s; 16/234 objects degraded (6.838%) 2026-03-09T17:33:47.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:46 vm06.local ceph-mon[109831]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 16/234 objects degraded (6.838%), 6 pgs degraded) 2026-03-09T17:33:47.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:46 vm06.local ceph-mon[109831]: Cluster is now healthy 2026-03-09T17:33:47.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:46 vm09.local ceph-mon[97995]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 16/234 objects degraded (6.838%), 6 pgs degraded) 2026-03-09T17:33:47.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:46 vm09.local ceph-mon[97995]: Cluster is now healthy 2026-03-09T17:33:47.958 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:47 vm06.local ceph-mon[109831]: pgmap v115: 65 pgs: 65 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 682 B/s rd, 1 op/s 2026-03-09T17:33:48.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:47 vm09.local ceph-mon[97995]: pgmap v115: 65 pgs: 65 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 682 B/s rd, 1 op/s 2026-03-09T17:33:50.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:49 vm06.local ceph-mon[109831]: pgmap v116: 65 pgs: 65 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 523 B/s rd, 1 op/s 2026-03-09T17:33:50.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:49 vm09.local ceph-mon[97995]: pgmap v116: 65 pgs: 65 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 523 B/s rd, 1 op/s 2026-03-09T17:33:52.136 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:51 vm09.local ceph-mon[97995]: pgmap v117: 65 pgs: 65 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 1.0 KiB/s rd, 2 op/s 2026-03-09T17:33:52.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:51 vm06.local ceph-mon[109831]: pgmap v117: 65 pgs: 65 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 1.0 KiB/s rd, 2 op/s 2026-03-09T17:33:52.738 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:52 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T17:33:52.738 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:52 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T17:33:52.738 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:52 vm09.local ceph-mon[97995]: Upgrade: osd.5 is safe to restart 2026-03-09T17:33:52.738 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:52 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:52.738 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:52 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T17:33:52.738 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:52 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:52.990 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:52 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T17:33:52.990 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:52 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T17:33:52.990 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:52 vm06.local ceph-mon[109831]: Upgrade: osd.5 is safe to restart 2026-03-09T17:33:52.990 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:52 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:52.990 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:52 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T17:33:52.990 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:52 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:53.395 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:53 vm09.local systemd[1]: Stopping Ceph osd.5 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:33:53.395 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:53 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[79160]: 2026-03-09T17:33:53.171+0000 7fad3a6d0700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:33:53.395 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:53 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[79160]: 2026-03-09T17:33:53.171+0000 7fad3a6d0700 -1 osd.5 70 *** Got signal Terminated *** 2026-03-09T17:33:53.395 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:53 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[79160]: 2026-03-09T17:33:53.171+0000 7fad3a6d0700 -1 osd.5 70 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T17:33:53.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:53 vm06.local ceph-mon[109831]: Upgrade: Updating osd.5 2026-03-09T17:33:53.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:53 vm06.local ceph-mon[109831]: Deploying daemon osd.5 on vm09 2026-03-09T17:33:53.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:53 vm06.local ceph-mon[109831]: pgmap v118: 65 pgs: 65 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 921 B/s rd, 1 op/s 2026-03-09T17:33:53.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:53 vm06.local ceph-mon[109831]: osd.5 marked itself down and dead 2026-03-09T17:33:53.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:53 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:53.955 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:53 vm09.local ceph-mon[97995]: Upgrade: Updating osd.5 2026-03-09T17:33:53.955 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:53 vm09.local ceph-mon[97995]: Deploying daemon osd.5 on vm09 2026-03-09T17:33:53.955 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:53 vm09.local ceph-mon[97995]: pgmap v118: 65 pgs: 65 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 921 B/s rd, 1 op/s 2026-03-09T17:33:53.955 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:53 vm09.local ceph-mon[97995]: osd.5 marked itself down and dead 2026-03-09T17:33:53.955 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:53 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:53.955 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:53 vm09.local podman[111165]: 2026-03-09 17:33:53.757094004 +0000 UTC m=+0.599128127 container died 89f436540a49f883e43e162098791e14537ce6db03a7c9a991c113e1247fb613 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5, GIT_BRANCH=HEAD, GIT_CLEAN=True, RELEASE=HEAD, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=-18.2.0, org.label-schema.name=CentOS Stream 8 Base Image) 2026-03-09T17:33:53.955 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:53 vm09.local podman[111165]: 2026-03-09 17:33:53.784681366 +0000 UTC m=+0.626715489 container remove 89f436540a49f883e43e162098791e14537ce6db03a7c9a991c113e1247fb613 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5, io.buildah.version=1.29.1, org.label-schema.build-date=20231212, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, GIT_BRANCH=HEAD, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, RELEASE=HEAD, ceph=True) 2026-03-09T17:33:53.955 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:53 vm09.local bash[111165]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5 2026-03-09T17:33:54.236 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:53 vm09.local podman[111233]: 2026-03-09 17:33:53.953841389 +0000 UTC m=+0.019588884 container create 54aff6cc510866ecf165eec5740df9c03ffe2780953655c5f504a519b6922afb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T17:33:54.236 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:53 vm09.local podman[111233]: 2026-03-09 17:33:53.994629598 +0000 UTC m=+0.060377103 container init 54aff6cc510866ecf165eec5740df9c03ffe2780953655c5f504a519b6922afb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-deactivate, io.buildah.version=1.41.3, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223) 2026-03-09T17:33:54.236 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:53 vm09.local podman[111233]: 2026-03-09 17:33:53.997427215 +0000 UTC m=+0.063174710 container start 54aff6cc510866ecf165eec5740df9c03ffe2780953655c5f504a519b6922afb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-deactivate, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) 2026-03-09T17:33:54.236 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local podman[111233]: 2026-03-09 17:33:54.001238949 +0000 UTC m=+0.066986453 container attach 54aff6cc510866ecf165eec5740df9c03ffe2780953655c5f504a519b6922afb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T17:33:54.236 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local podman[111233]: 2026-03-09 17:33:53.946518904 +0000 UTC m=+0.012266409 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:33:54.236 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local podman[111233]: 2026-03-09 17:33:54.129726818 +0000 UTC m=+0.195474313 container died 54aff6cc510866ecf165eec5740df9c03ffe2780953655c5f504a519b6922afb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS) 2026-03-09T17:33:54.236 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local podman[111233]: 2026-03-09 17:33:54.147994379 +0000 UTC m=+0.213741863 container remove 54aff6cc510866ecf165eec5740df9c03ffe2780953655c5f504a519b6922afb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-deactivate, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) 2026-03-09T17:33:54.236 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.5.service: Deactivated successfully. 2026-03-09T17:33:54.236 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local systemd[1]: Stopped Ceph osd.5 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:33:54.236 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local systemd[1]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.5.service: Consumed 48.293s CPU time. 2026-03-09T17:33:54.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.430+0000 7fd657fde700 1 -- 192.168.123.106:0/3537959983 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd650103980 msgr2=0x7fd650103dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:54.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.430+0000 7fd657fde700 1 --2- 192.168.123.106:0/3537959983 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd650103980 0x7fd650103dd0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fd64c009b50 tx=0x7fd64c009e60 comp rx=0 tx=0).stop 2026-03-09T17:33:54.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.431+0000 7fd657fde700 1 -- 192.168.123.106:0/3537959983 shutdown_connections 2026-03-09T17:33:54.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.431+0000 7fd657fde700 1 --2- 192.168.123.106:0/3537959983 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd650103980 0x7fd650103dd0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.431+0000 7fd657fde700 1 --2- 192.168.123.106:0/3537959983 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd650102780 0x7fd650102b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.431+0000 7fd657fde700 1 -- 192.168.123.106:0/3537959983 >> 192.168.123.106:0/3537959983 conn(0x7fd6500fdd10 msgr2=0x7fd650100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:54.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.431+0000 7fd657fde700 1 -- 192.168.123.106:0/3537959983 shutdown_connections 2026-03-09T17:33:54.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.431+0000 7fd657fde700 1 -- 192.168.123.106:0/3537959983 wait complete. 2026-03-09T17:33:54.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.432+0000 7fd657fde700 1 Processor -- start 2026-03-09T17:33:54.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.432+0000 7fd657fde700 1 -- start start 2026-03-09T17:33:54.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.432+0000 7fd657fde700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd650102780 0x7fd650198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:54.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.433+0000 7fd657fde700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd650103980 0x7fd650198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:54.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.433+0000 7fd657fde700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd650198b80 con 0x7fd650103980 2026-03-09T17:33:54.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.433+0000 7fd657fde700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd650198cc0 con 0x7fd650102780 2026-03-09T17:33:54.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.433+0000 7fd655579700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd650103980 0x7fd650198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:54.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.433+0000 7fd655579700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd650103980 0x7fd650198560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:59478/0 (socket says 192.168.123.106:59478) 2026-03-09T17:33:54.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.433+0000 7fd655579700 1 -- 192.168.123.106:0/3818578390 learned_addr learned my addr 192.168.123.106:0/3818578390 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:33:54.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.433+0000 7fd655d7a700 1 --2- 192.168.123.106:0/3818578390 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd650102780 0x7fd650198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:54.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.433+0000 7fd655579700 1 -- 192.168.123.106:0/3818578390 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd650102780 msgr2=0x7fd650198020 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:54.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.433+0000 7fd655579700 1 --2- 192.168.123.106:0/3818578390 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd650102780 0x7fd650198020 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.433+0000 7fd655579700 1 -- 192.168.123.106:0/3818578390 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd64c0097e0 con 0x7fd650103980 2026-03-09T17:33:54.436 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.434+0000 7fd655579700 1 --2- 192.168.123.106:0/3818578390 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd650103980 0x7fd650198560 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fd64c005950 tx=0x7fd64c004f80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:54.437 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.434+0000 7fd646ffd700 1 -- 192.168.123.106:0/3818578390 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd64c01d070 con 0x7fd650103980 2026-03-09T17:33:54.437 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.434+0000 7fd646ffd700 1 -- 192.168.123.106:0/3818578390 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd64c00bb70 con 0x7fd650103980 2026-03-09T17:33:54.437 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.434+0000 7fd646ffd700 1 -- 192.168.123.106:0/3818578390 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd64c00f780 con 0x7fd650103980 2026-03-09T17:33:54.437 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.434+0000 7fd657fde700 1 -- 192.168.123.106:0/3818578390 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd65019d710 con 0x7fd650103980 2026-03-09T17:33:54.437 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.434+0000 7fd657fde700 1 -- 192.168.123.106:0/3818578390 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd6500754f0 con 0x7fd650103980 2026-03-09T17:33:54.438 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.436+0000 7fd646ffd700 1 -- 192.168.123.106:0/3818578390 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd64c00f8e0 con 0x7fd650103980 2026-03-09T17:33:54.439 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.437+0000 7fd646ffd700 1 --2- 192.168.123.106:0/3818578390 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd63c077910 0x7fd63c079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:54.439 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.437+0000 7fd646ffd700 1 -- 192.168.123.106:0/3818578390 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fd64c00bd50 con 0x7fd650103980 2026-03-09T17:33:54.439 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.437+0000 7fd655d7a700 1 --2- 192.168.123.106:0/3818578390 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd63c077910 0x7fd63c079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:54.440 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.438+0000 7fd655d7a700 1 --2- 192.168.123.106:0/3818578390 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd63c077910 0x7fd63c079dc0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fd6501037e0 tx=0x7fd640009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:54.440 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.438+0000 7fd657fde700 1 -- 192.168.123.106:0/3818578390 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd650066e40 con 0x7fd650103980 2026-03-09T17:33:54.444 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.442+0000 7fd646ffd700 1 -- 192.168.123.106:0/3818578390 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd64c064ac0 con 0x7fd650103980 2026-03-09T17:33:54.490 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local systemd[1]: Starting Ceph osd.5 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:33:54.490 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local podman[111334]: 2026-03-09 17:33:54.44120206 +0000 UTC m=+0.020635593 container create 0c5f45326d6be7efdad016b3d317afecb056bbdb0caa14700f4e77876252a218 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T17:33:54.490 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local podman[111334]: 2026-03-09 17:33:54.483893652 +0000 UTC m=+0.063327185 container init 0c5f45326d6be7efdad016b3d317afecb056bbdb0caa14700f4e77876252a218 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, io.buildah.version=1.41.3) 2026-03-09T17:33:54.490 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local podman[111334]: 2026-03-09 17:33:54.487564672 +0000 UTC m=+0.066998195 container start 0c5f45326d6be7efdad016b3d317afecb056bbdb0caa14700f4e77876252a218 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T17:33:54.576 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.573+0000 7fd657fde700 1 -- 192.168.123.106:0/3818578390 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd6501082d0 con 0x7fd63c077910 2026-03-09T17:33:54.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.575+0000 7fd646ffd700 1 -- 192.168.123.106:0/3818578390 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7fd6501082d0 con 0x7fd63c077910 2026-03-09T17:33:54.579 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.577+0000 7fd657fde700 1 -- 192.168.123.106:0/3818578390 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd63c077910 msgr2=0x7fd63c079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:54.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.578+0000 7fd657fde700 1 --2- 192.168.123.106:0/3818578390 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd63c077910 0x7fd63c079dc0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fd6501037e0 tx=0x7fd640009450 comp rx=0 tx=0).stop 2026-03-09T17:33:54.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.578+0000 7fd657fde700 1 -- 192.168.123.106:0/3818578390 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd650103980 msgr2=0x7fd650198560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:54.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.578+0000 7fd657fde700 1 --2- 192.168.123.106:0/3818578390 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd650103980 0x7fd650198560 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fd64c005950 tx=0x7fd64c004f80 comp rx=0 tx=0).stop 2026-03-09T17:33:54.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.578+0000 7fd657fde700 1 -- 192.168.123.106:0/3818578390 shutdown_connections 2026-03-09T17:33:54.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.578+0000 7fd657fde700 1 --2- 192.168.123.106:0/3818578390 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd63c077910 0x7fd63c079dc0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.578+0000 7fd657fde700 1 --2- 192.168.123.106:0/3818578390 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd650102780 0x7fd650198020 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.578+0000 7fd657fde700 1 --2- 192.168.123.106:0/3818578390 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd650103980 0x7fd650198560 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.579+0000 7fd657fde700 1 -- 192.168.123.106:0/3818578390 >> 192.168.123.106:0/3818578390 conn(0x7fd6500fdd10 msgr2=0x7fd650106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:54.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.579+0000 7fd657fde700 1 -- 192.168.123.106:0/3818578390 shutdown_connections 2026-03-09T17:33:54.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.579+0000 7fd657fde700 1 -- 192.168.123.106:0/3818578390 wait complete. 2026-03-09T17:33:54.592 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:33:54.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.663+0000 7f604381e700 1 -- 192.168.123.106:0/4252043704 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f603c101710 msgr2=0x7f603c103b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:54.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.663+0000 7f604381e700 1 --2- 192.168.123.106:0/4252043704 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f603c101710 0x7f603c103b90 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f6038009b00 tx=0x7f6038009e10 comp rx=0 tx=0).stop 2026-03-09T17:33:54.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.663+0000 7f604381e700 1 -- 192.168.123.106:0/4252043704 shutdown_connections 2026-03-09T17:33:54.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.663+0000 7f604381e700 1 --2- 192.168.123.106:0/4252043704 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f603c101710 0x7f603c103b90 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.663+0000 7f604381e700 1 --2- 192.168.123.106:0/4252043704 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f603c0fedb0 0x7f603c1011d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.663+0000 7f604381e700 1 -- 192.168.123.106:0/4252043704 >> 192.168.123.106:0/4252043704 conn(0x7f603c0fa9c0 msgr2=0x7f603c0fce10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:54.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.664+0000 7f604381e700 1 -- 192.168.123.106:0/4252043704 shutdown_connections 2026-03-09T17:33:54.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.664+0000 7f604381e700 1 -- 192.168.123.106:0/4252043704 wait complete. 2026-03-09T17:33:54.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.664+0000 7f604381e700 1 Processor -- start 2026-03-09T17:33:54.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.665+0000 7f604381e700 1 -- start start 2026-03-09T17:33:54.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.665+0000 7f604381e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f603c0fedb0 0x7f603c19c330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:54.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.665+0000 7f604381e700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f603c101710 0x7f603c19c870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:54.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.665+0000 7f604381e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f603c19ce90 con 0x7f603c0fedb0 2026-03-09T17:33:54.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.665+0000 7f604381e700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f603c19cfd0 con 0x7f603c101710 2026-03-09T17:33:54.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.665+0000 7f6040db9700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f603c101710 0x7f603c19c870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:54.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.665+0000 7f6040db9700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f603c101710 0x7f603c19c870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:41526/0 (socket says 192.168.123.106:41526) 2026-03-09T17:33:54.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.665+0000 7f6040db9700 1 -- 192.168.123.106:0/1891973100 learned_addr learned my addr 192.168.123.106:0/1891973100 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:33:54.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.665+0000 7f6040db9700 1 -- 192.168.123.106:0/1891973100 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f603c0fedb0 msgr2=0x7f603c19c330 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:33:54.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.666+0000 7f60415ba700 1 --2- 192.168.123.106:0/1891973100 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f603c0fedb0 0x7f603c19c330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:54.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.666+0000 7f6040db9700 1 --2- 192.168.123.106:0/1891973100 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f603c0fedb0 0x7f603c19c330 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.666+0000 7f6040db9700 1 -- 192.168.123.106:0/1891973100 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f60380097e0 con 0x7f603c101710 2026-03-09T17:33:54.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.666+0000 7f60415ba700 1 --2- 192.168.123.106:0/1891973100 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f603c0fedb0 0x7f603c19c330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:33:54.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.666+0000 7f6040db9700 1 --2- 192.168.123.106:0/1891973100 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f603c101710 0x7f603c19c870 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f6038004990 tx=0x7f6038004a70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:54.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.666+0000 7f602e7fc700 1 -- 192.168.123.106:0/1891973100 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f603801d070 con 0x7f603c101710 2026-03-09T17:33:54.669 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.666+0000 7f602e7fc700 1 -- 192.168.123.106:0/1891973100 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f603800bd90 con 0x7f603c101710 2026-03-09T17:33:54.670 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.666+0000 7f602e7fc700 1 -- 192.168.123.106:0/1891973100 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f603800f950 con 0x7f603c101710 2026-03-09T17:33:54.670 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.666+0000 7f604381e700 1 -- 192.168.123.106:0/1891973100 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f603c1a1a20 con 0x7f603c101710 2026-03-09T17:33:54.670 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.666+0000 7f604381e700 1 -- 192.168.123.106:0/1891973100 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f603c1a1f10 con 0x7f603c101710 2026-03-09T17:33:54.670 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.668+0000 7f604381e700 1 -- 192.168.123.106:0/1891973100 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f603c1965f0 con 0x7f603c101710 2026-03-09T17:33:54.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.669+0000 7f602e7fc700 1 -- 192.168.123.106:0/1891973100 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6038022c70 con 0x7f603c101710 2026-03-09T17:33:54.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.670+0000 7f602e7fc700 1 --2- 192.168.123.106:0/1891973100 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f60280778c0 0x7f6028079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:54.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.670+0000 7f602e7fc700 1 -- 192.168.123.106:0/1891973100 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f603809bb30 con 0x7f603c101710 2026-03-09T17:33:54.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.670+0000 7f60415ba700 1 --2- 192.168.123.106:0/1891973100 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f60280778c0 0x7f6028079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:54.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.670+0000 7f60415ba700 1 --2- 192.168.123.106:0/1891973100 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f60280778c0 0x7f6028079d70 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f6030005fd0 tx=0x7f6030005dc0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:54.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.672+0000 7f602e7fc700 1 -- 192.168.123.106:0/1891973100 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f60380643e0 con 0x7f603c101710 2026-03-09T17:33:54.692 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:54 vm06.local ceph-mon[109831]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T17:33:54.692 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:54 vm06.local ceph-mon[109831]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-09T17:33:54.692 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:54 vm06.local ceph-mon[109831]: osdmap e71: 6 total, 5 up, 6 in 2026-03-09T17:33:54.823 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.821+0000 7f604381e700 1 -- 192.168.123.106:0/1891973100 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f603c061190 con 0x7f60280778c0 2026-03-09T17:33:54.825 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.822+0000 7f602e7fc700 1 -- 192.168.123.106:0/1891973100 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f603c061190 con 0x7f60280778c0 2026-03-09T17:33:54.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.824+0000 7f604381e700 1 -- 192.168.123.106:0/1891973100 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f60280778c0 msgr2=0x7f6028079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:54.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.824+0000 7f604381e700 1 --2- 192.168.123.106:0/1891973100 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f60280778c0 0x7f6028079d70 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f6030005fd0 tx=0x7f6030005dc0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.827 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.824+0000 7f604381e700 1 -- 192.168.123.106:0/1891973100 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f603c101710 msgr2=0x7f603c19c870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:54.827 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.824+0000 7f604381e700 1 --2- 192.168.123.106:0/1891973100 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f603c101710 0x7f603c19c870 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f6038004990 tx=0x7f6038004a70 comp rx=0 tx=0).stop 2026-03-09T17:33:54.827 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.825+0000 7f604381e700 1 -- 192.168.123.106:0/1891973100 shutdown_connections 2026-03-09T17:33:54.827 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.825+0000 7f604381e700 1 --2- 192.168.123.106:0/1891973100 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f60280778c0 0x7f6028079d70 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.827 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.825+0000 7f604381e700 1 --2- 192.168.123.106:0/1891973100 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f603c0fedb0 0x7f603c19c330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.827 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.825+0000 7f604381e700 1 --2- 192.168.123.106:0/1891973100 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f603c101710 0x7f603c19c870 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.827 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.825+0000 7f604381e700 1 -- 192.168.123.106:0/1891973100 >> 192.168.123.106:0/1891973100 conn(0x7f603c0fa9c0 msgr2=0x7f603c0fce10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:54.827 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.825+0000 7f604381e700 1 -- 192.168.123.106:0/1891973100 shutdown_connections 2026-03-09T17:33:54.827 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.825+0000 7f604381e700 1 -- 192.168.123.106:0/1891973100 wait complete. 2026-03-09T17:33:54.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:54 vm09.local ceph-mon[97995]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T17:33:54.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:54 vm09.local ceph-mon[97995]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-09T17:33:54.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:54 vm09.local ceph-mon[97995]: osdmap e71: 6 total, 5 up, 6 in 2026-03-09T17:33:54.895 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local podman[111334]: 2026-03-09 17:33:54.490032502 +0000 UTC m=+0.069466035 container attach 0c5f45326d6be7efdad016b3d317afecb056bbdb0caa14700f4e77876252a218 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T17:33:54.895 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local podman[111334]: 2026-03-09 17:33:54.432483341 +0000 UTC m=+0.011916874 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:33:54.895 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate[111345]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:54.895 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local bash[111334]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:54.895 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate[111345]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:54.895 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:54 vm09.local bash[111334]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:54.908 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.906+0000 7f55d785a700 1 -- 192.168.123.106:0/831525915 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d01018d0 msgr2=0x7f55d0101d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:54.908 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.906+0000 7f55d785a700 1 --2- 192.168.123.106:0/831525915 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d01018d0 0x7f55d0101d20 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f55cc009b00 tx=0x7f55cc009e10 comp rx=0 tx=0).stop 2026-03-09T17:33:54.908 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.906+0000 7f55d785a700 1 -- 192.168.123.106:0/831525915 shutdown_connections 2026-03-09T17:33:54.908 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.906+0000 7f55d785a700 1 --2- 192.168.123.106:0/831525915 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d01018d0 0x7f55d0101d20 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.908 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.906+0000 7f55d785a700 1 --2- 192.168.123.106:0/831525915 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d00fe850 0x7f55d00fec60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.908 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.906+0000 7f55d785a700 1 -- 192.168.123.106:0/831525915 >> 192.168.123.106:0/831525915 conn(0x7f55d00fa1a0 msgr2=0x7f55d00fc5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:54.908 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.906+0000 7f55d785a700 1 -- 192.168.123.106:0/831525915 shutdown_connections 2026-03-09T17:33:54.909 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.906+0000 7f55d785a700 1 -- 192.168.123.106:0/831525915 wait complete. 2026-03-09T17:33:54.909 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.907+0000 7f55d785a700 1 Processor -- start 2026-03-09T17:33:54.909 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.907+0000 7f55d785a700 1 -- start start 2026-03-09T17:33:54.909 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.907+0000 7f55d785a700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d00fe850 0x7f55d0197f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:54.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.907+0000 7f55d785a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d01018d0 0x7f55d0198460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:54.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.907+0000 7f55d785a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f55d01989f0 con 0x7f55d01018d0 2026-03-09T17:33:54.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.907+0000 7f55d785a700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f55d0198b30 con 0x7f55d00fe850 2026-03-09T17:33:54.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.908+0000 7f55d4df5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d01018d0 0x7f55d0198460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:54.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.908+0000 7f55d4df5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d01018d0 0x7f55d0198460 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:59504/0 (socket says 192.168.123.106:59504) 2026-03-09T17:33:54.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.908+0000 7f55d4df5700 1 -- 192.168.123.106:0/2349831637 learned_addr learned my addr 192.168.123.106:0/2349831637 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:33:54.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.908+0000 7f55d55f6700 1 --2- 192.168.123.106:0/2349831637 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d00fe850 0x7f55d0197f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:54.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.908+0000 7f55d4df5700 1 -- 192.168.123.106:0/2349831637 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d00fe850 msgr2=0x7f55d0197f20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:54.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.908+0000 7f55d4df5700 1 --2- 192.168.123.106:0/2349831637 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d00fe850 0x7f55d0197f20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:54.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.908+0000 7f55d4df5700 1 -- 192.168.123.106:0/2349831637 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f55cc0097e0 con 0x7f55d01018d0 2026-03-09T17:33:54.910 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.908+0000 7f55d55f6700 1 --2- 192.168.123.106:0/2349831637 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d00fe850 0x7f55d0197f20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:33:54.911 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.908+0000 7f55d4df5700 1 --2- 192.168.123.106:0/2349831637 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d01018d0 0x7f55d0198460 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f55cc004990 tx=0x7f55cc004a70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:54.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.909+0000 7f55c27fc700 1 -- 192.168.123.106:0/2349831637 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f55cc01d070 con 0x7f55d01018d0 2026-03-09T17:33:54.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.909+0000 7f55c27fc700 1 -- 192.168.123.106:0/2349831637 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f55cc00bcd0 con 0x7f55d01018d0 2026-03-09T17:33:54.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.909+0000 7f55c27fc700 1 -- 192.168.123.106:0/2349831637 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f55cc00f850 con 0x7f55d01018d0 2026-03-09T17:33:54.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.909+0000 7f55d785a700 1 -- 192.168.123.106:0/2349831637 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f55d019d590 con 0x7f55d01018d0 2026-03-09T17:33:54.912 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.909+0000 7f55d785a700 1 -- 192.168.123.106:0/2349831637 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f55d019da50 con 0x7f55d01018d0 2026-03-09T17:33:54.916 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.910+0000 7f55d785a700 1 -- 192.168.123.106:0/2349831637 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f55d01059b0 con 0x7f55d01018d0 2026-03-09T17:33:54.916 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.911+0000 7f55c27fc700 1 -- 192.168.123.106:0/2349831637 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f55cc022bc0 con 0x7f55d01018d0 2026-03-09T17:33:54.916 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.912+0000 7f55c27fc700 1 --2- 192.168.123.106:0/2349831637 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f55bc077990 0x7f55bc079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:54.916 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.912+0000 7f55c27fc700 1 -- 192.168.123.106:0/2349831637 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(72..72 src has 1..72) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f55cc09bd90 con 0x7f55d01018d0 2026-03-09T17:33:54.916 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.914+0000 7f55d55f6700 1 --2- 192.168.123.106:0/2349831637 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f55bc077990 0x7f55bc079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:54.916 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.914+0000 7f55d55f6700 1 --2- 192.168.123.106:0/2349831637 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f55bc077990 0x7f55bc079e40 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f55c4005fd0 tx=0x7f55c4005e40 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:54.917 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:54.914+0000 7f55c27fc700 1 -- 192.168.123.106:0/2349831637 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f55cc064640 con 0x7f55d01018d0 2026-03-09T17:33:55.040 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.037+0000 7f55d785a700 1 -- 192.168.123.106:0/2349831637 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f55d0061190 con 0x7f55bc077990 2026-03-09T17:33:55.048 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.046+0000 7f55c27fc700 1 -- 192.168.123.106:0/2349831637 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f55d0061190 con 0x7f55bc077990 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (8m) 47s ago 8m 26.0M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (9m) 47s ago 9m 9353k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (8m) 19s ago 8m 11.3M - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (2m) 47s ago 9m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 3e47d040f792 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (2m) 19s ago 8m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e a8538a05db57 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (8m) 47s ago 8m 95.1M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (6m) 47s ago 6m 16.8M - 18.2.0 dc2bc1663786 4b4cbdf0c640 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (6m) 47s ago 6m 177M - 18.2.0 dc2bc1663786 4c8e86b2b8cd 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (6m) 19s ago 6m 93.1M - 18.2.0 dc2bc1663786 aa1f0430b448 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (6m) 19s ago 6m 18.5M - 18.2.0 dc2bc1663786 8dc8a0159213 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:8443,9283,8765 running (3m) 47s ago 9m 627M - 19.2.3-678-ge911bdeb 654f31e6858e a857a5a84e3a 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (3m) 19s ago 8m 490M - 19.2.3-678-ge911bdeb 654f31e6858e ce41ce6b68d6 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (3m) 47s ago 9m 60.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 86c27c9946b5 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (2m) 19s ago 8m 51.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 65d270c6a306 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (8m) 47s ago 8m 14.6M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (8m) 19s ago 8m 16.0M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (2m) 47s ago 8m 205M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 3b19d9fcb067 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (71s) 47s ago 7m 117M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b63df0190ed3 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (49s) 47s ago 7m 12.5M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a5ccd85faf22 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (27s) 19s ago 7m 143M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 40d834360933 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (20s) 19s ago 7m 15.5M 4096M 19.2.3-678-ge911bdeb 654f31e6858e cb6e9cd4fe30 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (7m) 19s ago 7m 396M 4096M 18.2.0 dc2bc1663786 89f436540a49 2026-03-09T17:33:55.049 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (3m) 47s ago 8m 58.7M - 2.43.0 a07b618ecd1d f6ece95f2fd5 2026-03-09T17:33:55.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.050+0000 7f55d785a700 1 -- 192.168.123.106:0/2349831637 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f55bc077990 msgr2=0x7f55bc079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.050+0000 7f55d785a700 1 --2- 192.168.123.106:0/2349831637 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f55bc077990 0x7f55bc079e40 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f55c4005fd0 tx=0x7f55c4005e40 comp rx=0 tx=0).stop 2026-03-09T17:33:55.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.050+0000 7f55d785a700 1 -- 192.168.123.106:0/2349831637 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d01018d0 msgr2=0x7f55d0198460 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.050+0000 7f55d785a700 1 --2- 192.168.123.106:0/2349831637 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d01018d0 0x7f55d0198460 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f55cc004990 tx=0x7f55cc004a70 comp rx=0 tx=0).stop 2026-03-09T17:33:55.053 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.050+0000 7f55d785a700 1 -- 192.168.123.106:0/2349831637 shutdown_connections 2026-03-09T17:33:55.053 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.050+0000 7f55d785a700 1 --2- 192.168.123.106:0/2349831637 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f55bc077990 0x7f55bc079e40 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.053 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.050+0000 7f55d785a700 1 --2- 192.168.123.106:0/2349831637 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f55d00fe850 0x7f55d0197f20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.053 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.050+0000 7f55d785a700 1 --2- 192.168.123.106:0/2349831637 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f55d01018d0 0x7f55d0198460 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.053 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.050+0000 7f55d785a700 1 -- 192.168.123.106:0/2349831637 >> 192.168.123.106:0/2349831637 conn(0x7f55d00fa1a0 msgr2=0x7f55d0102af0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:55.053 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.051+0000 7f55d785a700 1 -- 192.168.123.106:0/2349831637 shutdown_connections 2026-03-09T17:33:55.053 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.051+0000 7f55d785a700 1 -- 192.168.123.106:0/2349831637 wait complete. 2026-03-09T17:33:55.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.123+0000 7f90f4910700 1 -- 192.168.123.106:0/2394766888 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90ec103970 msgr2=0x7f90ec105d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.123+0000 7f90f4910700 1 --2- 192.168.123.106:0/2394766888 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90ec103970 0x7f90ec105d50 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f90e0009b50 tx=0x7f90e0009e60 comp rx=0 tx=0).stop 2026-03-09T17:33:55.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.123+0000 7f90f4910700 1 -- 192.168.123.106:0/2394766888 shutdown_connections 2026-03-09T17:33:55.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.123+0000 7f90f4910700 1 --2- 192.168.123.106:0/2394766888 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90ec103970 0x7f90ec105d50 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.123+0000 7f90f4910700 1 --2- 192.168.123.106:0/2394766888 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f90ec101050 0x7f90ec103430 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.123+0000 7f90f4910700 1 -- 192.168.123.106:0/2394766888 >> 192.168.123.106:0/2394766888 conn(0x7f90ec0fa9b0 msgr2=0x7f90ec0fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:55.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.123+0000 7f90f4910700 1 -- 192.168.123.106:0/2394766888 shutdown_connections 2026-03-09T17:33:55.125 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.123+0000 7f90f4910700 1 -- 192.168.123.106:0/2394766888 wait complete. 2026-03-09T17:33:55.126 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.124+0000 7f90f4910700 1 Processor -- start 2026-03-09T17:33:55.126 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.124+0000 7f90f4910700 1 -- start start 2026-03-09T17:33:55.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.125+0000 7f90f4910700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90ec101050 0x7f90ec195e00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:55.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.125+0000 7f90f4910700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f90ec103970 0x7f90ec196340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:55.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.125+0000 7f90f4910700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f90ec196960 con 0x7f90ec101050 2026-03-09T17:33:55.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.125+0000 7f90f4910700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f90ec196aa0 con 0x7f90ec103970 2026-03-09T17:33:55.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.125+0000 7f90f1eab700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f90ec103970 0x7f90ec196340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:55.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.125+0000 7f90f1eab700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f90ec103970 0x7f90ec196340 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:41568/0 (socket says 192.168.123.106:41568) 2026-03-09T17:33:55.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.125+0000 7f90f1eab700 1 -- 192.168.123.106:0/1455088776 learned_addr learned my addr 192.168.123.106:0/1455088776 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:33:55.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.125+0000 7f90f26ac700 1 --2- 192.168.123.106:0/1455088776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90ec101050 0x7f90ec195e00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:55.127 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.125+0000 7f90f1eab700 1 -- 192.168.123.106:0/1455088776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90ec101050 msgr2=0x7f90ec195e00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.125+0000 7f90f1eab700 1 --2- 192.168.123.106:0/1455088776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90ec101050 0x7f90ec195e00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.125+0000 7f90f1eab700 1 -- 192.168.123.106:0/1455088776 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f90e00097e0 con 0x7f90ec103970 2026-03-09T17:33:55.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.126+0000 7f90f26ac700 1 --2- 192.168.123.106:0/1455088776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90ec101050 0x7f90ec195e00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:33:55.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.126+0000 7f90f1eab700 1 --2- 192.168.123.106:0/1455088776 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f90ec103970 0x7f90ec196340 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f90e0004ce0 tx=0x7f90e00057f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:55.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.126+0000 7f90e77fe700 1 -- 192.168.123.106:0/1455088776 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f90e001d070 con 0x7f90ec103970 2026-03-09T17:33:55.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.126+0000 7f90e77fe700 1 -- 192.168.123.106:0/1455088776 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f90e000bc30 con 0x7f90ec103970 2026-03-09T17:33:55.128 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.126+0000 7f90f4910700 1 -- 192.168.123.106:0/1455088776 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f90ec19b4f0 con 0x7f90ec103970 2026-03-09T17:33:55.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.126+0000 7f90e77fe700 1 -- 192.168.123.106:0/1455088776 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f90e000f830 con 0x7f90ec103970 2026-03-09T17:33:55.129 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.126+0000 7f90f4910700 1 -- 192.168.123.106:0/1455088776 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f90ec19b9e0 con 0x7f90ec103970 2026-03-09T17:33:55.130 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.128+0000 7f90f4910700 1 -- 192.168.123.106:0/1455088776 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f90ec18fff0 con 0x7f90ec103970 2026-03-09T17:33:55.132 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.129+0000 7f90e77fe700 1 -- 192.168.123.106:0/1455088776 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f90e0022b70 con 0x7f90ec103970 2026-03-09T17:33:55.132 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.129+0000 7f90e77fe700 1 --2- 192.168.123.106:0/1455088776 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f90dc0778c0 0x7f90dc079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:55.132 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.130+0000 7f90f26ac700 1 --2- 192.168.123.106:0/1455088776 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f90dc0778c0 0x7f90dc079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:55.132 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.130+0000 7f90f26ac700 1 --2- 192.168.123.106:0/1455088776 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f90dc0778c0 0x7f90dc079d70 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f90d8005fd0 tx=0x7f90d8005ee0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:55.132 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.130+0000 7f90e77fe700 1 -- 192.168.123.106:0/1455088776 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(72..72 src has 1..72) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f90e009bd50 con 0x7f90ec103970 2026-03-09T17:33:55.133 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.131+0000 7f90e77fe700 1 -- 192.168.123.106:0/1455088776 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f90e00646c0 con 0x7f90ec103970 2026-03-09T17:33:55.298 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.296+0000 7f90f4910700 1 -- 192.168.123.106:0/1455088776 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f90ec066e40 con 0x7f90ec103970 2026-03-09T17:33:55.300 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.297+0000 7f90e77fe700 1 -- 192.168.123.106:0/1455088776 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+708 (secure 0 0 0) 0x7f90e0063e10 con 0x7f90ec103970 2026-03-09T17:33:55.302 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:33:55.302 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:33:55.302 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:33:55.302 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:33:55.302 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:33:55.303 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:33:55.303 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:33:55.303 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:33:55.303 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-09T17:33:55.303 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:33:55.303 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:33:55.303 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-09T17:33:55.303 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:33:55.303 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:33:55.303 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4, 2026-03-09T17:33:55.303 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 9 2026-03-09T17:33:55.303 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:33:55.303 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:33:55.305 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.303+0000 7f90f4910700 1 -- 192.168.123.106:0/1455088776 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f90dc0778c0 msgr2=0x7f90dc079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.305 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.303+0000 7f90f4910700 1 --2- 192.168.123.106:0/1455088776 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f90dc0778c0 0x7f90dc079d70 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f90d8005fd0 tx=0x7f90d8005ee0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.305 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.303+0000 7f90f4910700 1 -- 192.168.123.106:0/1455088776 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f90ec103970 msgr2=0x7f90ec196340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.305 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.303+0000 7f90f4910700 1 --2- 192.168.123.106:0/1455088776 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f90ec103970 0x7f90ec196340 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f90e0004ce0 tx=0x7f90e00057f0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.305 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.303+0000 7f90f4910700 1 -- 192.168.123.106:0/1455088776 shutdown_connections 2026-03-09T17:33:55.305 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.303+0000 7f90f4910700 1 --2- 192.168.123.106:0/1455088776 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f90dc0778c0 0x7f90dc079d70 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.306 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.303+0000 7f90f4910700 1 --2- 192.168.123.106:0/1455088776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90ec101050 0x7f90ec195e00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.306 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.303+0000 7f90f4910700 1 --2- 192.168.123.106:0/1455088776 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f90ec103970 0x7f90ec196340 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.306 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.303+0000 7f90f4910700 1 -- 192.168.123.106:0/1455088776 >> 192.168.123.106:0/1455088776 conn(0x7f90ec0fa9b0 msgr2=0x7f90ec0fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:55.306 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.303+0000 7f90f4910700 1 -- 192.168.123.106:0/1455088776 shutdown_connections 2026-03-09T17:33:55.306 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.303+0000 7f90f4910700 1 -- 192.168.123.106:0/1455088776 wait complete. 2026-03-09T17:33:55.380 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.378+0000 7fa08b451700 1 -- 192.168.123.106:0/3457734449 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa084102740 msgr2=0x7fa084102b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.381 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.378+0000 7fa08b451700 1 --2- 192.168.123.106:0/3457734449 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa084102740 0x7fa084102b50 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fa074009b00 tx=0x7fa074009e10 comp rx=0 tx=0).stop 2026-03-09T17:33:55.381 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.378+0000 7fa08b451700 1 -- 192.168.123.106:0/3457734449 shutdown_connections 2026-03-09T17:33:55.381 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.378+0000 7fa08b451700 1 --2- 192.168.123.106:0/3457734449 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa084103940 0x7fa084103d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.381 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.378+0000 7fa08b451700 1 --2- 192.168.123.106:0/3457734449 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa084102740 0x7fa084102b50 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.381 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.378+0000 7fa08b451700 1 -- 192.168.123.106:0/3457734449 >> 192.168.123.106:0/3457734449 conn(0x7fa0840fdcf0 msgr2=0x7fa084100120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:55.381 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.379+0000 7fa08b451700 1 -- 192.168.123.106:0/3457734449 shutdown_connections 2026-03-09T17:33:55.381 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.379+0000 7fa08b451700 1 -- 192.168.123.106:0/3457734449 wait complete. 2026-03-09T17:33:55.382 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.379+0000 7fa08b451700 1 Processor -- start 2026-03-09T17:33:55.382 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.380+0000 7fa08b451700 1 -- start start 2026-03-09T17:33:55.382 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.380+0000 7fa08b451700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa084102740 0x7fa084198000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:55.382 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.380+0000 7fa08b451700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa084103940 0x7fa084198540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:55.382 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.380+0000 7fa08b451700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa084198b60 con 0x7fa084102740 2026-03-09T17:33:55.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.380+0000 7fa08b451700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa084198ca0 con 0x7fa084103940 2026-03-09T17:33:55.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.380+0000 7fa0891ed700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa084102740 0x7fa084198000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:55.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.380+0000 7fa0891ed700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa084102740 0x7fa084198000 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:59534/0 (socket says 192.168.123.106:59534) 2026-03-09T17:33:55.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.380+0000 7fa0891ed700 1 -- 192.168.123.106:0/241105362 learned_addr learned my addr 192.168.123.106:0/241105362 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:33:55.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.380+0000 7fa0889ec700 1 --2- 192.168.123.106:0/241105362 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa084103940 0x7fa084198540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:55.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.380+0000 7fa0891ed700 1 -- 192.168.123.106:0/241105362 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa084103940 msgr2=0x7fa084198540 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.380+0000 7fa0891ed700 1 --2- 192.168.123.106:0/241105362 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa084103940 0x7fa084198540 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.380+0000 7fa0891ed700 1 -- 192.168.123.106:0/241105362 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa0740097e0 con 0x7fa084102740 2026-03-09T17:33:55.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.381+0000 7fa0891ed700 1 --2- 192.168.123.106:0/241105362 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa084102740 0x7fa084198000 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fa074006010 tx=0x7fa074004930 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:55.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.381+0000 7fa07a7fc700 1 -- 192.168.123.106:0/241105362 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa07401d070 con 0x7fa084102740 2026-03-09T17:33:55.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.381+0000 7fa07a7fc700 1 -- 192.168.123.106:0/241105362 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa07400bc50 con 0x7fa084102740 2026-03-09T17:33:55.384 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.381+0000 7fa08b451700 1 -- 192.168.123.106:0/241105362 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa08419d6f0 con 0x7fa084102740 2026-03-09T17:33:55.385 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.381+0000 7fa08b451700 1 -- 192.168.123.106:0/241105362 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa08419dbe0 con 0x7fa084102740 2026-03-09T17:33:55.385 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.381+0000 7fa07a7fc700 1 -- 192.168.123.106:0/241105362 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa07400f780 con 0x7fa084102740 2026-03-09T17:33:55.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.383+0000 7fa07a7fc700 1 -- 192.168.123.106:0/241105362 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa07400f8e0 con 0x7fa084102740 2026-03-09T17:33:55.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.383+0000 7fa08b451700 1 -- 192.168.123.106:0/241105362 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa068005320 con 0x7fa084102740 2026-03-09T17:33:55.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.384+0000 7fa07a7fc700 1 --2- 192.168.123.106:0/241105362 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa0700801a0 0x7fa070082650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:55.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.384+0000 7fa07a7fc700 1 -- 192.168.123.106:0/241105362 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(72..72 src has 1..72) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fa07409af90 con 0x7fa084102740 2026-03-09T17:33:55.388 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.386+0000 7fa0889ec700 1 --2- 192.168.123.106:0/241105362 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa0700801a0 0x7fa070082650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:55.389 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.387+0000 7fa07a7fc700 1 -- 192.168.123.106:0/241105362 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa0740638f0 con 0x7fa084102740 2026-03-09T17:33:55.389 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.387+0000 7fa0889ec700 1 --2- 192.168.123.106:0/241105362 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa0700801a0 0x7fa070082650 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fa080005950 tx=0x7fa08000b410 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:55.395 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate[111345]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T17:33:55.395 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate[111345]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:55.395 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local bash[111334]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T17:33:55.395 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local bash[111334]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:55.395 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate[111345]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:55.395 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local bash[111334]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T17:33:55.395 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate[111345]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T17:33:55.395 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local bash[111334]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T17:33:55.395 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate[111345]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-efabfcee-c89d-45b0-9aa2-10506df8d5e6/osd-block-4a80decf-2a05-4525-b2be-269b4a9ba65c --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-09T17:33:55.395 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local bash[111334]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-efabfcee-c89d-45b0-9aa2-10506df8d5e6/osd-block-4a80decf-2a05-4525-b2be-269b4a9ba65c --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-09T17:33:55.533 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.531+0000 7fa08b451700 1 -- 192.168.123.106:0/241105362 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fa068006200 con 0x7fa084102740 2026-03-09T17:33:55.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.531+0000 7fa07a7fc700 1 -- 192.168.123.106:0/241105362 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1942 (secure 0 0 0) 0x7fa074063040 con 0x7fa084102740 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:e13 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:epoch 11 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:27:16.605001+0000 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:max_xattr_size 65536 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 0 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24291} 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:qdb_cluster leader: 0 members: 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:24291} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.106:6826/649840868,v1:192.168.123.106:6827/649840868] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{0:14476} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.109:6824/791757990,v1:192.168.123.109:6825/791757990] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{-1:14500} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:33:55.535 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:33:55.537 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.535+0000 7fa08b451700 1 -- 192.168.123.106:0/241105362 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa0700801a0 msgr2=0x7fa070082650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.535+0000 7fa08b451700 1 --2- 192.168.123.106:0/241105362 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa0700801a0 0x7fa070082650 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fa080005950 tx=0x7fa08000b410 comp rx=0 tx=0).stop 2026-03-09T17:33:55.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.535+0000 7fa08b451700 1 -- 192.168.123.106:0/241105362 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa084102740 msgr2=0x7fa084198000 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.535+0000 7fa08b451700 1 --2- 192.168.123.106:0/241105362 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa084102740 0x7fa084198000 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fa074006010 tx=0x7fa074004930 comp rx=0 tx=0).stop 2026-03-09T17:33:55.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.536+0000 7fa08b451700 1 -- 192.168.123.106:0/241105362 shutdown_connections 2026-03-09T17:33:55.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.536+0000 7fa08b451700 1 --2- 192.168.123.106:0/241105362 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa0700801a0 0x7fa070082650 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.536+0000 7fa08b451700 1 --2- 192.168.123.106:0/241105362 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa084102740 0x7fa084198000 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.536+0000 7fa08b451700 1 --2- 192.168.123.106:0/241105362 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa084103940 0x7fa084198540 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.536+0000 7fa08b451700 1 -- 192.168.123.106:0/241105362 >> 192.168.123.106:0/241105362 conn(0x7fa0840fdcf0 msgr2=0x7fa084106b70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:55.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.536+0000 7fa08b451700 1 -- 192.168.123.106:0/241105362 shutdown_connections 2026-03-09T17:33:55.538 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.536+0000 7fa08b451700 1 -- 192.168.123.106:0/241105362 wait complete. 2026-03-09T17:33:55.539 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:33:55.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.607+0000 7f6fce093700 1 -- 192.168.123.106:0/3899928947 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6fc8102740 msgr2=0x7f6fc8102b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.607+0000 7f6fce093700 1 --2- 192.168.123.106:0/3899928947 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6fc8102740 0x7f6fc8102b50 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f6fb8009b50 tx=0x7f6fb8009e60 comp rx=0 tx=0).stop 2026-03-09T17:33:55.613 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.611+0000 7f6fce093700 1 -- 192.168.123.106:0/3899928947 shutdown_connections 2026-03-09T17:33:55.613 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.611+0000 7f6fce093700 1 --2- 192.168.123.106:0/3899928947 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6fc8103940 0x7f6fc8103d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.613 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.611+0000 7f6fce093700 1 --2- 192.168.123.106:0/3899928947 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6fc8102740 0x7f6fc8102b50 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.613 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.611+0000 7f6fce093700 1 -- 192.168.123.106:0/3899928947 >> 192.168.123.106:0/3899928947 conn(0x7f6fc80fdcf0 msgr2=0x7f6fc8100120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:55.613 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.611+0000 7f6fce093700 1 -- 192.168.123.106:0/3899928947 shutdown_connections 2026-03-09T17:33:55.613 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.611+0000 7f6fce093700 1 -- 192.168.123.106:0/3899928947 wait complete. 2026-03-09T17:33:55.614 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.612+0000 7f6fce093700 1 Processor -- start 2026-03-09T17:33:55.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.612+0000 7f6fce093700 1 -- start start 2026-03-09T17:33:55.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.612+0000 7f6fce093700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6fc8102740 0x7f6fc8193c70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:55.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.612+0000 7f6fce093700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6fc8103940 0x7f6fc81941b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:55.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.612+0000 7f6fce093700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6fc8194780 con 0x7f6fc8102740 2026-03-09T17:33:55.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.612+0000 7f6fce093700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6fc81948c0 con 0x7f6fc8103940 2026-03-09T17:33:55.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.613+0000 7f6fc77fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6fc8102740 0x7f6fc8193c70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:55.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.614+0000 7f6fbffff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6fc8103940 0x7f6fc81941b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:55.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.614+0000 7f6fbffff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6fc8103940 0x7f6fc81941b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:41608/0 (socket says 192.168.123.106:41608) 2026-03-09T17:33:55.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.614+0000 7f6fbffff700 1 -- 192.168.123.106:0/52529638 learned_addr learned my addr 192.168.123.106:0/52529638 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:33:55.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.614+0000 7f6fbffff700 1 -- 192.168.123.106:0/52529638 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6fc8102740 msgr2=0x7f6fc8193c70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.614+0000 7f6fbffff700 1 --2- 192.168.123.106:0/52529638 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6fc8102740 0x7f6fc8193c70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.614+0000 7f6fbffff700 1 -- 192.168.123.106:0/52529638 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6fb80097e0 con 0x7f6fc8103940 2026-03-09T17:33:55.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.615+0000 7f6fc77fe700 1 --2- 192.168.123.106:0/52529638 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6fc8102740 0x7f6fc8193c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T17:33:55.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.615+0000 7f6fbffff700 1 --2- 192.168.123.106:0/52529638 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6fc8103940 0x7f6fc81941b0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f6fb000eb10 tx=0x7f6fb000ee20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:55.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.616+0000 7f6fc57fa700 1 -- 192.168.123.106:0/52529638 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6fb000cc40 con 0x7f6fc8103940 2026-03-09T17:33:55.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.616+0000 7f6fc57fa700 1 -- 192.168.123.106:0/52529638 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6fb000cda0 con 0x7f6fc8103940 2026-03-09T17:33:55.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.616+0000 7f6fc57fa700 1 -- 192.168.123.106:0/52529638 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6fb0018870 con 0x7f6fc8103940 2026-03-09T17:33:55.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.616+0000 7f6fce093700 1 -- 192.168.123.106:0/52529638 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6fc806a8c0 con 0x7f6fc8103940 2026-03-09T17:33:55.619 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.616+0000 7f6fce093700 1 -- 192.168.123.106:0/52529638 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6fc806ae10 con 0x7f6fc8103940 2026-03-09T17:33:55.619 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.617+0000 7f6fce093700 1 -- 192.168.123.106:0/52529638 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6fc8066e40 con 0x7f6fc8103940 2026-03-09T17:33:55.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.618+0000 7f6fc57fa700 1 -- 192.168.123.106:0/52529638 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6fb00189d0 con 0x7f6fc8103940 2026-03-09T17:33:55.621 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.619+0000 7f6fc57fa700 1 --2- 192.168.123.106:0/52529638 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6fa80778c0 0x7f6fa8079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:55.621 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.619+0000 7f6fc57fa700 1 -- 192.168.123.106:0/52529638 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(72..72 src has 1..72) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f6fb0014070 con 0x7f6fc8103940 2026-03-09T17:33:55.621 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.619+0000 7f6fc77fe700 1 --2- 192.168.123.106:0/52529638 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6fa80778c0 0x7f6fa8079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:55.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.619+0000 7f6fc77fe700 1 --2- 192.168.123.106:0/52529638 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6fa80778c0 0x7f6fa8079d70 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f6fb80053b0 tx=0x7f6fb8005a90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:55.623 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.621+0000 7f6fc57fa700 1 -- 192.168.123.106:0/52529638 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6fb009e050 con 0x7f6fc8103940 2026-03-09T17:33:55.713 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-mon[97995]: pgmap v120: 65 pgs: 10 peering, 10 stale+active+clean, 45 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 614 B/s rd, 1 op/s 2026-03-09T17:33:55.713 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-mon[97995]: from='client.34256 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:55.713 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-mon[97995]: Health check failed: Reduced data availability: 3 pgs peering (PG_AVAILABILITY) 2026-03-09T17:33:55.713 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-mon[97995]: osdmap e72: 6 total, 5 up, 6 in 2026-03-09T17:33:55.713 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-mon[97995]: from='client.44223 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:55.713 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-mon[97995]: from='client.34260 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:55.713 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/1455088776' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:55.713 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/241105362' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:33:55.713 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate[111345]: Running command: /usr/bin/ln -snf /dev/ceph-efabfcee-c89d-45b0-9aa2-10506df8d5e6/osd-block-4a80decf-2a05-4525-b2be-269b4a9ba65c /var/lib/ceph/osd/ceph-5/block 2026-03-09T17:33:55.713 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local bash[111334]: Running command: /usr/bin/ln -snf /dev/ceph-efabfcee-c89d-45b0-9aa2-10506df8d5e6/osd-block-4a80decf-2a05-4525-b2be-269b4a9ba65c /var/lib/ceph/osd/ceph-5/block 2026-03-09T17:33:55.713 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate[111345]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-09T17:33:55.713 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local bash[111334]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-09T17:33:55.713 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate[111345]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T17:33:55.713 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local bash[111334]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T17:33:55.713 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate[111345]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T17:33:55.713 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local bash[111334]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T17:33:55.713 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate[111345]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-09T17:33:55.713 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local bash[111334]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-09T17:33:55.713 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local podman[111334]: 2026-03-09 17:33:55.484556154 +0000 UTC m=+1.063989687 container died 0c5f45326d6be7efdad016b3d317afecb056bbdb0caa14700f4e77876252a218 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T17:33:55.714 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local podman[111334]: 2026-03-09 17:33:55.520777868 +0000 UTC m=+1.100211401 container remove 0c5f45326d6be7efdad016b3d317afecb056bbdb0caa14700f4e77876252a218 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-activate, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2) 2026-03-09T17:33:55.714 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local podman[111588]: 2026-03-09 17:33:55.674191429 +0000 UTC m=+0.025458821 container create b297663f757a3ec6ea9e8c1fe83654e511cf02e1d1ff5fdec2155a96fab97a79 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True) 2026-03-09T17:33:55.762 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.759+0000 7f6fce093700 1 -- 192.168.123.106:0/52529638 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6fc8108290 con 0x7f6fa80778c0 2026-03-09T17:33:55.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.761+0000 7f6fc57fa700 1 -- 192.168.123.106:0/52529638 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f6fc8108290 con 0x7f6fa80778c0 2026-03-09T17:33:55.764 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:33:55.764 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T17:33:55.764 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:33:55.764 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:33:55.764 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [ 2026-03-09T17:33:55.764 INFO:teuthology.orchestra.run.vm06.stdout: "mon", 2026-03-09T17:33:55.764 INFO:teuthology.orchestra.run.vm06.stdout: "crash", 2026-03-09T17:33:55.764 INFO:teuthology.orchestra.run.vm06.stdout: "mgr" 2026-03-09T17:33:55.764 INFO:teuthology.orchestra.run.vm06.stdout: ], 2026-03-09T17:33:55.764 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "11/23 daemons upgraded", 2026-03-09T17:33:55.764 INFO:teuthology.orchestra.run.vm06.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T17:33:55.764 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:33:55.764 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:33:55.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.765+0000 7f6fce093700 1 -- 192.168.123.106:0/52529638 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6fa80778c0 msgr2=0x7f6fa8079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.765+0000 7f6fce093700 1 --2- 192.168.123.106:0/52529638 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6fa80778c0 0x7f6fa8079d70 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f6fb80053b0 tx=0x7f6fb8005a90 comp rx=0 tx=0).stop 2026-03-09T17:33:55.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.765+0000 7f6fce093700 1 -- 192.168.123.106:0/52529638 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6fc8103940 msgr2=0x7f6fc81941b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.765+0000 7f6fce093700 1 --2- 192.168.123.106:0/52529638 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6fc8103940 0x7f6fc81941b0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f6fb000eb10 tx=0x7f6fb000ee20 comp rx=0 tx=0).stop 2026-03-09T17:33:55.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.765+0000 7f6fce093700 1 -- 192.168.123.106:0/52529638 shutdown_connections 2026-03-09T17:33:55.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.765+0000 7f6fce093700 1 --2- 192.168.123.106:0/52529638 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6fa80778c0 0x7f6fa8079d70 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.765+0000 7f6fce093700 1 --2- 192.168.123.106:0/52529638 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6fc8102740 0x7f6fc8193c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.765+0000 7f6fce093700 1 --2- 192.168.123.106:0/52529638 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6fc8103940 0x7f6fc81941b0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.765+0000 7f6fce093700 1 -- 192.168.123.106:0/52529638 >> 192.168.123.106:0/52529638 conn(0x7f6fc80fdcf0 msgr2=0x7f6fc8106b70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:55.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.766+0000 7f6fce093700 1 -- 192.168.123.106:0/52529638 shutdown_connections 2026-03-09T17:33:55.768 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.766+0000 7f6fce093700 1 -- 192.168.123.106:0/52529638 wait complete. 2026-03-09T17:33:55.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.842+0000 7fa4a2da7700 1 -- 192.168.123.106:0/1265639463 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa49c103680 msgr2=0x7fa49c105ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:55.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.842+0000 7fa4a2da7700 1 --2- 192.168.123.106:0/1265639463 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa49c103680 0x7fa49c105ac0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fa490009b00 tx=0x7fa490009e10 comp rx=0 tx=0).stop 2026-03-09T17:33:55.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.843+0000 7fa4a2da7700 1 -- 192.168.123.106:0/1265639463 shutdown_connections 2026-03-09T17:33:55.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.843+0000 7fa4a2da7700 1 --2- 192.168.123.106:0/1265639463 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa49c103680 0x7fa49c105ac0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.843+0000 7fa4a2da7700 1 --2- 192.168.123.106:0/1265639463 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa49c069180 0x7fa49c103140 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.843+0000 7fa4a2da7700 1 -- 192.168.123.106:0/1265639463 >> 192.168.123.106:0/1265639463 conn(0x7fa49c0faa70 msgr2=0x7fa49c0fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:55.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.843+0000 7fa4a2da7700 1 -- 192.168.123.106:0/1265639463 shutdown_connections 2026-03-09T17:33:55.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.843+0000 7fa4a2da7700 1 -- 192.168.123.106:0/1265639463 wait complete. 2026-03-09T17:33:55.845 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.843+0000 7fa4a2da7700 1 Processor -- start 2026-03-09T17:33:55.846 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.844+0000 7fa4a2da7700 1 -- start start 2026-03-09T17:33:55.846 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.844+0000 7fa4a2da7700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa49c069180 0x7fa49c198050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:55.846 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.844+0000 7fa4a2da7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa49c103680 0x7fa49c198590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:55.846 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.844+0000 7fa4a2da7700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa49c198bb0 con 0x7fa49c103680 2026-03-09T17:33:55.846 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.844+0000 7fa4a2da7700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa49c198cf0 con 0x7fa49c069180 2026-03-09T17:33:55.846 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.844+0000 7fa49bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa49c103680 0x7fa49c198590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:55.846 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.844+0000 7fa49bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa49c103680 0x7fa49c198590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:59554/0 (socket says 192.168.123.106:59554) 2026-03-09T17:33:55.846 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.844+0000 7fa49bfff700 1 -- 192.168.123.106:0/907131319 learned_addr learned my addr 192.168.123.106:0/907131319 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:33:55.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.844+0000 7fa49bfff700 1 -- 192.168.123.106:0/907131319 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa49c069180 msgr2=0x7fa49c198050 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:33:55.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.845+0000 7fa4a0b43700 1 --2- 192.168.123.106:0/907131319 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa49c069180 0x7fa49c198050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:55.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.845+0000 7fa49bfff700 1 --2- 192.168.123.106:0/907131319 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa49c069180 0x7fa49c198050 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:55.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.845+0000 7fa49bfff700 1 -- 192.168.123.106:0/907131319 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa4900097e0 con 0x7fa49c103680 2026-03-09T17:33:55.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.845+0000 7fa4a0b43700 1 --2- 192.168.123.106:0/907131319 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa49c069180 0x7fa49c198050 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:33:55.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.845+0000 7fa49bfff700 1 --2- 192.168.123.106:0/907131319 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa49c103680 0x7fa49c198590 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7fa490005850 tx=0x7fa490004a40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:55.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.845+0000 7fa499ffb700 1 -- 192.168.123.106:0/907131319 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa49001d070 con 0x7fa49c103680 2026-03-09T17:33:55.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.845+0000 7fa499ffb700 1 -- 192.168.123.106:0/907131319 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa49000bc50 con 0x7fa49c103680 2026-03-09T17:33:55.847 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.845+0000 7fa4a2da7700 1 -- 192.168.123.106:0/907131319 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa49c19d740 con 0x7fa49c103680 2026-03-09T17:33:55.848 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.845+0000 7fa4a2da7700 1 -- 192.168.123.106:0/907131319 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa49c19dc30 con 0x7fa49c103680 2026-03-09T17:33:55.848 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.846+0000 7fa499ffb700 1 -- 192.168.123.106:0/907131319 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa490022620 con 0x7fa49c103680 2026-03-09T17:33:55.850 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.846+0000 7fa4a2da7700 1 -- 192.168.123.106:0/907131319 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa49c0fc670 con 0x7fa49c103680 2026-03-09T17:33:55.850 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.847+0000 7fa499ffb700 1 -- 192.168.123.106:0/907131319 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa490022b50 con 0x7fa49c103680 2026-03-09T17:33:55.850 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.848+0000 7fa499ffb700 1 --2- 192.168.123.106:0/907131319 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa4840778c0 0x7fa484079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:33:55.850 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.848+0000 7fa499ffb700 1 -- 192.168.123.106:0/907131319 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(72..72 src has 1..72) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fa49009b400 con 0x7fa49c103680 2026-03-09T17:33:55.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.849+0000 7fa4a0b43700 1 --2- 192.168.123.106:0/907131319 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa4840778c0 0x7fa484079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:33:55.851 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.849+0000 7fa4a0b43700 1 --2- 192.168.123.106:0/907131319 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa4840778c0 0x7fa484079d70 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fa48c005fd0 tx=0x7fa48c005f00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:33:55.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:55.851+0000 7fa499ffb700 1 -- 192.168.123.106:0/907131319 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa490063bb0 con 0x7fa49c103680 2026-03-09T17:33:55.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:55 vm06.local ceph-mon[109831]: pgmap v120: 65 pgs: 10 peering, 10 stale+active+clean, 45 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 614 B/s rd, 1 op/s 2026-03-09T17:33:55.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:55 vm06.local ceph-mon[109831]: from='client.34256 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:55.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:55 vm06.local ceph-mon[109831]: Health check failed: Reduced data availability: 3 pgs peering (PG_AVAILABILITY) 2026-03-09T17:33:55.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:55 vm06.local ceph-mon[109831]: osdmap e72: 6 total, 5 up, 6 in 2026-03-09T17:33:55.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:55 vm06.local ceph-mon[109831]: from='client.44223 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:55.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:55 vm06.local ceph-mon[109831]: from='client.34260 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:55.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:55 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/1455088776' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:55.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:55 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/241105362' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:33:56.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:56.020+0000 7fa4a2da7700 1 -- 192.168.123.106:0/907131319 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fa49c066e40 con 0x7fa49c103680 2026-03-09T17:33:56.023 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:56.021+0000 7fa499ffb700 1 -- 192.168.123.106:0/907131319 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+675 (secure 0 0 0) 0x7fa490063300 con 0x7fa49c103680 2026-03-09T17:33:56.023 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_WARN 1 osds down; all OSDs are running squid or later but require_osd_release < squid; Reduced data availability: 3 pgs peering 2026-03-09T17:33:56.023 INFO:teuthology.orchestra.run.vm06.stdout:[WRN] OSD_DOWN: 1 osds down 2026-03-09T17:33:56.023 INFO:teuthology.orchestra.run.vm06.stdout: osd.5 (root=default,host=vm09) is down 2026-03-09T17:33:56.023 INFO:teuthology.orchestra.run.vm06.stdout:[WRN] OSD_UPGRADE_FINISHED: all OSDs are running squid or later but require_osd_release < squid 2026-03-09T17:33:56.023 INFO:teuthology.orchestra.run.vm06.stdout: all OSDs are running squid or later but require_osd_release < squid 2026-03-09T17:33:56.023 INFO:teuthology.orchestra.run.vm06.stdout:[WRN] PG_AVAILABILITY: Reduced data availability: 3 pgs peering 2026-03-09T17:33:56.023 INFO:teuthology.orchestra.run.vm06.stdout: pg 2.16 is stuck peering for 109s, current state peering, last acting [3,2] 2026-03-09T17:33:56.023 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.f is stuck peering for 75s, current state peering, last acting [3,0] 2026-03-09T17:33:56.023 INFO:teuthology.orchestra.run.vm06.stdout: pg 3.16 is stuck peering for 75s, current state peering, last acting [3,1] 2026-03-09T17:33:56.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:56.024+0000 7fa4a2da7700 1 -- 192.168.123.106:0/907131319 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa4840778c0 msgr2=0x7fa484079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:56.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:56.024+0000 7fa4a2da7700 1 --2- 192.168.123.106:0/907131319 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa4840778c0 0x7fa484079d70 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fa48c005fd0 tx=0x7fa48c005f00 comp rx=0 tx=0).stop 2026-03-09T17:33:56.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:56.024+0000 7fa4a2da7700 1 -- 192.168.123.106:0/907131319 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa49c103680 msgr2=0x7fa49c198590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:33:56.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:56.024+0000 7fa4a2da7700 1 --2- 192.168.123.106:0/907131319 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa49c103680 0x7fa49c198590 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7fa490005850 tx=0x7fa490004a40 comp rx=0 tx=0).stop 2026-03-09T17:33:56.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:56.025+0000 7fa4a2da7700 1 -- 192.168.123.106:0/907131319 shutdown_connections 2026-03-09T17:33:56.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:56.025+0000 7fa4a2da7700 1 --2- 192.168.123.106:0/907131319 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa4840778c0 0x7fa484079d70 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:56.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:56.025+0000 7fa4a2da7700 1 --2- 192.168.123.106:0/907131319 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa49c069180 0x7fa49c198050 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:56.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:56.025+0000 7fa4a2da7700 1 --2- 192.168.123.106:0/907131319 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa49c103680 0x7fa49c198590 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:33:56.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:56.025+0000 7fa4a2da7700 1 -- 192.168.123.106:0/907131319 >> 192.168.123.106:0/907131319 conn(0x7fa49c0faa70 msgr2=0x7fa49c0ff7b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:33:56.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:56.025+0000 7fa4a2da7700 1 -- 192.168.123.106:0/907131319 shutdown_connections 2026-03-09T17:33:56.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:33:56.025+0000 7fa4a2da7700 1 -- 192.168.123.106:0/907131319 wait complete. 2026-03-09T17:33:56.066 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local podman[111588]: 2026-03-09 17:33:55.716993478 +0000 UTC m=+0.068260880 container init b297663f757a3ec6ea9e8c1fe83654e511cf02e1d1ff5fdec2155a96fab97a79 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T17:33:56.066 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local podman[111588]: 2026-03-09 17:33:55.732304054 +0000 UTC m=+0.083571435 container start b297663f757a3ec6ea9e8c1fe83654e511cf02e1d1ff5fdec2155a96fab97a79 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T17:33:56.066 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local bash[111588]: b297663f757a3ec6ea9e8c1fe83654e511cf02e1d1ff5fdec2155a96fab97a79 2026-03-09T17:33:56.066 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local podman[111588]: 2026-03-09 17:33:55.665274569 +0000 UTC m=+0.016541961 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:33:56.066 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:55 vm09.local systemd[1]: Started Ceph osd.5 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048. 2026-03-09T17:33:56.066 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:56 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[111601]: 2026-03-09T17:33:56.063+0000 7f03770df740 -1 Falling back to public interface 2026-03-09T17:33:57.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:56 vm06.local ceph-mon[109831]: from='client.44237 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:57.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:56 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:57.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:56 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:57.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:56 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:57.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:56 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/907131319' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:33:57.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:56 vm09.local ceph-mon[97995]: from='client.44237 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:33:57.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:56 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:57.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:56 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:57.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:56 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:57.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:56 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/907131319' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:33:58.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:57 vm06.local ceph-mon[109831]: pgmap v122: 65 pgs: 10 active+undersized, 10 peering, 1 stale+active+clean, 5 active+undersized+degraded, 39 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 767 B/s rd, 1 op/s; 12/234 objects degraded (5.128%) 2026-03-09T17:33:58.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:57 vm06.local ceph-mon[109831]: Health check failed: Degraded data redundancy: 12/234 objects degraded (5.128%), 5 pgs degraded (PG_DEGRADED) 2026-03-09T17:33:58.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:57 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:58.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:57 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:58.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:57 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:58.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:57 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:57 vm09.local ceph-mon[97995]: pgmap v122: 65 pgs: 10 active+undersized, 10 peering, 1 stale+active+clean, 5 active+undersized+degraded, 39 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 767 B/s rd, 1 op/s; 12/234 objects degraded (5.128%) 2026-03-09T17:33:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:57 vm09.local ceph-mon[97995]: Health check failed: Degraded data redundancy: 12/234 objects degraded (5.128%), 5 pgs degraded (PG_DEGRADED) 2026-03-09T17:33:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:57 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:57 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:57 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:58.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:57 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:59.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:59.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:59.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:59.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: Upgrade: Setting container_image for all osd 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: pgmap v123: 65 pgs: 10 active+undersized, 10 peering, 1 stale+active+clean, 5 active+undersized+degraded, 39 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 127 B/s rd, 0 op/s; 12/234 objects degraded (5.128%) 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: Upgrade: Setting require_osd_release to 19 squid 2026-03-09T17:33:59.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:33:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: Upgrade: Setting container_image for all osd 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: pgmap v123: 65 pgs: 10 active+undersized, 10 peering, 1 stale+active+clean, 5 active+undersized+degraded, 39 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 127 B/s rd, 0 op/s; 12/234 objects degraded (5.128%) 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: Upgrade: Setting require_osd_release to 19 squid 2026-03-09T17:33:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-09T17:33:59.985 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[111601]: 2026-03-09T17:33:59.688+0000 7f03770df740 -1 osd.5 0 read_superblock omap replica is missing. 2026-03-09T17:33:59.985 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:33:59 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[111601]: 2026-03-09T17:33:59.983+0000 7f03770df740 -1 osd.5 70 log_to_monitors true 2026-03-09T17:34:00.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:00 vm09.local ceph-mon[97995]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-09T17:34:00.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:00 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-09T17:34:00.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:00 vm09.local ceph-mon[97995]: osdmap e73: 6 total, 5 up, 6 in 2026-03-09T17:34:00.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:00 vm09.local ceph-mon[97995]: Upgrade: Disabling standby-replay for filesystem cephfs 2026-03-09T17:34:00.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:00 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:00.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:00 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "0"}]: dispatch 2026-03-09T17:34:00.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:00 vm09.local ceph-mon[97995]: from='osd.5 [v2:192.168.123.109:6816/3252066314,v1:192.168.123.109:6817/3252066314]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T17:34:00.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:00 vm09.local ceph-mon[97995]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T17:34:00.718 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:00 vm06.local ceph-mon[109831]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-09T17:34:00.718 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:00 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-09T17:34:00.718 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:00 vm06.local ceph-mon[109831]: osdmap e73: 6 total, 5 up, 6 in 2026-03-09T17:34:00.718 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:00 vm06.local ceph-mon[109831]: Upgrade: Disabling standby-replay for filesystem cephfs 2026-03-09T17:34:00.718 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:00 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:00.718 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:00 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "0"}]: dispatch 2026-03-09T17:34:00.718 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:00 vm06.local ceph-mon[109831]: from='osd.5 [v2:192.168.123.109:6816/3252066314,v1:192.168.123.109:6817/3252066314]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T17:34:00.718 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:00 vm06.local ceph-mon[109831]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06[109827]: 2026-03-09T17:34:01.365+0000 7f3ee5e0f640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: pgmap v125: 65 pgs: 16 active+undersized, 12 active+undersized+degraded, 37 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 917 B/s rd, 2 op/s; 35/234 objects degraded (14.957%) 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: osdmap e74: 6 total, 5 up, 6 in 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 3 pgs peering) 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "0"}]': finished 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 2 up:standby 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm06.vmzmbb"]}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: Upgrade: It appears safe to stop mds.cephfs.vm06.vmzmbb 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: Upgrade: Updating mds.cephfs.vm06.vmzmbb 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vmzmbb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: Deploying daemon mds.cephfs.vm06.vmzmbb on vm06 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: osdmap e75: 6 total, 5 up, 6 in 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: mds.? [v2:192.168.123.109:6824/3053844977,v1:192.168.123.109:6825/3053844977] up:boot 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: Standby daemon mds.cephfs.vm06.gzymac assigned to filesystem cephfs as rank 0 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='osd.5 [v2:192.168.123.109:6816/3252066314,v1:192.168.123.109:6817/3252066314]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.cjcawy"}]: dispatch 2026-03-09T17:34:01.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:01 vm06.local ceph-mon[109831]: fsmap cephfs:1/1 {0=cephfs.vm06.gzymac=up:replay} 2 up:standby 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: pgmap v125: 65 pgs: 16 active+undersized, 12 active+undersized+degraded, 37 active+clean; 216 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 917 B/s rd, 2 op/s; 35/234 objects degraded (14.957%) 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: osdmap e74: 6 total, 5 up, 6 in 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 3 pgs peering) 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "0"}]': finished 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 2 up:standby 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm06.vmzmbb"]}]: dispatch 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: Upgrade: It appears safe to stop mds.cephfs.vm06.vmzmbb 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: Upgrade: Updating mds.cephfs.vm06.vmzmbb 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vmzmbb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: Deploying daemon mds.cephfs.vm06.vmzmbb on vm06 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: osdmap e75: 6 total, 5 up, 6 in 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: mds.? [v2:192.168.123.109:6824/3053844977,v1:192.168.123.109:6825/3053844977] up:boot 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: Standby daemon mds.cephfs.vm06.gzymac assigned to filesystem cephfs as rank 0 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T17:34:01.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='osd.5 [v2:192.168.123.109:6816/3252066314,v1:192.168.123.109:6817/3252066314]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:34:01.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T17:34:01.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm09", "root=default"]}]: dispatch 2026-03-09T17:34:01.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.cjcawy"}]: dispatch 2026-03-09T17:34:01.896 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:01 vm09.local ceph-mon[97995]: fsmap cephfs:1/1 {0=cephfs.vm06.gzymac=up:replay} 2 up:standby 2026-03-09T17:34:02.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:02 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 35/234 objects degraded (14.957%), 12 pgs degraded (PG_DEGRADED) 2026-03-09T17:34:02.894 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:34:02 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[111601]: 2026-03-09T17:34:02.416+0000 7f036e678640 -1 osd.5 70 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T17:34:02.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:02 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 35/234 objects degraded (14.957%), 12 pgs degraded (PG_DEGRADED) 2026-03-09T17:34:03.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:03 vm06.local ceph-mon[109831]: pgmap v128: 65 pgs: 16 active+undersized, 12 active+undersized+degraded, 37 active+clean; 216 MiB data, 917 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s; 35/234 objects degraded (14.957%) 2026-03-09T17:34:03.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:03 vm06.local ceph-mon[109831]: from='osd.5 ' entity='osd.5' 2026-03-09T17:34:03.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:03 vm06.local ceph-mon[109831]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T17:34:03.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:03 vm06.local ceph-mon[109831]: osd.5 [v2:192.168.123.109:6816/3252066314,v1:192.168.123.109:6817/3252066314] boot 2026-03-09T17:34:03.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:03 vm06.local ceph-mon[109831]: osdmap e76: 6 total, 6 up, 6 in 2026-03-09T17:34:03.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:03 vm09.local ceph-mon[97995]: pgmap v128: 65 pgs: 16 active+undersized, 12 active+undersized+degraded, 37 active+clean; 216 MiB data, 917 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s; 35/234 objects degraded (14.957%) 2026-03-09T17:34:03.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:03 vm09.local ceph-mon[97995]: from='osd.5 ' entity='osd.5' 2026-03-09T17:34:03.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:03 vm09.local ceph-mon[97995]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T17:34:03.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:03 vm09.local ceph-mon[97995]: osd.5 [v2:192.168.123.109:6816/3252066314,v1:192.168.123.109:6817/3252066314] boot 2026-03-09T17:34:03.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:03 vm09.local ceph-mon[97995]: osdmap e76: 6 total, 6 up, 6 in 2026-03-09T17:34:04.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:34:04.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T17:34:06.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:06 vm06.local ceph-mon[109831]: pgmap v130: 65 pgs: 9 peering, 12 active+undersized, 7 active+undersized+degraded, 37 active+clean; 216 MiB data, 917 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 6 op/s; 17/234 objects degraded (7.265%) 2026-03-09T17:34:06.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:06 vm06.local ceph-mon[109831]: osdmap e77: 6 total, 6 up, 6 in 2026-03-09T17:34:06.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:06 vm09.local ceph-mon[97995]: pgmap v130: 65 pgs: 9 peering, 12 active+undersized, 7 active+undersized+degraded, 37 active+clean; 216 MiB data, 917 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 6 op/s; 17/234 objects degraded (7.265%) 2026-03-09T17:34:06.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:06 vm09.local ceph-mon[97995]: osdmap e77: 6 total, 6 up, 6 in 2026-03-09T17:34:07.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:07 vm06.local ceph-mon[109831]: mds.? [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] up:reconnect 2026-03-09T17:34:07.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:07 vm06.local ceph-mon[109831]: fsmap cephfs:1/1 {0=cephfs.vm06.gzymac=up:reconnect} 2 up:standby 2026-03-09T17:34:07.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:07 vm06.local ceph-mon[109831]: reconnect by client.14520 192.168.144.1:0/677291695 after 0 2026-03-09T17:34:07.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:07 vm06.local ceph-mon[109831]: reconnect by client.24343 192.168.144.1:0/3798307677 after 0 2026-03-09T17:34:07.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:07 vm06.local ceph-mon[109831]: Health check update: Degraded data redundancy: 17/234 objects degraded (7.265%), 7 pgs degraded (PG_DEGRADED) 2026-03-09T17:34:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:07 vm09.local ceph-mon[97995]: mds.? [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] up:reconnect 2026-03-09T17:34:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:07 vm09.local ceph-mon[97995]: fsmap cephfs:1/1 {0=cephfs.vm06.gzymac=up:reconnect} 2 up:standby 2026-03-09T17:34:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:07 vm09.local ceph-mon[97995]: reconnect by client.14520 192.168.144.1:0/677291695 after 0 2026-03-09T17:34:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:07 vm09.local ceph-mon[97995]: reconnect by client.24343 192.168.144.1:0/3798307677 after 0 2026-03-09T17:34:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:07 vm09.local ceph-mon[97995]: Health check update: Degraded data redundancy: 17/234 objects degraded (7.265%), 7 pgs degraded (PG_DEGRADED) 2026-03-09T17:34:08.388 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:08 vm06.local ceph-mon[109831]: pgmap v132: 65 pgs: 9 peering, 9 active+undersized, 4 active+undersized+degraded, 43 active+clean; 216 MiB data, 918 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 7 op/s; 11/234 objects degraded (4.701%) 2026-03-09T17:34:08.389 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:08 vm06.local ceph-mon[109831]: mds.? [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] up:rejoin 2026-03-09T17:34:08.389 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:08 vm06.local ceph-mon[109831]: fsmap cephfs:1/1 {0=cephfs.vm06.gzymac=up:rejoin} 2 up:standby 2026-03-09T17:34:08.389 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:08 vm06.local ceph-mon[109831]: daemon mds.cephfs.vm06.gzymac is now active in filesystem cephfs as rank 0 2026-03-09T17:34:08.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:08 vm09.local ceph-mon[97995]: pgmap v132: 65 pgs: 9 peering, 9 active+undersized, 4 active+undersized+degraded, 43 active+clean; 216 MiB data, 918 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 7 op/s; 11/234 objects degraded (4.701%) 2026-03-09T17:34:08.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:08 vm09.local ceph-mon[97995]: mds.? [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] up:rejoin 2026-03-09T17:34:08.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:08 vm09.local ceph-mon[97995]: fsmap cephfs:1/1 {0=cephfs.vm06.gzymac=up:rejoin} 2 up:standby 2026-03-09T17:34:08.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:08 vm09.local ceph-mon[97995]: daemon mds.cephfs.vm06.gzymac is now active in filesystem cephfs as rank 0 2026-03-09T17:34:09.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:09 vm06.local ceph-mon[109831]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T17:34:09.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:09 vm06.local ceph-mon[109831]: mds.? [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] up:active 2026-03-09T17:34:09.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:09 vm06.local ceph-mon[109831]: fsmap cephfs:1 {0=cephfs.vm06.gzymac=up:active} 2 up:standby 2026-03-09T17:34:09.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:09.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:09.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:09 vm09.local ceph-mon[97995]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T17:34:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:09 vm09.local ceph-mon[97995]: mds.? [v2:192.168.123.106:6828/4261949342,v1:192.168.123.106:6829/4261949342] up:active 2026-03-09T17:34:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:09 vm09.local ceph-mon[97995]: fsmap cephfs:1 {0=cephfs.vm06.gzymac=up:active} 2 up:standby 2026-03-09T17:34:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:10.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:10 vm06.local ceph-mon[109831]: pgmap v133: 65 pgs: 9 peering, 56 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 7 op/s 2026-03-09T17:34:10.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:10 vm06.local ceph-mon[109831]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 11/234 objects degraded (4.701%), 4 pgs degraded) 2026-03-09T17:34:10.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:10 vm06.local ceph-mon[109831]: Cluster is now healthy 2026-03-09T17:34:10.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:10 vm06.local ceph-mon[109831]: mds.? [v2:192.168.123.106:6826/571707287,v1:192.168.123.106:6827/571707287] up:boot 2026-03-09T17:34:10.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:10 vm06.local ceph-mon[109831]: fsmap cephfs:1 {0=cephfs.vm06.gzymac=up:active} 3 up:standby 2026-03-09T17:34:10.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:10 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vmzmbb"}]: dispatch 2026-03-09T17:34:10.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:10 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:10.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:10 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:10.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:10 vm09.local ceph-mon[97995]: pgmap v133: 65 pgs: 9 peering, 56 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 7 op/s 2026-03-09T17:34:10.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:10 vm09.local ceph-mon[97995]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 11/234 objects degraded (4.701%), 4 pgs degraded) 2026-03-09T17:34:10.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:10 vm09.local ceph-mon[97995]: Cluster is now healthy 2026-03-09T17:34:10.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:10 vm09.local ceph-mon[97995]: mds.? [v2:192.168.123.106:6826/571707287,v1:192.168.123.106:6827/571707287] up:boot 2026-03-09T17:34:10.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:10 vm09.local ceph-mon[97995]: fsmap cephfs:1 {0=cephfs.vm06.gzymac=up:active} 3 up:standby 2026-03-09T17:34:10.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:10 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vmzmbb"}]: dispatch 2026-03-09T17:34:10.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:10 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:10.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:10 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:11.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:11.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:11 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:11.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:11 vm06.local ceph-mon[109831]: pgmap v134: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 255 B/s wr, 10 op/s 2026-03-09T17:34:11.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:11.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:11 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:11.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:11 vm09.local ceph-mon[97995]: pgmap v134: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 255 B/s wr, 10 op/s 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06[109827]: 2026-03-09T17:34:12.303+0000 7f3ee5e0f640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: Detected new or changed devices on vm06 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm06.gzymac"]}]: dispatch 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: Upgrade: It appears safe to stop mds.cephfs.vm06.gzymac 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: Upgrade: Updating mds.cephfs.vm06.gzymac 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.gzymac", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: Deploying daemon mds.cephfs.vm06.gzymac on vm06 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: osdmap e78: 6 total, 6 up, 6 in 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: Standby daemon mds.cephfs.vm09.drzmdt assigned to filesystem cephfs as rank 0 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T17:34:12.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:12 vm06.local ceph-mon[109831]: fsmap cephfs:1/1 {0=cephfs.vm09.drzmdt=up:replay} 2 up:standby 2026-03-09T17:34:12.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: Detected new or changed devices on vm06 2026-03-09T17:34:12.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:12.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:12.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:12.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:34:12.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:12.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:12.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:12.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:12.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:12.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:12.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm06.gzymac"]}]: dispatch 2026-03-09T17:34:12.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: Upgrade: It appears safe to stop mds.cephfs.vm06.gzymac 2026-03-09T17:34:12.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: Upgrade: Updating mds.cephfs.vm06.gzymac 2026-03-09T17:34:12.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:12.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.gzymac", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:34:12.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:12.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: Deploying daemon mds.cephfs.vm06.gzymac on vm06 2026-03-09T17:34:12.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T17:34:12.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T17:34:12.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: osdmap e78: 6 total, 6 up, 6 in 2026-03-09T17:34:12.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: Standby daemon mds.cephfs.vm09.drzmdt assigned to filesystem cephfs as rank 0 2026-03-09T17:34:12.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T17:34:12.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T17:34:12.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:12 vm09.local ceph-mon[97995]: fsmap cephfs:1/1 {0=cephfs.vm09.drzmdt=up:replay} 2 up:standby 2026-03-09T17:34:13.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:13 vm06.local ceph-mon[109831]: pgmap v136: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 12 MiB/s rd, 255 B/s wr, 8 op/s 2026-03-09T17:34:13.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:13 vm09.local ceph-mon[97995]: pgmap v136: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 12 MiB/s rd, 255 B/s wr, 8 op/s 2026-03-09T17:34:14.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:14 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:14.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:14 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:34:14.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:14 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:14.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:14 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:34:15.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:15 vm06.local ceph-mon[109831]: pgmap v137: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 5.1 KiB/s wr, 12 op/s 2026-03-09T17:34:15.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:34:15.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:15 vm09.local ceph-mon[97995]: pgmap v137: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 5.1 KiB/s wr, 12 op/s 2026-03-09T17:34:15.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:34:18.075 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:17 vm09.local ceph-mon[97995]: pgmap v138: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 5.0 KiB/s wr, 11 op/s 2026-03-09T17:34:18.075 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:17 vm09.local ceph-mon[97995]: reconnect by client.24343 192.168.144.1:0/3798307677 after 0 2026-03-09T17:34:18.075 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:17 vm09.local ceph-mon[97995]: mds.? [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] up:reconnect 2026-03-09T17:34:18.075 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:17 vm09.local ceph-mon[97995]: fsmap cephfs:1/1 {0=cephfs.vm09.drzmdt=up:reconnect} 2 up:standby 2026-03-09T17:34:18.075 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:17 vm09.local ceph-mon[97995]: reconnect by client.14520 192.168.144.1:0/677291695 after 0.003 2026-03-09T17:34:18.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:17 vm06.local ceph-mon[109831]: pgmap v138: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 5.0 KiB/s wr, 11 op/s 2026-03-09T17:34:18.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:17 vm06.local ceph-mon[109831]: reconnect by client.24343 192.168.144.1:0/3798307677 after 0 2026-03-09T17:34:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:17 vm06.local ceph-mon[109831]: mds.? [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] up:reconnect 2026-03-09T17:34:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:17 vm06.local ceph-mon[109831]: fsmap cephfs:1/1 {0=cephfs.vm09.drzmdt=up:reconnect} 2 up:standby 2026-03-09T17:34:18.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:17 vm06.local ceph-mon[109831]: reconnect by client.14520 192.168.144.1:0/677291695 after 0.003 2026-03-09T17:34:19.110 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:18 vm06.local ceph-mon[109831]: mds.? [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] up:rejoin 2026-03-09T17:34:19.110 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:18 vm06.local ceph-mon[109831]: fsmap cephfs:1/1 {0=cephfs.vm09.drzmdt=up:rejoin} 2 up:standby 2026-03-09T17:34:19.110 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:18 vm06.local ceph-mon[109831]: daemon mds.cephfs.vm09.drzmdt is now active in filesystem cephfs as rank 0 2026-03-09T17:34:19.110 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:19.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:18 vm09.local ceph-mon[97995]: mds.? [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] up:rejoin 2026-03-09T17:34:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:18 vm09.local ceph-mon[97995]: fsmap cephfs:1/1 {0=cephfs.vm09.drzmdt=up:rejoin} 2 up:standby 2026-03-09T17:34:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:18 vm09.local ceph-mon[97995]: daemon mds.cephfs.vm09.drzmdt is now active in filesystem cephfs as rank 0 2026-03-09T17:34:19.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:19.777 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:19 vm09.local ceph-mon[97995]: pgmap v139: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 5.0 KiB/s wr, 12 op/s 2026-03-09T17:34:19.777 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:19 vm09.local ceph-mon[97995]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T17:34:19.777 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:19 vm09.local ceph-mon[97995]: Cluster is now healthy 2026-03-09T17:34:19.777 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:19 vm09.local ceph-mon[97995]: mds.? [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] up:active 2026-03-09T17:34:19.777 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:19 vm09.local ceph-mon[97995]: fsmap cephfs:1 {0=cephfs.vm09.drzmdt=up:active} 2 up:standby 2026-03-09T17:34:19.777 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:19.777 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:19.777 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:20.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:19 vm06.local ceph-mon[109831]: pgmap v139: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 5.0 KiB/s wr, 12 op/s 2026-03-09T17:34:20.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:19 vm06.local ceph-mon[109831]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T17:34:20.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:19 vm06.local ceph-mon[109831]: Cluster is now healthy 2026-03-09T17:34:20.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:19 vm06.local ceph-mon[109831]: mds.? [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] up:active 2026-03-09T17:34:20.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:19 vm06.local ceph-mon[109831]: fsmap cephfs:1 {0=cephfs.vm09.drzmdt=up:active} 2 up:standby 2026-03-09T17:34:20.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:20.009 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:20.010 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:20.910 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:20 vm06.local ceph-mon[109831]: mds.? [v2:192.168.123.106:6828/2160269265,v1:192.168.123.106:6829/2160269265] up:boot 2026-03-09T17:34:20.911 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:20 vm06.local ceph-mon[109831]: fsmap cephfs:1 {0=cephfs.vm09.drzmdt=up:active} 3 up:standby 2026-03-09T17:34:20.911 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:34:20.911 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:20.911 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:20.911 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:20.911 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:21.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:20 vm09.local ceph-mon[97995]: mds.? [v2:192.168.123.106:6828/2160269265,v1:192.168.123.106:6829/2160269265] up:boot 2026-03-09T17:34:21.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:20 vm09.local ceph-mon[97995]: fsmap cephfs:1 {0=cephfs.vm09.drzmdt=up:active} 3 up:standby 2026-03-09T17:34:21.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:34:21.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:21.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:21.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:21.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:22.852 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:22 vm09.local ceph-mon[97995]: pgmap v140: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 5.0 KiB/s wr, 12 op/s 2026-03-09T17:34:22.852 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:22 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:22.852 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:22 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:22.852 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:22 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:22.852 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:22 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:34:22.852 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:22 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:22.852 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:22 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:22.852 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:22 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:22.852 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:22 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:22.852 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:22 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:22.852 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:22 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:22.852 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:22 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm09.cjcawy"]}]: dispatch 2026-03-09T17:34:22.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:22 vm06.local ceph-mon[109831]: pgmap v140: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 5.0 KiB/s wr, 12 op/s 2026-03-09T17:34:22.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:22 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:22.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:22 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:22.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:22 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:22.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:22 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:34:22.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:22 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:22.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:22 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:22.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:22 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:22.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:22 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:22.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:22 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:22.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:22 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:22.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:22 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm09.cjcawy"]}]: dispatch 2026-03-09T17:34:23.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:23 vm06.local ceph-mon[109831]: Upgrade: It appears safe to stop mds.cephfs.vm09.cjcawy 2026-03-09T17:34:23.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:23 vm06.local ceph-mon[109831]: pgmap v141: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 5.0 KiB/s wr, 12 op/s 2026-03-09T17:34:23.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:23 vm06.local ceph-mon[109831]: Upgrade: Updating mds.cephfs.vm09.cjcawy 2026-03-09T17:34:23.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:23 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:23.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:23 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.cjcawy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:34:23.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:23 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:23.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:23 vm06.local ceph-mon[109831]: Deploying daemon mds.cephfs.vm09.cjcawy on vm09 2026-03-09T17:34:23.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:23 vm06.local ceph-mon[109831]: osdmap e79: 6 total, 6 up, 6 in 2026-03-09T17:34:23.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:23 vm09.local ceph-mon[97995]: Upgrade: It appears safe to stop mds.cephfs.vm09.cjcawy 2026-03-09T17:34:23.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:23 vm09.local ceph-mon[97995]: pgmap v141: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 5.0 KiB/s wr, 12 op/s 2026-03-09T17:34:23.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:23 vm09.local ceph-mon[97995]: Upgrade: Updating mds.cephfs.vm09.cjcawy 2026-03-09T17:34:23.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:23 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:23.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:23 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.cjcawy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:34:23.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:23 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:23.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:23 vm09.local ceph-mon[97995]: Deploying daemon mds.cephfs.vm09.cjcawy on vm09 2026-03-09T17:34:23.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:23 vm09.local ceph-mon[97995]: osdmap e79: 6 total, 6 up, 6 in 2026-03-09T17:34:24.744 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:24 vm09.local ceph-mon[97995]: fsmap cephfs:1 {0=cephfs.vm09.drzmdt=up:active} 2 up:standby 2026-03-09T17:34:25.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:24 vm06.local ceph-mon[109831]: fsmap cephfs:1 {0=cephfs.vm09.drzmdt=up:active} 2 up:standby 2026-03-09T17:34:26.100 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.098+0000 7f17567aa700 1 -- 192.168.123.106:0/1633104687 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1750101780 msgr2=0x7f1750101bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:26.100 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.098+0000 7f17567aa700 1 --2- 192.168.123.106:0/1633104687 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1750101780 0x7f1750101bd0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f1740009b00 tx=0x7f1740009e10 comp rx=0 tx=0).stop 2026-03-09T17:34:26.100 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.099+0000 7f17567aa700 1 -- 192.168.123.106:0/1633104687 shutdown_connections 2026-03-09T17:34:26.100 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.099+0000 7f17567aa700 1 --2- 192.168.123.106:0/1633104687 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1750101780 0x7f1750101bd0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.100 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.099+0000 7f17567aa700 1 --2- 192.168.123.106:0/1633104687 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1750100580 0x7f1750100990 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.100 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.099+0000 7f17567aa700 1 -- 192.168.123.106:0/1633104687 >> 192.168.123.106:0/1633104687 conn(0x7f17500fbb50 msgr2=0x7f17500fdf60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:26.100 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.099+0000 7f17567aa700 1 -- 192.168.123.106:0/1633104687 shutdown_connections 2026-03-09T17:34:26.100 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.099+0000 7f17567aa700 1 -- 192.168.123.106:0/1633104687 wait complete. 2026-03-09T17:34:26.100 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.099+0000 7f17567aa700 1 Processor -- start 2026-03-09T17:34:26.101 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.099+0000 7f17567aa700 1 -- start start 2026-03-09T17:34:26.101 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.100+0000 7f17567aa700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1750100580 0x7f1750197ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:26.101 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.100+0000 7f17567aa700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1750101780 0x7f1750198530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:26.101 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.100+0000 7f17567aa700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1750198b50 con 0x7f1750101780 2026-03-09T17:34:26.101 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.100+0000 7f17567aa700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1750198c90 con 0x7f1750100580 2026-03-09T17:34:26.101 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.100+0000 7f174f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1750101780 0x7f1750198530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:26.101 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.100+0000 7f174f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1750101780 0x7f1750198530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:39962/0 (socket says 192.168.123.106:39962) 2026-03-09T17:34:26.101 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.100+0000 7f174f7fe700 1 -- 192.168.123.106:0/3624186278 learned_addr learned my addr 192.168.123.106:0/3624186278 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:34:26.101 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.100+0000 7f174f7fe700 1 -- 192.168.123.106:0/3624186278 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1750100580 msgr2=0x7f1750197ff0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T17:34:26.101 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.100+0000 7f174f7fe700 1 --2- 192.168.123.106:0/3624186278 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1750100580 0x7f1750197ff0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.101 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.100+0000 7f174f7fe700 1 -- 192.168.123.106:0/3624186278 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f17400097e0 con 0x7f1750101780 2026-03-09T17:34:26.102 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.101+0000 7f174f7fe700 1 --2- 192.168.123.106:0/3624186278 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1750101780 0x7f1750198530 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f17400048c0 tx=0x7f17400049a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:26.102 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.101+0000 7f174d7fa700 1 -- 192.168.123.106:0/3624186278 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f174001d070 con 0x7f1750101780 2026-03-09T17:34:26.103 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.101+0000 7f174d7fa700 1 -- 192.168.123.106:0/3624186278 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f174000bc50 con 0x7f1750101780 2026-03-09T17:34:26.103 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.101+0000 7f17567aa700 1 -- 192.168.123.106:0/3624186278 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f175019d6e0 con 0x7f1750101780 2026-03-09T17:34:26.103 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.101+0000 7f17567aa700 1 -- 192.168.123.106:0/3624186278 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f175019dbd0 con 0x7f1750101780 2026-03-09T17:34:26.103 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.101+0000 7f174d7fa700 1 -- 192.168.123.106:0/3624186278 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f174000f830 con 0x7f1750101780 2026-03-09T17:34:26.103 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.102+0000 7f17567aa700 1 -- 192.168.123.106:0/3624186278 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1750105860 con 0x7f1750101780 2026-03-09T17:34:26.108 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.106+0000 7f174d7fa700 1 -- 192.168.123.106:0/3624186278 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1740022b50 con 0x7f1750101780 2026-03-09T17:34:26.108 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.106+0000 7f174d7fa700 1 --2- 192.168.123.106:0/3624186278 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f173c0778c0 0x7f173c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:26.108 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.106+0000 7f174d7fa700 1 -- 192.168.123.106:0/3624186278 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f174009bd10 con 0x7f1750101780 2026-03-09T17:34:26.108 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.106+0000 7f174d7fa700 1 -- 192.168.123.106:0/3624186278 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f174009c190 con 0x7f1750101780 2026-03-09T17:34:26.108 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.107+0000 7f174ffff700 1 --2- 192.168.123.106:0/3624186278 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f173c0778c0 0x7f173c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:26.108 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.107+0000 7f174ffff700 1 --2- 192.168.123.106:0/3624186278 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f173c0778c0 0x7f173c079d70 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f1738005fd0 tx=0x7f1738005f60 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:26.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:25 vm09.local ceph-mon[97995]: pgmap v143: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 13 MiB/s rd, 5.0 KiB/s wr, 9 op/s 2026-03-09T17:34:26.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:25 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:26.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:25 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:26.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:25 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:26.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:25 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:26.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:25 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:26.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.234+0000 7f17567aa700 1 -- 192.168.123.106:0/3624186278 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1750061190 con 0x7f173c0778c0 2026-03-09T17:34:26.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:25 vm06.local ceph-mon[109831]: pgmap v143: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 13 MiB/s rd, 5.0 KiB/s wr, 9 op/s 2026-03-09T17:34:26.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:25 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:26.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:25 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:26.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:25 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:26.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:25 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:26.237 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:25 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:26.237 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.236+0000 7f174d7fa700 1 -- 192.168.123.106:0/3624186278 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f1750061190 con 0x7f173c0778c0 2026-03-09T17:34:26.240 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.239+0000 7f17567aa700 1 -- 192.168.123.106:0/3624186278 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f173c0778c0 msgr2=0x7f173c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:26.240 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.239+0000 7f17567aa700 1 --2- 192.168.123.106:0/3624186278 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f173c0778c0 0x7f173c079d70 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f1738005fd0 tx=0x7f1738005f60 comp rx=0 tx=0).stop 2026-03-09T17:34:26.240 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.239+0000 7f17567aa700 1 -- 192.168.123.106:0/3624186278 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1750101780 msgr2=0x7f1750198530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:26.240 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.239+0000 7f17567aa700 1 --2- 192.168.123.106:0/3624186278 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1750101780 0x7f1750198530 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f17400048c0 tx=0x7f17400049a0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.240 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.239+0000 7f17567aa700 1 -- 192.168.123.106:0/3624186278 shutdown_connections 2026-03-09T17:34:26.240 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.239+0000 7f17567aa700 1 --2- 192.168.123.106:0/3624186278 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f173c0778c0 0x7f173c079d70 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.239+0000 7f17567aa700 1 --2- 192.168.123.106:0/3624186278 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1750100580 0x7f1750197ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.239+0000 7f17567aa700 1 --2- 192.168.123.106:0/3624186278 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1750101780 0x7f1750198530 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.239+0000 7f17567aa700 1 -- 192.168.123.106:0/3624186278 >> 192.168.123.106:0/3624186278 conn(0x7f17500fbb50 msgr2=0x7f17501029a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:26.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.239+0000 7f17567aa700 1 -- 192.168.123.106:0/3624186278 shutdown_connections 2026-03-09T17:34:26.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.239+0000 7f17567aa700 1 -- 192.168.123.106:0/3624186278 wait complete. 2026-03-09T17:34:26.250 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:34:26.313 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.312+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/3592401854 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7fd4103970 msgr2=0x7f7fd4105d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:26.313 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.312+0000 7f7fdc3ae700 1 --2- 192.168.123.106:0/3592401854 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7fd4103970 0x7f7fd4105d50 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f7fd0009b00 tx=0x7f7fd0009e10 comp rx=0 tx=0).stop 2026-03-09T17:34:26.313 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.312+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/3592401854 shutdown_connections 2026-03-09T17:34:26.313 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.312+0000 7f7fdc3ae700 1 --2- 192.168.123.106:0/3592401854 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7fd4103970 0x7f7fd4105d50 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.313 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.312+0000 7f7fdc3ae700 1 --2- 192.168.123.106:0/3592401854 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7fd4101050 0x7f7fd4103430 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.313 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.312+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/3592401854 >> 192.168.123.106:0/3592401854 conn(0x7f7fd40fa990 msgr2=0x7f7fd40fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:26.313 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.312+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/3592401854 shutdown_connections 2026-03-09T17:34:26.314 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.312+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/3592401854 wait complete. 2026-03-09T17:34:26.314 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.313+0000 7f7fdc3ae700 1 Processor -- start 2026-03-09T17:34:26.314 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.313+0000 7f7fdc3ae700 1 -- start start 2026-03-09T17:34:26.314 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.313+0000 7f7fdc3ae700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7fd4101050 0x7f7fd4198090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:26.314 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.313+0000 7f7fdc3ae700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7fd4103970 0x7f7fd41985d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:26.314 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.313+0000 7f7fdc3ae700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7fd4198bf0 con 0x7f7fd4101050 2026-03-09T17:34:26.314 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.313+0000 7f7fdc3ae700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7fd4198d30 con 0x7f7fd4103970 2026-03-09T17:34:26.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.313+0000 7f7fda14a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7fd4101050 0x7f7fd4198090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:26.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.314+0000 7f7fda14a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7fd4101050 0x7f7fd4198090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:39968/0 (socket says 192.168.123.106:39968) 2026-03-09T17:34:26.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.314+0000 7f7fda14a700 1 -- 192.168.123.106:0/73113605 learned_addr learned my addr 192.168.123.106:0/73113605 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:34:26.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.314+0000 7f7fda14a700 1 -- 192.168.123.106:0/73113605 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7fd4103970 msgr2=0x7f7fd41985d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:26.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.314+0000 7f7fda14a700 1 --2- 192.168.123.106:0/73113605 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7fd4103970 0x7f7fd41985d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.314+0000 7f7fda14a700 1 -- 192.168.123.106:0/73113605 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7fd00097e0 con 0x7f7fd4101050 2026-03-09T17:34:26.315 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.314+0000 7f7fda14a700 1 --2- 192.168.123.106:0/73113605 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7fd4101050 0x7f7fd4198090 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f7fc400b700 tx=0x7f7fc400bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:26.316 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.314+0000 7f7fcb7fe700 1 -- 192.168.123.106:0/73113605 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7fc4010840 con 0x7f7fd4101050 2026-03-09T17:34:26.316 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.315+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/73113605 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7fd419d7e0 con 0x7f7fd4101050 2026-03-09T17:34:26.316 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.315+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/73113605 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7fd419ddb0 con 0x7f7fd4101050 2026-03-09T17:34:26.317 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.315+0000 7f7fcb7fe700 1 -- 192.168.123.106:0/73113605 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7fc4010e80 con 0x7f7fd4101050 2026-03-09T17:34:26.317 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.315+0000 7f7fcb7fe700 1 -- 192.168.123.106:0/73113605 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7fc400d590 con 0x7f7fd4101050 2026-03-09T17:34:26.318 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.316+0000 7f7fcb7fe700 1 -- 192.168.123.106:0/73113605 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7fc400d6f0 con 0x7f7fd4101050 2026-03-09T17:34:26.318 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.317+0000 7f7fcb7fe700 1 --2- 192.168.123.106:0/73113605 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7fc0077990 0x7f7fc0079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:26.318 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.317+0000 7f7fcb7fe700 1 -- 192.168.123.106:0/73113605 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f7fc409a590 con 0x7f7fd4101050 2026-03-09T17:34:26.318 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.317+0000 7f7fd9949700 1 --2- 192.168.123.106:0/73113605 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7fc0077990 0x7f7fc0079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:26.319 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.318+0000 7f7fd9949700 1 --2- 192.168.123.106:0/73113605 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7fc0077990 0x7f7fc0079e40 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f7fd0009ad0 tx=0x7f7fd0009f90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:26.319 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.318+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/73113605 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7fb8005320 con 0x7f7fd4101050 2026-03-09T17:34:26.322 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.321+0000 7f7fcb7fe700 1 -- 192.168.123.106:0/73113605 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7fc4062d10 con 0x7f7fd4101050 2026-03-09T17:34:26.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.460+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/73113605 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7fb8000bf0 con 0x7f7fc0077990 2026-03-09T17:34:26.463 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.462+0000 7f7fcb7fe700 1 -- 192.168.123.106:0/73113605 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f7fb8000bf0 con 0x7f7fc0077990 2026-03-09T17:34:26.465 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.464+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/73113605 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7fc0077990 msgr2=0x7f7fc0079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:26.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.464+0000 7f7fdc3ae700 1 --2- 192.168.123.106:0/73113605 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7fc0077990 0x7f7fc0079e40 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f7fd0009ad0 tx=0x7f7fd0009f90 comp rx=0 tx=0).stop 2026-03-09T17:34:26.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.464+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/73113605 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7fd4101050 msgr2=0x7f7fd4198090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:26.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.464+0000 7f7fdc3ae700 1 --2- 192.168.123.106:0/73113605 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7fd4101050 0x7f7fd4198090 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f7fc400b700 tx=0x7f7fc400bac0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.465+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/73113605 shutdown_connections 2026-03-09T17:34:26.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.465+0000 7f7fdc3ae700 1 --2- 192.168.123.106:0/73113605 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7fc0077990 0x7f7fc0079e40 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.465+0000 7f7fdc3ae700 1 --2- 192.168.123.106:0/73113605 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7fd4101050 0x7f7fd4198090 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.465+0000 7f7fdc3ae700 1 --2- 192.168.123.106:0/73113605 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7fd4103970 0x7f7fd41985d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.465+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/73113605 >> 192.168.123.106:0/73113605 conn(0x7f7fd40fa990 msgr2=0x7f7fd41045d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:26.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.465+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/73113605 shutdown_connections 2026-03-09T17:34:26.466 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.465+0000 7f7fdc3ae700 1 -- 192.168.123.106:0/73113605 wait complete. 2026-03-09T17:34:26.542 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.540+0000 7fa830e41700 1 -- 192.168.123.106:0/3836020172 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa82c0691a0 msgr2=0x7fa82c105520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:26.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.540+0000 7fa830e41700 1 --2- 192.168.123.106:0/3836020172 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa82c0691a0 0x7fa82c105520 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7fa814009b00 tx=0x7fa814009e10 comp rx=0 tx=0).stop 2026-03-09T17:34:26.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.543+0000 7fa830e41700 1 -- 192.168.123.106:0/3836020172 shutdown_connections 2026-03-09T17:34:26.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.543+0000 7fa830e41700 1 --2- 192.168.123.106:0/3836020172 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa82c105a60 0x7fa82c107e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.543+0000 7fa830e41700 1 --2- 192.168.123.106:0/3836020172 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa82c0691a0 0x7fa82c105520 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.543+0000 7fa830e41700 1 -- 192.168.123.106:0/3836020172 >> 192.168.123.106:0/3836020172 conn(0x7fa82c0faa70 msgr2=0x7fa82c0fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:26.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.543+0000 7fa830e41700 1 -- 192.168.123.106:0/3836020172 shutdown_connections 2026-03-09T17:34:26.544 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.543+0000 7fa830e41700 1 -- 192.168.123.106:0/3836020172 wait complete. 2026-03-09T17:34:26.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.544+0000 7fa830e41700 1 Processor -- start 2026-03-09T17:34:26.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.544+0000 7fa830e41700 1 -- start start 2026-03-09T17:34:26.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.544+0000 7fa830e41700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa82c0691a0 0x7fa82c071da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:26.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.544+0000 7fa830e41700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa82c105a60 0x7fa82c0722e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:26.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.544+0000 7fa830e41700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa82c072870 con 0x7fa82c105a60 2026-03-09T17:34:26.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.544+0000 7fa830e41700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa82c0729e0 con 0x7fa82c0691a0 2026-03-09T17:34:26.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.545+0000 7fa829d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa82c105a60 0x7fa82c0722e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:26.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.545+0000 7fa829d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa82c105a60 0x7fa82c0722e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:39978/0 (socket says 192.168.123.106:39978) 2026-03-09T17:34:26.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.545+0000 7fa829d9b700 1 -- 192.168.123.106:0/4221661234 learned_addr learned my addr 192.168.123.106:0/4221661234 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:34:26.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.545+0000 7fa82a59c700 1 --2- 192.168.123.106:0/4221661234 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa82c0691a0 0x7fa82c071da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:26.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.545+0000 7fa829d9b700 1 -- 192.168.123.106:0/4221661234 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa82c0691a0 msgr2=0x7fa82c071da0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:26.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.545+0000 7fa829d9b700 1 --2- 192.168.123.106:0/4221661234 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa82c0691a0 0x7fa82c071da0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.545+0000 7fa829d9b700 1 -- 192.168.123.106:0/4221661234 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa8140097e0 con 0x7fa82c105a60 2026-03-09T17:34:26.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.546+0000 7fa829d9b700 1 --2- 192.168.123.106:0/4221661234 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa82c105a60 0x7fa82c0722e0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7fa81c00cc60 tx=0x7fa81c0074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:26.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.546+0000 7fa8237fe700 1 -- 192.168.123.106:0/4221661234 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa81c007af0 con 0x7fa82c105a60 2026-03-09T17:34:26.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.546+0000 7fa8237fe700 1 -- 192.168.123.106:0/4221661234 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa81c007c50 con 0x7fa82c105a60 2026-03-09T17:34:26.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.546+0000 7fa830e41700 1 -- 192.168.123.106:0/4221661234 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa82c1a5ed0 con 0x7fa82c105a60 2026-03-09T17:34:26.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.546+0000 7fa8237fe700 1 -- 192.168.123.106:0/4221661234 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa81c0187b0 con 0x7fa82c105a60 2026-03-09T17:34:26.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.546+0000 7fa830e41700 1 -- 192.168.123.106:0/4221661234 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa82c1a63f0 con 0x7fa82c105a60 2026-03-09T17:34:26.549 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.547+0000 7fa830e41700 1 -- 192.168.123.106:0/4221661234 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa82c06bf10 con 0x7fa82c105a60 2026-03-09T17:34:26.552 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.549+0000 7fa8237fe700 1 -- 192.168.123.106:0/4221661234 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa81c00f450 con 0x7fa82c105a60 2026-03-09T17:34:26.552 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.549+0000 7fa8237fe700 1 --2- 192.168.123.106:0/4221661234 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa8180778c0 0x7fa818079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:26.552 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.550+0000 7fa8237fe700 1 -- 192.168.123.106:0/4221661234 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fa81c099b50 con 0x7fa82c105a60 2026-03-09T17:34:26.552 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.551+0000 7fa82a59c700 1 --2- 192.168.123.106:0/4221661234 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa8180778c0 0x7fa818079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:26.552 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.551+0000 7fa82a59c700 1 --2- 192.168.123.106:0/4221661234 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa8180778c0 0x7fa818079d70 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fa82c1a1d20 tx=0x7fa814005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:26.554 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.553+0000 7fa8237fe700 1 -- 192.168.123.106:0/4221661234 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa81c062350 con 0x7fa82c105a60 2026-03-09T17:34:26.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.676+0000 7fa830e41700 1 -- 192.168.123.106:0/4221661234 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fa82c061190 con 0x7fa8180778c0 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.681+0000 7fa8237fe700 1 -- 192.168.123.106:0/4221661234 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fa82c061190 con 0x7fa8180778c0 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (8m) 6s ago 9m 26.0M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (9m) 6s ago 9m 9717k - 18.2.0 dc2bc1663786 518b33d98521 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (8m) 0s ago 8m 11.9M - 18.2.0 dc2bc1663786 4486b60e6311 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (3m) 6s ago 9m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 3e47d040f792 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (3m) 0s ago 8m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e a8538a05db57 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (8m) 6s ago 9m 95.9M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (7s) 6s ago 7m 13.9M - 19.2.3-678-ge911bdeb 654f31e6858e cea38695f742 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (18s) 6s ago 7m 18.8M - 19.2.3-678-ge911bdeb 654f31e6858e 137634321f1d 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (1s) 0s ago 7m 16.2M - 19.2.3-678-ge911bdeb 654f31e6858e da0f8f8a04b1 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (7m) 0s ago 7m 94.1M - 18.2.0 dc2bc1663786 8dc8a0159213 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:8443,9283,8765 running (4m) 6s ago 10m 630M - 19.2.3-678-ge911bdeb 654f31e6858e a857a5a84e3a 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (3m) 0s ago 8m 491M - 19.2.3-678-ge911bdeb 654f31e6858e ce41ce6b68d6 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (3m) 6s ago 10m 62.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 86c27c9946b5 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (3m) 0s ago 8m 59.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 65d270c6a306 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (9m) 6s ago 9m 14.9M - 1.5.0 0da6a335fe13 ea650be5ff39 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (8m) 0s ago 8m 16.4M - 1.5.0 0da6a335fe13 364ad5f4aa86 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (2m) 6s ago 8m 209M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 3b19d9fcb067 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (103s) 6s ago 8m 129M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b63df0190ed3 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (81s) 6s ago 8m 124M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a5ccd85faf22 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (59s) 0s ago 8m 161M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 40d834360933 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (52s) 0s ago 7m 121M 4096M 19.2.3-678-ge911bdeb 654f31e6858e cb6e9cd4fe30 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (31s) 0s ago 7m 126M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b297663f757a 2026-03-09T17:34:26.683 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (3m) 6s ago 9m 65.0M - 2.43.0 a07b618ecd1d f6ece95f2fd5 2026-03-09T17:34:26.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.684+0000 7fa830e41700 1 -- 192.168.123.106:0/4221661234 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa8180778c0 msgr2=0x7fa818079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:26.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.684+0000 7fa830e41700 1 --2- 192.168.123.106:0/4221661234 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa8180778c0 0x7fa818079d70 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fa82c1a1d20 tx=0x7fa814005fb0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.685+0000 7fa830e41700 1 -- 192.168.123.106:0/4221661234 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa82c105a60 msgr2=0x7fa82c0722e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:26.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.685+0000 7fa830e41700 1 --2- 192.168.123.106:0/4221661234 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa82c105a60 0x7fa82c0722e0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7fa81c00cc60 tx=0x7fa81c0074a0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.685+0000 7fa830e41700 1 -- 192.168.123.106:0/4221661234 shutdown_connections 2026-03-09T17:34:26.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.685+0000 7fa830e41700 1 --2- 192.168.123.106:0/4221661234 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa8180778c0 0x7fa818079d70 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.685+0000 7fa830e41700 1 --2- 192.168.123.106:0/4221661234 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa82c0691a0 0x7fa82c071da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.685+0000 7fa830e41700 1 --2- 192.168.123.106:0/4221661234 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa82c105a60 0x7fa82c0722e0 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.685+0000 7fa830e41700 1 -- 192.168.123.106:0/4221661234 >> 192.168.123.106:0/4221661234 conn(0x7fa82c0faa70 msgr2=0x7fa82c0fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:26.687 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.686+0000 7fa830e41700 1 -- 192.168.123.106:0/4221661234 shutdown_connections 2026-03-09T17:34:26.687 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.686+0000 7fa830e41700 1 -- 192.168.123.106:0/4221661234 wait complete. 2026-03-09T17:34:26.757 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.756+0000 7f12dac42700 1 -- 192.168.123.106:0/2010098442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12d40691a0 msgr2=0x7f12d4105520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:26.757 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.756+0000 7f12dac42700 1 --2- 192.168.123.106:0/2010098442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12d40691a0 0x7f12d4105520 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f12c4009b50 tx=0x7f12c4009e60 comp rx=0 tx=0).stop 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.756+0000 7f12dac42700 1 -- 192.168.123.106:0/2010098442 shutdown_connections 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.756+0000 7f12dac42700 1 --2- 192.168.123.106:0/2010098442 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f12d4105a60 0x7f12d4107e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.756+0000 7f12dac42700 1 --2- 192.168.123.106:0/2010098442 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12d40691a0 0x7f12d4105520 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.756+0000 7f12dac42700 1 -- 192.168.123.106:0/2010098442 >> 192.168.123.106:0/2010098442 conn(0x7f12d40faa70 msgr2=0x7f12d40fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.756+0000 7f12dac42700 1 -- 192.168.123.106:0/2010098442 shutdown_connections 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.756+0000 7f12dac42700 1 -- 192.168.123.106:0/2010098442 wait complete. 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.757+0000 7f12dac42700 1 Processor -- start 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.757+0000 7f12dac42700 1 -- start start 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.757+0000 7f12dac42700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f12d40691a0 0x7f12d4198080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.757+0000 7f12dac42700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12d4105a60 0x7f12d41985c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.757+0000 7f12dac42700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f12d4198be0 con 0x7f12d4105a60 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.757+0000 7f12dac42700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f12d4198d20 con 0x7f12d40691a0 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.757+0000 7f12d3fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12d4105a60 0x7f12d41985c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.757+0000 7f12d3fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12d4105a60 0x7f12d41985c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:39998/0 (socket says 192.168.123.106:39998) 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.757+0000 7f12d3fff700 1 -- 192.168.123.106:0/714402961 learned_addr learned my addr 192.168.123.106:0/714402961 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.757+0000 7f12d3fff700 1 -- 192.168.123.106:0/714402961 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f12d40691a0 msgr2=0x7f12d4198080 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:34:26.758 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.757+0000 7f12d3fff700 1 --2- 192.168.123.106:0/714402961 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f12d40691a0 0x7f12d4198080 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.759 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.757+0000 7f12d3fff700 1 -- 192.168.123.106:0/714402961 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f12c40097e0 con 0x7f12d4105a60 2026-03-09T17:34:26.759 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.758+0000 7f12d3fff700 1 --2- 192.168.123.106:0/714402961 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12d4105a60 0x7f12d41985c0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f12cc0060b0 tx=0x7f12cc00d750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:26.759 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.758+0000 7f12d1ffb700 1 -- 192.168.123.106:0/714402961 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f12cc015400 con 0x7f12d4105a60 2026-03-09T17:34:26.759 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.758+0000 7f12d1ffb700 1 -- 192.168.123.106:0/714402961 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f12cc00f040 con 0x7f12d4105a60 2026-03-09T17:34:26.759 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.758+0000 7f12dac42700 1 -- 192.168.123.106:0/714402961 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f12d419d7d0 con 0x7f12d4105a60 2026-03-09T17:34:26.762 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.758+0000 7f12dac42700 1 -- 192.168.123.106:0/714402961 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f12d419dd20 con 0x7f12d4105a60 2026-03-09T17:34:26.762 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.759+0000 7f12d1ffb700 1 -- 192.168.123.106:0/714402961 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f12cc014a20 con 0x7f12d4105a60 2026-03-09T17:34:26.762 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.759+0000 7f12d1ffb700 1 -- 192.168.123.106:0/714402961 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f12cc014cc0 con 0x7f12d4105a60 2026-03-09T17:34:26.762 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.759+0000 7f12dac42700 1 -- 192.168.123.106:0/714402961 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f12c0005320 con 0x7f12d4105a60 2026-03-09T17:34:26.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.761+0000 7f12d1ffb700 1 --2- 192.168.123.106:0/714402961 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f12bc0778c0 0x7f12bc079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:26.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.761+0000 7f12d1ffb700 1 -- 192.168.123.106:0/714402961 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f12cc09a950 con 0x7f12d4105a60 2026-03-09T17:34:26.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.761+0000 7f12d89de700 1 --2- 192.168.123.106:0/714402961 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f12bc0778c0 0x7f12bc079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:26.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.761+0000 7f12d89de700 1 --2- 192.168.123.106:0/714402961 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f12bc0778c0 0x7f12bc079d70 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f12c4009b20 tx=0x7f12c4005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:26.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.763+0000 7f12d1ffb700 1 -- 192.168.123.106:0/714402961 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f12cc0143a0 con 0x7f12d4105a60 2026-03-09T17:34:26.926 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.925+0000 7f12dac42700 1 -- 192.168.123.106:0/714402961 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f12c0006200 con 0x7f12d4105a60 2026-03-09T17:34:26.926 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.925+0000 7f12d1ffb700 1 -- 192.168.123.106:0/714402961 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+815 (secure 0 0 0) 0x7f12cc0628e0 con 0x7f12d4105a60 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 1, 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 3 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 1, 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 13 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:34:26.927 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:34:26.929 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.928+0000 7f12dac42700 1 -- 192.168.123.106:0/714402961 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f12bc0778c0 msgr2=0x7f12bc079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:26.929 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.928+0000 7f12dac42700 1 --2- 192.168.123.106:0/714402961 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f12bc0778c0 0x7f12bc079d70 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f12c4009b20 tx=0x7f12c4005fb0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.929 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.928+0000 7f12dac42700 1 -- 192.168.123.106:0/714402961 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12d4105a60 msgr2=0x7f12d41985c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:26.929 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.928+0000 7f12dac42700 1 --2- 192.168.123.106:0/714402961 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12d4105a60 0x7f12d41985c0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f12cc0060b0 tx=0x7f12cc00d750 comp rx=0 tx=0).stop 2026-03-09T17:34:26.930 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.929+0000 7f12dac42700 1 -- 192.168.123.106:0/714402961 shutdown_connections 2026-03-09T17:34:26.930 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.929+0000 7f12dac42700 1 --2- 192.168.123.106:0/714402961 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f12bc0778c0 0x7f12bc079d70 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.930 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.929+0000 7f12dac42700 1 --2- 192.168.123.106:0/714402961 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f12d40691a0 0x7f12d4198080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.930 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.929+0000 7f12dac42700 1 --2- 192.168.123.106:0/714402961 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f12d4105a60 0x7f12d41985c0 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:26.930 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.929+0000 7f12dac42700 1 -- 192.168.123.106:0/714402961 >> 192.168.123.106:0/714402961 conn(0x7f12d40faa70 msgr2=0x7f12d40fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:26.930 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.929+0000 7f12dac42700 1 -- 192.168.123.106:0/714402961 shutdown_connections 2026-03-09T17:34:26.930 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:26.929+0000 7f12dac42700 1 -- 192.168.123.106:0/714402961 wait complete. 2026-03-09T17:34:27.005 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.004+0000 7f3d951ef700 1 -- 192.168.123.106:0/2496252759 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d880987c0 msgr2=0x7f3d8809aba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:27.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.004+0000 7f3d951ef700 1 --2- 192.168.123.106:0/2496252759 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d880987c0 0x7f3d8809aba0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f3d7c009b50 tx=0x7f3d7c009e60 comp rx=0 tx=0).stop 2026-03-09T17:34:27.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.004+0000 7f3d951ef700 1 -- 192.168.123.106:0/2496252759 shutdown_connections 2026-03-09T17:34:27.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.004+0000 7f3d951ef700 1 --2- 192.168.123.106:0/2496252759 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d880987c0 0x7f3d8809aba0 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.004+0000 7f3d951ef700 1 --2- 192.168.123.106:0/2496252759 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3d88095ea0 0x7f3d88098280 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.004+0000 7f3d951ef700 1 -- 192.168.123.106:0/2496252759 >> 192.168.123.106:0/2496252759 conn(0x7f3d8808f8a0 msgr2=0x7f3d88091cf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:27.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.004+0000 7f3d951ef700 1 -- 192.168.123.106:0/2496252759 shutdown_connections 2026-03-09T17:34:27.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.004+0000 7f3d951ef700 1 -- 192.168.123.106:0/2496252759 wait complete. 2026-03-09T17:34:27.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.005+0000 7f3d951ef700 1 Processor -- start 2026-03-09T17:34:27.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.005+0000 7f3d951ef700 1 -- start start 2026-03-09T17:34:27.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.005+0000 7f3d951ef700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3d88095ea0 0x7f3d88095800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:27.006 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.005+0000 7f3d951ef700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d880987c0 0x7f3d88093e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:27.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.005+0000 7f3d951ef700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d88094390 con 0x7f3d880987c0 2026-03-09T17:34:27.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.005+0000 7f3d951ef700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d880944d0 con 0x7f3d88095ea0 2026-03-09T17:34:27.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.006+0000 7f3d8f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d880987c0 0x7f3d88093e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:27.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.006+0000 7f3d8ffff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3d88095ea0 0x7f3d88095800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:27.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.006+0000 7f3d8f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d880987c0 0x7f3d88093e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:40028/0 (socket says 192.168.123.106:40028) 2026-03-09T17:34:27.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.006+0000 7f3d8f7fe700 1 -- 192.168.123.106:0/3673922968 learned_addr learned my addr 192.168.123.106:0/3673922968 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:34:27.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.006+0000 7f3d8ffff700 1 -- 192.168.123.106:0/3673922968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d880987c0 msgr2=0x7f3d88093e50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:27.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.006+0000 7f3d8ffff700 1 --2- 192.168.123.106:0/3673922968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d880987c0 0x7f3d88093e50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.006+0000 7f3d8ffff700 1 -- 192.168.123.106:0/3673922968 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3d7c0097e0 con 0x7f3d88095ea0 2026-03-09T17:34:27.007 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.006+0000 7f3d8ffff700 1 --2- 192.168.123.106:0/3673922968 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3d88095ea0 0x7f3d88095800 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f3d8400eab0 tx=0x7f3d8400ee70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:27.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.007+0000 7f3d8f7fe700 1 --2- 192.168.123.106:0/3673922968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d880987c0 0x7f3d88093e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:34:27.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.008+0000 7f3d8d7fa700 1 -- 192.168.123.106:0/3673922968 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d8400cbe0 con 0x7f3d88095ea0 2026-03-09T17:34:27.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.008+0000 7f3d8d7fa700 1 -- 192.168.123.106:0/3673922968 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3d8400cd40 con 0x7f3d88095ea0 2026-03-09T17:34:27.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.008+0000 7f3d8d7fa700 1 -- 192.168.123.106:0/3673922968 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d84018930 con 0x7f3d88095ea0 2026-03-09T17:34:27.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.008+0000 7f3d951ef700 1 -- 192.168.123.106:0/3673922968 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3d880947b0 con 0x7f3d88095ea0 2026-03-09T17:34:27.009 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.008+0000 7f3d951ef700 1 -- 192.168.123.106:0/3673922968 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3d88094cd0 con 0x7f3d88095ea0 2026-03-09T17:34:27.011 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.010+0000 7f3d8d7fa700 1 -- 192.168.123.106:0/3673922968 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3d84018a90 con 0x7f3d88095ea0 2026-03-09T17:34:27.011 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.010+0000 7f3d951ef700 1 -- 192.168.123.106:0/3673922968 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3d88004680 con 0x7f3d88095ea0 2026-03-09T17:34:27.011 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.010+0000 7f3d8d7fa700 1 --2- 192.168.123.106:0/3673922968 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3d800776c0 0x7f3d80079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:27.011 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.010+0000 7f3d8d7fa700 1 -- 192.168.123.106:0/3673922968 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f3d84014070 con 0x7f3d88095ea0 2026-03-09T17:34:27.012 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.011+0000 7f3d8f7fe700 1 --2- 192.168.123.106:0/3673922968 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3d800776c0 0x7f3d80079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:27.012 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.011+0000 7f3d8f7fe700 1 --2- 192.168.123.106:0/3673922968 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3d800776c0 0x7f3d80079b70 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f3d7c0054c0 tx=0x7f3d7c005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:27.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.015+0000 7f3d8d7fa700 1 -- 192.168.123.106:0/3673922968 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3d84062770 con 0x7f3d88095ea0 2026-03-09T17:34:27.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.150+0000 7f3d951ef700 1 -- 192.168.123.106:0/3673922968 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f3d88132e50 con 0x7f3d88095ea0 2026-03-09T17:34:27.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.151+0000 7f3d8d7fa700 1 -- 192.168.123.106:0/3673922968 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 28 v28) v1 ==== 76+0+1932 (secure 0 0 0) 0x7f3d84061ec0 con 0x7f3d88095ea0 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:e28 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:btime 2026-03-09T17:34:25:957089+0000 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:epoch 25 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:34:18.757239+0000 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:max_xattr_size 65536 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 78 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:up {0=24307} 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:qdb_cluster leader: 24307 members: 24307 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{0:24307} state up:active seq 109 join_fscid=1 addr [v2:192.168.123.109:6826/3078962403,v1:192.168.123.109:6827/3078962403] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{-1:34284} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6826/571707287,v1:192.168.123.106:6827/571707287] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{-1:44251} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/2160269265,v1:192.168.123.106:6829/2160269265] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T17:34:27.154 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{-1:44253} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.109:6824/3810846472,v1:192.168.123.109:6825/3810846472] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T17:34:27.155 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.154+0000 7f3d951ef700 1 -- 192.168.123.106:0/3673922968 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3d800776c0 msgr2=0x7f3d80079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:27.155 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.154+0000 7f3d951ef700 1 --2- 192.168.123.106:0/3673922968 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3d800776c0 0x7f3d80079b70 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f3d7c0054c0 tx=0x7f3d7c005fb0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.155 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.154+0000 7f3d951ef700 1 -- 192.168.123.106:0/3673922968 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3d88095ea0 msgr2=0x7f3d88095800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:27.155 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.154+0000 7f3d951ef700 1 --2- 192.168.123.106:0/3673922968 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3d88095ea0 0x7f3d88095800 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f3d8400eab0 tx=0x7f3d8400ee70 comp rx=0 tx=0).stop 2026-03-09T17:34:27.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.155+0000 7f3d951ef700 1 -- 192.168.123.106:0/3673922968 shutdown_connections 2026-03-09T17:34:27.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.155+0000 7f3d951ef700 1 --2- 192.168.123.106:0/3673922968 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3d800776c0 0x7f3d80079b70 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.155+0000 7f3d951ef700 1 --2- 192.168.123.106:0/3673922968 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f3d88095ea0 0x7f3d88095800 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.155+0000 7f3d951ef700 1 --2- 192.168.123.106:0/3673922968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3d880987c0 0x7f3d88093e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.155+0000 7f3d951ef700 1 -- 192.168.123.106:0/3673922968 >> 192.168.123.106:0/3673922968 conn(0x7f3d8808f8a0 msgr2=0x7f3d88091c80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:27.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.156+0000 7f3d951ef700 1 -- 192.168.123.106:0/3673922968 shutdown_connections 2026-03-09T17:34:27.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.156+0000 7f3d951ef700 1 -- 192.168.123.106:0/3673922968 wait complete. 2026-03-09T17:34:27.158 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 28 2026-03-09T17:34:27.228 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:26 vm06.local ceph-mon[109831]: mds.? [v2:192.168.123.109:6824/3810846472,v1:192.168.123.109:6825/3810846472] up:boot 2026-03-09T17:34:27.228 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:26 vm06.local ceph-mon[109831]: fsmap cephfs:1 {0=cephfs.vm09.drzmdt=up:active} 3 up:standby 2026-03-09T17:34:27.228 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:26 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.cjcawy"}]: dispatch 2026-03-09T17:34:27.228 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:26 vm06.local ceph-mon[109831]: from='client.34296 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:34:27.228 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:26 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:27.228 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:26 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:27.228 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:26 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/714402961' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:27.228 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.226+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/4086295666 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1bf0069180 msgr2=0x7f1bf0103140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:27.228 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.226+0000 7f1bf6d9b700 1 --2- 192.168.123.106:0/4086295666 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1bf0069180 0x7f1bf0103140 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f1be0009b00 tx=0x7f1be0009e10 comp rx=0 tx=0).stop 2026-03-09T17:34:27.233 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/4086295666 shutdown_connections 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf6d9b700 1 --2- 192.168.123.106:0/4086295666 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1bf0103680 0x7f1bf0105ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf6d9b700 1 --2- 192.168.123.106:0/4086295666 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1bf0069180 0x7f1bf0103140 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/4086295666 >> 192.168.123.106:0/4086295666 conn(0x7f1bf00faa70 msgr2=0x7f1bf00fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/4086295666 shutdown_connections 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/4086295666 wait complete. 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf6d9b700 1 Processor -- start 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf6d9b700 1 -- start start 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf6d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1bf0069180 0x7f1bf006a950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf6d9b700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1bf0103680 0x7f1bf006ae90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf6d9b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1bf006b4b0 con 0x7f1bf0069180 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf6d9b700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1bf006b5f0 con 0x7f1bf0103680 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf4b37700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1bf0069180 0x7f1bf006a950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf4b37700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1bf0069180 0x7f1bf006a950 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:40046/0 (socket says 192.168.123.106:40046) 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1bf4b37700 1 -- 192.168.123.106:0/54292511 learned_addr learned my addr 192.168.123.106:0/54292511 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.230+0000 7f1beffff700 1 --2- 192.168.123.106:0/54292511 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1bf0103680 0x7f1bf006ae90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.231+0000 7f1bf4b37700 1 -- 192.168.123.106:0/54292511 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1bf0103680 msgr2=0x7f1bf006ae90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.231+0000 7f1bf4b37700 1 --2- 192.168.123.106:0/54292511 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1bf0103680 0x7f1bf006ae90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.231+0000 7f1bf4b37700 1 -- 192.168.123.106:0/54292511 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1be00097e0 con 0x7f1bf0069180 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.231+0000 7f1bf4b37700 1 --2- 192.168.123.106:0/54292511 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1bf0069180 0x7f1bf006a950 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f1be0000c00 tx=0x7f1be0004910 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.232+0000 7f1bedffb700 1 -- 192.168.123.106:0/54292511 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1be001d070 con 0x7f1bf0069180 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.232+0000 7f1bedffb700 1 -- 192.168.123.106:0/54292511 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1be000bb40 con 0x7f1bf0069180 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.232+0000 7f1bedffb700 1 -- 192.168.123.106:0/54292511 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1be000f670 con 0x7f1bf0069180 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.233+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/54292511 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1bf0070040 con 0x7f1bf0069180 2026-03-09T17:34:27.234 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.233+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/54292511 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1bf0070530 con 0x7f1bf0069180 2026-03-09T17:34:27.235 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.233+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/54292511 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1bf00fc670 con 0x7f1bf0069180 2026-03-09T17:34:27.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.237+0000 7f1bedffb700 1 -- 192.168.123.106:0/54292511 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1be0004d10 con 0x7f1bf0069180 2026-03-09T17:34:27.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.237+0000 7f1bedffb700 1 --2- 192.168.123.106:0/54292511 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f1bd80778c0 0x7f1bd8079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:27.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.237+0000 7f1bedffb700 1 -- 192.168.123.106:0/54292511 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f1be006d900 con 0x7f1bf0069180 2026-03-09T17:34:27.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.238+0000 7f1bedffb700 1 -- 192.168.123.106:0/54292511 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1be0060c40 con 0x7f1bf0069180 2026-03-09T17:34:27.239 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.238+0000 7f1beffff700 1 --2- 192.168.123.106:0/54292511 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f1bd80778c0 0x7f1bd8079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:27.240 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.238+0000 7f1beffff700 1 --2- 192.168.123.106:0/54292511 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f1bd80778c0 0x7f1bd8079d70 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f1be4009e30 tx=0x7f1be4009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:27.290 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:26 vm09.local ceph-mon[97995]: mds.? [v2:192.168.123.109:6824/3810846472,v1:192.168.123.109:6825/3810846472] up:boot 2026-03-09T17:34:27.290 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:26 vm09.local ceph-mon[97995]: fsmap cephfs:1 {0=cephfs.vm09.drzmdt=up:active} 3 up:standby 2026-03-09T17:34:27.290 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:26 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.cjcawy"}]: dispatch 2026-03-09T17:34:27.290 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:26 vm09.local ceph-mon[97995]: from='client.34296 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:34:27.290 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:26 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:27.290 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:26 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:27.290 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:26 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/714402961' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:27.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.367+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/54292511 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1bf0061190 con 0x7f1bd80778c0 2026-03-09T17:34:27.369 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.368+0000 7f1bedffb700 1 -- 192.168.123.106:0/54292511 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f1bf0061190 con 0x7f1bd80778c0 2026-03-09T17:34:27.369 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:34:27.370 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T17:34:27.370 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:34:27.370 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:34:27.370 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [ 2026-03-09T17:34:27.370 INFO:teuthology.orchestra.run.vm06.stdout: "mon", 2026-03-09T17:34:27.370 INFO:teuthology.orchestra.run.vm06.stdout: "osd", 2026-03-09T17:34:27.370 INFO:teuthology.orchestra.run.vm06.stdout: "crash", 2026-03-09T17:34:27.370 INFO:teuthology.orchestra.run.vm06.stdout: "mgr" 2026-03-09T17:34:27.370 INFO:teuthology.orchestra.run.vm06.stdout: ], 2026-03-09T17:34:27.370 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "15/23 daemons upgraded", 2026-03-09T17:34:27.370 INFO:teuthology.orchestra.run.vm06.stdout: "message": "Currently upgrading mds daemons", 2026-03-09T17:34:27.370 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:34:27.370 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:34:27.373 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.372+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/54292511 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f1bd80778c0 msgr2=0x7f1bd8079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:27.373 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.372+0000 7f1bf6d9b700 1 --2- 192.168.123.106:0/54292511 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f1bd80778c0 0x7f1bd8079d70 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f1be4009e30 tx=0x7f1be4009450 comp rx=0 tx=0).stop 2026-03-09T17:34:27.373 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.372+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/54292511 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1bf0069180 msgr2=0x7f1bf006a950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:27.373 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.372+0000 7f1bf6d9b700 1 --2- 192.168.123.106:0/54292511 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1bf0069180 0x7f1bf006a950 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f1be0000c00 tx=0x7f1be0004910 comp rx=0 tx=0).stop 2026-03-09T17:34:27.375 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.374+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/54292511 shutdown_connections 2026-03-09T17:34:27.375 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.374+0000 7f1bf6d9b700 1 --2- 192.168.123.106:0/54292511 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f1bd80778c0 0x7f1bd8079d70 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.375 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.374+0000 7f1bf6d9b700 1 --2- 192.168.123.106:0/54292511 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1bf0069180 0x7f1bf006a950 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.375 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.374+0000 7f1bf6d9b700 1 --2- 192.168.123.106:0/54292511 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f1bf0103680 0x7f1bf006ae90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.374+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/54292511 >> 192.168.123.106:0/54292511 conn(0x7f1bf00faa70 msgr2=0x7f1bf00ff7b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:27.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.374+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/54292511 shutdown_connections 2026-03-09T17:34:27.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.375+0000 7f1bf6d9b700 1 -- 192.168.123.106:0/54292511 wait complete. 2026-03-09T17:34:27.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.454+0000 7f03b7c52700 1 -- 192.168.123.106:0/947879349 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f03b00691a0 msgr2=0x7f03b0105280 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:27.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.454+0000 7f03b7c52700 1 --2- 192.168.123.106:0/947879349 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f03b00691a0 0x7f03b0105280 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f03ac009b00 tx=0x7f03ac009e10 comp rx=0 tx=0).stop 2026-03-09T17:34:27.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.456+0000 7f03b7c52700 1 -- 192.168.123.106:0/947879349 shutdown_connections 2026-03-09T17:34:27.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.456+0000 7f03b7c52700 1 --2- 192.168.123.106:0/947879349 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f03b01057c0 0x7f03b0107ba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.456+0000 7f03b7c52700 1 --2- 192.168.123.106:0/947879349 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f03b00691a0 0x7f03b0105280 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.456+0000 7f03b7c52700 1 -- 192.168.123.106:0/947879349 >> 192.168.123.106:0/947879349 conn(0x7f03b00fa7f0 msgr2=0x7f03b00fcc40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:27.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.456+0000 7f03b7c52700 1 -- 192.168.123.106:0/947879349 shutdown_connections 2026-03-09T17:34:27.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.456+0000 7f03b7c52700 1 -- 192.168.123.106:0/947879349 wait complete. 2026-03-09T17:34:27.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.457+0000 7f03b7c52700 1 Processor -- start 2026-03-09T17:34:27.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.457+0000 7f03b7c52700 1 -- start start 2026-03-09T17:34:27.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.457+0000 7f03b7c52700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f03b00691a0 0x7f03b0197de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:27.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.457+0000 7f03b7c52700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f03b01057c0 0x7f03b0198320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:27.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.458+0000 7f03b7c52700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f03b0198940 con 0x7f03b00691a0 2026-03-09T17:34:27.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.458+0000 7f03b7c52700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f03b0198a80 con 0x7f03b01057c0 2026-03-09T17:34:27.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.458+0000 7f03b51ed700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f03b01057c0 0x7f03b0198320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:27.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.458+0000 7f03b51ed700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f03b01057c0 0x7f03b0198320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:41592/0 (socket says 192.168.123.106:41592) 2026-03-09T17:34:27.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.458+0000 7f03b51ed700 1 -- 192.168.123.106:0/3417730553 learned_addr learned my addr 192.168.123.106:0/3417730553 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:34:27.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.458+0000 7f03b59ee700 1 --2- 192.168.123.106:0/3417730553 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f03b00691a0 0x7f03b0197de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:27.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.458+0000 7f03b51ed700 1 -- 192.168.123.106:0/3417730553 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f03b00691a0 msgr2=0x7f03b0197de0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:27.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.458+0000 7f03b51ed700 1 --2- 192.168.123.106:0/3417730553 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f03b00691a0 0x7f03b0197de0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.458+0000 7f03b51ed700 1 -- 192.168.123.106:0/3417730553 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f03ac0097e0 con 0x7f03b01057c0 2026-03-09T17:34:27.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.458+0000 7f03b51ed700 1 --2- 192.168.123.106:0/3417730553 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f03b01057c0 0x7f03b0198320 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f03a000d8d0 tx=0x7f03a000dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:27.460 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.459+0000 7f03a6ffd700 1 -- 192.168.123.106:0/3417730553 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f03a0009940 con 0x7f03b01057c0 2026-03-09T17:34:27.460 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.459+0000 7f03a6ffd700 1 -- 192.168.123.106:0/3417730553 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f03a0010460 con 0x7f03b01057c0 2026-03-09T17:34:27.460 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.459+0000 7f03b7c52700 1 -- 192.168.123.106:0/3417730553 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f03b019d530 con 0x7f03b01057c0 2026-03-09T17:34:27.460 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.459+0000 7f03b7c52700 1 -- 192.168.123.106:0/3417730553 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f03b019da20 con 0x7f03b01057c0 2026-03-09T17:34:27.460 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.459+0000 7f03a6ffd700 1 -- 192.168.123.106:0/3417730553 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f03a000f5d0 con 0x7f03b01057c0 2026-03-09T17:34:27.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.461+0000 7f03a6ffd700 1 -- 192.168.123.106:0/3417730553 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f03a000f7c0 con 0x7f03b01057c0 2026-03-09T17:34:27.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.461+0000 7f03a6ffd700 1 --2- 192.168.123.106:0/3417730553 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f039c077910 0x7f039c079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:27.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.461+0000 7f03a6ffd700 1 -- 192.168.123.106:0/3417730553 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f03a009aa60 con 0x7f03b01057c0 2026-03-09T17:34:27.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.461+0000 7f03b59ee700 1 --2- 192.168.123.106:0/3417730553 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f039c077910 0x7f039c079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:27.463 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.461+0000 7f03b7c52700 1 -- 192.168.123.106:0/3417730553 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0394005320 con 0x7f03b01057c0 2026-03-09T17:34:27.464 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.463+0000 7f03b59ee700 1 --2- 192.168.123.106:0/3417730553 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f039c077910 0x7f039c079dc0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f03ac009fd0 tx=0x7f03ac005c00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:27.468 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.467+0000 7f03a6ffd700 1 -- 192.168.123.106:0/3417730553 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f03a0062b10 con 0x7f03b01057c0 2026-03-09T17:34:27.627 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.625+0000 7f03b7c52700 1 -- 192.168.123.106:0/3417730553 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f0394005190 con 0x7f03b01057c0 2026-03-09T17:34:27.629 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.628+0000 7f03a6ffd700 1 -- 192.168.123.106:0/3417730553 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f03a0016070 con 0x7f03b01057c0 2026-03-09T17:34:27.629 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_OK 2026-03-09T17:34:27.633 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.631+0000 7f03b7c52700 1 -- 192.168.123.106:0/3417730553 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f039c077910 msgr2=0x7f039c079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:27.633 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.632+0000 7f03b7c52700 1 --2- 192.168.123.106:0/3417730553 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f039c077910 0x7f039c079dc0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f03ac009fd0 tx=0x7f03ac005c00 comp rx=0 tx=0).stop 2026-03-09T17:34:27.633 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.632+0000 7f03b7c52700 1 -- 192.168.123.106:0/3417730553 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f03b01057c0 msgr2=0x7f03b0198320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:27.633 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.632+0000 7f03b7c52700 1 --2- 192.168.123.106:0/3417730553 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f03b01057c0 0x7f03b0198320 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f03a000d8d0 tx=0x7f03a000dc90 comp rx=0 tx=0).stop 2026-03-09T17:34:27.634 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.632+0000 7f03b7c52700 1 -- 192.168.123.106:0/3417730553 shutdown_connections 2026-03-09T17:34:27.634 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.632+0000 7f03b7c52700 1 --2- 192.168.123.106:0/3417730553 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f039c077910 0x7f039c079dc0 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.634 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.632+0000 7f03b7c52700 1 --2- 192.168.123.106:0/3417730553 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f03b00691a0 0x7f03b0197de0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.634 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.632+0000 7f03b7c52700 1 --2- 192.168.123.106:0/3417730553 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f03b01057c0 0x7f03b0198320 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:27.634 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.632+0000 7f03b7c52700 1 -- 192.168.123.106:0/3417730553 >> 192.168.123.106:0/3417730553 conn(0x7f03b00fa7f0 msgr2=0x7f03b00fcc40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:27.634 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.633+0000 7f03b7c52700 1 -- 192.168.123.106:0/3417730553 shutdown_connections 2026-03-09T17:34:27.634 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:27.633+0000 7f03b7c52700 1 -- 192.168.123.106:0/3417730553 wait complete. 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: pgmap v144: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 8.5 MiB/s rd, 5.0 KiB/s wr, 8 op/s 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='client.34300 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='client.34304 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/3673922968' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm09.drzmdt"]}]: dispatch 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/3417730553' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.drzmdt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:27 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:28.278 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: pgmap v144: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 8.5 MiB/s rd, 5.0 KiB/s wr, 8 op/s 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='client.34300 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='client.34304 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/3673922968' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm09.drzmdt"]}]: dispatch 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/3417730553' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm09.drzmdt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T17:34:28.279 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:27 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:28.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:28 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06[109827]: 2026-03-09T17:34:28.277+0000 7f3ee5e0f640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T17:34:29.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:28 vm06.local ceph-mon[109831]: Detected new or changed devices on vm09 2026-03-09T17:34:29.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:28 vm06.local ceph-mon[109831]: from='client.34314 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:34:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:28 vm06.local ceph-mon[109831]: Upgrade: It appears safe to stop mds.cephfs.vm09.drzmdt 2026-03-09T17:34:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:28 vm06.local ceph-mon[109831]: Upgrade: Updating mds.cephfs.vm09.drzmdt 2026-03-09T17:34:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:28 vm06.local ceph-mon[109831]: Deploying daemon mds.cephfs.vm09.drzmdt on vm09 2026-03-09T17:34:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:28 vm06.local ceph-mon[109831]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T17:34:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:28 vm06.local ceph-mon[109831]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T17:34:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:28 vm06.local ceph-mon[109831]: osdmap e80: 6 total, 6 up, 6 in 2026-03-09T17:34:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:28 vm06.local ceph-mon[109831]: Standby daemon mds.cephfs.vm06.vmzmbb assigned to filesystem cephfs as rank 0 2026-03-09T17:34:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:28 vm06.local ceph-mon[109831]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T17:34:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:28 vm06.local ceph-mon[109831]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T17:34:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:28 vm06.local ceph-mon[109831]: fsmap cephfs:1/1 {0=cephfs.vm06.vmzmbb=up:replay} 2 up:standby 2026-03-09T17:34:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:34:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:28 vm09.local ceph-mon[97995]: Detected new or changed devices on vm09 2026-03-09T17:34:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:28 vm09.local ceph-mon[97995]: from='client.34314 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:34:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:28 vm09.local ceph-mon[97995]: Upgrade: It appears safe to stop mds.cephfs.vm09.drzmdt 2026-03-09T17:34:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:28 vm09.local ceph-mon[97995]: Upgrade: Updating mds.cephfs.vm09.drzmdt 2026-03-09T17:34:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:28 vm09.local ceph-mon[97995]: Deploying daemon mds.cephfs.vm09.drzmdt on vm09 2026-03-09T17:34:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:28 vm09.local ceph-mon[97995]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T17:34:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:28 vm09.local ceph-mon[97995]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T17:34:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:28 vm09.local ceph-mon[97995]: osdmap e80: 6 total, 6 up, 6 in 2026-03-09T17:34:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:28 vm09.local ceph-mon[97995]: Standby daemon mds.cephfs.vm06.vmzmbb assigned to filesystem cephfs as rank 0 2026-03-09T17:34:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:28 vm09.local ceph-mon[97995]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T17:34:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:28 vm09.local ceph-mon[97995]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T17:34:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:28 vm09.local ceph-mon[97995]: fsmap cephfs:1/1 {0=cephfs.vm06.vmzmbb=up:replay} 2 up:standby 2026-03-09T17:34:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:34:30.292 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:29 vm06.local ceph-mon[109831]: pgmap v146: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 6.0 KiB/s wr, 4 op/s 2026-03-09T17:34:30.292 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:29 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.drzmdt"}]: dispatch 2026-03-09T17:34:30.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:29 vm09.local ceph-mon[97995]: pgmap v146: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 6.0 KiB/s wr, 4 op/s 2026-03-09T17:34:30.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:29 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.drzmdt"}]: dispatch 2026-03-09T17:34:32.353 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:32 vm09.local ceph-mon[97995]: pgmap v147: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 9.0 MiB/s rd, 6.0 KiB/s wr, 6 op/s 2026-03-09T17:34:32.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:32 vm06.local ceph-mon[109831]: pgmap v147: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 9.0 MiB/s rd, 6.0 KiB/s wr, 6 op/s 2026-03-09T17:34:34.188 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:34 vm09.local ceph-mon[97995]: pgmap v148: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 5.3 KiB/s wr, 8 op/s 2026-03-09T17:34:34.188 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:34 vm09.local ceph-mon[97995]: mds.? [v2:192.168.123.106:6826/571707287,v1:192.168.123.106:6827/571707287] up:reconnect 2026-03-09T17:34:34.188 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:34 vm09.local ceph-mon[97995]: fsmap cephfs:1/1 {0=cephfs.vm06.vmzmbb=up:reconnect} 2 up:standby 2026-03-09T17:34:34.188 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:34 vm09.local ceph-mon[97995]: reconnect by client.14520 192.168.144.1:0/677291695 after 0.002 2026-03-09T17:34:34.188 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:34 vm09.local ceph-mon[97995]: reconnect by client.24343 192.168.144.1:0/3798307677 after 0.002 2026-03-09T17:34:34.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:34 vm06.local ceph-mon[109831]: pgmap v148: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 5.3 KiB/s wr, 8 op/s 2026-03-09T17:34:34.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:34 vm06.local ceph-mon[109831]: mds.? [v2:192.168.123.106:6826/571707287,v1:192.168.123.106:6827/571707287] up:reconnect 2026-03-09T17:34:34.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:34 vm06.local ceph-mon[109831]: fsmap cephfs:1/1 {0=cephfs.vm06.vmzmbb=up:reconnect} 2 up:standby 2026-03-09T17:34:34.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:34 vm06.local ceph-mon[109831]: reconnect by client.14520 192.168.144.1:0/677291695 after 0.002 2026-03-09T17:34:34.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:34 vm06.local ceph-mon[109831]: reconnect by client.24343 192.168.144.1:0/3798307677 after 0.002 2026-03-09T17:34:35.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:35 vm06.local ceph-mon[109831]: mds.? [v2:192.168.123.106:6826/571707287,v1:192.168.123.106:6827/571707287] up:rejoin 2026-03-09T17:34:35.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:35 vm06.local ceph-mon[109831]: fsmap cephfs:1/1 {0=cephfs.vm06.vmzmbb=up:rejoin} 2 up:standby 2026-03-09T17:34:35.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:35 vm06.local ceph-mon[109831]: daemon mds.cephfs.vm06.vmzmbb is now active in filesystem cephfs as rank 0 2026-03-09T17:34:35.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:35 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:35.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:35 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:35.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:35 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:35 vm09.local ceph-mon[97995]: mds.? [v2:192.168.123.106:6826/571707287,v1:192.168.123.106:6827/571707287] up:rejoin 2026-03-09T17:34:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:35 vm09.local ceph-mon[97995]: fsmap cephfs:1/1 {0=cephfs.vm06.vmzmbb=up:rejoin} 2 up:standby 2026-03-09T17:34:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:35 vm09.local ceph-mon[97995]: daemon mds.cephfs.vm06.vmzmbb is now active in filesystem cephfs as rank 0 2026-03-09T17:34:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:35 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:35 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:35 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:36.564 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:36 vm09.local ceph-mon[97995]: pgmap v149: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 204 B/s wr, 6 op/s 2026-03-09T17:34:36.564 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:36 vm09.local ceph-mon[97995]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T17:34:36.564 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:36 vm09.local ceph-mon[97995]: Cluster is now healthy 2026-03-09T17:34:36.564 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:36 vm09.local ceph-mon[97995]: mds.? [v2:192.168.123.106:6826/571707287,v1:192.168.123.106:6827/571707287] up:active 2026-03-09T17:34:36.564 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:36 vm09.local ceph-mon[97995]: mds.? [v2:192.168.123.109:6826/3154236738,v1:192.168.123.109:6827/3154236738] up:boot 2026-03-09T17:34:36.564 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:36 vm09.local ceph-mon[97995]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 3 up:standby 2026-03-09T17:34:36.564 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:36 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.drzmdt"}]: dispatch 2026-03-09T17:34:36.564 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:36 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:36.564 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:36 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:36.564 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:36 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:36.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:36 vm06.local ceph-mon[109831]: pgmap v149: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 204 B/s wr, 6 op/s 2026-03-09T17:34:36.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:36 vm06.local ceph-mon[109831]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T17:34:36.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:36 vm06.local ceph-mon[109831]: Cluster is now healthy 2026-03-09T17:34:36.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:36 vm06.local ceph-mon[109831]: mds.? [v2:192.168.123.106:6826/571707287,v1:192.168.123.106:6827/571707287] up:active 2026-03-09T17:34:36.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:36 vm06.local ceph-mon[109831]: mds.? [v2:192.168.123.109:6826/3154236738,v1:192.168.123.109:6827/3154236738] up:boot 2026-03-09T17:34:36.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:36 vm06.local ceph-mon[109831]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 3 up:standby 2026-03-09T17:34:36.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:36 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm09.drzmdt"}]: dispatch 2026-03-09T17:34:36.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:36 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:36.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:36 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:36.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:36 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:37.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: pgmap v150: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 204 B/s wr, 9 op/s 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: Upgrade: Setting container_image for all mds 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.gzymac"}]': finished 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.vmzmbb"}]: dispatch 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.vmzmbb"}]': finished 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm09.cjcawy"}]: dispatch 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm09.cjcawy"}]': finished 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm09.drzmdt"}]: dispatch 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm09.drzmdt"}]': finished 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: Upgrade: Enabling allow_standby_replay on filesystem cephfs 2026-03-09T17:34:37.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:37 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "1"}]: dispatch 2026-03-09T17:34:37.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: pgmap v150: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 204 B/s wr, 9 op/s 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: Upgrade: Setting container_image for all mds 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.gzymac"}]: dispatch 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.gzymac"}]': finished 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.vmzmbb"}]: dispatch 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.vmzmbb"}]': finished 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm09.cjcawy"}]: dispatch 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm09.cjcawy"}]': finished 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm09.drzmdt"}]: dispatch 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm09.drzmdt"}]': finished 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: Upgrade: Enabling allow_standby_replay on filesystem cephfs 2026-03-09T17:34:37.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:37 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "1"}]: dispatch 2026-03-09T17:34:39.382 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "1"}]': finished 2026-03-09T17:34:39.382 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:39 vm06.local ceph-mon[109831]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 3 up:standby 2026-03-09T17:34:39.382 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:39 vm06.local ceph-mon[109831]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 2 up:standby 2026-03-09T17:34:39.382 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:39.382 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:39.382 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:39 vm06.local ceph-mon[109831]: Upgrade: Setting container_image for all rgw 2026-03-09T17:34:39.382 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:39.382 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:39.383 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:39 vm06.local ceph-mon[109831]: Upgrade: Setting container_image for all rbd-mirror 2026-03-09T17:34:39.383 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:39.383 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:39.383 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:39.383 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:34:39.383 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:39 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:39.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "1"}]': finished 2026-03-09T17:34:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:39 vm09.local ceph-mon[97995]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 3 up:standby 2026-03-09T17:34:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:39 vm09.local ceph-mon[97995]: fsmap cephfs:1 {0=cephfs.vm06.vmzmbb=up:active} 1 up:standby-replay 2 up:standby 2026-03-09T17:34:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:39 vm09.local ceph-mon[97995]: Upgrade: Setting container_image for all rgw 2026-03-09T17:34:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:39 vm09.local ceph-mon[97995]: Upgrade: Setting container_image for all rbd-mirror 2026-03-09T17:34:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:34:39.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:39 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:40.238 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:40 vm09.local ceph-mon[97995]: pgmap v151: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 202 B/s wr, 9 op/s 2026-03-09T17:34:40.238 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:40 vm09.local ceph-mon[97995]: Upgrade: Updating ceph-exporter.vm06 (1/2) 2026-03-09T17:34:40.238 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:40 vm09.local ceph-mon[97995]: Deploying daemon ceph-exporter.vm06 on vm06 2026-03-09T17:34:40.238 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:40 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:40.238 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:40 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:40.238 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:40 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:40.238 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:40 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm09", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:34:40.238 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:40 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:40 vm06.local ceph-mon[109831]: pgmap v151: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 202 B/s wr, 9 op/s 2026-03-09T17:34:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:40 vm06.local ceph-mon[109831]: Upgrade: Updating ceph-exporter.vm06 (1/2) 2026-03-09T17:34:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:40 vm06.local ceph-mon[109831]: Deploying daemon ceph-exporter.vm06 on vm06 2026-03-09T17:34:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:40 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:40 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:40 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:40 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm09", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T17:34:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:40 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:41.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:41 vm06.local ceph-mon[109831]: Upgrade: Updating ceph-exporter.vm09 (2/2) 2026-03-09T17:34:41.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:41 vm06.local ceph-mon[109831]: Deploying daemon ceph-exporter.vm09 on vm09 2026-03-09T17:34:41.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:41 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:41.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:41 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:41.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:41 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:41.595 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:41 vm09.local ceph-mon[97995]: Upgrade: Updating ceph-exporter.vm09 (2/2) 2026-03-09T17:34:41.595 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:41 vm09.local ceph-mon[97995]: Deploying daemon ceph-exporter.vm09 on vm09 2026-03-09T17:34:41.595 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:41 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:41.595 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:41 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:41.595 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:41 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:42.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:42 vm06.local ceph-mon[109831]: pgmap v152: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 4.2 KiB/s wr, 12 op/s 2026-03-09T17:34:42.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:42 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:42.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:42 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:42.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:42 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:42.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:42 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:42.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:42 vm09.local ceph-mon[97995]: pgmap v152: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 4.2 KiB/s wr, 12 op/s 2026-03-09T17:34:42.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:42 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:42.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:42 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:42.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:42 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:42.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:42 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:43.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:43 vm09.local ceph-mon[97995]: pgmap v153: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 4.2 KiB/s wr, 11 op/s 2026-03-09T17:34:43.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:43.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:43.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:43.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:34:43.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:43.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:43.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:43.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:43.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:43 vm06.local ceph-mon[109831]: pgmap v153: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 4.2 KiB/s wr, 11 op/s 2026-03-09T17:34:43.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:43.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:43.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:43.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:34:43.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:43.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:43.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:43.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:45.643 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:45 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:45.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:45 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:45.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:45 vm06.local ceph-mon[109831]: pgmap v154: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 4.2 KiB/s wr, 11 op/s 2026-03-09T17:34:45.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:45 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:45.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:45 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:45.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:45 vm09.local ceph-mon[97995]: pgmap v154: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 4.2 KiB/s wr, 11 op/s 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: Upgrade: Setting container_image for all ceph-exporter 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm06"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm06"}]': finished 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm09"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm09"}]': finished 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: Upgrade: Setting container_image for all iscsi 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: Upgrade: Setting container_image for all nfs 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: Upgrade: Setting container_image for all nvmeof 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: Upgrade: Updating node-exporter.vm06 (1/2) 2026-03-09T17:34:46.544 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:46 vm06.local ceph-mon[109831]: Deploying daemon node-exporter.vm06 on vm06 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: Upgrade: Setting container_image for all ceph-exporter 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm06"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm06"}]': finished 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm09"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm09"}]': finished 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: Upgrade: Setting container_image for all iscsi 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: Upgrade: Setting container_image for all nfs 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: Upgrade: Setting container_image for all nvmeof 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: Upgrade: Updating node-exporter.vm06 (1/2) 2026-03-09T17:34:46.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:46 vm09.local ceph-mon[97995]: Deploying daemon node-exporter.vm06 on vm06 2026-03-09T17:34:48.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:48 vm06.local ceph-mon[109831]: pgmap v155: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 4.0 KiB/s wr, 9 op/s 2026-03-09T17:34:48.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:48 vm09.local ceph-mon[97995]: pgmap v155: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 4.0 KiB/s wr, 9 op/s 2026-03-09T17:34:50.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:50 vm06.local ceph-mon[109831]: pgmap v156: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 4.0 KiB/s wr, 8 op/s 2026-03-09T17:34:50.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:50 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:50.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:50 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:50.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:50 vm06.local ceph-mon[109831]: Upgrade: Updating node-exporter.vm09 (2/2) 2026-03-09T17:34:50.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:50 vm06.local ceph-mon[109831]: Deploying daemon node-exporter.vm09 on vm09 2026-03-09T17:34:50.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:50 vm09.local ceph-mon[97995]: pgmap v156: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 4.0 KiB/s wr, 8 op/s 2026-03-09T17:34:50.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:50 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:50.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:50 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:50.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:50 vm09.local ceph-mon[97995]: Upgrade: Updating node-exporter.vm09 (2/2) 2026-03-09T17:34:50.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:50 vm09.local ceph-mon[97995]: Deploying daemon node-exporter.vm09 on vm09 2026-03-09T17:34:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:52 vm06.local ceph-mon[109831]: pgmap v157: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 4.0 KiB/s wr, 8 op/s 2026-03-09T17:34:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:52 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:52 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:52.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:52 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:52 vm09.local ceph-mon[97995]: pgmap v157: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 4.0 KiB/s wr, 8 op/s 2026-03-09T17:34:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:52 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:52 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:52.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:52 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:53.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:53 vm06.local ceph-mon[109831]: pgmap v158: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 4 op/s 2026-03-09T17:34:53.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:53 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:53.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:53 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:53.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:53 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:53.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:53 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:53.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:53 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:53.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:53 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:53.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:53 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:53.860 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:53 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:54.063 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:53 vm09.local ceph-mon[97995]: pgmap v158: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 4 op/s 2026-03-09T17:34:54.063 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:53 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:54.063 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:53 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:54.063 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:53 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:54.063 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:53 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:54.063 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:53 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:54.063 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:53 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:54.063 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:53 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:54.063 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:53 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:55.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:55.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:55.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:55.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:55 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:55.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:55 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:56.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:56 vm06.local ceph-mon[109831]: pgmap v159: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 10 MiB/s rd, 3 op/s 2026-03-09T17:34:56.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:56 vm06.local ceph-mon[109831]: Upgrade: Updating prometheus.vm06 2026-03-09T17:34:56.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:56 vm06.local ceph-mon[109831]: Deploying daemon prometheus.vm06 on vm06 2026-03-09T17:34:56.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:56 vm09.local ceph-mon[97995]: pgmap v159: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 10 MiB/s rd, 3 op/s 2026-03-09T17:34:56.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:56 vm09.local ceph-mon[97995]: Upgrade: Updating prometheus.vm06 2026-03-09T17:34:56.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:56 vm09.local ceph-mon[97995]: Deploying daemon prometheus.vm06 on vm06 2026-03-09T17:34:57.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.735+0000 7f81acfbd700 1 -- 192.168.123.106:0/3682364433 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f81a8072330 msgr2=0x7f81a80770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:57.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.735+0000 7f81acfbd700 1 --2- 192.168.123.106:0/3682364433 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f81a8072330 0x7f81a80770b0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f81a000b780 tx=0x7f81a000ba90 comp rx=0 tx=0).stop 2026-03-09T17:34:57.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.735+0000 7f81acfbd700 1 -- 192.168.123.106:0/3682364433 shutdown_connections 2026-03-09T17:34:57.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.735+0000 7f81acfbd700 1 --2- 192.168.123.106:0/3682364433 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f81a8072330 0x7f81a80770b0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:57.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.735+0000 7f81acfbd700 1 --2- 192.168.123.106:0/3682364433 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f81a8071950 0x7f81a8071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:57.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.735+0000 7f81acfbd700 1 -- 192.168.123.106:0/3682364433 >> 192.168.123.106:0/3682364433 conn(0x7f81a806d1a0 msgr2=0x7f81a806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:57.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.735+0000 7f81acfbd700 1 -- 192.168.123.106:0/3682364433 shutdown_connections 2026-03-09T17:34:57.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.735+0000 7f81acfbd700 1 -- 192.168.123.106:0/3682364433 wait complete. 2026-03-09T17:34:57.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.736+0000 7f81acfbd700 1 Processor -- start 2026-03-09T17:34:57.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.736+0000 7f81acfbd700 1 -- start start 2026-03-09T17:34:57.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.736+0000 7f81acfbd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f81a8071950 0x7f81a80824f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:57.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.736+0000 7f81acfbd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f81a8072330 0x7f81a8082a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:57.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.736+0000 7f81acfbd700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f81a8082f70 con 0x7f81a8072330 2026-03-09T17:34:57.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.736+0000 7f81acfbd700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f81a80830b0 con 0x7f81a8071950 2026-03-09T17:34:57.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.736+0000 7f81a6ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f81a8072330 0x7f81a8082a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:57.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.736+0000 7f81a6ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f81a8072330 0x7f81a8082a30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:33276/0 (socket says 192.168.123.106:33276) 2026-03-09T17:34:57.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.736+0000 7f81a6ffd700 1 -- 192.168.123.106:0/826498259 learned_addr learned my addr 192.168.123.106:0/826498259 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:34:57.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.736+0000 7f81a6ffd700 1 -- 192.168.123.106:0/826498259 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f81a8071950 msgr2=0x7f81a80824f0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T17:34:57.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.736+0000 7f81a6ffd700 1 --2- 192.168.123.106:0/826498259 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f81a8071950 0x7f81a80824f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:57.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.736+0000 7f81a6ffd700 1 -- 192.168.123.106:0/826498259 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f81a000b050 con 0x7f81a8072330 2026-03-09T17:34:57.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.737+0000 7f81a6ffd700 1 --2- 192.168.123.106:0/826498259 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f81a8072330 0x7f81a8082a30 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f81a000bde0 tx=0x7f81a0004a30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:57.740 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.738+0000 7f81a4ff9700 1 -- 192.168.123.106:0/826498259 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f81a0007c70 con 0x7f81a8072330 2026-03-09T17:34:57.744 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.738+0000 7f81acfbd700 1 -- 192.168.123.106:0/826498259 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f81a8083330 con 0x7f81a8072330 2026-03-09T17:34:57.745 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.738+0000 7f81acfbd700 1 -- 192.168.123.106:0/826498259 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f81a8108050 con 0x7f81a8072330 2026-03-09T17:34:57.745 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.739+0000 7f81a4ff9700 1 -- 192.168.123.106:0/826498259 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f81a0007dd0 con 0x7f81a8072330 2026-03-09T17:34:57.745 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.739+0000 7f81a4ff9700 1 -- 192.168.123.106:0/826498259 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f81a005db30 con 0x7f81a8072330 2026-03-09T17:34:57.745 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.739+0000 7f81acfbd700 1 -- 192.168.123.106:0/826498259 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8194005320 con 0x7f81a8072330 2026-03-09T17:34:57.745 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.740+0000 7f81a4ff9700 1 -- 192.168.123.106:0/826498259 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f81a005dc90 con 0x7f81a8072330 2026-03-09T17:34:57.745 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.741+0000 7f81a4ff9700 1 --2- 192.168.123.106:0/826498259 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f81900779e0 0x7f8190079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:57.745 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.741+0000 7f81a4ff9700 1 -- 192.168.123.106:0/826498259 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f81a00e11f0 con 0x7f81a8072330 2026-03-09T17:34:57.745 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.741+0000 7f81a77fe700 1 --2- 192.168.123.106:0/826498259 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f81900779e0 0x7f8190079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:57.745 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.741+0000 7f81a77fe700 1 --2- 192.168.123.106:0/826498259 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f81900779e0 0x7f8190079e90 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f819c009ce0 tx=0x7f819c009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:57.745 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.744+0000 7f81a4ff9700 1 -- 192.168.123.106:0/826498259 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f81a00a99a0 con 0x7f81a8072330 2026-03-09T17:34:57.899 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.897+0000 7f81acfbd700 1 -- 192.168.123.106:0/826498259 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8194000bf0 con 0x7f81900779e0 2026-03-09T17:34:57.899 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.898+0000 7f81a4ff9700 1 -- 192.168.123.106:0/826498259 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+463 (secure 0 0 0) 0x7f8194000bf0 con 0x7f81900779e0 2026-03-09T17:34:57.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.901+0000 7f81acfbd700 1 -- 192.168.123.106:0/826498259 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f81900779e0 msgr2=0x7f8190079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:57.902 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.901+0000 7f81acfbd700 1 --2- 192.168.123.106:0/826498259 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f81900779e0 0x7f8190079e90 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f819c009ce0 tx=0x7f819c009450 comp rx=0 tx=0).stop 2026-03-09T17:34:57.903 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.901+0000 7f81acfbd700 1 -- 192.168.123.106:0/826498259 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f81a8072330 msgr2=0x7f81a8082a30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:57.903 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.901+0000 7f81acfbd700 1 --2- 192.168.123.106:0/826498259 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f81a8072330 0x7f81a8082a30 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f81a000bde0 tx=0x7f81a0004a30 comp rx=0 tx=0).stop 2026-03-09T17:34:57.903 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.901+0000 7f81acfbd700 1 -- 192.168.123.106:0/826498259 shutdown_connections 2026-03-09T17:34:57.903 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.901+0000 7f81acfbd700 1 --2- 192.168.123.106:0/826498259 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f81900779e0 0x7f8190079e90 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:57.903 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.902+0000 7f81acfbd700 1 --2- 192.168.123.106:0/826498259 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f81a8071950 0x7f81a80824f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:57.903 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.902+0000 7f81acfbd700 1 --2- 192.168.123.106:0/826498259 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f81a8072330 0x7f81a8082a30 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:57.903 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.902+0000 7f81acfbd700 1 -- 192.168.123.106:0/826498259 >> 192.168.123.106:0/826498259 conn(0x7f81a806d1a0 msgr2=0x7f81a80765d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:57.903 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.902+0000 7f81acfbd700 1 -- 192.168.123.106:0/826498259 shutdown_connections 2026-03-09T17:34:57.903 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:57.902+0000 7f81acfbd700 1 -- 192.168.123.106:0/826498259 wait complete. 2026-03-09T17:34:57.916 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:34:58.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.019+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3341881203 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5c00a5830 msgr2=0x7fa5c00a5c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.019+0000 7fa5cf0a0700 1 --2- 192.168.123.106:0/3341881203 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5c00a5830 0x7fa5c00a5c40 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fa5c4009900 tx=0x7fa5c4009c10 comp rx=0 tx=0).stop 2026-03-09T17:34:58.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.019+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3341881203 shutdown_connections 2026-03-09T17:34:58.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.019+0000 7fa5cf0a0700 1 --2- 192.168.123.106:0/3341881203 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa5c00a3e70 0x7fa5c00a42c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.019+0000 7fa5cf0a0700 1 --2- 192.168.123.106:0/3341881203 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5c00a5830 0x7fa5c00a5c40 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.019+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3341881203 >> 192.168.123.106:0/3341881203 conn(0x7fa5c009f7e0 msgr2=0x7fa5c00a1c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:58.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.019+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3341881203 shutdown_connections 2026-03-09T17:34:58.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.019+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3341881203 wait complete. 2026-03-09T17:34:58.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.019+0000 7fa5cf0a0700 1 Processor -- start 2026-03-09T17:34:58.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.020+0000 7fa5cf0a0700 1 -- start start 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.020+0000 7fa5cf0a0700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa5c00a3e70 0x7fa5c0014060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.020+0000 7fa5cf0a0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5c00145a0 0x7fa5c00155f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.020+0000 7fa5cf0a0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa5c0014aa0 con 0x7fa5c00145a0 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.020+0000 7fa5cf0a0700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa5c0014c10 con 0x7fa5c00a3e70 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.020+0000 7fa5cd89d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5c00145a0 0x7fa5c00155f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.020+0000 7fa5cd89d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5c00145a0 0x7fa5c00155f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:33298/0 (socket says 192.168.123.106:33298) 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.020+0000 7fa5cd89d700 1 -- 192.168.123.106:0/3477054337 learned_addr learned my addr 192.168.123.106:0/3477054337 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.020+0000 7fa5cd89d700 1 -- 192.168.123.106:0/3477054337 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa5c00a3e70 msgr2=0x7fa5c0014060 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.020+0000 7fa5cd89d700 1 --2- 192.168.123.106:0/3477054337 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa5c00a3e70 0x7fa5c0014060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.020+0000 7fa5cd89d700 1 -- 192.168.123.106:0/3477054337 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa5c40095e0 con 0x7fa5c00145a0 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.023+0000 7fa5cd89d700 1 --2- 192.168.123.106:0/3477054337 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5c00145a0 0x7fa5c00155f0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fa5c8066720 tx=0x7fa5c80719c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.024+0000 7fa5bf7fe700 1 -- 192.168.123.106:0/3477054337 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa5c80b9070 con 0x7fa5c00145a0 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.024+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3477054337 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa5c0015b90 con 0x7fa5c00145a0 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.024+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3477054337 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa5c00160b0 con 0x7fa5c00145a0 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.025+0000 7fa5bf7fe700 1 -- 192.168.123.106:0/3477054337 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa5c8067a90 con 0x7fa5c00145a0 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.025+0000 7fa5bf7fe700 1 -- 192.168.123.106:0/3477054337 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa5c80bdac0 con 0x7fa5c00145a0 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.026+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3477054337 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa5ac005320 con 0x7fa5c00145a0 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.026+0000 7fa5bf7fe700 1 -- 192.168.123.106:0/3477054337 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa5c80c63e0 con 0x7fa5c00145a0 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.027+0000 7fa5bf7fe700 1 --2- 192.168.123.106:0/3477054337 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa5b4077aa0 0x7fa5b4079f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.027+0000 7fa5bf7fe700 1 -- 192.168.123.106:0/3477054337 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fa5c8140010 con 0x7fa5c00145a0 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.027+0000 7fa5ce09e700 1 --2- 192.168.123.106:0/3477054337 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa5b4077aa0 0x7fa5b4079f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.027+0000 7fa5ce09e700 1 --2- 192.168.123.106:0/3477054337 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa5b4077aa0 0x7fa5b4079f50 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fa5c4009900 tx=0x7fa5c400b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:58.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.030+0000 7fa5bf7fe700 1 -- 192.168.123.106:0/3477054337 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa5c81769f0 con 0x7fa5c00145a0 2026-03-09T17:34:58.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.172+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3477054337 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fa5ac000bf0 con 0x7fa5b4077aa0 2026-03-09T17:34:58.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.174+0000 7fa5bf7fe700 1 -- 192.168.123.106:0/3477054337 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+463 (secure 0 0 0) 0x7fa5ac000bf0 con 0x7fa5b4077aa0 2026-03-09T17:34:58.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.176+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3477054337 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa5b4077aa0 msgr2=0x7fa5b4079f50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.176+0000 7fa5cf0a0700 1 --2- 192.168.123.106:0/3477054337 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa5b4077aa0 0x7fa5b4079f50 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fa5c4009900 tx=0x7fa5c400b540 comp rx=0 tx=0).stop 2026-03-09T17:34:58.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.177+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3477054337 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5c00145a0 msgr2=0x7fa5c00155f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.177+0000 7fa5cf0a0700 1 --2- 192.168.123.106:0/3477054337 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5c00145a0 0x7fa5c00155f0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fa5c8066720 tx=0x7fa5c80719c0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.177+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3477054337 shutdown_connections 2026-03-09T17:34:58.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.177+0000 7fa5cf0a0700 1 --2- 192.168.123.106:0/3477054337 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa5b4077aa0 0x7fa5b4079f50 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.177+0000 7fa5cf0a0700 1 --2- 192.168.123.106:0/3477054337 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa5c00a3e70 0x7fa5c0014060 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.177+0000 7fa5cf0a0700 1 --2- 192.168.123.106:0/3477054337 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5c00145a0 0x7fa5c00155f0 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.177+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3477054337 >> 192.168.123.106:0/3477054337 conn(0x7fa5c009f7e0 msgr2=0x7fa5c00a1980 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:58.179 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.177+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3477054337 shutdown_connections 2026-03-09T17:34:58.179 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.177+0000 7fa5cf0a0700 1 -- 192.168.123.106:0/3477054337 wait complete. 2026-03-09T17:34:58.260 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.259+0000 7f2330f44700 1 -- 192.168.123.106:0/2531864163 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f232c071a60 msgr2=0x7f232c071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.259+0000 7f2330f44700 1 --2- 192.168.123.106:0/2531864163 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f232c071a60 0x7f232c071e70 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f231c009a60 tx=0x7f231c009d70 comp rx=0 tx=0).stop 2026-03-09T17:34:58.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.259+0000 7f2330f44700 1 -- 192.168.123.106:0/2531864163 shutdown_connections 2026-03-09T17:34:58.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.259+0000 7f2330f44700 1 --2- 192.168.123.106:0/2531864163 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f232c072440 0x7f232c10be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.259+0000 7f2330f44700 1 --2- 192.168.123.106:0/2531864163 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f232c071a60 0x7f232c071e70 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.259+0000 7f2330f44700 1 -- 192.168.123.106:0/2531864163 >> 192.168.123.106:0/2531864163 conn(0x7f232c06d1a0 msgr2=0x7f232c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:58.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.259+0000 7f2330f44700 1 -- 192.168.123.106:0/2531864163 shutdown_connections 2026-03-09T17:34:58.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.259+0000 7f2330f44700 1 -- 192.168.123.106:0/2531864163 wait complete. 2026-03-09T17:34:58.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.260+0000 7f2330f44700 1 Processor -- start 2026-03-09T17:34:58.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.260+0000 7f2330f44700 1 -- start start 2026-03-09T17:34:58.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.260+0000 7f2330f44700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f232c071a60 0x7f232c116c80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:58.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.260+0000 7f2330f44700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f232c072440 0x7f232c1171c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:58.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.260+0000 7f2330f44700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f232c1177e0 con 0x7f232c071a60 2026-03-09T17:34:58.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.260+0000 7f2330f44700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f232c117920 con 0x7f232c072440 2026-03-09T17:34:58.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.261+0000 7f232b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f232c071a60 0x7f232c116c80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:58.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.261+0000 7f232b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f232c071a60 0x7f232c116c80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:33318/0 (socket says 192.168.123.106:33318) 2026-03-09T17:34:58.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.261+0000 7f232b7fe700 1 -- 192.168.123.106:0/796409689 learned_addr learned my addr 192.168.123.106:0/796409689 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:34:58.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.261+0000 7f232affd700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f232c072440 0x7f232c1171c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:58.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.261+0000 7f232affd700 1 -- 192.168.123.106:0/796409689 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f232c071a60 msgr2=0x7f232c116c80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.261+0000 7f232affd700 1 --2- 192.168.123.106:0/796409689 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f232c071a60 0x7f232c116c80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.261+0000 7f232affd700 1 -- 192.168.123.106:0/796409689 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f231c009710 con 0x7f232c072440 2026-03-09T17:34:58.264 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.263+0000 7f232affd700 1 --2- 192.168.123.106:0/796409689 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f232c072440 0x7f232c1171c0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f2314005fd0 tx=0x7f231400ec90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:58.265 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.264+0000 7f2328ff9700 1 -- 192.168.123.106:0/796409689 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f231400cb50 con 0x7f232c072440 2026-03-09T17:34:58.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.264+0000 7f2330f44700 1 -- 192.168.123.106:0/796409689 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f232c1b2b50 con 0x7f232c072440 2026-03-09T17:34:58.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.264+0000 7f2330f44700 1 -- 192.168.123.106:0/796409689 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f232c1b30a0 con 0x7f232c072440 2026-03-09T17:34:58.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.264+0000 7f2328ff9700 1 -- 192.168.123.106:0/796409689 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f231400eed0 con 0x7f232c072440 2026-03-09T17:34:58.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.265+0000 7f2328ff9700 1 -- 192.168.123.106:0/796409689 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2314018700 con 0x7f232c072440 2026-03-09T17:34:58.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.265+0000 7f2328ff9700 1 -- 192.168.123.106:0/796409689 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2314018970 con 0x7f232c072440 2026-03-09T17:34:58.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.266+0000 7f2328ff9700 1 --2- 192.168.123.106:0/796409689 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2318077910 0x7f2318079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:58.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.266+0000 7f2328ff9700 1 -- 192.168.123.106:0/796409689 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f2314014070 con 0x7f232c072440 2026-03-09T17:34:58.267 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.266+0000 7f232b7fe700 1 --2- 192.168.123.106:0/796409689 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2318077910 0x7f2318079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:58.268 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.267+0000 7f232b7fe700 1 --2- 192.168.123.106:0/796409689 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2318077910 0x7f2318079dc0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f231c0096e0 tx=0x7f231c009670 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:58.268 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.267+0000 7f2330f44700 1 -- 192.168.123.106:0/796409689 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f230c005320 con 0x7f232c072440 2026-03-09T17:34:58.271 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.270+0000 7f2328ff9700 1 -- 192.168.123.106:0/796409689 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2314063ae0 con 0x7f232c072440 2026-03-09T17:34:58.394 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:58 vm06.local ceph-mon[109831]: pgmap v160: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 3.3 MiB/s rd, 2 op/s 2026-03-09T17:34:58.418 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.416+0000 7f2330f44700 1 -- 192.168.123.106:0/796409689 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f230c000bf0 con 0x7f2318077910 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (9m) 5s ago 9m 26.0M - 0.25.0 c8568f914cd2 b5fa36858876 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (18s) 5s ago 10m 10.5M - 19.2.3-678-ge911bdeb 654f31e6858e 6ccb5363cc83 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (17s) 5s ago 9m 10.3M - 19.2.3-678-ge911bdeb 654f31e6858e ec8da6f92eb8 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (3m) 5s ago 10m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 3e47d040f792 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (3m) 5s ago 9m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e a8538a05db57 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (9m) 5s ago 9m 96.1M - 9.4.7 954c08fa6188 d808369f1a53 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (39s) 5s ago 7m 93.4M - 19.2.3-678-ge911bdeb 654f31e6858e cea38695f742 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (49s) 5s ago 7m 102M - 19.2.3-678-ge911bdeb 654f31e6858e 137634321f1d 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (33s) 5s ago 7m 18.5M - 19.2.3-678-ge911bdeb 654f31e6858e da0f8f8a04b1 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (24s) 5s ago 7m 15.4M - 19.2.3-678-ge911bdeb 654f31e6858e 2a7cca6ff85f 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:8443,9283,8765 running (4m) 5s ago 10m 634M - 19.2.3-678-ge911bdeb 654f31e6858e a857a5a84e3a 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (4m) 5s ago 9m 491M - 19.2.3-678-ge911bdeb 654f31e6858e ce41ce6b68d6 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (4m) 5s ago 10m 63.0M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 86c27c9946b5 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (3m) 5s ago 9m 53.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 65d270c6a306 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (9s) 5s ago 10m 7578k - 1.7.0 72c9c2088986 afbba45bb140 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (6s) 5s ago 9m 5528k - 1.7.0 72c9c2088986 11641531af25 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (3m) 5s ago 9m 211M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 3b19d9fcb067 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (2m) 5s ago 8m 130M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b63df0190ed3 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (113s) 5s ago 8m 126M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a5ccd85faf22 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (90s) 5s ago 8m 165M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 40d834360933 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (84s) 5s ago 8m 121M 4096M 19.2.3-678-ge911bdeb 654f31e6858e cb6e9cd4fe30 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (62s) 5s ago 8m 126M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b297663f757a 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (4m) 5s ago 9m 70.0M - 2.43.0 a07b618ecd1d f6ece95f2fd5 2026-03-09T17:34:58.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.422+0000 7f2328ff9700 1 -- 192.168.123.106:0/796409689 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f230c000bf0 con 0x7f2318077910 2026-03-09T17:34:58.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.424+0000 7f23227fc700 1 -- 192.168.123.106:0/796409689 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2318077910 msgr2=0x7f2318079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.424+0000 7f23227fc700 1 --2- 192.168.123.106:0/796409689 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2318077910 0x7f2318079dc0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f231c0096e0 tx=0x7f231c009670 comp rx=0 tx=0).stop 2026-03-09T17:34:58.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.424+0000 7f23227fc700 1 -- 192.168.123.106:0/796409689 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f232c072440 msgr2=0x7f232c1171c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.424+0000 7f23227fc700 1 --2- 192.168.123.106:0/796409689 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f232c072440 0x7f232c1171c0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f2314005fd0 tx=0x7f231400ec90 comp rx=0 tx=0).stop 2026-03-09T17:34:58.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.425+0000 7f23227fc700 1 -- 192.168.123.106:0/796409689 shutdown_connections 2026-03-09T17:34:58.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.425+0000 7f23227fc700 1 --2- 192.168.123.106:0/796409689 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2318077910 0x7f2318079dc0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.425+0000 7f23227fc700 1 --2- 192.168.123.106:0/796409689 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f232c071a60 0x7f232c116c80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.425+0000 7f23227fc700 1 --2- 192.168.123.106:0/796409689 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f232c072440 0x7f232c1171c0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.425+0000 7f23227fc700 1 -- 192.168.123.106:0/796409689 >> 192.168.123.106:0/796409689 conn(0x7f232c06d1a0 msgr2=0x7f232c10a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:58.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.425+0000 7f23227fc700 1 -- 192.168.123.106:0/796409689 shutdown_connections 2026-03-09T17:34:58.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.425+0000 7f23227fc700 1 -- 192.168.123.106:0/796409689 wait complete. 2026-03-09T17:34:58.504 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.502+0000 7ff1f3395700 1 -- 192.168.123.106:0/3312025409 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff1ec072470 msgr2=0x7ff1ec10beb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.504 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.502+0000 7ff1f3395700 1 --2- 192.168.123.106:0/3312025409 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff1ec072470 0x7ff1ec10beb0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7ff1e400b3a0 tx=0x7ff1e400b6b0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.504 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.503+0000 7ff1f3395700 1 -- 192.168.123.106:0/3312025409 shutdown_connections 2026-03-09T17:34:58.504 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.503+0000 7ff1f3395700 1 --2- 192.168.123.106:0/3312025409 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff1ec072470 0x7ff1ec10beb0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.504 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.503+0000 7ff1f3395700 1 --2- 192.168.123.106:0/3312025409 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff1ec071a90 0x7ff1ec071ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.504 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.503+0000 7ff1f3395700 1 -- 192.168.123.106:0/3312025409 >> 192.168.123.106:0/3312025409 conn(0x7ff1ec06d1a0 msgr2=0x7ff1ec06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.503+0000 7ff1f3395700 1 -- 192.168.123.106:0/3312025409 shutdown_connections 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.503+0000 7ff1f3395700 1 -- 192.168.123.106:0/3312025409 wait complete. 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.503+0000 7ff1f3395700 1 Processor -- start 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.503+0000 7ff1f3395700 1 -- start start 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.503+0000 7ff1f3395700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff1ec071a90 0x7ff1ec116a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.503+0000 7ff1f3395700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff1ec116f90 0x7ff1ec1b2780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.503+0000 7ff1f3395700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff1ec117490 con 0x7ff1ec071a90 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.503+0000 7ff1f3395700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff1ec117600 con 0x7ff1ec116f90 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.504+0000 7ff1f0930700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff1ec116f90 0x7ff1ec1b2780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.504+0000 7ff1f0930700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff1ec116f90 0x7ff1ec1b2780 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:33976/0 (socket says 192.168.123.106:33976) 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.504+0000 7ff1f0930700 1 -- 192.168.123.106:0/1592446456 learned_addr learned my addr 192.168.123.106:0/1592446456 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.504+0000 7ff1f0930700 1 -- 192.168.123.106:0/1592446456 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff1ec071a90 msgr2=0x7ff1ec116a50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.504+0000 7ff1f0930700 1 --2- 192.168.123.106:0/1592446456 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff1ec071a90 0x7ff1ec116a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.504+0000 7ff1f0930700 1 -- 192.168.123.106:0/1592446456 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff1e400b050 con 0x7ff1ec116f90 2026-03-09T17:34:58.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.504+0000 7ff1f0930700 1 --2- 192.168.123.106:0/1592446456 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff1ec116f90 0x7ff1ec1b2780 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7ff1e40095d0 tx=0x7ff1e400bd00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:58.508 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.507+0000 7ff1e27fc700 1 -- 192.168.123.106:0/1592446456 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff1e400e050 con 0x7ff1ec116f90 2026-03-09T17:34:58.508 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.507+0000 7ff1e27fc700 1 -- 192.168.123.106:0/1592446456 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff1e4003dc0 con 0x7ff1ec116f90 2026-03-09T17:34:58.508 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.507+0000 7ff1e27fc700 1 -- 192.168.123.106:0/1592446456 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff1e401bab0 con 0x7ff1ec116f90 2026-03-09T17:34:58.508 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.507+0000 7ff1f3395700 1 -- 192.168.123.106:0/1592446456 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff1ec1b2cc0 con 0x7ff1ec116f90 2026-03-09T17:34:58.508 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.507+0000 7ff1f3395700 1 -- 192.168.123.106:0/1592446456 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff1ec1b31e0 con 0x7ff1ec116f90 2026-03-09T17:34:58.508 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.507+0000 7ff1f3395700 1 -- 192.168.123.106:0/1592446456 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff1ec110c20 con 0x7ff1ec116f90 2026-03-09T17:34:58.510 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.508+0000 7ff1e27fc700 1 -- 192.168.123.106:0/1592446456 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff1e4019040 con 0x7ff1ec116f90 2026-03-09T17:34:58.510 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.509+0000 7ff1e27fc700 1 --2- 192.168.123.106:0/1592446456 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff1d80779e0 0x7ff1d8079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:58.510 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.509+0000 7ff1e27fc700 1 -- 192.168.123.106:0/1592446456 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7ff1e4029030 con 0x7ff1ec116f90 2026-03-09T17:34:58.510 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.509+0000 7ff1f1131700 1 --2- 192.168.123.106:0/1592446456 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff1d80779e0 0x7ff1d8079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:58.511 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.510+0000 7ff1f1131700 1 --2- 192.168.123.106:0/1592446456 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff1d80779e0 0x7ff1d8079e90 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7ff1dc005d90 tx=0x7ff1dc005d00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:58.512 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.511+0000 7ff1e27fc700 1 -- 192.168.123.106:0/1592446456 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff1e4063fc0 con 0x7ff1ec116f90 2026-03-09T17:34:58.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:58 vm09.local ceph-mon[97995]: pgmap v160: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 3.3 MiB/s rd, 2 op/s 2026-03-09T17:34:58.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.697+0000 7ff1f3395700 1 -- 192.168.123.106:0/1592446456 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7ff1ec04ea50 con 0x7ff1ec116f90 2026-03-09T17:34:58.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.697+0000 7ff1e27fc700 1 -- 192.168.123.106:0/1592446456 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7ff1e4063710 con 0x7ff1ec116f90 2026-03-09T17:34:58.702 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:34:58.703 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:34:58.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.706+0000 7ff1d7fff700 1 -- 192.168.123.106:0/1592446456 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff1d80779e0 msgr2=0x7ff1d8079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.706+0000 7ff1d7fff700 1 --2- 192.168.123.106:0/1592446456 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff1d80779e0 0x7ff1d8079e90 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7ff1dc005d90 tx=0x7ff1dc005d00 comp rx=0 tx=0).stop 2026-03-09T17:34:58.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.706+0000 7ff1d7fff700 1 -- 192.168.123.106:0/1592446456 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff1ec116f90 msgr2=0x7ff1ec1b2780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.706+0000 7ff1d7fff700 1 --2- 192.168.123.106:0/1592446456 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff1ec116f90 0x7ff1ec1b2780 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7ff1e40095d0 tx=0x7ff1e400bd00 comp rx=0 tx=0).stop 2026-03-09T17:34:58.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.706+0000 7ff1d7fff700 1 -- 192.168.123.106:0/1592446456 shutdown_connections 2026-03-09T17:34:58.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.706+0000 7ff1d7fff700 1 --2- 192.168.123.106:0/1592446456 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff1d80779e0 0x7ff1d8079e90 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.706+0000 7ff1d7fff700 1 --2- 192.168.123.106:0/1592446456 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff1ec071a90 0x7ff1ec116a50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.706+0000 7ff1d7fff700 1 --2- 192.168.123.106:0/1592446456 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff1ec116f90 0x7ff1ec1b2780 secure :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7ff1e40095d0 tx=0x7ff1e400bd00 comp rx=0 tx=0).stop 2026-03-09T17:34:58.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.706+0000 7ff1d7fff700 1 -- 192.168.123.106:0/1592446456 >> 192.168.123.106:0/1592446456 conn(0x7ff1ec06d1a0 msgr2=0x7ff1ec10b1c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:58.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.706+0000 7ff1d7fff700 1 -- 192.168.123.106:0/1592446456 shutdown_connections 2026-03-09T17:34:58.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.707+0000 7ff1d7fff700 1 -- 192.168.123.106:0/1592446456 wait complete. 2026-03-09T17:34:58.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.789+0000 7f2955e91700 1 -- 192.168.123.106:0/2276089254 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29500ff1a0 msgr2=0x7f29500ff5b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.789+0000 7f2955e91700 1 --2- 192.168.123.106:0/2276089254 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29500ff1a0 0x7f29500ff5b0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f2944009b00 tx=0x7f2944009e10 comp rx=0 tx=0).stop 2026-03-09T17:34:58.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.789+0000 7f2955e91700 1 -- 192.168.123.106:0/2276089254 shutdown_connections 2026-03-09T17:34:58.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.790+0000 7f2955e91700 1 --2- 192.168.123.106:0/2276089254 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2950100440 0x7f29501008b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.790+0000 7f2955e91700 1 --2- 192.168.123.106:0/2276089254 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f29500ff1a0 0x7f29500ff5b0 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.791 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.790+0000 7f2955e91700 1 -- 192.168.123.106:0/2276089254 >> 192.168.123.106:0/2276089254 conn(0x7f29500fa7b0 msgr2=0x7f29500fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:58.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.791+0000 7f2955e91700 1 -- 192.168.123.106:0/2276089254 shutdown_connections 2026-03-09T17:34:58.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.791+0000 7f2955e91700 1 -- 192.168.123.106:0/2276089254 wait complete. 2026-03-09T17:34:58.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.791+0000 7f2955e91700 1 Processor -- start 2026-03-09T17:34:58.793 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.791+0000 7f2955e91700 1 -- start start 2026-03-09T17:34:58.793 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.792+0000 7f2955e91700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f29500ff1a0 0x7f295019c150 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:58.793 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.792+0000 7f2955e91700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2950100440 0x7f295019c690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:58.793 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.792+0000 7f2955e91700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f295019cc20 con 0x7f2950100440 2026-03-09T17:34:58.793 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.792+0000 7f2955e91700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f295019cd60 con 0x7f29500ff1a0 2026-03-09T17:34:58.793 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.792+0000 7f2954e8f700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f29500ff1a0 0x7f295019c150 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:58.793 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.792+0000 7f2954e8f700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f29500ff1a0 0x7f295019c150 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:33996/0 (socket says 192.168.123.106:33996) 2026-03-09T17:34:58.793 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.792+0000 7f2954e8f700 1 -- 192.168.123.106:0/1244876711 learned_addr learned my addr 192.168.123.106:0/1244876711 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:34:58.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.792+0000 7f294ffff700 1 --2- 192.168.123.106:0/1244876711 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2950100440 0x7f295019c690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:58.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.793+0000 7f2954e8f700 1 -- 192.168.123.106:0/1244876711 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2950100440 msgr2=0x7f295019c690 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.793+0000 7f2954e8f700 1 --2- 192.168.123.106:0/1244876711 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2950100440 0x7f295019c690 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.793+0000 7f2954e8f700 1 -- 192.168.123.106:0/1244876711 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f29440097e0 con 0x7f29500ff1a0 2026-03-09T17:34:58.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.793+0000 7f294ffff700 1 --2- 192.168.123.106:0/1244876711 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2950100440 0x7f295019c690 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T17:34:58.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.793+0000 7f2954e8f700 1 --2- 192.168.123.106:0/1244876711 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f29500ff1a0 0x7f295019c150 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f29440052d0 tx=0x7f2944004b10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:58.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.793+0000 7f294dffb700 1 -- 192.168.123.106:0/1244876711 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f294401d070 con 0x7f29500ff1a0 2026-03-09T17:34:58.795 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.793+0000 7f2955e91700 1 -- 192.168.123.106:0/1244876711 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f29501a17c0 con 0x7f29500ff1a0 2026-03-09T17:34:58.795 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.794+0000 7f2955e91700 1 -- 192.168.123.106:0/1244876711 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f29501a1c80 con 0x7f29500ff1a0 2026-03-09T17:34:58.795 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.794+0000 7f294dffb700 1 -- 192.168.123.106:0/1244876711 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f294400bd10 con 0x7f29500ff1a0 2026-03-09T17:34:58.795 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.794+0000 7f294dffb700 1 -- 192.168.123.106:0/1244876711 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f294400f800 con 0x7f29500ff1a0 2026-03-09T17:34:58.797 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.796+0000 7f2955e91700 1 -- 192.168.123.106:0/1244876711 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f295004ea50 con 0x7f29500ff1a0 2026-03-09T17:34:58.797 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.796+0000 7f294dffb700 1 -- 192.168.123.106:0/1244876711 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f294400f960 con 0x7f29500ff1a0 2026-03-09T17:34:58.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.796+0000 7f294dffb700 1 --2- 192.168.123.106:0/1244876711 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f29400779e0 0x7f2940079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:58.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.796+0000 7f294dffb700 1 -- 192.168.123.106:0/1244876711 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f294409c3a0 con 0x7f29500ff1a0 2026-03-09T17:34:58.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.797+0000 7f294ffff700 1 --2- 192.168.123.106:0/1244876711 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f29400779e0 0x7f2940079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:58.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.797+0000 7f294ffff700 1 --2- 192.168.123.106:0/1244876711 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f29400779e0 0x7f2940079e90 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f293c005fd0 tx=0x7f293c005e30 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:58.800 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.799+0000 7f294dffb700 1 -- 192.168.123.106:0/1244876711 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2944064c00 con 0x7f29500ff1a0 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:e35 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:btime 2026-03-09T17:34:38:194451+0000 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:epoch 35 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:34:38.194448+0000 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:max_xattr_size 65536 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 80 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:up {0=34284} 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:qdb_cluster leader: 34284 members: 34284 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:34284} state up:active seq 9 join_fscid=1 addr [v2:192.168.123.106:6826/571707287,v1:192.168.123.106:6827/571707287] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{0:44251} state up:standby-replay seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/2160269265,v1:192.168.123.106:6829/2160269265] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:34:58.962 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{-1:44253} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.109:6824/3810846472,v1:192.168.123.109:6825/3810846472] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:44275} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.109:6826/3154236738,v1:192.168.123.109:6827/3154236738] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.955+0000 7f2955e91700 1 -- 192.168.123.106:0/1244876711 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f29501a1f60 con 0x7f29500ff1a0 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.956+0000 7f294dffb700 1 -- 192.168.123.106:0/1244876711 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 35 v35) v1 ==== 76+0+2002 (secure 0 0 0) 0x7f2944064350 con 0x7f29500ff1a0 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.959+0000 7f293b7fe700 1 -- 192.168.123.106:0/1244876711 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f29400779e0 msgr2=0x7f2940079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.959+0000 7f293b7fe700 1 --2- 192.168.123.106:0/1244876711 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f29400779e0 0x7f2940079e90 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f293c005fd0 tx=0x7f293c005e30 comp rx=0 tx=0).stop 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.960+0000 7f293b7fe700 1 -- 192.168.123.106:0/1244876711 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f29500ff1a0 msgr2=0x7f295019c150 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.960+0000 7f293b7fe700 1 --2- 192.168.123.106:0/1244876711 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f29500ff1a0 0x7f295019c150 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f29440052d0 tx=0x7f2944004b10 comp rx=0 tx=0).stop 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.960+0000 7f293b7fe700 1 -- 192.168.123.106:0/1244876711 shutdown_connections 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.960+0000 7f293b7fe700 1 --2- 192.168.123.106:0/1244876711 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f29400779e0 0x7f2940079e90 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.960+0000 7f293b7fe700 1 --2- 192.168.123.106:0/1244876711 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f29500ff1a0 0x7f295019c150 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.960+0000 7f293b7fe700 1 --2- 192.168.123.106:0/1244876711 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2950100440 0x7f295019c690 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.960+0000 7f293b7fe700 1 -- 192.168.123.106:0/1244876711 >> 192.168.123.106:0/1244876711 conn(0x7f29500fa7b0 msgr2=0x7f2950103670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.960+0000 7f293b7fe700 1 -- 192.168.123.106:0/1244876711 shutdown_connections 2026-03-09T17:34:58.963 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:58.960+0000 7f293b7fe700 1 -- 192.168.123.106:0/1244876711 wait complete. 2026-03-09T17:34:58.966 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 35 2026-03-09T17:34:59.036 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.034+0000 7fc5249d6700 1 -- 192.168.123.106:0/2723842672 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc51c072360 msgr2=0x7fc51c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:59.036 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.034+0000 7fc5249d6700 1 --2- 192.168.123.106:0/2723842672 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc51c072360 0x7fc51c0770e0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fc51800b600 tx=0x7fc51800b910 comp rx=0 tx=0).stop 2026-03-09T17:34:59.036 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.034+0000 7fc5249d6700 1 -- 192.168.123.106:0/2723842672 shutdown_connections 2026-03-09T17:34:59.036 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.034+0000 7fc5249d6700 1 --2- 192.168.123.106:0/2723842672 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc51c072360 0x7fc51c0770e0 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:59.036 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.034+0000 7fc5249d6700 1 --2- 192.168.123.106:0/2723842672 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc51c071980 0x7fc51c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:59.036 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.034+0000 7fc5249d6700 1 -- 192.168.123.106:0/2723842672 >> 192.168.123.106:0/2723842672 conn(0x7fc51c06d1a0 msgr2=0x7fc51c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.034+0000 7fc5249d6700 1 -- 192.168.123.106:0/2723842672 shutdown_connections 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.034+0000 7fc5249d6700 1 -- 192.168.123.106:0/2723842672 wait complete. 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.035+0000 7fc5249d6700 1 Processor -- start 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.035+0000 7fc5249d6700 1 -- start start 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.035+0000 7fc5249d6700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc51c071980 0x7fc51c082620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.035+0000 7fc5249d6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc51c082b60 0x7fc51c082fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.035+0000 7fc5249d6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc51c1b2a90 con 0x7fc51c082b60 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.035+0000 7fc5249d6700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc51c1b2bd0 con 0x7fc51c071980 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.035+0000 7fc521f71700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc51c082b60 0x7fc51c082fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.035+0000 7fc521f71700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc51c082b60 0x7fc51c082fd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:33374/0 (socket says 192.168.123.106:33374) 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.035+0000 7fc521f71700 1 -- 192.168.123.106:0/540109923 learned_addr learned my addr 192.168.123.106:0/540109923 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.035+0000 7fc521f71700 1 -- 192.168.123.106:0/540109923 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc51c071980 msgr2=0x7fc51c082620 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.035+0000 7fc521f71700 1 --2- 192.168.123.106:0/540109923 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc51c071980 0x7fc51c082620 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.035+0000 7fc521f71700 1 -- 192.168.123.106:0/540109923 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc51800b050 con 0x7fc51c082b60 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.035+0000 7fc521f71700 1 --2- 192.168.123.106:0/540109923 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc51c082b60 0x7fc51c082fd0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fc518003c30 tx=0x7fc518003d10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:59.039 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.036+0000 7fc50f7fe700 1 -- 192.168.123.106:0/540109923 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc51800e030 con 0x7fc51c082b60 2026-03-09T17:34:59.041 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.036+0000 7fc5249d6700 1 -- 192.168.123.106:0/540109923 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc51c1b2dd0 con 0x7fc51c082b60 2026-03-09T17:34:59.041 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.036+0000 7fc5249d6700 1 -- 192.168.123.106:0/540109923 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc51c1b3320 con 0x7fc51c082b60 2026-03-09T17:34:59.041 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.038+0000 7fc50f7fe700 1 -- 192.168.123.106:0/540109923 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc5180048e0 con 0x7fc51c082b60 2026-03-09T17:34:59.041 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.038+0000 7fc50f7fe700 1 -- 192.168.123.106:0/540109923 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc51801cdf0 con 0x7fc51c082b60 2026-03-09T17:34:59.041 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.039+0000 7fc50f7fe700 1 -- 192.168.123.106:0/540109923 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc518012430 con 0x7fc51c082b60 2026-03-09T17:34:59.041 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.039+0000 7fc50f7fe700 1 --2- 192.168.123.106:0/540109923 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc508079cd0 0x7fc50807c180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:59.045 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.040+0000 7fc522772700 1 --2- 192.168.123.106:0/540109923 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc508079cd0 0x7fc50807c180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:59.045 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.040+0000 7fc522772700 1 --2- 192.168.123.106:0/540109923 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc508079cd0 0x7fc50807c180 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fc51c083d50 tx=0x7fc510006c60 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:59.045 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.040+0000 7fc50f7fe700 1 -- 192.168.123.106:0/540109923 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fc51802a030 con 0x7fc51c082b60 2026-03-09T17:34:59.045 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.041+0000 7fc5249d6700 1 -- 192.168.123.106:0/540109923 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc500005320 con 0x7fc51c082b60 2026-03-09T17:34:59.045 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.044+0000 7fc50f7fe700 1 -- 192.168.123.106:0/540109923 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc518064b00 con 0x7fc51c082b60 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.171+0000 7fc5249d6700 1 -- 192.168.123.106:0/540109923 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc500000bf0 con 0x7fc508079cd0 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.172+0000 7fc50f7fe700 1 -- 192.168.123.106:0/540109923 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+463 (secure 0 0 0) 0x7fc500000bf0 con 0x7fc508079cd0 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": true, 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [ 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout: "mon", 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout: "ceph-exporter", 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout: "mgr", 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout: "mds", 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout: "crash", 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout: "osd" 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout: ], 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout: "progress": "18/23 daemons upgraded", 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout: "message": "Currently upgrading prometheus daemons", 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:34:59.174 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:34:59.176 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.175+0000 7fc50d7fa700 1 -- 192.168.123.106:0/540109923 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc508079cd0 msgr2=0x7fc50807c180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:59.176 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.175+0000 7fc50d7fa700 1 --2- 192.168.123.106:0/540109923 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc508079cd0 0x7fc50807c180 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fc51c083d50 tx=0x7fc510006c60 comp rx=0 tx=0).stop 2026-03-09T17:34:59.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.175+0000 7fc50d7fa700 1 -- 192.168.123.106:0/540109923 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc51c082b60 msgr2=0x7fc51c082fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:59.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.176+0000 7fc50d7fa700 1 --2- 192.168.123.106:0/540109923 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc51c082b60 0x7fc51c082fd0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fc518003c30 tx=0x7fc518003d10 comp rx=0 tx=0).stop 2026-03-09T17:34:59.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.176+0000 7fc50d7fa700 1 -- 192.168.123.106:0/540109923 shutdown_connections 2026-03-09T17:34:59.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.176+0000 7fc50d7fa700 1 --2- 192.168.123.106:0/540109923 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc508079cd0 0x7fc50807c180 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:59.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.176+0000 7fc50d7fa700 1 --2- 192.168.123.106:0/540109923 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc51c071980 0x7fc51c082620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:59.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.176+0000 7fc50d7fa700 1 --2- 192.168.123.106:0/540109923 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc51c082b60 0x7fc51c082fd0 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:59.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.176+0000 7fc50d7fa700 1 -- 192.168.123.106:0/540109923 >> 192.168.123.106:0/540109923 conn(0x7fc51c06d1a0 msgr2=0x7fc51c0764e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:59.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.176+0000 7fc50d7fa700 1 -- 192.168.123.106:0/540109923 shutdown_connections 2026-03-09T17:34:59.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.176+0000 7fc50d7fa700 1 -- 192.168.123.106:0/540109923 wait complete. 2026-03-09T17:34:59.259 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:59 vm06.local ceph-mon[109831]: from='client.34330 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:34:59.259 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:59 vm06.local ceph-mon[109831]: from='client.34334 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:34:59.259 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:34:59.259 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:59 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/1592446456' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:59.259 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:34:59 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/1244876711' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:34:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.257+0000 7f7d9bda3700 1 -- 192.168.123.106:0/3168097894 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d94072470 msgr2=0x7f7d9410beb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.257+0000 7f7d9bda3700 1 --2- 192.168.123.106:0/3168097894 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d94072470 0x7f7d9410beb0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f7d8c01c580 tx=0x7f7d8c01c890 comp rx=0 tx=0).stop 2026-03-09T17:34:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.257+0000 7f7d9bda3700 1 -- 192.168.123.106:0/3168097894 shutdown_connections 2026-03-09T17:34:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.257+0000 7f7d9bda3700 1 --2- 192.168.123.106:0/3168097894 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d94072470 0x7f7d9410beb0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.257+0000 7f7d9bda3700 1 --2- 192.168.123.106:0/3168097894 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7d94071a90 0x7f7d94071ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:59.259 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.257+0000 7f7d9bda3700 1 -- 192.168.123.106:0/3168097894 >> 192.168.123.106:0/3168097894 conn(0x7f7d9406d1a0 msgr2=0x7f7d9406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:59.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.260+0000 7f7d9bda3700 1 -- 192.168.123.106:0/3168097894 shutdown_connections 2026-03-09T17:34:59.262 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.260+0000 7f7d9bda3700 1 -- 192.168.123.106:0/3168097894 wait complete. 2026-03-09T17:34:59.263 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.261+0000 7f7d9bda3700 1 Processor -- start 2026-03-09T17:34:59.265 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.262+0000 7f7d9bda3700 1 -- start start 2026-03-09T17:34:59.265 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.262+0000 7f7d9bda3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d94071a90 0x7f7d9419c330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:59.265 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.262+0000 7f7d9bda3700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7d9419c870 0x7f7d941a18e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:59.265 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.262+0000 7f7d9bda3700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7d9419cd70 con 0x7f7d94071a90 2026-03-09T17:34:59.265 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.262+0000 7f7d9bda3700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7d9419cee0 con 0x7f7d9419c870 2026-03-09T17:34:59.265 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.263+0000 7f7d99b3f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d94071a90 0x7f7d9419c330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:59.265 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.263+0000 7f7d99b3f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d94071a90 0x7f7d9419c330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:33390/0 (socket says 192.168.123.106:33390) 2026-03-09T17:34:59.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.263+0000 7f7d99b3f700 1 -- 192.168.123.106:0/1973225962 learned_addr learned my addr 192.168.123.106:0/1973225962 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:34:59.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.263+0000 7f7d9933e700 1 --2- 192.168.123.106:0/1973225962 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7d9419c870 0x7f7d941a18e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:59.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.263+0000 7f7d99b3f700 1 -- 192.168.123.106:0/1973225962 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7d9419c870 msgr2=0x7f7d941a18e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:59.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.263+0000 7f7d99b3f700 1 --2- 192.168.123.106:0/1973225962 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7d9419c870 0x7f7d941a18e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:59.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.263+0000 7f7d99b3f700 1 -- 192.168.123.106:0/1973225962 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7d8c01c060 con 0x7f7d94071a90 2026-03-09T17:34:59.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.264+0000 7f7d99b3f700 1 --2- 192.168.123.106:0/1973225962 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d94071a90 0x7f7d9419c330 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f7d9000ba70 tx=0x7f7d9000be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:59.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.264+0000 7f7d8affd700 1 -- 192.168.123.106:0/1973225962 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7d9000c760 con 0x7f7d94071a90 2026-03-09T17:34:59.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.264+0000 7f7d8affd700 1 -- 192.168.123.106:0/1973225962 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7d9000cda0 con 0x7f7d94071a90 2026-03-09T17:34:59.266 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.265+0000 7f7d8affd700 1 -- 192.168.123.106:0/1973225962 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7d90012550 con 0x7f7d94071a90 2026-03-09T17:34:59.270 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.265+0000 7f7d9bda3700 1 -- 192.168.123.106:0/1973225962 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7d941a1e80 con 0x7f7d94071a90 2026-03-09T17:34:59.270 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.265+0000 7f7d9bda3700 1 -- 192.168.123.106:0/1973225962 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7d941a21e0 con 0x7f7d94071a90 2026-03-09T17:34:59.270 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.265+0000 7f7d9bda3700 1 -- 192.168.123.106:0/1973225962 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7d9404ea50 con 0x7f7d94071a90 2026-03-09T17:34:59.270 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.268+0000 7f7d8affd700 1 -- 192.168.123.106:0/1973225962 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7d9000c8c0 con 0x7f7d94071a90 2026-03-09T17:34:59.270 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.268+0000 7f7d8affd700 1 --2- 192.168.123.106:0/1973225962 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7d80077700 0x7f7d80079bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:34:59.270 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.268+0000 7f7d8affd700 1 -- 192.168.123.106:0/1973225962 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f7d90098d10 con 0x7f7d94071a90 2026-03-09T17:34:59.270 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.269+0000 7f7d8affd700 1 -- 192.168.123.106:0/1973225962 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7d90061550 con 0x7f7d94071a90 2026-03-09T17:34:59.270 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.269+0000 7f7d9933e700 1 --2- 192.168.123.106:0/1973225962 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7d80077700 0x7f7d80079bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:34:59.276 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.275+0000 7f7d9933e700 1 --2- 192.168.123.106:0/1973225962 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7d80077700 0x7f7d80079bb0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f7d8c007fd0 tx=0x7f7d8c007f40 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:34:59.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.453+0000 7f7d9bda3700 1 -- 192.168.123.106:0/1973225962 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f7d9402cd00 con 0x7f7d94071a90 2026-03-09T17:34:59.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.454+0000 7f7d8affd700 1 -- 192.168.123.106:0/1973225962 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f7d90060ca0 con 0x7f7d94071a90 2026-03-09T17:34:59.455 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_OK 2026-03-09T17:34:59.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.457+0000 7f7d88ff9700 1 -- 192.168.123.106:0/1973225962 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7d80077700 msgr2=0x7f7d80079bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:59.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.457+0000 7f7d88ff9700 1 --2- 192.168.123.106:0/1973225962 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7d80077700 0x7f7d80079bb0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f7d8c007fd0 tx=0x7f7d8c007f40 comp rx=0 tx=0).stop 2026-03-09T17:34:59.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.457+0000 7f7d88ff9700 1 -- 192.168.123.106:0/1973225962 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d94071a90 msgr2=0x7f7d9419c330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:34:59.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.457+0000 7f7d88ff9700 1 --2- 192.168.123.106:0/1973225962 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d94071a90 0x7f7d9419c330 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f7d9000ba70 tx=0x7f7d9000be30 comp rx=0 tx=0).stop 2026-03-09T17:34:59.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.457+0000 7f7d88ff9700 1 -- 192.168.123.106:0/1973225962 shutdown_connections 2026-03-09T17:34:59.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.457+0000 7f7d88ff9700 1 --2- 192.168.123.106:0/1973225962 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7d80077700 0x7f7d80079bb0 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:59.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.457+0000 7f7d88ff9700 1 --2- 192.168.123.106:0/1973225962 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d94071a90 0x7f7d9419c330 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:59.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.457+0000 7f7d88ff9700 1 --2- 192.168.123.106:0/1973225962 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7d9419c870 0x7f7d941a18e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:34:59.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.457+0000 7f7d88ff9700 1 -- 192.168.123.106:0/1973225962 >> 192.168.123.106:0/1973225962 conn(0x7f7d9406d1a0 msgr2=0x7f7d9410b300 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:34:59.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.458+0000 7f7d88ff9700 1 -- 192.168.123.106:0/1973225962 shutdown_connections 2026-03-09T17:34:59.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:34:59.458+0000 7f7d88ff9700 1 -- 192.168.123.106:0/1973225962 wait complete. 2026-03-09T17:34:59.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:59 vm09.local ceph-mon[97995]: from='client.34330 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:34:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:59 vm09.local ceph-mon[97995]: from='client.34334 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:34:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:34:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:59 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/1592446456' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:34:59.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:34:59 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/1244876711' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:35:00.291 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:00 vm06.local ceph-mon[109831]: pgmap v161: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 1 op/s 2026-03-09T17:35:00.291 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:00 vm06.local ceph-mon[109831]: from='client.44287 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:35:00.291 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:00 vm06.local ceph-mon[109831]: from='client.34348 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:35:00.291 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:00 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/1973225962' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:35:00.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:00 vm09.local ceph-mon[97995]: pgmap v161: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 1 op/s 2026-03-09T17:35:00.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:00 vm09.local ceph-mon[97995]: from='client.44287 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:35:00.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:00 vm09.local ceph-mon[97995]: from='client.34348 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:35:00.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:00 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/1973225962' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:35:02.546 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:02 vm09.local ceph-mon[97995]: pgmap v162: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 346 KiB/s rd, 2 op/s 2026-03-09T17:35:02.546 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:02 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:02.546 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:02 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:02.546 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:02 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:02.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:02 vm06.local ceph-mon[109831]: pgmap v162: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 346 KiB/s rd, 2 op/s 2026-03-09T17:35:02.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:02 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:02.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:02 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:02.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:02 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:03.750 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:03 vm06.local ceph-mon[109831]: pgmap v163: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:03.750 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:03.750 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:03.750 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:03.750 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:03 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:03.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:03 vm09.local ceph-mon[97995]: pgmap v163: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:03.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:03 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:03.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:03 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:03.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:03 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:03.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:03 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.059 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.060 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.060 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.060 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.060 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: Upgrade: Updating alertmanager.vm06 2026-03-09T17:35:05.060 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:04 vm06.local ceph-mon[109831]: Deploying daemon alertmanager.vm06 on vm06 2026-03-09T17:35:05.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: Upgrade: Updating alertmanager.vm06 2026-03-09T17:35:05.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:04 vm09.local ceph-mon[97995]: Deploying daemon alertmanager.vm06 on vm06 2026-03-09T17:35:06.341 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:06 vm06.local ceph-mon[109831]: pgmap v164: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:06.341 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:06 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:06.341 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:06 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:06.341 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:06 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:06.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:06 vm09.local ceph-mon[97995]: pgmap v164: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:06.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:06 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:06.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:06 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:06.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:06 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:07.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:07 vm06.local ceph-mon[109831]: pgmap v165: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:07.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:07 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:07.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:07 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:07.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:07 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:07.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:07 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:08.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:07 vm09.local ceph-mon[97995]: pgmap v165: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:08.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:07 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:08.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:07 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:08.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:07 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:08.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:07 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:09 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:09.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:09.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:35:09.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:35:09.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:09.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:09 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:10.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:10 vm06.local ceph-mon[109831]: pgmap v166: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:10.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:10 vm06.local ceph-mon[109831]: Upgrade: Updating grafana.vm06 2026-03-09T17:35:10.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:10 vm06.local ceph-mon[109831]: Deploying daemon grafana.vm06 on vm06 2026-03-09T17:35:10.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:10 vm09.local ceph-mon[97995]: pgmap v166: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:10.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:10 vm09.local ceph-mon[97995]: Upgrade: Updating grafana.vm06 2026-03-09T17:35:10.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:10 vm09.local ceph-mon[97995]: Deploying daemon grafana.vm06 on vm06 2026-03-09T17:35:12.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:12 vm06.local ceph-mon[109831]: pgmap v167: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:12.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:12 vm09.local ceph-mon[97995]: pgmap v167: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:14.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:14 vm06.local ceph-mon[109831]: pgmap v168: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:14.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:14 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:35:14.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:14 vm09.local ceph-mon[97995]: pgmap v168: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:14.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:14 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:35:16.248 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:15 vm06.local ceph-mon[109831]: pgmap v169: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:16.258 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:16.258 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:16.258 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:15 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:16.380 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:15 vm09.local ceph-mon[97995]: pgmap v169: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:16.380 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:16.380 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:16.380 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:15 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:18.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:18 vm06.local ceph-mon[109831]: pgmap v170: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:18.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:18.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:18.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:18.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:18 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:18.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:18 vm09.local ceph-mon[97995]: pgmap v170: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:18.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:18.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:18.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:18.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:18 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: pgmap v171: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: Upgrade: Finalizing container_image settings 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-09T17:35:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: Upgrade: Complete! 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:35:20.143 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:20.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: pgmap v171: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:20.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:20.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:20.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:35:20.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:35:20.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:20.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T17:35:20.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T17:35:20.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:20.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: Upgrade: Finalizing container_image settings 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: Upgrade: Complete! 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:35:20.146 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:22.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:22 vm06.local ceph-mon[109831]: pgmap v172: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:22.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:22 vm09.local ceph-mon[97995]: pgmap v172: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:24.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:24 vm06.local ceph-mon[109831]: pgmap v173: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:24.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:24 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:24.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:24 vm09.local ceph-mon[97995]: pgmap v173: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:24.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:24 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:35:26.533 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:26 vm06.local ceph-mon[109831]: pgmap v174: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:26.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:26 vm09.local ceph-mon[97995]: pgmap v174: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:28.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:28 vm06.local ceph-mon[109831]: pgmap v175: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:28.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:28 vm09.local ceph-mon[97995]: pgmap v175: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:29.534 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:29 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:35:29.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.531+0000 7f7d62917700 1 -- 192.168.123.106:0/4209333195 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d5c0ff7b0 msgr2=0x7f7d5c0ffc20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:29.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.531+0000 7f7d62917700 1 --2- 192.168.123.106:0/4209333195 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d5c0ff7b0 0x7f7d5c0ffc20 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f7d50009b50 tx=0x7f7d50009e60 comp rx=0 tx=0).stop 2026-03-09T17:35:29.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.533+0000 7f7d62917700 1 -- 192.168.123.106:0/4209333195 shutdown_connections 2026-03-09T17:35:29.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.533+0000 7f7d62917700 1 --2- 192.168.123.106:0/4209333195 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d5c0ff7b0 0x7f7d5c0ffc20 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:29.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.533+0000 7f7d62917700 1 --2- 192.168.123.106:0/4209333195 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7d5c0fee60 0x7f7d5c0ff270 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:29.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.533+0000 7f7d62917700 1 -- 192.168.123.106:0/4209333195 >> 192.168.123.106:0/4209333195 conn(0x7f7d5c0fa9f0 msgr2=0x7f7d5c0fce40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:35:29.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.533+0000 7f7d62917700 1 -- 192.168.123.106:0/4209333195 shutdown_connections 2026-03-09T17:35:29.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.533+0000 7f7d62917700 1 -- 192.168.123.106:0/4209333195 wait complete. 2026-03-09T17:35:29.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.533+0000 7f7d62917700 1 Processor -- start 2026-03-09T17:35:29.534 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.533+0000 7f7d62917700 1 -- start start 2026-03-09T17:35:29.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.534+0000 7f7d62917700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d5c0fee60 0x7f7d5c198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:29.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.534+0000 7f7d62917700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7d5c0ff7b0 0x7f7d5c198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:29.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.534+0000 7f7d62917700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7d5c198b80 con 0x7f7d5c0fee60 2026-03-09T17:35:29.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.534+0000 7f7d62917700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7d5c198cc0 con 0x7f7d5c0ff7b0 2026-03-09T17:35:29.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.534+0000 7f7d5bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d5c0fee60 0x7f7d5c198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:35:29.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.534+0000 7f7d5bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d5c0fee60 0x7f7d5c198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:60460/0 (socket says 192.168.123.106:60460) 2026-03-09T17:35:29.535 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.534+0000 7f7d5bfff700 1 -- 192.168.123.106:0/1599846061 learned_addr learned my addr 192.168.123.106:0/1599846061 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:35:29.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.534+0000 7f7d5bfff700 1 -- 192.168.123.106:0/1599846061 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7d5c0ff7b0 msgr2=0x7f7d5c198560 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T17:35:29.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.534+0000 7f7d5bfff700 1 --2- 192.168.123.106:0/1599846061 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7d5c0ff7b0 0x7f7d5c198560 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:29.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.534+0000 7f7d5bfff700 1 -- 192.168.123.106:0/1599846061 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7d500097e0 con 0x7f7d5c0fee60 2026-03-09T17:35:29.536 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.535+0000 7f7d5bfff700 1 --2- 192.168.123.106:0/1599846061 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d5c0fee60 0x7f7d5c198020 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f7d4c00eb10 tx=0x7f7d4c00eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:35:29.537 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.535+0000 7f7d597fa700 1 -- 192.168.123.106:0/1599846061 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7d4c00cca0 con 0x7f7d5c0fee60 2026-03-09T17:35:29.537 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.535+0000 7f7d597fa700 1 -- 192.168.123.106:0/1599846061 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f7d4c00ce00 con 0x7f7d5c0fee60 2026-03-09T17:35:29.537 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.535+0000 7f7d597fa700 1 -- 192.168.123.106:0/1599846061 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7d4c018910 con 0x7f7d5c0fee60 2026-03-09T17:35:29.537 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.535+0000 7f7d62917700 1 -- 192.168.123.106:0/1599846061 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7d5c19d770 con 0x7f7d5c0fee60 2026-03-09T17:35:29.537 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.535+0000 7f7d62917700 1 -- 192.168.123.106:0/1599846061 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7d5c101240 con 0x7f7d5c0fee60 2026-03-09T17:35:29.537 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.536+0000 7f7d62917700 1 -- 192.168.123.106:0/1599846061 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7d5c10b420 con 0x7f7d5c0fee60 2026-03-09T17:35:29.541 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.538+0000 7f7d597fa700 1 -- 192.168.123.106:0/1599846061 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7d4c018a70 con 0x7f7d5c0fee60 2026-03-09T17:35:29.541 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.539+0000 7f7d597fa700 1 --2- 192.168.123.106:0/1599846061 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7d44077990 0x7f7d44079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:29.541 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.539+0000 7f7d597fa700 1 -- 192.168.123.106:0/1599846061 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f7d4c014070 con 0x7f7d5c0fee60 2026-03-09T17:35:29.541 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.539+0000 7f7d597fa700 1 -- 192.168.123.106:0/1599846061 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7d4c0631e0 con 0x7f7d5c0fee60 2026-03-09T17:35:29.541 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.540+0000 7f7d5b7fe700 1 --2- 192.168.123.106:0/1599846061 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7d44077990 0x7f7d44079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:35:29.541 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.540+0000 7f7d5b7fe700 1 --2- 192.168.123.106:0/1599846061 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7d44077990 0x7f7d44079e40 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f7d50009b20 tx=0x7f7d50005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:35:29.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:29 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:35:29.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.671+0000 7f7d62917700 1 -- 192.168.123.106:0/1599846061 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7d5c19d900 con 0x7f7d44077990 2026-03-09T17:35:29.673 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.672+0000 7f7d597fa700 1 -- 192.168.123.106:0/1599846061 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f7d5c19d900 con 0x7f7d44077990 2026-03-09T17:35:29.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.674+0000 7f7d62917700 1 -- 192.168.123.106:0/1599846061 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7d44077990 msgr2=0x7f7d44079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:29.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.674+0000 7f7d62917700 1 --2- 192.168.123.106:0/1599846061 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7d44077990 0x7f7d44079e40 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f7d50009b20 tx=0x7f7d50005fb0 comp rx=0 tx=0).stop 2026-03-09T17:35:29.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.675+0000 7f7d62917700 1 -- 192.168.123.106:0/1599846061 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d5c0fee60 msgr2=0x7f7d5c198020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:29.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.675+0000 7f7d62917700 1 --2- 192.168.123.106:0/1599846061 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d5c0fee60 0x7f7d5c198020 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f7d4c00eb10 tx=0x7f7d4c00eed0 comp rx=0 tx=0).stop 2026-03-09T17:35:29.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.675+0000 7f7d62917700 1 -- 192.168.123.106:0/1599846061 shutdown_connections 2026-03-09T17:35:29.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.675+0000 7f7d62917700 1 --2- 192.168.123.106:0/1599846061 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f7d44077990 0x7f7d44079e40 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:29.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.675+0000 7f7d62917700 1 --2- 192.168.123.106:0/1599846061 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7d5c0fee60 0x7f7d5c198020 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:29.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.675+0000 7f7d62917700 1 --2- 192.168.123.106:0/1599846061 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f7d5c0ff7b0 0x7f7d5c198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:29.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.675+0000 7f7d62917700 1 -- 192.168.123.106:0/1599846061 >> 192.168.123.106:0/1599846061 conn(0x7f7d5c0fa9f0 msgr2=0x7f7d5c107490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:35:29.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.675+0000 7f7d62917700 1 -- 192.168.123.106:0/1599846061 shutdown_connections 2026-03-09T17:35:29.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:29.675+0000 7f7d62917700 1 -- 192.168.123.106:0/1599846061 wait complete. 2026-03-09T17:35:29.728 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-09T17:35:29.878 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:35:30.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.146+0000 7fc44c04a700 1 -- 192.168.123.106:0/3368806857 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc444102670 msgr2=0x7fc444102ae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:30.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.146+0000 7fc44c04a700 1 --2- 192.168.123.106:0/3368806857 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc444102670 0x7fc444102ae0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7fc440009b00 tx=0x7fc440009e10 comp rx=0 tx=0).stop 2026-03-09T17:35:30.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.147+0000 7fc44c04a700 1 -- 192.168.123.106:0/3368806857 shutdown_connections 2026-03-09T17:35:30.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.147+0000 7fc44c04a700 1 --2- 192.168.123.106:0/3368806857 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc444102670 0x7fc444102ae0 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:30.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.147+0000 7fc44c04a700 1 --2- 192.168.123.106:0/3368806857 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc444108670 0x7fc444108a40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:30.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.147+0000 7fc44c04a700 1 -- 192.168.123.106:0/3368806857 >> 192.168.123.106:0/3368806857 conn(0x7fc4440fe150 msgr2=0x7fc444100560 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:35:30.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.147+0000 7fc44c04a700 1 -- 192.168.123.106:0/3368806857 shutdown_connections 2026-03-09T17:35:30.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.147+0000 7fc44c04a700 1 -- 192.168.123.106:0/3368806857 wait complete. 2026-03-09T17:35:30.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.148+0000 7fc44c04a700 1 Processor -- start 2026-03-09T17:35:30.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.148+0000 7fc44c04a700 1 -- start start 2026-03-09T17:35:30.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.148+0000 7fc44c04a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc444102670 0x7fc4441982a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:30.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.148+0000 7fc44c04a700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc444108670 0x7fc4441987e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:30.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.148+0000 7fc44c04a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc444198e30 con 0x7fc444102670 2026-03-09T17:35:30.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.149+0000 7fc44c04a700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc444198f70 con 0x7fc444108670 2026-03-09T17:35:30.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.149+0000 7fc4495e5700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc444108670 0x7fc4441987e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:35:30.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.149+0000 7fc4495e5700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc444108670 0x7fc4441987e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:58200/0 (socket says 192.168.123.106:58200) 2026-03-09T17:35:30.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.149+0000 7fc4495e5700 1 -- 192.168.123.106:0/86477032 learned_addr learned my addr 192.168.123.106:0/86477032 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:35:30.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.149+0000 7fc449de6700 1 --2- 192.168.123.106:0/86477032 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc444102670 0x7fc4441982a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:35:30.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.149+0000 7fc4495e5700 1 -- 192.168.123.106:0/86477032 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc444102670 msgr2=0x7fc4441982a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:30.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.149+0000 7fc4495e5700 1 --2- 192.168.123.106:0/86477032 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc444102670 0x7fc4441982a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:30.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.149+0000 7fc4495e5700 1 -- 192.168.123.106:0/86477032 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc4400097e0 con 0x7fc444108670 2026-03-09T17:35:30.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.149+0000 7fc449de6700 1 --2- 192.168.123.106:0/86477032 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc444102670 0x7fc4441982a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T17:35:30.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.150+0000 7fc4495e5700 1 --2- 192.168.123.106:0/86477032 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc444108670 0x7fc4441987e0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fc440009ad0 tx=0x7fc4400052e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:35:30.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.150+0000 7fc436ffd700 1 -- 192.168.123.106:0/86477032 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc44001d070 con 0x7fc444108670 2026-03-09T17:35:30.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.150+0000 7fc44c04a700 1 -- 192.168.123.106:0/86477032 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc44419cd60 con 0x7fc444108670 2026-03-09T17:35:30.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.150+0000 7fc436ffd700 1 -- 192.168.123.106:0/86477032 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc44000bc50 con 0x7fc444108670 2026-03-09T17:35:30.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.150+0000 7fc436ffd700 1 -- 192.168.123.106:0/86477032 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc44000f790 con 0x7fc444108670 2026-03-09T17:35:30.152 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.150+0000 7fc44c04a700 1 -- 192.168.123.106:0/86477032 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc44419d250 con 0x7fc444108670 2026-03-09T17:35:30.155 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.151+0000 7fc44c04a700 1 -- 192.168.123.106:0/86477032 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc44404f310 con 0x7fc444108670 2026-03-09T17:35:30.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.154+0000 7fc436ffd700 1 -- 192.168.123.106:0/86477032 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc440022470 con 0x7fc444108670 2026-03-09T17:35:30.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.155+0000 7fc436ffd700 1 --2- 192.168.123.106:0/86477032 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc430077910 0x7fc430079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:30.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.155+0000 7fc436ffd700 1 -- 192.168.123.106:0/86477032 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fc44009b500 con 0x7fc444108670 2026-03-09T17:35:30.156 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.155+0000 7fc449de6700 1 --2- 192.168.123.106:0/86477032 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc430077910 0x7fc430079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:35:30.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.156+0000 7fc449de6700 1 --2- 192.168.123.106:0/86477032 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc430077910 0x7fc430079dc0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fc4441037b0 tx=0x7fc43800a680 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:35:30.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.156+0000 7fc436ffd700 1 -- 192.168.123.106:0/86477032 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc440063bb0 con 0x7fc444108670 2026-03-09T17:35:30.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.281+0000 7fc44c04a700 1 -- 192.168.123.106:0/86477032 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fc44419d530 con 0x7fc430077910 2026-03-09T17:35:30.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.287+0000 7fc436ffd700 1 -- 192.168.123.106:0/86477032 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fc44419d530 con 0x7fc430077910 2026-03-09T17:35:30.288 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:35:30.288 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (25s) 13s ago 10m 16.2M - 0.25.0 c8568f914cd2 eaf8e59269b1 2026-03-09T17:35:30.288 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (50s) 13s ago 10m 10.5M - 19.2.3-678-ge911bdeb 654f31e6858e 6ccb5363cc83 2026-03-09T17:35:30.288 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (49s) 37s ago 10m 10.3M - 19.2.3-678-ge911bdeb 654f31e6858e ec8da6f92eb8 2026-03-09T17:35:30.288 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (4m) 13s ago 10m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 3e47d040f792 2026-03-09T17:35:30.288 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (4m) 37s ago 10m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e a8538a05db57 2026-03-09T17:35:30.288 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (15s) 13s ago 10m 40.3M - 10.4.0 c8b91775d855 3f6028cb482d 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (71s) 13s ago 8m 93.1M - 19.2.3-678-ge911bdeb 654f31e6858e cea38695f742 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (81s) 13s ago 8m 101M - 19.2.3-678-ge911bdeb 654f31e6858e 137634321f1d 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (65s) 37s ago 8m 18.5M - 19.2.3-678-ge911bdeb 654f31e6858e da0f8f8a04b1 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (56s) 37s ago 8m 15.4M - 19.2.3-678-ge911bdeb 654f31e6858e 2a7cca6ff85f 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:8443,9283,8765 running (5m) 13s ago 11m 643M - 19.2.3-678-ge911bdeb 654f31e6858e a857a5a84e3a 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (4m) 37s ago 9m 491M - 19.2.3-678-ge911bdeb 654f31e6858e ce41ce6b68d6 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (4m) 13s ago 11m 64.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 86c27c9946b5 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (4m) 37s ago 9m 53.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 65d270c6a306 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (41s) 13s ago 10m 8186k - 1.7.0 72c9c2088986 afbba45bb140 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (38s) 37s ago 9m 5528k - 1.7.0 72c9c2088986 11641531af25 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (3m) 13s ago 9m 211M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 3b19d9fcb067 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (2m) 13s ago 9m 130M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b63df0190ed3 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (2m) 13s ago 9m 126M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a5ccd85faf22 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (2m) 37s ago 9m 165M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 40d834360933 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (116s) 37s ago 8m 121M 4096M 19.2.3-678-ge911bdeb 654f31e6858e cb6e9cd4fe30 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (94s) 37s ago 8m 126M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b297663f757a 2026-03-09T17:35:30.289 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (29s) 13s ago 10m 51.7M - 2.51.0 1d3b7f56885b ea61dadaf020 2026-03-09T17:35:30.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.290+0000 7fc44c04a700 1 -- 192.168.123.106:0/86477032 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc430077910 msgr2=0x7fc430079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:30.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.290+0000 7fc44c04a700 1 --2- 192.168.123.106:0/86477032 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc430077910 0x7fc430079dc0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fc4441037b0 tx=0x7fc43800a680 comp rx=0 tx=0).stop 2026-03-09T17:35:30.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.290+0000 7fc44c04a700 1 -- 192.168.123.106:0/86477032 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc444108670 msgr2=0x7fc4441987e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:30.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.290+0000 7fc44c04a700 1 --2- 192.168.123.106:0/86477032 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc444108670 0x7fc4441987e0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fc440009ad0 tx=0x7fc4400052e0 comp rx=0 tx=0).stop 2026-03-09T17:35:30.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.290+0000 7fc44c04a700 1 -- 192.168.123.106:0/86477032 shutdown_connections 2026-03-09T17:35:30.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.290+0000 7fc44c04a700 1 --2- 192.168.123.106:0/86477032 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc430077910 0x7fc430079dc0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:30.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.290+0000 7fc44c04a700 1 --2- 192.168.123.106:0/86477032 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc444102670 0x7fc4441982a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:30.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.290+0000 7fc44c04a700 1 --2- 192.168.123.106:0/86477032 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc444108670 0x7fc4441987e0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:30.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.290+0000 7fc44c04a700 1 -- 192.168.123.106:0/86477032 >> 192.168.123.106:0/86477032 conn(0x7fc4440fe150 msgr2=0x7fc4440ff9b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:35:30.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.290+0000 7fc44c04a700 1 -- 192.168.123.106:0/86477032 shutdown_connections 2026-03-09T17:35:30.291 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.290+0000 7fc44c04a700 1 -- 192.168.123.106:0/86477032 wait complete. 2026-03-09T17:35:30.359 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade status' 2026-03-09T17:35:30.414 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:30 vm06.local ceph-mon[109831]: pgmap v176: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:30.414 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:30 vm06.local ceph-mon[109831]: from='client.34356 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:35:30.520 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:35:30.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:30 vm09.local ceph-mon[97995]: pgmap v176: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:30.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:30 vm09.local ceph-mon[97995]: from='client.34356 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:35:30.761 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.759+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/155526034 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa6d8073a00 msgr2=0x7fa6d8110ff0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:30.761 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.759+0000 7fa6dd8ea700 1 --2- 192.168.123.106:0/155526034 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa6d8073a00 0x7fa6d8110ff0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fa6c8009b00 tx=0x7fa6c8009e10 comp rx=0 tx=0).stop 2026-03-09T17:35:30.761 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.760+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/155526034 shutdown_connections 2026-03-09T17:35:30.761 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.760+0000 7fa6dd8ea700 1 --2- 192.168.123.106:0/155526034 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa6d8073a00 0x7fa6d8110ff0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:30.761 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.760+0000 7fa6dd8ea700 1 --2- 192.168.123.106:0/155526034 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa6d80730f0 0x7fa6d80734c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:30.761 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.760+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/155526034 >> 192.168.123.106:0/155526034 conn(0x7fa6d80fc000 msgr2=0x7fa6d80fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:35:30.762 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.760+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/155526034 shutdown_connections 2026-03-09T17:35:30.762 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.761+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/155526034 wait complete. 2026-03-09T17:35:30.762 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.761+0000 7fa6dd8ea700 1 Processor -- start 2026-03-09T17:35:30.762 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.761+0000 7fa6dd8ea700 1 -- start start 2026-03-09T17:35:30.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.762+0000 7fa6dd8ea700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa6d80730f0 0x7fa6d81a25f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:30.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.762+0000 7fa6dd8ea700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa6d8073a00 0x7fa6d81a2b30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:30.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.762+0000 7fa6d6ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa6d80730f0 0x7fa6d81a25f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:35:30.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.762+0000 7fa6d6ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa6d80730f0 0x7fa6d81a25f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:60484/0 (socket says 192.168.123.106:60484) 2026-03-09T17:35:30.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.762+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/3899862348 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa6d81a31c0 con 0x7fa6d80730f0 2026-03-09T17:35:30.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.762+0000 7fa6d67fc700 1 --2- 192.168.123.106:0/3899862348 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa6d8073a00 0x7fa6d81a2b30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:35:30.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.762+0000 7fa6d6ffd700 1 -- 192.168.123.106:0/3899862348 learned_addr learned my addr 192.168.123.106:0/3899862348 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:35:30.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.762+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/3899862348 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa6d819c670 con 0x7fa6d8073a00 2026-03-09T17:35:30.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.763+0000 7fa6d6ffd700 1 -- 192.168.123.106:0/3899862348 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa6d8073a00 msgr2=0x7fa6d81a2b30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:30.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.763+0000 7fa6d6ffd700 1 --2- 192.168.123.106:0/3899862348 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa6d8073a00 0x7fa6d81a2b30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:30.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.763+0000 7fa6d6ffd700 1 -- 192.168.123.106:0/3899862348 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa6c80097e0 con 0x7fa6d80730f0 2026-03-09T17:35:30.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.763+0000 7fa6d6ffd700 1 --2- 192.168.123.106:0/3899862348 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa6d80730f0 0x7fa6d81a25f0 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7fa6c000ba70 tx=0x7fa6c000bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:35:30.765 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.764+0000 7fa6dc8e8700 1 -- 192.168.123.106:0/3899862348 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa6c000c700 con 0x7fa6d80730f0 2026-03-09T17:35:30.765 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.764+0000 7fa6dc8e8700 1 -- 192.168.123.106:0/3899862348 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa6c000cd40 con 0x7fa6d80730f0 2026-03-09T17:35:30.765 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.764+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/3899862348 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa6d819c950 con 0x7fa6d80730f0 2026-03-09T17:35:30.766 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.764+0000 7fa6dc8e8700 1 -- 192.168.123.106:0/3899862348 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa6c0012340 con 0x7fa6d80730f0 2026-03-09T17:35:30.766 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.764+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/3899862348 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa6d819cea0 con 0x7fa6d80730f0 2026-03-09T17:35:30.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.765+0000 7fa6dc8e8700 1 -- 192.168.123.106:0/3899862348 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa6c00124e0 con 0x7fa6d80730f0 2026-03-09T17:35:30.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.765+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/3899862348 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa6d810e770 con 0x7fa6d80730f0 2026-03-09T17:35:30.770 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.768+0000 7fa6dc8e8700 1 --2- 192.168.123.106:0/3899862348 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa6c40778c0 0x7fa6c4079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:30.770 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.768+0000 7fa6dc8e8700 1 -- 192.168.123.106:0/3899862348 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fa6c0098970 con 0x7fa6d80730f0 2026-03-09T17:35:30.770 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.768+0000 7fa6dc8e8700 1 -- 192.168.123.106:0/3899862348 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa6c009c2a0 con 0x7fa6d80730f0 2026-03-09T17:35:30.770 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.769+0000 7fa6d67fc700 1 --2- 192.168.123.106:0/3899862348 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa6c40778c0 0x7fa6c4079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:35:30.770 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.769+0000 7fa6d67fc700 1 --2- 192.168.123.106:0/3899862348 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa6c40778c0 0x7fa6c4079d70 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7fa6d80fd790 tx=0x7fa6c8005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:35:30.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.895+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/3899862348 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fa6d8066eb0 con 0x7fa6c40778c0 2026-03-09T17:35:30.897 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.896+0000 7fa6dc8e8700 1 -- 192.168.123.106:0/3899862348 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7fa6d8066eb0 con 0x7fa6c40778c0 2026-03-09T17:35:30.898 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:35:30.898 INFO:teuthology.orchestra.run.vm06.stdout: "target_image": null, 2026-03-09T17:35:30.898 INFO:teuthology.orchestra.run.vm06.stdout: "in_progress": false, 2026-03-09T17:35:30.898 INFO:teuthology.orchestra.run.vm06.stdout: "which": "", 2026-03-09T17:35:30.898 INFO:teuthology.orchestra.run.vm06.stdout: "services_complete": [], 2026-03-09T17:35:30.898 INFO:teuthology.orchestra.run.vm06.stdout: "progress": null, 2026-03-09T17:35:30.898 INFO:teuthology.orchestra.run.vm06.stdout: "message": "", 2026-03-09T17:35:30.898 INFO:teuthology.orchestra.run.vm06.stdout: "is_paused": false 2026-03-09T17:35:30.898 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:35:30.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.899+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/3899862348 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa6c40778c0 msgr2=0x7fa6c4079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:30.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.899+0000 7fa6dd8ea700 1 --2- 192.168.123.106:0/3899862348 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa6c40778c0 0x7fa6c4079d70 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7fa6d80fd790 tx=0x7fa6c8005fb0 comp rx=0 tx=0).stop 2026-03-09T17:35:30.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.899+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/3899862348 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa6d80730f0 msgr2=0x7fa6d81a25f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:30.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.899+0000 7fa6dd8ea700 1 --2- 192.168.123.106:0/3899862348 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa6d80730f0 0x7fa6d81a25f0 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7fa6c000ba70 tx=0x7fa6c000bd80 comp rx=0 tx=0).stop 2026-03-09T17:35:30.900 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.899+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/3899862348 shutdown_connections 2026-03-09T17:35:30.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.899+0000 7fa6dd8ea700 1 --2- 192.168.123.106:0/3899862348 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fa6c40778c0 0x7fa6c4079d70 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:30.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.899+0000 7fa6dd8ea700 1 --2- 192.168.123.106:0/3899862348 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa6d80730f0 0x7fa6d81a25f0 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:30.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.899+0000 7fa6dd8ea700 1 --2- 192.168.123.106:0/3899862348 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fa6d8073a00 0x7fa6d81a2b30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:30.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.899+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/3899862348 >> 192.168.123.106:0/3899862348 conn(0x7fa6d80fc000 msgr2=0x7fa6d8102b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:35:30.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.900+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/3899862348 shutdown_connections 2026-03-09T17:35:30.901 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:30.900+0000 7fa6dd8ea700 1 -- 192.168.123.106:0/3899862348 wait complete. 2026-03-09T17:35:30.964 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph health detail' 2026-03-09T17:35:31.108 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:35:31.411 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.409+0000 7f274353f700 1 -- 192.168.123.106:0/2217094781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f273c0ff990 msgr2=0x7f273c10c6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:31.411 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.409+0000 7f274353f700 1 --2- 192.168.123.106:0/2217094781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f273c0ff990 0x7f273c10c6f0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f272c009b00 tx=0x7f272c009e10 comp rx=0 tx=0).stop 2026-03-09T17:35:31.411 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.409+0000 7f274353f700 1 -- 192.168.123.106:0/2217094781 shutdown_connections 2026-03-09T17:35:31.411 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.409+0000 7f274353f700 1 --2- 192.168.123.106:0/2217094781 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f273c0ff990 0x7f273c10c6f0 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:31.411 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.409+0000 7f274353f700 1 --2- 192.168.123.106:0/2217094781 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f273c0ff080 0x7f273c0ff450 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:31.411 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.409+0000 7f274353f700 1 -- 192.168.123.106:0/2217094781 >> 192.168.123.106:0/2217094781 conn(0x7f273c0facf0 msgr2=0x7f273c0fd100 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:35:31.411 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.409+0000 7f274353f700 1 -- 192.168.123.106:0/2217094781 shutdown_connections 2026-03-09T17:35:31.411 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.409+0000 7f274353f700 1 -- 192.168.123.106:0/2217094781 wait complete. 2026-03-09T17:35:31.412 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.410+0000 7f274353f700 1 Processor -- start 2026-03-09T17:35:31.412 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.410+0000 7f274353f700 1 -- start start 2026-03-09T17:35:31.412 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.411+0000 7f274353f700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f273c0ff080 0x7f273c198170 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:31.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.412+0000 7f274353f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f273c0ff990 0x7f273c1986b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:31.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.412+0000 7f274353f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f273c198d90 con 0x7f273c0ff990 2026-03-09T17:35:31.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.412+0000 7f274353f700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f273c19c910 con 0x7f273c0ff080 2026-03-09T17:35:31.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.412+0000 7f274253d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f273c0ff080 0x7f273c198170 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:35:31.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.412+0000 7f274253d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f273c0ff080 0x7f273c198170 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:58220/0 (socket says 192.168.123.106:58220) 2026-03-09T17:35:31.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.412+0000 7f274253d700 1 -- 192.168.123.106:0/755493 learned_addr learned my addr 192.168.123.106:0/755493 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:35:31.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.412+0000 7f274253d700 1 -- 192.168.123.106:0/755493 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f273c0ff990 msgr2=0x7f273c1986b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:35:31.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.412+0000 7f274253d700 1 --2- 192.168.123.106:0/755493 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f273c0ff990 0x7f273c1986b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:31.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.412+0000 7f274253d700 1 -- 192.168.123.106:0/755493 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f272c0097e0 con 0x7f273c0ff080 2026-03-09T17:35:31.413 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.412+0000 7f274253d700 1 --2- 192.168.123.106:0/755493 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f273c0ff080 0x7f273c198170 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f273800eb10 tx=0x7f273800eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:35:31.414 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.413+0000 7f27337fe700 1 -- 192.168.123.106:0/755493 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f273800cca0 con 0x7f273c0ff080 2026-03-09T17:35:31.414 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.413+0000 7f27337fe700 1 -- 192.168.123.106:0/755493 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f273800ce00 con 0x7f273c0ff080 2026-03-09T17:35:31.414 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.413+0000 7f27337fe700 1 -- 192.168.123.106:0/755493 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2738018910 con 0x7f273c0ff080 2026-03-09T17:35:31.414 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.413+0000 7f274353f700 1 -- 192.168.123.106:0/755493 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f273c19cbf0 con 0x7f273c0ff080 2026-03-09T17:35:31.414 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.413+0000 7f274353f700 1 -- 192.168.123.106:0/755493 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f273c19d060 con 0x7f273c0ff080 2026-03-09T17:35:31.415 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:31 vm06.local ceph-mon[109831]: from='client.44299 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:35:31.415 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:31 vm06.local ceph-mon[109831]: pgmap v177: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:31.415 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:31 vm06.local ceph-mon[109831]: from='client.34364 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:35:31.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.414+0000 7f274353f700 1 -- 192.168.123.106:0/755493 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f273c109e60 con 0x7f273c0ff080 2026-03-09T17:35:31.416 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.415+0000 7f27337fe700 1 -- 192.168.123.106:0/755493 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2738018a70 con 0x7f273c0ff080 2026-03-09T17:35:31.417 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.416+0000 7f27337fe700 1 --2- 192.168.123.106:0/755493 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2728077790 0x7f2728079c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:31.417 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.416+0000 7f27337fe700 1 -- 192.168.123.106:0/755493 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f2738014070 con 0x7f273c0ff080 2026-03-09T17:35:31.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.417+0000 7f2741d3c700 1 --2- 192.168.123.106:0/755493 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2728077790 0x7f2728079c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:35:31.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.418+0000 7f2741d3c700 1 --2- 192.168.123.106:0/755493 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2728077790 0x7f2728079c40 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f272c006010 tx=0x7f272c00b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:35:31.419 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.418+0000 7f27337fe700 1 -- 192.168.123.106:0/755493 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2738062980 con 0x7f273c0ff080 2026-03-09T17:35:31.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.584+0000 7f274353f700 1 -- 192.168.123.106:0/755493 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f273c04f2a0 con 0x7f273c0ff080 2026-03-09T17:35:31.587 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.586+0000 7f27337fe700 1 -- 192.168.123.106:0/755493 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f27380620d0 con 0x7f273c0ff080 2026-03-09T17:35:31.587 INFO:teuthology.orchestra.run.vm06.stdout:HEALTH_OK 2026-03-09T17:35:31.590 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.589+0000 7f27317fa700 1 -- 192.168.123.106:0/755493 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2728077790 msgr2=0x7f2728079c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:31.590 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.589+0000 7f27317fa700 1 --2- 192.168.123.106:0/755493 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2728077790 0x7f2728079c40 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f272c006010 tx=0x7f272c00b540 comp rx=0 tx=0).stop 2026-03-09T17:35:31.590 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.589+0000 7f27317fa700 1 -- 192.168.123.106:0/755493 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f273c0ff080 msgr2=0x7f273c198170 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:31.590 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.589+0000 7f27317fa700 1 --2- 192.168.123.106:0/755493 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f273c0ff080 0x7f273c198170 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f273800eb10 tx=0x7f273800eed0 comp rx=0 tx=0).stop 2026-03-09T17:35:31.590 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.589+0000 7f27317fa700 1 -- 192.168.123.106:0/755493 shutdown_connections 2026-03-09T17:35:31.590 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.589+0000 7f27317fa700 1 --2- 192.168.123.106:0/755493 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2728077790 0x7f2728079c40 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:31.590 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.589+0000 7f27317fa700 1 --2- 192.168.123.106:0/755493 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f273c0ff080 0x7f273c198170 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:31.590 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.589+0000 7f27317fa700 1 --2- 192.168.123.106:0/755493 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f273c0ff990 0x7f273c1986b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:31.590 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.589+0000 7f27317fa700 1 -- 192.168.123.106:0/755493 >> 192.168.123.106:0/755493 conn(0x7f273c0facf0 msgr2=0x7f273c0fc2a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:35:31.590 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.589+0000 7f27317fa700 1 -- 192.168.123.106:0/755493 shutdown_connections 2026-03-09T17:35:31.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:31.589+0000 7f27317fa700 1 -- 192.168.123.106:0/755493 wait complete. 2026-03-09T17:35:31.632 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-09T17:35:31.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:31 vm09.local ceph-mon[97995]: from='client.44299 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:35:31.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:31 vm09.local ceph-mon[97995]: pgmap v177: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:31.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:31 vm09.local ceph-mon[97995]: from='client.34364 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:35:31.805 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:35:32.088 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.087+0000 7fc636aba700 1 -- 192.168.123.106:0/2532675047 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc630102780 msgr2=0x7fc630102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:32.089 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.087+0000 7fc636aba700 1 --2- 192.168.123.106:0/2532675047 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc630102780 0x7fc630102bf0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7fc624009b00 tx=0x7fc624009e10 comp rx=0 tx=0).stop 2026-03-09T17:35:32.089 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.087+0000 7fc636aba700 1 -- 192.168.123.106:0/2532675047 shutdown_connections 2026-03-09T17:35:32.089 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.087+0000 7fc636aba700 1 --2- 192.168.123.106:0/2532675047 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc630102780 0x7fc630102bf0 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:32.089 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.087+0000 7fc636aba700 1 --2- 192.168.123.106:0/2532675047 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc630108780 0x7fc630108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:32.089 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.087+0000 7fc636aba700 1 -- 192.168.123.106:0/2532675047 >> 192.168.123.106:0/2532675047 conn(0x7fc6300fe280 msgr2=0x7fc630100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:35:32.089 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.087+0000 7fc636aba700 1 -- 192.168.123.106:0/2532675047 shutdown_connections 2026-03-09T17:35:32.089 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.087+0000 7fc636aba700 1 -- 192.168.123.106:0/2532675047 wait complete. 2026-03-09T17:35:32.089 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.088+0000 7fc636aba700 1 Processor -- start 2026-03-09T17:35:32.089 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.088+0000 7fc636aba700 1 -- start start 2026-03-09T17:35:32.089 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.088+0000 7fc636aba700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc630102780 0x7fc630198460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:32.089 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.088+0000 7fc634856700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc630102780 0x7fc630198460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:35:32.090 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.088+0000 7fc634856700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc630102780 0x7fc630198460 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:60520/0 (socket says 192.168.123.106:60520) 2026-03-09T17:35:32.090 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.088+0000 7fc636aba700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc630108780 0x7fc6301989a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:32.090 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.088+0000 7fc636aba700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc630199080 con 0x7fc630102780 2026-03-09T17:35:32.090 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.088+0000 7fc636aba700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc63019ce10 con 0x7fc630108780 2026-03-09T17:35:32.090 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.088+0000 7fc634856700 1 -- 192.168.123.106:0/3419003094 learned_addr learned my addr 192.168.123.106:0/3419003094 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:35:32.090 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.089+0000 7fc634856700 1 -- 192.168.123.106:0/3419003094 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc630108780 msgr2=0x7fc6301989a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:35:32.090 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.089+0000 7fc62ffff700 1 --2- 192.168.123.106:0/3419003094 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc630108780 0x7fc6301989a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:35:32.090 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.089+0000 7fc634856700 1 --2- 192.168.123.106:0/3419003094 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc630108780 0x7fc6301989a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:32.090 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.089+0000 7fc634856700 1 -- 192.168.123.106:0/3419003094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc6240097e0 con 0x7fc630102780 2026-03-09T17:35:32.090 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.089+0000 7fc634856700 1 --2- 192.168.123.106:0/3419003094 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc630102780 0x7fc630198460 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fc62000ba70 tx=0x7fc62000be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:35:32.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.089+0000 7fc62dffb700 1 -- 192.168.123.106:0/3419003094 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc62000c760 con 0x7fc630102780 2026-03-09T17:35:32.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.089+0000 7fc62dffb700 1 -- 192.168.123.106:0/3419003094 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc62000cda0 con 0x7fc630102780 2026-03-09T17:35:32.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.089+0000 7fc62dffb700 1 -- 192.168.123.106:0/3419003094 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc620012550 con 0x7fc630102780 2026-03-09T17:35:32.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.089+0000 7fc636aba700 1 -- 192.168.123.106:0/3419003094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc63019d0f0 con 0x7fc630102780 2026-03-09T17:35:32.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.089+0000 7fc636aba700 1 -- 192.168.123.106:0/3419003094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc63019d6c0 con 0x7fc630102780 2026-03-09T17:35:32.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.091+0000 7fc62dffb700 1 -- 192.168.123.106:0/3419003094 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc620014440 con 0x7fc630102780 2026-03-09T17:35:32.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.091+0000 7fc62dffb700 1 --2- 192.168.123.106:0/3419003094 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc618077870 0x7fc618079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:35:32.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.091+0000 7fc62dffb700 1 -- 192.168.123.106:0/3419003094 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fc620098b70 con 0x7fc630102780 2026-03-09T17:35:32.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.091+0000 7fc62ffff700 1 --2- 192.168.123.106:0/3419003094 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc618077870 0x7fc618079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:35:32.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.091+0000 7fc636aba700 1 -- 192.168.123.106:0/3419003094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc63004ea50 con 0x7fc630102780 2026-03-09T17:35:32.095 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.092+0000 7fc62ffff700 1 --2- 192.168.123.106:0/3419003094 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc618077870 0x7fc618079d20 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fc624000c00 tx=0x7fc624005dc0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:35:32.095 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.094+0000 7fc62dffb700 1 -- 192.168.123.106:0/3419003094 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc620014e60 con 0x7fc630102780 2026-03-09T17:35:32.256 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.254+0000 7fc636aba700 1 -- 192.168.123.106:0/3419003094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fc630199880 con 0x7fc630102780 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.257+0000 7fc62dffb700 1 -- 192.168.123.106:0/3419003094 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7fc620019070 con 0x7fc630102780 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:35:32.258 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:35:32.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.260+0000 7fc636aba700 1 -- 192.168.123.106:0/3419003094 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc618077870 msgr2=0x7fc618079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:32.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.260+0000 7fc636aba700 1 --2- 192.168.123.106:0/3419003094 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc618077870 0x7fc618079d20 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fc624000c00 tx=0x7fc624005dc0 comp rx=0 tx=0).stop 2026-03-09T17:35:32.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.260+0000 7fc636aba700 1 -- 192.168.123.106:0/3419003094 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc630102780 msgr2=0x7fc630198460 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:35:32.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.260+0000 7fc636aba700 1 --2- 192.168.123.106:0/3419003094 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc630102780 0x7fc630198460 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fc62000ba70 tx=0x7fc62000be30 comp rx=0 tx=0).stop 2026-03-09T17:35:32.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.260+0000 7fc636aba700 1 -- 192.168.123.106:0/3419003094 shutdown_connections 2026-03-09T17:35:32.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.260+0000 7fc636aba700 1 --2- 192.168.123.106:0/3419003094 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc618077870 0x7fc618079d20 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:32.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.260+0000 7fc636aba700 1 --2- 192.168.123.106:0/3419003094 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc630102780 0x7fc630198460 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:32.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.260+0000 7fc636aba700 1 --2- 192.168.123.106:0/3419003094 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc630108780 0x7fc6301989a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:35:32.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.260+0000 7fc636aba700 1 -- 192.168.123.106:0/3419003094 >> 192.168.123.106:0/3419003094 conn(0x7fc6300fe280 msgr2=0x7fc6300ffb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:35:32.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.260+0000 7fc636aba700 1 -- 192.168.123.106:0/3419003094 shutdown_connections 2026-03-09T17:35:32.261 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:35:32.260+0000 7fc636aba700 1 -- 192.168.123.106:0/3419003094 wait complete. 2026-03-09T17:35:32.329 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'echo "wait for servicemap items w/ changing names to refresh"' 2026-03-09T17:35:32.467 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:35:32.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:32 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/755493' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:35:32.508 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:32 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/3419003094' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:32.546 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:32 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/755493' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T17:35:32.546 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:32 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/3419003094' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:35:32.638 INFO:teuthology.orchestra.run.vm06.stdout:wait for servicemap items w/ changing names to refresh 2026-03-09T17:35:32.671 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'sleep 60' 2026-03-09T17:35:32.812 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:35:33.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:33 vm06.local ceph-mon[109831]: pgmap v178: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:33.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:33 vm09.local ceph-mon[97995]: pgmap v178: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:35.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:35 vm06.local ceph-mon[109831]: pgmap v179: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:35.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:35 vm09.local ceph-mon[97995]: pgmap v179: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:37.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:37 vm06.local ceph-mon[109831]: pgmap v180: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:37.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:37 vm09.local ceph-mon[97995]: pgmap v180: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:39.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:39 vm06.local ceph-mon[109831]: pgmap v181: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:39.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:39 vm09.local ceph-mon[97995]: pgmap v181: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:41.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:41 vm06.local ceph-mon[109831]: pgmap v182: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:41.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:41 vm09.local ceph-mon[97995]: pgmap v182: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:43.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:43 vm06.local ceph-mon[109831]: pgmap v183: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:43.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:35:43.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:43 vm09.local ceph-mon[97995]: pgmap v183: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:43.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:35:45.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:45 vm06.local ceph-mon[109831]: pgmap v184: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:45.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:45 vm09.local ceph-mon[97995]: pgmap v184: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:47.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:47 vm06.local ceph-mon[109831]: pgmap v185: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:47.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:47 vm09.local ceph-mon[97995]: pgmap v185: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:49.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:49 vm06.local ceph-mon[109831]: pgmap v186: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:49.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:49 vm09.local ceph-mon[97995]: pgmap v186: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:52.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:51 vm06.local ceph-mon[109831]: pgmap v187: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:52.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:51 vm09.local ceph-mon[97995]: pgmap v187: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:54.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:53 vm06.local ceph-mon[109831]: pgmap v188: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:54.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:53 vm09.local ceph-mon[97995]: pgmap v188: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:56.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:55 vm06.local ceph-mon[109831]: pgmap v189: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:56.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:55 vm09.local ceph-mon[97995]: pgmap v189: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:35:58.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:57 vm06.local ceph-mon[109831]: pgmap v190: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:58.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:57 vm09.local ceph-mon[97995]: pgmap v190: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:35:59.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:35:59.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:58 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:36:00.042 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:35:59 vm06.local ceph-mon[109831]: pgmap v191: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:00.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:35:59 vm09.local ceph-mon[97995]: pgmap v191: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:02.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:01 vm06.local ceph-mon[109831]: pgmap v192: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:02.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:01 vm09.local ceph-mon[97995]: pgmap v192: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:04.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:03 vm06.local ceph-mon[109831]: pgmap v193: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:04.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:03 vm09.local ceph-mon[97995]: pgmap v193: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:06.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:05 vm06.local ceph-mon[109831]: pgmap v194: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:06.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:05 vm09.local ceph-mon[97995]: pgmap v194: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:08.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:07 vm06.local ceph-mon[109831]: pgmap v195: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:08.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:07 vm09.local ceph-mon[97995]: pgmap v195: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:10.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:09 vm06.local ceph-mon[109831]: pgmap v196: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:10.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:09 vm09.local ceph-mon[97995]: pgmap v196: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:12.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:11 vm06.local ceph-mon[109831]: pgmap v197: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:12.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:11 vm09.local ceph-mon[97995]: pgmap v197: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:14.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:13 vm06.local ceph-mon[109831]: pgmap v198: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:14.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:36:14.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:13 vm09.local ceph-mon[97995]: pgmap v198: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:14.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:36:16.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:15 vm06.local ceph-mon[109831]: pgmap v199: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:16.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:15 vm09.local ceph-mon[97995]: pgmap v199: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:18.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:17 vm06.local ceph-mon[109831]: pgmap v200: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:18.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:17 vm09.local ceph-mon[97995]: pgmap v200: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:20.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:19 vm06.local ceph-mon[109831]: pgmap v201: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:20.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:36:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:36:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:36:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:36:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:36:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:19 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:36:20.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:19 vm09.local ceph-mon[97995]: pgmap v201: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:20.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:36:20.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:36:20.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:36:20.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:36:20.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:36:20.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:19 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:36:22.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:21 vm06.local ceph-mon[109831]: pgmap v202: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:22.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:21 vm09.local ceph-mon[97995]: pgmap v202: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:24.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:24 vm06.local ceph-mon[109831]: pgmap v203: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:24.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:24 vm09.local ceph-mon[97995]: pgmap v203: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:26.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:26 vm06.local ceph-mon[109831]: pgmap v204: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:26.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:26 vm09.local ceph-mon[97995]: pgmap v204: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:28.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:28 vm06.local ceph-mon[109831]: pgmap v205: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:28.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:28 vm09.local ceph-mon[97995]: pgmap v205: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:29.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:29 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:36:29.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:29 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:36:30.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:30 vm06.local ceph-mon[109831]: pgmap v206: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:30.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:30 vm09.local ceph-mon[97995]: pgmap v206: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:32.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:32 vm06.local ceph-mon[109831]: pgmap v207: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:32.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:32 vm09.local ceph-mon[97995]: pgmap v207: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:33.058 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-09T17:36:33.216 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:33.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.480+0000 7f81544c9700 1 -- 192.168.123.106:0/128268449 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f814c108810 msgr2=0x7f814c108be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:33.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.480+0000 7f81544c9700 1 --2- 192.168.123.106:0/128268449 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f814c108810 0x7f814c108be0 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f813c009b00 tx=0x7f813c009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:33.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.481+0000 7f81544c9700 1 -- 192.168.123.106:0/128268449 shutdown_connections 2026-03-09T17:36:33.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.481+0000 7f81544c9700 1 --2- 192.168.123.106:0/128268449 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f814c102810 0x7f814c102c80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:33.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.481+0000 7f81544c9700 1 --2- 192.168.123.106:0/128268449 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f814c108810 0x7f814c108be0 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:33.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.481+0000 7f81544c9700 1 -- 192.168.123.106:0/128268449 >> 192.168.123.106:0/128268449 conn(0x7f814c0fe330 msgr2=0x7f814c100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:33.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.481+0000 7f81544c9700 1 -- 192.168.123.106:0/128268449 shutdown_connections 2026-03-09T17:36:33.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.481+0000 7f81544c9700 1 -- 192.168.123.106:0/128268449 wait complete. 2026-03-09T17:36:33.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.482+0000 7f81544c9700 1 Processor -- start 2026-03-09T17:36:33.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.482+0000 7f81544c9700 1 -- start start 2026-03-09T17:36:33.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.482+0000 7f81544c9700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f814c102810 0x7f814c075260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:33.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.482+0000 7f81544c9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f814c108810 0x7f814c0757a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:33.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.482+0000 7f81544c9700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f814c0793f0 con 0x7f814c108810 2026-03-09T17:36:33.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.482+0000 7f81544c9700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f814c075ce0 con 0x7f814c102810 2026-03-09T17:36:33.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.482+0000 7f8151a64700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f814c108810 0x7f814c0757a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:33.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.482+0000 7f8151a64700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f814c108810 0x7f814c0757a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58270/0 (socket says 192.168.123.106:58270) 2026-03-09T17:36:33.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.482+0000 7f8151a64700 1 -- 192.168.123.106:0/2199603711 learned_addr learned my addr 192.168.123.106:0/2199603711 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:33.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.483+0000 7f8151a64700 1 -- 192.168.123.106:0/2199603711 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f814c102810 msgr2=0x7f814c075260 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T17:36:33.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.483+0000 7f8151a64700 1 --2- 192.168.123.106:0/2199603711 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f814c102810 0x7f814c075260 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:33.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.483+0000 7f8151a64700 1 -- 192.168.123.106:0/2199603711 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f813c0097e0 con 0x7f814c108810 2026-03-09T17:36:33.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.483+0000 7f8151a64700 1 --2- 192.168.123.106:0/2199603711 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f814c108810 0x7f814c0757a0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f814800eb10 tx=0x7f814800eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:33.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.483+0000 7f81437fe700 1 -- 192.168.123.106:0/2199603711 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f814800cca0 con 0x7f814c108810 2026-03-09T17:36:33.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.483+0000 7f81437fe700 1 -- 192.168.123.106:0/2199603711 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f814800ce00 con 0x7f814c108810 2026-03-09T17:36:33.485 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.483+0000 7f81544c9700 1 -- 192.168.123.106:0/2199603711 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f814c075fc0 con 0x7f814c108810 2026-03-09T17:36:33.485 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.483+0000 7f81437fe700 1 -- 192.168.123.106:0/2199603711 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8148018990 con 0x7f814c108810 2026-03-09T17:36:33.485 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.483+0000 7f81544c9700 1 -- 192.168.123.106:0/2199603711 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f814c1a6c20 con 0x7f814c108810 2026-03-09T17:36:33.489 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.485+0000 7f81544c9700 1 -- 192.168.123.106:0/2199603711 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f814c04ea50 con 0x7f814c108810 2026-03-09T17:36:33.489 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.485+0000 7f81437fe700 1 -- 192.168.123.106:0/2199603711 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8148018af0 con 0x7f814c108810 2026-03-09T17:36:33.490 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.489+0000 7f81437fe700 1 --2- 192.168.123.106:0/2199603711 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f81380778c0 0x7f8138079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:33.490 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.489+0000 7f81437fe700 1 -- 192.168.123.106:0/2199603711 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f8148014070 con 0x7f814c108810 2026-03-09T17:36:33.490 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.489+0000 7f81437fe700 1 -- 192.168.123.106:0/2199603711 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f814809c890 con 0x7f814c108810 2026-03-09T17:36:33.491 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.489+0000 7f8152265700 1 --2- 192.168.123.106:0/2199603711 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f81380778c0 0x7f8138079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:33.491 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.490+0000 7f8152265700 1 --2- 192.168.123.106:0/2199603711 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f81380778c0 0x7f8138079d70 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f813c00b5c0 tx=0x7f813c005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:33.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.611+0000 7f81544c9700 1 -- 192.168.123.106:0/2199603711 --> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f814c0768a0 con 0x7f81380778c0 2026-03-09T17:36:33.622 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.621+0000 7f81437fe700 1 -- 192.168.123.106:0/2199603711 <== mgr.34104 v2:192.168.123.106:6800/431144776 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f814c0768a0 con 0x7f81380778c0 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:alertmanager.vm06 vm06 *:9093,9094 running (88s) 76s ago 11m 16.2M - 0.25.0 c8568f914cd2 eaf8e59269b1 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm06 vm06 running (113s) 76s ago 11m 10.5M - 19.2.3-678-ge911bdeb 654f31e6858e 6ccb5363cc83 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:ceph-exporter.vm09 vm09 running (112s) 100s ago 11m 10.3M - 19.2.3-678-ge911bdeb 654f31e6858e ec8da6f92eb8 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm06 vm06 running (5m) 76s ago 11m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 3e47d040f792 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:crash.vm09 vm09 running (5m) 100s ago 11m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e a8538a05db57 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:grafana.vm06 vm06 *:3000 running (78s) 76s ago 11m 40.3M - 10.4.0 c8b91775d855 3f6028cb482d 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.gzymac vm06 running (2m) 76s ago 9m 93.1M - 19.2.3-678-ge911bdeb 654f31e6858e cea38695f742 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm06.vmzmbb vm06 running (2m) 76s ago 9m 101M - 19.2.3-678-ge911bdeb 654f31e6858e 137634321f1d 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.cjcawy vm09 running (2m) 100s ago 9m 18.5M - 19.2.3-678-ge911bdeb 654f31e6858e da0f8f8a04b1 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:mds.cephfs.vm09.drzmdt vm09 running (119s) 100s ago 9m 15.4M - 19.2.3-678-ge911bdeb 654f31e6858e 2a7cca6ff85f 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm06.pbgzei vm06 *:8443,9283,8765 running (6m) 76s ago 12m 643M - 19.2.3-678-ge911bdeb 654f31e6858e a857a5a84e3a 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:mgr.vm09.lqzvkh vm09 *:8443,9283,8765 running (5m) 100s ago 10m 491M - 19.2.3-678-ge911bdeb 654f31e6858e ce41ce6b68d6 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm06 vm06 running (5m) 76s ago 12m 64.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 86c27c9946b5 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:mon.vm09 vm09 running (5m) 100s ago 10m 53.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 65d270c6a306 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm06 vm06 *:9100 running (104s) 76s ago 11m 8186k - 1.7.0 72c9c2088986 afbba45bb140 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:node-exporter.vm09 vm09 *:9100 running (101s) 100s ago 11m 5528k - 1.7.0 72c9c2088986 11641531af25 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:osd.0 vm06 running (5m) 76s ago 10m 211M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 3b19d9fcb067 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:osd.1 vm06 running (3m) 76s ago 10m 130M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b63df0190ed3 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:osd.2 vm06 running (3m) 76s ago 10m 126M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a5ccd85faf22 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:osd.3 vm09 running (3m) 100s ago 10m 165M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 40d834360933 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:osd.4 vm09 running (2m) 100s ago 9m 121M 4096M 19.2.3-678-ge911bdeb 654f31e6858e cb6e9cd4fe30 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:osd.5 vm09 running (2m) 100s ago 9m 126M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b297663f757a 2026-03-09T17:36:33.623 INFO:teuthology.orchestra.run.vm06.stdout:prometheus.vm06 vm06 *:9095 running (92s) 76s ago 11m 51.7M - 2.51.0 1d3b7f56885b ea61dadaf020 2026-03-09T17:36:33.625 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.624+0000 7f81544c9700 1 -- 192.168.123.106:0/2199603711 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f81380778c0 msgr2=0x7f8138079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:33.625 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.624+0000 7f81544c9700 1 --2- 192.168.123.106:0/2199603711 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f81380778c0 0x7f8138079d70 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f813c00b5c0 tx=0x7f813c005fb0 comp rx=0 tx=0).stop 2026-03-09T17:36:33.625 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.624+0000 7f81544c9700 1 -- 192.168.123.106:0/2199603711 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f814c108810 msgr2=0x7f814c0757a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:33.625 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.624+0000 7f81544c9700 1 --2- 192.168.123.106:0/2199603711 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f814c108810 0x7f814c0757a0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f814800eb10 tx=0x7f814800eed0 comp rx=0 tx=0).stop 2026-03-09T17:36:33.626 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.624+0000 7f81544c9700 1 -- 192.168.123.106:0/2199603711 shutdown_connections 2026-03-09T17:36:33.626 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.625+0000 7f81544c9700 1 --2- 192.168.123.106:0/2199603711 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f81380778c0 0x7f8138079d70 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:33.626 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.625+0000 7f81544c9700 1 --2- 192.168.123.106:0/2199603711 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f814c102810 0x7f814c075260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:33.626 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.625+0000 7f81544c9700 1 --2- 192.168.123.106:0/2199603711 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f814c108810 0x7f814c0757a0 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:33.626 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.625+0000 7f81544c9700 1 -- 192.168.123.106:0/2199603711 >> 192.168.123.106:0/2199603711 conn(0x7f814c0fe330 msgr2=0x7f814c0ffa90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:33.626 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.625+0000 7f81544c9700 1 -- 192.168.123.106:0/2199603711 shutdown_connections 2026-03-09T17:36:33.626 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:33.625+0000 7f81544c9700 1 -- 192.168.123.106:0/2199603711 wait complete. 2026-03-09T17:36:33.670 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-09T17:36:33.810 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:34.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.070+0000 7f2583308700 1 -- 192.168.123.106:0/2081247902 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f257c102780 msgr2=0x7f257c102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:34.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.070+0000 7f2583308700 1 --2- 192.168.123.106:0/2081247902 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f257c102780 0x7f257c102bf0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f2578009b00 tx=0x7f2578009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:34.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.071+0000 7f2583308700 1 -- 192.168.123.106:0/2081247902 shutdown_connections 2026-03-09T17:36:34.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.071+0000 7f2583308700 1 --2- 192.168.123.106:0/2081247902 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f257c102780 0x7f257c102bf0 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.071+0000 7f2583308700 1 --2- 192.168.123.106:0/2081247902 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f257c108780 0x7f257c108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.071+0000 7f2583308700 1 -- 192.168.123.106:0/2081247902 >> 192.168.123.106:0/2081247902 conn(0x7f257c0fe280 msgr2=0x7f257c100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:34.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.072+0000 7f2583308700 1 -- 192.168.123.106:0/2081247902 shutdown_connections 2026-03-09T17:36:34.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.072+0000 7f2583308700 1 -- 192.168.123.106:0/2081247902 wait complete. 2026-03-09T17:36:34.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.072+0000 7f2583308700 1 Processor -- start 2026-03-09T17:36:34.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.072+0000 7f2583308700 1 -- start start 2026-03-09T17:36:34.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.072+0000 7f2583308700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f257c102780 0x7f257c198390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:34.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.072+0000 7f2583308700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f257c108780 0x7f257c1988d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:34.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.072+0000 7f2583308700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f257c198fb0 con 0x7f257c108780 2026-03-09T17:36:34.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.072+0000 7f2583308700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f257c19ccf0 con 0x7f257c102780 2026-03-09T17:36:34.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.073+0000 7f25808a3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f257c108780 0x7f257c1988d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:34.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.073+0000 7f25810a4700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f257c102780 0x7f257c198390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:34.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.073+0000 7f25810a4700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f257c102780 0x7f257c198390 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:34768/0 (socket says 192.168.123.106:34768) 2026-03-09T17:36:34.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.073+0000 7f25810a4700 1 -- 192.168.123.106:0/626272750 learned_addr learned my addr 192.168.123.106:0/626272750 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:34.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.073+0000 7f25808a3700 1 -- 192.168.123.106:0/626272750 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f257c102780 msgr2=0x7f257c198390 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:34.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.073+0000 7f25808a3700 1 --2- 192.168.123.106:0/626272750 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f257c102780 0x7f257c198390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.074 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.073+0000 7f25808a3700 1 -- 192.168.123.106:0/626272750 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f25780097e0 con 0x7f257c108780 2026-03-09T17:36:34.075 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.074+0000 7f25808a3700 1 --2- 192.168.123.106:0/626272750 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f257c108780 0x7f257c1988d0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f2578009fd0 tx=0x7f25780048c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:34.076 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.074+0000 7f25727fc700 1 -- 192.168.123.106:0/626272750 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f257801d070 con 0x7f257c108780 2026-03-09T17:36:34.076 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.074+0000 7f25727fc700 1 -- 192.168.123.106:0/626272750 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2578022470 con 0x7f257c108780 2026-03-09T17:36:34.076 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.074+0000 7f2583308700 1 -- 192.168.123.106:0/626272750 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f257c19cf70 con 0x7f257c108780 2026-03-09T17:36:34.076 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.074+0000 7f2583308700 1 -- 192.168.123.106:0/626272750 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f257c19d460 con 0x7f257c108780 2026-03-09T17:36:34.076 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.074+0000 7f25727fc700 1 -- 192.168.123.106:0/626272750 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f257800f670 con 0x7f257c108780 2026-03-09T17:36:34.079 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.075+0000 7f2583308700 1 -- 192.168.123.106:0/626272750 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f257c04ea50 con 0x7f257c108780 2026-03-09T17:36:34.080 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.078+0000 7f25727fc700 1 -- 192.168.123.106:0/626272750 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f25780225e0 con 0x7f257c108780 2026-03-09T17:36:34.080 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.079+0000 7f25727fc700 1 --2- 192.168.123.106:0/626272750 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f25680779e0 0x7f2568079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:34.080 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.079+0000 7f25727fc700 1 -- 192.168.123.106:0/626272750 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f257809bc20 con 0x7f257c108780 2026-03-09T17:36:34.080 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.079+0000 7f25810a4700 1 --2- 192.168.123.106:0/626272750 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f25680779e0 0x7f2568079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:34.080 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.079+0000 7f25810a4700 1 --2- 192.168.123.106:0/626272750 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f25680779e0 0x7f2568079e90 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f256c00bd70 tx=0x7f256c00b4a0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:34.080 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.079+0000 7f25727fc700 1 -- 192.168.123.106:0/626272750 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f25780cba90 con 0x7f257c108780 2026-03-09T17:36:34.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:34 vm06.local ceph-mon[109831]: pgmap v208: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:34.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.239+0000 7f2583308700 1 -- 192.168.123.106:0/626272750 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f257c066e40 con 0x7f257c108780 2026-03-09T17:36:34.241 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.240+0000 7f25727fc700 1 -- 192.168.123.106:0/626272750 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f25780644f0 con 0x7f257c108780 2026-03-09T17:36:34.241 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-09T17:36:34.241 INFO:teuthology.orchestra.run.vm06.stdout: "mon": { 2026-03-09T17:36:34.241 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:36:34.241 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:36:34.241 INFO:teuthology.orchestra.run.vm06.stdout: "mgr": { 2026-03-09T17:36:34.241 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T17:36:34.241 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:36:34.241 INFO:teuthology.orchestra.run.vm06.stdout: "osd": { 2026-03-09T17:36:34.241 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T17:36:34.241 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:36:34.242 INFO:teuthology.orchestra.run.vm06.stdout: "mds": { 2026-03-09T17:36:34.242 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T17:36:34.242 INFO:teuthology.orchestra.run.vm06.stdout: }, 2026-03-09T17:36:34.242 INFO:teuthology.orchestra.run.vm06.stdout: "overall": { 2026-03-09T17:36:34.242 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-09T17:36:34.242 INFO:teuthology.orchestra.run.vm06.stdout: } 2026-03-09T17:36:34.242 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-09T17:36:34.244 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.243+0000 7f2583308700 1 -- 192.168.123.106:0/626272750 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f25680779e0 msgr2=0x7f2568079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:34.244 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.243+0000 7f2583308700 1 --2- 192.168.123.106:0/626272750 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f25680779e0 0x7f2568079e90 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f256c00bd70 tx=0x7f256c00b4a0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.244 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.243+0000 7f2583308700 1 -- 192.168.123.106:0/626272750 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f257c108780 msgr2=0x7f257c1988d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:34.244 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.243+0000 7f2583308700 1 --2- 192.168.123.106:0/626272750 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f257c108780 0x7f257c1988d0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f2578009fd0 tx=0x7f25780048c0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.244 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.243+0000 7f2583308700 1 -- 192.168.123.106:0/626272750 shutdown_connections 2026-03-09T17:36:34.244 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.243+0000 7f2583308700 1 --2- 192.168.123.106:0/626272750 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f25680779e0 0x7f2568079e90 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.245 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.243+0000 7f2583308700 1 --2- 192.168.123.106:0/626272750 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f257c102780 0x7f257c198390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.245 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.244+0000 7f2583308700 1 --2- 192.168.123.106:0/626272750 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f257c108780 0x7f257c1988d0 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.245 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.244+0000 7f2583308700 1 -- 192.168.123.106:0/626272750 >> 192.168.123.106:0/626272750 conn(0x7f257c0fe280 msgr2=0x7f257c0ffbd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:34.245 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.244+0000 7f2583308700 1 -- 192.168.123.106:0/626272750 shutdown_connections 2026-03-09T17:36:34.245 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.244+0000 7f2583308700 1 -- 192.168.123.106:0/626272750 wait complete. 2026-03-09T17:36:34.292 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | length == 1'"'"'' 2026-03-09T17:36:34.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:34 vm09.local ceph-mon[97995]: pgmap v208: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:34.431 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:34.664 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.662+0000 7f89df988700 1 -- 192.168.123.106:0/843660209 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f89d81013a0 msgr2=0x7f89d8101770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:34.664 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.662+0000 7f89df988700 1 --2- 192.168.123.106:0/843660209 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f89d81013a0 0x7f89d8101770 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f89cc009b00 tx=0x7f89cc009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:34.664 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.662+0000 7f89df988700 1 -- 192.168.123.106:0/843660209 shutdown_connections 2026-03-09T17:36:34.664 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.662+0000 7f89df988700 1 --2- 192.168.123.106:0/843660209 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f89d8068490 0x7f89d8068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.664 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.662+0000 7f89df988700 1 --2- 192.168.123.106:0/843660209 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f89d81013a0 0x7f89d8101770 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.664 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.662+0000 7f89df988700 1 -- 192.168.123.106:0/843660209 >> 192.168.123.106:0/843660209 conn(0x7f89d80754a0 msgr2=0x7f89d80758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:34.664 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.663+0000 7f89df988700 1 -- 192.168.123.106:0/843660209 shutdown_connections 2026-03-09T17:36:34.664 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.663+0000 7f89df988700 1 -- 192.168.123.106:0/843660209 wait complete. 2026-03-09T17:36:34.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.663+0000 7f89df988700 1 Processor -- start 2026-03-09T17:36:34.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.664+0000 7f89df988700 1 -- start start 2026-03-09T17:36:34.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.664+0000 7f89df988700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f89d8068490 0x7f89d81982f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:34.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.664+0000 7f89df988700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f89d81013a0 0x7f89d8198830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:34.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.664+0000 7f89df988700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89d8198f10 con 0x7f89d81013a0 2026-03-09T17:36:34.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.664+0000 7f89df988700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89d819cca0 con 0x7f89d8068490 2026-03-09T17:36:34.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.665+0000 7f89dcf23700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f89d81013a0 0x7f89d8198830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:34.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.665+0000 7f89dcf23700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f89d81013a0 0x7f89d8198830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58322/0 (socket says 192.168.123.106:58322) 2026-03-09T17:36:34.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.665+0000 7f89dcf23700 1 -- 192.168.123.106:0/1000941185 learned_addr learned my addr 192.168.123.106:0/1000941185 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:34.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.665+0000 7f89dcf23700 1 -- 192.168.123.106:0/1000941185 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f89d8068490 msgr2=0x7f89d81982f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:34.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.664+0000 7f89dd724700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f89d8068490 0x7f89d81982f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:34.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.665+0000 7f89dcf23700 1 --2- 192.168.123.106:0/1000941185 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f89d8068490 0x7f89d81982f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.665+0000 7f89dcf23700 1 -- 192.168.123.106:0/1000941185 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f89cc0097e0 con 0x7f89d81013a0 2026-03-09T17:36:34.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.665+0000 7f89dcf23700 1 --2- 192.168.123.106:0/1000941185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f89d81013a0 0x7f89d8198830 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f89d400d900 tx=0x7f89d400dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:34.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.666+0000 7f89dd724700 1 --2- 192.168.123.106:0/1000941185 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f89d8068490 0x7f89d81982f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:36:34.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.666+0000 7f89ca7fc700 1 -- 192.168.123.106:0/1000941185 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f89d40049e0 con 0x7f89d81013a0 2026-03-09T17:36:34.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.666+0000 7f89ca7fc700 1 -- 192.168.123.106:0/1000941185 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f89d4005500 con 0x7f89d81013a0 2026-03-09T17:36:34.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.666+0000 7f89ca7fc700 1 -- 192.168.123.106:0/1000941185 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f89d4009de0 con 0x7f89d81013a0 2026-03-09T17:36:34.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.666+0000 7f89df988700 1 -- 192.168.123.106:0/1000941185 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f89d819cf80 con 0x7f89d81013a0 2026-03-09T17:36:34.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.666+0000 7f89df988700 1 -- 192.168.123.106:0/1000941185 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f89d819d4d0 con 0x7f89d81013a0 2026-03-09T17:36:34.671 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.668+0000 7f89df988700 1 -- 192.168.123.106:0/1000941185 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f89d804ea50 con 0x7f89d81013a0 2026-03-09T17:36:34.671 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.668+0000 7f89ca7fc700 1 -- 192.168.123.106:0/1000941185 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f89d4004db0 con 0x7f89d81013a0 2026-03-09T17:36:34.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.671+0000 7f89ca7fc700 1 --2- 192.168.123.106:0/1000941185 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f89c40778c0 0x7f89c4079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:34.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.671+0000 7f89ca7fc700 1 -- 192.168.123.106:0/1000941185 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f89d4020030 con 0x7f89d81013a0 2026-03-09T17:36:34.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.671+0000 7f89dd724700 1 --2- 192.168.123.106:0/1000941185 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f89c40778c0 0x7f89c4079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:34.673 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.672+0000 7f89dd724700 1 --2- 192.168.123.106:0/1000941185 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f89c40778c0 0x7f89c4079d70 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f89cc009fd0 tx=0x7f89cc005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:34.673 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.672+0000 7f89ca7fc700 1 -- 192.168.123.106:0/1000941185 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f89d4061e40 con 0x7f89d81013a0 2026-03-09T17:36:34.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.830+0000 7f89df988700 1 -- 192.168.123.106:0/1000941185 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f89d8066e40 con 0x7f89d81013a0 2026-03-09T17:36:34.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.833+0000 7f89ca7fc700 1 -- 192.168.123.106:0/1000941185 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f89d4061590 con 0x7f89d81013a0 2026-03-09T17:36:34.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.835+0000 7f89df988700 1 -- 192.168.123.106:0/1000941185 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f89c40778c0 msgr2=0x7f89c4079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:34.836 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.835+0000 7f89df988700 1 --2- 192.168.123.106:0/1000941185 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f89c40778c0 0x7f89c4079d70 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f89cc009fd0 tx=0x7f89cc005fb0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.836+0000 7f89df988700 1 -- 192.168.123.106:0/1000941185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f89d81013a0 msgr2=0x7f89d8198830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:34.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.836+0000 7f89df988700 1 --2- 192.168.123.106:0/1000941185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f89d81013a0 0x7f89d8198830 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f89d400d900 tx=0x7f89d400dcc0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.836+0000 7f89df988700 1 -- 192.168.123.106:0/1000941185 shutdown_connections 2026-03-09T17:36:34.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.836+0000 7f89df988700 1 --2- 192.168.123.106:0/1000941185 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f89c40778c0 0x7f89c4079d70 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.836+0000 7f89df988700 1 --2- 192.168.123.106:0/1000941185 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f89d8068490 0x7f89d81982f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.836+0000 7f89df988700 1 --2- 192.168.123.106:0/1000941185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f89d81013a0 0x7f89d8198830 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:34.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.837+0000 7f89df988700 1 -- 192.168.123.106:0/1000941185 >> 192.168.123.106:0/1000941185 conn(0x7f89d80754a0 msgr2=0x7f89d80fdd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:34.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.837+0000 7f89df988700 1 -- 192.168.123.106:0/1000941185 shutdown_connections 2026-03-09T17:36:34.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:34.837+0000 7f89df988700 1 -- 192.168.123.106:0/1000941185 wait complete. 2026-03-09T17:36:34.847 INFO:teuthology.orchestra.run.vm06.stdout:true 2026-03-09T17:36:34.899 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | keys'"'"' | grep $sha1' 2026-03-09T17:36:35.043 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:35.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:35 vm06.local ceph-mon[109831]: from='client.34374 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:36:35.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:35 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/626272750' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:36:35.087 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:35 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/1000941185' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:36:35.326 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.324+0000 7f0bd882e700 1 -- 192.168.123.106:0/115714400 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bd410e9e0 msgr2=0x7f0bd410edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:35.326 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.324+0000 7f0bd882e700 1 --2- 192.168.123.106:0/115714400 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bd410e9e0 0x7f0bd410edb0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f0bc4009b00 tx=0x7f0bc4009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:35.326 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.325+0000 7f0bd882e700 1 -- 192.168.123.106:0/115714400 shutdown_connections 2026-03-09T17:36:35.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.325+0000 7f0bd882e700 1 --2- 192.168.123.106:0/115714400 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0bd4071b60 0x7f0bd4071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:35.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.325+0000 7f0bd882e700 1 --2- 192.168.123.106:0/115714400 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bd410e9e0 0x7f0bd410edb0 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:35.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.325+0000 7f0bd882e700 1 -- 192.168.123.106:0/115714400 >> 192.168.123.106:0/115714400 conn(0x7f0bd406c6c0 msgr2=0x7f0bd406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:35.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.325+0000 7f0bd882e700 1 -- 192.168.123.106:0/115714400 shutdown_connections 2026-03-09T17:36:35.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.325+0000 7f0bd882e700 1 -- 192.168.123.106:0/115714400 wait complete. 2026-03-09T17:36:35.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.326+0000 7f0bd882e700 1 Processor -- start 2026-03-09T17:36:35.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.326+0000 7f0bd882e700 1 -- start start 2026-03-09T17:36:35.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.326+0000 7f0bd882e700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0bd4071b60 0x7f0bd4119520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:35.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.326+0000 7f0bd882e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bd410e9e0 0x7f0bd4114520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:35.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.326+0000 7f0bd882e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0bd4114a60 con 0x7f0bd410e9e0 2026-03-09T17:36:35.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.326+0000 7f0bd882e700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0bd4114bd0 con 0x7f0bd4071b60 2026-03-09T17:36:35.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.327+0000 7f0bd2d9d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0bd4071b60 0x7f0bd4119520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:35.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.327+0000 7f0bd2d9d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0bd4071b60 0x7f0bd4119520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:34810/0 (socket says 192.168.123.106:34810) 2026-03-09T17:36:35.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.327+0000 7f0bd2d9d700 1 -- 192.168.123.106:0/4114858801 learned_addr learned my addr 192.168.123.106:0/4114858801 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:35.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.327+0000 7f0bd259c700 1 --2- 192.168.123.106:0/4114858801 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bd410e9e0 0x7f0bd4114520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:35.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.327+0000 7f0bd259c700 1 -- 192.168.123.106:0/4114858801 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0bd4071b60 msgr2=0x7f0bd4119520 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:35.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.327+0000 7f0bd259c700 1 --2- 192.168.123.106:0/4114858801 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0bd4071b60 0x7f0bd4119520 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:35.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.327+0000 7f0bd259c700 1 -- 192.168.123.106:0/4114858801 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0bc40097e0 con 0x7f0bd410e9e0 2026-03-09T17:36:35.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.327+0000 7f0bd259c700 1 --2- 192.168.123.106:0/4114858801 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bd410e9e0 0x7f0bd4114520 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f0bc800b6d0 tx=0x7f0bc800b9e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:35.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.328+0000 7f0bbbfff700 1 -- 192.168.123.106:0/4114858801 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0bc8011630 con 0x7f0bd410e9e0 2026-03-09T17:36:35.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.328+0000 7f0bd882e700 1 -- 192.168.123.106:0/4114858801 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0bd4114eb0 con 0x7f0bd410e9e0 2026-03-09T17:36:35.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.328+0000 7f0bd882e700 1 -- 192.168.123.106:0/4114858801 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0bd41b7b20 con 0x7f0bd410e9e0 2026-03-09T17:36:35.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.329+0000 7f0bd882e700 1 -- 192.168.123.106:0/4114858801 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0bd404f2a0 con 0x7f0bd410e9e0 2026-03-09T17:36:35.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.329+0000 7f0bbbfff700 1 -- 192.168.123.106:0/4114858801 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0bc8011c70 con 0x7f0bd410e9e0 2026-03-09T17:36:35.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.329+0000 7f0bbbfff700 1 -- 192.168.123.106:0/4114858801 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0bc8010e80 con 0x7f0bd410e9e0 2026-03-09T17:36:35.331 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.330+0000 7f0bbbfff700 1 -- 192.168.123.106:0/4114858801 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0bc8011790 con 0x7f0bd410e9e0 2026-03-09T17:36:35.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.331+0000 7f0bbbfff700 1 --2- 192.168.123.106:0/4114858801 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f0bbc0779e0 0x7f0bbc079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:35.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.331+0000 7f0bd2d9d700 1 --2- 192.168.123.106:0/4114858801 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f0bbc0779e0 0x7f0bbc079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:35.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.331+0000 7f0bbbfff700 1 -- 192.168.123.106:0/4114858801 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f0bc8099800 con 0x7f0bd410e9e0 2026-03-09T17:36:35.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.331+0000 7f0bd2d9d700 1 --2- 192.168.123.106:0/4114858801 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f0bbc0779e0 0x7f0bbc079e90 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f0bc4005f50 tx=0x7f0bc4005dc0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:35.334 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.333+0000 7f0bbbfff700 1 -- 192.168.123.106:0/4114858801 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0bc8061fb0 con 0x7f0bd410e9e0 2026-03-09T17:36:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:35 vm09.local ceph-mon[97995]: from='client.34374 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T17:36:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:35 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/626272750' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:36:35.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:35 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/1000941185' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:36:35.504 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.503+0000 7f0bd882e700 1 -- 192.168.123.106:0/4114858801 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f0bd4115040 con 0x7f0bd410e9e0 2026-03-09T17:36:35.505 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.503+0000 7f0bbbfff700 1 -- 192.168.123.106:0/4114858801 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f0bd4115040 con 0x7f0bd410e9e0 2026-03-09T17:36:35.509 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.508+0000 7f0bb9ffb700 1 -- 192.168.123.106:0/4114858801 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f0bbc0779e0 msgr2=0x7f0bbc079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:35.509 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.508+0000 7f0bb9ffb700 1 --2- 192.168.123.106:0/4114858801 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f0bbc0779e0 0x7f0bbc079e90 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f0bc4005f50 tx=0x7f0bc4005dc0 comp rx=0 tx=0).stop 2026-03-09T17:36:35.509 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.508+0000 7f0bb9ffb700 1 -- 192.168.123.106:0/4114858801 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bd410e9e0 msgr2=0x7f0bd4114520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:35.509 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.508+0000 7f0bb9ffb700 1 --2- 192.168.123.106:0/4114858801 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bd410e9e0 0x7f0bd4114520 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f0bc800b6d0 tx=0x7f0bc800b9e0 comp rx=0 tx=0).stop 2026-03-09T17:36:35.512 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.511+0000 7f0bb9ffb700 1 -- 192.168.123.106:0/4114858801 shutdown_connections 2026-03-09T17:36:35.512 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.511+0000 7f0bb9ffb700 1 --2- 192.168.123.106:0/4114858801 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f0bbc0779e0 0x7f0bbc079e90 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:35.512 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.511+0000 7f0bb9ffb700 1 --2- 192.168.123.106:0/4114858801 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f0bd4071b60 0x7f0bd4119520 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:35.512 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.511+0000 7f0bb9ffb700 1 --2- 192.168.123.106:0/4114858801 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bd410e9e0 0x7f0bd4114520 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:35.512 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.511+0000 7f0bb9ffb700 1 -- 192.168.123.106:0/4114858801 >> 192.168.123.106:0/4114858801 conn(0x7f0bd406c6c0 msgr2=0x7f0bd406cf10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:35.512 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.511+0000 7f0bb9ffb700 1 -- 192.168.123.106:0/4114858801 shutdown_connections 2026-03-09T17:36:35.512 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:35.511+0000 7f0bb9ffb700 1 -- 192.168.123.106:0/4114858801 wait complete. 2026-03-09T17:36:35.521 INFO:teuthology.orchestra.run.vm06.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)" 2026-03-09T17:36:35.563 DEBUG:teuthology.parallel:result is None 2026-03-09T17:36:35.563 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T17:36:35.566 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm06.local 2026-03-09T17:36:35.567 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- bash -c 'ceph fs dump' 2026-03-09T17:36:35.753 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:36.042 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.039+0000 7fdce1b9d700 1 -- 192.168.123.106:0/3451419655 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcdc108790 msgr2=0x7fdcdc108b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:36.042 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.039+0000 7fdce1b9d700 1 --2- 192.168.123.106:0/3451419655 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcdc108790 0x7fdcdc108b60 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fdcc4009b00 tx=0x7fdcc4009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:36.042 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.040+0000 7fdce1b9d700 1 -- 192.168.123.106:0/3451419655 shutdown_connections 2026-03-09T17:36:36.042 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.040+0000 7fdce1b9d700 1 --2- 192.168.123.106:0/3451419655 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcdc102790 0x7fdcdc102c00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:36.042 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.040+0000 7fdce1b9d700 1 --2- 192.168.123.106:0/3451419655 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcdc108790 0x7fdcdc108b60 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:36.042 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.040+0000 7fdce1b9d700 1 -- 192.168.123.106:0/3451419655 >> 192.168.123.106:0/3451419655 conn(0x7fdcdc0fe2b0 msgr2=0x7fdcdc1006c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:36.044 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.043+0000 7fdce1b9d700 1 -- 192.168.123.106:0/3451419655 shutdown_connections 2026-03-09T17:36:36.044 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.043+0000 7fdce1b9d700 1 -- 192.168.123.106:0/3451419655 wait complete. 2026-03-09T17:36:36.044 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.043+0000 7fdce1b9d700 1 Processor -- start 2026-03-09T17:36:36.045 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.044+0000 7fdce1b9d700 1 -- start start 2026-03-09T17:36:36.045 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.044+0000 7fdce1b9d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcdc102790 0x7fdcdc198340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:36.045 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.044+0000 7fdce1b9d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcdc108790 0x7fdcdc198880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:36.045 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.044+0000 7fdce1b9d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdcdc198f60 con 0x7fdcdc108790 2026-03-09T17:36:36.045 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.044+0000 7fdce1b9d700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdcdc19ccf0 con 0x7fdcdc102790 2026-03-09T17:36:36.045 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.044+0000 7fdcdaffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcdc108790 0x7fdcdc198880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:36.046 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.044+0000 7fdcdb7fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcdc102790 0x7fdcdc198340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:36.046 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.044+0000 7fdcdb7fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcdc102790 0x7fdcdc198340 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:34830/0 (socket says 192.168.123.106:34830) 2026-03-09T17:36:36.046 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.044+0000 7fdcdb7fe700 1 -- 192.168.123.106:0/2485300377 learned_addr learned my addr 192.168.123.106:0/2485300377 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:36.046 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.045+0000 7fdcdb7fe700 1 -- 192.168.123.106:0/2485300377 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcdc108790 msgr2=0x7fdcdc198880 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:36.046 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.045+0000 7fdcdb7fe700 1 --2- 192.168.123.106:0/2485300377 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcdc108790 0x7fdcdc198880 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:36.046 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.045+0000 7fdcdb7fe700 1 -- 192.168.123.106:0/2485300377 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdcc40097e0 con 0x7fdcdc102790 2026-03-09T17:36:36.046 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.045+0000 7fdcdaffd700 1 --2- 192.168.123.106:0/2485300377 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcdc108790 0x7fdcdc198880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T17:36:36.046 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.045+0000 7fdcdb7fe700 1 --2- 192.168.123.106:0/2485300377 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcdc102790 0x7fdcdc198340 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fdcc4009fd0 tx=0x7fdcc4004930 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:36.046 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.045+0000 7fdcd8ff9700 1 -- 192.168.123.106:0/2485300377 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdcc401d070 con 0x7fdcdc102790 2026-03-09T17:36:36.048 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.045+0000 7fdce1b9d700 1 -- 192.168.123.106:0/2485300377 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdcdc19cf70 con 0x7fdcdc102790 2026-03-09T17:36:36.048 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.045+0000 7fdce1b9d700 1 -- 192.168.123.106:0/2485300377 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdcdc19d460 con 0x7fdcdc102790 2026-03-09T17:36:36.048 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.045+0000 7fdcd8ff9700 1 -- 192.168.123.106:0/2485300377 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fdcc4004b80 con 0x7fdcdc102790 2026-03-09T17:36:36.048 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.045+0000 7fdcd8ff9700 1 -- 192.168.123.106:0/2485300377 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdcc400f700 con 0x7fdcdc102790 2026-03-09T17:36:36.049 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.047+0000 7fdce1b9d700 1 -- 192.168.123.106:0/2485300377 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdcbc005320 con 0x7fdcdc102790 2026-03-09T17:36:36.049 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.047+0000 7fdcd8ff9700 1 -- 192.168.123.106:0/2485300377 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fdcc400bc50 con 0x7fdcdc102790 2026-03-09T17:36:36.049 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.047+0000 7fdcd8ff9700 1 --2- 192.168.123.106:0/2485300377 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fdcc80778c0 0x7fdcc8079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:36.049 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.047+0000 7fdcd8ff9700 1 -- 192.168.123.106:0/2485300377 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fdcc409b1e0 con 0x7fdcdc102790 2026-03-09T17:36:36.049 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.048+0000 7fdcdaffd700 1 --2- 192.168.123.106:0/2485300377 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fdcc80778c0 0x7fdcc8079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:36.049 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.048+0000 7fdcdaffd700 1 --2- 192.168.123.106:0/2485300377 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fdcc80778c0 0x7fdcc8079d70 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fdcdc199960 tx=0x7fdccc00a300 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:36.051 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.050+0000 7fdcd8ff9700 1 -- 192.168.123.106:0/2485300377 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fdcc4063910 con 0x7fdcdc102790 2026-03-09T17:36:36.205 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.204+0000 7fdce1b9d700 1 -- 192.168.123.106:0/2485300377 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fdcbc006200 con 0x7fdcdc102790 2026-03-09T17:36:36.206 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.204+0000 7fdcd8ff9700 1 -- 192.168.123.106:0/2485300377 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 35 v35) v1 ==== 76+0+2002 (secure 0 0 0) 0x7fdcc4063060 con 0x7fdcdc102790 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout:e35 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout:btime 2026-03-09T17:34:38:194451+0000 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout:legacy client fscid: 1 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout:Filesystem 'cephfs' (1) 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout:fs_name cephfs 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout:epoch 35 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout:created 2026-03-09T17:27:09.795351+0000 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout:modified 2026-03-09T17:34:38.194448+0000 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout:tableserver 0 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout:root 0 2026-03-09T17:36:36.207 INFO:teuthology.orchestra.run.vm06.stdout:session_timeout 60 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:session_autoclose 300 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:max_file_size 1099511627776 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:max_xattr_size 65536 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:required_client_features {} 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:last_failure 0 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:last_failure_osd_epoch 80 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:max_mds 1 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:in 0 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:up {0=34284} 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:failed 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:damaged 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:stopped 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:data_pools [3] 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:metadata_pool 2 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:inline_data disabled 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:balancer 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:bal_rank_mask -1 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:standby_count_wanted 1 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:qdb_cluster leader: 34284 members: 34284 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.vmzmbb{0:34284} state up:active seq 9 join_fscid=1 addr [v2:192.168.123.106:6826/571707287,v1:192.168.123.106:6827/571707287] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm06.gzymac{0:44251} state up:standby-replay seq 1 join_fscid=1 addr [v2:192.168.123.106:6828/2160269265,v1:192.168.123.106:6829/2160269265] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:Standby daemons: 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.cjcawy{-1:44253} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.109:6824/3810846472,v1:192.168.123.109:6825/3810846472] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T17:36:36.208 INFO:teuthology.orchestra.run.vm06.stdout:[mds.cephfs.vm09.drzmdt{-1:44275} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.109:6826/3154236738,v1:192.168.123.109:6827/3154236738] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T17:36:36.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.208+0000 7fdce1b9d700 1 -- 192.168.123.106:0/2485300377 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fdcc80778c0 msgr2=0x7fdcc8079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:36.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.208+0000 7fdce1b9d700 1 --2- 192.168.123.106:0/2485300377 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fdcc80778c0 0x7fdcc8079d70 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fdcdc199960 tx=0x7fdccc00a300 comp rx=0 tx=0).stop 2026-03-09T17:36:36.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.209+0000 7fdce1b9d700 1 -- 192.168.123.106:0/2485300377 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcdc102790 msgr2=0x7fdcdc198340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:36.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.209+0000 7fdce1b9d700 1 --2- 192.168.123.106:0/2485300377 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcdc102790 0x7fdcdc198340 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fdcc4009fd0 tx=0x7fdcc4004930 comp rx=0 tx=0).stop 2026-03-09T17:36:36.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.209+0000 7fdce1b9d700 1 -- 192.168.123.106:0/2485300377 shutdown_connections 2026-03-09T17:36:36.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.209+0000 7fdce1b9d700 1 --2- 192.168.123.106:0/2485300377 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fdcc80778c0 0x7fdcc8079d70 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:36.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.209+0000 7fdce1b9d700 1 --2- 192.168.123.106:0/2485300377 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdcdc102790 0x7fdcdc198340 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:36.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.209+0000 7fdce1b9d700 1 --2- 192.168.123.106:0/2485300377 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdcdc108790 0x7fdcdc198880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:36.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.209+0000 7fdce1b9d700 1 -- 192.168.123.106:0/2485300377 >> 192.168.123.106:0/2485300377 conn(0x7fdcdc0fe2b0 msgr2=0x7fdcdc0ff9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:36.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.209+0000 7fdce1b9d700 1 -- 192.168.123.106:0/2485300377 shutdown_connections 2026-03-09T17:36:36.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.209+0000 7fdce1b9d700 1 -- 192.168.123.106:0/2485300377 wait complete. 2026-03-09T17:36:36.211 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 35 2026-03-09T17:36:36.276 INFO:teuthology.run_tasks:Running task fs.post_upgrade_checks... 2026-03-09T17:36:36.279 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 2026-03-09T17:36:36.303 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:36 vm06.local ceph-mon[109831]: pgmap v209: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:36.303 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:36 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/4114858801' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:36:36.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:36 vm09.local ceph-mon[97995]: pgmap v209: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:36.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:36 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/4114858801' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T17:36:36.429 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:36.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.672+0000 7fcb88ef5700 1 -- 192.168.123.106:0/2944121479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb84101ee0 msgr2=0x7fcb8410a5b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:36.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.672+0000 7fcb88ef5700 1 --2- 192.168.123.106:0/2944121479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb84101ee0 0x7fcb8410a5b0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fcb74009b00 tx=0x7fcb74009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:36.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.673+0000 7fcb88ef5700 1 -- 192.168.123.106:0/2944121479 shutdown_connections 2026-03-09T17:36:36.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.673+0000 7fcb88ef5700 1 --2- 192.168.123.106:0/2944121479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb84101ee0 0x7fcb8410a5b0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:36.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.673+0000 7fcb88ef5700 1 --2- 192.168.123.106:0/2944121479 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb841015d0 0x7fcb841019a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:36.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.673+0000 7fcb88ef5700 1 -- 192.168.123.106:0/2944121479 >> 192.168.123.106:0/2944121479 conn(0x7fcb840faf00 msgr2=0x7fcb840fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:36.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.673+0000 7fcb88ef5700 1 -- 192.168.123.106:0/2944121479 shutdown_connections 2026-03-09T17:36:36.674 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.673+0000 7fcb88ef5700 1 -- 192.168.123.106:0/2944121479 wait complete. 2026-03-09T17:36:36.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.673+0000 7fcb88ef5700 1 Processor -- start 2026-03-09T17:36:36.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.673+0000 7fcb88ef5700 1 -- start start 2026-03-09T17:36:36.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.674+0000 7fcb88ef5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb841015d0 0x7fcb84072900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:36.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.674+0000 7fcb88ef5700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb84101ee0 0x7fcb8406d900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:36.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.674+0000 7fcb88ef5700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb8406ded0 con 0x7fcb841015d0 2026-03-09T17:36:36.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.674+0000 7fcb88ef5700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb8406e010 con 0x7fcb84101ee0 2026-03-09T17:36:36.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.674+0000 7fcb8259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb841015d0 0x7fcb84072900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:36.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.674+0000 7fcb81d9b700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb84101ee0 0x7fcb8406d900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:36.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.674+0000 7fcb81d9b700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb84101ee0 0x7fcb8406d900 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:34842/0 (socket says 192.168.123.106:34842) 2026-03-09T17:36:36.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.674+0000 7fcb81d9b700 1 -- 192.168.123.106:0/3110380914 learned_addr learned my addr 192.168.123.106:0/3110380914 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:36.675 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.674+0000 7fcb81d9b700 1 -- 192.168.123.106:0/3110380914 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb841015d0 msgr2=0x7fcb84072900 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:36.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.674+0000 7fcb81d9b700 1 --2- 192.168.123.106:0/3110380914 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb841015d0 0x7fcb84072900 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:36.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.674+0000 7fcb81d9b700 1 -- 192.168.123.106:0/3110380914 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcb740097e0 con 0x7fcb84101ee0 2026-03-09T17:36:36.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.674+0000 7fcb8259c700 1 --2- 192.168.123.106:0/3110380914 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb841015d0 0x7fcb84072900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T17:36:36.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.675+0000 7fcb81d9b700 1 --2- 192.168.123.106:0/3110380914 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb84101ee0 0x7fcb8406d900 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fcb7400bba0 tx=0x7fcb7400bbd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:36.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.675+0000 7fcb7b7fe700 1 -- 192.168.123.106:0/3110380914 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcb7401d070 con 0x7fcb84101ee0 2026-03-09T17:36:36.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.675+0000 7fcb88ef5700 1 -- 192.168.123.106:0/3110380914 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcb8406e290 con 0x7fcb84101ee0 2026-03-09T17:36:36.676 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.675+0000 7fcb88ef5700 1 -- 192.168.123.106:0/3110380914 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcb8406e780 con 0x7fcb84101ee0 2026-03-09T17:36:36.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.675+0000 7fcb7b7fe700 1 -- 192.168.123.106:0/3110380914 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fcb74003b80 con 0x7fcb84101ee0 2026-03-09T17:36:36.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.675+0000 7fcb7b7fe700 1 -- 192.168.123.106:0/3110380914 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcb74021620 con 0x7fcb84101ee0 2026-03-09T17:36:36.677 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.676+0000 7fcb88ef5700 1 -- 192.168.123.106:0/3110380914 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcb84107cb0 con 0x7fcb84101ee0 2026-03-09T17:36:36.678 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.677+0000 7fcb7b7fe700 1 -- 192.168.123.106:0/3110380914 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcb74003cf0 con 0x7fcb84101ee0 2026-03-09T17:36:36.678 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.677+0000 7fcb7b7fe700 1 --2- 192.168.123.106:0/3110380914 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fcb70077870 0x7fcb70079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:36.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.678+0000 7fcb8259c700 1 --2- 192.168.123.106:0/3110380914 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fcb70077870 0x7fcb70079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:36.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.678+0000 7fcb7b7fe700 1 -- 192.168.123.106:0/3110380914 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fcb7409af20 con 0x7fcb84101ee0 2026-03-09T17:36:36.679 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.678+0000 7fcb8259c700 1 --2- 192.168.123.106:0/3110380914 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fcb70077870 0x7fcb70079d20 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fcb6c009fd0 tx=0x7fcb6c009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:36.680 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.679+0000 7fcb7b7fe700 1 -- 192.168.123.106:0/3110380914 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcb74063780 con 0x7fcb84101ee0 2026-03-09T17:36:36.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.822+0000 7fcb88ef5700 1 -- 192.168.123.106:0/3110380914 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7fcb840689d0 con 0x7fcb84101ee0 2026-03-09T17:36:36.826 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.825+0000 7fcb7b7fe700 1 -- 192.168.123.106:0/3110380914 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 35 v35) v1 ==== 94+0+5269 (secure 0 0 0) 0x7fcb840689d0 con 0x7fcb84101ee0 2026-03-09T17:36:36.826 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:36.826 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":35,"btime":"2026-03-09T17:34:38:194451+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44253,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3810846472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3810846472},{"type":"v1","addr":"192.168.123.109:6825","nonce":3810846472}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":28},{"gid":44275,"name":"cephfs.vm09.drzmdt","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6827/3154236738","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3154236738},{"type":"v1","addr":"192.168.123.109:6827","nonce":3154236738}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":33}],"filesystems":[{"mdsmap":{"epoch":35,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:38.194448+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":80,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34284},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34284":{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":0,"incarnation":30,"state":"up:active","state_seq":9,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}},"gid_44251":{"gid":44251,"name":"cephfs.vm06.gzymac","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":1,"addr":"192.168.123.106:6829/2160269265","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":2160269265},{"type":"v1","addr":"192.168.123.106:6829","nonce":2160269265}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34284,"qdb_cluster":[34284]},"id":1}]} 2026-03-09T17:36:36.829 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.827+0000 7fcb88ef5700 1 -- 192.168.123.106:0/3110380914 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fcb70077870 msgr2=0x7fcb70079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:36.829 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.827+0000 7fcb88ef5700 1 --2- 192.168.123.106:0/3110380914 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fcb70077870 0x7fcb70079d20 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fcb6c009fd0 tx=0x7fcb6c009380 comp rx=0 tx=0).stop 2026-03-09T17:36:36.829 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.828+0000 7fcb88ef5700 1 -- 192.168.123.106:0/3110380914 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb84101ee0 msgr2=0x7fcb8406d900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:36.829 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.828+0000 7fcb88ef5700 1 --2- 192.168.123.106:0/3110380914 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb84101ee0 0x7fcb8406d900 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fcb7400bba0 tx=0x7fcb7400bbd0 comp rx=0 tx=0).stop 2026-03-09T17:36:36.829 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.828+0000 7fcb88ef5700 1 -- 192.168.123.106:0/3110380914 shutdown_connections 2026-03-09T17:36:36.829 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.828+0000 7fcb88ef5700 1 --2- 192.168.123.106:0/3110380914 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fcb70077870 0x7fcb70079d20 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:36.829 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.828+0000 7fcb88ef5700 1 --2- 192.168.123.106:0/3110380914 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcb841015d0 0x7fcb84072900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:36.829 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.828+0000 7fcb88ef5700 1 --2- 192.168.123.106:0/3110380914 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcb84101ee0 0x7fcb8406d900 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:36.829 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.828+0000 7fcb88ef5700 1 -- 192.168.123.106:0/3110380914 >> 192.168.123.106:0/3110380914 conn(0x7fcb840faf00 msgr2=0x7fcb84104df0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:36.829 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.828+0000 7fcb88ef5700 1 -- 192.168.123.106:0/3110380914 shutdown_connections 2026-03-09T17:36:36.829 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:36.828+0000 7fcb88ef5700 1 -- 192.168.123.106:0/3110380914 wait complete. 2026-03-09T17:36:36.830 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 35 2026-03-09T17:36:36.891 DEBUG:tasks.fs:checking fs fscid=1,name=cephfs state = {'epoch': 11, 'max_mds': 1, 'flags': 50} 2026-03-09T17:36:36.892 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 12 2026-03-09T17:36:37.077 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:37 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/2485300377' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:36:37.078 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:37 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/3110380914' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T17:36:37.087 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:37.326 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.324+0000 7f8d7197e700 1 -- 192.168.123.106:0/2251317625 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d6c0ffb30 msgr2=0x7f8d6c10c990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:37.326 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.324+0000 7f8d7197e700 1 --2- 192.168.123.106:0/2251317625 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d6c0ffb30 0x7f8d6c10c990 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f8d54009b00 tx=0x7f8d54009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:37.326 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.325+0000 7f8d7197e700 1 -- 192.168.123.106:0/2251317625 shutdown_connections 2026-03-09T17:36:37.326 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.325+0000 7f8d7197e700 1 --2- 192.168.123.106:0/2251317625 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d6c0ffb30 0x7f8d6c10c990 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:37.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.325+0000 7f8d7197e700 1 --2- 192.168.123.106:0/2251317625 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8d6c0ff220 0x7f8d6c0ff5f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:37.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.325+0000 7f8d7197e700 1 -- 192.168.123.106:0/2251317625 >> 192.168.123.106:0/2251317625 conn(0x7f8d6c0747e0 msgr2=0x7f8d6c074be0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:37.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.327+0000 7f8d7197e700 1 -- 192.168.123.106:0/2251317625 shutdown_connections 2026-03-09T17:36:37.328 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.327+0000 7f8d7197e700 1 -- 192.168.123.106:0/2251317625 wait complete. 2026-03-09T17:36:37.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.328+0000 7f8d7197e700 1 Processor -- start 2026-03-09T17:36:37.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.328+0000 7f8d7197e700 1 -- start start 2026-03-09T17:36:37.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.328+0000 7f8d7197e700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8d6c0ff220 0x7f8d6c10c540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:37.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.328+0000 7f8d7197e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d6c0ffb30 0x7f8d6c103490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:37.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.328+0000 7f8d7197e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8d6c10cc40 con 0x7f8d6c0ffb30 2026-03-09T17:36:37.329 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.328+0000 7f8d7197e700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8d6c1039d0 con 0x7f8d6c0ff220 2026-03-09T17:36:37.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.328+0000 7f8d63fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d6c0ffb30 0x7f8d6c103490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:37.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.329+0000 7f8d63fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d6c0ffb30 0x7f8d6c103490 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58392/0 (socket says 192.168.123.106:58392) 2026-03-09T17:36:37.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.329+0000 7f8d63fff700 1 -- 192.168.123.106:0/1863693816 learned_addr learned my addr 192.168.123.106:0/1863693816 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:37.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.329+0000 7f8d6affd700 1 --2- 192.168.123.106:0/1863693816 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8d6c0ff220 0x7f8d6c10c540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:37.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.329+0000 7f8d63fff700 1 -- 192.168.123.106:0/1863693816 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8d6c0ff220 msgr2=0x7f8d6c10c540 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:37.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.329+0000 7f8d63fff700 1 --2- 192.168.123.106:0/1863693816 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8d6c0ff220 0x7f8d6c10c540 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:37.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.329+0000 7f8d63fff700 1 -- 192.168.123.106:0/1863693816 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8d540097e0 con 0x7f8d6c0ffb30 2026-03-09T17:36:37.330 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.329+0000 7f8d6affd700 1 --2- 192.168.123.106:0/1863693816 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8d6c0ff220 0x7f8d6c10c540 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T17:36:37.331 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.329+0000 7f8d63fff700 1 --2- 192.168.123.106:0/1863693816 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d6c0ffb30 0x7f8d6c103490 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f8d540048c0 tx=0x7f8d540048f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:37.331 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.330+0000 7f8d7097c700 1 -- 192.168.123.106:0/1863693816 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8d5401d070 con 0x7f8d6c0ffb30 2026-03-09T17:36:37.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.330+0000 7f8d7197e700 1 -- 192.168.123.106:0/1863693816 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8d6c103c50 con 0x7f8d6c0ffb30 2026-03-09T17:36:37.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.330+0000 7f8d7197e700 1 -- 192.168.123.106:0/1863693816 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8d6c104140 con 0x7f8d6c0ffb30 2026-03-09T17:36:37.338 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.332+0000 7f8d7097c700 1 -- 192.168.123.106:0/1863693816 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f8d54004b80 con 0x7f8d6c0ffb30 2026-03-09T17:36:37.338 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.332+0000 7f8d7097c700 1 -- 192.168.123.106:0/1863693816 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8d5400f650 con 0x7f8d6c0ffb30 2026-03-09T17:36:37.338 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.332+0000 7f8d7097c700 1 -- 192.168.123.106:0/1863693816 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8d5400f870 con 0x7f8d6c0ffb30 2026-03-09T17:36:37.338 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.333+0000 7f8d7097c700 1 --2- 192.168.123.106:0/1863693816 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8d58077750 0x7f8d58079c00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:37.338 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.333+0000 7f8d7097c700 1 -- 192.168.123.106:0/1863693816 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f8d5409c1e0 con 0x7f8d6c0ffb30 2026-03-09T17:36:37.338 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.333+0000 7f8d7197e700 1 -- 192.168.123.106:0/1863693816 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8d4c005320 con 0x7f8d6c0ffb30 2026-03-09T17:36:37.338 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.336+0000 7f8d6affd700 1 --2- 192.168.123.106:0/1863693816 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8d58077750 0x7f8d58079c00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:37.338 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.336+0000 7f8d7097c700 1 -- 192.168.123.106:0/1863693816 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8d54064a40 con 0x7f8d6c0ffb30 2026-03-09T17:36:37.342 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.341+0000 7f8d6affd700 1 --2- 192.168.123.106:0/1863693816 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8d58077750 0x7f8d58079c00 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f8d5c005950 tx=0x7f8d5c0058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:37.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:37 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/2485300377' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T17:36:37.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:37 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/3110380914' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T17:36:37.478 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.476+0000 7f8d7197e700 1 -- 192.168.123.106:0/1863693816 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 12, "format": "json"} v 0) v1 -- 0x7f8d4c0059f0 con 0x7f8d6c0ffb30 2026-03-09T17:36:37.478 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.477+0000 7f8d7097c700 1 -- 192.168.123.106:0/1863693816 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 12, "format": "json"}]=0 dumped fsmap epoch 12 v35) v1 ==== 107+0+4915 (secure 0 0 0) 0x7f8d54027020 con 0x7f8d6c0ffb30 2026-03-09T17:36:37.478 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:37.478 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":12,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14500,"name":"cephfs.vm06.gzymac","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6829/4261949342","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":4261949342},{"type":"v1","addr":"192.168.123.106:6829","nonce":4261949342}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":12},{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9}],"filesystems":[{"mdsmap":{"epoch":11,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:27:16.605001+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24291},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14476":{"gid":14476,"name":"cephfs.vm09.cjcawy","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.109:6825/791757990","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":791757990},{"type":"v1","addr":"192.168.123.109:6825","nonce":791757990}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24291":{"gid":24291,"name":"cephfs.vm06.vmzmbb","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.106:6827/649840868","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":649840868},{"type":"v1","addr":"192.168.123.106:6827","nonce":649840868}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T17:36:37.480 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.479+0000 7f8d7197e700 1 -- 192.168.123.106:0/1863693816 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8d58077750 msgr2=0x7f8d58079c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:37.480 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.479+0000 7f8d7197e700 1 --2- 192.168.123.106:0/1863693816 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8d58077750 0x7f8d58079c00 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f8d5c005950 tx=0x7f8d5c0058e0 comp rx=0 tx=0).stop 2026-03-09T17:36:37.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.479+0000 7f8d7197e700 1 -- 192.168.123.106:0/1863693816 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d6c0ffb30 msgr2=0x7f8d6c103490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:37.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.479+0000 7f8d7197e700 1 --2- 192.168.123.106:0/1863693816 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d6c0ffb30 0x7f8d6c103490 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f8d540048c0 tx=0x7f8d540048f0 comp rx=0 tx=0).stop 2026-03-09T17:36:37.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.480+0000 7f8d7197e700 1 -- 192.168.123.106:0/1863693816 shutdown_connections 2026-03-09T17:36:37.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.480+0000 7f8d7197e700 1 --2- 192.168.123.106:0/1863693816 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f8d58077750 0x7f8d58079c00 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:37.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.480+0000 7f8d7197e700 1 --2- 192.168.123.106:0/1863693816 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f8d6c0ff220 0x7f8d6c10c540 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:37.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.480+0000 7f8d7197e700 1 --2- 192.168.123.106:0/1863693816 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d6c0ffb30 0x7f8d6c103490 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:37.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.480+0000 7f8d7197e700 1 -- 192.168.123.106:0/1863693816 >> 192.168.123.106:0/1863693816 conn(0x7f8d6c0747e0 msgr2=0x7f8d6c0fe230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:37.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.480+0000 7f8d7197e700 1 -- 192.168.123.106:0/1863693816 shutdown_connections 2026-03-09T17:36:37.481 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.480+0000 7f8d7197e700 1 -- 192.168.123.106:0/1863693816 wait complete. 2026-03-09T17:36:37.482 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 12 2026-03-09T17:36:37.522 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 13 2026-03-09T17:36:37.672 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:37.919 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.918+0000 7fab12c4d700 1 -- 192.168.123.106:0/3324691221 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab0c101ec0 msgr2=0x7fab0c10a590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:37.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.918+0000 7fab12c4d700 1 --2- 192.168.123.106:0/3324691221 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab0c101ec0 0x7fab0c10a590 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7fab00009b00 tx=0x7fab00009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:37.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.918+0000 7fab12c4d700 1 -- 192.168.123.106:0/3324691221 shutdown_connections 2026-03-09T17:36:37.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.918+0000 7fab12c4d700 1 --2- 192.168.123.106:0/3324691221 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab0c101ec0 0x7fab0c10a590 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:37.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.918+0000 7fab12c4d700 1 --2- 192.168.123.106:0/3324691221 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fab0c1015b0 0x7fab0c101980 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:37.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.918+0000 7fab12c4d700 1 -- 192.168.123.106:0/3324691221 >> 192.168.123.106:0/3324691221 conn(0x7fab0c0faf00 msgr2=0x7fab0c0fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:37.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.918+0000 7fab12c4d700 1 -- 192.168.123.106:0/3324691221 shutdown_connections 2026-03-09T17:36:37.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.918+0000 7fab12c4d700 1 -- 192.168.123.106:0/3324691221 wait complete. 2026-03-09T17:36:37.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.919+0000 7fab12c4d700 1 Processor -- start 2026-03-09T17:36:37.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.919+0000 7fab12c4d700 1 -- start start 2026-03-09T17:36:37.920 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.919+0000 7fab12c4d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab0c1015b0 0x7fab0c196170 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:37.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.919+0000 7fab12c4d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fab0c101ec0 0x7fab0c1966b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:37.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.919+0000 7fab12c4d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fab0c196d90 con 0x7fab0c1015b0 2026-03-09T17:36:37.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.919+0000 7fab12c4d700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fab0c19ab20 con 0x7fab0c101ec0 2026-03-09T17:36:37.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.920+0000 7fab0bfff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fab0c101ec0 0x7fab0c1966b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:37.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.920+0000 7fab0bfff700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fab0c101ec0 0x7fab0c1966b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:34886/0 (socket says 192.168.123.106:34886) 2026-03-09T17:36:37.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.920+0000 7fab0bfff700 1 -- 192.168.123.106:0/4098737948 learned_addr learned my addr 192.168.123.106:0/4098737948 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:37.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.920+0000 7fab109e9700 1 --2- 192.168.123.106:0/4098737948 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab0c1015b0 0x7fab0c196170 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:37.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.920+0000 7fab109e9700 1 -- 192.168.123.106:0/4098737948 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fab0c101ec0 msgr2=0x7fab0c1966b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:37.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.920+0000 7fab109e9700 1 --2- 192.168.123.106:0/4098737948 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fab0c101ec0 0x7fab0c1966b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:37.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.920+0000 7fab109e9700 1 -- 192.168.123.106:0/4098737948 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fab000097e0 con 0x7fab0c1015b0 2026-03-09T17:36:37.921 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.920+0000 7fab0bfff700 1 --2- 192.168.123.106:0/4098737948 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fab0c101ec0 0x7fab0c1966b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T17:36:37.922 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.920+0000 7fab109e9700 1 --2- 192.168.123.106:0/4098737948 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab0c1015b0 0x7fab0c196170 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7faafc00eab0 tx=0x7faafc00edc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:37.922 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.920+0000 7fab09ffb700 1 -- 192.168.123.106:0/4098737948 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faafc00cb20 con 0x7fab0c1015b0 2026-03-09T17:36:37.922 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.921+0000 7fab09ffb700 1 -- 192.168.123.106:0/4098737948 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7faafc00cc80 con 0x7fab0c1015b0 2026-03-09T17:36:37.922 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.921+0000 7fab12c4d700 1 -- 192.168.123.106:0/4098737948 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fab0c19ada0 con 0x7fab0c1015b0 2026-03-09T17:36:37.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.921+0000 7fab09ffb700 1 -- 192.168.123.106:0/4098737948 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faafc018860 con 0x7fab0c1015b0 2026-03-09T17:36:37.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.921+0000 7fab12c4d700 1 -- 192.168.123.106:0/4098737948 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fab0c19b2f0 con 0x7fab0c1015b0 2026-03-09T17:36:37.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.922+0000 7fab09ffb700 1 -- 192.168.123.106:0/4098737948 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7faafc0189c0 con 0x7fab0c1015b0 2026-03-09T17:36:37.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.922+0000 7fab09ffb700 1 --2- 192.168.123.106:0/4098737948 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7faaf40779e0 0x7faaf4079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:37.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.922+0000 7fab09ffb700 1 -- 192.168.123.106:0/4098737948 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7faafc014070 con 0x7fab0c1015b0 2026-03-09T17:36:37.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.923+0000 7fab0bfff700 1 --2- 192.168.123.106:0/4098737948 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7faaf40779e0 0x7faaf4079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:37.924 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.923+0000 7fab12c4d700 1 -- 192.168.123.106:0/4098737948 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fab0c19af30 con 0x7fab0c1015b0 2026-03-09T17:36:37.927 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.926+0000 7fab09ffb700 1 -- 192.168.123.106:0/4098737948 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fab0c19af30 con 0x7fab0c1015b0 2026-03-09T17:36:37.927 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:37.926+0000 7fab0bfff700 1 --2- 192.168.123.106:0/4098737948 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7faaf40779e0 0x7faaf4079e90 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7fab0c197790 tx=0x7fab0000b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:38.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.065+0000 7fab12c4d700 1 -- 192.168.123.106:0/4098737948 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 13, "format": "json"} v 0) v1 -- 0x7fab0c19af30 con 0x7fab0c1015b0 2026-03-09T17:36:38.068 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.066+0000 7fab09ffb700 1 -- 192.168.123.106:0/4098737948 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 13, "format": "json"}]=0 dumped fsmap epoch 13 v35) v1 ==== 107+0+4915 (secure 0 0 0) 0x7fab0c19af30 con 0x7fab0c1015b0 2026-03-09T17:36:38.069 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:38.069 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":13,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14500,"name":"cephfs.vm06.gzymac","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6829/4261949342","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":4261949342},{"type":"v1","addr":"192.168.123.106:6829","nonce":4261949342}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":12},{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13}],"filesystems":[{"mdsmap":{"epoch":11,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:27:16.605001+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24291},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14476":{"gid":14476,"name":"cephfs.vm09.cjcawy","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.109:6825/791757990","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":791757990},{"type":"v1","addr":"192.168.123.109:6825","nonce":791757990}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24291":{"gid":24291,"name":"cephfs.vm06.vmzmbb","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.106:6827/649840868","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":649840868},{"type":"v1","addr":"192.168.123.106:6827","nonce":649840868}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T17:36:38.072 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.070+0000 7fab12c4d700 1 -- 192.168.123.106:0/4098737948 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7faaf40779e0 msgr2=0x7faaf4079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:38.072 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.070+0000 7fab12c4d700 1 --2- 192.168.123.106:0/4098737948 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7faaf40779e0 0x7faaf4079e90 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7fab0c197790 tx=0x7fab0000b540 comp rx=0 tx=0).stop 2026-03-09T17:36:38.072 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.070+0000 7fab12c4d700 1 -- 192.168.123.106:0/4098737948 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab0c1015b0 msgr2=0x7fab0c196170 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:38.072 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.070+0000 7fab12c4d700 1 --2- 192.168.123.106:0/4098737948 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab0c1015b0 0x7fab0c196170 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7faafc00eab0 tx=0x7faafc00edc0 comp rx=0 tx=0).stop 2026-03-09T17:36:38.072 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.071+0000 7fab12c4d700 1 -- 192.168.123.106:0/4098737948 shutdown_connections 2026-03-09T17:36:38.072 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.071+0000 7fab12c4d700 1 --2- 192.168.123.106:0/4098737948 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7faaf40779e0 0x7faaf4079e90 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:38.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.071+0000 7fab12c4d700 1 --2- 192.168.123.106:0/4098737948 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fab0c1015b0 0x7fab0c196170 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:38.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.071+0000 7fab12c4d700 1 --2- 192.168.123.106:0/4098737948 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fab0c101ec0 0x7fab0c1966b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:38.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.071+0000 7fab12c4d700 1 -- 192.168.123.106:0/4098737948 >> 192.168.123.106:0/4098737948 conn(0x7fab0c0faf00 msgr2=0x7fab0c104d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:38.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.071+0000 7fab12c4d700 1 -- 192.168.123.106:0/4098737948 shutdown_connections 2026-03-09T17:36:38.073 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.071+0000 7fab12c4d700 1 -- 192.168.123.106:0/4098737948 wait complete. 2026-03-09T17:36:38.074 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 13 2026-03-09T17:36:38.143 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 14 2026-03-09T17:36:38.300 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:38.326 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:38 vm06.local ceph-mon[109831]: pgmap v210: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:38.326 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:38 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/1863693816' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-09T17:36:38.327 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:38 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/4098737948' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-09T17:36:38.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:38 vm09.local ceph-mon[97995]: pgmap v210: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:38.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:38 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/1863693816' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-09T17:36:38.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:38 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/4098737948' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-09T17:36:38.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.542+0000 7f633847b700 1 -- 192.168.123.106:0/2419183632 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6330102790 msgr2=0x7f6330102c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:38.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.542+0000 7f633847b700 1 --2- 192.168.123.106:0/2419183632 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6330102790 0x7f6330102c00 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7f632c009b50 tx=0x7f632c009e60 comp rx=0 tx=0).stop 2026-03-09T17:36:38.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.543+0000 7f633847b700 1 -- 192.168.123.106:0/2419183632 shutdown_connections 2026-03-09T17:36:38.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.543+0000 7f633847b700 1 --2- 192.168.123.106:0/2419183632 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6330102790 0x7f6330102c00 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:38.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.543+0000 7f633847b700 1 --2- 192.168.123.106:0/2419183632 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6330108790 0x7f6330108b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:38.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.543+0000 7f633847b700 1 -- 192.168.123.106:0/2419183632 >> 192.168.123.106:0/2419183632 conn(0x7f63300fe2b0 msgr2=0x7f63301006c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:38.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.544+0000 7f633847b700 1 -- 192.168.123.106:0/2419183632 shutdown_connections 2026-03-09T17:36:38.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.544+0000 7f633847b700 1 -- 192.168.123.106:0/2419183632 wait complete. 2026-03-09T17:36:38.545 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.544+0000 7f633847b700 1 Processor -- start 2026-03-09T17:36:38.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.544+0000 7f633847b700 1 -- start start 2026-03-09T17:36:38.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.545+0000 7f633847b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6330102790 0x7f6330198370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:38.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.545+0000 7f633847b700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6330108790 0x7f63301988b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:38.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.545+0000 7f633847b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6330198f90 con 0x7f6330102790 2026-03-09T17:36:38.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.545+0000 7f633847b700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f633019cd20 con 0x7f6330108790 2026-03-09T17:36:38.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.545+0000 7f6335a16700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6330108790 0x7f63301988b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:38.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.545+0000 7f6335a16700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6330108790 0x7f63301988b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:34900/0 (socket says 192.168.123.106:34900) 2026-03-09T17:36:38.546 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.545+0000 7f6335a16700 1 -- 192.168.123.106:0/274526124 learned_addr learned my addr 192.168.123.106:0/274526124 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:38.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.546+0000 7f6335a16700 1 -- 192.168.123.106:0/274526124 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6330102790 msgr2=0x7f6330198370 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:38.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.546+0000 7f6336217700 1 --2- 192.168.123.106:0/274526124 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6330102790 0x7f6330198370 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:38.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.546+0000 7f6335a16700 1 --2- 192.168.123.106:0/274526124 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6330102790 0x7f6330198370 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:38.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.546+0000 7f6335a16700 1 -- 192.168.123.106:0/274526124 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f632c0097e0 con 0x7f6330108790 2026-03-09T17:36:38.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.546+0000 7f6336217700 1 --2- 192.168.123.106:0/274526124 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6330102790 0x7f6330198370 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:36:38.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.546+0000 7f6335a16700 1 --2- 192.168.123.106:0/274526124 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6330108790 0x7f63301988b0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f632c0094d0 tx=0x7f632c004970 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:38.547 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.546+0000 7f63277fe700 1 -- 192.168.123.106:0/274526124 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f632c01d070 con 0x7f6330108790 2026-03-09T17:36:38.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.546+0000 7f633847b700 1 -- 192.168.123.106:0/274526124 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f633019cfa0 con 0x7f6330108790 2026-03-09T17:36:38.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.547+0000 7f63277fe700 1 -- 192.168.123.106:0/274526124 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f632c00bc50 con 0x7f6330108790 2026-03-09T17:36:38.548 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.547+0000 7f633847b700 1 -- 192.168.123.106:0/274526124 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f633019d490 con 0x7f6330108790 2026-03-09T17:36:38.549 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.547+0000 7f63277fe700 1 -- 192.168.123.106:0/274526124 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f632c00bdc0 con 0x7f6330108790 2026-03-09T17:36:38.549 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.548+0000 7f63257fa700 1 -- 192.168.123.106:0/274526124 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f63180052f0 con 0x7f6330108790 2026-03-09T17:36:38.551 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.549+0000 7f63277fe700 1 -- 192.168.123.106:0/274526124 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f632c022470 con 0x7f6330108790 2026-03-09T17:36:38.551 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.549+0000 7f63277fe700 1 --2- 192.168.123.106:0/274526124 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f631c0778c0 0x7f631c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:38.551 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.549+0000 7f6336217700 1 --2- 192.168.123.106:0/274526124 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f631c0778c0 0x7f631c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:38.551 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.549+0000 7f63277fe700 1 -- 192.168.123.106:0/274526124 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f632c09bb40 con 0x7f6330108790 2026-03-09T17:36:38.551 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.550+0000 7f6336217700 1 --2- 192.168.123.106:0/274526124 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f631c0778c0 0x7f631c079d70 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f63301038d0 tx=0x7f6320009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:38.552 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.551+0000 7f63277fe700 1 -- 192.168.123.106:0/274526124 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f632c0642d0 con 0x7f6330108790 2026-03-09T17:36:38.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.705+0000 7f63257fa700 1 -- 192.168.123.106:0/274526124 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 14, "format": "json"} v 0) v1 -- 0x7f6318005160 con 0x7f6330108790 2026-03-09T17:36:38.708 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.706+0000 7f63277fe700 1 -- 192.168.123.106:0/274526124 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 14, "format": "json"}]=0 dumped fsmap epoch 14 v35) v1 ==== 107+0+4138 (secure 0 0 0) 0x7f632c063a80 con 0x7f6330108790 2026-03-09T17:36:38.708 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:38.708 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":14,"btime":"2026-03-09T17:34:00:437715+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14500,"name":"cephfs.vm06.gzymac","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6829/4261949342","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":4261949342},{"type":"v1","addr":"192.168.123.106:6829","nonce":4261949342}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":12},{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13}],"filesystems":[{"mdsmap":{"epoch":14,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:00.437714+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24291},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24291":{"gid":24291,"name":"cephfs.vm06.vmzmbb","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.106:6827/649840868","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":649840868},{"type":"v1","addr":"192.168.123.106:6827","nonce":649840868}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24291,"qdb_cluster":[24291]},"id":1}]} 2026-03-09T17:36:38.710 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.709+0000 7f63257fa700 1 -- 192.168.123.106:0/274526124 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f631c0778c0 msgr2=0x7f631c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:38.710 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.709+0000 7f63257fa700 1 --2- 192.168.123.106:0/274526124 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f631c0778c0 0x7f631c079d70 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f63301038d0 tx=0x7f6320009450 comp rx=0 tx=0).stop 2026-03-09T17:36:38.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.709+0000 7f63257fa700 1 -- 192.168.123.106:0/274526124 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6330108790 msgr2=0x7f63301988b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:38.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.709+0000 7f63257fa700 1 --2- 192.168.123.106:0/274526124 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6330108790 0x7f63301988b0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f632c0094d0 tx=0x7f632c004970 comp rx=0 tx=0).stop 2026-03-09T17:36:38.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.710+0000 7f63257fa700 1 -- 192.168.123.106:0/274526124 shutdown_connections 2026-03-09T17:36:38.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.710+0000 7f63257fa700 1 --2- 192.168.123.106:0/274526124 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f631c0778c0 0x7f631c079d70 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:38.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.710+0000 7f63257fa700 1 --2- 192.168.123.106:0/274526124 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6330102790 0x7f6330198370 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:38.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.710+0000 7f63257fa700 1 --2- 192.168.123.106:0/274526124 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6330108790 0x7f63301988b0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:38.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.710+0000 7f63257fa700 1 -- 192.168.123.106:0/274526124 >> 192.168.123.106:0/274526124 conn(0x7f63300fe2b0 msgr2=0x7f63300ffb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:38.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.710+0000 7f63257fa700 1 -- 192.168.123.106:0/274526124 shutdown_connections 2026-03-09T17:36:38.711 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:38.710+0000 7f63257fa700 1 -- 192.168.123.106:0/274526124 wait complete. 2026-03-09T17:36:38.712 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 14 2026-03-09T17:36:38.776 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 14 2026-03-09T17:36:38.776 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 15 2026-03-09T17:36:38.921 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:39.176 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.174+0000 7fc01a4cc700 1 -- 192.168.123.106:0/486280027 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc014101ec0 msgr2=0x7fc01410a590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:39.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.174+0000 7fc01a4cc700 1 --2- 192.168.123.106:0/486280027 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc014101ec0 0x7fc01410a590 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7fbffc009b00 tx=0x7fbffc009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:39.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.175+0000 7fc01a4cc700 1 -- 192.168.123.106:0/486280027 shutdown_connections 2026-03-09T17:36:39.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.175+0000 7fc01a4cc700 1 --2- 192.168.123.106:0/486280027 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc014101ec0 0x7fc01410a590 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:39.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.175+0000 7fc01a4cc700 1 --2- 192.168.123.106:0/486280027 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc0141015b0 0x7fc014101980 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:39.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.175+0000 7fc01a4cc700 1 -- 192.168.123.106:0/486280027 >> 192.168.123.106:0/486280027 conn(0x7fc0140faf00 msgr2=0x7fc0140fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:39.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.175+0000 7fc01a4cc700 1 -- 192.168.123.106:0/486280027 shutdown_connections 2026-03-09T17:36:39.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.175+0000 7fc01a4cc700 1 -- 192.168.123.106:0/486280027 wait complete. 2026-03-09T17:36:39.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.176+0000 7fc01a4cc700 1 Processor -- start 2026-03-09T17:36:39.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.176+0000 7fc01a4cc700 1 -- start start 2026-03-09T17:36:39.177 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.176+0000 7fc01a4cc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc0141015b0 0x7fc014196150 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:39.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.176+0000 7fc01a4cc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc014101ec0 0x7fc014196690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:39.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.176+0000 7fc01a4cc700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc014196ce0 con 0x7fc014101ec0 2026-03-09T17:36:39.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.176+0000 7fc01a4cc700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc014196e20 con 0x7fc0141015b0 2026-03-09T17:36:39.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.177+0000 7fc0137fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc014101ec0 0x7fc014196690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:39.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.177+0000 7fc0137fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc014101ec0 0x7fc014196690 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58470/0 (socket says 192.168.123.106:58470) 2026-03-09T17:36:39.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.177+0000 7fc0137fe700 1 -- 192.168.123.106:0/3555106070 learned_addr learned my addr 192.168.123.106:0/3555106070 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:39.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.177+0000 7fc013fff700 1 --2- 192.168.123.106:0/3555106070 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc0141015b0 0x7fc014196150 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:39.179 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.177+0000 7fc0137fe700 1 -- 192.168.123.106:0/3555106070 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc0141015b0 msgr2=0x7fc014196150 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:39.179 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.177+0000 7fc0137fe700 1 --2- 192.168.123.106:0/3555106070 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc0141015b0 0x7fc014196150 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:39.179 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.177+0000 7fc0137fe700 1 -- 192.168.123.106:0/3555106070 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbffc0097e0 con 0x7fc014101ec0 2026-03-09T17:36:39.179 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.177+0000 7fc0137fe700 1 --2- 192.168.123.106:0/3555106070 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc014101ec0 0x7fc014196690 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7fbffc004930 tx=0x7fbffc004960 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:39.179 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.178+0000 7fc0117fa700 1 -- 192.168.123.106:0/3555106070 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbffc01d070 con 0x7fc014101ec0 2026-03-09T17:36:39.180 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.178+0000 7fc0117fa700 1 -- 192.168.123.106:0/3555106070 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fbffc00bc50 con 0x7fc014101ec0 2026-03-09T17:36:39.180 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.178+0000 7fc0117fa700 1 -- 192.168.123.106:0/3555106070 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbffc00f770 con 0x7fc014101ec0 2026-03-09T17:36:39.180 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.178+0000 7fc01a4cc700 1 -- 192.168.123.106:0/3555106070 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc01419ac10 con 0x7fc014101ec0 2026-03-09T17:36:39.180 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.178+0000 7fc01a4cc700 1 -- 192.168.123.106:0/3555106070 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc01419b100 con 0x7fc014101ec0 2026-03-09T17:36:39.180 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.179+0000 7fc01a4cc700 1 -- 192.168.123.106:0/3555106070 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc014107c30 con 0x7fc014101ec0 2026-03-09T17:36:39.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.183+0000 7fc0117fa700 1 -- 192.168.123.106:0/3555106070 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fbffc022470 con 0x7fc014101ec0 2026-03-09T17:36:39.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.183+0000 7fc0117fa700 1 --2- 192.168.123.106:0/3555106070 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc000077870 0x7fc000079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:39.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.183+0000 7fc0117fa700 1 -- 192.168.123.106:0/3555106070 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fbffc09bd30 con 0x7fc014101ec0 2026-03-09T17:36:39.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.184+0000 7fc013fff700 1 --2- 192.168.123.106:0/3555106070 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc000077870 0x7fc000079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:39.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.184+0000 7fc0117fa700 1 -- 192.168.123.106:0/3555106070 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbffc09c1c0 con 0x7fc014101ec0 2026-03-09T17:36:39.185 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.184+0000 7fc013fff700 1 --2- 192.168.123.106:0/3555106070 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc000077870 0x7fc000079d20 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fc00400ac50 tx=0x7fc00400a380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:39.329 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:39 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/274526124' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-09T17:36:39.331 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.327+0000 7fc01a4cc700 1 -- 192.168.123.106:0/3555106070 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 15, "format": "json"} v 0) v1 -- 0x7fc01404ea50 con 0x7fc014101ec0 2026-03-09T17:36:39.332 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.331+0000 7fc0117fa700 1 -- 192.168.123.106:0/3555106070 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 15, "format": "json"}]=0 dumped fsmap epoch 15 v35) v1 ==== 107+0+4123 (secure 0 0 0) 0x7fbffc0644e0 con 0x7fc014101ec0 2026-03-09T17:36:39.332 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:39.332 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":15,"btime":"2026-03-09T17:34:01:367017+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14500,"name":"cephfs.vm06.gzymac","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6829/4261949342","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":4261949342},{"type":"v1","addr":"192.168.123.106:6829","nonce":4261949342}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":12},{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13},{"gid":34280,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3053844977","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3053844977},{"type":"v1","addr":"192.168.123.109:6825","nonce":3053844977}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":15,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:01.367016+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":75,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T17:36:39.334 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.333+0000 7fc01a4cc700 1 -- 192.168.123.106:0/3555106070 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc000077870 msgr2=0x7fc000079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:39.334 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.333+0000 7fc01a4cc700 1 --2- 192.168.123.106:0/3555106070 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc000077870 0x7fc000079d20 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fc00400ac50 tx=0x7fc00400a380 comp rx=0 tx=0).stop 2026-03-09T17:36:39.335 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.334+0000 7fc01a4cc700 1 -- 192.168.123.106:0/3555106070 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc014101ec0 msgr2=0x7fc014196690 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:39.335 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.334+0000 7fc01a4cc700 1 --2- 192.168.123.106:0/3555106070 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc014101ec0 0x7fc014196690 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7fbffc004930 tx=0x7fbffc004960 comp rx=0 tx=0).stop 2026-03-09T17:36:39.335 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.334+0000 7fc01a4cc700 1 -- 192.168.123.106:0/3555106070 shutdown_connections 2026-03-09T17:36:39.335 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.334+0000 7fc01a4cc700 1 --2- 192.168.123.106:0/3555106070 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc000077870 0x7fc000079d20 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:39.335 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.334+0000 7fc01a4cc700 1 --2- 192.168.123.106:0/3555106070 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc0141015b0 0x7fc014196150 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:39.335 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.334+0000 7fc01a4cc700 1 --2- 192.168.123.106:0/3555106070 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc014101ec0 0x7fc014196690 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:39.335 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.334+0000 7fc01a4cc700 1 -- 192.168.123.106:0/3555106070 >> 192.168.123.106:0/3555106070 conn(0x7fc0140faf00 msgr2=0x7fc014104d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:39.335 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.334+0000 7fc01a4cc700 1 -- 192.168.123.106:0/3555106070 shutdown_connections 2026-03-09T17:36:39.335 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.334+0000 7fc01a4cc700 1 -- 192.168.123.106:0/3555106070 wait complete. 2026-03-09T17:36:39.336 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 15 2026-03-09T17:36:39.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:39 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/274526124' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-09T17:36:39.397 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 15 2026-03-09T17:36:39.397 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 16 2026-03-09T17:36:39.541 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:39.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.801+0000 7f2a69b1d700 1 -- 192.168.123.106:0/642418908 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a64075b90 msgr2=0x7f2a64110ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:39.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.801+0000 7f2a69b1d700 1 --2- 192.168.123.106:0/642418908 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a64075b90 0x7f2a64110ef0 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f2a54009b00 tx=0x7f2a54009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:39.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.802+0000 7f2a69b1d700 1 -- 192.168.123.106:0/642418908 shutdown_connections 2026-03-09T17:36:39.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.802+0000 7f2a69b1d700 1 --2- 192.168.123.106:0/642418908 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a64075b90 0x7f2a64110ef0 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:39.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.802+0000 7f2a69b1d700 1 --2- 192.168.123.106:0/642418908 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2a64075280 0x7f2a64075650 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:39.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.802+0000 7f2a69b1d700 1 -- 192.168.123.106:0/642418908 >> 192.168.123.106:0/642418908 conn(0x7f2a640fe2b0 msgr2=0x7f2a641006c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:39.804 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.802+0000 7f2a69b1d700 1 -- 192.168.123.106:0/642418908 shutdown_connections 2026-03-09T17:36:39.804 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.802+0000 7f2a69b1d700 1 -- 192.168.123.106:0/642418908 wait complete. 2026-03-09T17:36:39.804 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.803+0000 7f2a69b1d700 1 Processor -- start 2026-03-09T17:36:39.804 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.803+0000 7f2a69b1d700 1 -- start start 2026-03-09T17:36:39.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.803+0000 7f2a69b1d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2a64075280 0x7f2a64110cd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:39.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.803+0000 7f2a69b1d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a64075b90 0x7f2a6410bcd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:39.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.803+0000 7f2a69b1d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a6410c210 con 0x7f2a64075b90 2026-03-09T17:36:39.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.803+0000 7f2a69b1d700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a6410c380 con 0x7f2a64075280 2026-03-09T17:36:39.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.803+0000 7f2a62ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a64075b90 0x7f2a6410bcd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:39.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.803+0000 7f2a62ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a64075b90 0x7f2a6410bcd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58494/0 (socket says 192.168.123.106:58494) 2026-03-09T17:36:39.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.803+0000 7f2a62ffd700 1 -- 192.168.123.106:0/2744915298 learned_addr learned my addr 192.168.123.106:0/2744915298 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:39.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.803+0000 7f2a62ffd700 1 -- 192.168.123.106:0/2744915298 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2a64075280 msgr2=0x7f2a64110cd0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T17:36:39.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.803+0000 7f2a62ffd700 1 --2- 192.168.123.106:0/2744915298 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2a64075280 0x7f2a64110cd0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:39.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.803+0000 7f2a62ffd700 1 -- 192.168.123.106:0/2744915298 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2a540097e0 con 0x7f2a64075b90 2026-03-09T17:36:39.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.804+0000 7f2a62ffd700 1 --2- 192.168.123.106:0/2744915298 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a64075b90 0x7f2a6410bcd0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f2a54005950 tx=0x7f2a5400bc50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:39.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.804+0000 7f2a60ff9700 1 -- 192.168.123.106:0/2744915298 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2a5401d070 con 0x7f2a64075b90 2026-03-09T17:36:39.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.804+0000 7f2a69b1d700 1 -- 192.168.123.106:0/2744915298 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2a6410c580 con 0x7f2a64075b90 2026-03-09T17:36:39.805 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.804+0000 7f2a69b1d700 1 -- 192.168.123.106:0/2744915298 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2a6410ca80 con 0x7f2a64075b90 2026-03-09T17:36:39.807 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.805+0000 7f2a60ff9700 1 -- 192.168.123.106:0/2744915298 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2a54003bf0 con 0x7f2a64075b90 2026-03-09T17:36:39.809 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.806+0000 7f2a69b1d700 1 -- 192.168.123.106:0/2744915298 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2a64105e50 con 0x7f2a64075b90 2026-03-09T17:36:39.809 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.806+0000 7f2a60ff9700 1 -- 192.168.123.106:0/2744915298 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2a54021700 con 0x7f2a64075b90 2026-03-09T17:36:39.809 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.806+0000 7f2a60ff9700 1 -- 192.168.123.106:0/2744915298 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2a5402b430 con 0x7f2a64075b90 2026-03-09T17:36:39.809 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.807+0000 7f2a60ff9700 1 --2- 192.168.123.106:0/2744915298 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2a50077ab0 0x7f2a50079f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:39.809 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.807+0000 7f2a637fe700 1 --2- 192.168.123.106:0/2744915298 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2a50077ab0 0x7f2a50079f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:39.809 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.807+0000 7f2a637fe700 1 --2- 192.168.123.106:0/2744915298 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2a50077ab0 0x7f2a50079f60 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f2a4c009780 tx=0x7f2a4c006cb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:39.809 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.807+0000 7f2a60ff9700 1 -- 192.168.123.106:0/2744915298 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f2a5409c070 con 0x7f2a64075b90 2026-03-09T17:36:39.810 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.809+0000 7f2a60ff9700 1 -- 192.168.123.106:0/2744915298 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2a540d29f0 con 0x7f2a64075b90 2026-03-09T17:36:39.952 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.951+0000 7f2a69b1d700 1 -- 192.168.123.106:0/2744915298 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 16, "format": "json"} v 0) v1 -- 0x7f2a6404ea50 con 0x7f2a64075b90 2026-03-09T17:36:39.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.952+0000 7f2a60ff9700 1 -- 192.168.123.106:0/2744915298 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 16, "format": "json"}]=0 dumped fsmap epoch 16 v35) v1 ==== 107+0+4134 (secure 0 0 0) 0x7f2a54064150 con 0x7f2a64075b90 2026-03-09T17:36:39.953 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:39.953 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":16,"btime":"2026-03-09T17:34:01:375767+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13},{"gid":34280,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3053844977","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3053844977},{"type":"v1","addr":"192.168.123.109:6825","nonce":3053844977}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":16,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:01.375762+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":75,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14500},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14500":{"gid":14500,"name":"cephfs.vm06.gzymac","rank":0,"incarnation":16,"state":"up:replay","state_seq":1,"addr":"192.168.123.106:6829/4261949342","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":4261949342},{"type":"v1","addr":"192.168.123.106:6829","nonce":4261949342}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T17:36:39.955 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.954+0000 7f2a69b1d700 1 -- 192.168.123.106:0/2744915298 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2a50077ab0 msgr2=0x7f2a50079f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:39.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.954+0000 7f2a69b1d700 1 --2- 192.168.123.106:0/2744915298 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2a50077ab0 0x7f2a50079f60 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f2a4c009780 tx=0x7f2a4c006cb0 comp rx=0 tx=0).stop 2026-03-09T17:36:39.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.954+0000 7f2a69b1d700 1 -- 192.168.123.106:0/2744915298 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a64075b90 msgr2=0x7f2a6410bcd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:39.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.954+0000 7f2a69b1d700 1 --2- 192.168.123.106:0/2744915298 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a64075b90 0x7f2a6410bcd0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f2a54005950 tx=0x7f2a5400bc50 comp rx=0 tx=0).stop 2026-03-09T17:36:39.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.955+0000 7f2a69b1d700 1 -- 192.168.123.106:0/2744915298 shutdown_connections 2026-03-09T17:36:39.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.955+0000 7f2a69b1d700 1 --2- 192.168.123.106:0/2744915298 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f2a50077ab0 0x7f2a50079f60 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:39.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.955+0000 7f2a69b1d700 1 --2- 192.168.123.106:0/2744915298 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f2a64075280 0x7f2a64110cd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:39.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.955+0000 7f2a69b1d700 1 --2- 192.168.123.106:0/2744915298 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2a64075b90 0x7f2a6410bcd0 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:39.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.955+0000 7f2a69b1d700 1 -- 192.168.123.106:0/2744915298 >> 192.168.123.106:0/2744915298 conn(0x7f2a640fe2b0 msgr2=0x7f2a64100330 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:39.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.955+0000 7f2a69b1d700 1 -- 192.168.123.106:0/2744915298 shutdown_connections 2026-03-09T17:36:39.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:39.955+0000 7f2a69b1d700 1 -- 192.168.123.106:0/2744915298 wait complete. 2026-03-09T17:36:39.957 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 16 2026-03-09T17:36:40.019 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 16 2026-03-09T17:36:40.019 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 17 2026-03-09T17:36:40.164 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:40.208 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:40 vm06.local ceph-mon[109831]: pgmap v211: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:40.208 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:40 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/3555106070' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-09T17:36:40.208 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:40 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/2744915298' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-09T17:36:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:40 vm09.local ceph-mon[97995]: pgmap v211: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:40 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/3555106070' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-09T17:36:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:40 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/2744915298' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-09T17:36:40.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.432+0000 7fd268514700 1 -- 192.168.123.106:0/873096550 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd2600fff00 msgr2=0x7fd260100370 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:40.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.432+0000 7fd268514700 1 --2- 192.168.123.106:0/873096550 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd2600fff00 0x7fd260100370 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7fd25c009b50 tx=0x7fd25c009e60 comp rx=0 tx=0).stop 2026-03-09T17:36:40.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.433+0000 7fd268514700 1 -- 192.168.123.106:0/873096550 shutdown_connections 2026-03-09T17:36:40.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.433+0000 7fd268514700 1 --2- 192.168.123.106:0/873096550 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd2600fff00 0x7fd260100370 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:40.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.433+0000 7fd268514700 1 --2- 192.168.123.106:0/873096550 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd260104520 0x7fd2601048f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:40.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.433+0000 7fd268514700 1 -- 192.168.123.106:0/873096550 >> 192.168.123.106:0/873096550 conn(0x7fd2600754a0 msgr2=0x7fd2600758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:40.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.433+0000 7fd268514700 1 -- 192.168.123.106:0/873096550 shutdown_connections 2026-03-09T17:36:40.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.433+0000 7fd268514700 1 -- 192.168.123.106:0/873096550 wait complete. 2026-03-09T17:36:40.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.433+0000 7fd268514700 1 Processor -- start 2026-03-09T17:36:40.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.433+0000 7fd268514700 1 -- start start 2026-03-09T17:36:40.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.434+0000 7fd268514700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd2600fff00 0x7fd260105210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:40.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.434+0000 7fd268514700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd260104520 0x7fd260105750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:40.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.434+0000 7fd268514700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd2601093a0 con 0x7fd2600fff00 2026-03-09T17:36:40.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.434+0000 7fd268514700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd260105c90 con 0x7fd260104520 2026-03-09T17:36:40.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.434+0000 7fd2662b0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd2600fff00 0x7fd260105210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:40.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.434+0000 7fd2662b0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd2600fff00 0x7fd260105210 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58514/0 (socket says 192.168.123.106:58514) 2026-03-09T17:36:40.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.434+0000 7fd2662b0700 1 -- 192.168.123.106:0/3084838809 learned_addr learned my addr 192.168.123.106:0/3084838809 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:40.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.434+0000 7fd2662b0700 1 -- 192.168.123.106:0/3084838809 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd260104520 msgr2=0x7fd260105750 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:36:40.435 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.434+0000 7fd2662b0700 1 --2- 192.168.123.106:0/3084838809 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd260104520 0x7fd260105750 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:40.436 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.434+0000 7fd2662b0700 1 -- 192.168.123.106:0/3084838809 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd25c0097e0 con 0x7fd2600fff00 2026-03-09T17:36:40.436 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.434+0000 7fd2662b0700 1 --2- 192.168.123.106:0/3084838809 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd2600fff00 0x7fd260105210 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7fd25000d8d0 tx=0x7fd25000dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:40.436 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.435+0000 7fd2577fe700 1 -- 192.168.123.106:0/3084838809 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd250009940 con 0x7fd2600fff00 2026-03-09T17:36:40.436 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.435+0000 7fd268514700 1 -- 192.168.123.106:0/3084838809 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd260105f70 con 0x7fd2600fff00 2026-03-09T17:36:40.437 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.435+0000 7fd268514700 1 -- 192.168.123.106:0/3084838809 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd2601a6ce0 con 0x7fd2600fff00 2026-03-09T17:36:40.437 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.435+0000 7fd2577fe700 1 -- 192.168.123.106:0/3084838809 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd250010460 con 0x7fd2600fff00 2026-03-09T17:36:40.437 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.435+0000 7fd2577fe700 1 -- 192.168.123.106:0/3084838809 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd25000f5d0 con 0x7fd2600fff00 2026-03-09T17:36:40.437 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.436+0000 7fd2577fe700 1 -- 192.168.123.106:0/3084838809 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd25000f7e0 con 0x7fd2600fff00 2026-03-09T17:36:40.437 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.436+0000 7fd2577fe700 1 --2- 192.168.123.106:0/3084838809 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd24c0778c0 0x7fd24c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:40.438 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.436+0000 7fd2577fe700 1 -- 192.168.123.106:0/3084838809 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fd25009aaf0 con 0x7fd2600fff00 2026-03-09T17:36:40.438 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.437+0000 7fd265aaf700 1 --2- 192.168.123.106:0/3084838809 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd24c0778c0 0x7fd24c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:40.440 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.437+0000 7fd268514700 1 -- 192.168.123.106:0/3084838809 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd244005320 con 0x7fd2600fff00 2026-03-09T17:36:40.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.439+0000 7fd265aaf700 1 --2- 192.168.123.106:0/3084838809 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd24c0778c0 0x7fd24c079d70 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7fd25c000c00 tx=0x7fd25c005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:40.441 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.440+0000 7fd2577fe700 1 -- 192.168.123.106:0/3084838809 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd250063220 con 0x7fd2600fff00 2026-03-09T17:36:40.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.578+0000 7fd268514700 1 -- 192.168.123.106:0/3084838809 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 17, "format": "json"} v 0) v1 -- 0x7fd244005190 con 0x7fd2600fff00 2026-03-09T17:36:40.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.580+0000 7fd2577fe700 1 -- 192.168.123.106:0/3084838809 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 17, "format": "json"}]=0 dumped fsmap epoch 17 v35) v1 ==== 107+0+4139 (secure 0 0 0) 0x7fd250020020 con 0x7fd2600fff00 2026-03-09T17:36:40.582 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:40.582 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":17,"btime":"2026-03-09T17:34:06:141750+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13},{"gid":34280,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3053844977","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3053844977},{"type":"v1","addr":"192.168.123.109:6825","nonce":3053844977}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":17,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:06.091589+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":75,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14500},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14500":{"gid":14500,"name":"cephfs.vm06.gzymac","rank":0,"incarnation":16,"state":"up:reconnect","state_seq":104,"addr":"192.168.123.106:6829/4261949342","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":4261949342},{"type":"v1","addr":"192.168.123.106:6829","nonce":4261949342}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T17:36:40.584 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.583+0000 7fd268514700 1 -- 192.168.123.106:0/3084838809 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd24c0778c0 msgr2=0x7fd24c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:40.585 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.583+0000 7fd268514700 1 --2- 192.168.123.106:0/3084838809 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd24c0778c0 0x7fd24c079d70 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7fd25c000c00 tx=0x7fd25c005fb0 comp rx=0 tx=0).stop 2026-03-09T17:36:40.585 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.584+0000 7fd268514700 1 -- 192.168.123.106:0/3084838809 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd2600fff00 msgr2=0x7fd260105210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:40.585 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.584+0000 7fd268514700 1 --2- 192.168.123.106:0/3084838809 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd2600fff00 0x7fd260105210 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7fd25000d8d0 tx=0x7fd25000dc90 comp rx=0 tx=0).stop 2026-03-09T17:36:40.585 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.584+0000 7fd268514700 1 -- 192.168.123.106:0/3084838809 shutdown_connections 2026-03-09T17:36:40.585 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.584+0000 7fd268514700 1 --2- 192.168.123.106:0/3084838809 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd24c0778c0 0x7fd24c079d70 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:40.585 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.584+0000 7fd268514700 1 --2- 192.168.123.106:0/3084838809 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd2600fff00 0x7fd260105210 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:40.585 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.584+0000 7fd268514700 1 --2- 192.168.123.106:0/3084838809 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd260104520 0x7fd260105750 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:40.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.585+0000 7fd268514700 1 -- 192.168.123.106:0/3084838809 >> 192.168.123.106:0/3084838809 conn(0x7fd2600754a0 msgr2=0x7fd260068d20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:40.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.585+0000 7fd268514700 1 -- 192.168.123.106:0/3084838809 shutdown_connections 2026-03-09T17:36:40.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:40.585+0000 7fd268514700 1 -- 192.168.123.106:0/3084838809 wait complete. 2026-03-09T17:36:40.587 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 17 2026-03-09T17:36:40.645 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 17 2026-03-09T17:36:40.646 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 18 2026-03-09T17:36:40.796 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:41.019 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.018+0000 7face88e4700 1 -- 192.168.123.106:0/4195886646 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7face0073070 msgr2=0x7face0073440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:41.019 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.018+0000 7face88e4700 1 --2- 192.168.123.106:0/4195886646 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7face0073070 0x7face0073440 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7facd8009b30 tx=0x7facd8009e40 comp rx=0 tx=0).stop 2026-03-09T17:36:41.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.018+0000 7face88e4700 1 -- 192.168.123.106:0/4195886646 shutdown_connections 2026-03-09T17:36:41.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.018+0000 7face88e4700 1 --2- 192.168.123.106:0/4195886646 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7face0073980 0x7face010c8a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.018+0000 7face88e4700 1 --2- 192.168.123.106:0/4195886646 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7face0073070 0x7face0073440 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.018+0000 7face88e4700 1 -- 192.168.123.106:0/4195886646 >> 192.168.123.106:0/4195886646 conn(0x7face00fbfc0 msgr2=0x7face00fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:41.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.019+0000 7face88e4700 1 -- 192.168.123.106:0/4195886646 shutdown_connections 2026-03-09T17:36:41.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.019+0000 7face88e4700 1 -- 192.168.123.106:0/4195886646 wait complete. 2026-03-09T17:36:41.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.019+0000 7face88e4700 1 Processor -- start 2026-03-09T17:36:41.020 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.019+0000 7face88e4700 1 -- start start 2026-03-09T17:36:41.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.020+0000 7face88e4700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7face0073070 0x7face0198360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:41.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.020+0000 7face88e4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7face0073980 0x7face01988a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:41.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.020+0000 7face88e4700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7face0198ef0 con 0x7face0073980 2026-03-09T17:36:41.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.020+0000 7face88e4700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7face0199030 con 0x7face0073070 2026-03-09T17:36:41.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.020+0000 7face5e7f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7face0073980 0x7face01988a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:41.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.020+0000 7face5e7f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7face0073980 0x7face01988a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58544/0 (socket says 192.168.123.106:58544) 2026-03-09T17:36:41.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.020+0000 7face5e7f700 1 -- 192.168.123.106:0/2098957955 learned_addr learned my addr 192.168.123.106:0/2098957955 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:41.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.020+0000 7face6680700 1 --2- 192.168.123.106:0/2098957955 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7face0073070 0x7face0198360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:41.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.020+0000 7face5e7f700 1 -- 192.168.123.106:0/2098957955 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7face0073070 msgr2=0x7face0198360 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:41.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.020+0000 7face5e7f700 1 --2- 192.168.123.106:0/2098957955 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7face0073070 0x7face0198360 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.020+0000 7face5e7f700 1 -- 192.168.123.106:0/2098957955 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7facd80097e0 con 0x7face0073980 2026-03-09T17:36:41.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.020+0000 7face6680700 1 --2- 192.168.123.106:0/2098957955 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7face0073070 0x7face0198360 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T17:36:41.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.020+0000 7face5e7f700 1 --2- 192.168.123.106:0/2098957955 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7face0073980 0x7face01988a0 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7facd400c930 tx=0x7facd400ccf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:41.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.021+0000 7facd37fe700 1 -- 192.168.123.106:0/2098957955 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7facd4007ab0 con 0x7face0073980 2026-03-09T17:36:41.023 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.021+0000 7facd37fe700 1 -- 192.168.123.106:0/2098957955 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7facd4007c10 con 0x7face0073980 2026-03-09T17:36:41.023 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.021+0000 7facd37fe700 1 -- 192.168.123.106:0/2098957955 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7facd4018730 con 0x7face0073980 2026-03-09T17:36:41.023 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.021+0000 7face88e4700 1 -- 192.168.123.106:0/2098957955 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7face019ce80 con 0x7face0073980 2026-03-09T17:36:41.023 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.021+0000 7face88e4700 1 -- 192.168.123.106:0/2098957955 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7face019d3d0 con 0x7face0073980 2026-03-09T17:36:41.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.021+0000 7face88e4700 1 -- 192.168.123.106:0/2098957955 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7face0109fa0 con 0x7face0073980 2026-03-09T17:36:41.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.025+0000 7facd37fe700 1 -- 192.168.123.106:0/2098957955 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7facd401f030 con 0x7face0073980 2026-03-09T17:36:41.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.026+0000 7facd37fe700 1 --2- 192.168.123.106:0/2098957955 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7faccc077910 0x7faccc079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:41.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.026+0000 7face6680700 1 --2- 192.168.123.106:0/2098957955 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7faccc077910 0x7faccc079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:41.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.026+0000 7facd37fe700 1 -- 192.168.123.106:0/2098957955 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7facd4099fe0 con 0x7face0073980 2026-03-09T17:36:41.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.026+0000 7face6680700 1 --2- 192.168.123.106:0/2098957955 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7faccc077910 0x7faccc079dc0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7facd800b580 tx=0x7facd8005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:41.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.027+0000 7facd37fe700 1 -- 192.168.123.106:0/2098957955 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7facd409a400 con 0x7face0073980 2026-03-09T17:36:41.174 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.172+0000 7face88e4700 1 -- 192.168.123.106:0/2098957955 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 18, "format": "json"} v 0) v1 -- 0x7face0199760 con 0x7face0073980 2026-03-09T17:36:41.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.173+0000 7facd37fe700 1 -- 192.168.123.106:0/2098957955 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 18, "format": "json"}]=0 dumped fsmap epoch 18 v35) v1 ==== 107+0+4136 (secure 0 0 0) 0x7facd4062790 con 0x7face0073980 2026-03-09T17:36:41.175 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:41.175 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":18,"btime":"2026-03-09T17:34:07:155529+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13},{"gid":34280,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3053844977","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3053844977},{"type":"v1","addr":"192.168.123.109:6825","nonce":3053844977}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":18,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:06.150544+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":75,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14500},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14500":{"gid":14500,"name":"cephfs.vm06.gzymac","rank":0,"incarnation":16,"state":"up:rejoin","state_seq":105,"addr":"192.168.123.106:6829/4261949342","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":4261949342},{"type":"v1","addr":"192.168.123.106:6829","nonce":4261949342}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T17:36:41.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.181+0000 7face88e4700 1 -- 192.168.123.106:0/2098957955 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7faccc077910 msgr2=0x7faccc079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:41.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.181+0000 7face88e4700 1 --2- 192.168.123.106:0/2098957955 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7faccc077910 0x7faccc079dc0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7facd800b580 tx=0x7facd8005fb0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.181+0000 7face88e4700 1 -- 192.168.123.106:0/2098957955 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7face0073980 msgr2=0x7face01988a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:41.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.181+0000 7face88e4700 1 --2- 192.168.123.106:0/2098957955 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7face0073980 0x7face01988a0 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7facd400c930 tx=0x7facd400ccf0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.183+0000 7face88e4700 1 -- 192.168.123.106:0/2098957955 shutdown_connections 2026-03-09T17:36:41.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.183+0000 7face88e4700 1 --2- 192.168.123.106:0/2098957955 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7faccc077910 0x7faccc079dc0 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.183+0000 7face88e4700 1 --2- 192.168.123.106:0/2098957955 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7face0073070 0x7face0198360 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.183+0000 7face88e4700 1 --2- 192.168.123.106:0/2098957955 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7face0073980 0x7face01988a0 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.183+0000 7face88e4700 1 -- 192.168.123.106:0/2098957955 >> 192.168.123.106:0/2098957955 conn(0x7face00fbfc0 msgr2=0x7face01070e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:41.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.183+0000 7face88e4700 1 -- 192.168.123.106:0/2098957955 shutdown_connections 2026-03-09T17:36:41.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.183+0000 7face88e4700 1 -- 192.168.123.106:0/2098957955 wait complete. 2026-03-09T17:36:41.185 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 18 2026-03-09T17:36:41.237 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 18 2026-03-09T17:36:41.238 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 19 2026-03-09T17:36:41.391 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:41.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:41 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/3084838809' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-09T17:36:41.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:41 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/3084838809' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-09T17:36:41.634 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.633+0000 7febb4604700 1 -- 192.168.123.106:0/3400572279 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febac1013a0 msgr2=0x7febac101770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:41.634 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.633+0000 7febb4604700 1 --2- 192.168.123.106:0/3400572279 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febac1013a0 0x7febac101770 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7feba8009b00 tx=0x7feba8009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:41.635 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.633+0000 7febb4604700 1 -- 192.168.123.106:0/3400572279 shutdown_connections 2026-03-09T17:36:41.635 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.633+0000 7febb4604700 1 --2- 192.168.123.106:0/3400572279 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7febac068490 0x7febac068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.635 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.633+0000 7febb4604700 1 --2- 192.168.123.106:0/3400572279 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febac1013a0 0x7febac101770 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.635 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.633+0000 7febb4604700 1 -- 192.168.123.106:0/3400572279 >> 192.168.123.106:0/3400572279 conn(0x7febac0754a0 msgr2=0x7febac0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:41.635 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.634+0000 7febb4604700 1 -- 192.168.123.106:0/3400572279 shutdown_connections 2026-03-09T17:36:41.635 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.634+0000 7febb4604700 1 -- 192.168.123.106:0/3400572279 wait complete. 2026-03-09T17:36:41.635 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.634+0000 7febb4604700 1 Processor -- start 2026-03-09T17:36:41.636 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.635+0000 7febb4604700 1 -- start start 2026-03-09T17:36:41.636 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.635+0000 7febb4604700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7febac068490 0x7febac198330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:41.636 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.635+0000 7febb4604700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febac1013a0 0x7febac198870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:41.636 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.635+0000 7febb4604700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7febac198f50 con 0x7febac1013a0 2026-03-09T17:36:41.636 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.635+0000 7febb4604700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7febac19cce0 con 0x7febac068490 2026-03-09T17:36:41.636 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.635+0000 7febb1b9f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febac1013a0 0x7febac198870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:41.636 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.635+0000 7febb1b9f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febac1013a0 0x7febac198870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:58568/0 (socket says 192.168.123.106:58568) 2026-03-09T17:36:41.636 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.635+0000 7febb1b9f700 1 -- 192.168.123.106:0/2024688205 learned_addr learned my addr 192.168.123.106:0/2024688205 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:41.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.635+0000 7febb23a0700 1 --2- 192.168.123.106:0/2024688205 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7febac068490 0x7febac198330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:41.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.636+0000 7febb1b9f700 1 -- 192.168.123.106:0/2024688205 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7febac068490 msgr2=0x7febac198330 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:41.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.636+0000 7febb1b9f700 1 --2- 192.168.123.106:0/2024688205 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7febac068490 0x7febac198330 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.636+0000 7febb1b9f700 1 -- 192.168.123.106:0/2024688205 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feba80097e0 con 0x7febac1013a0 2026-03-09T17:36:41.637 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.636+0000 7febb1b9f700 1 --2- 192.168.123.106:0/2024688205 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febac1013a0 0x7febac198870 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7feb9c00cc60 tx=0x7feb9c0074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:41.638 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.636+0000 7feba37fe700 1 -- 192.168.123.106:0/2024688205 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feb9c007af0 con 0x7febac1013a0 2026-03-09T17:36:41.638 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.636+0000 7feba37fe700 1 -- 192.168.123.106:0/2024688205 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7feb9c007c50 con 0x7febac1013a0 2026-03-09T17:36:41.638 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.636+0000 7feba37fe700 1 -- 192.168.123.106:0/2024688205 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feb9c00e600 con 0x7febac1013a0 2026-03-09T17:36:41.638 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.637+0000 7febb4604700 1 -- 192.168.123.106:0/2024688205 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7febac19cfc0 con 0x7febac1013a0 2026-03-09T17:36:41.639 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.637+0000 7febb4604700 1 -- 192.168.123.106:0/2024688205 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7febac19d4e0 con 0x7febac1013a0 2026-03-09T17:36:41.640 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.638+0000 7febb4604700 1 -- 192.168.123.106:0/2024688205 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7febac04ea50 con 0x7febac1013a0 2026-03-09T17:36:41.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.641+0000 7feba37fe700 1 -- 192.168.123.106:0/2024688205 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7feb9c00e760 con 0x7febac1013a0 2026-03-09T17:36:41.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.641+0000 7feba37fe700 1 --2- 192.168.123.106:0/2024688205 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7feb980779e0 0x7feb98079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:41.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.641+0000 7feba37fe700 1 -- 192.168.123.106:0/2024688205 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7feb9c099790 con 0x7febac1013a0 2026-03-09T17:36:41.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.641+0000 7febb23a0700 1 --2- 192.168.123.106:0/2024688205 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7feb980779e0 0x7feb98079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:41.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.642+0000 7febb23a0700 1 --2- 192.168.123.106:0/2024688205 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7feb980779e0 0x7feb98079e90 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7feba800b5c0 tx=0x7feba8005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:41.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.642+0000 7feba37fe700 1 -- 192.168.123.106:0/2024688205 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7feb9c061f40 con 0x7febac1013a0 2026-03-09T17:36:41.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.782+0000 7febb4604700 1 -- 192.168.123.106:0/2024688205 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 19, "format": "json"} v 0) v1 -- 0x7febac066e40 con 0x7febac1013a0 2026-03-09T17:36:41.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.784+0000 7feba37fe700 1 -- 192.168.123.106:0/2024688205 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 19, "format": "json"}]=0 dumped fsmap epoch 19 v35) v1 ==== 107+0+4145 (secure 0 0 0) 0x7feb9c061690 con 0x7febac1013a0 2026-03-09T17:36:41.786 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:41.786 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":19,"btime":"2026-03-09T17:34:08:163976+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13},{"gid":34280,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3053844977","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3053844977},{"type":"v1","addr":"192.168.123.109:6825","nonce":3053844977}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":15}],"filesystems":[{"mdsmap":{"epoch":19,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:08.163973+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":75,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14500},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14500":{"gid":14500,"name":"cephfs.vm06.gzymac","rank":0,"incarnation":16,"state":"up:active","state_seq":106,"addr":"192.168.123.106:6829/4261949342","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":4261949342},{"type":"v1","addr":"192.168.123.106:6829","nonce":4261949342}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14500,"qdb_cluster":[14500]},"id":1}]} 2026-03-09T17:36:41.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.788+0000 7febb4604700 1 -- 192.168.123.106:0/2024688205 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7feb980779e0 msgr2=0x7feb98079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:41.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.788+0000 7febb4604700 1 --2- 192.168.123.106:0/2024688205 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7feb980779e0 0x7feb98079e90 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7feba800b5c0 tx=0x7feba8005fb0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.788+0000 7febb4604700 1 -- 192.168.123.106:0/2024688205 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febac1013a0 msgr2=0x7febac198870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:41.789 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.788+0000 7febb4604700 1 --2- 192.168.123.106:0/2024688205 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febac1013a0 0x7febac198870 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7feb9c00cc60 tx=0x7feb9c0074a0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.788+0000 7febb4604700 1 -- 192.168.123.106:0/2024688205 shutdown_connections 2026-03-09T17:36:41.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.788+0000 7febb4604700 1 --2- 192.168.123.106:0/2024688205 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7feb980779e0 0x7feb98079e90 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.788+0000 7febb4604700 1 --2- 192.168.123.106:0/2024688205 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7febac068490 0x7febac198330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.788+0000 7febb4604700 1 --2- 192.168.123.106:0/2024688205 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7febac1013a0 0x7febac198870 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:41.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.788+0000 7febb4604700 1 -- 192.168.123.106:0/2024688205 >> 192.168.123.106:0/2024688205 conn(0x7febac0754a0 msgr2=0x7febac0fdd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:41.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.788+0000 7febb4604700 1 -- 192.168.123.106:0/2024688205 shutdown_connections 2026-03-09T17:36:41.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:41.789+0000 7febb4604700 1 -- 192.168.123.106:0/2024688205 wait complete. 2026-03-09T17:36:41.791 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 19 2026-03-09T17:36:41.901 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 19 2026-03-09T17:36:41.901 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 20 2026-03-09T17:36:42.045 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:42.281 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.279+0000 7f6eb4e30700 1 -- 192.168.123.106:0/2506444489 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6eb01024e0 msgr2=0x7f6eb0102950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:42.281 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.279+0000 7f6eb4e30700 1 --2- 192.168.123.106:0/2506444489 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6eb01024e0 0x7f6eb0102950 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f6ea0009b00 tx=0x7f6ea0009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:42.281 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.280+0000 7f6eb4e30700 1 -- 192.168.123.106:0/2506444489 shutdown_connections 2026-03-09T17:36:42.281 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.280+0000 7f6eb4e30700 1 --2- 192.168.123.106:0/2506444489 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6eb01024e0 0x7f6eb0102950 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:42.281 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.280+0000 7f6eb4e30700 1 --2- 192.168.123.106:0/2506444489 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6eb01084e0 0x7f6eb01088b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:42.281 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.280+0000 7f6eb4e30700 1 -- 192.168.123.106:0/2506444489 >> 192.168.123.106:0/2506444489 conn(0x7f6eb00fe000 msgr2=0x7f6eb0100410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:42.281 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.280+0000 7f6eb4e30700 1 -- 192.168.123.106:0/2506444489 shutdown_connections 2026-03-09T17:36:42.281 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.280+0000 7f6eb4e30700 1 -- 192.168.123.106:0/2506444489 wait complete. 2026-03-09T17:36:42.281 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.280+0000 7f6eb4e30700 1 Processor -- start 2026-03-09T17:36:42.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.281+0000 7f6eb4e30700 1 -- start start 2026-03-09T17:36:42.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.281+0000 7f6eb4e30700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6eb01024e0 0x7f6eb01981c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:42.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.281+0000 7f6eb4e30700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6eb01084e0 0x7f6eb0198700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:42.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.281+0000 7f6eb4e30700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6eb0198de0 con 0x7f6eb01084e0 2026-03-09T17:36:42.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.281+0000 7f6eb4e30700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6eb019cb70 con 0x7f6eb01024e0 2026-03-09T17:36:42.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.281+0000 7f6eaeffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6eb01084e0 0x7f6eb0198700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:42.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.281+0000 7f6eaf7fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6eb01024e0 0x7f6eb01981c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:42.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.281+0000 7f6eaf7fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6eb01024e0 0x7f6eb01981c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:35020/0 (socket says 192.168.123.106:35020) 2026-03-09T17:36:42.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.281+0000 7f6eaf7fe700 1 -- 192.168.123.106:0/828012063 learned_addr learned my addr 192.168.123.106:0/828012063 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:42.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.281+0000 7f6eaeffd700 1 -- 192.168.123.106:0/828012063 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6eb01024e0 msgr2=0x7f6eb01981c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:42.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.281+0000 7f6eaeffd700 1 --2- 192.168.123.106:0/828012063 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6eb01024e0 0x7f6eb01981c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:42.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.281+0000 7f6eaeffd700 1 -- 192.168.123.106:0/828012063 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6e98009710 con 0x7f6eb01084e0 2026-03-09T17:36:42.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.282+0000 7f6eaeffd700 1 --2- 192.168.123.106:0/828012063 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6eb01084e0 0x7f6eb0198700 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f6ea0000c00 tx=0x7f6ea000ba00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:42.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.282+0000 7f6eacff9700 1 -- 192.168.123.106:0/828012063 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ea001d070 con 0x7f6eb01084e0 2026-03-09T17:36:42.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.282+0000 7f6eacff9700 1 -- 192.168.123.106:0/828012063 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6ea000f460 con 0x7f6eb01084e0 2026-03-09T17:36:42.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.282+0000 7f6eacff9700 1 -- 192.168.123.106:0/828012063 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ea0021600 con 0x7f6eb01084e0 2026-03-09T17:36:42.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.282+0000 7f6eb4e30700 1 -- 192.168.123.106:0/828012063 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6ea00097e0 con 0x7f6eb01084e0 2026-03-09T17:36:42.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.282+0000 7f6eb4e30700 1 -- 192.168.123.106:0/828012063 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6eb019d070 con 0x7f6eb01084e0 2026-03-09T17:36:42.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.283+0000 7f6eb4e30700 1 -- 192.168.123.106:0/828012063 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6eb004f2a0 con 0x7f6eb01084e0 2026-03-09T17:36:42.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.286+0000 7f6eacff9700 1 -- 192.168.123.106:0/828012063 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6ea000f5d0 con 0x7f6eb01084e0 2026-03-09T17:36:42.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.286+0000 7f6eacff9700 1 --2- 192.168.123.106:0/828012063 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6e9c077910 0x7f6e9c079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:42.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.286+0000 7f6eacff9700 1 -- 192.168.123.106:0/828012063 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f6ea009ae40 con 0x7f6eb01084e0 2026-03-09T17:36:42.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.287+0000 7f6eacff9700 1 -- 192.168.123.106:0/828012063 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6ea009b260 con 0x7f6eb01084e0 2026-03-09T17:36:42.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.287+0000 7f6eaf7fe700 1 --2- 192.168.123.106:0/828012063 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6e9c077910 0x7f6e9c079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:42.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.287+0000 7f6eaf7fe700 1 --2- 192.168.123.106:0/828012063 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6e9c077910 0x7f6e9c079dc0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f6e98009fd0 tx=0x7f6e98009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:42.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:42 vm06.local ceph-mon[109831]: pgmap v212: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:42.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:42 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/2098957955' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-09T17:36:42.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:42 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/2024688205' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-09T17:36:42.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:42 vm09.local ceph-mon[97995]: pgmap v212: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:42.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:42 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/2098957955' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-09T17:36:42.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:42 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/2024688205' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-09T17:36:42.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.424+0000 7f6eb4e30700 1 -- 192.168.123.106:0/828012063 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 20, "format": "json"} v 0) v1 -- 0x7f6eb004ea50 con 0x7f6eb01084e0 2026-03-09T17:36:42.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.425+0000 7f6eacff9700 1 -- 192.168.123.106:0/828012063 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 20, "format": "json"}]=0 dumped fsmap epoch 20 v35) v1 ==== 107+0+4993 (secure 0 0 0) 0x7f6ea00635f0 con 0x7f6eb01084e0 2026-03-09T17:36:42.426 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:42.426 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":20,"btime":"2026-03-09T17:34:09:566892+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13},{"gid":34280,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3053844977","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3053844977},{"type":"v1","addr":"192.168.123.109:6825","nonce":3053844977}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":15},{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20}],"filesystems":[{"mdsmap":{"epoch":19,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:08.163973+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":75,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14500},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14500":{"gid":14500,"name":"cephfs.vm06.gzymac","rank":0,"incarnation":16,"state":"up:active","state_seq":106,"addr":"192.168.123.106:6829/4261949342","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":4261949342},{"type":"v1","addr":"192.168.123.106:6829","nonce":4261949342}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14500,"qdb_cluster":[14500]},"id":1}]} 2026-03-09T17:36:42.428 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.427+0000 7f6eb4e30700 1 -- 192.168.123.106:0/828012063 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6e9c077910 msgr2=0x7f6e9c079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:42.428 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.427+0000 7f6eb4e30700 1 --2- 192.168.123.106:0/828012063 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6e9c077910 0x7f6e9c079dc0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f6e98009fd0 tx=0x7f6e98009450 comp rx=0 tx=0).stop 2026-03-09T17:36:42.428 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.427+0000 7f6eb4e30700 1 -- 192.168.123.106:0/828012063 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6eb01084e0 msgr2=0x7f6eb0198700 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:42.428 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.427+0000 7f6eb4e30700 1 --2- 192.168.123.106:0/828012063 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6eb01084e0 0x7f6eb0198700 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f6ea0000c00 tx=0x7f6ea000ba00 comp rx=0 tx=0).stop 2026-03-09T17:36:42.428 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.427+0000 7f6eb4e30700 1 -- 192.168.123.106:0/828012063 shutdown_connections 2026-03-09T17:36:42.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.427+0000 7f6eb4e30700 1 --2- 192.168.123.106:0/828012063 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6e9c077910 0x7f6e9c079dc0 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:42.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.427+0000 7f6eb4e30700 1 --2- 192.168.123.106:0/828012063 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6eb01024e0 0x7f6eb01981c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:42.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.427+0000 7f6eb4e30700 1 --2- 192.168.123.106:0/828012063 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6eb01084e0 0x7f6eb0198700 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:42.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.428+0000 7f6eb4e30700 1 -- 192.168.123.106:0/828012063 >> 192.168.123.106:0/828012063 conn(0x7f6eb00fe000 msgr2=0x7f6eb00fea50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:42.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.428+0000 7f6eb4e30700 1 -- 192.168.123.106:0/828012063 shutdown_connections 2026-03-09T17:36:42.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.428+0000 7f6eb4e30700 1 -- 192.168.123.106:0/828012063 wait complete. 2026-03-09T17:36:42.429 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 20 2026-03-09T17:36:42.471 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 20 2026-03-09T17:36:42.471 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 21 2026-03-09T17:36:42.600 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:42.830 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.828+0000 7f35ba5cf700 1 -- 192.168.123.106:0/864397429 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35b4073a00 msgr2=0x7f35b4110ff0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:42.830 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.828+0000 7f35ba5cf700 1 --2- 192.168.123.106:0/864397429 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35b4073a00 0x7f35b4110ff0 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f35a4009b00 tx=0x7f35a4009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:42.830 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.829+0000 7f35ba5cf700 1 -- 192.168.123.106:0/864397429 shutdown_connections 2026-03-09T17:36:42.830 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.829+0000 7f35ba5cf700 1 --2- 192.168.123.106:0/864397429 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35b4073a00 0x7f35b4110ff0 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:42.830 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.829+0000 7f35ba5cf700 1 --2- 192.168.123.106:0/864397429 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f35b40730f0 0x7f35b40734c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:42.830 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.829+0000 7f35ba5cf700 1 -- 192.168.123.106:0/864397429 >> 192.168.123.106:0/864397429 conn(0x7f35b40fc000 msgr2=0x7f35b40fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:42.830 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.829+0000 7f35ba5cf700 1 -- 192.168.123.106:0/864397429 shutdown_connections 2026-03-09T17:36:42.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.829+0000 7f35ba5cf700 1 -- 192.168.123.106:0/864397429 wait complete. 2026-03-09T17:36:42.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.830+0000 7f35ba5cf700 1 Processor -- start 2026-03-09T17:36:42.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.830+0000 7f35ba5cf700 1 -- start start 2026-03-09T17:36:42.831 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.830+0000 7f35ba5cf700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f35b40730f0 0x7f35b419c6e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:42.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.830+0000 7f35ba5cf700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35b4073a00 0x7f35b419cc20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:42.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.830+0000 7f35ba5cf700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f35b419d220 con 0x7f35b4073a00 2026-03-09T17:36:42.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.830+0000 7f35ba5cf700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f35b419d390 con 0x7f35b40730f0 2026-03-09T17:36:42.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.831+0000 7f35b37fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35b4073a00 0x7f35b419cc20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:42.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.831+0000 7f35b37fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35b4073a00 0x7f35b419cc20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:55796/0 (socket says 192.168.123.106:55796) 2026-03-09T17:36:42.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.831+0000 7f35b37fe700 1 -- 192.168.123.106:0/2193083681 learned_addr learned my addr 192.168.123.106:0/2193083681 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:42.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.831+0000 7f35b37fe700 1 -- 192.168.123.106:0/2193083681 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f35b40730f0 msgr2=0x7f35b419c6e0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T17:36:42.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.831+0000 7f35b37fe700 1 --2- 192.168.123.106:0/2193083681 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f35b40730f0 0x7f35b419c6e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:42.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.831+0000 7f35b37fe700 1 -- 192.168.123.106:0/2193083681 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f35a40097e0 con 0x7f35b4073a00 2026-03-09T17:36:42.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.831+0000 7f35b37fe700 1 --2- 192.168.123.106:0/2193083681 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35b4073a00 0x7f35b419cc20 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f35a40094d0 tx=0x7f35a40049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:42.832 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.831+0000 7f35b17fa700 1 -- 192.168.123.106:0/2193083681 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f35a401d070 con 0x7f35b4073a00 2026-03-09T17:36:42.833 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.831+0000 7f35b17fa700 1 -- 192.168.123.106:0/2193083681 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f35a400bc50 con 0x7f35b4073a00 2026-03-09T17:36:42.833 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.832+0000 7f35ba5cf700 1 -- 192.168.123.106:0/2193083681 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f35b41a0130 con 0x7f35b4073a00 2026-03-09T17:36:42.833 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.832+0000 7f35b17fa700 1 -- 192.168.123.106:0/2193083681 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f35a4022620 con 0x7f35b4073a00 2026-03-09T17:36:42.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.832+0000 7f35ba5cf700 1 -- 192.168.123.106:0/2193083681 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f35b41a0650 con 0x7f35b4073a00 2026-03-09T17:36:42.834 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.833+0000 7f35ba5cf700 1 -- 192.168.123.106:0/2193083681 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f35b410e770 con 0x7f35b4073a00 2026-03-09T17:36:42.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.833+0000 7f35b17fa700 1 -- 192.168.123.106:0/2193083681 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f35a400f660 con 0x7f35b4073a00 2026-03-09T17:36:42.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.834+0000 7f35b17fa700 1 --2- 192.168.123.106:0/2193083681 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f35a0077870 0x7f35a0079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:42.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.834+0000 7f35b17fa700 1 -- 192.168.123.106:0/2193083681 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f35a4067a20 con 0x7f35b4073a00 2026-03-09T17:36:42.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.836+0000 7f35b17fa700 1 -- 192.168.123.106:0/2193083681 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f35a40a0050 con 0x7f35b4073a00 2026-03-09T17:36:42.837 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.836+0000 7f35b3fff700 1 --2- 192.168.123.106:0/2193083681 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f35a0077870 0x7f35a0079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:42.838 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.837+0000 7f35b3fff700 1 --2- 192.168.123.106:0/2193083681 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f35a0077870 0x7f35a0079d20 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f359c00aa00 tx=0x7f359c005c90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:42.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.973+0000 7f35ba5cf700 1 -- 192.168.123.106:0/2193083681 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 21, "format": "json"} v 0) v1 -- 0x7f35b404ea50 con 0x7f35b4073a00 2026-03-09T17:36:42.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.974+0000 7f35b17fa700 1 -- 192.168.123.106:0/2193083681 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 21, "format": "json"}]=0 dumped fsmap epoch 21 v35) v1 ==== 107+0+4188 (secure 0 0 0) 0x7f35a4063440 con 0x7f35b4073a00 2026-03-09T17:36:42.976 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:42.976 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":21,"btime":"2026-03-09T17:34:12:303494+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":13},{"gid":34280,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3053844977","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3053844977},{"type":"v1","addr":"192.168.123.109:6825","nonce":3053844977}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":15},{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20}],"filesystems":[{"mdsmap":{"epoch":21,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:12.303493+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T17:36:42.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.977+0000 7f35ba5cf700 1 -- 192.168.123.106:0/2193083681 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f35a0077870 msgr2=0x7f35a0079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:42.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.977+0000 7f35ba5cf700 1 --2- 192.168.123.106:0/2193083681 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f35a0077870 0x7f35a0079d20 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f359c00aa00 tx=0x7f359c005c90 comp rx=0 tx=0).stop 2026-03-09T17:36:42.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.977+0000 7f35ba5cf700 1 -- 192.168.123.106:0/2193083681 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35b4073a00 msgr2=0x7f35b419cc20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:42.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.977+0000 7f35ba5cf700 1 --2- 192.168.123.106:0/2193083681 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35b4073a00 0x7f35b419cc20 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f35a40094d0 tx=0x7f35a40049e0 comp rx=0 tx=0).stop 2026-03-09T17:36:42.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.978+0000 7f35ba5cf700 1 -- 192.168.123.106:0/2193083681 shutdown_connections 2026-03-09T17:36:42.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.978+0000 7f35ba5cf700 1 --2- 192.168.123.106:0/2193083681 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f35a0077870 0x7f35a0079d20 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:42.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.978+0000 7f35ba5cf700 1 --2- 192.168.123.106:0/2193083681 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f35b40730f0 0x7f35b419c6e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:42.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.978+0000 7f35ba5cf700 1 --2- 192.168.123.106:0/2193083681 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f35b4073a00 0x7f35b419cc20 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:42.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.978+0000 7f35ba5cf700 1 -- 192.168.123.106:0/2193083681 >> 192.168.123.106:0/2193083681 conn(0x7f35b40fc000 msgr2=0x7f35b4102b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:42.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.978+0000 7f35ba5cf700 1 -- 192.168.123.106:0/2193083681 shutdown_connections 2026-03-09T17:36:42.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:42.978+0000 7f35ba5cf700 1 -- 192.168.123.106:0/2193083681 wait complete. 2026-03-09T17:36:42.980 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 21 2026-03-09T17:36:43.038 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 21 2026-03-09T17:36:43.038 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 22 2026-03-09T17:36:43.184 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:43.225 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:43 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/828012063' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-09T17:36:43.225 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:43 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/2193083681' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-09T17:36:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:43 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/828012063' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-09T17:36:43.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:43 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/2193083681' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-09T17:36:43.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.452+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/3464103996 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5fec0686f0 msgr2=0x7f5fec068ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:43.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.452+0000 7f5ff1a6a700 1 --2- 192.168.123.106:0/3464103996 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5fec0686f0 0x7f5fec068ac0 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f5fd4009b00 tx=0x7f5fd4009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:43.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.453+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/3464103996 shutdown_connections 2026-03-09T17:36:43.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.453+0000 7f5ff1a6a700 1 --2- 192.168.123.106:0/3464103996 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5fec069000 0x7f5fec1051e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:43.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.453+0000 7f5ff1a6a700 1 --2- 192.168.123.106:0/3464103996 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5fec0686f0 0x7f5fec068ac0 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:43.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.453+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/3464103996 >> 192.168.123.106:0/3464103996 conn(0x7f5fec0754a0 msgr2=0x7f5fec0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:43.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.453+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/3464103996 shutdown_connections 2026-03-09T17:36:43.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.453+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/3464103996 wait complete. 2026-03-09T17:36:43.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.453+0000 7f5ff1a6a700 1 Processor -- start 2026-03-09T17:36:43.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.454+0000 7f5ff1a6a700 1 -- start start 2026-03-09T17:36:43.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.454+0000 7f5ff1a6a700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5fec0686f0 0x7f5fec1a2610 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:43.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.454+0000 7f5ff1a6a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5fec069000 0x7f5fec1a2b50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:43.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.454+0000 7f5ff1a6a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5fec1a3170 con 0x7f5fec069000 2026-03-09T17:36:43.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.454+0000 7f5ff1a6a700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5fec19c690 con 0x7f5fec0686f0 2026-03-09T17:36:43.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.454+0000 7f5feb7fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5fec0686f0 0x7f5fec1a2610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:43.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.454+0000 7f5feb7fe700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5fec0686f0 0x7f5fec1a2610 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:46714/0 (socket says 192.168.123.106:46714) 2026-03-09T17:36:43.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.454+0000 7f5feb7fe700 1 -- 192.168.123.106:0/936748664 learned_addr learned my addr 192.168.123.106:0/936748664 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:43.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.454+0000 7f5feb7fe700 1 -- 192.168.123.106:0/936748664 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5fec069000 msgr2=0x7f5fec1a2b50 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:36:43.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.455+0000 7f5feaffd700 1 --2- 192.168.123.106:0/936748664 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5fec069000 0x7f5fec1a2b50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:43.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.455+0000 7f5feb7fe700 1 --2- 192.168.123.106:0/936748664 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5fec069000 0x7f5fec1a2b50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:43.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.455+0000 7f5feb7fe700 1 -- 192.168.123.106:0/936748664 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5fd40097e0 con 0x7f5fec0686f0 2026-03-09T17:36:43.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.455+0000 7f5feaffd700 1 --2- 192.168.123.106:0/936748664 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5fec069000 0x7f5fec1a2b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:36:43.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.455+0000 7f5feb7fe700 1 --2- 192.168.123.106:0/936748664 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5fec0686f0 0x7f5fec1a2610 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f5fd4005b40 tx=0x7f5fd400bab0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:43.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.455+0000 7f5fe8ff9700 1 -- 192.168.123.106:0/936748664 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5fd401d070 con 0x7f5fec0686f0 2026-03-09T17:36:43.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.455+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/936748664 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5fec19c910 con 0x7f5fec0686f0 2026-03-09T17:36:43.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.455+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/936748664 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5fec19ce00 con 0x7f5fec0686f0 2026-03-09T17:36:43.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.455+0000 7f5fe8ff9700 1 -- 192.168.123.106:0/936748664 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5fd400f460 con 0x7f5fec0686f0 2026-03-09T17:36:43.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.455+0000 7f5fe8ff9700 1 -- 192.168.123.106:0/936748664 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5fd4021620 con 0x7f5fec0686f0 2026-03-09T17:36:43.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.457+0000 7f5fe8ff9700 1 -- 192.168.123.106:0/936748664 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5fd4003ac0 con 0x7f5fec0686f0 2026-03-09T17:36:43.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.457+0000 7f5fe8ff9700 1 --2- 192.168.123.106:0/936748664 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f5fd80778d0 0x7f5fd8079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:43.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.457+0000 7f5fe8ff9700 1 -- 192.168.123.106:0/936748664 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f5fd409ae10 con 0x7f5fec0686f0 2026-03-09T17:36:43.458 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.457+0000 7f5feaffd700 1 --2- 192.168.123.106:0/936748664 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f5fd80778d0 0x7f5fd8079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:43.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.458+0000 7f5feaffd700 1 --2- 192.168.123.106:0/936748664 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f5fd80778d0 0x7f5fd8079d80 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f5fec19de60 tx=0x7f5fdc009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:43.459 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.458+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/936748664 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5fcc005320 con 0x7f5fec0686f0 2026-03-09T17:36:43.462 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.461+0000 7f5fe8ff9700 1 -- 192.168.123.106:0/936748664 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5fd4063540 con 0x7f5fec0686f0 2026-03-09T17:36:43.600 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.598+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/936748664 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 22, "format": "json"} v 0) v1 -- 0x7f5fcc005190 con 0x7f5fec0686f0 2026-03-09T17:36:43.600 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.599+0000 7f5fe8ff9700 1 -- 192.168.123.106:0/936748664 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 22, "format": "json"}]=0 dumped fsmap epoch 22 v35) v1 ==== 107+0+4199 (secure 0 0 0) 0x7f5fd4062c90 con 0x7f5fec0686f0 2026-03-09T17:36:43.603 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:43.603 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":22,"btime":"2026-03-09T17:34:12:314812+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34280,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3053844977","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3053844977},{"type":"v1","addr":"192.168.123.109:6825","nonce":3053844977}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":15},{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20}],"filesystems":[{"mdsmap":{"epoch":22,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:12.314792+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":0,"incarnation":22,"state":"up:replay","state_seq":2,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T17:36:43.604 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.603+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/936748664 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f5fd80778d0 msgr2=0x7f5fd8079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:43.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.603+0000 7f5ff1a6a700 1 --2- 192.168.123.106:0/936748664 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f5fd80778d0 0x7f5fd8079d80 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f5fec19de60 tx=0x7f5fdc009450 comp rx=0 tx=0).stop 2026-03-09T17:36:43.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.604+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/936748664 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5fec0686f0 msgr2=0x7f5fec1a2610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:43.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.604+0000 7f5ff1a6a700 1 --2- 192.168.123.106:0/936748664 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5fec0686f0 0x7f5fec1a2610 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f5fd4005b40 tx=0x7f5fd400bab0 comp rx=0 tx=0).stop 2026-03-09T17:36:43.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.604+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/936748664 shutdown_connections 2026-03-09T17:36:43.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.604+0000 7f5ff1a6a700 1 --2- 192.168.123.106:0/936748664 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f5fd80778d0 0x7f5fd8079d80 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:43.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.604+0000 7f5ff1a6a700 1 --2- 192.168.123.106:0/936748664 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f5fec0686f0 0x7f5fec1a2610 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:43.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.604+0000 7f5ff1a6a700 1 --2- 192.168.123.106:0/936748664 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5fec069000 0x7f5fec1a2b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:43.605 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.604+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/936748664 >> 192.168.123.106:0/936748664 conn(0x7f5fec0754a0 msgr2=0x7f5fec0ff7b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:43.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.605+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/936748664 shutdown_connections 2026-03-09T17:36:43.606 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:43.605+0000 7f5ff1a6a700 1 -- 192.168.123.106:0/936748664 wait complete. 2026-03-09T17:36:43.607 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 22 2026-03-09T17:36:43.664 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 22 2026-03-09T17:36:43.664 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 23 2026-03-09T17:36:43.796 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:44.013 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.012+0000 7f9c7ec17700 1 -- 192.168.123.106:0/1340064667 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c780686f0 msgr2=0x7f9c78068ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:44.013 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.012+0000 7f9c7ec17700 1 --2- 192.168.123.106:0/1340064667 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c780686f0 0x7f9c78068ac0 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f9c6c009b50 tx=0x7f9c6c009e60 comp rx=0 tx=0).stop 2026-03-09T17:36:44.014 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.013+0000 7f9c7ec17700 1 -- 192.168.123.106:0/1340064667 shutdown_connections 2026-03-09T17:36:44.014 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.013+0000 7f9c7ec17700 1 --2- 192.168.123.106:0/1340064667 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9c78069000 0x7f9c781051e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:44.014 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.013+0000 7f9c7ec17700 1 --2- 192.168.123.106:0/1340064667 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c780686f0 0x7f9c78068ac0 unknown :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:44.014 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.013+0000 7f9c7ec17700 1 -- 192.168.123.106:0/1340064667 >> 192.168.123.106:0/1340064667 conn(0x7f9c780754a0 msgr2=0x7f9c780758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:44.014 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.013+0000 7f9c7ec17700 1 -- 192.168.123.106:0/1340064667 shutdown_connections 2026-03-09T17:36:44.014 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.013+0000 7f9c7ec17700 1 -- 192.168.123.106:0/1340064667 wait complete. 2026-03-09T17:36:44.015 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.014+0000 7f9c7ec17700 1 Processor -- start 2026-03-09T17:36:44.015 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.014+0000 7f9c7ec17700 1 -- start start 2026-03-09T17:36:44.015 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.014+0000 7f9c7ec17700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9c780686f0 0x7f9c781961b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:44.015 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.014+0000 7f9c7ec17700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c78069000 0x7f9c781966f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:44.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.014+0000 7f9c7c9b3700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9c780686f0 0x7f9c781961b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:44.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.014+0000 7f9c7c9b3700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9c780686f0 0x7f9c781961b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:46730/0 (socket says 192.168.123.106:46730) 2026-03-09T17:36:44.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.015+0000 7f9c7ec17700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9c78196dd0 con 0x7f9c78069000 2026-03-09T17:36:44.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.015+0000 7f9c7ec17700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9c7819ab60 con 0x7f9c780686f0 2026-03-09T17:36:44.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.015+0000 7f9c77fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c78069000 0x7f9c781966f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:44.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.015+0000 7f9c77fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c78069000 0x7f9c781966f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:55840/0 (socket says 192.168.123.106:55840) 2026-03-09T17:36:44.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.015+0000 7f9c77fff700 1 -- 192.168.123.106:0/4063555331 learned_addr learned my addr 192.168.123.106:0/4063555331 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:44.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.015+0000 7f9c77fff700 1 -- 192.168.123.106:0/4063555331 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9c780686f0 msgr2=0x7f9c781961b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:44.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.015+0000 7f9c77fff700 1 --2- 192.168.123.106:0/4063555331 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9c780686f0 0x7f9c781961b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:44.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.015+0000 7f9c77fff700 1 -- 192.168.123.106:0/4063555331 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9c6c0097e0 con 0x7f9c78069000 2026-03-09T17:36:44.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.015+0000 7f9c77fff700 1 --2- 192.168.123.106:0/4063555331 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c78069000 0x7f9c781966f0 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f9c6400dc40 tx=0x7f9c6400be10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:44.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.016+0000 7f9c75ffb700 1 -- 192.168.123.106:0/4063555331 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9c640099a0 con 0x7f9c78069000 2026-03-09T17:36:44.017 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.016+0000 7f9c75ffb700 1 -- 192.168.123.106:0/4063555331 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9c64010460 con 0x7f9c78069000 2026-03-09T17:36:44.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.016+0000 7f9c75ffb700 1 -- 192.168.123.106:0/4063555331 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9c6400f6f0 con 0x7f9c78069000 2026-03-09T17:36:44.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.016+0000 7f9c7ec17700 1 -- 192.168.123.106:0/4063555331 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9c7819ae40 con 0x7f9c78069000 2026-03-09T17:36:44.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.016+0000 7f9c7ec17700 1 -- 192.168.123.106:0/4063555331 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9c7819b390 con 0x7f9c78069000 2026-03-09T17:36:44.018 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.017+0000 7f9c7ec17700 1 -- 192.168.123.106:0/4063555331 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9c78108b30 con 0x7f9c78069000 2026-03-09T17:36:44.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.019+0000 7f9c75ffb700 1 -- 192.168.123.106:0/4063555331 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9c640105d0 con 0x7f9c78069000 2026-03-09T17:36:44.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.020+0000 7f9c75ffb700 1 --2- 192.168.123.106:0/4063555331 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9c680779e0 0x7f9c68079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:44.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.020+0000 7f9c75ffb700 1 -- 192.168.123.106:0/4063555331 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f9c64099e90 con 0x7f9c78069000 2026-03-09T17:36:44.021 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.020+0000 7f9c7c9b3700 1 --2- 192.168.123.106:0/4063555331 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9c680779e0 0x7f9c68079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:44.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.020+0000 7f9c75ffb700 1 -- 192.168.123.106:0/4063555331 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9c64062640 con 0x7f9c78069000 2026-03-09T17:36:44.022 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.021+0000 7f9c7c9b3700 1 --2- 192.168.123.106:0/4063555331 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9c680779e0 0x7f9c68079e90 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f9c6c00b5c0 tx=0x7f9c6c0058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:44.125 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:44 vm06.local ceph-mon[109831]: pgmap v213: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:44.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.159+0000 7f9c7ec17700 1 -- 192.168.123.106:0/4063555331 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 23, "format": "json"} v 0) v1 -- 0x7f9c7804ea50 con 0x7f9c78069000 2026-03-09T17:36:44.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.160+0000 7f9c75ffb700 1 -- 192.168.123.106:0/4063555331 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 23, "format": "json"}]=0 dumped fsmap epoch 23 v35) v1 ==== 107+0+4204 (secure 0 0 0) 0x7f9c64010880 con 0x7f9c78069000 2026-03-09T17:36:44.161 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:44.161 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":23,"btime":"2026-03-09T17:34:16:597195+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34280,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3053844977","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3053844977},{"type":"v1","addr":"192.168.123.109:6825","nonce":3053844977}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":15},{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20}],"filesystems":[{"mdsmap":{"epoch":23,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:16.547087+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":0,"incarnation":22,"state":"up:reconnect","state_seq":107,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T17:36:44.163 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.162+0000 7f9c7ec17700 1 -- 192.168.123.106:0/4063555331 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9c680779e0 msgr2=0x7f9c68079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:44.163 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.162+0000 7f9c7ec17700 1 --2- 192.168.123.106:0/4063555331 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9c680779e0 0x7f9c68079e90 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f9c6c00b5c0 tx=0x7f9c6c0058e0 comp rx=0 tx=0).stop 2026-03-09T17:36:44.164 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.162+0000 7f9c7ec17700 1 -- 192.168.123.106:0/4063555331 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c78069000 msgr2=0x7f9c781966f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:44.164 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.162+0000 7f9c7ec17700 1 --2- 192.168.123.106:0/4063555331 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c78069000 0x7f9c781966f0 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f9c6400dc40 tx=0x7f9c6400be10 comp rx=0 tx=0).stop 2026-03-09T17:36:44.164 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.162+0000 7f9c7ec17700 1 -- 192.168.123.106:0/4063555331 shutdown_connections 2026-03-09T17:36:44.164 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.162+0000 7f9c7ec17700 1 --2- 192.168.123.106:0/4063555331 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f9c680779e0 0x7f9c68079e90 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:44.164 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.162+0000 7f9c7ec17700 1 --2- 192.168.123.106:0/4063555331 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9c780686f0 0x7f9c781961b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:44.164 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.162+0000 7f9c7ec17700 1 --2- 192.168.123.106:0/4063555331 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c78069000 0x7f9c781966f0 unknown :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:44.164 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.162+0000 7f9c7ec17700 1 -- 192.168.123.106:0/4063555331 >> 192.168.123.106:0/4063555331 conn(0x7f9c780754a0 msgr2=0x7f9c780ff780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:44.164 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.163+0000 7f9c7ec17700 1 -- 192.168.123.106:0/4063555331 shutdown_connections 2026-03-09T17:36:44.164 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.163+0000 7f9c7ec17700 1 -- 192.168.123.106:0/4063555331 wait complete. 2026-03-09T17:36:44.165 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 23 2026-03-09T17:36:44.226 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 23 2026-03-09T17:36:44.226 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 24 2026-03-09T17:36:44.363 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:44.388 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:44 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:36:44.388 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:44 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/936748664' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-09T17:36:44.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:44 vm09.local ceph-mon[97995]: pgmap v213: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:44.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:44 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:36:44.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:44 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/936748664' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-09T17:36:44.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.614+0000 7fd5e009d700 1 -- 192.168.123.106:0/2750186121 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd5d8108780 msgr2=0x7fd5d8108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:44.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.614+0000 7fd5e009d700 1 --2- 192.168.123.106:0/2750186121 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd5d8108780 0x7fd5d8108b50 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7fd5cc009b00 tx=0x7fd5cc009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:44.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.615+0000 7fd5e009d700 1 -- 192.168.123.106:0/2750186121 shutdown_connections 2026-03-09T17:36:44.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.615+0000 7fd5e009d700 1 --2- 192.168.123.106:0/2750186121 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd5d8102780 0x7fd5d8102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:44.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.615+0000 7fd5e009d700 1 --2- 192.168.123.106:0/2750186121 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd5d8108780 0x7fd5d8108b50 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:44.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.615+0000 7fd5e009d700 1 -- 192.168.123.106:0/2750186121 >> 192.168.123.106:0/2750186121 conn(0x7fd5d80fe280 msgr2=0x7fd5d8100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:44.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.615+0000 7fd5e009d700 1 -- 192.168.123.106:0/2750186121 shutdown_connections 2026-03-09T17:36:44.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.615+0000 7fd5e009d700 1 -- 192.168.123.106:0/2750186121 wait complete. 2026-03-09T17:36:44.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.616+0000 7fd5e009d700 1 Processor -- start 2026-03-09T17:36:44.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.616+0000 7fd5e009d700 1 -- start start 2026-03-09T17:36:44.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.616+0000 7fd5e009d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd5d8102780 0x7fd5d8198320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:44.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.616+0000 7fd5e009d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd5d8108780 0x7fd5d8198860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:44.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.616+0000 7fd5e009d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd5d8198f40 con 0x7fd5d8102780 2026-03-09T17:36:44.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.616+0000 7fd5e009d700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd5d819ccd0 con 0x7fd5d8108780 2026-03-09T17:36:44.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.616+0000 7fd5dde39700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd5d8102780 0x7fd5d8198320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:44.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.616+0000 7fd5dde39700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd5d8102780 0x7fd5d8198320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:55864/0 (socket says 192.168.123.106:55864) 2026-03-09T17:36:44.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.616+0000 7fd5dde39700 1 -- 192.168.123.106:0/1547119822 learned_addr learned my addr 192.168.123.106:0/1547119822 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:44.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.617+0000 7fd5dde39700 1 -- 192.168.123.106:0/1547119822 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd5d8108780 msgr2=0x7fd5d8198860 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:36:44.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.617+0000 7fd5dde39700 1 --2- 192.168.123.106:0/1547119822 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd5d8108780 0x7fd5d8198860 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:44.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.617+0000 7fd5dde39700 1 -- 192.168.123.106:0/1547119822 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd5d4009710 con 0x7fd5d8102780 2026-03-09T17:36:44.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.617+0000 7fd5dde39700 1 --2- 192.168.123.106:0/1547119822 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd5d8102780 0x7fd5d8198320 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7fd5cc0094d0 tx=0x7fd5cc00bab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:44.619 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.618+0000 7fd5caffd700 1 -- 192.168.123.106:0/1547119822 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd5cc01d070 con 0x7fd5d8102780 2026-03-09T17:36:44.619 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.618+0000 7fd5caffd700 1 -- 192.168.123.106:0/1547119822 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd5cc00f460 con 0x7fd5d8102780 2026-03-09T17:36:44.619 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.618+0000 7fd5e009d700 1 -- 192.168.123.106:0/1547119822 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd5cc0097e0 con 0x7fd5d8102780 2026-03-09T17:36:44.623 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.618+0000 7fd5e009d700 1 -- 192.168.123.106:0/1547119822 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd5d819d2b0 con 0x7fd5d8102780 2026-03-09T17:36:44.623 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.619+0000 7fd5caffd700 1 -- 192.168.123.106:0/1547119822 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd5cc021620 con 0x7fd5d8102780 2026-03-09T17:36:44.623 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.619+0000 7fd5caffd700 1 -- 192.168.123.106:0/1547119822 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd5cc02b430 con 0x7fd5d8102780 2026-03-09T17:36:44.623 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.619+0000 7fd5e009d700 1 -- 192.168.123.106:0/1547119822 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd5d804ea50 con 0x7fd5d8102780 2026-03-09T17:36:44.623 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.622+0000 7fd5caffd700 1 --2- 192.168.123.106:0/1547119822 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd5c4077a60 0x7fd5c4079f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:44.624 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.623+0000 7fd5dd638700 1 --2- 192.168.123.106:0/1547119822 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd5c4077a60 0x7fd5c4079f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:44.624 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.623+0000 7fd5caffd700 1 -- 192.168.123.106:0/1547119822 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fd5cc09bf20 con 0x7fd5d8102780 2026-03-09T17:36:44.625 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.624+0000 7fd5dd638700 1 --2- 192.168.123.106:0/1547119822 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd5c4077a60 0x7fd5c4079f10 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fd5d8199940 tx=0x7fd5d4009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:44.625 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.624+0000 7fd5caffd700 1 -- 192.168.123.106:0/1547119822 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd5cc00fa70 con 0x7fd5d8102780 2026-03-09T17:36:44.772 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.770+0000 7fd5e009d700 1 -- 192.168.123.106:0/1547119822 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 24, "format": "json"} v 0) v1 -- 0x7fd5d8066e40 con 0x7fd5d8102780 2026-03-09T17:36:44.772 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.771+0000 7fd5caffd700 1 -- 192.168.123.106:0/1547119822 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 24, "format": "json"}]=0 dumped fsmap epoch 24 v35) v1 ==== 107+0+4201 (secure 0 0 0) 0x7fd5cc0646d0 con 0x7fd5d8102780 2026-03-09T17:36:44.772 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:44.773 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":24,"btime":"2026-03-09T17:34:17:742558+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34280,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3053844977","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3053844977},{"type":"v1","addr":"192.168.123.109:6825","nonce":3053844977}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":15},{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20}],"filesystems":[{"mdsmap":{"epoch":24,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:16.746998+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":0,"incarnation":22,"state":"up:rejoin","state_seq":108,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T17:36:44.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.774+0000 7fd5e009d700 1 -- 192.168.123.106:0/1547119822 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd5c4077a60 msgr2=0x7fd5c4079f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:44.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.774+0000 7fd5e009d700 1 --2- 192.168.123.106:0/1547119822 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd5c4077a60 0x7fd5c4079f10 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fd5d8199940 tx=0x7fd5d4009450 comp rx=0 tx=0).stop 2026-03-09T17:36:44.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.774+0000 7fd5e009d700 1 -- 192.168.123.106:0/1547119822 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd5d8102780 msgr2=0x7fd5d8198320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:44.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.774+0000 7fd5e009d700 1 --2- 192.168.123.106:0/1547119822 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd5d8102780 0x7fd5d8198320 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7fd5cc0094d0 tx=0x7fd5cc00bab0 comp rx=0 tx=0).stop 2026-03-09T17:36:44.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.775+0000 7fd5e009d700 1 -- 192.168.123.106:0/1547119822 shutdown_connections 2026-03-09T17:36:44.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.775+0000 7fd5e009d700 1 --2- 192.168.123.106:0/1547119822 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fd5c4077a60 0x7fd5c4079f10 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:44.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.775+0000 7fd5e009d700 1 --2- 192.168.123.106:0/1547119822 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd5d8102780 0x7fd5d8198320 unknown :-1 s=CLOSED pgs=157 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:44.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.775+0000 7fd5e009d700 1 --2- 192.168.123.106:0/1547119822 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fd5d8108780 0x7fd5d8198860 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:44.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.775+0000 7fd5e009d700 1 -- 192.168.123.106:0/1547119822 >> 192.168.123.106:0/1547119822 conn(0x7fd5d80fe280 msgr2=0x7fd5d80ffa90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:44.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.775+0000 7fd5e009d700 1 -- 192.168.123.106:0/1547119822 shutdown_connections 2026-03-09T17:36:44.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:44.775+0000 7fd5e009d700 1 -- 192.168.123.106:0/1547119822 wait complete. 2026-03-09T17:36:44.777 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 24 2026-03-09T17:36:44.819 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 24 2026-03-09T17:36:44.819 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 25 2026-03-09T17:36:44.956 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:45.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.198+0000 7fcff3fc6700 1 -- 192.168.123.106:0/2059530594 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcfec0ff370 msgr2=0x7fcfec0ff740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:45.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.198+0000 7fcff3fc6700 1 --2- 192.168.123.106:0/2059530594 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcfec0ff370 0x7fcfec0ff740 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7fcfe8009b50 tx=0x7fcfe8009e60 comp rx=0 tx=0).stop 2026-03-09T17:36:45.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.199+0000 7fcff3fc6700 1 -- 192.168.123.106:0/2059530594 shutdown_connections 2026-03-09T17:36:45.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.199+0000 7fcff3fc6700 1 --2- 192.168.123.106:0/2059530594 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcfec0ffc80 0x7fcfec10c9e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:45.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.199+0000 7fcff3fc6700 1 --2- 192.168.123.106:0/2059530594 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcfec0ff370 0x7fcfec0ff740 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:45.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.199+0000 7fcff3fc6700 1 -- 192.168.123.106:0/2059530594 >> 192.168.123.106:0/2059530594 conn(0x7fcfec0762d0 msgr2=0x7fcfec0766d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:45.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.199+0000 7fcff3fc6700 1 -- 192.168.123.106:0/2059530594 shutdown_connections 2026-03-09T17:36:45.200 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.199+0000 7fcff3fc6700 1 -- 192.168.123.106:0/2059530594 wait complete. 2026-03-09T17:36:45.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.200+0000 7fcff3fc6700 1 Processor -- start 2026-03-09T17:36:45.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.200+0000 7fcff3fc6700 1 -- start start 2026-03-09T17:36:45.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.200+0000 7fcff3fc6700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcfec0ff370 0x7fcfec198380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:45.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.200+0000 7fcff3fc6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcfec0ffc80 0x7fcfec1988c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:45.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.200+0000 7fcff3fc6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcfec198fa0 con 0x7fcfec0ffc80 2026-03-09T17:36:45.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.200+0000 7fcff3fc6700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcfec19cd30 con 0x7fcfec0ff370 2026-03-09T17:36:45.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.200+0000 7fcff1561700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcfec0ffc80 0x7fcfec1988c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:45.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.200+0000 7fcff1561700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcfec0ffc80 0x7fcfec1988c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:55882/0 (socket says 192.168.123.106:55882) 2026-03-09T17:36:45.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.200+0000 7fcff1561700 1 -- 192.168.123.106:0/1633506877 learned_addr learned my addr 192.168.123.106:0/1633506877 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:45.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.201+0000 7fcff1561700 1 -- 192.168.123.106:0/1633506877 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcfec0ff370 msgr2=0x7fcfec198380 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T17:36:45.202 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.201+0000 7fcff1d62700 1 --2- 192.168.123.106:0/1633506877 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcfec0ff370 0x7fcfec198380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:45.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.201+0000 7fcff1561700 1 --2- 192.168.123.106:0/1633506877 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcfec0ff370 0x7fcfec198380 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:45.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.201+0000 7fcff1561700 1 -- 192.168.123.106:0/1633506877 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcfe80097e0 con 0x7fcfec0ffc80 2026-03-09T17:36:45.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.201+0000 7fcff1d62700 1 --2- 192.168.123.106:0/1633506877 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcfec0ff370 0x7fcfec198380 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:36:45.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.201+0000 7fcff1561700 1 --2- 192.168.123.106:0/1633506877 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcfec0ffc80 0x7fcfec1988c0 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7fcfdc00d8d0 tx=0x7fcfdc00dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:45.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.201+0000 7fcfe2ffd700 1 -- 192.168.123.106:0/1633506877 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcfdc009940 con 0x7fcfec0ffc80 2026-03-09T17:36:45.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.202+0000 7fcfe2ffd700 1 -- 192.168.123.106:0/1633506877 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fcfdc010460 con 0x7fcfec0ffc80 2026-03-09T17:36:45.203 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.202+0000 7fcff3fc6700 1 -- 192.168.123.106:0/1633506877 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcfec19d010 con 0x7fcfec0ffc80 2026-03-09T17:36:45.204 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.202+0000 7fcff3fc6700 1 -- 192.168.123.106:0/1633506877 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcfec19d560 con 0x7fcfec0ffc80 2026-03-09T17:36:45.204 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.203+0000 7fcfe2ffd700 1 -- 192.168.123.106:0/1633506877 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcfdc009c50 con 0x7fcfec0ffc80 2026-03-09T17:36:45.205 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.203+0000 7fcff3fc6700 1 -- 192.168.123.106:0/1633506877 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcfec10a150 con 0x7fcfec0ffc80 2026-03-09T17:36:45.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.207+0000 7fcfe2ffd700 1 -- 192.168.123.106:0/1633506877 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcfdc0105d0 con 0x7fcfec0ffc80 2026-03-09T17:36:45.208 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.207+0000 7fcfe2ffd700 1 --2- 192.168.123.106:0/1633506877 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fcfd8077870 0x7fcfd8079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:45.209 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.208+0000 7fcfe2ffd700 1 -- 192.168.123.106:0/1633506877 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fcfdc09a0f0 con 0x7fcfec0ffc80 2026-03-09T17:36:45.209 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.208+0000 7fcfe2ffd700 1 -- 192.168.123.106:0/1633506877 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcfdc09a580 con 0x7fcfec0ffc80 2026-03-09T17:36:45.209 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.208+0000 7fcff1d62700 1 --2- 192.168.123.106:0/1633506877 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fcfd8077870 0x7fcfd8079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:45.209 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.208+0000 7fcff1d62700 1 --2- 192.168.123.106:0/1633506877 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fcfd8077870 0x7fcfd8079d20 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7fcfe800b5c0 tx=0x7fcfe80058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:45.347 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:45 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/4063555331' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-09T17:36:45.347 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:45 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/1547119822' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-09T17:36:45.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.346+0000 7fcff3fc6700 1 -- 192.168.123.106:0/1633506877 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 25, "format": "json"} v 0) v1 -- 0x7fcfec19d1a0 con 0x7fcfec0ffc80 2026-03-09T17:36:45.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.349+0000 7fcfe2ffd700 1 -- 192.168.123.106:0/1633506877 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 25, "format": "json"}]=0 dumped fsmap epoch 25 v35) v1 ==== 107+0+4210 (secure 0 0 0) 0x7fcfec19d1a0 con 0x7fcfec0ffc80 2026-03-09T17:36:45.351 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:45.351 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":25,"btime":"2026-03-09T17:34:18:757240+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34280,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3053844977","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3053844977},{"type":"v1","addr":"192.168.123.109:6825","nonce":3053844977}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":15},{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20}],"filesystems":[{"mdsmap":{"epoch":25,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:18.757239+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":0,"incarnation":22,"state":"up:active","state_seq":109,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24307,"qdb_cluster":[24307]},"id":1}]} 2026-03-09T17:36:45.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.352+0000 7fcff3fc6700 1 -- 192.168.123.106:0/1633506877 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fcfd8077870 msgr2=0x7fcfd8079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:45.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.352+0000 7fcff3fc6700 1 --2- 192.168.123.106:0/1633506877 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fcfd8077870 0x7fcfd8079d20 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7fcfe800b5c0 tx=0x7fcfe80058e0 comp rx=0 tx=0).stop 2026-03-09T17:36:45.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.352+0000 7fcff3fc6700 1 -- 192.168.123.106:0/1633506877 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcfec0ffc80 msgr2=0x7fcfec1988c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:45.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.352+0000 7fcff3fc6700 1 --2- 192.168.123.106:0/1633506877 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcfec0ffc80 0x7fcfec1988c0 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7fcfdc00d8d0 tx=0x7fcfdc00dc90 comp rx=0 tx=0).stop 2026-03-09T17:36:45.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.352+0000 7fcff3fc6700 1 -- 192.168.123.106:0/1633506877 shutdown_connections 2026-03-09T17:36:45.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.352+0000 7fcff3fc6700 1 --2- 192.168.123.106:0/1633506877 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fcfd8077870 0x7fcfd8079d20 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:45.354 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.353+0000 7fcff3fc6700 1 --2- 192.168.123.106:0/1633506877 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fcfec0ff370 0x7fcfec198380 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:45.354 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.353+0000 7fcff3fc6700 1 --2- 192.168.123.106:0/1633506877 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcfec0ffc80 0x7fcfec1988c0 unknown :-1 s=CLOSED pgs=159 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:45.354 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.353+0000 7fcff3fc6700 1 -- 192.168.123.106:0/1633506877 >> 192.168.123.106:0/1633506877 conn(0x7fcfec0762d0 msgr2=0x7fcfec0fd990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:45.354 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.353+0000 7fcff3fc6700 1 -- 192.168.123.106:0/1633506877 shutdown_connections 2026-03-09T17:36:45.354 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.353+0000 7fcff3fc6700 1 -- 192.168.123.106:0/1633506877 wait complete. 2026-03-09T17:36:45.355 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 25 2026-03-09T17:36:45.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:45 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/4063555331' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-09T17:36:45.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:45 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/1547119822' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-09T17:36:45.395 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 25 2026-03-09T17:36:45.395 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 26 2026-03-09T17:36:45.530 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:45.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.774+0000 7f6e1c27c700 1 -- 192.168.123.106:0/887191 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e14102790 msgr2=0x7f6e14102c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:45.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.774+0000 7f6e1c27c700 1 --2- 192.168.123.106:0/887191 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e14102790 0x7f6e14102c00 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f6e10009b00 tx=0x7f6e10009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:45.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.775+0000 7f6e1c27c700 1 -- 192.168.123.106:0/887191 shutdown_connections 2026-03-09T17:36:45.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.775+0000 7f6e1c27c700 1 --2- 192.168.123.106:0/887191 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e14102790 0x7f6e14102c00 unknown :-1 s=CLOSED pgs=160 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:45.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.775+0000 7f6e1c27c700 1 --2- 192.168.123.106:0/887191 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6e14108790 0x7f6e14108b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:45.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.775+0000 7f6e1c27c700 1 -- 192.168.123.106:0/887191 >> 192.168.123.106:0/887191 conn(0x7f6e140fe2b0 msgr2=0x7f6e141006c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:45.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.775+0000 7f6e1c27c700 1 -- 192.168.123.106:0/887191 shutdown_connections 2026-03-09T17:36:45.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.775+0000 7f6e1c27c700 1 -- 192.168.123.106:0/887191 wait complete. 2026-03-09T17:36:45.776 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.775+0000 7f6e1c27c700 1 Processor -- start 2026-03-09T17:36:45.777 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.776+0000 7f6e1c27c700 1 -- start start 2026-03-09T17:36:45.777 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.776+0000 7f6e1c27c700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6e14102790 0x7f6e14198390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:45.777 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.776+0000 7f6e1c27c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e14108790 0x7f6e141988d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:45.777 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.776+0000 7f6e1c27c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6e14198fb0 con 0x7f6e14108790 2026-03-09T17:36:45.777 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.776+0000 7f6e1c27c700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6e1419cd40 con 0x7f6e14102790 2026-03-09T17:36:45.778 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.776+0000 7f6e19817700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e14108790 0x7f6e141988d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:45.778 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.776+0000 7f6e19817700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e14108790 0x7f6e141988d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:55890/0 (socket says 192.168.123.106:55890) 2026-03-09T17:36:45.778 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.776+0000 7f6e19817700 1 -- 192.168.123.106:0/3619262241 learned_addr learned my addr 192.168.123.106:0/3619262241 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:45.778 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.776+0000 7f6e19817700 1 -- 192.168.123.106:0/3619262241 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6e14102790 msgr2=0x7f6e14198390 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:36:45.778 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.776+0000 7f6e1a018700 1 --2- 192.168.123.106:0/3619262241 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6e14102790 0x7f6e14198390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:45.778 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.777+0000 7f6e19817700 1 --2- 192.168.123.106:0/3619262241 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6e14102790 0x7f6e14198390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:45.778 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.777+0000 7f6e19817700 1 -- 192.168.123.106:0/3619262241 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6e100097e0 con 0x7f6e14108790 2026-03-09T17:36:45.778 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.777+0000 7f6e1a018700 1 --2- 192.168.123.106:0/3619262241 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6e14102790 0x7f6e14198390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:36:45.778 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.777+0000 7f6e19817700 1 --2- 192.168.123.106:0/3619262241 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e14108790 0x7f6e141988d0 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f6e1000b5c0 tx=0x7f6e100049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:45.778 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.777+0000 7f6e0b7fe700 1 -- 192.168.123.106:0/3619262241 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6e1001d070 con 0x7f6e14108790 2026-03-09T17:36:45.778 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.777+0000 7f6e0b7fe700 1 -- 192.168.123.106:0/3619262241 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6e1000bc50 con 0x7f6e14108790 2026-03-09T17:36:45.778 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.777+0000 7f6e1c27c700 1 -- 192.168.123.106:0/3619262241 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6e1419cfc0 con 0x7f6e14108790 2026-03-09T17:36:45.780 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.777+0000 7f6e1c27c700 1 -- 192.168.123.106:0/3619262241 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6e1419d530 con 0x7f6e14108790 2026-03-09T17:36:45.780 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.778+0000 7f6e0b7fe700 1 -- 192.168.123.106:0/3619262241 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6e10022620 con 0x7f6e14108790 2026-03-09T17:36:45.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.779+0000 7f6e1c27c700 1 -- 192.168.123.106:0/3619262241 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6e1404ea50 con 0x7f6e14108790 2026-03-09T17:36:45.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.779+0000 7f6e0b7fe700 1 -- 192.168.123.106:0/3619262241 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6e10022a50 con 0x7f6e14108790 2026-03-09T17:36:45.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.779+0000 7f6e0b7fe700 1 --2- 192.168.123.106:0/3619262241 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6e000778c0 0x7f6e00079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:45.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.779+0000 7f6e0b7fe700 1 -- 192.168.123.106:0/3619262241 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f6e1009b580 con 0x7f6e14108790 2026-03-09T17:36:45.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.780+0000 7f6e1a018700 1 --2- 192.168.123.106:0/3619262241 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6e000778c0 0x7f6e00079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:45.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.780+0000 7f6e1a018700 1 --2- 192.168.123.106:0/3619262241 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6e000778c0 0x7f6e00079d70 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f6e141038d0 tx=0x7f6e04008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:45.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.782+0000 7f6e0b7fe700 1 -- 192.168.123.106:0/3619262241 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6e10063c30 con 0x7f6e14108790 2026-03-09T17:36:45.931 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.930+0000 7f6e1c27c700 1 -- 192.168.123.106:0/3619262241 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 26, "format": "json"} v 0) v1 -- 0x7f6e14066e40 con 0x7f6e14108790 2026-03-09T17:36:45.932 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.931+0000 7f6e0b7fe700 1 -- 192.168.123.106:0/3619262241 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 26, "format": "json"}]=0 dumped fsmap epoch 26 v35) v1 ==== 107+0+5061 (secure 0 0 0) 0x7f6e10022890 con 0x7f6e14108790 2026-03-09T17:36:45.932 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:45.932 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":26,"btime":"2026-03-09T17:34:19:817880+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34280,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3053844977","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3053844977},{"type":"v1","addr":"192.168.123.109:6825","nonce":3053844977}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":15},{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44251,"name":"cephfs.vm06.gzymac","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6829/2160269265","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":2160269265},{"type":"v1","addr":"192.168.123.106:6829","nonce":2160269265}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26}],"filesystems":[{"mdsmap":{"epoch":25,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:18.757239+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":0,"incarnation":22,"state":"up:active","state_seq":109,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24307,"qdb_cluster":[24307]},"id":1}]} 2026-03-09T17:36:45.935 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.933+0000 7f6e1c27c700 1 -- 192.168.123.106:0/3619262241 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6e000778c0 msgr2=0x7f6e00079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:45.935 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.933+0000 7f6e1c27c700 1 --2- 192.168.123.106:0/3619262241 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6e000778c0 0x7f6e00079d70 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f6e141038d0 tx=0x7f6e04008040 comp rx=0 tx=0).stop 2026-03-09T17:36:45.935 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.934+0000 7f6e1c27c700 1 -- 192.168.123.106:0/3619262241 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e14108790 msgr2=0x7f6e141988d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:45.935 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.934+0000 7f6e1c27c700 1 --2- 192.168.123.106:0/3619262241 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e14108790 0x7f6e141988d0 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f6e1000b5c0 tx=0x7f6e100049e0 comp rx=0 tx=0).stop 2026-03-09T17:36:45.935 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.934+0000 7f6e1c27c700 1 -- 192.168.123.106:0/3619262241 shutdown_connections 2026-03-09T17:36:45.935 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.934+0000 7f6e1c27c700 1 --2- 192.168.123.106:0/3619262241 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6e000778c0 0x7f6e00079d70 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:45.935 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.934+0000 7f6e1c27c700 1 --2- 192.168.123.106:0/3619262241 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6e14102790 0x7f6e14198390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:45.935 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.934+0000 7f6e1c27c700 1 --2- 192.168.123.106:0/3619262241 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e14108790 0x7f6e141988d0 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:45.935 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.934+0000 7f6e1c27c700 1 -- 192.168.123.106:0/3619262241 >> 192.168.123.106:0/3619262241 conn(0x7f6e140fe2b0 msgr2=0x7f6e140ff9c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:45.935 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.934+0000 7f6e1c27c700 1 -- 192.168.123.106:0/3619262241 shutdown_connections 2026-03-09T17:36:45.935 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:45.934+0000 7f6e1c27c700 1 -- 192.168.123.106:0/3619262241 wait complete. 2026-03-09T17:36:45.936 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 26 2026-03-09T17:36:45.989 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 26 2026-03-09T17:36:45.990 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 27 2026-03-09T17:36:46.123 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:46.160 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:46 vm06.local ceph-mon[109831]: pgmap v214: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:46.160 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:46 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/1633506877' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-09T17:36:46.160 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:46 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/3619262241' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-09T17:36:46.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.374+0000 7ff452a57700 1 -- 192.168.123.106:0/4135638291 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff44c104520 msgr2=0x7ff44c1048f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:46.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.374+0000 7ff452a57700 1 --2- 192.168.123.106:0/4135638291 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff44c104520 0x7ff44c1048f0 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7ff434009b50 tx=0x7ff434009e60 comp rx=0 tx=0).stop 2026-03-09T17:36:46.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.374+0000 7ff452a57700 1 -- 192.168.123.106:0/4135638291 shutdown_connections 2026-03-09T17:36:46.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.374+0000 7ff452a57700 1 --2- 192.168.123.106:0/4135638291 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff44c0fff00 0x7ff44c100370 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:46.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.374+0000 7ff452a57700 1 --2- 192.168.123.106:0/4135638291 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff44c104520 0x7ff44c1048f0 unknown :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:46.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.374+0000 7ff452a57700 1 -- 192.168.123.106:0/4135638291 >> 192.168.123.106:0/4135638291 conn(0x7ff44c0754a0 msgr2=0x7ff44c0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:46.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.375+0000 7ff452a57700 1 -- 192.168.123.106:0/4135638291 shutdown_connections 2026-03-09T17:36:46.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.375+0000 7ff452a57700 1 -- 192.168.123.106:0/4135638291 wait complete. 2026-03-09T17:36:46.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.375+0000 7ff452a57700 1 Processor -- start 2026-03-09T17:36:46.376 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.375+0000 7ff452a57700 1 -- start start 2026-03-09T17:36:46.377 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.376+0000 7ff452a57700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff44c0fff00 0x7ff44c114040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:46.377 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.376+0000 7ff452a57700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff44c104520 0x7ff44c1145a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:46.377 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.376+0000 7ff452a57700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff44c10f0f0 con 0x7ff44c104520 2026-03-09T17:36:46.377 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.376+0000 7ff452a57700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff44c10f260 con 0x7ff44c0fff00 2026-03-09T17:36:46.377 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.376+0000 7ff44b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff44c104520 0x7ff44c1145a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:46.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.376+0000 7ff44b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff44c104520 0x7ff44c1145a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:55896/0 (socket says 192.168.123.106:55896) 2026-03-09T17:36:46.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.376+0000 7ff44b7fe700 1 -- 192.168.123.106:0/662131507 learned_addr learned my addr 192.168.123.106:0/662131507 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:46.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.376+0000 7ff44bfff700 1 --2- 192.168.123.106:0/662131507 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff44c0fff00 0x7ff44c114040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:46.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.376+0000 7ff44b7fe700 1 -- 192.168.123.106:0/662131507 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff44c0fff00 msgr2=0x7ff44c114040 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:46.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.376+0000 7ff44b7fe700 1 --2- 192.168.123.106:0/662131507 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff44c0fff00 0x7ff44c114040 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:46.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.376+0000 7ff44b7fe700 1 -- 192.168.123.106:0/662131507 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff4340097e0 con 0x7ff44c104520 2026-03-09T17:36:46.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.376+0000 7ff44b7fe700 1 --2- 192.168.123.106:0/662131507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff44c104520 0x7ff44c1145a0 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7ff43c00eb10 tx=0x7ff43c00eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:46.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.376+0000 7ff4497fa700 1 -- 192.168.123.106:0/662131507 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff43c00cca0 con 0x7ff44c104520 2026-03-09T17:36:46.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.376+0000 7ff4497fa700 1 -- 192.168.123.106:0/662131507 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff43c00ce00 con 0x7ff44c104520 2026-03-09T17:36:46.378 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.377+0000 7ff4497fa700 1 -- 192.168.123.106:0/662131507 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff43c0105e0 con 0x7ff44c104520 2026-03-09T17:36:46.379 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.377+0000 7ff452a57700 1 -- 192.168.123.106:0/662131507 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff44c10f540 con 0x7ff44c104520 2026-03-09T17:36:46.379 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.378+0000 7ff452a57700 1 -- 192.168.123.106:0/662131507 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff44c10fa10 con 0x7ff44c104520 2026-03-09T17:36:46.380 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.379+0000 7ff452a57700 1 -- 192.168.123.106:0/662131507 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff44c04ea50 con 0x7ff44c104520 2026-03-09T17:36:46.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.379+0000 7ff4497fa700 1 -- 192.168.123.106:0/662131507 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff43c018700 con 0x7ff44c104520 2026-03-09T17:36:46.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.380+0000 7ff4497fa700 1 --2- 192.168.123.106:0/662131507 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff438077870 0x7ff438079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:46.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.380+0000 7ff4497fa700 1 -- 192.168.123.106:0/662131507 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7ff43c014070 con 0x7ff44c104520 2026-03-09T17:36:46.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.380+0000 7ff44bfff700 1 --2- 192.168.123.106:0/662131507 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff438077870 0x7ff438079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:46.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.380+0000 7ff44bfff700 1 --2- 192.168.123.106:0/662131507 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff438077870 0x7ff438079d20 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7ff434006010 tx=0x7ff43400b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:46.383 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.382+0000 7ff4497fa700 1 -- 192.168.123.106:0/662131507 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff43c062270 con 0x7ff44c104520 2026-03-09T17:36:46.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:46 vm09.local ceph-mon[97995]: pgmap v214: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:46.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:46 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/1633506877' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-09T17:36:46.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:46 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/3619262241' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-09T17:36:46.526 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.524+0000 7ff452a57700 1 -- 192.168.123.106:0/662131507 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 27, "format": "json"} v 0) v1 -- 0x7ff44c066e40 con 0x7ff44c104520 2026-03-09T17:36:46.526 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.525+0000 7ff4497fa700 1 -- 192.168.123.106:0/662131507 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 27, "format": "json"}]=0 dumped fsmap epoch 27 v35) v1 ==== 107+0+4278 (secure 0 0 0) 0x7ff43c0619c0 con 0x7ff44c104520 2026-03-09T17:36:46.528 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:46.528 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":27,"btime":"2026-03-09T17:34:23:673502+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44251,"name":"cephfs.vm06.gzymac","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6829/2160269265","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":2160269265},{"type":"v1","addr":"192.168.123.106:6829","nonce":2160269265}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26}],"filesystems":[{"mdsmap":{"epoch":25,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:18.757239+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":0,"incarnation":22,"state":"up:active","state_seq":109,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24307,"qdb_cluster":[24307]},"id":1}]} 2026-03-09T17:36:46.530 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.529+0000 7ff452a57700 1 -- 192.168.123.106:0/662131507 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff438077870 msgr2=0x7ff438079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:46.530 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.529+0000 7ff452a57700 1 --2- 192.168.123.106:0/662131507 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff438077870 0x7ff438079d20 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7ff434006010 tx=0x7ff43400b540 comp rx=0 tx=0).stop 2026-03-09T17:36:46.531 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.529+0000 7ff452a57700 1 -- 192.168.123.106:0/662131507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff44c104520 msgr2=0x7ff44c1145a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:46.531 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.529+0000 7ff452a57700 1 --2- 192.168.123.106:0/662131507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff44c104520 0x7ff44c1145a0 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7ff43c00eb10 tx=0x7ff43c00eed0 comp rx=0 tx=0).stop 2026-03-09T17:36:46.531 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.529+0000 7ff452a57700 1 -- 192.168.123.106:0/662131507 shutdown_connections 2026-03-09T17:36:46.531 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.529+0000 7ff452a57700 1 --2- 192.168.123.106:0/662131507 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7ff438077870 0x7ff438079d20 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:46.531 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.529+0000 7ff452a57700 1 --2- 192.168.123.106:0/662131507 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7ff44c0fff00 0x7ff44c114040 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:46.531 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.529+0000 7ff452a57700 1 --2- 192.168.123.106:0/662131507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff44c104520 0x7ff44c1145a0 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:46.531 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.529+0000 7ff452a57700 1 -- 192.168.123.106:0/662131507 >> 192.168.123.106:0/662131507 conn(0x7ff44c0754a0 msgr2=0x7ff44c0feca0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:46.531 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.529+0000 7ff452a57700 1 -- 192.168.123.106:0/662131507 shutdown_connections 2026-03-09T17:36:46.531 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.530+0000 7ff452a57700 1 -- 192.168.123.106:0/662131507 wait complete. 2026-03-09T17:36:46.532 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 27 2026-03-09T17:36:46.596 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 27 2026-03-09T17:36:46.596 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 28 2026-03-09T17:36:46.743 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:46.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.975+0000 7fe27f636700 1 -- 192.168.123.106:0/531220843 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe278102810 msgr2=0x7fe278102c80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:46.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.975+0000 7fe27f636700 1 --2- 192.168.123.106:0/531220843 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe278102810 0x7fe278102c80 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7fe274009b50 tx=0x7fe274009e60 comp rx=0 tx=0).stop 2026-03-09T17:36:46.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.976+0000 7fe27f636700 1 -- 192.168.123.106:0/531220843 shutdown_connections 2026-03-09T17:36:46.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.976+0000 7fe27f636700 1 --2- 192.168.123.106:0/531220843 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe278102810 0x7fe278102c80 unknown :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:46.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.976+0000 7fe27f636700 1 --2- 192.168.123.106:0/531220843 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe278108810 0x7fe278108be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:46.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.976+0000 7fe27f636700 1 -- 192.168.123.106:0/531220843 >> 192.168.123.106:0/531220843 conn(0x7fe2780fe330 msgr2=0x7fe278100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:46.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.976+0000 7fe27f636700 1 -- 192.168.123.106:0/531220843 shutdown_connections 2026-03-09T17:36:46.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.976+0000 7fe27f636700 1 -- 192.168.123.106:0/531220843 wait complete. 2026-03-09T17:36:46.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.977+0000 7fe27f636700 1 Processor -- start 2026-03-09T17:36:46.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.977+0000 7fe27f636700 1 -- start start 2026-03-09T17:36:46.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.977+0000 7fe27f636700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe278102810 0x7fe2781983e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:46.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.977+0000 7fe27f636700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe278108810 0x7fe278198920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:46.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.977+0000 7fe27f636700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe278199000 con 0x7fe278102810 2026-03-09T17:36:46.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.977+0000 7fe27f636700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe27819cd90 con 0x7fe278108810 2026-03-09T17:36:46.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.977+0000 7fe27cbd1700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe278108810 0x7fe278198920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:46.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.977+0000 7fe27cbd1700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe278108810 0x7fe278198920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:46818/0 (socket says 192.168.123.106:46818) 2026-03-09T17:36:46.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.977+0000 7fe27cbd1700 1 -- 192.168.123.106:0/1121270686 learned_addr learned my addr 192.168.123.106:0/1121270686 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:46.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.977+0000 7fe27d3d2700 1 --2- 192.168.123.106:0/1121270686 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe278102810 0x7fe2781983e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:46.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.977+0000 7fe27cbd1700 1 -- 192.168.123.106:0/1121270686 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe278102810 msgr2=0x7fe2781983e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:46.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.978+0000 7fe27cbd1700 1 --2- 192.168.123.106:0/1121270686 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe278102810 0x7fe2781983e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:46.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.978+0000 7fe27cbd1700 1 -- 192.168.123.106:0/1121270686 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe2740097e0 con 0x7fe278108810 2026-03-09T17:36:46.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.978+0000 7fe27d3d2700 1 --2- 192.168.123.106:0/1121270686 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe278102810 0x7fe2781983e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T17:36:46.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.978+0000 7fe27cbd1700 1 --2- 192.168.123.106:0/1121270686 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe278108810 0x7fe278198920 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fe274005f50 tx=0x7fe2740049e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:46.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.978+0000 7fe26e7fc700 1 -- 192.168.123.106:0/1121270686 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe27401d070 con 0x7fe278108810 2026-03-09T17:36:46.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.978+0000 7fe27f636700 1 -- 192.168.123.106:0/1121270686 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe27819d010 con 0x7fe278108810 2026-03-09T17:36:46.980 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.978+0000 7fe26e7fc700 1 -- 192.168.123.106:0/1121270686 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe274022470 con 0x7fe278108810 2026-03-09T17:36:46.980 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.978+0000 7fe26e7fc700 1 -- 192.168.123.106:0/1121270686 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe27400f650 con 0x7fe278108810 2026-03-09T17:36:46.980 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.978+0000 7fe27f636700 1 -- 192.168.123.106:0/1121270686 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe27819d500 con 0x7fe278108810 2026-03-09T17:36:46.981 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.979+0000 7fe27f636700 1 -- 192.168.123.106:0/1121270686 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe27804ea50 con 0x7fe278108810 2026-03-09T17:36:46.981 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.980+0000 7fe26e7fc700 1 -- 192.168.123.106:0/1121270686 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe274044b60 con 0x7fe278108810 2026-03-09T17:36:46.981 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.980+0000 7fe26e7fc700 1 --2- 192.168.123.106:0/1121270686 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe260077920 0x7fe260079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:46.981 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.980+0000 7fe26e7fc700 1 -- 192.168.123.106:0/1121270686 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fe27409b300 con 0x7fe278108810 2026-03-09T17:36:46.981 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.980+0000 7fe27d3d2700 1 --2- 192.168.123.106:0/1121270686 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe260077920 0x7fe260079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:46.982 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.981+0000 7fe27d3d2700 1 --2- 192.168.123.106:0/1121270686 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe260077920 0x7fe260079dd0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fe278103950 tx=0x7fe268009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:46.984 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:46.983+0000 7fe26e7fc700 1 -- 192.168.123.106:0/1121270686 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe274063a30 con 0x7fe278108810 2026-03-09T17:36:47.119 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.117+0000 7fe27f636700 1 -- 192.168.123.106:0/1121270686 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 28, "format": "json"} v 0) v1 -- 0x7fe2781997e0 con 0x7fe278108810 2026-03-09T17:36:47.121 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.119+0000 7fe26e7fc700 1 -- 192.168.123.106:0/1121270686 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 28, "format": "json"}]=0 dumped fsmap epoch 28 v35) v1 ==== 107+0+5129 (secure 0 0 0) 0x7fe274063180 con 0x7fe278108810 2026-03-09T17:36:47.121 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:47.121 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":28,"btime":"2026-03-09T17:34:25:957089+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44251,"name":"cephfs.vm06.gzymac","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6829/2160269265","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":2160269265},{"type":"v1","addr":"192.168.123.106:6829","nonce":2160269265}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":44253,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3810846472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3810846472},{"type":"v1","addr":"192.168.123.109:6825","nonce":3810846472}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":28}],"filesystems":[{"mdsmap":{"epoch":25,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:18.757239+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":78,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm09.drzmdt","rank":0,"incarnation":22,"state":"up:active","state_seq":109,"addr":"192.168.123.109:6827/3078962403","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3078962403},{"type":"v1","addr":"192.168.123.109:6827","nonce":3078962403}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24307,"qdb_cluster":[24307]},"id":1}]} 2026-03-09T17:36:47.123 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.122+0000 7fe27f636700 1 -- 192.168.123.106:0/1121270686 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe260077920 msgr2=0x7fe260079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:47.123 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.122+0000 7fe27f636700 1 --2- 192.168.123.106:0/1121270686 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe260077920 0x7fe260079dd0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fe278103950 tx=0x7fe268009450 comp rx=0 tx=0).stop 2026-03-09T17:36:47.123 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.122+0000 7fe27f636700 1 -- 192.168.123.106:0/1121270686 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe278108810 msgr2=0x7fe278198920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:47.123 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.122+0000 7fe27f636700 1 --2- 192.168.123.106:0/1121270686 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe278108810 0x7fe278198920 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fe274005f50 tx=0x7fe2740049e0 comp rx=0 tx=0).stop 2026-03-09T17:36:47.124 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.122+0000 7fe27f636700 1 -- 192.168.123.106:0/1121270686 shutdown_connections 2026-03-09T17:36:47.124 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.122+0000 7fe27f636700 1 --2- 192.168.123.106:0/1121270686 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fe260077920 0x7fe260079dd0 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:47.124 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.122+0000 7fe27f636700 1 --2- 192.168.123.106:0/1121270686 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe278102810 0x7fe2781983e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:47.124 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.122+0000 7fe27f636700 1 --2- 192.168.123.106:0/1121270686 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fe278108810 0x7fe278198920 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:47.124 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.123+0000 7fe27f636700 1 -- 192.168.123.106:0/1121270686 >> 192.168.123.106:0/1121270686 conn(0x7fe2780fe330 msgr2=0x7fe2780ffb70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:47.124 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.123+0000 7fe27f636700 1 -- 192.168.123.106:0/1121270686 shutdown_connections 2026-03-09T17:36:47.124 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.123+0000 7fe27f636700 1 -- 192.168.123.106:0/1121270686 wait complete. 2026-03-09T17:36:47.125 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 28 2026-03-09T17:36:47.183 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 28 2026-03-09T17:36:47.183 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 29 2026-03-09T17:36:47.319 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:47.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:47 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/662131507' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-09T17:36:47.371 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:47 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/1121270686' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-09T17:36:47.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:47 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/662131507' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-09T17:36:47.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:47 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/1121270686' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-09T17:36:47.565 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.563+0000 7fc6757ea700 1 -- 192.168.123.106:0/1761797309 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc670108810 msgr2=0x7fc670108be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:47.565 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.563+0000 7fc6757ea700 1 --2- 192.168.123.106:0/1761797309 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc670108810 0x7fc670108be0 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7fc658009b00 tx=0x7fc658009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:47.565 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.564+0000 7fc6757ea700 1 -- 192.168.123.106:0/1761797309 shutdown_connections 2026-03-09T17:36:47.565 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.564+0000 7fc6757ea700 1 --2- 192.168.123.106:0/1761797309 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc670102810 0x7fc670102c80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:47.565 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.564+0000 7fc6757ea700 1 --2- 192.168.123.106:0/1761797309 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc670108810 0x7fc670108be0 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:47.565 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.564+0000 7fc6757ea700 1 -- 192.168.123.106:0/1761797309 >> 192.168.123.106:0/1761797309 conn(0x7fc6700fe330 msgr2=0x7fc670100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:47.565 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.564+0000 7fc6757ea700 1 -- 192.168.123.106:0/1761797309 shutdown_connections 2026-03-09T17:36:47.565 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.564+0000 7fc6757ea700 1 -- 192.168.123.106:0/1761797309 wait complete. 2026-03-09T17:36:47.565 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.564+0000 7fc6757ea700 1 Processor -- start 2026-03-09T17:36:47.566 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.565+0000 7fc6757ea700 1 -- start start 2026-03-09T17:36:47.566 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.565+0000 7fc6757ea700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc670102810 0x7fc6701983f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:47.566 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.565+0000 7fc6757ea700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc670108810 0x7fc670198930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:47.566 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.565+0000 7fc6757ea700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc670199010 con 0x7fc670102810 2026-03-09T17:36:47.566 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.565+0000 7fc66e7fc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc670108810 0x7fc670198930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:47.566 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.565+0000 7fc6757ea700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc67019cda0 con 0x7fc670108810 2026-03-09T17:36:47.566 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.565+0000 7fc66e7fc700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc670108810 0x7fc670198930 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:46830/0 (socket says 192.168.123.106:46830) 2026-03-09T17:36:47.566 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.565+0000 7fc66e7fc700 1 -- 192.168.123.106:0/4224855675 learned_addr learned my addr 192.168.123.106:0/4224855675 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:47.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.565+0000 7fc66effd700 1 --2- 192.168.123.106:0/4224855675 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc670102810 0x7fc6701983f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:47.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.566+0000 7fc66e7fc700 1 -- 192.168.123.106:0/4224855675 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc670102810 msgr2=0x7fc6701983f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:47.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.566+0000 7fc66e7fc700 1 --2- 192.168.123.106:0/4224855675 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc670102810 0x7fc6701983f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:47.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.566+0000 7fc66e7fc700 1 -- 192.168.123.106:0/4224855675 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc6580097e0 con 0x7fc670108810 2026-03-09T17:36:47.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.566+0000 7fc66effd700 1 --2- 192.168.123.106:0/4224855675 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc670102810 0x7fc6701983f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T17:36:47.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.566+0000 7fc66e7fc700 1 --2- 192.168.123.106:0/4224855675 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc670108810 0x7fc670198930 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fc66000d8d0 tx=0x7fc66000dbe0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:47.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.566+0000 7fc667fff700 1 -- 192.168.123.106:0/4224855675 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc660009880 con 0x7fc670108810 2026-03-09T17:36:47.567 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.566+0000 7fc667fff700 1 -- 192.168.123.106:0/4224855675 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc660010460 con 0x7fc670108810 2026-03-09T17:36:47.568 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.566+0000 7fc6757ea700 1 -- 192.168.123.106:0/4224855675 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc67019d080 con 0x7fc670108810 2026-03-09T17:36:47.568 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.566+0000 7fc667fff700 1 -- 192.168.123.106:0/4224855675 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc66000f5d0 con 0x7fc670108810 2026-03-09T17:36:47.568 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.566+0000 7fc6757ea700 1 -- 192.168.123.106:0/4224855675 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc67019d5d0 con 0x7fc670108810 2026-03-09T17:36:47.568 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.567+0000 7fc6757ea700 1 -- 192.168.123.106:0/4224855675 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc67004ea50 con 0x7fc670108810 2026-03-09T17:36:47.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.568+0000 7fc667fff700 1 -- 192.168.123.106:0/4224855675 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc6600105d0 con 0x7fc670108810 2026-03-09T17:36:47.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.569+0000 7fc667fff700 1 --2- 192.168.123.106:0/4224855675 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc65c0778c0 0x7fc65c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:47.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.569+0000 7fc667fff700 1 -- 192.168.123.106:0/4224855675 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fc660020030 con 0x7fc670108810 2026-03-09T17:36:47.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.569+0000 7fc66effd700 1 --2- 192.168.123.106:0/4224855675 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc65c0778c0 0x7fc65c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:47.570 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.569+0000 7fc66effd700 1 --2- 192.168.123.106:0/4224855675 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc65c0778c0 0x7fc65c079d70 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7fc658006010 tx=0x7fc658005e80 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:47.571 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.570+0000 7fc667fff700 1 -- 192.168.123.106:0/4224855675 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc660062570 con 0x7fc670108810 2026-03-09T17:36:47.706 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.705+0000 7fc6757ea700 1 -- 192.168.123.106:0/4224855675 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 29, "format": "json"} v 0) v1 -- 0x7fc670066e40 con 0x7fc670108810 2026-03-09T17:36:47.707 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.706+0000 7fc667fff700 1 -- 192.168.123.106:0/4224855675 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 29, "format": "json"}]=0 dumped fsmap epoch 29 v35) v1 ==== 107+0+4324 (secure 0 0 0) 0x7fc660061cc0 con 0x7fc670108810 2026-03-09T17:36:47.707 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:47.707 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":29,"btime":"2026-03-09T17:34:28:277179+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44251,"name":"cephfs.vm06.gzymac","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6829/2160269265","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":2160269265},{"type":"v1","addr":"192.168.123.106:6829","nonce":2160269265}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":44253,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3810846472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3810846472},{"type":"v1","addr":"192.168.123.109:6825","nonce":3810846472}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":28}],"filesystems":[{"mdsmap":{"epoch":29,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:28.277171+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":80,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T17:36:47.709 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.708+0000 7fc6757ea700 1 -- 192.168.123.106:0/4224855675 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc65c0778c0 msgr2=0x7fc65c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:47.709 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.708+0000 7fc6757ea700 1 --2- 192.168.123.106:0/4224855675 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc65c0778c0 0x7fc65c079d70 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7fc658006010 tx=0x7fc658005e80 comp rx=0 tx=0).stop 2026-03-09T17:36:47.709 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.708+0000 7fc6757ea700 1 -- 192.168.123.106:0/4224855675 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc670108810 msgr2=0x7fc670198930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:47.709 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.708+0000 7fc6757ea700 1 --2- 192.168.123.106:0/4224855675 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc670108810 0x7fc670198930 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fc66000d8d0 tx=0x7fc66000dbe0 comp rx=0 tx=0).stop 2026-03-09T17:36:47.710 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.708+0000 7fc6757ea700 1 -- 192.168.123.106:0/4224855675 shutdown_connections 2026-03-09T17:36:47.710 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.708+0000 7fc6757ea700 1 --2- 192.168.123.106:0/4224855675 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fc65c0778c0 0x7fc65c079d70 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:47.710 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.708+0000 7fc6757ea700 1 --2- 192.168.123.106:0/4224855675 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc670102810 0x7fc6701983f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:47.710 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.708+0000 7fc6757ea700 1 --2- 192.168.123.106:0/4224855675 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fc670108810 0x7fc670198930 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:47.710 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.708+0000 7fc6757ea700 1 -- 192.168.123.106:0/4224855675 >> 192.168.123.106:0/4224855675 conn(0x7fc6700fe330 msgr2=0x7fc6700ffa90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:47.710 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.709+0000 7fc6757ea700 1 -- 192.168.123.106:0/4224855675 shutdown_connections 2026-03-09T17:36:47.710 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:47.709+0000 7fc6757ea700 1 -- 192.168.123.106:0/4224855675 wait complete. 2026-03-09T17:36:47.710 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 29 2026-03-09T17:36:47.754 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 29 2026-03-09T17:36:47.755 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 30 2026-03-09T17:36:47.892 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:48.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.145+0000 7f93597f9700 1 -- 192.168.123.106:0/4248168617 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9354073960 msgr2=0x7f935410c9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:48.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.145+0000 7f93597f9700 1 --2- 192.168.123.106:0/4248168617 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9354073960 0x7f935410c9f0 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7f9344009ae0 tx=0x7f9344009df0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.146+0000 7f93597f9700 1 -- 192.168.123.106:0/4248168617 shutdown_connections 2026-03-09T17:36:48.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.146+0000 7f93597f9700 1 --2- 192.168.123.106:0/4248168617 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9354073960 0x7f935410c9f0 unknown :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.146+0000 7f93597f9700 1 --2- 192.168.123.106:0/4248168617 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9354073050 0x7f9354073420 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.146+0000 7f93597f9700 1 -- 192.168.123.106:0/4248168617 >> 192.168.123.106:0/4248168617 conn(0x7f9354078580 msgr2=0x7f9354078980 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:48.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.146+0000 7f93597f9700 1 -- 192.168.123.106:0/4248168617 shutdown_connections 2026-03-09T17:36:48.147 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.146+0000 7f93597f9700 1 -- 192.168.123.106:0/4248168617 wait complete. 2026-03-09T17:36:48.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.147+0000 7f93597f9700 1 Processor -- start 2026-03-09T17:36:48.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.147+0000 7f93597f9700 1 -- start start 2026-03-09T17:36:48.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.147+0000 7f93597f9700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9354073050 0x7f93541984d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:48.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.147+0000 7f93597f9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9354073960 0x7f9354198a10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:48.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.147+0000 7f93597f9700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f93541990f0 con 0x7f9354073960 2026-03-09T17:36:48.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.147+0000 7f93597f9700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f935419ce80 con 0x7f9354073050 2026-03-09T17:36:48.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.147+0000 7f93527fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9354073960 0x7f9354198a10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:48.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.147+0000 7f93527fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9354073960 0x7f9354198a10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:55966/0 (socket says 192.168.123.106:55966) 2026-03-09T17:36:48.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.147+0000 7f93527fc700 1 -- 192.168.123.106:0/1164518400 learned_addr learned my addr 192.168.123.106:0/1164518400 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:48.148 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.147+0000 7f9352ffd700 1 --2- 192.168.123.106:0/1164518400 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9354073050 0x7f93541984d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:48.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.148+0000 7f93527fc700 1 -- 192.168.123.106:0/1164518400 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9354073050 msgr2=0x7f93541984d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:48.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.148+0000 7f93527fc700 1 --2- 192.168.123.106:0/1164518400 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9354073050 0x7f93541984d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.148+0000 7f93527fc700 1 -- 192.168.123.106:0/1164518400 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9344009710 con 0x7f9354073960 2026-03-09T17:36:48.149 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.148+0000 7f93527fc700 1 --2- 192.168.123.106:0/1164518400 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9354073960 0x7f9354198a10 secure :-1 s=READY pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7f934400f690 tx=0x7f934400f6c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:48.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.148+0000 7f934bfff700 1 -- 192.168.123.106:0/1164518400 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f934401d070 con 0x7f9354073960 2026-03-09T17:36:48.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.148+0000 7f934bfff700 1 -- 192.168.123.106:0/1164518400 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f934400bbc0 con 0x7f9354073960 2026-03-09T17:36:48.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.148+0000 7f934bfff700 1 -- 192.168.123.106:0/1164518400 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f93440176d0 con 0x7f9354073960 2026-03-09T17:36:48.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.148+0000 7f93597f9700 1 -- 192.168.123.106:0/1164518400 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f935419d100 con 0x7f9354073960 2026-03-09T17:36:48.150 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.148+0000 7f93597f9700 1 -- 192.168.123.106:0/1164518400 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f935419d510 con 0x7f9354073960 2026-03-09T17:36:48.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.150+0000 7f934bfff700 1 -- 192.168.123.106:0/1164518400 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9344017830 con 0x7f9354073960 2026-03-09T17:36:48.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.150+0000 7f93597f9700 1 -- 192.168.123.106:0/1164518400 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f935410a0f0 con 0x7f9354073960 2026-03-09T17:36:48.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.150+0000 7f934bfff700 1 --2- 192.168.123.106:0/1164518400 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f93400778c0 0x7f9340079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:48.151 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.150+0000 7f934bfff700 1 -- 192.168.123.106:0/1164518400 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f934409aef0 con 0x7f9354073960 2026-03-09T17:36:48.154 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.153+0000 7f9352ffd700 1 --2- 192.168.123.106:0/1164518400 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f93400778c0 0x7f9340079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:48.154 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.153+0000 7f934bfff700 1 -- 192.168.123.106:0/1164518400 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9344063750 con 0x7f9354073960 2026-03-09T17:36:48.154 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.153+0000 7f9352ffd700 1 --2- 192.168.123.106:0/1164518400 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f93400778c0 0x7f9340079d70 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f933c005950 tx=0x7f933c00b410 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:48.299 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.298+0000 7f93597f9700 1 -- 192.168.123.106:0/1164518400 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 30, "format": "json"} v 0) v1 -- 0x7f9354199830 con 0x7f9354073960 2026-03-09T17:36:48.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:48 vm06.local ceph-mon[109831]: pgmap v215: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:48.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:48 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/4224855675' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-09T17:36:48.301 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.300+0000 7f934bfff700 1 -- 192.168.123.106:0/1164518400 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 30, "format": "json"}]=0 dumped fsmap epoch 30 v35) v1 ==== 107+0+4403 (secure 0 0 0) 0x7f9344062ea0 con 0x7f9354073960 2026-03-09T17:36:48.301 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:48.301 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":30,"btime":"2026-03-09T17:34:28:285785+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44251,"name":"cephfs.vm06.gzymac","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6829/2160269265","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":2160269265},{"type":"v1","addr":"192.168.123.106:6829","nonce":2160269265}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":44253,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3810846472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3810846472},{"type":"v1","addr":"192.168.123.109:6825","nonce":3810846472}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":28}],"filesystems":[{"mdsmap":{"epoch":30,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:28.285781+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":80,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34284},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34284":{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":0,"incarnation":30,"state":"up:replay","state_seq":1,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T17:36:48.303 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.302+0000 7f93597f9700 1 -- 192.168.123.106:0/1164518400 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f93400778c0 msgr2=0x7f9340079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:48.303 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.302+0000 7f93597f9700 1 --2- 192.168.123.106:0/1164518400 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f93400778c0 0x7f9340079d70 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f933c005950 tx=0x7f933c00b410 comp rx=0 tx=0).stop 2026-03-09T17:36:48.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.303+0000 7f93597f9700 1 -- 192.168.123.106:0/1164518400 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9354073960 msgr2=0x7f9354198a10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:48.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.303+0000 7f93597f9700 1 --2- 192.168.123.106:0/1164518400 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9354073960 0x7f9354198a10 secure :-1 s=READY pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7f934400f690 tx=0x7f934400f6c0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.303+0000 7f93597f9700 1 -- 192.168.123.106:0/1164518400 shutdown_connections 2026-03-09T17:36:48.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.303+0000 7f93597f9700 1 --2- 192.168.123.106:0/1164518400 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f93400778c0 0x7f9340079d70 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.303+0000 7f93597f9700 1 --2- 192.168.123.106:0/1164518400 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f9354073050 0x7f93541984d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.303+0000 7f93597f9700 1 --2- 192.168.123.106:0/1164518400 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9354073960 0x7f9354198a10 unknown :-1 s=CLOSED pgs=167 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.303+0000 7f93597f9700 1 -- 192.168.123.106:0/1164518400 >> 192.168.123.106:0/1164518400 conn(0x7f9354078580 msgr2=0x7f9354107230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:48.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.303+0000 7f93597f9700 1 -- 192.168.123.106:0/1164518400 shutdown_connections 2026-03-09T17:36:48.304 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.303+0000 7f93597f9700 1 -- 192.168.123.106:0/1164518400 wait complete. 2026-03-09T17:36:48.305 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 30 2026-03-09T17:36:48.366 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 30 2026-03-09T17:36:48.366 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 31 2026-03-09T17:36:48.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:48 vm09.local ceph-mon[97995]: pgmap v215: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:48.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:48 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/4224855675' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-09T17:36:48.505 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:48.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.734+0000 7f30b097f700 1 -- 192.168.123.106:0/185221875 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f30a80ff1e0 msgr2=0x7f30a80ff5b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:48.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.734+0000 7f30b097f700 1 --2- 192.168.123.106:0/185221875 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f30a80ff1e0 0x7f30a80ff5b0 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7f30a0009b00 tx=0x7f30a0009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:48.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.735+0000 7f30b097f700 1 -- 192.168.123.106:0/185221875 shutdown_connections 2026-03-09T17:36:48.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.735+0000 7f30b097f700 1 --2- 192.168.123.106:0/185221875 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f30a80ffaf0 0x7f30a810c850 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.735+0000 7f30b097f700 1 --2- 192.168.123.106:0/185221875 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f30a80ff1e0 0x7f30a80ff5b0 unknown :-1 s=CLOSED pgs=168 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.735+0000 7f30b097f700 1 -- 192.168.123.106:0/185221875 >> 192.168.123.106:0/185221875 conn(0x7f30a8076060 msgr2=0x7f30a8076460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:48.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.735+0000 7f30b097f700 1 -- 192.168.123.106:0/185221875 shutdown_connections 2026-03-09T17:36:48.736 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.735+0000 7f30b097f700 1 -- 192.168.123.106:0/185221875 wait complete. 2026-03-09T17:36:48.737 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.736+0000 7f30b097f700 1 Processor -- start 2026-03-09T17:36:48.737 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.736+0000 7f30b097f700 1 -- start start 2026-03-09T17:36:48.737 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.736+0000 7f30b097f700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f30a80ff1e0 0x7f30a8198210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:48.737 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.736+0000 7f30b097f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f30a80ffaf0 0x7f30a8198750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:48.737 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.736+0000 7f30b097f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f30a8198da0 con 0x7f30a80ffaf0 2026-03-09T17:36:48.737 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.736+0000 7f30b097f700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f30a8198ee0 con 0x7f30a80ff1e0 2026-03-09T17:36:48.737 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.736+0000 7f30adf1a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f30a80ffaf0 0x7f30a8198750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:48.737 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.736+0000 7f30adf1a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f30a80ffaf0 0x7f30a8198750 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:55980/0 (socket says 192.168.123.106:55980) 2026-03-09T17:36:48.738 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.736+0000 7f30adf1a700 1 -- 192.168.123.106:0/580093766 learned_addr learned my addr 192.168.123.106:0/580093766 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:48.738 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.737+0000 7f30adf1a700 1 -- 192.168.123.106:0/580093766 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f30a80ff1e0 msgr2=0x7f30a8198210 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:48.738 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.737+0000 7f30adf1a700 1 --2- 192.168.123.106:0/580093766 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f30a80ff1e0 0x7f30a8198210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.738 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.737+0000 7f30adf1a700 1 -- 192.168.123.106:0/580093766 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f30a00097e0 con 0x7f30a80ffaf0 2026-03-09T17:36:48.738 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.737+0000 7f30adf1a700 1 --2- 192.168.123.106:0/580093766 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f30a80ffaf0 0x7f30a8198750 secure :-1 s=READY pgs=169 cs=0 l=1 rev1=1 crypto rx=0x7f309c00cc60 tx=0x7f309c0074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:48.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.737+0000 7f309b7fe700 1 -- 192.168.123.106:0/580093766 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f309c007af0 con 0x7f30a80ffaf0 2026-03-09T17:36:48.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.737+0000 7f309b7fe700 1 -- 192.168.123.106:0/580093766 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f309c007c50 con 0x7f30a80ffaf0 2026-03-09T17:36:48.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.737+0000 7f309b7fe700 1 -- 192.168.123.106:0/580093766 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f309c0187b0 con 0x7f30a80ffaf0 2026-03-09T17:36:48.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.737+0000 7f30b097f700 1 -- 192.168.123.106:0/580093766 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f30a819cd30 con 0x7f30a80ffaf0 2026-03-09T17:36:48.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.737+0000 7f30b097f700 1 -- 192.168.123.106:0/580093766 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f30a819d280 con 0x7f30a80ffaf0 2026-03-09T17:36:48.739 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.738+0000 7f30b097f700 1 -- 192.168.123.106:0/580093766 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f30a8109fc0 con 0x7f30a80ffaf0 2026-03-09T17:36:48.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.741+0000 7f309b7fe700 1 -- 192.168.123.106:0/580093766 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f309c01f030 con 0x7f30a80ffaf0 2026-03-09T17:36:48.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.742+0000 7f309b7fe700 1 --2- 192.168.123.106:0/580093766 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3094077990 0x7f3094079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:48.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.742+0000 7f309b7fe700 1 -- 192.168.123.106:0/580093766 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f309c099e80 con 0x7f30a80ffaf0 2026-03-09T17:36:48.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.742+0000 7f309b7fe700 1 -- 192.168.123.106:0/580093766 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f309c09a300 con 0x7f30a80ffaf0 2026-03-09T17:36:48.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.742+0000 7f30ae71b700 1 --2- 192.168.123.106:0/580093766 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3094077990 0x7f3094079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:48.744 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.743+0000 7f30ae71b700 1 --2- 192.168.123.106:0/580093766 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3094077990 0x7f3094079e40 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f30a000b5c0 tx=0x7f30a0009f90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:48.892 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.890+0000 7f30b097f700 1 -- 192.168.123.106:0/580093766 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 31, "format": "json"} v 0) v1 -- 0x7f30a804f310 con 0x7f30a80ffaf0 2026-03-09T17:36:48.892 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.891+0000 7f309b7fe700 1 -- 192.168.123.106:0/580093766 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 31, "format": "json"}]=0 dumped fsmap epoch 31 v35) v1 ==== 107+0+4406 (secure 0 0 0) 0x7f309c062630 con 0x7f30a80ffaf0 2026-03-09T17:36:48.893 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:48.893 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":31,"btime":"2026-03-09T17:34:33:098654+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44251,"name":"cephfs.vm06.gzymac","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6829/2160269265","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":2160269265},{"type":"v1","addr":"192.168.123.106:6829","nonce":2160269265}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":44253,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3810846472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3810846472},{"type":"v1","addr":"192.168.123.109:6825","nonce":3810846472}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":28}],"filesystems":[{"mdsmap":{"epoch":31,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:32.378128+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":80,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34284},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34284":{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":0,"incarnation":30,"state":"up:reconnect","state_seq":7,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T17:36:48.894 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.893+0000 7f30b097f700 1 -- 192.168.123.106:0/580093766 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3094077990 msgr2=0x7f3094079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:48.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.894+0000 7f30b097f700 1 --2- 192.168.123.106:0/580093766 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3094077990 0x7f3094079e40 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f30a000b5c0 tx=0x7f30a0009f90 comp rx=0 tx=0).stop 2026-03-09T17:36:48.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.894+0000 7f30b097f700 1 -- 192.168.123.106:0/580093766 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f30a80ffaf0 msgr2=0x7f30a8198750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:48.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.894+0000 7f30b097f700 1 --2- 192.168.123.106:0/580093766 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f30a80ffaf0 0x7f30a8198750 secure :-1 s=READY pgs=169 cs=0 l=1 rev1=1 crypto rx=0x7f309c00cc60 tx=0x7f309c0074a0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.894+0000 7f30b097f700 1 -- 192.168.123.106:0/580093766 shutdown_connections 2026-03-09T17:36:48.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.894+0000 7f30b097f700 1 --2- 192.168.123.106:0/580093766 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f3094077990 0x7f3094079e40 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.894+0000 7f30b097f700 1 --2- 192.168.123.106:0/580093766 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f30a80ff1e0 0x7f30a8198210 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.896 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.895+0000 7f30b097f700 1 --2- 192.168.123.106:0/580093766 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f30a80ffaf0 0x7f30a8198750 unknown :-1 s=CLOSED pgs=169 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:48.896 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.895+0000 7f30b097f700 1 -- 192.168.123.106:0/580093766 >> 192.168.123.106:0/580093766 conn(0x7f30a8076060 msgr2=0x7f30a80fd8a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:48.896 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.895+0000 7f30b097f700 1 -- 192.168.123.106:0/580093766 shutdown_connections 2026-03-09T17:36:48.896 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:48.895+0000 7f30b097f700 1 -- 192.168.123.106:0/580093766 wait complete. 2026-03-09T17:36:48.897 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 31 2026-03-09T17:36:48.956 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 31 2026-03-09T17:36:48.956 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 32 2026-03-09T17:36:49.100 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:49.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.348+0000 7fdda8450700 1 -- 192.168.123.106:0/2628544262 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdda0108790 msgr2=0x7fdda0108b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:49.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.348+0000 7fdda8450700 1 --2- 192.168.123.106:0/2628544262 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdda0108790 0x7fdda0108b60 secure :-1 s=READY pgs=170 cs=0 l=1 rev1=1 crypto rx=0x7fdd90013e70 tx=0x7fdd900099e0 comp rx=0 tx=0).stop 2026-03-09T17:36:49.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.348+0000 7fdda8450700 1 -- 192.168.123.106:0/2628544262 shutdown_connections 2026-03-09T17:36:49.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.348+0000 7fdda8450700 1 --2- 192.168.123.106:0/2628544262 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdda0102790 0x7fdda0102c00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:49.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.348+0000 7fdda8450700 1 --2- 192.168.123.106:0/2628544262 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdda0108790 0x7fdda0108b60 unknown :-1 s=CLOSED pgs=170 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:49.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.348+0000 7fdda8450700 1 -- 192.168.123.106:0/2628544262 >> 192.168.123.106:0/2628544262 conn(0x7fdda00fe2b0 msgr2=0x7fdda01006c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:49.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.349+0000 7fdda8450700 1 -- 192.168.123.106:0/2628544262 shutdown_connections 2026-03-09T17:36:49.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.349+0000 7fdda8450700 1 -- 192.168.123.106:0/2628544262 wait complete. 2026-03-09T17:36:49.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.349+0000 7fdda8450700 1 Processor -- start 2026-03-09T17:36:49.351 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.350+0000 7fdda8450700 1 -- start start 2026-03-09T17:36:49.351 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.350+0000 7fdda8450700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdda0102790 0x7fdda01a4fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:49.351 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.350+0000 7fdda8450700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdda0108790 0x7fdda01a5500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:49.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.350+0000 7fdda8450700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdda01a5be0 con 0x7fdda0102790 2026-03-09T17:36:49.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.350+0000 7fdda8450700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdda01a9970 con 0x7fdda0108790 2026-03-09T17:36:49.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.350+0000 7fdda61ec700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdda0102790 0x7fdda01a4fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:49.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.350+0000 7fdda61ec700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdda0102790 0x7fdda01a4fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:55988/0 (socket says 192.168.123.106:55988) 2026-03-09T17:36:49.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.350+0000 7fdda61ec700 1 -- 192.168.123.106:0/3561249460 learned_addr learned my addr 192.168.123.106:0/3561249460 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:49.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.350+0000 7fdda61ec700 1 -- 192.168.123.106:0/3561249460 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdda0108790 msgr2=0x7fdda01a5500 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:36:49.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.350+0000 7fdda61ec700 1 --2- 192.168.123.106:0/3561249460 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdda0108790 0x7fdda01a5500 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:49.354 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.350+0000 7fdda61ec700 1 -- 192.168.123.106:0/3561249460 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdd90013b50 con 0x7fdda0102790 2026-03-09T17:36:49.354 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.350+0000 7fdda61ec700 1 --2- 192.168.123.106:0/3561249460 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdda0102790 0x7fdda01a4fc0 secure :-1 s=READY pgs=171 cs=0 l=1 rev1=1 crypto rx=0x7fdd90006010 tx=0x7fdd9001b640 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:49.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.351+0000 7fdd977fe700 1 -- 192.168.123.106:0/3561249460 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdd90005280 con 0x7fdda0102790 2026-03-09T17:36:49.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.351+0000 7fdd977fe700 1 -- 192.168.123.106:0/3561249460 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fdd90013800 con 0x7fdda0102790 2026-03-09T17:36:49.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.351+0000 7fdd977fe700 1 -- 192.168.123.106:0/3561249460 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdd90003ea0 con 0x7fdda0102790 2026-03-09T17:36:49.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.351+0000 7fdda8450700 1 -- 192.168.123.106:0/3561249460 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdda01a9bf0 con 0x7fdda0102790 2026-03-09T17:36:49.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.351+0000 7fdda8450700 1 -- 192.168.123.106:0/3561249460 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdda01aa0e0 con 0x7fdda0102790 2026-03-09T17:36:49.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.352+0000 7fdd977fe700 1 -- 192.168.123.106:0/3561249460 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fdd90010670 con 0x7fdda0102790 2026-03-09T17:36:49.357 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.353+0000 7fdda8450700 1 -- 192.168.123.106:0/3561249460 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdda004ea50 con 0x7fdda0102790 2026-03-09T17:36:49.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.355+0000 7fdd977fe700 1 --2- 192.168.123.106:0/3561249460 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fdd8c0779e0 0x7fdd8c079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:49.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.355+0000 7fdd977fe700 1 -- 192.168.123.106:0/3561249460 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fdd9009c0d0 con 0x7fdda0102790 2026-03-09T17:36:49.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.355+0000 7fdd977fe700 1 -- 192.168.123.106:0/3561249460 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fdd900d29f0 con 0x7fdda0102790 2026-03-09T17:36:49.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.356+0000 7fdda59eb700 1 --2- 192.168.123.106:0/3561249460 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fdd8c0779e0 0x7fdd8c079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:49.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.357+0000 7fdda59eb700 1 --2- 192.168.123.106:0/3561249460 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fdd8c0779e0 0x7fdd8c079e90 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fdda01a65e0 tx=0x7fdd9c006cb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:49.497 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:49 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/1164518400' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-09T17:36:49.497 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:49 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/580093766' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-09T17:36:49.499 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.495+0000 7fdda8450700 1 -- 192.168.123.106:0/3561249460 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 32, "format": "json"} v 0) v1 -- 0x7fdda0066e40 con 0x7fdda0102790 2026-03-09T17:36:49.499 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.498+0000 7fdd977fe700 1 -- 192.168.123.106:0/3561249460 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 32, "format": "json"}]=0 dumped fsmap epoch 32 v35) v1 ==== 107+0+4403 (secure 0 0 0) 0x7fdd9006d020 con 0x7fdda0102790 2026-03-09T17:36:49.500 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:49.500 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":32,"btime":"2026-03-09T17:34:34:105743+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44251,"name":"cephfs.vm06.gzymac","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6829/2160269265","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":2160269265},{"type":"v1","addr":"192.168.123.106:6829","nonce":2160269265}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":44253,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3810846472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3810846472},{"type":"v1","addr":"192.168.123.109:6825","nonce":3810846472}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":28}],"filesystems":[{"mdsmap":{"epoch":32,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:33.110127+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":80,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34284},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34284":{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":0,"incarnation":30,"state":"up:rejoin","state_seq":8,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T17:36:49.502 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.501+0000 7fdda8450700 1 -- 192.168.123.106:0/3561249460 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fdd8c0779e0 msgr2=0x7fdd8c079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:49.502 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.501+0000 7fdda8450700 1 --2- 192.168.123.106:0/3561249460 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fdd8c0779e0 0x7fdd8c079e90 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fdda01a65e0 tx=0x7fdd9c006cb0 comp rx=0 tx=0).stop 2026-03-09T17:36:49.502 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.501+0000 7fdda8450700 1 -- 192.168.123.106:0/3561249460 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdda0102790 msgr2=0x7fdda01a4fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:49.502 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.501+0000 7fdda8450700 1 --2- 192.168.123.106:0/3561249460 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdda0102790 0x7fdda01a4fc0 secure :-1 s=READY pgs=171 cs=0 l=1 rev1=1 crypto rx=0x7fdd90006010 tx=0x7fdd9001b640 comp rx=0 tx=0).stop 2026-03-09T17:36:49.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.501+0000 7fdda8450700 1 -- 192.168.123.106:0/3561249460 shutdown_connections 2026-03-09T17:36:49.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.501+0000 7fdda8450700 1 --2- 192.168.123.106:0/3561249460 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7fdd8c0779e0 0x7fdd8c079e90 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:49.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.501+0000 7fdda8450700 1 --2- 192.168.123.106:0/3561249460 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdda0102790 0x7fdda01a4fc0 unknown :-1 s=CLOSED pgs=171 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:49.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.501+0000 7fdda8450700 1 --2- 192.168.123.106:0/3561249460 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7fdda0108790 0x7fdda01a5500 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:49.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.501+0000 7fdda8450700 1 -- 192.168.123.106:0/3561249460 >> 192.168.123.106:0/3561249460 conn(0x7fdda00fe2b0 msgr2=0x7fdda00ffac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:49.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.501+0000 7fdda8450700 1 -- 192.168.123.106:0/3561249460 shutdown_connections 2026-03-09T17:36:49.503 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.501+0000 7fdda8450700 1 -- 192.168.123.106:0/3561249460 wait complete. 2026-03-09T17:36:49.503 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 32 2026-03-09T17:36:49.563 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 32 2026-03-09T17:36:49.563 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 33 2026-03-09T17:36:49.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:49 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/1164518400' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-09T17:36:49.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:49 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/580093766' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-09T17:36:49.699 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:49.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.944+0000 7f6367e2d700 1 -- 192.168.123.106:0/3775626831 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63600730f0 msgr2=0x7f63600734c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:49.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.944+0000 7f6367e2d700 1 --2- 192.168.123.106:0/3775626831 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63600730f0 0x7f63600734c0 secure :-1 s=READY pgs=172 cs=0 l=1 rev1=1 crypto rx=0x7f6354009b50 tx=0x7f6354009e60 comp rx=0 tx=0).stop 2026-03-09T17:36:49.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.944+0000 7f6367e2d700 1 -- 192.168.123.106:0/3775626831 shutdown_connections 2026-03-09T17:36:49.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.944+0000 7f6367e2d700 1 --2- 192.168.123.106:0/3775626831 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6360073a00 0x7f6360110ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:49.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.944+0000 7f6367e2d700 1 --2- 192.168.123.106:0/3775626831 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f63600730f0 0x7f63600734c0 unknown :-1 s=CLOSED pgs=172 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:49.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.944+0000 7f6367e2d700 1 -- 192.168.123.106:0/3775626831 >> 192.168.123.106:0/3775626831 conn(0x7f63600fc000 msgr2=0x7f63600fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:49.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.944+0000 7f6367e2d700 1 -- 192.168.123.106:0/3775626831 shutdown_connections 2026-03-09T17:36:49.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.945+0000 7f6367e2d700 1 -- 192.168.123.106:0/3775626831 wait complete. 2026-03-09T17:36:49.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.945+0000 7f6367e2d700 1 Processor -- start 2026-03-09T17:36:49.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.945+0000 7f6367e2d700 1 -- start start 2026-03-09T17:36:49.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.945+0000 7f6367e2d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f63600730f0 0x7f63601a2560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:49.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.945+0000 7f6367e2d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6360073a00 0x7f63601a2aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:49.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.945+0000 7f6367e2d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f63601a3130 con 0x7f6360073a00 2026-03-09T17:36:49.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.945+0000 7f6367e2d700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f636019c630 con 0x7f63600730f0 2026-03-09T17:36:49.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.946+0000 7f63653c8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6360073a00 0x7f63601a2aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:49.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.946+0000 7f63653c8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6360073a00 0x7f63601a2aa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:56006/0 (socket says 192.168.123.106:56006) 2026-03-09T17:36:49.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.946+0000 7f63653c8700 1 -- 192.168.123.106:0/3467663446 learned_addr learned my addr 192.168.123.106:0/3467663446 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:49.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.946+0000 7f63653c8700 1 -- 192.168.123.106:0/3467663446 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f63600730f0 msgr2=0x7f63601a2560 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T17:36:49.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.946+0000 7f6365bc9700 1 --2- 192.168.123.106:0/3467663446 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f63600730f0 0x7f63601a2560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:49.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.946+0000 7f63653c8700 1 --2- 192.168.123.106:0/3467663446 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f63600730f0 0x7f63601a2560 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:49.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.946+0000 7f63653c8700 1 -- 192.168.123.106:0/3467663446 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f63540097e0 con 0x7f6360073a00 2026-03-09T17:36:49.947 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.946+0000 7f6365bc9700 1 --2- 192.168.123.106:0/3467663446 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f63600730f0 0x7f63601a2560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:36:49.948 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.946+0000 7f63653c8700 1 --2- 192.168.123.106:0/3467663446 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6360073a00 0x7f63601a2aa0 secure :-1 s=READY pgs=173 cs=0 l=1 rev1=1 crypto rx=0x7f635c00d8d0 tx=0x7f635c00dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:49.949 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.947+0000 7f6352ffd700 1 -- 192.168.123.106:0/3467663446 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f635c009940 con 0x7f6360073a00 2026-03-09T17:36:49.949 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.947+0000 7f6352ffd700 1 -- 192.168.123.106:0/3467663446 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f635c010460 con 0x7f6360073a00 2026-03-09T17:36:49.949 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.947+0000 7f6352ffd700 1 -- 192.168.123.106:0/3467663446 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f635c00f5d0 con 0x7f6360073a00 2026-03-09T17:36:49.949 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.947+0000 7f6367e2d700 1 -- 192.168.123.106:0/3467663446 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f636019c910 con 0x7f6360073a00 2026-03-09T17:36:49.949 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.947+0000 7f6367e2d700 1 -- 192.168.123.106:0/3467663446 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f636019ce60 con 0x7f6360073a00 2026-03-09T17:36:49.951 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.948+0000 7f6367e2d700 1 -- 192.168.123.106:0/3467663446 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f636010e770 con 0x7f6360073a00 2026-03-09T17:36:49.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.951+0000 7f6352ffd700 1 -- 192.168.123.106:0/3467663446 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f635c010a90 con 0x7f6360073a00 2026-03-09T17:36:49.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.952+0000 7f6352ffd700 1 --2- 192.168.123.106:0/3467663446 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f634c077990 0x7f634c079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:49.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.952+0000 7f6365bc9700 1 --2- 192.168.123.106:0/3467663446 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f634c077990 0x7f634c079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:49.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.952+0000 7f6352ffd700 1 -- 192.168.123.106:0/3467663446 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f635c099f40 con 0x7f6360073a00 2026-03-09T17:36:49.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.952+0000 7f6365bc9700 1 --2- 192.168.123.106:0/3467663446 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f634c077990 0x7f634c079e40 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f635400b5c0 tx=0x7f63540058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:49.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:49.952+0000 7f6352ffd700 1 -- 192.168.123.106:0/3467663446 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f635c09a320 con 0x7f6360073a00 2026-03-09T17:36:50.101 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.097+0000 7f6367e2d700 1 -- 192.168.123.106:0/3467663446 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 33, "format": "json"} v 0) v1 -- 0x7f636004ea50 con 0x7f6360073a00 2026-03-09T17:36:50.102 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.101+0000 7f6352ffd700 1 -- 192.168.123.106:0/3467663446 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 33, "format": "json"}]=0 dumped fsmap epoch 33 v35) v1 ==== 107+0+5263 (secure 0 0 0) 0x7f635c067e60 con 0x7f6360073a00 2026-03-09T17:36:50.102 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:50.102 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":33,"btime":"2026-03-09T17:34:35:113342+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44251,"name":"cephfs.vm06.gzymac","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6829/2160269265","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":2160269265},{"type":"v1","addr":"192.168.123.106:6829","nonce":2160269265}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":44253,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3810846472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3810846472},{"type":"v1","addr":"192.168.123.109:6825","nonce":3810846472}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":28},{"gid":44275,"name":"cephfs.vm09.drzmdt","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6827/3154236738","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3154236738},{"type":"v1","addr":"192.168.123.109:6827","nonce":3154236738}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":33}],"filesystems":[{"mdsmap":{"epoch":33,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:35.113340+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":80,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34284},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34284":{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":0,"incarnation":30,"state":"up:active","state_seq":9,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34284,"qdb_cluster":[34284]},"id":1}]} 2026-03-09T17:36:50.104 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.103+0000 7f6367e2d700 1 -- 192.168.123.106:0/3467663446 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f634c077990 msgr2=0x7f634c079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:50.104 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.103+0000 7f6367e2d700 1 --2- 192.168.123.106:0/3467663446 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f634c077990 0x7f634c079e40 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f635400b5c0 tx=0x7f63540058e0 comp rx=0 tx=0).stop 2026-03-09T17:36:50.104 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.103+0000 7f6367e2d700 1 -- 192.168.123.106:0/3467663446 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6360073a00 msgr2=0x7f63601a2aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:50.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.103+0000 7f6367e2d700 1 --2- 192.168.123.106:0/3467663446 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6360073a00 0x7f63601a2aa0 secure :-1 s=READY pgs=173 cs=0 l=1 rev1=1 crypto rx=0x7f635c00d8d0 tx=0x7f635c00dc90 comp rx=0 tx=0).stop 2026-03-09T17:36:50.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.104+0000 7f6367e2d700 1 -- 192.168.123.106:0/3467663446 shutdown_connections 2026-03-09T17:36:50.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.104+0000 7f6367e2d700 1 --2- 192.168.123.106:0/3467663446 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f634c077990 0x7f634c079e40 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:50.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.104+0000 7f6367e2d700 1 --2- 192.168.123.106:0/3467663446 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f63600730f0 0x7f63601a2560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:50.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.104+0000 7f6367e2d700 1 --2- 192.168.123.106:0/3467663446 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6360073a00 0x7f63601a2aa0 unknown :-1 s=CLOSED pgs=173 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:50.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.104+0000 7f6367e2d700 1 -- 192.168.123.106:0/3467663446 >> 192.168.123.106:0/3467663446 conn(0x7f63600fc000 msgr2=0x7f6360102b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:50.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.104+0000 7f6367e2d700 1 -- 192.168.123.106:0/3467663446 shutdown_connections 2026-03-09T17:36:50.105 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.104+0000 7f6367e2d700 1 -- 192.168.123.106:0/3467663446 wait complete. 2026-03-09T17:36:50.106 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 33 2026-03-09T17:36:50.173 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 33 2026-03-09T17:36:50.173 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph fs dump --format=json 34 2026-03-09T17:36:50.320 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:50.362 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:50 vm06.local ceph-mon[109831]: pgmap v216: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:50.362 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:50 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/3561249460' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 32, "format": "json"}]: dispatch 2026-03-09T17:36:50.362 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:50 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/3467663446' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 33, "format": "json"}]: dispatch 2026-03-09T17:36:50.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.584+0000 7f49a8dd6700 1 -- 192.168.123.106:0/3290860521 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f49a4102780 msgr2=0x7f49a4102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:50.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.584+0000 7f49a8dd6700 1 --2- 192.168.123.106:0/3290860521 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f49a4102780 0x7f49a4102bf0 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f4994009b00 tx=0x7f4994009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:50.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.585+0000 7f49a8dd6700 1 -- 192.168.123.106:0/3290860521 shutdown_connections 2026-03-09T17:36:50.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.585+0000 7f49a8dd6700 1 --2- 192.168.123.106:0/3290860521 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f49a4102780 0x7f49a4102bf0 unknown :-1 s=CLOSED pgs=174 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:50.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.585+0000 7f49a8dd6700 1 --2- 192.168.123.106:0/3290860521 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f49a4108780 0x7f49a4108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:50.586 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.585+0000 7f49a8dd6700 1 -- 192.168.123.106:0/3290860521 >> 192.168.123.106:0/3290860521 conn(0x7f49a40fe280 msgr2=0x7f49a4100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:50.587 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.586+0000 7f49a8dd6700 1 -- 192.168.123.106:0/3290860521 shutdown_connections 2026-03-09T17:36:50.587 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.586+0000 7f49a8dd6700 1 -- 192.168.123.106:0/3290860521 wait complete. 2026-03-09T17:36:50.587 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.586+0000 7f49a8dd6700 1 Processor -- start 2026-03-09T17:36:50.587 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.586+0000 7f49a8dd6700 1 -- start start 2026-03-09T17:36:50.588 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.587+0000 7f49a8dd6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f49a4102780 0x7f49a41983c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:50.588 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.587+0000 7f49a8dd6700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f49a4108780 0x7f49a4198900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:50.588 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.587+0000 7f49a8dd6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f49a4198fe0 con 0x7f49a4102780 2026-03-09T17:36:50.588 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.587+0000 7f49a8dd6700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f49a419cd70 con 0x7f49a4108780 2026-03-09T17:36:50.588 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.587+0000 7f49a1d9b700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f49a4108780 0x7f49a4198900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:50.588 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.587+0000 7f49a1d9b700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f49a4108780 0x7f49a4198900 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.109:3300/0 says I am v2:192.168.123.106:46930/0 (socket says 192.168.123.106:46930) 2026-03-09T17:36:50.588 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.587+0000 7f49a1d9b700 1 -- 192.168.123.106:0/2104281983 learned_addr learned my addr 192.168.123.106:0/2104281983 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:50.589 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.588+0000 7f49a259c700 1 --2- 192.168.123.106:0/2104281983 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f49a4102780 0x7f49a41983c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:50.589 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.588+0000 7f49a1d9b700 1 -- 192.168.123.106:0/2104281983 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f49a4102780 msgr2=0x7f49a41983c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:50.589 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.588+0000 7f49a1d9b700 1 --2- 192.168.123.106:0/2104281983 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f49a4102780 0x7f49a41983c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:50.589 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.588+0000 7f49a1d9b700 1 -- 192.168.123.106:0/2104281983 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f49940097e0 con 0x7f49a4108780 2026-03-09T17:36:50.589 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.588+0000 7f49a259c700 1 --2- 192.168.123.106:0/2104281983 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f49a4102780 0x7f49a41983c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T17:36:50.589 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.588+0000 7f49a1d9b700 1 --2- 192.168.123.106:0/2104281983 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f49a4108780 0x7f49a4198900 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f49940094d0 tx=0x7f49940049e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:50.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.589+0000 7f499b7fe700 1 -- 192.168.123.106:0/2104281983 <== mon.1 v2:192.168.123.109:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f499401d070 con 0x7f49a4108780 2026-03-09T17:36:50.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.589+0000 7f499b7fe700 1 -- 192.168.123.106:0/2104281983 <== mon.1 v2:192.168.123.109:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f499400bc50 con 0x7f49a4108780 2026-03-09T17:36:50.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.589+0000 7f499b7fe700 1 -- 192.168.123.106:0/2104281983 <== mon.1 v2:192.168.123.109:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f499400f7f0 con 0x7f49a4108780 2026-03-09T17:36:50.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.589+0000 7f49a8dd6700 1 -- 192.168.123.106:0/2104281983 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f49a419cff0 con 0x7f49a4108780 2026-03-09T17:36:50.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.589+0000 7f49a8dd6700 1 -- 192.168.123.106:0/2104281983 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f49a419d560 con 0x7f49a4108780 2026-03-09T17:36:50.591 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.590+0000 7f49a8dd6700 1 -- 192.168.123.106:0/2104281983 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f49a404ea50 con 0x7f49a4108780 2026-03-09T17:36:50.593 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.591+0000 7f499b7fe700 1 -- 192.168.123.106:0/2104281983 <== mon.1 v2:192.168.123.109:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4994022470 con 0x7f49a4108780 2026-03-09T17:36:50.593 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.591+0000 7f499b7fe700 1 --2- 192.168.123.106:0/2104281983 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f4990077990 0x7f4990079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:50.593 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.591+0000 7f499b7fe700 1 -- 192.168.123.106:0/2104281983 <== mon.1 v2:192.168.123.109:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f499409bdf0 con 0x7f49a4108780 2026-03-09T17:36:50.593 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.592+0000 7f49a259c700 1 --2- 192.168.123.106:0/2104281983 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f4990077990 0x7f4990079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:50.593 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.592+0000 7f49a259c700 1 --2- 192.168.123.106:0/2104281983 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f4990077990 0x7f4990079e40 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f49a41038c0 tx=0x7f498c008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:50.594 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.593+0000 7f499b7fe700 1 -- 192.168.123.106:0/2104281983 <== mon.1 v2:192.168.123.109:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f49940645a0 con 0x7f49a4108780 2026-03-09T17:36:50.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:50 vm09.local ceph-mon[97995]: pgmap v216: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:50.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:50 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/3561249460' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 32, "format": "json"}]: dispatch 2026-03-09T17:36:50.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:50 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/3467663446' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 33, "format": "json"}]: dispatch 2026-03-09T17:36:50.743 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.741+0000 7f49a8dd6700 1 -- 192.168.123.106:0/2104281983 --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 34, "format": "json"} v 0) v1 -- 0x7f49a4066e40 con 0x7f49a4108780 2026-03-09T17:36:50.744 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.742+0000 7f499b7fe700 1 -- 192.168.123.106:0/2104281983 <== mon.1 v2:192.168.123.109:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 34, "format": "json"}]=0 dumped fsmap epoch 34 v35) v1 ==== 107+0+5262 (secure 0 0 0) 0x7f4994005c00 con 0x7f49a4108780 2026-03-09T17:36:50.744 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:36:50.744 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":34,"btime":"2026-03-09T17:34:38:189874+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44251,"name":"cephfs.vm06.gzymac","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6829/2160269265","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6828","nonce":2160269265},{"type":"v1","addr":"192.168.123.106:6829","nonce":2160269265}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":44253,"name":"cephfs.vm09.cjcawy","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6825/3810846472","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6824","nonce":3810846472},{"type":"v1","addr":"192.168.123.109:6825","nonce":3810846472}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":28},{"gid":44275,"name":"cephfs.vm09.drzmdt","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.109:6827/3154236738","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.109:6826","nonce":3154236738},{"type":"v1","addr":"192.168.123.109:6827","nonce":3154236738}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":33}],"filesystems":[{"mdsmap":{"epoch":34,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-09T17:27:09.795351+0000","modified":"2026-03-09T17:34:37.193037+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":80,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34284},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34284":{"gid":34284,"name":"cephfs.vm06.vmzmbb","rank":0,"incarnation":30,"state":"up:active","state_seq":9,"addr":"192.168.123.106:6827/571707287","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":571707287},{"type":"v1","addr":"192.168.123.106:6827","nonce":571707287}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34284,"qdb_cluster":[34284]},"id":1}]} 2026-03-09T17:36:50.746 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.745+0000 7f49a8dd6700 1 -- 192.168.123.106:0/2104281983 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f4990077990 msgr2=0x7f4990079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:50.746 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.745+0000 7f49a8dd6700 1 --2- 192.168.123.106:0/2104281983 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f4990077990 0x7f4990079e40 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f49a41038c0 tx=0x7f498c008040 comp rx=0 tx=0).stop 2026-03-09T17:36:50.747 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.746+0000 7f49a8dd6700 1 -- 192.168.123.106:0/2104281983 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f49a4108780 msgr2=0x7f49a4198900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:50.747 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.746+0000 7f49a8dd6700 1 --2- 192.168.123.106:0/2104281983 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f49a4108780 0x7f49a4198900 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f49940094d0 tx=0x7f49940049e0 comp rx=0 tx=0).stop 2026-03-09T17:36:50.747 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.746+0000 7f49a8dd6700 1 -- 192.168.123.106:0/2104281983 shutdown_connections 2026-03-09T17:36:50.747 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.746+0000 7f49a8dd6700 1 --2- 192.168.123.106:0/2104281983 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f4990077990 0x7f4990079e40 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:50.747 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.746+0000 7f49a8dd6700 1 --2- 192.168.123.106:0/2104281983 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f49a4102780 0x7f49a41983c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:50.747 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.746+0000 7f49a8dd6700 1 --2- 192.168.123.106:0/2104281983 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f49a4108780 0x7f49a4198900 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:50.747 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.746+0000 7f49a8dd6700 1 -- 192.168.123.106:0/2104281983 >> 192.168.123.106:0/2104281983 conn(0x7f49a40fe280 msgr2=0x7f49a40ffa10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:50.748 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.747+0000 7f49a8dd6700 1 -- 192.168.123.106:0/2104281983 shutdown_connections 2026-03-09T17:36:50.748 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:50.747+0000 7f49a8dd6700 1 -- 192.168.123.106:0/2104281983 wait complete. 2026-03-09T17:36:50.749 INFO:teuthology.orchestra.run.vm06.stderr:dumped fsmap epoch 34 2026-03-09T17:36:50.807 DEBUG:teuthology.run_tasks:Unwinding manager ceph-fuse 2026-03-09T17:36:50.811 INFO:tasks.ceph_fuse:Unmounting ceph-fuse clients... 2026-03-09T17:36:50.812 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:36:50.812 DEBUG:teuthology.orchestra.run.vm06:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T17:36:50.828 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:36:50.829 DEBUG:teuthology.orchestra.run.vm06:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T17:36:50.887 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd blocklist ls 2026-03-09T17:36:51.078 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:51.311 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:51 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/2104281983' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 34, "format": "json"}]: dispatch 2026-03-09T17:36:51.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.338+0000 7f6ec759e700 1 -- 192.168.123.106:0/4103664204 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ec00a4f30 msgr2=0x7f6ec00a5300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:51.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.338+0000 7f6ec759e700 1 --2- 192.168.123.106:0/4103664204 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ec00a4f30 0x7f6ec00a5300 secure :-1 s=READY pgs=175 cs=0 l=1 rev1=1 crypto rx=0x7f6eb80077e0 tx=0x7f6eb8007af0 comp rx=0 tx=0).stop 2026-03-09T17:36:51.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.338+0000 7f6ec759e700 1 -- 192.168.123.106:0/4103664204 shutdown_connections 2026-03-09T17:36:51.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.338+0000 7f6ec759e700 1 --2- 192.168.123.106:0/4103664204 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6ec00a58d0 0x7f6ec00a8e90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:51.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.338+0000 7f6ec759e700 1 --2- 192.168.123.106:0/4103664204 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ec00a4f30 0x7f6ec00a5300 unknown :-1 s=CLOSED pgs=175 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:51.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.338+0000 7f6ec759e700 1 -- 192.168.123.106:0/4103664204 >> 192.168.123.106:0/4103664204 conn(0x7f6ec001a290 msgr2=0x7f6ec001a690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:51.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.338+0000 7f6ec759e700 1 -- 192.168.123.106:0/4103664204 shutdown_connections 2026-03-09T17:36:51.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.338+0000 7f6ec759e700 1 -- 192.168.123.106:0/4103664204 wait complete. 2026-03-09T17:36:51.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.338+0000 7f6ec759e700 1 Processor -- start 2026-03-09T17:36:51.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.338+0000 7f6ec759e700 1 -- start start 2026-03-09T17:36:51.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.338+0000 7f6ec759e700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6ec00a58d0 0x7f6ec000f770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:51.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.338+0000 7f6ec759e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ec000fcb0 0x7f6ec0010120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:51.343 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.338+0000 7f6ec759e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ec00142f0 con 0x7f6ec000fcb0 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.338+0000 7f6ec759e700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ec0014430 con 0x7f6ec00a58d0 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.339+0000 7f6ec5d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ec000fcb0 0x7f6ec0010120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.339+0000 7f6ec5d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ec000fcb0 0x7f6ec0010120 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:56048/0 (socket says 192.168.123.106:56048) 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.339+0000 7f6ec5d9b700 1 -- 192.168.123.106:0/3194578090 learned_addr learned my addr 192.168.123.106:0/3194578090 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.339+0000 7f6ec659c700 1 --2- 192.168.123.106:0/3194578090 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6ec00a58d0 0x7f6ec000f770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.339+0000 7f6ec5d9b700 1 -- 192.168.123.106:0/3194578090 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6ec00a58d0 msgr2=0x7f6ec000f770 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.339+0000 7f6ec5d9b700 1 --2- 192.168.123.106:0/3194578090 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6ec00a58d0 0x7f6ec000f770 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.339+0000 7f6ec5d9b700 1 -- 192.168.123.106:0/3194578090 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6eb8007430 con 0x7f6ec000fcb0 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.339+0000 7f6ec5d9b700 1 --2- 192.168.123.106:0/3194578090 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ec000fcb0 0x7f6ec0010120 secure :-1 s=READY pgs=176 cs=0 l=1 rev1=1 crypto rx=0x7f6ebc00d900 tx=0x7f6ebc00dc10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.339+0000 7f6eb77fe700 1 -- 192.168.123.106:0/3194578090 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ebc0098e0 con 0x7f6ec000fcb0 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.339+0000 7f6ec759e700 1 -- 192.168.123.106:0/3194578090 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6ec00146b0 con 0x7f6ec000fcb0 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.339+0000 7f6ec759e700 1 -- 192.168.123.106:0/3194578090 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6ec0014c00 con 0x7f6ec000fcb0 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.340+0000 7f6eb77fe700 1 -- 192.168.123.106:0/3194578090 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6ebc010460 con 0x7f6ec000fcb0 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.340+0000 7f6eb77fe700 1 -- 192.168.123.106:0/3194578090 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ebc00f5d0 con 0x7f6ec000fcb0 2026-03-09T17:36:51.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.340+0000 7f6eb77fe700 1 -- 192.168.123.106:0/3194578090 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6ebc00f7b0 con 0x7f6ec000fcb0 2026-03-09T17:36:51.347 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.345+0000 7f6eb77fe700 1 --2- 192.168.123.106:0/3194578090 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6eb0077b80 0x7f6eb007a030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:51.347 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.345+0000 7f6eb77fe700 1 -- 192.168.123.106:0/3194578090 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f6ebc09a630 con 0x7f6ec000fcb0 2026-03-09T17:36:51.347 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.345+0000 7f6ec759e700 1 -- 192.168.123.106:0/3194578090 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6ea8005320 con 0x7f6ec000fcb0 2026-03-09T17:36:51.349 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.348+0000 7f6ec659c700 1 --2- 192.168.123.106:0/3194578090 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6eb0077b80 0x7f6eb007a030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:51.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.348+0000 7f6ec659c700 1 --2- 192.168.123.106:0/3194578090 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6eb0077b80 0x7f6eb007a030 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f6eb8007ef0 tx=0x7f6eb8020040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:51.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.348+0000 7f6eb77fe700 1 -- 192.168.123.106:0/3194578090 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6ebc062de0 con 0x7f6ec000fcb0 2026-03-09T17:36:51.479 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.478+0000 7f6ec759e700 1 -- 192.168.123.106:0/3194578090 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7f6ea8005f70 con 0x7f6ec000fcb0 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.479+0000 7f6eb77fe700 1 -- 192.168.123.106:0/3194578090 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 37 entries v80) v1 ==== 81+0+2265 (secure 0 0 0) 0x7f6ebc020080 con 0x7f6ec000fcb0 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:6826/3078962403 2026-03-10T17:34:28.276802+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6829/4261949342 2026-03-10T17:34:12.303307+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6828/4261949342 2026-03-10T17:34:12.303307+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6826/649840868 2026-03-10T17:34:01.366659+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:6828/111652423 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:0/1672023524 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3405957808 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3891378720 2026-03-10T17:25:23.192748+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:6829/111652423 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/1414971993 2026-03-10T17:25:23.192748+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/85092552 2026-03-10T17:30:02.509732+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3617020233 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/2965444141 2026-03-10T17:24:32.871932+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/687723179 2026-03-10T17:24:47.307802+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/1363105795 2026-03-10T17:30:02.509732+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6801/2 2026-03-10T17:24:32.871932+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3698608392 2026-03-10T17:25:23.192748+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6800/2 2026-03-10T17:24:32.871932+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/2798770433 2026-03-10T17:24:47.307802+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:0/3162557461 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3665301257 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/795371929 2026-03-10T17:24:47.307802+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:0/3211192742 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6800/1431796821 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:0/2078421168 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:51.480 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:0/290944557 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:51.481 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3292834485 2026-03-10T17:30:02.509732+0000 2026-03-09T17:36:51.481 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6827/649840868 2026-03-10T17:34:01.366659+0000 2026-03-09T17:36:51.481 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:0/908952367 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:51.481 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3673999554 2026-03-10T17:24:32.871932+0000 2026-03-09T17:36:51.481 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/2700778697 2026-03-10T17:24:32.871932+0000 2026-03-09T17:36:51.481 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/4245576415 2026-03-10T17:30:02.509732+0000 2026-03-09T17:36:51.481 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:6827/3078962403 2026-03-10T17:34:28.276802+0000 2026-03-09T17:36:51.481 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3709236964 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:51.481 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/2987883560 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:51.481 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6801/1431796821 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:51.481 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3516141428 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:51.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.481+0000 7f6ec759e700 1 -- 192.168.123.106:0/3194578090 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6eb0077b80 msgr2=0x7f6eb007a030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:51.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.481+0000 7f6ec759e700 1 --2- 192.168.123.106:0/3194578090 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6eb0077b80 0x7f6eb007a030 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f6eb8007ef0 tx=0x7f6eb8020040 comp rx=0 tx=0).stop 2026-03-09T17:36:51.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.482+0000 7f6ec759e700 1 -- 192.168.123.106:0/3194578090 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ec000fcb0 msgr2=0x7f6ec0010120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:51.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.482+0000 7f6ec759e700 1 --2- 192.168.123.106:0/3194578090 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ec000fcb0 0x7f6ec0010120 secure :-1 s=READY pgs=176 cs=0 l=1 rev1=1 crypto rx=0x7f6ebc00d900 tx=0x7f6ebc00dc10 comp rx=0 tx=0).stop 2026-03-09T17:36:51.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.482+0000 7f6ec759e700 1 -- 192.168.123.106:0/3194578090 shutdown_connections 2026-03-09T17:36:51.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.482+0000 7f6ec759e700 1 --2- 192.168.123.106:0/3194578090 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6eb0077b80 0x7f6eb007a030 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:51.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.482+0000 7f6ec759e700 1 --2- 192.168.123.106:0/3194578090 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6ec00a58d0 0x7f6ec000f770 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:51.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.482+0000 7f6ec759e700 1 --2- 192.168.123.106:0/3194578090 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ec000fcb0 0x7f6ec0010120 unknown :-1 s=CLOSED pgs=176 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:51.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.482+0000 7f6ec759e700 1 -- 192.168.123.106:0/3194578090 >> 192.168.123.106:0/3194578090 conn(0x7f6ec001a290 msgr2=0x7f6ec00a2b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:51.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.482+0000 7f6ec759e700 1 -- 192.168.123.106:0/3194578090 shutdown_connections 2026-03-09T17:36:51.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.482+0000 7f6ec759e700 1 -- 192.168.123.106:0/3194578090 wait complete. 2026-03-09T17:36:51.484 INFO:teuthology.orchestra.run.vm06.stderr:listed 37 entries 2026-03-09T17:36:51.527 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T17:36:51.527 DEBUG:teuthology.orchestra.run.vm06:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T17:36:51.544 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph osd blocklist ls 2026-03-09T17:36:51.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:51 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/2104281983' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 34, "format": "json"}]: dispatch 2026-03-09T17:36:51.731 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:36:51.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.974+0000 7f6f2132d700 1 -- 192.168.123.106:0/2132464209 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f1c108810 msgr2=0x7f6f1c108be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:51.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.975+0000 7f6f2132d700 1 --2- 192.168.123.106:0/2132464209 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f1c108810 0x7f6f1c108be0 secure :-1 s=READY pgs=177 cs=0 l=1 rev1=1 crypto rx=0x7f6f04009b00 tx=0x7f6f04009e10 comp rx=0 tx=0).stop 2026-03-09T17:36:51.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.975+0000 7f6f2132d700 1 -- 192.168.123.106:0/2132464209 shutdown_connections 2026-03-09T17:36:51.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.975+0000 7f6f2132d700 1 --2- 192.168.123.106:0/2132464209 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6f1c102810 0x7f6f1c102c80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:51.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.975+0000 7f6f2132d700 1 --2- 192.168.123.106:0/2132464209 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f1c108810 0x7f6f1c108be0 unknown :-1 s=CLOSED pgs=177 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:51.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.975+0000 7f6f2132d700 1 -- 192.168.123.106:0/2132464209 >> 192.168.123.106:0/2132464209 conn(0x7f6f1c0fe330 msgr2=0x7f6f1c100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:51.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.975+0000 7f6f2132d700 1 -- 192.168.123.106:0/2132464209 shutdown_connections 2026-03-09T17:36:51.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.975+0000 7f6f2132d700 1 -- 192.168.123.106:0/2132464209 wait complete. 2026-03-09T17:36:51.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.976+0000 7f6f2132d700 1 Processor -- start 2026-03-09T17:36:51.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.976+0000 7f6f2132d700 1 -- start start 2026-03-09T17:36:51.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.976+0000 7f6f2132d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f1c102810 0x7f6f1c1983e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:51.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.976+0000 7f6f2132d700 1 --2- >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6f1c108810 0x7f6f1c198920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:51.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.976+0000 7f6f2132d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f1c198f70 con 0x7f6f1c102810 2026-03-09T17:36:51.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.976+0000 7f6f2132d700 1 -- --> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f1c1990b0 con 0x7f6f1c108810 2026-03-09T17:36:51.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.976+0000 7f6f1affd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f1c102810 0x7f6f1c1983e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:51.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.976+0000 7f6f1affd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f1c102810 0x7f6f1c1983e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:56060/0 (socket says 192.168.123.106:56060) 2026-03-09T17:36:51.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.976+0000 7f6f1affd700 1 -- 192.168.123.106:0/389949256 learned_addr learned my addr 192.168.123.106:0/389949256 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T17:36:51.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.976+0000 7f6f1affd700 1 -- 192.168.123.106:0/389949256 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6f1c108810 msgr2=0x7f6f1c198920 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:51.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.977+0000 7f6f1a7fc700 1 --2- 192.168.123.106:0/389949256 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6f1c108810 0x7f6f1c198920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:51.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.977+0000 7f6f1affd700 1 --2- 192.168.123.106:0/389949256 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6f1c108810 0x7f6f1c198920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:51.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.977+0000 7f6f1affd700 1 -- 192.168.123.106:0/389949256 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6f040097e0 con 0x7f6f1c102810 2026-03-09T17:36:51.978 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.977+0000 7f6f1affd700 1 --2- 192.168.123.106:0/389949256 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f1c102810 0x7f6f1c1983e0 secure :-1 s=READY pgs=178 cs=0 l=1 rev1=1 crypto rx=0x7f6f04000c00 tx=0x7f6f04005de0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:51.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.977+0000 7f6f13fff700 1 -- 192.168.123.106:0/389949256 <== mon.0 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6f0401d070 con 0x7f6f1c102810 2026-03-09T17:36:51.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.977+0000 7f6f13fff700 1 -- 192.168.123.106:0/389949256 <== mon.0 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6f0400bc50 con 0x7f6f1c102810 2026-03-09T17:36:51.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.977+0000 7f6f13fff700 1 -- 192.168.123.106:0/389949256 <== mon.0 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6f040217a0 con 0x7f6f1c102810 2026-03-09T17:36:51.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.977+0000 7f6f2132d700 1 -- 192.168.123.106:0/389949256 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6f1c19cea0 con 0x7f6f1c102810 2026-03-09T17:36:51.979 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.977+0000 7f6f2132d700 1 -- 192.168.123.106:0/389949256 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6f1c19d390 con 0x7f6f1c102810 2026-03-09T17:36:51.980 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.978+0000 7f6f13fff700 1 -- 192.168.123.106:0/389949256 <== mon.0 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6f0402b430 con 0x7f6f1c102810 2026-03-09T17:36:51.980 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.979+0000 7f6f13fff700 1 --2- 192.168.123.106:0/389949256 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6f08077910 0x7f6f08079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T17:36:51.980 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.979+0000 7f6f13fff700 1 -- 192.168.123.106:0/389949256 <== mon.0 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f6f0409bfa0 con 0x7f6f1c102810 2026-03-09T17:36:51.983 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.979+0000 7f6f2132d700 1 -- 192.168.123.106:0/389949256 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6f1c04ea50 con 0x7f6f1c102810 2026-03-09T17:36:51.983 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.981+0000 7f6f1a7fc700 1 --2- 192.168.123.106:0/389949256 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6f08077910 0x7f6f08079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T17:36:51.983 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.981+0000 7f6f1a7fc700 1 --2- 192.168.123.106:0/389949256 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6f08077910 0x7f6f08079dc0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f6f1c199a00 tx=0x7f6f0c005d50 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T17:36:51.983 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:51.982+0000 7f6f13fff700 1 -- 192.168.123.106:0/389949256 <== mon.0 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6f04064810 con 0x7f6f1c102810 2026-03-09T17:36:52.103 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:52.102+0000 7f6f2132d700 1 -- 192.168.123.106:0/389949256 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7f6f1c066e40 con 0x7f6f1c102810 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:52.102+0000 7f6f13fff700 1 -- 192.168.123.106:0/389949256 <== mon.0 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 37 entries v80) v1 ==== 81+0+2265 (secure 0 0 0) 0x7f6f04026020 con 0x7f6f1c102810 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:6826/3078962403 2026-03-10T17:34:28.276802+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6829/4261949342 2026-03-10T17:34:12.303307+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6828/4261949342 2026-03-10T17:34:12.303307+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6826/649840868 2026-03-10T17:34:01.366659+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:6828/111652423 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:0/1672023524 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3405957808 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3891378720 2026-03-10T17:25:23.192748+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:6829/111652423 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/1414971993 2026-03-10T17:25:23.192748+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/85092552 2026-03-10T17:30:02.509732+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3617020233 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/2965444141 2026-03-10T17:24:32.871932+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/687723179 2026-03-10T17:24:47.307802+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/1363105795 2026-03-10T17:30:02.509732+0000 2026-03-09T17:36:52.104 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6801/2 2026-03-10T17:24:32.871932+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3698608392 2026-03-10T17:25:23.192748+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6800/2 2026-03-10T17:24:32.871932+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/2798770433 2026-03-10T17:24:47.307802+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:0/3162557461 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3665301257 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/795371929 2026-03-10T17:24:47.307802+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:0/3211192742 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6800/1431796821 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:0/2078421168 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:0/290944557 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3292834485 2026-03-10T17:30:02.509732+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6827/649840868 2026-03-10T17:34:01.366659+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:0/908952367 2026-03-10T17:30:26.375971+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3673999554 2026-03-10T17:24:32.871932+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/2700778697 2026-03-10T17:24:32.871932+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/4245576415 2026-03-10T17:30:02.509732+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.109:6827/3078962403 2026-03-10T17:34:28.276802+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3709236964 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/2987883560 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:6801/1431796821 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:52.105 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.106:0/3516141428 2026-03-10T17:30:58.278779+0000 2026-03-09T17:36:52.106 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:52.105+0000 7f6f2132d700 1 -- 192.168.123.106:0/389949256 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6f08077910 msgr2=0x7f6f08079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:52.106 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:52.105+0000 7f6f2132d700 1 --2- 192.168.123.106:0/389949256 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6f08077910 0x7f6f08079dc0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f6f1c199a00 tx=0x7f6f0c005d50 comp rx=0 tx=0).stop 2026-03-09T17:36:52.107 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:52.106+0000 7f6f2132d700 1 -- 192.168.123.106:0/389949256 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f1c102810 msgr2=0x7f6f1c1983e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T17:36:52.107 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:52.106+0000 7f6f2132d700 1 --2- 192.168.123.106:0/389949256 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f1c102810 0x7f6f1c1983e0 secure :-1 s=READY pgs=178 cs=0 l=1 rev1=1 crypto rx=0x7f6f04000c00 tx=0x7f6f04005de0 comp rx=0 tx=0).stop 2026-03-09T17:36:52.107 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:52.106+0000 7f6f2132d700 1 -- 192.168.123.106:0/389949256 shutdown_connections 2026-03-09T17:36:52.107 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:52.106+0000 7f6f2132d700 1 --2- 192.168.123.106:0/389949256 >> [v2:192.168.123.106:6800/431144776,v1:192.168.123.106:6801/431144776] conn(0x7f6f08077910 0x7f6f08079dc0 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:52.107 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:52.106+0000 7f6f2132d700 1 --2- 192.168.123.106:0/389949256 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f1c102810 0x7f6f1c1983e0 unknown :-1 s=CLOSED pgs=178 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:52.107 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:52.106+0000 7f6f2132d700 1 --2- 192.168.123.106:0/389949256 >> [v2:192.168.123.109:3300/0,v1:192.168.123.109:6789/0] conn(0x7f6f1c108810 0x7f6f1c198920 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T17:36:52.107 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:52.106+0000 7f6f2132d700 1 -- 192.168.123.106:0/389949256 >> 192.168.123.106:0/389949256 conn(0x7f6f1c0fe330 msgr2=0x7f6f1c0ffb50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T17:36:52.107 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:52.106+0000 7f6f2132d700 1 -- 192.168.123.106:0/389949256 shutdown_connections 2026-03-09T17:36:52.107 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T17:36:52.106+0000 7f6f2132d700 1 -- 192.168.123.106:0/389949256 wait complete. 2026-03-09T17:36:52.108 INFO:teuthology.orchestra.run.vm06.stderr:listed 37 entries 2026-03-09T17:36:52.176 INFO:tasks.cephfs.fuse_mount:Running fusermount -u on ubuntu@vm06.local... 2026-03-09T17:36:52.177 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T17:36:52.177 DEBUG:teuthology.orchestra.run.vm06:> sudo fusermount -u /home/ubuntu/cephtest/mnt.0 2026-03-09T17:36:52.203 INFO:teuthology.orchestra.run:waiting for 300 2026-03-09T17:36:52.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:52 vm06.local ceph-mon[109831]: pgmap v217: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:52.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:52 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/3194578090' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T17:36:52.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:52 vm06.local ceph-mon[109831]: from='client.? 192.168.123.106:0/389949256' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T17:36:52.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:52 vm09.local ceph-mon[97995]: pgmap v217: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:52.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:52 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/3194578090' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T17:36:52.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:52 vm09.local ceph-mon[97995]: from='client.? 192.168.123.106:0/389949256' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T17:36:54.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:54 vm06.local ceph-mon[109831]: pgmap v218: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:54.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:54 vm09.local ceph-mon[97995]: pgmap v218: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:56.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:56 vm06.local ceph-mon[109831]: pgmap v219: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:56.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:56 vm09.local ceph-mon[97995]: pgmap v219: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:36:58.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:58 vm06.local ceph-mon[109831]: pgmap v220: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:58.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:58 vm09.local ceph-mon[97995]: pgmap v220: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:36:59.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:36:59 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:36:59.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:36:59 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:37:00.543 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:00 vm06.local ceph-mon[109831]: pgmap v221: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:00.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:00 vm09.local ceph-mon[97995]: pgmap v221: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:02.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:02 vm06.local ceph-mon[109831]: pgmap v222: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:02.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:02 vm09.local ceph-mon[97995]: pgmap v222: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:03.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:03 vm06.local ceph-mon[109831]: pgmap v223: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:03.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:03 vm09.local ceph-mon[97995]: pgmap v223: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:05.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:05 vm06.local ceph-mon[109831]: pgmap v224: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:05.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:05 vm09.local ceph-mon[97995]: pgmap v224: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:07.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:07 vm09.local ceph-mon[97995]: pgmap v225: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:07.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:07 vm06.local ceph-mon[109831]: pgmap v225: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:09.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:09 vm06.local ceph-mon[109831]: pgmap v226: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:09.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:09 vm09.local ceph-mon[97995]: pgmap v226: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:11.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:11 vm06.local ceph-mon[109831]: pgmap v227: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:11.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:11 vm09.local ceph-mon[97995]: pgmap v227: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:13.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:13 vm06.local ceph-mon[109831]: pgmap v228: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:13.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:37:13.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:13 vm09.local ceph-mon[97995]: pgmap v228: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:13.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:37:15.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:15 vm06.local ceph-mon[109831]: pgmap v229: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:15.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:15 vm09.local ceph-mon[97995]: pgmap v229: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:17.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:17 vm06.local ceph-mon[109831]: pgmap v230: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:17.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:17 vm09.local ceph-mon[97995]: pgmap v230: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:19 vm06.local ceph-mon[109831]: pgmap v231: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:19.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:19 vm09.local ceph-mon[97995]: pgmap v231: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:20.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:37:20.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:37:20.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:37:20.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:37:20.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:37:20.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:37:20.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:37:20.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:37:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:21 vm06.local ceph-mon[109831]: pgmap v232: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:22.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:21 vm09.local ceph-mon[97995]: pgmap v232: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:24.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:23 vm06.local ceph-mon[109831]: pgmap v233: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:24.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:23 vm09.local ceph-mon[97995]: pgmap v233: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:26.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:25 vm06.local ceph-mon[109831]: pgmap v234: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:26.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:25 vm09.local ceph-mon[97995]: pgmap v234: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:28.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:27 vm06.local ceph-mon[109831]: pgmap v235: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:28.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:27 vm09.local ceph-mon[97995]: pgmap v235: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:29.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:37:29.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:37:30.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:29 vm06.local ceph-mon[109831]: pgmap v236: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:30.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:29 vm09.local ceph-mon[97995]: pgmap v236: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:32.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:31 vm06.local ceph-mon[109831]: pgmap v237: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:32.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:31 vm09.local ceph-mon[97995]: pgmap v237: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:34.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:33 vm06.local ceph-mon[109831]: pgmap v238: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:34.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:33 vm09.local ceph-mon[97995]: pgmap v238: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:36.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:35 vm06.local ceph-mon[109831]: pgmap v239: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:36.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:35 vm09.local ceph-mon[97995]: pgmap v239: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:37.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:37 vm09.local ceph-mon[97995]: pgmap v240: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:38.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:37 vm06.local ceph-mon[109831]: pgmap v240: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:40.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:39 vm06.local ceph-mon[109831]: pgmap v241: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:39 vm09.local ceph-mon[97995]: pgmap v241: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:42.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:41 vm06.local ceph-mon[109831]: pgmap v242: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:42.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:41 vm09.local ceph-mon[97995]: pgmap v242: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:44.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:43 vm06.local ceph-mon[109831]: pgmap v243: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:44.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:37:44.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:43 vm09.local ceph-mon[97995]: pgmap v243: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:44.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:37:46.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:45 vm06.local ceph-mon[109831]: pgmap v244: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:46.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:45 vm09.local ceph-mon[97995]: pgmap v244: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:48.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:47 vm09.local ceph-mon[97995]: pgmap v245: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:48.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:47 vm06.local ceph-mon[109831]: pgmap v245: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:50.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:49 vm06.local ceph-mon[109831]: pgmap v246: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:50.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:49 vm09.local ceph-mon[97995]: pgmap v246: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:52.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:51 vm06.local ceph-mon[109831]: pgmap v247: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:52.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:51 vm09.local ceph-mon[97995]: pgmap v247: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:54.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:53 vm06.local ceph-mon[109831]: pgmap v248: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:54.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:53 vm09.local ceph-mon[97995]: pgmap v248: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:56.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:55 vm06.local ceph-mon[109831]: pgmap v249: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:56.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:55 vm09.local ceph-mon[97995]: pgmap v249: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:37:58.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:57 vm09.local ceph-mon[97995]: pgmap v250: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:58.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:57 vm06.local ceph-mon[109831]: pgmap v250: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:37:59.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:37:59.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:58 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:38:00.292 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:37:59 vm06.local ceph-mon[109831]: pgmap v251: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:00.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:37:59 vm09.local ceph-mon[97995]: pgmap v251: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:02.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:01 vm06.local ceph-mon[109831]: pgmap v252: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:02.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:01 vm09.local ceph-mon[97995]: pgmap v252: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:04.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:03 vm06.local ceph-mon[109831]: pgmap v253: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:04.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:03 vm09.local ceph-mon[97995]: pgmap v253: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:06.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:05 vm06.local ceph-mon[109831]: pgmap v254: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:06.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:05 vm09.local ceph-mon[97995]: pgmap v254: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:08.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:07 vm06.local ceph-mon[109831]: pgmap v255: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:08.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:07 vm09.local ceph-mon[97995]: pgmap v255: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:10.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:10 vm06.local ceph-mon[109831]: pgmap v256: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:10.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:10 vm09.local ceph-mon[97995]: pgmap v256: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:12.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:12 vm06.local ceph-mon[109831]: pgmap v257: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:12.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:12 vm09.local ceph-mon[97995]: pgmap v257: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:14.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:14 vm06.local ceph-mon[109831]: pgmap v258: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:14.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:14 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:38:14.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:14 vm09.local ceph-mon[97995]: pgmap v258: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:14.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:14 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:38:16.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:16 vm06.local ceph-mon[109831]: pgmap v259: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:16.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:16 vm09.local ceph-mon[97995]: pgmap v259: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:18.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:18 vm06.local ceph-mon[109831]: pgmap v260: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:18.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:18 vm09.local ceph-mon[97995]: pgmap v260: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:20.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:20 vm06.local ceph-mon[109831]: pgmap v261: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:20.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:38:20.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:20 vm09.local ceph-mon[97995]: pgmap v261: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:20.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:38:21.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:21 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:38:21.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:21 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:38:21.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:21 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:38:21.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:21 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:38:21.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:21 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:38:21.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:21 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:38:22.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:22 vm06.local ceph-mon[109831]: pgmap v262: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:22.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:22 vm09.local ceph-mon[97995]: pgmap v262: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:24.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:24 vm06.local ceph-mon[109831]: pgmap v263: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:24.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:24 vm09.local ceph-mon[97995]: pgmap v263: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:26.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:26 vm06.local ceph-mon[109831]: pgmap v264: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:26.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:26 vm09.local ceph-mon[97995]: pgmap v264: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:28.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:28 vm09.local ceph-mon[97995]: pgmap v265: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:28.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:28 vm06.local ceph-mon[109831]: pgmap v265: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:29.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:29 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:38:29.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:29 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:38:30.543 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:30 vm06.local ceph-mon[109831]: pgmap v266: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:30.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:30 vm09.local ceph-mon[97995]: pgmap v266: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:32.451 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:32 vm06.local ceph-mon[109831]: pgmap v267: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:32.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:32 vm09.local ceph-mon[97995]: pgmap v267: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:34.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:34 vm06.local ceph-mon[109831]: pgmap v268: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:34.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:34 vm09.local ceph-mon[97995]: pgmap v268: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:36.447 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:36 vm09.local ceph-mon[97995]: pgmap v269: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:36.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:36 vm06.local ceph-mon[109831]: pgmap v269: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:38.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:38 vm06.local ceph-mon[109831]: pgmap v270: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:38.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:38 vm09.local ceph-mon[97995]: pgmap v270: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:40.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:40 vm06.local ceph-mon[109831]: pgmap v271: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:40.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:40 vm09.local ceph-mon[97995]: pgmap v271: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:42.560 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:42 vm06.local ceph-mon[109831]: pgmap v272: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:42.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:42 vm09.local ceph-mon[97995]: pgmap v272: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:44.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:44 vm06.local ceph-mon[109831]: pgmap v273: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:44.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:44 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:38:44.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:44 vm09.local ceph-mon[97995]: pgmap v273: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:44.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:44 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:38:46.641 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:46 vm06.local ceph-mon[109831]: pgmap v274: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:46.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:46 vm09.local ceph-mon[97995]: pgmap v274: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:47.615 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:47 vm06.local ceph-mon[109831]: pgmap v275: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:47.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:47 vm09.local ceph-mon[97995]: pgmap v275: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:49.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:49 vm06.local ceph-mon[109831]: pgmap v276: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:49.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:49 vm09.local ceph-mon[97995]: pgmap v276: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:51.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:51 vm06.local ceph-mon[109831]: pgmap v277: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:51.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:51 vm09.local ceph-mon[97995]: pgmap v277: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:53.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:53 vm09.local ceph-mon[97995]: pgmap v278: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:53.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:53 vm06.local ceph-mon[109831]: pgmap v278: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:55.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:55 vm06.local ceph-mon[109831]: pgmap v279: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:55.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:55 vm09.local ceph-mon[97995]: pgmap v279: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:38:57.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:57 vm06.local ceph-mon[109831]: pgmap v280: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:57.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:57 vm09.local ceph-mon[97995]: pgmap v280: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:58.645 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:58 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:38:58.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:38:59.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:38:59 vm06.local ceph-mon[109831]: pgmap v281: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:38:59.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:38:59 vm09.local ceph-mon[97995]: pgmap v281: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:01.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:01 vm06.local ceph-mon[109831]: pgmap v282: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:01.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:01 vm09.local ceph-mon[97995]: pgmap v282: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:03.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:03 vm06.local ceph-mon[109831]: pgmap v283: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:03.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:03 vm09.local ceph-mon[97995]: pgmap v283: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:05.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:05 vm06.local ceph-mon[109831]: pgmap v284: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:05.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:05 vm09.local ceph-mon[97995]: pgmap v284: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:07.835 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:07 vm06.local ceph-mon[109831]: pgmap v285: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:07.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:07 vm09.local ceph-mon[97995]: pgmap v285: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:09.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:09 vm06.local ceph-mon[109831]: pgmap v286: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:09.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:09 vm09.local ceph-mon[97995]: pgmap v286: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:11.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:11 vm06.local ceph-mon[109831]: pgmap v287: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:11.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:11 vm09.local ceph-mon[97995]: pgmap v287: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:13.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:13 vm06.local ceph-mon[109831]: pgmap v288: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:13.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:39:13.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:13 vm09.local ceph-mon[97995]: pgmap v288: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:13.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:39:15.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:15 vm06.local ceph-mon[109831]: pgmap v289: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:15.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:15 vm09.local ceph-mon[97995]: pgmap v289: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:17.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:17 vm06.local ceph-mon[109831]: pgmap v290: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:17.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:17 vm09.local ceph-mon[97995]: pgmap v290: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:19 vm06.local ceph-mon[109831]: pgmap v291: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:19.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:19 vm09.local ceph-mon[97995]: pgmap v291: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:20.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:39:20.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:39:20.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:39:20.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:39:20.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:39:20.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:39:20.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:39:20.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:39:21.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:21 vm06.local ceph-mon[109831]: pgmap v292: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:21.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:21 vm09.local ceph-mon[97995]: pgmap v292: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:23.895 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:23 vm09.local ceph-mon[97995]: pgmap v293: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:24.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:23 vm06.local ceph-mon[109831]: pgmap v293: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:26.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:25 vm06.local ceph-mon[109831]: pgmap v294: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:26.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:25 vm09.local ceph-mon[97995]: pgmap v294: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:28.058 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:27 vm06.local ceph-mon[109831]: pgmap v295: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:28.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:27 vm09.local ceph-mon[97995]: pgmap v295: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:29.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:39:29.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:39:30.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:29 vm06.local ceph-mon[109831]: pgmap v296: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:30.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:29 vm09.local ceph-mon[97995]: pgmap v296: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:32.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:31 vm06.local ceph-mon[109831]: pgmap v297: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:32.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:31 vm09.local ceph-mon[97995]: pgmap v297: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:34.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:33 vm06.local ceph-mon[109831]: pgmap v298: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:34.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:33 vm09.local ceph-mon[97995]: pgmap v298: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:36.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:35 vm06.local ceph-mon[109831]: pgmap v299: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:36.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:35 vm09.local ceph-mon[97995]: pgmap v299: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:38.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:37 vm06.local ceph-mon[109831]: pgmap v300: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:38.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:37 vm09.local ceph-mon[97995]: pgmap v300: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:40.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:39 vm06.local ceph-mon[109831]: pgmap v301: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:40.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:39 vm09.local ceph-mon[97995]: pgmap v301: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:42.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:41 vm06.local ceph-mon[109831]: pgmap v302: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:42.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:41 vm09.local ceph-mon[97995]: pgmap v302: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:44.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:43 vm06.local ceph-mon[109831]: pgmap v303: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:44.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:43 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:39:44.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:43 vm09.local ceph-mon[97995]: pgmap v303: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:44.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:43 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:39:46.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:45 vm06.local ceph-mon[109831]: pgmap v304: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:46.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:45 vm09.local ceph-mon[97995]: pgmap v304: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:48.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:47 vm06.local ceph-mon[109831]: pgmap v305: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:48.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:47 vm09.local ceph-mon[97995]: pgmap v305: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:50.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:49 vm06.local ceph-mon[109831]: pgmap v306: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:50.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:49 vm09.local ceph-mon[97995]: pgmap v306: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:52.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:51 vm06.local ceph-mon[109831]: pgmap v307: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:52.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:51 vm09.local ceph-mon[97995]: pgmap v307: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:54.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:53 vm06.local ceph-mon[109831]: pgmap v308: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:54.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:53 vm09.local ceph-mon[97995]: pgmap v308: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:56.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:55 vm06.local ceph-mon[109831]: pgmap v309: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:56.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:55 vm09.local ceph-mon[97995]: pgmap v309: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:39:58.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:57 vm06.local ceph-mon[109831]: pgmap v310: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:58.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:57 vm09.local ceph-mon[97995]: pgmap v310: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:39:59.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:39:59.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:58 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:40:00.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:39:59 vm06.local ceph-mon[109831]: pgmap v311: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:00.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:39:59 vm09.local ceph-mon[97995]: pgmap v311: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:01.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:00 vm06.local ceph-mon[109831]: overall HEALTH_OK 2026-03-09T17:40:01.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:00 vm09.local ceph-mon[97995]: overall HEALTH_OK 2026-03-09T17:40:02.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:01 vm06.local ceph-mon[109831]: pgmap v312: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:02.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:01 vm09.local ceph-mon[97995]: pgmap v312: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:04.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:03 vm06.local ceph-mon[109831]: pgmap v313: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:04.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:03 vm09.local ceph-mon[97995]: pgmap v313: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:06.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:05 vm06.local ceph-mon[109831]: pgmap v314: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:06.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:05 vm09.local ceph-mon[97995]: pgmap v314: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:08.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:07 vm06.local ceph-mon[109831]: pgmap v315: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:08.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:07 vm09.local ceph-mon[97995]: pgmap v315: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:10.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:09 vm06.local ceph-mon[109831]: pgmap v316: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:10.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:09 vm09.local ceph-mon[97995]: pgmap v316: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:12.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:11 vm06.local ceph-mon[109831]: pgmap v317: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:12.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:11 vm09.local ceph-mon[97995]: pgmap v317: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:14.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:13 vm06.local ceph-mon[109831]: pgmap v318: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:14.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:40:14.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:13 vm09.local ceph-mon[97995]: pgmap v318: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:14.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:40:16.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:15 vm06.local ceph-mon[109831]: pgmap v319: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:16.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:15 vm09.local ceph-mon[97995]: pgmap v319: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:18.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:17 vm06.local ceph-mon[109831]: pgmap v320: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:18.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:17 vm09.local ceph-mon[97995]: pgmap v320: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:20.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:19 vm06.local ceph-mon[109831]: pgmap v321: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:20.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:19 vm09.local ceph-mon[97995]: pgmap v321: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:21.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:40:21.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:40:21.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:40:21.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:20 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:40:21.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:40:21.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:40:21.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:40:21.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:20 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:40:22.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:21 vm06.local ceph-mon[109831]: pgmap v322: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:22.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:21 vm09.local ceph-mon[97995]: pgmap v322: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:24.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:24 vm06.local ceph-mon[109831]: pgmap v323: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:24.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:24 vm09.local ceph-mon[97995]: pgmap v323: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:26.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:26 vm06.local ceph-mon[109831]: pgmap v324: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:26.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:26 vm09.local ceph-mon[97995]: pgmap v324: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:28.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:28 vm06.local ceph-mon[109831]: pgmap v325: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:28.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:28 vm09.local ceph-mon[97995]: pgmap v325: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:29.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:29 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:40:29.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:29 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:40:30.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:30 vm06.local ceph-mon[109831]: pgmap v326: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:30.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:30 vm09.local ceph-mon[97995]: pgmap v326: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:32.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:32 vm06.local ceph-mon[109831]: pgmap v327: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:32.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:32 vm09.local ceph-mon[97995]: pgmap v327: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:34.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:34 vm06.local ceph-mon[109831]: pgmap v328: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:34.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:34 vm09.local ceph-mon[97995]: pgmap v328: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:36 vm06.local ceph-mon[109831]: pgmap v329: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:36 vm09.local ceph-mon[97995]: pgmap v329: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:38.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:38 vm06.local ceph-mon[109831]: pgmap v330: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:38.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:38 vm09.local ceph-mon[97995]: pgmap v330: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:40.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:40 vm06.local ceph-mon[109831]: pgmap v331: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:40.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:40 vm09.local ceph-mon[97995]: pgmap v331: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:42.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:42 vm06.local ceph-mon[109831]: pgmap v332: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:42.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:42 vm09.local ceph-mon[97995]: pgmap v332: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:44.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:44 vm06.local ceph-mon[109831]: pgmap v333: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:44.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:44 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:40:44.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:44 vm09.local ceph-mon[97995]: pgmap v333: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:44.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:44 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:40:46.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:46 vm06.local ceph-mon[109831]: pgmap v334: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:46.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:46 vm09.local ceph-mon[97995]: pgmap v334: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:48.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:48 vm06.local ceph-mon[109831]: pgmap v335: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:48.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:48 vm09.local ceph-mon[97995]: pgmap v335: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:50.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:50 vm06.local ceph-mon[109831]: pgmap v336: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:50.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:50 vm09.local ceph-mon[97995]: pgmap v336: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:52.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:52 vm06.local ceph-mon[109831]: pgmap v337: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:52.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:52 vm09.local ceph-mon[97995]: pgmap v337: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:53.642 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:53 vm06.local ceph-mon[109831]: pgmap v338: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:53.644 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:53 vm09.local ceph-mon[97995]: pgmap v338: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:55.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:55 vm06.local ceph-mon[109831]: pgmap v339: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:55.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:55 vm09.local ceph-mon[97995]: pgmap v339: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:40:57.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:57 vm06.local ceph-mon[109831]: pgmap v340: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:57.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:57 vm09.local ceph-mon[97995]: pgmap v340: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:58.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:58 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:40:58.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:58 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:40:59.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:40:59 vm06.local ceph-mon[109831]: pgmap v341: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:40:59.894 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:40:59 vm09.local ceph-mon[97995]: pgmap v341: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:02.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:01 vm06.local ceph-mon[109831]: pgmap v342: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:02.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:01 vm09.local ceph-mon[97995]: pgmap v342: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:04.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:03 vm06.local ceph-mon[109831]: pgmap v343: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:04.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:03 vm09.local ceph-mon[97995]: pgmap v343: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:06.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:05 vm06.local ceph-mon[109831]: pgmap v344: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:06.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:05 vm09.local ceph-mon[97995]: pgmap v344: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:08.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:07 vm06.local ceph-mon[109831]: pgmap v345: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:08.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:07 vm09.local ceph-mon[97995]: pgmap v345: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:09.947 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:09 vm09.local ceph-mon[97995]: pgmap v346: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:10.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:09 vm06.local ceph-mon[109831]: pgmap v346: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:12.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:11 vm06.local ceph-mon[109831]: pgmap v347: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:12.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:11 vm09.local ceph-mon[97995]: pgmap v347: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:14.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:13 vm06.local ceph-mon[109831]: pgmap v348: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:14.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:13 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:41:14.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:13 vm09.local ceph-mon[97995]: pgmap v348: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:14.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:13 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:41:16.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:15 vm06.local ceph-mon[109831]: pgmap v349: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:16.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:15 vm09.local ceph-mon[97995]: pgmap v349: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:18.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:17 vm06.local ceph-mon[109831]: pgmap v350: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:18.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:17 vm09.local ceph-mon[97995]: pgmap v350: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:20.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:19 vm06.local ceph-mon[109831]: pgmap v351: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:20.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:19 vm09.local ceph-mon[97995]: pgmap v351: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:22.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:21 vm06.local ceph-mon[109831]: pgmap v352: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:22.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:21 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:41:22.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:21 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:41:22.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:21 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:41:22.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:21 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:41:22.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:21 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:41:22.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:21 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:41:22.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:21 vm09.local ceph-mon[97995]: pgmap v352: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:22.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:21 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T17:41:22.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:21 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm09", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:41:22.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:21 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T17:41:22.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:21 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T17:41:22.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:21 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T17:41:22.145 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:21 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' 2026-03-09T17:41:24.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:23 vm06.local ceph-mon[109831]: pgmap v353: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:24.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:23 vm09.local ceph-mon[97995]: pgmap v353: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:26.142 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:25 vm06.local ceph-mon[109831]: pgmap v354: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:26.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:25 vm09.local ceph-mon[97995]: pgmap v354: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:28.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:27 vm06.local ceph-mon[109831]: pgmap v355: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:28.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:27 vm09.local ceph-mon[97995]: pgmap v355: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:29.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:28 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:41:29.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:28 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:41:30.144 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:29 vm09.local ceph-mon[97995]: pgmap v356: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:30.392 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:29 vm06.local ceph-mon[109831]: pgmap v356: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:32.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:31 vm06.local ceph-mon[109831]: pgmap v357: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:32.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:31 vm09.local ceph-mon[97995]: pgmap v357: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:34.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:33 vm06.local ceph-mon[109831]: pgmap v358: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:34.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:33 vm09.local ceph-mon[97995]: pgmap v358: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:36.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:35 vm06.local ceph-mon[109831]: pgmap v359: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:36.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:35 vm09.local ceph-mon[97995]: pgmap v359: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:38.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:38 vm06.local ceph-mon[109831]: pgmap v360: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:38.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:38 vm09.local ceph-mon[97995]: pgmap v360: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:40.197 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:40 vm09.local ceph-mon[97995]: pgmap v361: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:40.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:40 vm06.local ceph-mon[109831]: pgmap v361: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:42.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:42 vm06.local ceph-mon[109831]: pgmap v362: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:42.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:42 vm09.local ceph-mon[97995]: pgmap v362: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:44.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:44 vm06.local ceph-mon[109831]: pgmap v363: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:44.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:44 vm06.local ceph-mon[109831]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:41:44.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:44 vm09.local ceph-mon[97995]: pgmap v363: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:44.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:44 vm09.local ceph-mon[97995]: from='mgr.34104 192.168.123.106:0/362681300' entity='mgr.vm06.pbgzei' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T17:41:46.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:46 vm06.local ceph-mon[109831]: pgmap v364: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:46.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:46 vm09.local ceph-mon[97995]: pgmap v364: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s 2026-03-09T17:41:48.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:48 vm06.local ceph-mon[109831]: pgmap v365: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:48.394 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:48 vm09.local ceph-mon[97995]: pgmap v365: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:50.391 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:50 vm06.local ceph-mon[109831]: pgmap v366: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:50.395 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:50 vm09.local ceph-mon[97995]: pgmap v366: 65 pgs: 65 active+clean; 216 MiB data, 922 MiB used, 119 GiB / 120 GiB avail; 853 B/s rd, 1 op/s 2026-03-09T17:41:51.251 ERROR:tasks.cephfs.fuse_mount:process failed to terminate after unmount. This probably indicates a bug within ceph-fuse. 2026-03-09T17:41:51.251 ERROR:teuthology.run_tasks:Manager failed: ceph-fuse Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-09T17:41:51.252 DEBUG:teuthology.run_tasks:Unwinding manager cephadm 2026-03-09T17:41:51.255 INFO:tasks.cephadm:Teardown begin 2026-03-09T17:41:51.255 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephadm.py", line 2252, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-09T17:41:51.255 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T17:41:51.281 DEBUG:teuthology.orchestra.run.vm09:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T17:41:51.309 INFO:tasks.cephadm:Disabling cephadm mgr module 2026-03-09T17:41:51.309 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 -- ceph mgr module disable cephadm 2026-03-09T17:41:51.478 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/mon.vm06/config 2026-03-09T17:41:51.632 INFO:teuthology.orchestra.run.vm06.stderr:Error: statfs /etc/ceph/ceph.client.admin.keyring: no such file or directory 2026-03-09T17:41:51.650 DEBUG:teuthology.orchestra.run:got remote process result: 125 2026-03-09T17:41:51.650 INFO:tasks.cephadm:Cleaning up testdir ceph.* files... 2026-03-09T17:41:51.650 DEBUG:teuthology.orchestra.run.vm06:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-09T17:41:51.668 DEBUG:teuthology.orchestra.run.vm09:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-09T17:41:51.684 INFO:tasks.cephadm:Stopping all daemons... 2026-03-09T17:41:51.684 INFO:tasks.cephadm.mon.vm06:Stopping mon.vm06... 2026-03-09T17:41:51.684 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm06 2026-03-09T17:41:51.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:51 vm06.local systemd[1]: Stopping Ceph mon.vm06 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:41:51.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:51 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06[109827]: 2026-03-09T17:41:51.803+0000 7f3eeb61a640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm06 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:41:51.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 17:41:51 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm06[109827]: 2026-03-09T17:41:51.803+0000 7f3eeb61a640 -1 mon.vm06@0(leader) e3 *** Got Signal Terminated *** 2026-03-09T17:41:52.165 DEBUG:teuthology.orchestra.run.vm06:> sudo pkill -f 'journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm06.service' 2026-03-09T17:41:52.202 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T17:41:52.202 INFO:tasks.cephadm.mon.vm06:Stopped mon.vm06 2026-03-09T17:41:52.202 INFO:tasks.cephadm.mon.vm09:Stopping mon.vm09... 2026-03-09T17:41:52.202 DEBUG:teuthology.orchestra.run.vm09:> sudo systemctl stop ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm09 2026-03-09T17:41:52.474 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:52 vm09.local systemd[1]: Stopping Ceph mon.vm09 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:41:52.474 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:52 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm09[97991]: 2026-03-09T17:41:52.311+0000 7f56b1a90640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm09 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:41:52.474 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:52 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm09[97991]: 2026-03-09T17:41:52.311+0000 7f56b1a90640 -1 mon.vm09@1(peon) e3 *** Got Signal Terminated *** 2026-03-09T17:41:52.474 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:52 vm09.local podman[127797]: 2026-03-09 17:41:52.387359815 +0000 UTC m=+0.091386331 container died 65d270c6a306964790a627a32e51d0f9a5e6e5c0b3971111e299edc53e3c24aa (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm09, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T17:41:52.474 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:52 vm09.local podman[127797]: 2026-03-09 17:41:52.403266869 +0000 UTC m=+0.107293384 container remove 65d270c6a306964790a627a32e51d0f9a5e6e5c0b3971111e299edc53e3c24aa (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm09, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) 2026-03-09T17:41:52.474 INFO:journalctl@ceph.mon.vm09.vm09.stdout:Mar 09 17:41:52 vm09.local bash[127797]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-mon-vm09 2026-03-09T17:41:52.479 DEBUG:teuthology.orchestra.run.vm09:> sudo pkill -f 'journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@mon.vm09.service' 2026-03-09T17:41:52.517 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T17:41:52.517 INFO:tasks.cephadm.mon.vm09:Stopped mon.vm09 2026-03-09T17:41:52.517 INFO:tasks.cephadm.osd.0:Stopping osd.0... 2026-03-09T17:41:52.517 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.0 2026-03-09T17:41:52.891 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:41:52 vm06.local systemd[1]: Stopping Ceph osd.0 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:41:52.891 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:41:52 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0[115687]: 2026-03-09T17:41:52.607+0000 7f9b2064d640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:41:52.891 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:41:52 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0[115687]: 2026-03-09T17:41:52.607+0000 7f9b2064d640 -1 osd.0 80 *** Got signal Terminated *** 2026-03-09T17:41:52.892 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:41:52 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0[115687]: 2026-03-09T17:41:52.607+0000 7f9b2064d640 -1 osd.0 80 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T17:41:57.923 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:41:57 vm06.local podman[158016]: 2026-03-09 17:41:57.652515472 +0000 UTC m=+5.054635399 container died 3b19d9fcb067bacd704ab71f5839810b08a99307b95db949ac2e980087b234b0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0) 2026-03-09T17:41:57.924 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:41:57 vm06.local podman[158016]: 2026-03-09 17:41:57.675312561 +0000 UTC m=+5.077432499 container remove 3b19d9fcb067bacd704ab71f5839810b08a99307b95db949ac2e980087b234b0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3) 2026-03-09T17:41:57.924 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:41:57 vm06.local bash[158016]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0 2026-03-09T17:41:57.924 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:41:57 vm06.local podman[158083]: 2026-03-09 17:41:57.829249179 +0000 UTC m=+0.017649316 container create 46ebd8bacbae127a4ebab36b9ded22fd97e5d4af03f1f698b27e784d432b8f49 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T17:41:57.924 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:41:57 vm06.local podman[158083]: 2026-03-09 17:41:57.881055547 +0000 UTC m=+0.069455684 container init 46ebd8bacbae127a4ebab36b9ded22fd97e5d4af03f1f698b27e784d432b8f49 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-deactivate, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T17:41:57.924 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:41:57 vm06.local podman[158083]: 2026-03-09 17:41:57.885235791 +0000 UTC m=+0.073635928 container start 46ebd8bacbae127a4ebab36b9ded22fd97e5d4af03f1f698b27e784d432b8f49 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T17:41:57.924 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:41:57 vm06.local podman[158083]: 2026-03-09 17:41:57.892340869 +0000 UTC m=+0.080741017 container attach 46ebd8bacbae127a4ebab36b9ded22fd97e5d4af03f1f698b27e784d432b8f49 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-0-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS) 2026-03-09T17:41:57.924 INFO:journalctl@ceph.osd.0.vm06.stdout:Mar 09 17:41:57 vm06.local podman[158083]: 2026-03-09 17:41:57.822532277 +0000 UTC m=+0.010932414 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T17:41:58.063 DEBUG:teuthology.orchestra.run.vm06:> sudo pkill -f 'journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.0.service' 2026-03-09T17:41:58.095 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T17:41:58.096 INFO:tasks.cephadm.osd.0:Stopped osd.0 2026-03-09T17:41:58.096 INFO:tasks.cephadm.osd.1:Stopping osd.1... 2026-03-09T17:41:58.096 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.1 2026-03-09T17:41:58.175 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:41:58 vm06.local systemd[1]: Stopping Ceph osd.1 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:41:58.641 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:41:58 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1[119994]: 2026-03-09T17:41:58.243+0000 7f2c38f00640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:41:58.641 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:41:58 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1[119994]: 2026-03-09T17:41:58.243+0000 7f2c38f00640 -1 osd.1 80 *** Got signal Terminated *** 2026-03-09T17:41:58.641 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:41:58 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1[119994]: 2026-03-09T17:41:58.243+0000 7f2c38f00640 -1 osd.1 80 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T17:42:03.547 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:42:03 vm06.local podman[158178]: 2026-03-09 17:42:03.283403749 +0000 UTC m=+5.056155494 container died b63df0190ed326264a0bcd546fd3898de755b6579c1f0b3f080d173dc98b6dc9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T17:42:03.547 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:42:03 vm06.local podman[158178]: 2026-03-09 17:42:03.307086217 +0000 UTC m=+5.079837962 container remove b63df0190ed326264a0bcd546fd3898de755b6579c1f0b3f080d173dc98b6dc9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T17:42:03.547 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:42:03 vm06.local bash[158178]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1 2026-03-09T17:42:03.547 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:42:03 vm06.local podman[158254]: 2026-03-09 17:42:03.4533752 +0000 UTC m=+0.016657329 container create 22e99aff898b20c277d6e613a2624e6b6d5eb4ff8d82cdf7bb27e4cbd6cec9c0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-deactivate, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T17:42:03.547 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:42:03 vm06.local podman[158254]: 2026-03-09 17:42:03.497433464 +0000 UTC m=+0.060715613 container init 22e99aff898b20c277d6e613a2624e6b6d5eb4ff8d82cdf7bb27e4cbd6cec9c0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-deactivate, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T17:42:03.547 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:42:03 vm06.local podman[158254]: 2026-03-09 17:42:03.500133157 +0000 UTC m=+0.063415296 container start 22e99aff898b20c277d6e613a2624e6b6d5eb4ff8d82cdf7bb27e4cbd6cec9c0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS) 2026-03-09T17:42:03.547 INFO:journalctl@ceph.osd.1.vm06.stdout:Mar 09 17:42:03 vm06.local podman[158254]: 2026-03-09 17:42:03.506328594 +0000 UTC m=+0.069610733 container attach 22e99aff898b20c277d6e613a2624e6b6d5eb4ff8d82cdf7bb27e4cbd6cec9c0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-1-deactivate, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T17:42:03.659 DEBUG:teuthology.orchestra.run.vm06:> sudo pkill -f 'journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.1.service' 2026-03-09T17:42:03.734 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T17:42:03.734 INFO:tasks.cephadm.osd.1:Stopped osd.1 2026-03-09T17:42:03.735 INFO:tasks.cephadm.osd.2:Stopping osd.2... 2026-03-09T17:42:03.735 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.2 2026-03-09T17:42:03.813 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:42:03 vm06.local systemd[1]: Stopping Ceph osd.2 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:42:04.141 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:42:03 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2[124140]: 2026-03-09T17:42:03.882+0000 7fec2f2a8640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:42:04.141 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:42:03 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2[124140]: 2026-03-09T17:42:03.882+0000 7fec2f2a8640 -1 osd.2 80 *** Got signal Terminated *** 2026-03-09T17:42:04.141 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:42:03 vm06.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2[124140]: 2026-03-09T17:42:03.882+0000 7fec2f2a8640 -1 osd.2 80 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T17:42:09.170 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:42:08 vm06.local podman[158351]: 2026-03-09 17:42:08.914900708 +0000 UTC m=+5.046075440 container died a5ccd85faf221f22746043728365d5228b18ecd24c0e4f5ac48deeaac5d785a3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223) 2026-03-09T17:42:09.170 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:42:08 vm06.local podman[158351]: 2026-03-09 17:42:08.948241182 +0000 UTC m=+5.079415904 container remove a5ccd85faf221f22746043728365d5228b18ecd24c0e4f5ac48deeaac5d785a3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True) 2026-03-09T17:42:09.170 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:42:08 vm06.local bash[158351]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2 2026-03-09T17:42:09.170 INFO:journalctl@ceph.osd.2.vm06.stdout:Mar 09 17:42:09 vm06.local podman[158420]: 2026-03-09 17:42:09.127135895 +0000 UTC m=+0.021320418 container create f54c1cf2fa5eace3010941add8b2c8125be6911886192a256d21b501fa032429 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-2-deactivate, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T17:42:09.349 DEBUG:teuthology.orchestra.run.vm06:> sudo pkill -f 'journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.2.service' 2026-03-09T17:42:09.427 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T17:42:09.427 INFO:tasks.cephadm.osd.2:Stopped osd.2 2026-03-09T17:42:09.427 INFO:tasks.cephadm.osd.3:Stopping osd.3... 2026-03-09T17:42:09.427 DEBUG:teuthology.orchestra.run.vm09:> sudo systemctl stop ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.3 2026-03-09T17:42:09.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:42:09 vm09.local systemd[1]: Stopping Ceph osd.3 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:42:09.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:42:09 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3[104326]: 2026-03-09T17:42:09.542+0000 7faf30597640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:42:09.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:42:09 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3[104326]: 2026-03-09T17:42:09.542+0000 7faf30597640 -1 osd.3 80 *** Got signal Terminated *** 2026-03-09T17:42:09.895 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:42:09 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3[104326]: 2026-03-09T17:42:09.542+0000 7faf30597640 -1 osd.3 80 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T17:42:14.858 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:42:14 vm09.local podman[127900]: 2026-03-09 17:42:14.583128008 +0000 UTC m=+5.056304663 container died 40d8343609336e13e857e2ad041bb33ed6783fb5b5e5a916914f372c1a9c0a22 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True) 2026-03-09T17:42:14.858 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:42:14 vm09.local podman[127900]: 2026-03-09 17:42:14.60903596 +0000 UTC m=+5.082212625 container remove 40d8343609336e13e857e2ad041bb33ed6783fb5b5e5a916914f372c1a9c0a22 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3) 2026-03-09T17:42:14.858 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:42:14 vm09.local bash[127900]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3 2026-03-09T17:42:14.858 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:42:14 vm09.local podman[127981]: 2026-03-09 17:42:14.778612211 +0000 UTC m=+0.032308955 container create 0b8e5b896fe049734e2b8b01814d48eee3eb7b371a5c2d6f46d0a27252f65db2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223) 2026-03-09T17:42:14.858 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:42:14 vm09.local podman[127981]: 2026-03-09 17:42:14.823534873 +0000 UTC m=+0.077231628 container init 0b8e5b896fe049734e2b8b01814d48eee3eb7b371a5c2d6f46d0a27252f65db2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-deactivate, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223) 2026-03-09T17:42:14.858 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:42:14 vm09.local podman[127981]: 2026-03-09 17:42:14.827088575 +0000 UTC m=+0.080785329 container start 0b8e5b896fe049734e2b8b01814d48eee3eb7b371a5c2d6f46d0a27252f65db2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0) 2026-03-09T17:42:14.858 INFO:journalctl@ceph.osd.3.vm09.stdout:Mar 09 17:42:14 vm09.local podman[127981]: 2026-03-09 17:42:14.82820227 +0000 UTC m=+0.081899024 container attach 0b8e5b896fe049734e2b8b01814d48eee3eb7b371a5c2d6f46d0a27252f65db2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-3-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T17:42:15.017 DEBUG:teuthology.orchestra.run.vm09:> sudo pkill -f 'journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.3.service' 2026-03-09T17:42:15.053 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T17:42:15.053 INFO:tasks.cephadm.osd.3:Stopped osd.3 2026-03-09T17:42:15.053 INFO:tasks.cephadm.osd.4:Stopping osd.4... 2026-03-09T17:42:15.053 DEBUG:teuthology.orchestra.run.vm09:> sudo systemctl stop ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.4 2026-03-09T17:42:15.395 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:42:15 vm09.local systemd[1]: Stopping Ceph osd.4 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:42:15.395 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:42:15 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4[108010]: 2026-03-09T17:42:15.194+0000 7f9de0885640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:42:15.395 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:42:15 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4[108010]: 2026-03-09T17:42:15.194+0000 7f9de0885640 -1 osd.4 80 *** Got signal Terminated *** 2026-03-09T17:42:15.395 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:42:15 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4[108010]: 2026-03-09T17:42:15.194+0000 7f9de0885640 -1 osd.4 80 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T17:42:20.483 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:42:20 vm09.local podman[128080]: 2026-03-09 17:42:20.229577689 +0000 UTC m=+5.049166434 container died cb6e9cd4fe303e400642c9035b2a2860c284ccbb7a646d0f1b82f1f70af57c2a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3) 2026-03-09T17:42:20.484 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:42:20 vm09.local podman[128080]: 2026-03-09 17:42:20.249284574 +0000 UTC m=+5.068873319 container remove cb6e9cd4fe303e400642c9035b2a2860c284ccbb7a646d0f1b82f1f70af57c2a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid) 2026-03-09T17:42:20.484 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:42:20 vm09.local bash[128080]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4 2026-03-09T17:42:20.484 INFO:journalctl@ceph.osd.4.vm09.stdout:Mar 09 17:42:20 vm09.local podman[128151]: 2026-03-09 17:42:20.446706183 +0000 UTC m=+0.021230880 container create 5e9b1df837d343ea6e232d9005f70f05ddbf27663e286b3543110da5684b93c0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-4-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid) 2026-03-09T17:42:20.484 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:20 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[111601]: 2026-03-09T17:42:20.326+0000 7f036fe7b640 -1 osd.5 80 heartbeat_check: no reply from 192.168.123.106:6806 osd.0 since back 2026-03-09T17:41:54.726857+0000 front 2026-03-09T17:41:54.726885+0000 (oldest deadline 2026-03-09T17:42:20.026592+0000) 2026-03-09T17:42:20.700 DEBUG:teuthology.orchestra.run.vm09:> sudo pkill -f 'journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.4.service' 2026-03-09T17:42:20.740 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T17:42:20.740 INFO:tasks.cephadm.osd.4:Stopped osd.4 2026-03-09T17:42:20.740 INFO:tasks.cephadm.osd.5:Stopping osd.5... 2026-03-09T17:42:20.740 DEBUG:teuthology.orchestra.run.vm09:> sudo systemctl stop ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.5 2026-03-09T17:42:21.145 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:20 vm09.local systemd[1]: Stopping Ceph osd.5 for bcd3bcc2-1bdc-11f1-97b3-3f61613e7048... 2026-03-09T17:42:21.145 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:20 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[111601]: 2026-03-09T17:42:20.899+0000 7f0374074640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T17:42:21.145 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:20 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[111601]: 2026-03-09T17:42:20.899+0000 7f0374074640 -1 osd.5 80 *** Got signal Terminated *** 2026-03-09T17:42:21.145 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:20 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[111601]: 2026-03-09T17:42:20.899+0000 7f0374074640 -1 osd.5 80 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T17:42:21.644 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:21 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[111601]: 2026-03-09T17:42:21.359+0000 7f036fe7b640 -1 osd.5 80 heartbeat_check: no reply from 192.168.123.106:6806 osd.0 since back 2026-03-09T17:41:54.726857+0000 front 2026-03-09T17:41:54.726885+0000 (oldest deadline 2026-03-09T17:42:20.026592+0000) 2026-03-09T17:42:22.644 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:22 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[111601]: 2026-03-09T17:42:22.335+0000 7f036fe7b640 -1 osd.5 80 heartbeat_check: no reply from 192.168.123.106:6806 osd.0 since back 2026-03-09T17:41:54.726857+0000 front 2026-03-09T17:41:54.726885+0000 (oldest deadline 2026-03-09T17:42:20.026592+0000) 2026-03-09T17:42:23.644 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:23 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[111601]: 2026-03-09T17:42:23.329+0000 7f036fe7b640 -1 osd.5 80 heartbeat_check: no reply from 192.168.123.106:6806 osd.0 since back 2026-03-09T17:41:54.726857+0000 front 2026-03-09T17:41:54.726885+0000 (oldest deadline 2026-03-09T17:42:20.026592+0000) 2026-03-09T17:42:24.644 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:24 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[111601]: 2026-03-09T17:42:24.297+0000 7f036fe7b640 -1 osd.5 80 heartbeat_check: no reply from 192.168.123.106:6806 osd.0 since back 2026-03-09T17:41:54.726857+0000 front 2026-03-09T17:41:54.726885+0000 (oldest deadline 2026-03-09T17:42:20.026592+0000) 2026-03-09T17:42:25.644 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:25 vm09.local ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5[111601]: 2026-03-09T17:42:25.270+0000 7f036fe7b640 -1 osd.5 80 heartbeat_check: no reply from 192.168.123.106:6806 osd.0 since back 2026-03-09T17:41:54.726857+0000 front 2026-03-09T17:41:54.726885+0000 (oldest deadline 2026-03-09T17:42:20.026592+0000) 2026-03-09T17:42:26.197 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:25 vm09.local podman[128247]: 2026-03-09 17:42:25.925125078 +0000 UTC m=+5.041595095 container died b297663f757a3ec6ea9e8c1fe83654e511cf02e1d1ff5fdec2155a96fab97a79 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T17:42:26.197 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:25 vm09.local podman[128247]: 2026-03-09 17:42:25.952416288 +0000 UTC m=+5.068886305 container remove b297663f757a3ec6ea9e8c1fe83654e511cf02e1d1ff5fdec2155a96fab97a79 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T17:42:26.197 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:25 vm09.local bash[128247]: ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5 2026-03-09T17:42:26.197 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:26 vm09.local podman[128315]: 2026-03-09 17:42:26.105283563 +0000 UTC m=+0.018335711 container create afb9e1762959ba7b3b3ac0aac4c756d4d7ba4b85eb8aff3cac438358f0413619 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.41.3) 2026-03-09T17:42:26.197 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:26 vm09.local podman[128315]: 2026-03-09 17:42:26.168201378 +0000 UTC m=+0.081253536 container init afb9e1762959ba7b3b3ac0aac4c756d4d7ba4b85eb8aff3cac438358f0413619 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-deactivate, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T17:42:26.197 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:26 vm09.local podman[128315]: 2026-03-09 17:42:26.172671805 +0000 UTC m=+0.085723952 container start afb9e1762959ba7b3b3ac0aac4c756d4d7ba4b85eb8aff3cac438358f0413619 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-deactivate, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T17:42:26.197 INFO:journalctl@ceph.osd.5.vm09.stdout:Mar 09 17:42:26 vm09.local podman[128315]: 2026-03-09 17:42:26.173944508 +0000 UTC m=+0.086996655 container attach afb9e1762959ba7b3b3ac0aac4c756d4d7ba4b85eb8aff3cac438358f0413619 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048-osd-5-deactivate, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True) 2026-03-09T17:42:26.366 DEBUG:teuthology.orchestra.run.vm09:> sudo pkill -f 'journalctl -f -n 0 -u ceph-bcd3bcc2-1bdc-11f1-97b3-3f61613e7048@osd.5.service' 2026-03-09T17:42:26.407 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T17:42:26.407 INFO:tasks.cephadm.osd.5:Stopped osd.5 2026-03-09T17:42:26.407 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 --force --keep-logs 2026-03-09T17:42:26.526 INFO:teuthology.orchestra.run.vm06.stdout:Deleting cluster with fsid: bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:42:27.996 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm06.stderr:ceph-fuse[98178]: fuse finished with error 0 and tester_r 0 2026-03-09T17:42:37.376 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 --force --keep-logs 2026-03-09T17:42:37.481 INFO:teuthology.orchestra.run.vm09.stdout:Deleting cluster with fsid: bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:42:42.541 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T17:42:42.569 DEBUG:teuthology.orchestra.run.vm09:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T17:42:42.606 INFO:tasks.cephadm:Archiving crash dumps... 2026-03-09T17:42:42.607 DEBUG:teuthology.misc:Transferring archived files from vm06:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/crash to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/585/remote/vm06/crash 2026-03-09T17:42:42.607 DEBUG:teuthology.orchestra.run.vm06:> sudo tar c -f - -C /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/crash -- . 2026-03-09T17:42:42.635 INFO:teuthology.orchestra.run.vm06.stderr:tar: /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/crash: Cannot open: No such file or directory 2026-03-09T17:42:42.635 INFO:teuthology.orchestra.run.vm06.stderr:tar: Error is not recoverable: exiting now 2026-03-09T17:42:42.636 DEBUG:teuthology.misc:Transferring archived files from vm09:/var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/crash to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/585/remote/vm09/crash 2026-03-09T17:42:42.637 DEBUG:teuthology.orchestra.run.vm09:> sudo tar c -f - -C /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/crash -- . 2026-03-09T17:42:42.673 INFO:teuthology.orchestra.run.vm09.stderr:tar: /var/lib/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/crash: Cannot open: No such file or directory 2026-03-09T17:42:42.673 INFO:teuthology.orchestra.run.vm09.stderr:tar: Error is not recoverable: exiting now 2026-03-09T17:42:42.675 INFO:tasks.cephadm:Checking cluster log for badness... 2026-03-09T17:42:42.675 DEBUG:teuthology.orchestra.run.vm06:> sudo egrep '\[ERR\]|\[WRN\]|\[SEC\]' /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v FS_DEGRADED | egrep -v 'filesystem is degraded' | egrep -v FS_INLINE_DATA_DEPRECATED | egrep -v FS_WITH_FAILED_MDS | egrep -v MDS_ALL_DOWN | egrep -v 'filesystem is offline' | egrep -v 'is offline because no MDS' | egrep -v MDS_DAMAGE | egrep -v MDS_DEGRADED | egrep -v MDS_FAILED | egrep -v MDS_INSUFFICIENT_STANDBY | egrep -v MDS_UP_LESS_THAN_MAX | egrep -v 'online, but wants' | egrep -v 'filesystem is online with fewer MDS than max_mds' | egrep -v POOL_APP_NOT_ENABLED | egrep -v 'do not have an application enabled' | egrep -v 'overall HEALTH_' | egrep -v 'Replacing daemon' | egrep -v 'deprecated feature inline_data' | egrep -v MGR_MODULE_ERROR | egrep -v OSD_DOWN | egrep -v 'osds down' | egrep -v 'overall HEALTH_' | egrep -v '\(OSD_DOWN\)' | egrep -v '\(OSD_' | egrep -v 'but it is still running' | egrep -v 'is not responding' | egrep -v MON_DOWN | egrep -v PG_AVAILABILITY | egrep -v PG_DEGRADED | egrep -v 'Reduced data availability' | egrep -v 'Degraded data redundancy' | egrep -v 'pg .* is stuck inactive' | egrep -v 'pg .* is .*degraded' | egrep -v 'pg .* is stuck peering' | head -n 1 2026-03-09T17:42:42.740 INFO:tasks.cephadm:Compressing logs... 2026-03-09T17:42:42.740 DEBUG:teuthology.orchestra.run.vm06:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T17:42:42.741 DEBUG:teuthology.orchestra.run.vm09:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T17:42:42.763 INFO:teuthology.orchestra.run.vm06.stderr:find: gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-09T17:42:42.763 INFO:teuthology.orchestra.run.vm06.stderr:‘/var/log/rbd-target-api’: No such file or directory 2026-03-09T17:42:42.764 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mon.vm06.log 2026-03-09T17:42:42.764 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.log 2026-03-09T17:42:42.765 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mon.vm06.log: gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mgr.vm06.pbgzei.log 2026-03-09T17:42:42.775 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.log: 87.6% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.log.gz 2026-03-09T17:42:42.775 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.audit.log 2026-03-09T17:42:42.776 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mgr.vm06.pbgzei.log: 91.8% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-09T17:42:42.777 INFO:teuthology.orchestra.run.vm09.stderr:find: ‘/var/log/rbd-target-api’: No such file or directory 2026-03-09T17:42:42.778 INFO:teuthology.orchestra.run.vm09.stderr:gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-09T17:42:42.778 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.cephadm.log 2026-03-09T17:42:42.778 INFO:teuthology.orchestra.run.vm09.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-volume.log 2026-03-09T17:42:42.779 INFO:teuthology.orchestra.run.vm09.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-volume.log: gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-client.ceph-exporter.vm09.log 2026-03-09T17:42:42.780 INFO:teuthology.orchestra.run.vm09.stderr: 92.6% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-09T17:42:42.781 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.audit.log: 91.3% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.audit.log.gz 2026-03-09T17:42:42.782 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-volume.log 2026-03-09T17:42:42.782 INFO:teuthology.orchestra.run.vm09.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-client.ceph-exporter.vm09.log: gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mgr.vm09.lqzvkh.log 2026-03-09T17:42:42.783 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.cephadm.log: 85.0% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.cephadm.log.gz 2026-03-09T17:42:42.783 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-client.ceph-exporter.vm06.log 2026-03-09T17:42:42.783 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-volume.log: gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.0.log 2026-03-09T17:42:42.784 INFO:teuthology.orchestra.run.vm09.stderr: 93.9% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-client.ceph-exporter.vm09.log.gz 2026-03-09T17:42:42.785 INFO:teuthology.orchestra.run.vm09.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mgr.vm09.lqzvkh.log: gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mon.vm09.log 2026-03-09T17:42:42.790 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-client.ceph-exporter.vm06.log: 93.9% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-client.ceph-exporter.vm06.log.gz 2026-03-09T17:42:42.791 INFO:teuthology.orchestra.run.vm06.stderr: 94.1% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-volume.log.gz 2026-03-09T17:42:42.791 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.1.log 2026-03-09T17:42:42.791 INFO:teuthology.orchestra.run.vm09.stderr: 94.0% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-volume.log.gz 2026-03-09T17:42:42.791 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.0.log: gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.2.log 2026-03-09T17:42:42.794 INFO:teuthology.orchestra.run.vm09.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mon.vm09.log: gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.log 2026-03-09T17:42:42.796 INFO:teuthology.orchestra.run.vm09.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.log: 87.7% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.log.gz 2026-03-09T17:42:42.796 INFO:teuthology.orchestra.run.vm09.stderr:gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.audit.log 2026-03-09T17:42:42.804 INFO:teuthology.orchestra.run.vm09.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.audit.log: 91.5% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.audit.log.gz 2026-03-09T17:42:42.804 INFO:teuthology.orchestra.run.vm09.stderr:gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.cephadm.log 2026-03-09T17:42:42.808 INFO:teuthology.orchestra.run.vm09.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.cephadm.log: 85.0% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph.cephadm.log.gz 2026-03-09T17:42:42.809 INFO:teuthology.orchestra.run.vm09.stderr:gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.3.log 2026-03-09T17:42:42.810 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.1.log: gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mds.cephfs.vm06.vmzmbb.log 2026-03-09T17:42:42.816 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.2.log: gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mds.cephfs.vm06.gzymac.log 2026-03-09T17:42:42.819 INFO:teuthology.orchestra.run.vm09.stderr:gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.4.log 2026-03-09T17:42:42.819 INFO:teuthology.orchestra.run.vm09.stderr: 89.2% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mgr.vm09.lqzvkh.log.gz 2026-03-09T17:42:42.822 INFO:teuthology.orchestra.run.vm09.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.3.log: gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.5.log 2026-03-09T17:42:42.831 INFO:teuthology.orchestra.run.vm09.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.4.log: gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mds.cephfs.vm09.cjcawy.log 2026-03-09T17:42:42.832 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mds.cephfs.vm06.vmzmbb.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.0.log 2026-03-09T17:42:42.839 INFO:teuthology.orchestra.run.vm09.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.5.log: gzip -5 --verbose -- /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mds.cephfs.vm09.drzmdt.log 2026-03-09T17:42:42.848 INFO:teuthology.orchestra.run.vm09.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mds.cephfs.vm09.cjcawy.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.1.log 2026-03-09T17:42:43.403 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mds.cephfs.vm06.gzymac.log: /var/log/ceph/ceph-client.0.log: 89.4% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mgr.vm06.pbgzei.log.gz 2026-03-09T17:42:43.470 INFO:teuthology.orchestra.run.vm09.stderr:/var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mds.cephfs.vm09.drzmdt.log: /var/log/ceph/ceph-client.1.log: 92.3% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mon.vm09.log.gz 2026-03-09T17:42:44.378 INFO:teuthology.orchestra.run.vm06.stderr: 90.5% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mon.vm06.log.gz 2026-03-09T17:42:51.531 INFO:teuthology.orchestra.run.vm09.stderr: 93.5% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.4.log.gz 2026-03-09T17:42:52.746 INFO:teuthology.orchestra.run.vm06.stderr: 93.8% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.2.log.gz 2026-03-09T17:42:53.359 INFO:teuthology.orchestra.run.vm06.stderr: 93.7% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.0.log.gz 2026-03-09T17:42:53.896 INFO:teuthology.orchestra.run.vm09.stderr: 93.8% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.5.log.gz 2026-03-09T17:42:53.940 INFO:teuthology.orchestra.run.vm06.stderr: 93.6% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.1.log.gz 2026-03-09T17:42:54.472 INFO:teuthology.orchestra.run.vm09.stderr: 94.9% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mds.cephfs.vm09.drzmdt.log.gz 2026-03-09T17:42:55.535 INFO:teuthology.orchestra.run.vm09.stderr: 95.0% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mds.cephfs.vm09.cjcawy.log.gz 2026-03-09T17:42:55.549 INFO:teuthology.orchestra.run.vm09.stderr: 94.0% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-osd.3.log.gz 2026-03-09T17:42:59.335 INFO:teuthology.orchestra.run.vm09.stderr:gzip: /var/log/ceph/ceph-client.1.log: file size changed while zipping 2026-03-09T17:42:59.335 INFO:teuthology.orchestra.run.vm09.stderr: 93.4% -- replaced with /var/log/ceph/ceph-client.1.log.gz 2026-03-09T17:42:59.338 INFO:teuthology.orchestra.run.vm09.stderr: 2026-03-09T17:42:59.338 INFO:teuthology.orchestra.run.vm09.stderr:real 0m16.577s 2026-03-09T17:42:59.338 INFO:teuthology.orchestra.run.vm09.stderr:user 0m27.449s 2026-03-09T17:42:59.338 INFO:teuthology.orchestra.run.vm09.stderr:sys 0m1.554s 2026-03-09T17:43:01.153 INFO:teuthology.orchestra.run.vm06.stderr: 94.9% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mds.cephfs.vm06.gzymac.log.gz 2026-03-09T17:43:01.211 INFO:teuthology.orchestra.run.vm06.stderr:gzip: /var/log/ceph/ceph-client.0.log: file size changed while zipping 2026-03-09T17:43:01.403 INFO:teuthology.orchestra.run.vm06.stderr: 93.5% -- replaced with /var/log/ceph/ceph-client.0.log.gz 2026-03-09T17:43:57.405 INFO:teuthology.orchestra.run.vm06.stderr: 93.0% -- replaced with /var/log/ceph/bcd3bcc2-1bdc-11f1-97b3-3f61613e7048/ceph-mds.cephfs.vm06.vmzmbb.log.gz 2026-03-09T17:43:57.408 INFO:teuthology.orchestra.run.vm06.stderr: 2026-03-09T17:43:57.408 INFO:teuthology.orchestra.run.vm06.stderr:real 1m14.654s 2026-03-09T17:43:57.408 INFO:teuthology.orchestra.run.vm06.stderr:user 1m25.428s 2026-03-09T17:43:57.408 INFO:teuthology.orchestra.run.vm06.stderr:sys 0m6.150s 2026-03-09T17:43:57.409 INFO:tasks.cephadm:Archiving logs... 2026-03-09T17:43:57.409 DEBUG:teuthology.misc:Transferring archived files from vm06:/var/log/ceph to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/585/remote/vm06/log 2026-03-09T17:43:57.409 DEBUG:teuthology.orchestra.run.vm06:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-09T17:44:02.186 DEBUG:teuthology.misc:Transferring archived files from vm09:/var/log/ceph to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/585/remote/vm09/log 2026-03-09T17:44:02.186 DEBUG:teuthology.orchestra.run.vm09:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-09T17:44:03.549 INFO:tasks.cephadm:Removing cluster... 2026-03-09T17:44:03.549 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 --force 2026-03-09T17:44:03.689 INFO:teuthology.orchestra.run.vm06.stdout:Deleting cluster with fsid: bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:44:04.030 DEBUG:teuthology.orchestra.run.vm09:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 --force 2026-03-09T17:44:04.147 INFO:teuthology.orchestra.run.vm09.stdout:Deleting cluster with fsid: bcd3bcc2-1bdc-11f1-97b3-3f61613e7048 2026-03-09T17:44:04.439 INFO:tasks.cephadm:Removing cephadm ... 2026-03-09T17:44:04.439 DEBUG:teuthology.orchestra.run.vm06:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-09T17:44:04.454 DEBUG:teuthology.orchestra.run.vm09:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-09T17:44:04.472 INFO:tasks.cephadm:Teardown complete 2026-03-09T17:44:04.472 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-09T17:44:04.474 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/teuthology/teuthology/task/install/__init__.py", line 644, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-09T17:44:04.475 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-09T17:44:04.475 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-09T17:44:04.496 DEBUG:teuthology.orchestra.run.vm09:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-09T17:44:04.550 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-09T17:44:04.550 DEBUG:teuthology.orchestra.run.vm06:> 2026-03-09T17:44:04.550 DEBUG:teuthology.orchestra.run.vm06:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-09T17:44:04.550 DEBUG:teuthology.orchestra.run.vm06:> sudo yum -y remove $d || true 2026-03-09T17:44:04.550 DEBUG:teuthology.orchestra.run.vm06:> done 2026-03-09T17:44:04.557 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-09T17:44:04.557 DEBUG:teuthology.orchestra.run.vm09:> 2026-03-09T17:44:04.557 DEBUG:teuthology.orchestra.run.vm09:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-09T17:44:04.557 DEBUG:teuthology.orchestra.run.vm09:> sudo yum -y remove $d || true 2026-03-09T17:44:04.557 DEBUG:teuthology.orchestra.run.vm09:> done 2026-03-09T17:44:04.812 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:04.812 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:04.812 INFO:teuthology.orchestra.run.vm09.stdout: Package Architecture Version Repository Size 2026-03-09T17:44:04.812 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:04.812 INFO:teuthology.orchestra.run.vm09.stdout:Removing: 2026-03-09T17:44:04.812 INFO:teuthology.orchestra.run.vm09.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 @ceph 31 M 2026-03-09T17:44:04.812 INFO:teuthology.orchestra.run.vm09.stdout:Removing unused dependencies: 2026-03-09T17:44:04.812 INFO:teuthology.orchestra.run.vm09.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-09T17:44:04.812 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:04.812 INFO:teuthology.orchestra.run.vm09.stdout:Transaction Summary 2026-03-09T17:44:04.812 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:04.812 INFO:teuthology.orchestra.run.vm09.stdout:Remove 2 Packages 2026-03-09T17:44:04.812 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:04.812 INFO:teuthology.orchestra.run.vm09.stdout:Freed space: 31 M 2026-03-09T17:44:04.813 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction check 2026-03-09T17:44:04.817 INFO:teuthology.orchestra.run.vm09.stdout:Transaction check succeeded. 2026-03-09T17:44:04.817 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction test 2026-03-09T17:44:04.833 INFO:teuthology.orchestra.run.vm09.stdout:Transaction test succeeded. 2026-03-09T17:44:04.833 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction 2026-03-09T17:44:04.868 INFO:teuthology.orchestra.run.vm09.stdout: Preparing : 1/1 2026-03-09T17:44:04.894 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-09T17:44:04.894 INFO:teuthology.orchestra.run.vm09.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:44:04.894 INFO:teuthology.orchestra.run.vm09.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-09T17:44:04.894 INFO:teuthology.orchestra.run.vm09.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-09T17:44:04.894 INFO:teuthology.orchestra.run.vm09.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-09T17:44:04.894 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:04.896 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-09T17:44:04.907 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:04.914 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:04.914 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-09T17:44:04.914 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:04.914 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T17:44:04.914 INFO:teuthology.orchestra.run.vm06.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 @ceph 31 M 2026-03-09T17:44:04.914 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-09T17:44:04.914 INFO:teuthology.orchestra.run.vm06.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-09T17:44:04.914 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:04.914 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T17:44:04.914 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:04.914 INFO:teuthology.orchestra.run.vm06.stdout:Remove 2 Packages 2026-03-09T17:44:04.914 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:04.914 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 31 M 2026-03-09T17:44:04.914 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T17:44:04.921 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T17:44:04.921 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T17:44:04.940 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T17:44:04.940 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T17:44:04.947 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-09T17:44:04.978 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T17:44:04.998 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T17:44:05.020 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-09T17:44:05.020 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:44:05.020 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-09T17:44:05.020 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-09T17:44:05.020 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-09T17:44:05.020 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:05.022 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-09T17:44:05.032 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-09T17:44:05.047 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T17:44:05.047 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T17:44:05.047 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-09T17:44:05.109 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T17:44:05.110 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:05.110 INFO:teuthology.orchestra.run.vm09.stdout:Removed: 2026-03-09T17:44:05.110 INFO:teuthology.orchestra.run.vm09.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-09T17:44:05.110 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:05.110 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:05.119 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T17:44:05.119 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-09T17:44:05.173 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T17:44:05.173 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:05.173 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T17:44:05.173 INFO:teuthology.orchestra.run.vm06.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-09T17:44:05.173 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:05.173 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:05.321 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout: Package Architecture Version Repository Size 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout:Removing: 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout: ceph-test x86_64 2:18.2.0-0.el9 @ceph 164 M 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout:Removing unused dependencies: 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout:Transaction Summary 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout:Remove 4 Packages 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout:Freed space: 166 M 2026-03-09T17:44:05.322 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction check 2026-03-09T17:44:05.325 INFO:teuthology.orchestra.run.vm09.stdout:Transaction check succeeded. 2026-03-09T17:44:05.325 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction test 2026-03-09T17:44:05.351 INFO:teuthology.orchestra.run.vm09.stdout:Transaction test succeeded. 2026-03-09T17:44:05.352 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction 2026-03-09T17:44:05.383 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:05.383 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:05.383 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-09T17:44:05.383 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:05.383 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T17:44:05.383 INFO:teuthology.orchestra.run.vm06.stdout: ceph-test x86_64 2:18.2.0-0.el9 @ceph 164 M 2026-03-09T17:44:05.383 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-09T17:44:05.383 INFO:teuthology.orchestra.run.vm06.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-09T17:44:05.383 INFO:teuthology.orchestra.run.vm06.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-09T17:44:05.384 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-09T17:44:05.384 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:05.384 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T17:44:05.384 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:05.384 INFO:teuthology.orchestra.run.vm06.stdout:Remove 4 Packages 2026-03-09T17:44:05.384 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:05.384 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 166 M 2026-03-09T17:44:05.384 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T17:44:05.386 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T17:44:05.386 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T17:44:05.402 INFO:teuthology.orchestra.run.vm09.stdout: Preparing : 1/1 2026-03-09T17:44:05.409 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-09T17:44:05.412 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T17:44:05.412 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T17:44:05.412 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-09T17:44:05.415 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-09T17:44:05.431 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T17:44:05.465 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T17:44:05.471 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-09T17:44:05.473 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-09T17:44:05.478 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-09T17:44:05.494 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T17:44:05.509 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T17:44:05.509 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-09T17:44:05.509 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-09T17:44:05.509 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-09T17:44:05.554 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-09T17:44:05.554 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:05.555 INFO:teuthology.orchestra.run.vm09.stdout:Removed: 2026-03-09T17:44:05.555 INFO:teuthology.orchestra.run.vm09.stdout: ceph-test-2:18.2.0-0.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-09T17:44:05.555 INFO:teuthology.orchestra.run.vm09.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-09T17:44:05.555 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:05.555 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:05.567 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T17:44:05.567 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-09T17:44:05.567 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-09T17:44:05.567 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-09T17:44:05.617 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-09T17:44:05.617 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:05.618 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T17:44:05.618 INFO:teuthology.orchestra.run.vm06.stdout: ceph-test-2:18.2.0-0.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-09T17:44:05.618 INFO:teuthology.orchestra.run.vm06.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-09T17:44:05.618 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:05.618 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:05.771 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout: Package Arch Version Repository Size 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout:Removing: 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout: ceph x86_64 2:18.2.0-0.el9 @ceph 0 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout:Removing unused dependencies: 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mds x86_64 2:18.2.0-0.el9 @ceph 6.4 M 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mon x86_64 2:18.2.0-0.el9 @ceph 20 M 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout: ceph-osd x86_64 2:18.2.0-0.el9 @ceph 61 M 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout:Transaction Summary 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout:Remove 8 Packages 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout:Freed space: 89 M 2026-03-09T17:44:05.772 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction check 2026-03-09T17:44:05.775 INFO:teuthology.orchestra.run.vm09.stdout:Transaction check succeeded. 2026-03-09T17:44:05.775 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction test 2026-03-09T17:44:05.798 INFO:teuthology.orchestra.run.vm09.stdout:Transaction test succeeded. 2026-03-09T17:44:05.799 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction 2026-03-09T17:44:05.830 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout: ceph x86_64 2:18.2.0-0.el9 @ceph 0 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mds x86_64 2:18.2.0-0.el9 @ceph 6.4 M 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon x86_64 2:18.2.0-0.el9 @ceph 20 M 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout: ceph-osd x86_64 2:18.2.0-0.el9 @ceph 61 M 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout:Remove 8 Packages 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 89 M 2026-03-09T17:44:05.831 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T17:44:05.834 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T17:44:05.834 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T17:44:05.837 INFO:teuthology.orchestra.run.vm09.stdout: Preparing : 1/1 2026-03-09T17:44:05.838 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-09T17:44:05.856 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-09T17:44:05.856 INFO:teuthology.orchestra.run.vm09.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:44:05.856 INFO:teuthology.orchestra.run.vm09.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-09T17:44:05.856 INFO:teuthology.orchestra.run.vm09.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-09T17:44:05.856 INFO:teuthology.orchestra.run.vm09.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-09T17:44:05.856 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:05.857 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T17:44:05.858 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T17:44:05.859 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-09T17:44:05.868 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-09T17:44:05.881 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T17:44:05.881 INFO:teuthology.orchestra.run.vm09.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-09T17:44:05.881 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:05.882 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T17:44:05.898 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T17:44:05.899 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-09T17:44:05.900 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T17:44:05.903 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-09T17:44:05.905 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T17:44:05.906 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T17:44:05.920 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-09T17:44:05.920 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:44:05.920 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-09T17:44:05.920 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-09T17:44:05.920 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-09T17:44:05.920 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:05.922 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-09T17:44:05.926 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-09T17:44:05.926 INFO:teuthology.orchestra.run.vm09.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:44:05.926 INFO:teuthology.orchestra.run.vm09.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-09T17:44:05.926 INFO:teuthology.orchestra.run.vm09.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-09T17:44:05.926 INFO:teuthology.orchestra.run.vm09.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-09T17:44:05.926 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:05.927 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-09T17:44:05.931 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-09T17:44:05.934 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-09T17:44:05.946 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T17:44:05.946 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-09T17:44:05.946 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:05.947 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T17:44:05.953 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-09T17:44:05.953 INFO:teuthology.orchestra.run.vm09.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:44:05.953 INFO:teuthology.orchestra.run.vm09.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-09T17:44:05.954 INFO:teuthology.orchestra.run.vm09.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-09T17:44:05.954 INFO:teuthology.orchestra.run.vm09.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-09T17:44:05.954 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:05.954 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-09T17:44:05.969 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T17:44:05.973 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-09T17:44:05.975 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T17:44:05.977 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T17:44:05.999 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-09T17:44:05.999 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:44:05.999 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-09T17:44:05.999 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-09T17:44:05.999 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-09T17:44:05.999 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:06.000 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-09T17:44:06.009 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-09T17:44:06.029 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-09T17:44:06.030 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:44:06.030 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-09T17:44:06.030 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-09T17:44:06.030 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-09T17:44:06.030 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:06.030 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-09T17:44:06.038 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-09T17:44:06.038 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-09T17:44:06.038 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 2/8 2026-03-09T17:44:06.038 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 3/8 2026-03-09T17:44:06.038 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 4/8 2026-03-09T17:44:06.038 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T17:44:06.038 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T17:44:06.038 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-09T17:44:06.094 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-09T17:44:06.094 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:06.094 INFO:teuthology.orchestra.run.vm09.stdout:Removed: 2026-03-09T17:44:06.094 INFO:teuthology.orchestra.run.vm09.stdout: ceph-2:18.2.0-0.el9.x86_64 ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:06.094 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:06.094 INFO:teuthology.orchestra.run.vm09.stdout: ledmon-libs-1.1.0-3.el9.x86_64 libconfig-1.7.2-9.el9.x86_64 2026-03-09T17:44:06.094 INFO:teuthology.orchestra.run.vm09.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T17:44:06.094 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:06.094 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:06.122 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-09T17:44:06.122 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-09T17:44:06.122 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 2/8 2026-03-09T17:44:06.122 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 3/8 2026-03-09T17:44:06.122 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 4/8 2026-03-09T17:44:06.122 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T17:44:06.122 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T17:44:06.122 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-09T17:44:06.183 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-09T17:44:06.183 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:06.183 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T17:44:06.183 INFO:teuthology.orchestra.run.vm06.stdout: ceph-2:18.2.0-0.el9.x86_64 ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:06.183 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:06.183 INFO:teuthology.orchestra.run.vm06.stdout: ledmon-libs-1.1.0-3.el9.x86_64 libconfig-1.7.2-9.el9.x86_64 2026-03-09T17:44:06.183 INFO:teuthology.orchestra.run.vm06.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T17:44:06.183 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:06.183 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:06.326 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout: Package Arch Version Repository Size 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout:Removing: 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout: ceph-base x86_64 2:18.2.0-0.el9 @ceph 22 M 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout:Removing dependent packages: 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 @ceph 399 k 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 @ceph 4.5 M 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 654 k 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 @ceph-noarch 7.1 M 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 @ceph-noarch 66 M 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 @ceph-noarch 567 k 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout:Removing unused dependencies: 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout: ceph-common x86_64 2:18.2.0-0.el9 @ceph 70 M 2026-03-09T17:44:06.331 INFO:teuthology.orchestra.run.vm09.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 @ceph-noarch 319 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 @ceph-noarch 1.4 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 @ceph-noarch 40 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 @ceph 138 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 @ceph 426 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 @ceph 1.5 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 @ceph 570 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-jwt noarch 2.4.0-1.el9 @epel 103 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 @epel 5.4 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-09T17:44:06.332 INFO:teuthology.orchestra.run.vm09.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout:Transaction Summary 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout:Remove 84 Packages 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout:Freed space: 433 M 2026-03-09T17:44:06.333 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction check 2026-03-09T17:44:06.356 INFO:teuthology.orchestra.run.vm09.stdout:Transaction check succeeded. 2026-03-09T17:44:06.356 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction test 2026-03-09T17:44:06.394 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout: ceph-base x86_64 2:18.2.0-0.el9 @ceph 22 M 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout:Removing dependent packages: 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 @ceph 399 k 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 @ceph 4.5 M 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 654 k 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 @ceph-noarch 7.1 M 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 @ceph-noarch 66 M 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 @ceph-noarch 567 k 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout: ceph-common x86_64 2:18.2.0-0.el9 @ceph 70 M 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 @ceph-noarch 319 k 2026-03-09T17:44:06.399 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 @ceph-noarch 1.4 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 @ceph-noarch 40 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 @ceph 138 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 @ceph 426 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 @ceph 1.5 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 @ceph 570 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt noarch 2.4.0-1.el9 @epel 103 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 @epel 5.4 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-09T17:44:06.400 INFO:teuthology.orchestra.run.vm06.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout:Remove 84 Packages 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 433 M 2026-03-09T17:44:06.401 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T17:44:06.424 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T17:44:06.424 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T17:44:06.465 INFO:teuthology.orchestra.run.vm09.stdout:Transaction test succeeded. 2026-03-09T17:44:06.466 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction 2026-03-09T17:44:06.534 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T17:44:06.534 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T17:44:06.601 INFO:teuthology.orchestra.run.vm09.stdout: Preparing : 1/1 2026-03-09T17:44:06.601 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-09T17:44:06.609 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-09T17:44:06.629 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-09T17:44:06.629 INFO:teuthology.orchestra.run.vm09.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:44:06.629 INFO:teuthology.orchestra.run.vm09.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-09T17:44:06.629 INFO:teuthology.orchestra.run.vm09.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-09T17:44:06.629 INFO:teuthology.orchestra.run.vm09.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-09T17:44:06.629 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:06.629 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-09T17:44:06.645 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-09T17:44:06.653 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 3/84 2026-03-09T17:44:06.654 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-09T17:44:06.674 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T17:44:06.674 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-09T17:44:06.681 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-09T17:44:06.698 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-09T17:44:06.699 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:44:06.699 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-09T17:44:06.699 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-09T17:44:06.699 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-09T17:44:06.699 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:06.699 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-09T17:44:06.714 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-09T17:44:06.714 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-09T17:44:06.722 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 3/84 2026-03-09T17:44:06.722 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-09T17:44:06.723 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/84 2026-03-09T17:44:06.728 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-09T17:44:06.728 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-09T17:44:06.741 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-09T17:44:06.751 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-09T17:44:06.755 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-09T17:44:06.759 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-09T17:44:06.764 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-09T17:44:06.770 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-09T17:44:06.781 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-09T17:44:06.783 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-09T17:44:06.792 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/84 2026-03-09T17:44:06.795 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-09T17:44:06.797 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-09T17:44:06.797 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-09T17:44:06.803 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-09T17:44:06.810 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-09T17:44:06.815 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-09T17:44:06.817 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-09T17:44:06.820 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-09T17:44:06.822 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-09T17:44:06.823 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-09T17:44:06.829 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-09T17:44:06.834 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-09T17:44:06.844 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-09T17:44:06.853 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-09T17:44:06.857 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-09T17:44:06.861 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-09T17:44:06.865 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-09T17:44:06.865 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-09T17:44:06.874 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-09T17:44:06.877 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-09T17:44:06.883 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-09T17:44:06.883 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-09T17:44:06.884 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-09T17:44:06.892 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-09T17:44:06.915 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-09T17:44:06.925 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-09T17:44:06.928 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-09T17:44:06.937 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-09T17:44:06.945 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-09T17:44:06.945 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-09T17:44:06.954 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-09T17:44:06.989 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-09T17:44:07.020 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-09T17:44:07.027 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-09T17:44:07.034 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 27/84 2026-03-09T17:44:07.038 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 28/84 2026-03-09T17:44:07.041 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 29/84 2026-03-09T17:44:07.044 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 30/84 2026-03-09T17:44:07.047 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 31/84 2026-03-09T17:44:07.050 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 32/84 2026-03-09T17:44:07.053 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-jwt-2.4.0-1.el9.noarch 33/84 2026-03-09T17:44:07.056 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-jwt+crypto-2.4.0-1.el9.noarch 34/84 2026-03-09T17:44:07.063 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-09T17:44:07.071 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-09T17:44:07.079 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-09T17:44:07.086 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 37/84 2026-03-09T17:44:07.094 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-09T17:44:07.100 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-09T17:44:07.106 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 27/84 2026-03-09T17:44:07.110 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 28/84 2026-03-09T17:44:07.113 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 29/84 2026-03-09T17:44:07.116 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 30/84 2026-03-09T17:44:07.118 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 31/84 2026-03-09T17:44:07.121 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 32/84 2026-03-09T17:44:07.124 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jwt-2.4.0-1.el9.noarch 33/84 2026-03-09T17:44:07.127 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jwt+crypto-2.4.0-1.el9.noarch 34/84 2026-03-09T17:44:07.137 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 38/84 2026-03-09T17:44:07.141 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-09T17:44:07.149 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-09T17:44:07.150 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 39/84 2026-03-09T17:44:07.153 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 40/84 2026-03-09T17:44:07.154 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 37/84 2026-03-09T17:44:07.156 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 41/84 2026-03-09T17:44:07.159 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 42/84 2026-03-09T17:44:07.161 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 43/84 2026-03-09T17:44:07.181 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-09T17:44:07.181 INFO:teuthology.orchestra.run.vm09.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:44:07.182 INFO:teuthology.orchestra.run.vm09.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-09T17:44:07.182 INFO:teuthology.orchestra.run.vm09.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-09T17:44:07.182 INFO:teuthology.orchestra.run.vm09.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-09T17:44:07.182 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:07.182 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-09T17:44:07.191 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-09T17:44:07.206 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 38/84 2026-03-09T17:44:07.212 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-09T17:44:07.212 INFO:teuthology.orchestra.run.vm09.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:44:07.212 INFO:teuthology.orchestra.run.vm09.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-09T17:44:07.212 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:07.212 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-09T17:44:07.218 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 39/84 2026-03-09T17:44:07.221 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 40/84 2026-03-09T17:44:07.222 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-09T17:44:07.224 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 46/84 2026-03-09T17:44:07.225 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 41/84 2026-03-09T17:44:07.227 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 47/84 2026-03-09T17:44:07.227 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 42/84 2026-03-09T17:44:07.229 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 43/84 2026-03-09T17:44:07.230 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-ply-3.11-14.el9.noarch 48/84 2026-03-09T17:44:07.233 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 49/84 2026-03-09T17:44:07.236 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 50/84 2026-03-09T17:44:07.239 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 51/84 2026-03-09T17:44:07.242 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 52/84 2026-03-09T17:44:07.245 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 53/84 2026-03-09T17:44:07.252 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-09T17:44:07.252 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:44:07.252 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-09T17:44:07.252 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-09T17:44:07.252 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-09T17:44:07.252 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:07.253 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-09T17:44:07.253 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 54/84 2026-03-09T17:44:07.260 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 55/84 2026-03-09T17:44:07.262 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-09T17:44:07.262 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 56/84 2026-03-09T17:44:07.265 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 57/84 2026-03-09T17:44:07.268 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 58/84 2026-03-09T17:44:07.274 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 59/84 2026-03-09T17:44:07.279 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 60/84 2026-03-09T17:44:07.283 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-09T17:44:07.283 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T17:44:07.283 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-09T17:44:07.283 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:07.284 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-09T17:44:07.285 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 61/84 2026-03-09T17:44:07.291 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 62/84 2026-03-09T17:44:07.293 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-09T17:44:07.294 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 46/84 2026-03-09T17:44:07.297 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 47/84 2026-03-09T17:44:07.297 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/84 2026-03-09T17:44:07.300 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-ply-3.11-14.el9.noarch 48/84 2026-03-09T17:44:07.301 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/84 2026-03-09T17:44:07.303 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 49/84 2026-03-09T17:44:07.304 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 65/84 2026-03-09T17:44:07.305 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 50/84 2026-03-09T17:44:07.308 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 51/84 2026-03-09T17:44:07.308 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 66/84 2026-03-09T17:44:07.311 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 52/84 2026-03-09T17:44:07.315 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 53/84 2026-03-09T17:44:07.316 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 67/84 2026-03-09T17:44:07.322 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 68/84 2026-03-09T17:44:07.323 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 54/84 2026-03-09T17:44:07.326 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 69/84 2026-03-09T17:44:07.328 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 55/84 2026-03-09T17:44:07.328 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 70/84 2026-03-09T17:44:07.330 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 71/84 2026-03-09T17:44:07.331 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 56/84 2026-03-09T17:44:07.334 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 57/84 2026-03-09T17:44:07.337 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 72/84 2026-03-09T17:44:07.337 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 58/84 2026-03-09T17:44:07.341 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 73/84 2026-03-09T17:44:07.343 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 59/84 2026-03-09T17:44:07.347 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 60/84 2026-03-09T17:44:07.352 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 61/84 2026-03-09T17:44:07.357 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 62/84 2026-03-09T17:44:07.363 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-09T17:44:07.363 INFO:teuthology.orchestra.run.vm09.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-09T17:44:07.363 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:07.363 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/84 2026-03-09T17:44:07.367 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/84 2026-03-09T17:44:07.369 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-09T17:44:07.369 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 65/84 2026-03-09T17:44:07.373 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 66/84 2026-03-09T17:44:07.381 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 67/84 2026-03-09T17:44:07.387 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 68/84 2026-03-09T17:44:07.389 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-09T17:44:07.389 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-09T17:44:07.390 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 69/84 2026-03-09T17:44:07.393 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 70/84 2026-03-09T17:44:07.394 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 71/84 2026-03-09T17:44:07.401 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 72/84 2026-03-09T17:44:07.404 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 73/84 2026-03-09T17:44:07.425 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-09T17:44:07.426 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-09T17:44:07.426 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:07.432 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-09T17:44:07.452 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-09T17:44:07.452 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-09T17:44:13.073 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-09T17:44:13.073 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /sys 2026-03-09T17:44:13.073 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /proc 2026-03-09T17:44:13.073 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /mnt 2026-03-09T17:44:13.073 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /var/tmp 2026-03-09T17:44:13.073 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /home 2026-03-09T17:44:13.073 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /root 2026-03-09T17:44:13.073 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /tmp 2026-03-09T17:44:13.073 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:13.088 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-09T17:44:13.117 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-09T17:44:13.120 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-ceph-common-2:18.2.0-0.el9.x86_64 77/84 2026-03-09T17:44:13.123 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 78/84 2026-03-09T17:44:13.126 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-09T17:44:13.126 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-09T17:44:13.142 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-09T17:44:13.145 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-09T17:44:13.148 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-09T17:44:13.151 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-09T17:44:13.151 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-09T17:44:13.249 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-09T17:44:13.249 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 1/84 2026-03-09T17:44:13.249 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 2/84 2026-03-09T17:44:13.249 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 3/84 2026-03-09T17:44:13.249 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 4/84 2026-03-09T17:44:13.249 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 5/84 2026-03-09T17:44:13.249 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 6/84 2026-03-09T17:44:13.249 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 7/84 2026-03-09T17:44:13.249 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 8/84 2026-03-09T17:44:13.249 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 9/84 2026-03-09T17:44:13.249 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 10/84 2026-03-09T17:44:13.250 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 11/84 2026-03-09T17:44:13.250 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 12/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 17/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 18/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 19/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 20/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 21/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 22/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 23/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 24/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 25/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 26/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 27/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 29/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 30/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 31/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 32/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 33/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 34/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 35/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 37/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 38/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 39/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 40/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 41/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 42/84 2026-03-09T17:44:13.251 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 43/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 44/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 45/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 46/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 47/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 48/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 49/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 50/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-09T17:44:13.252 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-09T17:44:13.253 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-09T17:44:13.253 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-09T17:44:13.253 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-09T17:44:13.253 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-09T17:44:13.253 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-09T17:44:13.253 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-09T17:44:13.253 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-09T17:44:13.253 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-09T17:44:13.253 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-09T17:44:13.253 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-09T17:44:13.253 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-09T17:44:13.253 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-09T17:44:13.253 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 84/84 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-09T17:44:13.329 INFO:teuthology.orchestra.run.vm06.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-ply-3.11-14.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:13.330 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:13.370 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-09T17:44:13.370 INFO:teuthology.orchestra.run.vm09.stdout:skipping the directory /sys 2026-03-09T17:44:13.370 INFO:teuthology.orchestra.run.vm09.stdout:skipping the directory /proc 2026-03-09T17:44:13.370 INFO:teuthology.orchestra.run.vm09.stdout:skipping the directory /mnt 2026-03-09T17:44:13.370 INFO:teuthology.orchestra.run.vm09.stdout:skipping the directory /var/tmp 2026-03-09T17:44:13.370 INFO:teuthology.orchestra.run.vm09.stdout:skipping the directory /home 2026-03-09T17:44:13.370 INFO:teuthology.orchestra.run.vm09.stdout:skipping the directory /root 2026-03-09T17:44:13.370 INFO:teuthology.orchestra.run.vm09.stdout:skipping the directory /tmp 2026-03-09T17:44:13.370 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:13.381 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-09T17:44:13.408 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-09T17:44:13.412 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-ceph-common-2:18.2.0-0.el9.x86_64 77/84 2026-03-09T17:44:13.414 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 78/84 2026-03-09T17:44:13.416 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-09T17:44:13.416 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-09T17:44:13.431 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-09T17:44:13.433 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-09T17:44:13.436 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-09T17:44:13.439 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-09T17:44:13.439 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-09T17:44:13.541 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:13.542 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:13.542 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-09T17:44:13.542 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:13.542 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T17:44:13.542 INFO:teuthology.orchestra.run.vm06.stdout: cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 200 k 2026-03-09T17:44:13.542 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:13.542 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T17:44:13.542 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:13.542 INFO:teuthology.orchestra.run.vm06.stdout:Remove 1 Package 2026-03-09T17:44:13.542 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:13.542 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 200 k 2026-03-09T17:44:13.542 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T17:44:13.544 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T17:44:13.544 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T17:44:13.545 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T17:44:13.545 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 1/84 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 2/84 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 3/84 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 4/84 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 5/84 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 6/84 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 7/84 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 8/84 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 9/84 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 10/84 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 11/84 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 12/84 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-09T17:44:13.548 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 17/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 18/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 19/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 20/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 21/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 22/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 23/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 24/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 25/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 26/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 27/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 29/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 30/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 31/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 32/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 33/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 34/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 35/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 37/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 38/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 39/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 40/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 41/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 42/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 43/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 44/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 45/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 46/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 47/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 48/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 49/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 50/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-09T17:44:13.550 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-09T17:44:13.551 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-09T17:44:13.551 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-09T17:44:13.551 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-09T17:44:13.551 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-09T17:44:13.551 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-09T17:44:13.551 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-09T17:44:13.551 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-09T17:44:13.551 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-09T17:44:13.551 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-09T17:44:13.551 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-09T17:44:13.562 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T17:44:13.562 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-09T17:44:13.622 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 84/84 2026-03-09T17:44:13.622 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:13.622 INFO:teuthology.orchestra.run.vm09.stdout:Removed: 2026-03-09T17:44:13.622 INFO:teuthology.orchestra.run.vm09.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.622 INFO:teuthology.orchestra.run.vm09.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.622 INFO:teuthology.orchestra.run.vm09.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.622 INFO:teuthology.orchestra.run.vm09.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.622 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.622 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.622 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.622 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-ply-3.11-14.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-09T17:44:13.623 INFO:teuthology.orchestra.run.vm09.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:13.624 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:13.676 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-09T17:44:13.719 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-09T17:44:13.719 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:13.719 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T17:44:13.719 INFO:teuthology.orchestra.run.vm06.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-09T17:44:13.719 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:13.719 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:13.848 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:13.849 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:13.849 INFO:teuthology.orchestra.run.vm09.stdout: Package Architecture Version Repository Size 2026-03-09T17:44:13.849 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:13.849 INFO:teuthology.orchestra.run.vm09.stdout:Removing: 2026-03-09T17:44:13.849 INFO:teuthology.orchestra.run.vm09.stdout: cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 200 k 2026-03-09T17:44:13.849 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:13.849 INFO:teuthology.orchestra.run.vm09.stdout:Transaction Summary 2026-03-09T17:44:13.849 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:13.849 INFO:teuthology.orchestra.run.vm09.stdout:Remove 1 Package 2026-03-09T17:44:13.849 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:13.849 INFO:teuthology.orchestra.run.vm09.stdout:Freed space: 200 k 2026-03-09T17:44:13.849 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction check 2026-03-09T17:44:13.851 INFO:teuthology.orchestra.run.vm09.stdout:Transaction check succeeded. 2026-03-09T17:44:13.851 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction test 2026-03-09T17:44:13.852 INFO:teuthology.orchestra.run.vm09.stdout:Transaction test succeeded. 2026-03-09T17:44:13.852 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction 2026-03-09T17:44:13.868 INFO:teuthology.orchestra.run.vm09.stdout: Preparing : 1/1 2026-03-09T17:44:13.868 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-09T17:44:13.924 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-immutable-object-cache 2026-03-09T17:44:13.924 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:13.927 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:13.927 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:13.927 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:13.999 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-09T17:44:14.042 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-09T17:44:14.042 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:14.042 INFO:teuthology.orchestra.run.vm09.stdout:Removed: 2026-03-09T17:44:14.042 INFO:teuthology.orchestra.run.vm09.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-09T17:44:14.042 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:14.042 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:14.104 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr 2026-03-09T17:44:14.104 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:14.106 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:14.107 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:14.107 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:14.236 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: ceph-immutable-object-cache 2026-03-09T17:44:14.236 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:14.239 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:14.239 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:14.239 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:14.284 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr-dashboard 2026-03-09T17:44:14.284 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:14.287 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:14.288 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:14.288 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:14.411 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: ceph-mgr 2026-03-09T17:44:14.411 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:14.414 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:14.415 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:14.415 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:14.467 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-09T17:44:14.467 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:14.470 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:14.470 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:14.471 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:14.585 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: ceph-mgr-dashboard 2026-03-09T17:44:14.586 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:14.589 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:14.589 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:14.589 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:14.645 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr-rook 2026-03-09T17:44:14.646 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:14.649 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:14.649 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:14.649 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:14.765 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-09T17:44:14.765 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:14.768 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:14.769 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:14.769 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:14.826 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr-cephadm 2026-03-09T17:44:14.826 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:14.829 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:14.829 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:14.829 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:14.941 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: ceph-mgr-rook 2026-03-09T17:44:14.941 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:14.944 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:14.945 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:14.945 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:15.020 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:15.021 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:15.021 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-09T17:44:15.021 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:15.021 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T17:44:15.021 INFO:teuthology.orchestra.run.vm06.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 @ceph 2.4 M 2026-03-09T17:44:15.021 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:15.021 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T17:44:15.021 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:15.021 INFO:teuthology.orchestra.run.vm06.stdout:Remove 1 Package 2026-03-09T17:44:15.021 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:15.021 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 2.4 M 2026-03-09T17:44:15.021 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T17:44:15.023 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T17:44:15.023 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T17:44:15.037 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T17:44:15.038 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T17:44:15.066 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T17:44:15.080 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-09T17:44:15.115 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: ceph-mgr-cephadm 2026-03-09T17:44:15.115 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:15.118 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:15.118 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:15.118 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:15.142 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-09T17:44:15.236 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-09T17:44:15.236 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:15.236 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T17:44:15.236 INFO:teuthology.orchestra.run.vm06.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:15.236 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:15.236 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:15.370 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:15.370 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:15.370 INFO:teuthology.orchestra.run.vm09.stdout: Package Architecture Version Repository Size 2026-03-09T17:44:15.370 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:15.370 INFO:teuthology.orchestra.run.vm09.stdout:Removing: 2026-03-09T17:44:15.370 INFO:teuthology.orchestra.run.vm09.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 @ceph 2.4 M 2026-03-09T17:44:15.370 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:15.370 INFO:teuthology.orchestra.run.vm09.stdout:Transaction Summary 2026-03-09T17:44:15.370 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:15.370 INFO:teuthology.orchestra.run.vm09.stdout:Remove 1 Package 2026-03-09T17:44:15.370 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:15.370 INFO:teuthology.orchestra.run.vm09.stdout:Freed space: 2.4 M 2026-03-09T17:44:15.370 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction check 2026-03-09T17:44:15.372 INFO:teuthology.orchestra.run.vm09.stdout:Transaction check succeeded. 2026-03-09T17:44:15.372 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction test 2026-03-09T17:44:15.386 INFO:teuthology.orchestra.run.vm09.stdout:Transaction test succeeded. 2026-03-09T17:44:15.386 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction 2026-03-09T17:44:15.414 INFO:teuthology.orchestra.run.vm09.stdout: Preparing : 1/1 2026-03-09T17:44:15.428 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout: librados-devel x86_64 2:18.2.0-0.el9 @ceph 456 k 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout:Removing dependent packages: 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 @ceph 137 k 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout:Remove 2 Packages 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 593 k 2026-03-09T17:44:15.434 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T17:44:15.436 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T17:44:15.436 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T17:44:15.447 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T17:44:15.447 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T17:44:15.473 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T17:44:15.476 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-09T17:44:15.489 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-09T17:44:15.490 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-09T17:44:15.533 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-09T17:44:15.533 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:15.533 INFO:teuthology.orchestra.run.vm09.stdout:Removed: 2026-03-09T17:44:15.533 INFO:teuthology.orchestra.run.vm09.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:15.533 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:15.533 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:15.555 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-09T17:44:15.555 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-09T17:44:15.600 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-09T17:44:15.600 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:15.600 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T17:44:15.600 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 librados-devel-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:15.600 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:15.600 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:15.727 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:15.728 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:15.728 INFO:teuthology.orchestra.run.vm09.stdout: Package Architecture Version Repository Size 2026-03-09T17:44:15.728 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:15.728 INFO:teuthology.orchestra.run.vm09.stdout:Removing: 2026-03-09T17:44:15.728 INFO:teuthology.orchestra.run.vm09.stdout: librados-devel x86_64 2:18.2.0-0.el9 @ceph 456 k 2026-03-09T17:44:15.728 INFO:teuthology.orchestra.run.vm09.stdout:Removing dependent packages: 2026-03-09T17:44:15.728 INFO:teuthology.orchestra.run.vm09.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 @ceph 137 k 2026-03-09T17:44:15.728 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:15.728 INFO:teuthology.orchestra.run.vm09.stdout:Transaction Summary 2026-03-09T17:44:15.728 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:15.728 INFO:teuthology.orchestra.run.vm09.stdout:Remove 2 Packages 2026-03-09T17:44:15.728 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:15.728 INFO:teuthology.orchestra.run.vm09.stdout:Freed space: 593 k 2026-03-09T17:44:15.728 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction check 2026-03-09T17:44:15.730 INFO:teuthology.orchestra.run.vm09.stdout:Transaction check succeeded. 2026-03-09T17:44:15.730 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction test 2026-03-09T17:44:15.740 INFO:teuthology.orchestra.run.vm09.stdout:Transaction test succeeded. 2026-03-09T17:44:15.740 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction 2026-03-09T17:44:15.766 INFO:teuthology.orchestra.run.vm09.stdout: Preparing : 1/1 2026-03-09T17:44:15.768 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-09T17:44:15.781 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-09T17:44:15.812 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 @ceph 1.8 M 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout:Removing dependent packages: 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 @ceph 480 k 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 @ceph 186 k 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout:Remove 3 Packages 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 2.5 M 2026-03-09T17:44:15.813 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T17:44:15.815 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T17:44:15.815 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T17:44:15.828 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T17:44:15.828 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T17:44:15.850 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-09T17:44:15.850 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-09T17:44:15.856 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T17:44:15.860 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cephfs-2:18.2.0-0.el9.x86_64 1/3 2026-03-09T17:44:15.861 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-09T17:44:15.861 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-09T17:44:15.899 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-09T17:44:15.899 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:15.899 INFO:teuthology.orchestra.run.vm09.stdout:Removed: 2026-03-09T17:44:15.899 INFO:teuthology.orchestra.run.vm09.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 librados-devel-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:15.899 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:15.899 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:15.927 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-09T17:44:15.927 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 1/3 2026-03-09T17:44:15.927 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-09T17:44:15.970 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 3/3 2026-03-09T17:44:15.970 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:15.970 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T17:44:15.970 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:15.970 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:15.970 INFO:teuthology.orchestra.run.vm06.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:15.970 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:15.970 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:16.100 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:16.100 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:16.100 INFO:teuthology.orchestra.run.vm09.stdout: Package Arch Version Repository Size 2026-03-09T17:44:16.100 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:16.100 INFO:teuthology.orchestra.run.vm09.stdout:Removing: 2026-03-09T17:44:16.100 INFO:teuthology.orchestra.run.vm09.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 @ceph 1.8 M 2026-03-09T17:44:16.100 INFO:teuthology.orchestra.run.vm09.stdout:Removing dependent packages: 2026-03-09T17:44:16.100 INFO:teuthology.orchestra.run.vm09.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 @ceph 480 k 2026-03-09T17:44:16.100 INFO:teuthology.orchestra.run.vm09.stdout:Removing unused dependencies: 2026-03-09T17:44:16.100 INFO:teuthology.orchestra.run.vm09.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 @ceph 186 k 2026-03-09T17:44:16.100 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:16.100 INFO:teuthology.orchestra.run.vm09.stdout:Transaction Summary 2026-03-09T17:44:16.100 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:16.101 INFO:teuthology.orchestra.run.vm09.stdout:Remove 3 Packages 2026-03-09T17:44:16.101 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:16.101 INFO:teuthology.orchestra.run.vm09.stdout:Freed space: 2.5 M 2026-03-09T17:44:16.101 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction check 2026-03-09T17:44:16.102 INFO:teuthology.orchestra.run.vm09.stdout:Transaction check succeeded. 2026-03-09T17:44:16.102 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction test 2026-03-09T17:44:16.115 INFO:teuthology.orchestra.run.vm09.stdout:Transaction test succeeded. 2026-03-09T17:44:16.116 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction 2026-03-09T17:44:16.141 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: libcephfs-devel 2026-03-09T17:44:16.141 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:16.144 INFO:teuthology.orchestra.run.vm09.stdout: Preparing : 1/1 2026-03-09T17:44:16.144 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:16.145 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:16.145 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:16.146 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-cephfs-2:18.2.0-0.el9.x86_64 1/3 2026-03-09T17:44:16.147 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-09T17:44:16.148 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-09T17:44:16.217 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-09T17:44:16.217 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 1/3 2026-03-09T17:44:16.217 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-09T17:44:16.258 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 3/3 2026-03-09T17:44:16.258 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:16.258 INFO:teuthology.orchestra.run.vm09.stdout:Removed: 2026-03-09T17:44:16.258 INFO:teuthology.orchestra.run.vm09.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.258 INFO:teuthology.orchestra.run.vm09.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.258 INFO:teuthology.orchestra.run.vm09.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.258 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:16.258 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:16.334 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: librados2 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout:Removing dependent packages: 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: python3-rados x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: python3-rbd x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: python3-rgw x86_64 2:18.2.0-0.el9 @ceph 269 k 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 @ceph 230 k 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 @ceph 490 k 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: librbd1 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: librgw2 x86_64 2:18.2.0-0.el9 @ceph 15 M 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout:Remove 21 Packages 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 74 M 2026-03-09T17:44:16.336 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T17:44:16.340 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T17:44:16.340 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T17:44:16.363 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T17:44:16.363 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T17:44:16.406 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T17:44:16.409 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : rbd-nbd-2:18.2.0-0.el9.x86_64 1/21 2026-03-09T17:44:16.412 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : rbd-fuse-2:18.2.0-0.el9.x86_64 2/21 2026-03-09T17:44:16.415 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-rgw-2:18.2.0-0.el9.x86_64 3/21 2026-03-09T17:44:16.415 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-09T17:44:16.431 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-09T17:44:16.432 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: libcephfs-devel 2026-03-09T17:44:16.432 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:16.433 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/21 2026-03-09T17:44:16.435 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:16.435 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-rbd-2:18.2.0-0.el9.x86_64 6/21 2026-03-09T17:44:16.435 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:16.435 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:16.437 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-rados-2:18.2.0-0.el9.x86_64 7/21 2026-03-09T17:44:16.439 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/21 2026-03-09T17:44:16.439 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-09T17:44:16.455 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-09T17:44:16.455 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-09T17:44:16.455 INFO:teuthology.orchestra.run.vm06.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-09T17:44:16.455 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:16.468 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-09T17:44:16.471 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 11/21 2026-03-09T17:44:16.473 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 12/21 2026-03-09T17:44:16.476 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 13/21 2026-03-09T17:44:16.478 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 14/21 2026-03-09T17:44:16.482 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : re2-1:20211101-20.el9.x86_64 15/21 2026-03-09T17:44:16.485 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 16/21 2026-03-09T17:44:16.488 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 17/21 2026-03-09T17:44:16.490 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 18/21 2026-03-09T17:44:16.493 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 19/21 2026-03-09T17:44:16.495 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 20/21 2026-03-09T17:44:16.510 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-09T17:44:16.576 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-09T17:44:16.576 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/21 2026-03-09T17:44:16.576 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 2/21 2026-03-09T17:44:16.576 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 3/21 2026-03-09T17:44:16.576 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 4/21 2026-03-09T17:44:16.576 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/21 2026-03-09T17:44:16.576 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/21 2026-03-09T17:44:16.576 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 7/21 2026-03-09T17:44:16.576 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 8/21 2026-03-09T17:44:16.576 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/21 2026-03-09T17:44:16.577 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 10/21 2026-03-09T17:44:16.577 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 11/21 2026-03-09T17:44:16.577 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 12/21 2026-03-09T17:44:16.577 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 13/21 2026-03-09T17:44:16.577 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 14/21 2026-03-09T17:44:16.577 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 15/21 2026-03-09T17:44:16.577 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 16/21 2026-03-09T17:44:16.577 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 17/21 2026-03-09T17:44:16.577 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 18/21 2026-03-09T17:44:16.577 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 19/21 2026-03-09T17:44:16.577 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : re2-1:20211101-20.el9.x86_64 20/21 2026-03-09T17:44:16.623 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:16.624 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:16.624 INFO:teuthology.orchestra.run.vm09.stdout: Package Arch Version Repository Size 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout:Removing: 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: librados2 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout:Removing dependent packages: 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: python3-rados x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: python3-rbd x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: python3-rgw x86_64 2:18.2.0-0.el9 @ceph 269 k 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 @ceph 230 k 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 @ceph 490 k 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout:Removing unused dependencies: 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: librbd1 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: librgw2 x86_64 2:18.2.0-0.el9 @ceph 15 M 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout:Transaction Summary 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout:================================================================================ 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout:Remove 21 Packages 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout:Freed space: 74 M 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction check 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 21/21 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm06.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm06.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-09T17:44:16.625 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: librados2-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: librbd1-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: re2-1:20211101-20.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T17:44:16.626 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:16.629 INFO:teuthology.orchestra.run.vm09.stdout:Transaction check succeeded. 2026-03-09T17:44:16.629 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction test 2026-03-09T17:44:16.653 INFO:teuthology.orchestra.run.vm09.stdout:Transaction test succeeded. 2026-03-09T17:44:16.653 INFO:teuthology.orchestra.run.vm09.stdout:Running transaction 2026-03-09T17:44:16.695 INFO:teuthology.orchestra.run.vm09.stdout: Preparing : 1/1 2026-03-09T17:44:16.697 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : rbd-nbd-2:18.2.0-0.el9.x86_64 1/21 2026-03-09T17:44:16.700 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : rbd-fuse-2:18.2.0-0.el9.x86_64 2/21 2026-03-09T17:44:16.702 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-rgw-2:18.2.0-0.el9.x86_64 3/21 2026-03-09T17:44:16.702 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-09T17:44:16.715 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-09T17:44:16.717 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/21 2026-03-09T17:44:16.718 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-rbd-2:18.2.0-0.el9.x86_64 6/21 2026-03-09T17:44:16.720 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : python3-rados-2:18.2.0-0.el9.x86_64 7/21 2026-03-09T17:44:16.722 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/21 2026-03-09T17:44:16.722 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-09T17:44:16.736 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-09T17:44:16.736 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-09T17:44:16.736 INFO:teuthology.orchestra.run.vm09.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-09T17:44:16.736 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:16.749 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-09T17:44:16.751 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 11/21 2026-03-09T17:44:16.753 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 12/21 2026-03-09T17:44:16.755 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 13/21 2026-03-09T17:44:16.758 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 14/21 2026-03-09T17:44:16.761 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : re2-1:20211101-20.el9.x86_64 15/21 2026-03-09T17:44:16.764 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 16/21 2026-03-09T17:44:16.767 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 17/21 2026-03-09T17:44:16.769 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 18/21 2026-03-09T17:44:16.771 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 19/21 2026-03-09T17:44:16.773 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 20/21 2026-03-09T17:44:16.787 INFO:teuthology.orchestra.run.vm09.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-09T17:44:16.845 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: librbd1 2026-03-09T17:44:16.845 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:16.848 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:16.849 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:16.849 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:16.853 INFO:teuthology.orchestra.run.vm09.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-09T17:44:16.854 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/21 2026-03-09T17:44:16.854 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 2/21 2026-03-09T17:44:16.854 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 3/21 2026-03-09T17:44:16.854 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 4/21 2026-03-09T17:44:16.854 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/21 2026-03-09T17:44:16.854 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/21 2026-03-09T17:44:16.854 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 7/21 2026-03-09T17:44:16.854 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 8/21 2026-03-09T17:44:16.854 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/21 2026-03-09T17:44:16.854 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 10/21 2026-03-09T17:44:16.854 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 11/21 2026-03-09T17:44:16.854 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 12/21 2026-03-09T17:44:16.854 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 13/21 2026-03-09T17:44:16.855 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 14/21 2026-03-09T17:44:16.855 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 15/21 2026-03-09T17:44:16.855 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 16/21 2026-03-09T17:44:16.855 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 17/21 2026-03-09T17:44:16.855 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 18/21 2026-03-09T17:44:16.855 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 19/21 2026-03-09T17:44:16.855 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : re2-1:20211101-20.el9.x86_64 20/21 2026-03-09T17:44:16.909 INFO:teuthology.orchestra.run.vm09.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 21/21 2026-03-09T17:44:16.909 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:16.909 INFO:teuthology.orchestra.run.vm09.stdout:Removed: 2026-03-09T17:44:16.909 INFO:teuthology.orchestra.run.vm09.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-09T17:44:16.909 INFO:teuthology.orchestra.run.vm09.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-09T17:44:16.909 INFO:teuthology.orchestra.run.vm09.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: librados2-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: librbd1-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: re2-1:20211101-20.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout: 2026-03-09T17:44:16.910 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:17.044 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: python3-rados 2026-03-09T17:44:17.045 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:17.048 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:17.048 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:17.049 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:17.133 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: librbd1 2026-03-09T17:44:17.133 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:17.137 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:17.137 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:17.138 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:17.234 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: python3-rgw 2026-03-09T17:44:17.234 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:17.237 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:17.237 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:17.237 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:17.324 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: python3-rados 2026-03-09T17:44:17.325 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:17.328 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:17.328 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:17.328 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:17.412 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: python3-cephfs 2026-03-09T17:44:17.412 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:17.416 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:17.416 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:17.417 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:17.499 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: python3-rgw 2026-03-09T17:44:17.499 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:17.502 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:17.503 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:17.503 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:17.612 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: python3-rbd 2026-03-09T17:44:17.612 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:17.615 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:17.616 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:17.616 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:17.670 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: python3-cephfs 2026-03-09T17:44:17.670 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:17.673 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:17.674 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:17.674 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:17.801 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: rbd-fuse 2026-03-09T17:44:17.801 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:17.804 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:17.805 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:17.805 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:17.841 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: python3-rbd 2026-03-09T17:44:17.842 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:17.845 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:17.845 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:17.845 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:17.980 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: rbd-mirror 2026-03-09T17:44:17.980 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:17.983 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:17.984 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:17.984 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:18.028 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: rbd-fuse 2026-03-09T17:44:18.028 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:18.031 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:18.032 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:18.032 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:18.168 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: rbd-nbd 2026-03-09T17:44:18.169 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T17:44:18.171 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T17:44:18.172 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T17:44:18.172 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T17:44:18.197 DEBUG:teuthology.orchestra.run.vm06:> sudo yum clean all 2026-03-09T17:44:18.216 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: rbd-mirror 2026-03-09T17:44:18.216 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:18.219 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:18.220 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:18.220 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:18.325 INFO:teuthology.orchestra.run.vm06.stdout:56 files removed 2026-03-09T17:44:18.347 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T17:44:18.372 DEBUG:teuthology.orchestra.run.vm06:> sudo yum clean expire-cache 2026-03-09T17:44:18.393 INFO:teuthology.orchestra.run.vm09.stdout:No match for argument: rbd-nbd 2026-03-09T17:44:18.393 INFO:teuthology.orchestra.run.vm09.stderr:No packages marked for removal. 2026-03-09T17:44:18.396 INFO:teuthology.orchestra.run.vm09.stdout:Dependencies resolved. 2026-03-09T17:44:18.397 INFO:teuthology.orchestra.run.vm09.stdout:Nothing to do. 2026-03-09T17:44:18.397 INFO:teuthology.orchestra.run.vm09.stdout:Complete! 2026-03-09T17:44:18.419 DEBUG:teuthology.orchestra.run.vm09:> sudo yum clean all 2026-03-09T17:44:18.539 INFO:teuthology.orchestra.run.vm06.stdout:Cache was expired 2026-03-09T17:44:18.540 INFO:teuthology.orchestra.run.vm06.stdout:0 files removed 2026-03-09T17:44:18.551 INFO:teuthology.orchestra.run.vm09.stdout:56 files removed 2026-03-09T17:44:18.565 DEBUG:teuthology.parallel:result is None 2026-03-09T17:44:18.575 DEBUG:teuthology.orchestra.run.vm09:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T17:44:18.600 DEBUG:teuthology.orchestra.run.vm09:> sudo yum clean expire-cache 2026-03-09T17:44:18.761 INFO:teuthology.orchestra.run.vm09.stdout:Cache was expired 2026-03-09T17:44:18.761 INFO:teuthology.orchestra.run.vm09.stdout:0 files removed 2026-03-09T17:44:18.781 DEBUG:teuthology.parallel:result is None 2026-03-09T17:44:18.781 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm06.local 2026-03-09T17:44:18.781 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm09.local 2026-03-09T17:44:18.781 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T17:44:18.782 DEBUG:teuthology.orchestra.run.vm09:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T17:44:18.808 DEBUG:teuthology.orchestra.run.vm06:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-09T17:44:18.809 DEBUG:teuthology.orchestra.run.vm09:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-09T17:44:18.875 DEBUG:teuthology.parallel:result is None 2026-03-09T17:44:18.876 DEBUG:teuthology.parallel:result is None 2026-03-09T17:44:18.876 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-09T17:44:18.879 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-09T17:44:18.879 DEBUG:teuthology.orchestra.run.vm06:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T17:44:18.917 DEBUG:teuthology.orchestra.run.vm09:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T17:44:18.930 INFO:teuthology.orchestra.run.vm06.stderr:bash: line 1: ntpq: command not found 2026-03-09T17:44:18.932 INFO:teuthology.orchestra.run.vm09.stderr:bash: line 1: ntpq: command not found 2026-03-09T17:44:19.030 INFO:teuthology.orchestra.run.vm09.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-09T17:44:19.030 INFO:teuthology.orchestra.run.vm09.stdout:=============================================================================== 2026-03-09T17:44:19.030 INFO:teuthology.orchestra.run.vm09.stdout:^* stratum2-4.NTP.TechFak.U> 2 6 377 22 -68us[ -58us] +/- 18ms 2026-03-09T17:44:19.030 INFO:teuthology.orchestra.run.vm09.stdout:^+ t2.ipfu.de 4 6 377 23 +1103us[+1113us] +/- 17ms 2026-03-09T17:44:19.030 INFO:teuthology.orchestra.run.vm09.stdout:^+ time.ndless.net 2 7 377 24 -2183us[-2173us] +/- 18ms 2026-03-09T17:44:19.031 INFO:teuthology.orchestra.run.vm09.stdout:^+ 172-104-154-182.ip.linod> 2 6 377 20 +3069us[+3069us] +/- 31ms 2026-03-09T17:44:19.031 INFO:teuthology.orchestra.run.vm06.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-09T17:44:19.031 INFO:teuthology.orchestra.run.vm06.stdout:=============================================================================== 2026-03-09T17:44:19.031 INFO:teuthology.orchestra.run.vm06.stdout:^+ 172-104-154-182.ip.linod> 2 7 377 22 +2956us[+2956us] +/- 31ms 2026-03-09T17:44:19.031 INFO:teuthology.orchestra.run.vm06.stdout:^+ stratum2-4.NTP.TechFak.U> 2 7 377 25 -359us[ -359us] +/- 18ms 2026-03-09T17:44:19.031 INFO:teuthology.orchestra.run.vm06.stdout:^+ t2.ipfu.de 4 7 377 88 +801us[ +801us] +/- 17ms 2026-03-09T17:44:19.031 INFO:teuthology.orchestra.run.vm06.stdout:^* time.ndless.net 2 7 377 89 -2200us[-2201us] +/- 18ms 2026-03-09T17:44:19.034 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-09T17:44:19.036 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-09T17:44:19.037 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-09T17:44:19.040 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-09T17:44:19.043 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-09T17:44:19.046 INFO:teuthology.task.internal:Duration was 1489.121563 seconds 2026-03-09T17:44:19.046 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-09T17:44:19.049 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-09T17:44:19.049 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-09T17:44:19.076 DEBUG:teuthology.orchestra.run.vm09:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-09T17:44:19.119 INFO:teuthology.orchestra.run.vm06.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-09T17:44:19.121 INFO:teuthology.orchestra.run.vm09.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-09T17:44:19.426 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-09T17:44:19.426 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm06.local 2026-03-09T17:44:19.426 DEBUG:teuthology.orchestra.run.vm06:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-09T17:44:19.453 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm09.local 2026-03-09T17:44:19.453 DEBUG:teuthology.orchestra.run.vm09:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-09T17:44:19.482 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-09T17:44:19.482 DEBUG:teuthology.orchestra.run.vm06:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T17:44:19.495 DEBUG:teuthology.orchestra.run.vm09:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T17:44:20.391 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-09T17:44:20.391 DEBUG:teuthology.orchestra.run.vm06:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T17:44:20.393 DEBUG:teuthology.orchestra.run.vm09:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T17:44:20.416 INFO:teuthology.orchestra.run.vm09.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T17:44:20.417 INFO:teuthology.orchestra.run.vm09.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T17:44:20.417 INFO:teuthology.orchestra.run.vm09.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-09T17:44:20.417 INFO:teuthology.orchestra.run.vm09.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T17:44:20.417 INFO:teuthology.orchestra.run.vm09.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: /home/ubuntu/cephtest/archive/syslog/journalctl.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-09T17:44:20.417 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T17:44:20.418 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T17:44:20.418 INFO:teuthology.orchestra.run.vm06.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-09T17:44:20.418 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T17:44:20.418 INFO:teuthology.orchestra.run.vm06.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: /home/ubuntu/cephtest/archive/syslog/journalctl.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-09T17:44:20.601 INFO:teuthology.orchestra.run.vm09.stderr: 98.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-09T17:44:20.660 INFO:teuthology.orchestra.run.vm06.stderr: 97.5% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-09T17:44:20.662 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-09T17:44:20.666 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-09T17:44:20.666 DEBUG:teuthology.orchestra.run.vm06:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-09T17:44:20.728 DEBUG:teuthology.orchestra.run.vm09:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-09T17:44:20.752 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-09T17:44:20.756 DEBUG:teuthology.orchestra.run.vm06:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-09T17:44:20.770 DEBUG:teuthology.orchestra.run.vm09:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-09T17:44:20.799 INFO:teuthology.orchestra.run.vm06.stdout:kernel.core_pattern = core 2026-03-09T17:44:20.820 INFO:teuthology.orchestra.run.vm09.stdout:kernel.core_pattern = core 2026-03-09T17:44:20.834 DEBUG:teuthology.orchestra.run.vm06:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-09T17:44:20.868 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T17:44:20.868 DEBUG:teuthology.orchestra.run.vm09:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-09T17:44:20.889 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T17:44:20.889 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-09T17:44:20.893 INFO:teuthology.task.internal:Transferring archived files... 2026-03-09T17:44:20.893 DEBUG:teuthology.misc:Transferring archived files from vm06:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/585/remote/vm06 2026-03-09T17:44:20.893 DEBUG:teuthology.orchestra.run.vm06:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-09T17:44:20.942 DEBUG:teuthology.misc:Transferring archived files from vm09:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-09_11:23:05-orch-squid-none-default-vps/585/remote/vm09 2026-03-09T17:44:20.942 DEBUG:teuthology.orchestra.run.vm09:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-09T17:44:20.972 INFO:teuthology.task.internal:Removing archive directory... 2026-03-09T17:44:20.972 DEBUG:teuthology.orchestra.run.vm06:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-09T17:44:20.982 DEBUG:teuthology.orchestra.run.vm09:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-09T17:44:21.027 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-09T17:44:21.031 INFO:teuthology.task.internal:Not uploading archives. 2026-03-09T17:44:21.031 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-09T17:44:21.034 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-09T17:44:21.034 DEBUG:teuthology.orchestra.run.vm06:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-09T17:44:21.037 DEBUG:teuthology.orchestra.run.vm09:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-09T17:44:21.053 INFO:teuthology.orchestra.run.vm06.stdout: 8532144 0 drwxr-xr-x 3 ubuntu ubuntu 19 Mar 9 17:44 /home/ubuntu/cephtest 2026-03-09T17:44:21.053 INFO:teuthology.orchestra.run.vm06.stdout: 84041164 0 d--------- 2 ubuntu ubuntu 6 Mar 9 17:27 /home/ubuntu/cephtest/mnt.0 2026-03-09T17:44:21.053 INFO:teuthology.orchestra.run.vm06.stderr:find: ‘/home/ubuntu/cephtest/mnt.0’: Permission denied 2026-03-09T17:44:21.054 INFO:teuthology.orchestra.run.vm06.stderr:rmdir: failed to remove '/home/ubuntu/cephtest': Directory not empty 2026-03-09T17:44:21.069 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T17:44:21.069 ERROR:teuthology.run_tasks:Manager failed: internal.base Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 48, in base yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 153, in __exit__ self.gen.throw(typ, value, traceback) File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 53, in base run.wait( File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 485, in wait proc.wait() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 161, in wait self._raise_for_status() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 181, in _raise_for_status raise CommandFailedError( teuthology.exceptions.CommandFailedError: Command failed on vm06 with status 1: 'find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest' 2026-03-09T17:44:21.069 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-09T17:44:21.073 DEBUG:teuthology.run_tasks:Exception was not quenched, exiting: MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-09T17:44:21.074 INFO:teuthology.run:Summary data: description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/no kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{v18.2.0} 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/yes 3-inline/no 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} duration: 1489.1215634346008 failure_reason: reached maximum tries (50) after waiting for 300 seconds flavor: default owner: kyr status: fail success: false 2026-03-09T17:44:21.075 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-09T17:44:21.099 INFO:teuthology.run:FAIL