Conversation
Co-authored-by: Andrew Morgan <andrew@amorgan.xyz>
Bumps [pynacl](https://github.com/pyca/pynacl) from 1.5.0 to 1.6.2. <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/pyca/pynacl/blob/main/CHANGELOG.rst">pynacl's changelog</a>.</em></p> <blockquote> <h2>1.6.2 (2026-01-01)</h2> <ul> <li>Updated <code>libsodium</code> to 1.0.20-stable (2025-12-31 build) to resolve <code>CVE-2025-69277</code>.</li> </ul> <h2>1.6.1 (2025-11-10)</h2> <ul> <li>The <code>MAKE</code> environment variable can now be used to specify the <code>make</code> binary that should be used in the build process.</li> </ul> <h2>1.6.0 (2025-09-11)</h2> <ul> <li><strong>BACKWARDS INCOMPATIBLE:</strong> Removed support for Python 3.6 and 3.7.</li> <li>Added support for the low level AEAD AES bindings.</li> <li>Added support for <code>crypto_core_ed25519_from_uniform</code>.</li> <li>Update <code>libsodium</code> to 1.0.20-stable (2025-08-27 build).</li> <li>Added support for free-threaded Python 3.14.</li> <li>Added support for Windows on ARM wheels.</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/pyca/pynacl/commit/ecf41f55a3d8f1e10ce89c61c4b4d67f3f4467cf"><code>ecf41f5</code></a> changelog and version bump for 1.6.2 (<a href="https://redirect.github.com/pyca/pynacl/issues/923">#923</a>)</li> <li><a href="https://github.com/pyca/pynacl/commit/685a5e727772c2df81cfce61fb8768122102fa89"><code>685a5e7</code></a> Switch to PyPI trusted publishing (<a href="https://redirect.github.com/pyca/pynacl/issues/925">#925</a>)</li> <li><a href="https://github.com/pyca/pynacl/commit/78e0aa32b1a0acdd51e2b3bf394e7cb911fc1e68"><code>78e0aa3</code></a> missed adding these files as part of the libsodium update (<a href="https://redirect.github.com/pyca/pynacl/issues/924">#924</a>)</li> <li><a href="https://github.com/pyca/pynacl/commit/96314884d88d1089ff5f336dba61d7abbcddbbf7"><code>9631488</code></a> Bump libsodium to the latest 1.0.20 (<a href="https://redirect.github.com/pyca/pynacl/issues/922">#922</a>)</li> <li><a href="https://github.com/pyca/pynacl/commit/563b25bdedf666e86f0cc2a95321cd23a960260e"><code>563b25b</code></a> Add script to update vendored libsodium (<a href="https://redirect.github.com/pyca/pynacl/issues/921">#921</a>)</li> <li><a href="https://github.com/pyca/pynacl/commit/d23310561899d1ca4fd4026a646893b2af6b6c21"><code>d233105</code></a> Include libsodium license in wheels (<a href="https://redirect.github.com/pyca/pynacl/issues/917">#917</a>)</li> <li><a href="https://github.com/pyca/pynacl/commit/cabc3a879d142e62f7503a632e62e706ef5eccbb"><code>cabc3a8</code></a> Bump dessant/lock-threads from 5 to 6 (<a href="https://redirect.github.com/pyca/pynacl/issues/914">#914</a>)</li> <li><a href="https://github.com/pyca/pynacl/commit/f3596177b3a43a49b211ff2a3f9ea133f1cf8f23"><code>f359617</code></a> Bump actions/download-artifact from 6.0.0 to 7.0.0 (<a href="https://redirect.github.com/pyca/pynacl/issues/915">#915</a>)</li> <li><a href="https://github.com/pyca/pynacl/commit/fb6e37f76dcf9f93d3f65bb31abdb548816126b6"><code>fb6e37f</code></a> Bump actions/upload-artifact from 5 to 6 (<a href="https://redirect.github.com/pyca/pynacl/issues/916">#916</a>)</li> <li><a href="https://github.com/pyca/pynacl/commit/526f99278383ffda906bec9d08288191dcbbc3b3"><code>526f992</code></a> Bump actions/checkout from 6.0.0 to 6.0.1 (<a href="https://redirect.github.com/pyca/pynacl/issues/911">#911</a>)</li> <li>Additional commits viewable in <a href="https://github.com/pyca/pynacl/compare/1.5.0...1.6.2">compare view</a></li> </ul> </details> <br /> [](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/element-hq/synapse/network/alerts). </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
…68)" (#19351) Fixes #19349 This reverts commit 3f63638 (element-hq/synapse#19268) as the DB migration was taking too long and blocking media access while it happened. See element-hq/synapse#19349 for further information. ### Pull Request Checklist <!-- Please read https://element-hq.github.io/synapse/latest/development/contributing_guide.html before submitting your pull request --> * [X] Pull request is based on the develop branch * [X] Pull request includes a [changelog file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog). The entry should: - Be a short description of your change which makes sense to users. "Fixed a bug that prevented receiving messages from other servers." instead of "Moved X method from `EventStore` to `EventWorkerStore`.". - Use markdown where necessary, mostly for `code blocks`. - End with either a period (.) or an exclamation mark (!). - Start with a capital letter. - Feel free to credit yourself, by adding a sentence "Contributed by @github_username." or "Contributed by [Your Name]." to the end of the entry. * [X] [Code style](https://element-hq.github.io/synapse/latest/code_style.html) is correct (run the [linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters)) --------- Co-authored-by: Travis Ralston <travpc@gmail.com>
…al storage provider (#19204)
… 25.10 'Questing Quokka' (#19348) Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
…for_user_in_room (#19353) Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.6.0 to 2.6.3. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/urllib3/urllib3/releases">urllib3's releases</a>.</em></p> <blockquote> <h2>2.6.3</h2> <h2>🚀 urllib3 is fundraising for HTTP/2 support</h2> <p><a href="https://sethmlarson.dev/urllib3-is-fundraising-for-http2-support">urllib3 is raising ~$40,000 USD</a> to release HTTP/2 support and ensure long-term sustainable maintenance of the project after a sharp decline in financial support. If your company or organization uses Python and would benefit from HTTP/2 support in Requests, pip, cloud SDKs, and thousands of other projects <a href="https://opencollective.com/urllib3">please consider contributing financially</a> to ensure HTTP/2 support is developed sustainably and maintained for the long-haul.</p> <p>Thank you for your support.</p> <h2>Changes</h2> <ul> <li>Fixed a security issue where decompression-bomb safeguards of the streaming API were bypassed when HTTP redirects were followed. (CVE-2026-21441 reported by <a href="https://github.com/D47A"><code>@D47A</code></a>, 8.9 High, GHSA-38jv-5279-wg99)</li> <li>Started treating <code>Retry-After</code> times greater than 6 hours as 6 hours by default. (<a href="https://redirect.github.com/urllib3/urllib3/issues/3743">urllib3/urllib3#3743</a>)</li> <li>Fixed <code>urllib3.connection.VerifiedHTTPSConnection</code> on Emscripten. (<a href="https://redirect.github.com/urllib3/urllib3/issues/3752">urllib3/urllib3#3752</a>)</li> </ul> <h2>2.6.2</h2> <h2>🚀 urllib3 is fundraising for HTTP/2 support</h2> <p><a href="https://sethmlarson.dev/urllib3-is-fundraising-for-http2-support">urllib3 is raising ~$40,000 USD</a> to release HTTP/2 support and ensure long-term sustainable maintenance of the project after a sharp decline in financial support. If your company or organization uses Python and would benefit from HTTP/2 support in Requests, pip, cloud SDKs, and thousands of other projects <a href="https://opencollective.com/urllib3">please consider contributing financially</a> to ensure HTTP/2 support is developed sustainably and maintained for the long-haul.</p> <p>Thank you for your support.</p> <h2>Changes</h2> <ul> <li>Fixed <code>HTTPResponse.read_chunked()</code> to properly handle leftover data in the decoder's buffer when reading compressed chunked responses. (<a href="https://redirect.github.com/urllib3/urllib3/issues/3734">urllib3/urllib3#3734</a>)</li> </ul> <h2>2.6.1</h2> <h2>🚀 urllib3 is fundraising for HTTP/2 support</h2> <p><a href="https://sethmlarson.dev/urllib3-is-fundraising-for-http2-support">urllib3 is raising ~$40,000 USD</a> to release HTTP/2 support and ensure long-term sustainable maintenance of the project after a sharp decline in financial support. If your company or organization uses Python and would benefit from HTTP/2 support in Requests, pip, cloud SDKs, and thousands of other projects <a href="https://opencollective.com/urllib3">please consider contributing financially</a> to ensure HTTP/2 support is developed sustainably and maintained for the long-haul.</p> <p>Thank you for your support.</p> <h2>Changes</h2> <ul> <li>Restore previously removed <code>HTTPResponse.getheaders()</code> and <code>HTTPResponse.getheader()</code> methods. (<a href="https://redirect.github.com/urllib3/urllib3/issues/3731">#3731</a>)</li> </ul> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/urllib3/urllib3/blob/main/CHANGES.rst">urllib3's changelog</a>.</em></p> <blockquote> <h1>2.6.3 (2026-01-07)</h1> <ul> <li>Fixed a high-severity security issue where decompression-bomb safeguards of the streaming API were bypassed when HTTP redirects were followed. (<code>GHSA-38jv-5279-wg99 <https://github.com/urllib3/urllib3/security/advisories/GHSA-38jv-5279-wg99></code>__)</li> <li>Started treating <code>Retry-After</code> times greater than 6 hours as 6 hours by default. (<code>[#3743](urllib3/urllib3#3743) <https://github.com/urllib3/urllib3/issues/3743></code>__)</li> <li>Fixed <code>urllib3.connection.VerifiedHTTPSConnection</code> on Emscripten. (<code>[#3752](urllib3/urllib3#3752) <https://github.com/urllib3/urllib3/issues/3752></code>__)</li> </ul> <h1>2.6.2 (2025-12-11)</h1> <ul> <li>Fixed <code>HTTPResponse.read_chunked()</code> to properly handle leftover data in the decoder's buffer when reading compressed chunked responses. (<code>[#3734](urllib3/urllib3#3734) <https://github.com/urllib3/urllib3/issues/3734></code>__)</li> </ul> <h1>2.6.1 (2025-12-08)</h1> <ul> <li>Restore previously removed <code>HTTPResponse.getheaders()</code> and <code>HTTPResponse.getheader()</code> methods. (<code>[#3731](urllib3/urllib3#3731) <https://github.com/urllib3/urllib3/issues/3731></code>__)</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/urllib3/urllib3/commit/0248277dd7ac0239204889ca991353ad3e3a1ddc"><code>0248277</code></a> Release 2.6.3</li> <li><a href="https://github.com/urllib3/urllib3/commit/8864ac407bba8607950025e0979c4c69bc7abc7b"><code>8864ac4</code></a> Merge commit from fork</li> <li><a href="https://github.com/urllib3/urllib3/commit/70cecb27ca99d56aaaeb63ac27ee270ef2b24c5c"><code>70cecb2</code></a> Fix Scorecard issues related to vulnerable dev dependencies (<a href="https://redirect.github.com/urllib3/urllib3/issues/3755">#3755</a>)</li> <li><a href="https://github.com/urllib3/urllib3/commit/41f249abe1ef3e20768588969c4035aba060a359"><code>41f249a</code></a> Move "v2.0 Migration Guide" to the end of the table of contents (<a href="https://redirect.github.com/urllib3/urllib3/issues/3747">#3747</a>)</li> <li><a href="https://github.com/urllib3/urllib3/commit/fd4dffd2fc544166b76151a2fa3d7b7c0eab540c"><code>fd4dffd</code></a> Patch <code>VerifiedHTTPSConnection</code> for Emscripten (<a href="https://redirect.github.com/urllib3/urllib3/issues/3752">#3752</a>)</li> <li><a href="https://github.com/urllib3/urllib3/commit/13f0bfd55e4468fe1ea9c6f809d3a87b0f93ebab"><code>13f0bfd</code></a> Handle massive values in Retry-After when calculating time to sleep for (<a href="https://redirect.github.com/urllib3/urllib3/issues/3743">#3743</a>)</li> <li><a href="https://github.com/urllib3/urllib3/commit/8c480bf87bcefd321b3a1ae47f04e908b6b2ed7b"><code>8c480bf</code></a> Bump actions/upload-artifact from 5.0.0 to 6.0.0 (<a href="https://redirect.github.com/urllib3/urllib3/issues/3748">#3748</a>)</li> <li><a href="https://github.com/urllib3/urllib3/commit/4b40616e959c0a2c466e8075f2a785a9f99bb0c1"><code>4b40616</code></a> Bump actions/cache from 4.3.0 to 5.0.1 (<a href="https://redirect.github.com/urllib3/urllib3/issues/3750">#3750</a>)</li> <li><a href="https://github.com/urllib3/urllib3/commit/82b8479663d037d220c883f1584dd01a43bb273b"><code>82b8479</code></a> Bump actions/download-artifact from 6.0.0 to 7.0.0 (<a href="https://redirect.github.com/urllib3/urllib3/issues/3749">#3749</a>)</li> <li><a href="https://github.com/urllib3/urllib3/commit/34284cb01700bb7d4fdd472f909e22393e9174e2"><code>34284cb</code></a> Mention experimental features in the security policy (<a href="https://redirect.github.com/urllib3/urllib3/issues/3746">#3746</a>)</li> <li>Additional commits viewable in <a href="https://github.com/urllib3/urllib3/compare/2.6.0...2.6.3">compare view</a></li> </ul> </details> <br /> [](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/element-hq/synapse/network/alerts). </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
These metrics were [removed completely from the codebase](https://github.com/element-hq/synapse/blob/444bc56cda05953cb24f95f291d1d2906f3045cc/docs/changelogs/CHANGES-2022.md#synapse-1730-2022-12-06) in Synapse v1.73.0 (2022-12-06). 3-years is plenty enough time :fast_forward: The deprecation/removal is still in our [upgrade notes](https://github.com/element-hq/synapse/blob/444bc56cda05953cb24f95f291d1d2906f3045cc/docs/upgrade.md#deprecation-of-legacy-prometheus-metric-names) which points to a durable versioned link with the info still available: https://element-hq.github.io/synapse/v1.69/metrics-howto.html#renaming-of-metrics--deprecation-of-old-names-in-12
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
Bumps [authlib](https://github.com/authlib/authlib) from 1.6.5 to 1.6.6. <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/authlib/authlib/blob/main/docs/changelog.rst">authlib's changelog</a>.</em></p> <blockquote> <h2>Version 1.6.6</h2> <p><strong>Released on Dec 12, 2025</strong></p> <ul> <li><code>get_jwt_config</code> takes a <code>client</code> parameter, :pr:<code>844</code>.</li> <li>Fix incorrect signature when <code>Content-Type</code> is x-www-form-urlencoded for OAuth 1.0 Client, :pr:<code>778</code>.</li> <li>Use <code>expires_in</code> in <code>OAuth2Token</code> when <code>expires_at</code> is unparsable, :pr:<code>842</code>.</li> <li>Always track <code>state</code> in session for OAuth client integrations.</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/authlib/authlib/commit/bb7a315befbad333faf9a23ef574d6e3134a6774"><code>bb7a315</code></a> chore: release 1.6.6</li> <li><a href="https://github.com/authlib/authlib/commit/0a423d4638bed1c0fe4597b2296a85c5bb59fba2"><code>0a423d4</code></a> Merge pull request <a href="https://redirect.github.com/authlib/authlib/issues/844">#844</a> from azmeuk/806-get-jwt-config-client</li> <li><a href="https://github.com/authlib/authlib/commit/2808378611dd6fb2532b189a9087877d8f0c0489"><code>2808378</code></a> Merge commit from fork</li> <li><a href="https://github.com/authlib/authlib/commit/714502a4738bc29f26eb245b0c66718d8536cdda"><code>714502a</code></a> feat: get_jwt_config takes a client parameter</li> <li><a href="https://github.com/authlib/authlib/commit/260d04edee23d8470057ea659c16fb8a2c7b0dc2"><code>260d04e</code></a> Fix: Use <code>expires_in</code> when <code>expires_at</code> is unparsable</li> <li><a href="https://github.com/authlib/authlib/commit/eb37124bbbec6ccbfba3699d8960f9710d330ad8"><code>eb37124</code></a> Merge pull request <a href="https://redirect.github.com/authlib/authlib/issues/778">#778</a> from shc261392/fix-httpx-oauth1-form-data-incorrect-s...</li> <li><a href="https://github.com/authlib/authlib/commit/0ba9ec4feeb8e19f572c454e2d1dbbdc1d30ae62"><code>0ba9ec4</code></a> docs: fix guide on requests self signed certificate</li> <li><a href="https://github.com/authlib/authlib/commit/a2e9943815bb5161863b1fa144ac0aaa50d97e91"><code>a2e9943</code></a> docs: indicate that <a href="https://redirect.github.com/authlib/authlib/issues/743">#743</a> needs a migration</li> <li><a href="https://github.com/authlib/authlib/commit/06015d20652a23eff8350b6ad71b32fe41dae4ba"><code>06015d2</code></a> test: factorize the token fixture</li> <li>See full diff in <a href="https://github.com/authlib/authlib/compare/v1.6.5...v1.6.6">compare view</a></li> </ul> </details> <br /> [](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/element-hq/synapse/network/alerts). </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Fixes #19347 This deprecates MSC2697 which has been closed since May 2024. As per #19347 this seems to be a thing we can just rip out. The crypto team have moved onto MSC3814 and are suggesting that developers who rely on MSC2697 should use MSC3814 instead. MSC2697 implementation originally introduced by matrix-org/synapse#8380
…n users (#19321) Fix element-hq/synapse#19120 by always falling back to checking power levels for local users if a local creator cannot be found in a v12 room. Complement tests: matrix-org/complement#836
…tchup_room_event_ids`. (#19320) A couple of tiny tweaks pulled out of #18968.
### Pull Request Checklist <!-- Please read https://element-hq.github.io/synapse/latest/development/contributing_guide.html before submitting your pull request --> * [X] Pull request is based on the develop branch * [X] Pull request includes a [changelog file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog). The entry should: - Be a short description of your change which makes sense to users. "Fixed a bug that prevented receiving messages from other servers." instead of "Moved X method from `EventStore` to `EventWorkerStore`.". - Use markdown where necessary, mostly for `code blocks`. - End with either a period (.) or an exclamation mark (!). - Start with a capital letter. - Feel free to credit yourself, by adding a sentence "Contributed by @github_username." or "Contributed by [Your Name]." to the end of the entry. * [X] [Code style](https://element-hq.github.io/synapse/latest/code_style.html) is correct (run the [linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
- Update `synapse_xxx` (server-level) metrics to use
`server_name="$server_name",` instead of `instance="$instance"`
- Add `synapse_server_name_info` metric to map Synapse `server_name`s to
the `instance`s they're hosted on.
- For process level metrics, update to use `xxx * on (instance, job,
index) group_left(server_name)
synapse_server_name_info{server_name="$server_name"}`
All of the changes here are backwards compatible with whatever people
were doing before with their Prometheus/Grafana dashboards.
Previously, the recommendation was to use the `instance` label to group
everything under the same server (https://github.com/element-hq/synapse/blob/803e4b4d884b2de4b9e20dc47ffb59a983b8a4b5/docs/metrics-howto.md#L93-L147)
But the `instance` label actually has a special meaning and we're
actually abusing it by using it that way:
> `instance`: The `<host>:<port>` part of the target's URL that was
scraped.
>
> *--
https://prometheus.io/docs/concepts/jobs_instances/#automatically-generated-labels-and-time-series*
Since element-hq/synapse#18592 (Synapse
`v1.139.0`), we now have the `server_name` label to use instead.
---
Additionally, the assumption that a single process is serving a single
server is no longer true with [Synapse Pro for small
hosts](https://docs.element.io/latest/element-server-suite-pro/synapse-pro-for-small-hosts/overview/).
Part of element-hq/synapse-small-hosts#106
### Motivating use case
Although this change also benefits [Synapse Pro for small
hosts](https://docs.element.io/latest/element-server-suite-pro/synapse-pro-for-small-hosts/overview/)
(element-hq/synapse-small-hosts#106), this is
actually spawning from adding Prometheus metrics to our workerized
Docker image (element-hq/synapse#19324,
element-hq/synapse#19336) with a more correct
label setup (without `instance`) and wanting the dashboard to be better.
### Testing strategy
1. Make sure your firewall allows the Docker containers to communicate
to the host (`host.docker.internal`) so they can access exposed ports of
other Docker containers. We want to allow Synapse to access the
Prometheus container and Grafana to access to the Prometheus container.
- `sudo ufw allow in on docker0 comment "Allow traffic from the default
Docker network to the host machine (host.docker.internal)"`
- `sudo ufw allow in on br-+ comment "(from Matrix Complement testing)
Allow traffic from custom Docker networks to the host machine
(host.docker.internal)"`
- [Complement firewall
docs](https://github.com/matrix-org/complement/blob/ee6acd9154bbae2d0071a9cb39593c0a5e37268b/README.md#potential-conflict-with-firewall-software)
1. Build the Docker image for Synapse: `docker build -t
matrixdotorg/synapse -f docker/Dockerfile .`
([docs](https://github.com/element-hq/synapse/blob/7a24fafbc376b9bffeb3277b1ad4aa950720c96c/docker/README-testing.md#building-and-running-the-images-manually))
1. Generate config for Synapse:
```
docker run -it --rm \
--mount type=volume,src=synapse-data,dst=/data \
-e SYNAPSE_SERVER_NAME=my.docker.synapse.server \
-e SYNAPSE_REPORT_STATS=yes \
-e SYNAPSE_ENABLE_METRICS=1 \
matrixdotorg/synapse:latest generate
```
1. Start Synapse:
```
docker run -d --name synapse \
--mount type=volume,src=synapse-data,dst=/data \
-p 8008:8008 \
-p 19090:19090 \
matrixdotorg/synapse:latest
```
1. You should be able to see metrics from Synapse at
http://localhost:19090/_synapse/metrics
1. Create a Prometheus config (`prometheus.yml`)
```yaml
global:
scrape_interval: 15s
scrape_timeout: 15s
evaluation_interval: 15s
scrape_configs:
- job_name: prometheus
scrape_interval: 15s
metrics_path: /_synapse/metrics
scheme: http
static_configs:
- targets:
# This should point to the Synapse metrics listener (we're using
`host.docker.internal` because this is from within the Prometheus
container)
- host.docker.internal:19090
```
1. Start Prometheus (update the volume bind mount to the config you just
saved somewhere):
```
docker run \
--detach \
--name=prometheus \
--add-host host.docker.internal:host-gateway \
-p 9090:9090 \
-v
~/Documents/code/random/prometheus-config/prometheus.yml:/etc/prometheus/prometheus.yml
\
prom/prometheus
```
1. Make sure you're seeing some data in Prometheus. On
http://localhost:9090/query, search for `synapse_build_info`
1. Start [Grafana](https://hub.docker.com/r/grafana/grafana)
```
docker run -d --name=grafana --add-host
host.docker.internal:host-gateway -p 3000:3000 grafana/grafana
```
1. Visit the Grafana dashboard, http://localhost:3000/ (Credentials:
`admin`/`admin`)
1. **Connections** -> **Data Sources** -> **Add data source** ->
**Prometheus**
- Prometheus server URL: `http://host.docker.internal:9090`
1. Import the Synapse dashboard: `contrib/grafana/synapse.json`
To test workers, you can use the testing strategy from
element-hq/synapse#19336 (assumes both changes
from this PR and the other PR are combined)
…all workers in Docker image (#19336) Add Prometheus [HTTP service discovery](https://prometheus.io/docs/prometheus/latest/http_sd/) endpoint for easy discovery of all workers in Docker image. Follow-up to element-hq/synapse#19324 Spawning from wanting to [run a load test](element-hq/synapse-rust-apps#397) against the Complement Docker image of Synapse and see metrics from the homeserver. `GET http://<synapse_container>:9469/metrics/service_discovery` ```json5 [ { "targets": [ "<host>", ... ], "labels": { "<labelname>": "<labelvalue>", ... } }, ... ] ``` The metrics from each worker can also be accessed via `http://<synapse_container>:9469/metrics/worker/<worker_name>` which is what the service discovery response points to behind the scenes. This way, you only need to expose a single port (9469) to access all metrics. <details> <summary>Real HTTP service discovery response</summary> ```json5 [ { "targets": [ "localhost:9469" ], "labels": { "job": "event_persister", "index": "1", "__metrics_path__": "/metrics/worker/event_persister1" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "event_persister", "index": "2", "__metrics_path__": "/metrics/worker/event_persister2" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "background_worker", "index": "1", "__metrics_path__": "/metrics/worker/background_worker1" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "event_creator", "index": "1", "__metrics_path__": "/metrics/worker/event_creator1" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "user_dir", "index": "1", "__metrics_path__": "/metrics/worker/user_dir1" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "media_repository", "index": "1", "__metrics_path__": "/metrics/worker/media_repository1" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "federation_inbound", "index": "1", "__metrics_path__": "/metrics/worker/federation_inbound1" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "federation_reader", "index": "1", "__metrics_path__": "/metrics/worker/federation_reader1" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "federation_sender", "index": "1", "__metrics_path__": "/metrics/worker/federation_sender1" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "synchrotron", "index": "1", "__metrics_path__": "/metrics/worker/synchrotron1" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "client_reader", "index": "1", "__metrics_path__": "/metrics/worker/client_reader1" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "appservice", "index": "1", "__metrics_path__": "/metrics/worker/appservice1" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "pusher", "index": "1", "__metrics_path__": "/metrics/worker/pusher1" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "device_lists", "index": "1", "__metrics_path__": "/metrics/worker/device_lists1" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "device_lists", "index": "2", "__metrics_path__": "/metrics/worker/device_lists2" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "stream_writers", "index": "1", "__metrics_path__": "/metrics/worker/stream_writers1" } }, { "targets": [ "localhost:9469" ], "labels": { "job": "main", "index": "1", "__metrics_path__": "/metrics/worker/main" } } ] ``` </details> And how it ends up as targets in Prometheus (http://localhost:9090/targets): (image) ### Testing strategy 1. Make sure your firewall allows the Docker containers to communicate to the host (`host.docker.internal`) so they can access exposed ports of other Docker containers. We want to allow Synapse to access the Prometheus container and Grafana to access to the Prometheus container. - `sudo ufw allow in on docker0 comment "Allow traffic from the default Docker network to the host machine (host.docker.internal)"` - `sudo ufw allow in on br-+ comment "(from Matrix Complement testing) Allow traffic from custom Docker networks to the host machine (host.docker.internal)"` - [Complement firewall docs](https://github.com/matrix-org/complement/blob/ee6acd9154bbae2d0071a9cb39593c0a5e37268b/README.md#potential-conflict-with-firewall-software) 1. Build the Docker image for Synapse: `docker build -t matrixdotorg/synapse -f docker/Dockerfile . && docker build -t matrixdotorg/synapse-workers -f docker/Dockerfile-workers .` ([docs](https://github.com/element-hq/synapse/blob/7a24fafbc376b9bffeb3277b1ad4aa950720c96c/docker/README-testing.md#building-and-running-the-images-manually)) 1. Start Synapse: ``` docker run -d --name synapse \ --mount type=volume,src=synapse-data,dst=/data \ -e SYNAPSE_SERVER_NAME=my.docker.synapse.server \ -e SYNAPSE_REPORT_STATS=no \ -e SYNAPSE_ENABLE_METRICS=1 \ -p 8008:8008 \ -p 9469:9469 \ matrixdotorg/synapse-workers:latest ``` - Also try with workers: ``` docker run -d --name synapse \ --mount type=volume,src=synapse-data,dst=/data \ -e SYNAPSE_SERVER_NAME=my.docker.synapse.server \ -e SYNAPSE_REPORT_STATS=no \ -e SYNAPSE_ENABLE_METRICS=1 \ -e SYNAPSE_WORKER_TYPES="\ event_persister:2, \ background_worker, \ event_creator, \ user_dir, \ media_repository, \ federation_inbound, \ federation_reader, \ federation_sender, \ synchrotron, \ client_reader, \ appservice, \ pusher, \ device_lists:2, \ stream_writers=account_data+presence+receipts+to_device+typing" \ -p 8008:8008 \ -p 9469:9469 \ matrixdotorg/synapse-workers:latest ``` 1. You should be able to see Prometheus service discovery endpoint at http://localhost:9469/metrics/service_discovery 1. Create a Prometheus config (`prometheus.yml`) ```yaml global: scrape_interval: 15s scrape_timeout: 15s evaluation_interval: 15s scrape_configs: - job_name: synapse scrape_interval: 15s metrics_path: /_synapse/metrics scheme: http # We set `honor_labels` so that each service can set their own `job` label # # > honor_labels controls how Prometheus handles conflicts between labels that are # > already present in scraped data and labels that Prometheus would attach # > server-side ("job" and "instance" labels, manually configured target # > labels, and labels generated by service discovery implementations). # > # > *-- https://prometheus.io/docs/prometheus/latest/configuration/configuration/#scrape_config* honor_labels: true # Use HTTP service discovery # # Reference: # - https://prometheus.io/docs/prometheus/latest/http_sd/ # - https://prometheus.io/docs/prometheus/latest/configuration/configuration/#http_sd_config http_sd_configs: - url: 'http://localhost:9469/metrics/service_discovery' ``` 1. Start Prometheus (update the volume bind mount to the config you just saved somewhere): ``` docker run \ --detach \ --name=prometheus \ --add-host host.docker.internal:host-gateway \ -p 9090:9090 \ -v ~/Documents/code/random/prometheus-config/prometheus.yml:/etc/prometheus/prometheus.yml \ prom/prometheus ``` 1. Make sure you're seeing some data in Prometheus. On http://localhost:9090/query, search for `synapse_build_info` 1. Start [Grafana](https://hub.docker.com/r/grafana/grafana) ``` docker run -d --name=grafana --add-host host.docker.internal:host-gateway -p 3000:3000 grafana/grafana ``` 1. Visit the Grafana dashboard, http://localhost:3000/ (Credentials: `admin`/`admin`) 1. **Connections** -> **Data Sources** -> **Add data source** -> **Prometheus** - Prometheus server URL: `http://host.docker.internal:9090` 1. Import the Synapse dashboard: https://github.com/element-hq/synapse/blob/develop/contrib/grafana/synapse.json
Store the JSON content of scheduled delayed events as text instead of a byte array. This brings it in line with the `event_json` table's `json` column, and fixes the inability to schedule a delayed event with non-ASCII characters in its content. Fixes #19242
…#19383) Spawning from not seeing any reactor metrics in the Grafana dashboard in some load tests, noticing `python_twisted_reactor_tick_time_bucket` is `0` in Prometheus, following it back to Synapse and seeing that we don't warn about skipping reactor metrics in all cases (when using an unknown reactor type). A follow-up to this would be to actually figure out how to instrument the `ProxiedReactor` or why `ProxiedReactor` is being chosen in the first place and see if we can get it to use a more normal type :thinking: ### Reproduction instructions 1. Using the Complement scripts **with workers**: `WORKERS=1 ./scripts-dev/complement.sh ./tests/csapi` 1. `docker logs complement_csapi_dirty_hs1 2>&1 | grep -i "reactor"` 1. With these changes, notice `Skipping configuring ReactorLastSeenMetric: unexpected reactor type: <__main__.ProxiedReactor object at 0x7fc0adaaea50>` and `Twisted reactor: ProxiedReactor` 1. Cleanup: - `docker stop $(docker ps --all --filter "label=complement_context" --quiet)` - `docker rm $(docker ps --all --filter "label=complement_context" --quiet)` I'm unable to reproduce with the normal Synapse images or `complement-synapse` without workers. They all use `Twisted reactor: EPollReactor` <details> <summary>Checking <code>docker/Dockerfile-workers</code></summary> 1. Build the Docker image for Synapse: `docker build -t matrixdotorg/synapse -f docker/Dockerfile . && docker build -t matrixdotorg/synapse-workers -f docker/Dockerfile-workers .` ([docs](https://github.com/element-hq/synapse/blob/7a24fafbc376b9bffeb3277b1ad4aa950720c96c/docker/README-testing.md#building-and-running-the-images-manually)) 1. Start Synapse: ``` docker run -d --name synapse \ --mount type=volume,src=synapse-data,dst=/data \ -e SYNAPSE_SERVER_NAME=my.docker.synapse.server \ -e SYNAPSE_REPORT_STATS=no \ -e SYNAPSE_ENABLE_METRICS=1 \ -p 8008:8008 \ -p 9469:9469 \ matrixdotorg/synapse-workers:latest ``` 1. `docker logs synapse 2>&1 | grep -i "reactor"` 1. Says `Twisted reactor: EPollReactor` </details>
Previously, because `conn.rollback()` was inside the `if i < MAX_NUMBER_OF_RETRIES:` condition, it never rolled back on the final retry. Part of element-hq/synapse#19202 There are other problems mentioned in element-hq/synapse#19202 but this is a nice standalone change.
These are automatic changes from importing/exporting from Grafana 12.3.1. In order to verify that I'm not sneaking in any changes, you can follow these steps to get the same output. Reproduction instructions: 1. Start [Grafana](https://hub.docker.com/r/grafana/grafana) ``` docker run -d --name=grafana --add-host host.docker.internal:host-gateway -p 3000:3000 grafana/grafana ``` 1. Visit the Grafana dashboard, http://localhost:3000/ (Credentials: `admin`/`admin`) 1. Import the Synapse dashboard: `contrib/grafana/synapse.json` 1. Export the Synapse dashboard. On the dashboard page -> **Export** -> **Export as code** -> Using the **Classic** model -> Check **Export for sharing externally** -> Copy 1. Paste into `contrib/grafana/synapse.json` 1. `git status`/`git diff` to check if there is any diff Sanity checked the dashboard itself by importing the dashboard on https://grafana.matrix.org/ (Grafana 10.4.1 according to https://grafana.matrix.org/api/health). The process-level metrics won't work because element-hq/synapse#19337 just merged and isn't on `matrix.org` yet. Also just generally, this dashboard works for me locally with the [load-tests](element-hq/synapse-rust-apps#397) I've been doing. ### Motivation There are few fixes I want to make to the Grafana dashboard and it sucks having to manually translate everything back over because we have different formatting. Hopefully after this bulk change, future exports will have exactly what we want to change.
Bumps [pyasn1](https://github.com/pyasn1/pyasn1) from 0.6.1 to 0.6.2. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/pyasn1/pyasn1/releases">pyasn1's releases</a>.</em></p> <blockquote> <h2>Release 0.6.2</h2> <p>It's a minor release.</p> <ul> <li>Fixed continuation octet limits in OID/RELATIVE-OID decoder (CVE-2026-23490).</li> <li>Added support for Python 3.14.</li> <li>Added SECURITY.md policy.</li> <li>Migrated to pyproject.toml packaging.</li> </ul> <p>All changes are noted in the <a href="https://github.com/pyasn1/pyasn1/blob/master/CHANGES.rst">CHANGELOG</a>.</p> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/pyasn1/pyasn1/blob/main/CHANGES.rst">pyasn1's changelog</a>.</em></p> <blockquote> <h2>Revision 0.6.2, released 16-01-2026</h2> <ul> <li>CVE-2026-23490 (GHSA-63vm-454h-vhhq): Fixed continuation octet limits in OID/RELATIVE-OID decoder (thanks to tsigouris007)</li> <li>Added support for Python 3.14 [pr <a href="https://redirect.github.com/pyasn1/pyasn1/issues/97">#97</a>](<a href="https://redirect.github.com/pyasn1/pyasn1/pull/97">pyasn1/pyasn1#97</a>)</li> <li>Added SECURITY.md policy</li> <li>Fixed unit tests failing due to missing code [issue <a href="https://redirect.github.com/pyasn1/pyasn1/issues/91">#91</a>](<a href="https://redirect.github.com/pyasn1/pyasn1/issues/91">pyasn1/pyasn1#91</a>) [pr <a href="https://redirect.github.com/pyasn1/pyasn1/issues/92">#92</a>](<a href="https://redirect.github.com/pyasn1/pyasn1/pull/92">pyasn1/pyasn1#92</a>)</li> <li>Migrated to pyproject.toml packaging [pr <a href="https://redirect.github.com/pyasn1/pyasn1/issues/90">#90</a>](<a href="https://redirect.github.com/pyasn1/pyasn1/pull/90">pyasn1/pyasn1#90</a>)</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/pyasn1/pyasn1/commit/e7356f89cf32c130d65b1a0e903fe7ecce426424"><code>e7356f8</code></a> Prepare release 0.6.2</li> <li><a href="https://github.com/pyasn1/pyasn1/commit/3908f144229eed4df24bd569d16e5991ace44970"><code>3908f14</code></a> Merge commit from fork</li> <li><a href="https://github.com/pyasn1/pyasn1/commit/0a7e067674b1ec558f9d233a3bc173472fe27c6c"><code>0a7e067</code></a> Add support for Python 3.14 (<a href="https://redirect.github.com/pyasn1/pyasn1/issues/97">#97</a>)</li> <li><a href="https://github.com/pyasn1/pyasn1/commit/33656e986d8e2355123799e7267104ac7d8a6fb1"><code>33656e9</code></a> Create Security Policy</li> <li><a href="https://github.com/pyasn1/pyasn1/commit/fa62307253f4423effac71a618e20fabaa37e22e"><code>fa62307</code></a> fix for issue <a href="https://redirect.github.com/pyasn1/pyasn1/issues/91">#91</a>: unit tests failing due to missing code (<a href="https://redirect.github.com/pyasn1/pyasn1/issues/92">#92</a>)</li> <li><a href="https://github.com/pyasn1/pyasn1/commit/f1ed02e41c193a66741572185bab94d34f43daec"><code>f1ed02e</code></a> Package pyasn1 with pyproject.toml (<a href="https://redirect.github.com/pyasn1/pyasn1/issues/90">#90</a>)</li> <li><a href="https://github.com/pyasn1/pyasn1/commit/93c4d4f0b6af84c13517b5700104ac57fb6d3fe5"><code>93c4d4f</code></a> Switch documentation user to pyasn1 (<a href="https://redirect.github.com/pyasn1/pyasn1/issues/89">#89</a>)</li> <li>See full diff in <a href="https://github.com/pyasn1/pyasn1/compare/v0.6.1...v0.6.2">compare view</a></li> </ul> </details> <br /> [](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/element-hq/synapse/network/alerts). </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
…0.24 (#19379) Fixes #19375 `prometheus_client` 0.24 makes `Collector` a generic type. Previously, `InFlightGauge` inherited from both `Generic[MetricsEntry]` and `Collector`, resulting in the error `TypeError: cannot create a consistent MRO` when using `prometheus_client` >= 0.24. This behaviour of disallowing multiple `Generic` inheritance is more strictly enforced starting with python 3.14, but can still lead to issues with earlier versions of python. This PR separates runtime and typing inheritance for `InFlightGauge`: - Runtime: `InFlightGauge` inherits only from `Collector` - Typing: `InFlightGauge` is generic This preserves static typing, avoids MRO conflicts, and supports both `prometheus_client` <0.24 and >=0.24. I have tested these changes out locally with `prometheus_client` 0.23.1 & 0.24 on python 3.14 while sending a bunch of messages over federation and watching a grafana dashboard configured to show `synapse_util_metrics_block_in_flight_total` & `synapse_util_metrics_block_in_flight_real_time_sum` (the only metric setup to use `InFlightGauge`) and things are working in each case. https://github.com/element-hq/synapse/blob/a1e9abc7df3e6c43a95cba059348546a4c9d4491/synapse/util/metrics.py#L112-L119 ### Pull Request Checklist <!-- Please read https://element-hq.github.io/synapse/latest/development/contributing_guide.html before submitting your pull request --> * [X] Pull request is based on the develop branch * [X] Pull request includes a [changelog file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog). The entry should: - Be a short description of your change which makes sense to users. "Fixed a bug that prevented receiving messages from other servers." instead of "Moved X method from `EventStore` to `EventWorkerStore`.". - Use markdown where necessary, mostly for `code blocks`. - End with either a period (.) or an exclamation mark (!). - Start with a capital letter. - Feel free to credit yourself, by adding a sentence "Contributed by @github_username." or "Contributed by [Your Name]." to the end of the entry. * [X] [Code style](https://element-hq.github.io/synapse/latest/code_style.html) is correct (run the [linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
…ker Complement tests (#19385) Follow-up to element-hq/synapse#19383 The [`ProxiedReactor`](https://github.com/element-hq/synapse/blob/079c52e16b3e75929380d59c5f1d85b66c1a4ccf/synapse/app/complement_fork_starter.py#L38-L71) is a special custom reactor used in the `synapse/app/complement_fork_starter.py`. It's used by default when using `WORKERS=1 ./scripts-dev/complement.sh` (see `SYNAPSE_USE_EXPERIMENTAL_FORKING_LAUNCHER`). The point of the forking launcher is to improve start-up times and the point of the [`ProxiedReactor`](https://github.com/element-hq/synapse/blob/079c52e16b3e75929380d59c5f1d85b66c1a4ccf/synapse/app/complement_fork_starter.py#L38-L71) is explained in the [docstring](https://github.com/element-hq/synapse/blob/079c52e16b3e75929380d59c5f1d85b66c1a4ccf/synapse/app/complement_fork_starter.py#L38-L56) (introduced in matrix-org/synapse#13127) ### Reproduction instructions 1. Using the Complement scripts **with workers**: `WORKERS=1 COMPLEMENT_DIR=../complement ./scripts-dev/complement.sh ./tests/csapi` 1. `docker logs complement_csapi_dirty_hs1 2>&1 | grep -i "reactor"` 1. With these changes, notice `Twisted reactor: ProxiedReactor` but no warning about `Skipping configuring ReactorLastSeenMetric: unexpected reactor type: <__main__.ProxiedReactor object at 0x7fc0adaaea50>` 1. Cleanup: - `docker stop $(docker ps --all --filter "label=complement_context" --quiet)` - `docker rm $(docker ps --all --filter "label=complement_context" --quiet)` To test that we're actually seeing reactor metrics, I've been using this with the load-test runs in element-hq/synapse-rust-apps#397
Codecov Report❌ Patch coverage is Additional details and impacted files@@ Coverage Diff @@
## master #232 +/- ##
==========================================
- Coverage 80.14% 80.12% -0.03%
==========================================
Files 498 500 +2
Lines 71133 71200 +67
Branches 10683 10700 +17
==========================================
+ Hits 57013 57050 +37
- Misses 10881 10905 +24
- Partials 3239 3245 +6
... and 2 files with indirect coverage changes Continue to review full report in Codecov by Sentry.
🚀 New features to boost your workflow:
|
fc89d8f to
bd2d8e4
Compare
bd2d8e4 to
ad6cddb
Compare
|
I added commit chore: away from mdbook-admonish |
There was a problem hiding this comment.
Pull request overview
This PR merges upstream Synapse v1.146.0 into the Famedly fork, updating from v1.145.0 to v1.146.0_1. The release includes significant new features, bug fixes, breaking changes, and infrastructure improvements.
Changes:
- Adds
enable_local_media_storageconfiguration option to allow off-site media storage without local caching - Removes MSC2697 (legacy dehydrated devices) as the MSC is closed; developers should migrate to MSC3814
- Stabilizes MSC4312's
m.oauthUser-Interactive Auth stage for cross-signing reset (deprecatesorg.matrix.cross_signing_reset) - Adds task cancellation API to the task scheduler
- Improves database transaction retry logic to always rollback on retry attempts
- Adds Prometheus HTTP service discovery endpoint for Docker workers setup
- Drops support for Ubuntu 25.04, adds support for Ubuntu 25.10
Reviewed changes
Copilot reviewed 76 out of 78 changed files in this pull request and generated 9 comments.
Show a summary per file
| File | Description |
|---|---|
| pyproject.toml | Version bump to 1.146.0 |
| CHANGES.md | Comprehensive changelog for v1.146.0 release |
| synapse/types/init.py | Adds CANCELLED status to TaskStatus enum |
| synapse/util/task_scheduler.py | Implements task cancellation functionality |
| synapse/storage/databases/main/devices.py | Makes keys_for_device parameter non-optional after MSC2697 removal |
| synapse/handlers/device.py | Removes rehydrate_device method (MSC2697 removal) |
| synapse/rest/client/devices.py | Removes DehydratedDeviceServlet and ClaimDehydratedDeviceServlet |
| synapse/rest/client/keys.py | Stabilizes m.oauth UIA stage, removes device_id parameter from keys upload |
| synapse/rest/client/auth.py | Adds support for m.oauth alongside unstable org.matrix.cross_signing_reset |
| synapse/config/repository.py | Adds enable_local_media_storage configuration option |
| synapse/config/experimental.py | Removes msc2697_enabled configuration |
| synapse/media/media_storage.py | Major refactoring to support optional local storage provider |
| synapse/media/media_repository.py | Updates to use new media storage interface |
| synapse/storage/database.py | Improves transaction retry logic with proper rollback handling |
| synapse/storage/schema/main/delta/93/04_make_delayed_event_content_text.py | Database migration to fix delayed event content encoding |
| synapse/metrics/init.py | Fixes InFlightGauge typing for prometheus_client compatibility |
| synapse/server.py | Adds synapse_server_name_info metric |
| docker/configure_workers_and_start.py | Adds Prometheus HTTP service discovery for workers |
| tests/* | Replaces deprecated assertEquals with assertEqual, adds new test coverage |
| docs/* | Updates documentation including metrics, configuration, and upgrade guides |
| poetry.lock | Updates dependencies including authlib, pyasn1, pynacl, urllib3 |
| .github/workflows/* | Updates action versions and mdbook to 0.5.2 |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| txn_mocks.after_callback.assert_called_once_with(123, 456, extra=789) | ||
| txn_mocks.exception_callback.assert_not_called() | ||
|
|
||
| # Should have commited right away |
There was a problem hiding this comment.
Typo in comment: "commited" should be "committed"
| local_path = os.path.join(self.local_media_directory, path) # type: ignore[arg-type] | ||
| if os.path.isfile(local_path): | ||
| # Import here to avoid circular import | ||
| from .media_storage import FileResponder |
There was a problem hiding this comment.
The module 'synapse.media.media_storage' imports itself.
The module 'media_storage' imports itself.
| # Another worker downloaded the media first. Clean up our file. | ||
| try: | ||
| os.remove(fname) | ||
| except Exception: |
There was a problem hiding this comment.
'except' clause does nothing but pass and there is no explanatory comment.
| if is_temp_file: | ||
| try: | ||
| os.remove(media_filepath) | ||
| except Exception: |
There was a problem hiding this comment.
'except' clause does nothing but pass and there is no explanatory comment.
| if temp_path: | ||
| try: | ||
| os.remove(temp_path) | ||
| except Exception: |
There was a problem hiding this comment.
'except' clause does nothing but pass and there is no explanatory comment.
| ) | ||
| await self.hs.get_clock().sleep(Duration(microseconds=1)) | ||
|
|
||
| return TaskStatus.COMPLETE, None, None # type: ignore[unreachable] |
There was a problem hiding this comment.
This statement is unreachable.
| N = 5 | ||
| while True: | ||
| MAX_NUMBER_OF_ATTEMPTS = 5 | ||
| for attempt_number in range(1, MAX_NUMBER_OF_ATTEMPTS + 1): |
There was a problem hiding this comment.
This 'for' statement has a redundant 'else' as no 'break' is present in the body.
| if remote_res: | ||
| with remote_res: | ||
| consumer = BackgroundFileConsumer( | ||
| open(local_path, "wb"), self.reactor |
There was a problem hiding this comment.
File is opened but is not closed.
| temp_path = tmp.name | ||
| with res: | ||
| consumer = BackgroundFileConsumer( | ||
| open(temp_path, "wb"), self.reactor |
There was a problem hiding this comment.
File is opened but is not closed.
Famedly Synapse Release v1.146.0_1
Famedly additions for v1.146.0_1
depends on: famedly/complement#10
Notes for Famedly:
Deprecations and Removals
Updates to the Docker image
docker/Dockerfile-workersimage (see the Metrics section of our Docker testing docs). (#19336)Features
enable_local_media_storagewhich controls whether media is additionally stored locally when using configuredmedia_storage_providers. Setting this tofalseallows off-site media storage without a local cache. Contributed by Patrice Brend'amour @dr.allgood. (#19204)m.oauthUser-Interactive Auth stage for resetting cross-signing identity with the OAuth 2.0 API. The old, unstable name (org.matrix.cross_signing_reset) is now deprecated and will be removed in a future release. (#19273)server_namelabel (instead ofinstance). (#19337)