-
Notifications
You must be signed in to change notification settings - Fork 393
Instrument _ByteProducer
with tracing to measure potential dead time while writing bytes to the request
#18804
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
MadLittleMods
merged 7 commits into
develop
from
madlittlemods/17722-trace-byte-producer
Aug 27, 2025
Merged
Instrument _ByteProducer
with tracing to measure potential dead time while writing bytes to the request
#18804
MadLittleMods
merged 7 commits into
develop
from
madlittlemods/17722-trace-byte-producer
Aug 27, 2025
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Spawning from #17722
_ByteProducer
with tracing_ByteProducer
with tracing
MadLittleMods
commented
Aug 12, 2025
_ByteProducer
with tracing_ByteProducer
with tracing to measure potential dead time while writing bytes to the request
3 tasks
MadLittleMods
added a commit
that referenced
this pull request
Aug 20, 2025
3 tasks
These changes are being introduced in #18849 instead.
MadLittleMods
added a commit
that referenced
this pull request
Aug 20, 2025
Spawning from #18804 (comment)
3 tasks
anoadragon453
approved these changes
Aug 27, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the detailed explanation and screenshots!
MadLittleMods
added a commit
that referenced
this pull request
Aug 27, 2025
Switch to OpenTracing's `ContextVarsScopeManager` instead of our own custom `LogContextScopeManager`. This is now possible because the linked Twisted issue from the comment in our custom `LogContextScopeManager` is resolved: https://twistedmatrix.com/trac/ticket/10301 This PR is spawning from exploring different possibilities to solve the `scope` loss problem I was encountering in #18804 (comment). This appears to solve the problem and I've added the additional test from there to this PR ✅
Thanks for the review @anoadragon453 🐆 |
This was referenced Oct 1, 2025
MadLittleMods
added a commit
that referenced
this pull request
Oct 2, 2025
…19007) Revert #18849 Go back to our custom `LogContextScopeManager` after trying OpenTracing's `ContextVarsScopeManager`. Fix #19004 ### Why revert? For reference, with the normal reactor, `ContextVarsScopeManager` worked just as good as our custom `LogContextScopeManager` as far as I can tell (and even better in some cases). But since Twisted appears to not fully support `ContextVar`'s, it doesn't work as expected in all cases. Compounding things, `ContextVarsScopeManager` was causing errors with the experimental `SYNAPSE_ASYNC_IO_REACTOR` option. Since we're not getting the full benefit that we originally desired, we might as well revert and figure out alternatives for extending the logcontext lifetimes to support the use case we were trying to unlock (c.f. #18804). See #19004 (comment) for more info. ### Does this require backporting and patch releases? No. Since `ContextVarsScopeManager` operates just as good with the normal reactor and was only causing actual errors with the experimental `SYNAPSE_ASYNC_IO_REACTOR` option, I don't think this requires us to backport and make patch releases at all. ### Maintain cross-links between main trace and background process work In order to maintain the functionality introduced in #18932 (cross-links between the background process trace and currently active trace), we also needed a small change. Previously, when we were using `ContextVarsScopeManager`, it tracked the tracing scope across the logcontext changes without issue. Now that we're using our own custom `LogContextScopeManager` again, we need to capture the active span from the logcontext before we reset to the sentinel context because of the `PreserveLoggingContext()` below. Added some tests to ensure we maintain the `run_as_background` tracing behavior regardless of the tracing scope manager we use.
MadLittleMods
added a commit
that referenced
this pull request
Oct 2, 2025
…_available` This test is still expected to fail with our current `LogContextScopeManager` implementation but there were some flaws in how we were handling the logcontext before (like pumping the reactor in a non-sentinel logcontext) This test is originally from #18849 spawning from #18804 (comment)
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Instrument
_ByteProducer
with tracing. This will allow to easily see how much time is taken up by being able to filter by thewrite_bytes_to_request
operation in Jaeger.Spawning from #17722
The
write_bytes_to_request
span won't show up in the trace until #18849 is merged.Note: It's totally fine for a span child to finish after the parent. See
https://opentracing.io/specification/#references-between-spans which shows "Child Span D" outliving the "Parent Span"
Testing strategy
homeserver.yaml
opentracing
depdency is installed:poetry install --extras all
poetry run synapse_homeserver --config-path homeserver.yaml
GET http://localhost:8008/_matrix/client/versions
Dev notes
PR where we did a similar thing by managing
__enter__
and__exit__
manually on astart_active_span
context manager for tracing, matrix-org/synapse#15888PR's where our custom
LogContextScopeManager
/_LogContextScope
was changed:Discussion below where we were running into
scope
loss problems on theLoggingContext
which our customLogContextScopeManager
/_LogContextScope
used -> #18804 (comment)Log context docs: https://github.com/element-hq/synapse/blob/9135d78b88a429cf0220d6a93bdac7485a3a0f88/docs/log_contexts.md
Twisted docs on Producers (Producers and Consumers: Efficient High-Volume Streaming): https://docs.twisted.org/en/twisted-16.2.0/core/howto/producers.html
^.* INFO -
-> ``(\S*?)=(\S*)
->\n $1=$2
Todo
ContextVarsScopeManager
#18849 to mergePull Request Checklist
EventStore
toEventWorkerStore
.".code blocks
.