Skip to content

Performance: Reuse ServerCallContext #369

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
JamesNK opened this issue Jul 4, 2019 · 4 comments
Closed

Performance: Reuse ServerCallContext #369

JamesNK opened this issue Jul 4, 2019 · 4 comments
Assignees
Labels
enhancement New feature or request
Milestone

Comments

@JamesNK
Copy link
Member

JamesNK commented Jul 4, 2019

HttpContextServerCallContext is currently created every call. If we make the context reusable (like HttpContext now is) then we can reset and reuse it between call. That would allow us to cache other per-call objects on it like the reader, writer, serialization context, etc, and save allocating those per-call as well.

Questions:

  1. Is reusing the context object an ok thing to do? Holding onto the context after a call is finished doesn't make sense to me. @jtattermusch is that right?
  2. How should the context be cached? Would a thread local work with the async nature of methods? Otherwise we could use a global object pool. Eventually I would like to use the pool built into hosting for HttpContext - Caching per-request objects on reused HttpContexts dotnet/aspnetcore#6895 - but that will have to be post 3.0
@jtattermusch
Copy link
Contributor

I think reusing ServerCallContext makes sense if it can be done safely. We'd need this

  • clearly defined semantics of ServerCallContext going out of scope - e.g. serverCallContext becomes invalid as soon as the server-side hander method implemented by the user returns)

  • it would be good to have some way of detecting that serverCallContext was used illegally - I can easily someone misusing the API and accessing the ServerCallContext shortly after the method handler has returns, which would result in crosstalk between two separate RPCs (something that can be hard to debug).

  • thread local wouldn't work with the async flow of control (the challenge is being able to return return the thread local context to the original thread you've borrowed it from). Btw, in Grpc.Core we currently have https://github.com/grpc/grpc/blob/70dd0b9b3b70e0923149f6c187834eb81b274401/src/csharp/Grpc.Core/Internal/DefaultObjectPool.cs#L29 which could be helpful, but to be honest, the functionality is basic and more performance work (and benchmarking) would be needed.

@jtattermusch
Copy link
Contributor

Looking at the ServerCallContext members, I'm not sure how many of them could be reused profitably.
Candidates are:

  • Metadata RequestHeaders and ReponseTrailers (reuse the containers to avoid creating new arrays)
  • internal state (SerializationContext, DeserializationContext etc.)
  • AuthContext (but it's initialized lazily anyway)
  • UserState (reuse the containers, but it's also intialized lazily)

I think it would be good to first have some solid benchmarks and then look into this optimization in a data-centric way.

@JamesNK
Copy link
Member Author

JamesNK commented Jul 8, 2019

Reusable:

  • SerializationContext
  • DeserializationContext
  • Deadline related allocations (deadline manager, SemaphoreSlim)
  • Reader and writer (they can be made reusable and have references to them off ServerCallContext)

@JamesNK JamesNK added this to the Preview 8 milestone Jul 8, 2019
@JamesNK JamesNK self-assigned this Jul 9, 2019
@JamesNK JamesNK modified the milestones: Preview 8, Backlog Jul 29, 2019
@JamesNK JamesNK added the enhancement New feature or request label Oct 4, 2019
@JamesNK
Copy link
Member Author

JamesNK commented Jan 20, 2022

Tried this out using IPersistentStateFeature in .NET 6 and it wasn't worth it. The context is too small to make up for the overhead of getting the context from a dictionary and reset all the state.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

3 participants