-
Notifications
You must be signed in to change notification settings - Fork 10.3k
Request Throttling for ASP.NET Core #29933
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@davidfowl Do you accept PR for this? 😅 https://github.com/Kahbazi/ThrottlR |
@Kahbazi based on the name alone, yes! But no, there's a bunch of design work that needs to happen. I'm sure the ideas are very similar though so maybe once I write down what has been done thus far the stars will align. PS: The quick look I've given that repository, the feature set aligns with what I would want to support here as well. I'll flesh out the proposal so you can give feedback based on what you've done. |
Thanks for contacting us. |
I hope this will subsume/replace Microsoft.AspNetCore.ConcurrencyLimiter. See #10702 for the original epic. |
I think if the abstraction is done right it might be able to replace the concurrency limiter |
I closed #19823 which asks for action-based (or endpoint routing based) concurrency limit support. I think this design should cover that by allowing for the implementation of hard concurrency limits and per-endpoint configuration. |
@davidfowl Any news on this? Is it still on track for 6.0? Will there be an IP based request throttling variant? |
We've moved this issue to the Backlog milestone. This means that it is not going to be worked on for the coming release. We will reassess the backlog following the current release and consider this item at that time. To learn more about our issue management process and to have better expectation regarding different types of issues you can read our Triage Process. |
I would love throttling on the Blazor virtualizer. The ItemsProvider is called with no regard for latency in that call. In my scenario, the items provider issues a database call which takes some time. Not a lot... but long enough such that a user could keep scrolling. As they do, the ItemsProvider calls just stack up. What I would prefer is all but the most recent data requests be discarded while a current fetch is underway. When the user finally stops scrolling, the most recent data request could then become the final request. Obviously, it has to be configurable because different data sources behave differently - but right now, if a user gets impatient and continues to scroll, that ItemsProvider function gets hammered and the SQL calls just stack up. Because I'm returning the right items count for the universe of results, they can also drag the scroll bar into a position that seems "right", but there's a myriad of calls that get stacked up while they're doing it. It would be nice if we could simply ignore scroll events until they're done for .5 seconds (or something like that.) |
Great! Here's a doc you can read to help you get started: https://docs.microsoft.com/en-us/aspnet/core/performance/rate-limit?view=aspnetcore-7.0, plus some info about updates in RC1 here: https://docs.microsoft.com/en-us/aspnet/core/performance/rate-limit?view=aspnetcore-7.0 |
Summary
1-2 sentences. Say what this is about.
Motivation and goals
1-2 paragraphs, or a bullet-pointed list. What existing pain points does this solve? What evidence shows it's valuable to solve this?
In scope
A list of major scenarios, perhaps in priority order.
Out of scope
Scenarios you explicitly want to exclude.
Risks / unknowns
How might developers misinterpret/misuse this? How might implementing it restrict us from other enhancements in the future? Also list any perf/security/correctness concerns.
Examples
Give brief examples of possible developer experiences (e.g., code they would write).
Don't be deeply concerned with how it would be implemented yet. Your examples could even be from other technology stacks.
The text was updated successfully, but these errors were encountered: