-
Notifications
You must be signed in to change notification settings - Fork 822
Offer a way to calculate and limit the total cost of a query #772
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@patrys I think you can achieve this using middleware. I'd imagine you set up some counter in the execution context and then increment it in the middleware. I've not actually tried this though but if you're willing to try it out do let us know how it goes. |
@dan98765 I found this comment that implies that you already have something working? Could you share how you do it? |
@jkimbo I want to calculate this and reject a query before it starts to be evaluated. |
Ah sorry I missed that part. As far as I know there isn't an "official" way of doing this but I've been playing around with then new backends feature that @syrusakbary introduced in graphql-core (graphql-python/graphql-core#185) and I think this would be a great use case for it. You should be able to do something like this with graphql-core 2.1rc: from graphql.backend.core import GraphQLCoreBackend
def measure_depth(selection_set, level=1):
max_depth = level
for field in selection_set.selections:
if field.selection_set:
new_depth = measure_depth(field.selection_set, level=level + 1)
if new_depth > max_depth:
max_depth = new_depth
return max_depth
class DepthAnalysisBackend(GraphQLCoreBackend):
def document_from_string(self, schema, document_string):
document = super().document_from_string(schema, document_string)
ast = document.document_ast
for definition in ast.definitions:
# We are only interested in queries
if definition.operation != 'query':
continue
depth = measure_depth(definition.selection_set)
if depth > 3: # set your depth max here
raise Exception('Query is too complex')
return document Then when you're executing the following query it will bail out before trying to execute the query: query_string = '''
query {
myFavouriteBook {
author {
bestBook {
author {
name
}
}
}
}
}
'''
backend = DepthAnalysisBackend()
schema = graphene.Schema(query=Query)
result = schema.execute(query_string, backend=backend)
assert result.errors I think there is a lot more that could be done here (like being able to assign complexity values to each field) but hopefully this helps you for now. |
Here's some more about the topic from the JS world: |
Closing this issue in favour of tracking multiple approaches to protecting against malicious queries in #907 |
Hey, I am trying to add this code into my settings file, but it doesn't work!
|
Temporary solution:
And remove this check: |
Since GraphQLView.as_view(schema=schema, middleware=[DepthLimitMiddleware()]) class DepthLimitMiddleware:
def resolve(self, next: Callable, root: Any, info: ResolveInfo, **args):
# We need to check the depth only for the root, any nested queries don't need to be rechecked.
if root is not None:
return next(root, info, **args)
for ast in info.field_asts:
if not (selection_set := getattr(ast, "selection_set", None)):
continue
depth = self._get_query_depth(selection_set)
if depth > MAX_GQL_QUERY_DEPTH:
raise ValidationError("Query is too complex.")
return next(root, info, **args)
def _get_query_depth(self, selection_set: SelectionSet, level: int = 1) -> int:
max_depth = level
for field in getattr(selection_set, "selections", []):
# The field we are at is already a lever deeper, even if it doesn't have its own selection set.
max_depth = max(max_depth, level + 1)
if selection_set := getattr(field, "selection_set", None):
max_depth = max(max_depth, self._get_query_depth(selection_set, level + 1))
return max_depth Edited based on testing and point made by @patrys. |
Middleware is a poor fit because middleware is executed for each and every field resolver. For anyone interested, here's how we solved it in our codebase: |
When offering a public API there's the problem of malicious clients preparing intentionally expensive queries. For example one could abuse mutually related objects to arbitrarily make a query more expensive:
book
->author
->books
->author
->books
-> ...and so on.I'd like to request a method to evaluate the estimated cost of query before actually executing any resolvers and a way to prevent execution of queries with the estimate cost above a certain threshold (ideally in a programmatic fashion so we could for example vary the limits depending on the currently logged in user's role).
The text was updated successfully, but these errors were encountered: