-
-
Notifications
You must be signed in to change notification settings - Fork 32.1k
docs: clarify using lru_cache on class method keeps a reference to the instance #96851
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Clarify that when using the lru_cache decorator on class methods the cached function keeps a reference to each instance (self), which prevents the class instance from being garbage collected.
I've closed the PR because I want to be worded this differently. The current sentence was already added to address this issue. Likewise a FAQ entry was added as well. Roughly this new issue is suggesting that those two efforts didn't go far enough. Per the dev guide, we document in an affirmative manner, saying what a tool does and how to use it. That is what the current sentence does. I'm thinking of adding a sentence to effect of "For ideas about using the lru_cache() to decorate methods, see the FAQ entry, How do I cache method calls?" IMO it was a profound disservice for the referenced video to call this "a memory leak". Any container such as a list or dict will create a strong reference that keeps an object alive. That is completely normal. The lru_cache() is much more favorable because it eventually releases the memory when an object ages out. A person using the decorator is literally instructing the computer to hold a limited number of question/answer pairs. The only issue here is that someone might want more aggressive release of memory if their instances happen to be huge. Otherwise, the entire point of the lru_cache is to guarantee that memory use doesn't grow without bound which is the case with an actual leak. |
Thank you for your quick response! I do think linking to the FAQ would help because it clearly states that:
I think the issue could be that the documentation doesn't mentioned if it's cached on the class level or on the instance level when used on a method, even though it's implied that it's on the class level in the previously mentioned sentence. This could lead to someone using Maybe mentioning that the results are stored on the class level when used on a method could be useful as well, since that does describe the affirmative case. Thank you for clarifying and helping! |
…onGH-96902) (cherry picked from commit bbc24b2) Co-authored-by: Raymond Hettinger <[email protected]>
…onGH-96902) (cherry picked from commit bbc24b2) Co-authored-by: Raymond Hettinger <[email protected]>
(cherry picked from commit bbc24b2) Co-authored-by: Raymond Hettinger <[email protected]>
(cherry picked from commit bbc24b2) Co-authored-by: Raymond Hettinger <[email protected]>
Documentation
When using the lru_cache decorator on class methods the cached function keeps a reference to each instance (self) preventing the class instance from being garbage collected. This is expected behavior, as
self
is a parameter to the cached function. However, many use it on class methods expecting the cache to be on a per instance basis.This video explains the issue in detail, and googling
lru_cache memory leak
gives a lot of examples of people who have faced this issue.I suggest that we clarify this behavior in the documentation to prevent others from falling into this trap.
Perhaps modifying the following paragraph something like this:
Would be happy to make a PR if you think this is a good idea.
Looking forward to feedback!
The text was updated successfully, but these errors were encountered: