Skip to content

gh-127371: Fix high memory usage in SpooledTemporaryFile.writelines() when handling large iterators #127378

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 9 commits into from
6 changes: 3 additions & 3 deletions Lib/tempfile.py
Original file line number Diff line number Diff line change
Expand Up @@ -849,9 +849,9 @@ def write(self, s):

def writelines(self, iterable):
file = self._file
rv = file.writelines(iterable)
self._check(file)
return rv
for line in iterable:
file.write(line)
self._check(file)

def detach(self):
return self._file.detach()
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
The ``writelines()`` method in ``SpooledTemporaryFile`` previously only checked whether to roll over to a disk-backed file after the entire iterable was written. This led to high memory usage when handling large iterators, as the in-memory buffer grew without limit. The method was modified to write each line individually and check the buffer size after each write. If the maximum size is exceeded, the file rolls over to disk immediately, preventing excessive memory consumption.
Loading