Skip to content

Commit 8c663fd

Browse files
authored
Replace KB unit with KiB (#4293)
kB (*kilo* byte) unit means 1000 bytes, whereas KiB ("kibibyte") means 1024 bytes. KB was misused: replace kB or KB with KiB when appropriate. Same change for MB and GB which become MiB and GiB. Change the output of Tools/iobench/iobench.py. Round also the size of the documentation from 5.5 MB to 5 MiB.
1 parent 0e163d2 commit 8c663fd

38 files changed

+76
-76
lines changed

Doc/c-api/memory.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -450,7 +450,7 @@ The pymalloc allocator
450450
451451
Python has a *pymalloc* allocator optimized for small objects (smaller or equal
452452
to 512 bytes) with a short lifetime. It uses memory mappings called "arenas"
453-
with a fixed size of 256 KB. It falls back to :c:func:`PyMem_RawMalloc` and
453+
with a fixed size of 256 KiB. It falls back to :c:func:`PyMem_RawMalloc` and
454454
:c:func:`PyMem_RawRealloc` for allocations larger than 512 bytes.
455455
456456
*pymalloc* is the default allocator of the :c:data:`PYMEM_DOMAIN_MEM` (ex:

Doc/library/hashlib.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -267,7 +267,7 @@ include a `salt <https://en.wikipedia.org/wiki/Salt_%28cryptography%29>`_.
267267
should be about 16 or more bytes from a proper source, e.g. :func:`os.urandom`.
268268

269269
*n* is the CPU/Memory cost factor, *r* the block size, *p* parallelization
270-
factor and *maxmem* limits memory (OpenSSL 1.1.0 defaults to 32 MB).
270+
factor and *maxmem* limits memory (OpenSSL 1.1.0 defaults to 32 MiB).
271271
*dklen* is the length of the derived key.
272272

273273
Availability: OpenSSL 1.1+

Doc/library/locale.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -373,7 +373,7 @@ The :mod:`locale` module defines the following exception and functions:
373373

374374
Please note that this function works like :meth:`format_string` but will
375375
only work for exactly one ``%char`` specifier. For example, ``'%f'`` and
376-
``'%.0f'`` are both valid specifiers, but ``'%f kB'`` is not.
376+
``'%.0f'`` are both valid specifiers, but ``'%f KiB'`` is not.
377377

378378
For whole format strings, use :func:`format_string`.
379379

Doc/library/multiprocessing.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1034,7 +1034,7 @@ Connection objects are usually created using :func:`Pipe` -- see also
10341034
Send an object to the other end of the connection which should be read
10351035
using :meth:`recv`.
10361036

1037-
The object must be picklable. Very large pickles (approximately 32 MB+,
1037+
The object must be picklable. Very large pickles (approximately 32 MiB+,
10381038
though it depends on the OS) may raise a :exc:`ValueError` exception.
10391039

10401040
.. method:: recv()
@@ -1071,7 +1071,7 @@ Connection objects are usually created using :func:`Pipe` -- see also
10711071

10721072
If *offset* is given then data is read from that position in *buffer*. If
10731073
*size* is given then that many bytes will be read from buffer. Very large
1074-
buffers (approximately 32 MB+, though it depends on the OS) may raise a
1074+
buffers (approximately 32 MiB+, though it depends on the OS) may raise a
10751075
:exc:`ValueError` exception
10761076

10771077
.. method:: recv_bytes([maxlength])

Doc/tools/templates/download.html

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -18,23 +18,23 @@ <h1>Download Python {{ release }} Documentation</h1>
1818
<table class="docutils">
1919
<tr><th>Format</th><th>Packed as .zip</th><th>Packed as .tar.bz2</th></tr>
2020
<tr><td>PDF (US-Letter paper size)</td>
21-
<td><a href="{{ dlbase }}/python-{{ release }}-docs-pdf-letter.zip">Download</a> (ca. 13 MB)</td>
22-
<td><a href="{{ dlbase }}/python-{{ release }}-docs-pdf-letter.tar.bz2">Download</a> (ca. 13 MB)</td>
21+
<td><a href="{{ dlbase }}/python-{{ release }}-docs-pdf-letter.zip">Download</a> (ca. 13 MiB)</td>
22+
<td><a href="{{ dlbase }}/python-{{ release }}-docs-pdf-letter.tar.bz2">Download</a> (ca. 13 MiB)</td>
2323
</tr>
2424
<tr><td>PDF (A4 paper size)</td>
25-
<td><a href="{{ dlbase }}/python-{{ release }}-docs-pdf-a4.zip">Download</a> (ca. 13 MB)</td>
26-
<td><a href="{{ dlbase }}/python-{{ release }}-docs-pdf-a4.tar.bz2">Download</a> (ca. 13 MB)</td>
25+
<td><a href="{{ dlbase }}/python-{{ release }}-docs-pdf-a4.zip">Download</a> (ca. 13 MiB)</td>
26+
<td><a href="{{ dlbase }}/python-{{ release }}-docs-pdf-a4.tar.bz2">Download</a> (ca. 13 MiB)</td>
2727
</tr>
2828
<tr><td>HTML</td>
29-
<td><a href="{{ dlbase }}/python-{{ release }}-docs-html.zip">Download</a> (ca. 9 MB)</td>
30-
<td><a href="{{ dlbase }}/python-{{ release }}-docs-html.tar.bz2">Download</a> (ca. 6 MB)</td>
29+
<td><a href="{{ dlbase }}/python-{{ release }}-docs-html.zip">Download</a> (ca. 9 MiB)</td>
30+
<td><a href="{{ dlbase }}/python-{{ release }}-docs-html.tar.bz2">Download</a> (ca. 6 MiB)</td>
3131
</tr>
3232
<tr><td>Plain Text</td>
33-
<td><a href="{{ dlbase }}/python-{{ release }}-docs-text.zip">Download</a> (ca. 3 MB)</td>
34-
<td><a href="{{ dlbase }}/python-{{ release }}-docs-text.tar.bz2">Download</a> (ca. 2 MB)</td>
33+
<td><a href="{{ dlbase }}/python-{{ release }}-docs-text.zip">Download</a> (ca. 3 MiB)</td>
34+
<td><a href="{{ dlbase }}/python-{{ release }}-docs-text.tar.bz2">Download</a> (ca. 2 MiB)</td>
3535
</tr>
3636
<tr><td>EPUB</td>
37-
<td><a href="{{ dlbase }}/python-{{ release }}-docs.epub">Download</a> (ca. 5.5 MB)</td>
37+
<td><a href="{{ dlbase }}/python-{{ release }}-docs.epub">Download</a> (ca. 5 MiB)</td>
3838
<td></td>
3939
</tr>
4040
</table>

Include/internal/pymalloc.h

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -160,7 +160,7 @@
160160
*/
161161
#ifdef WITH_MEMORY_LIMITS
162162
#ifndef SMALL_MEMORY_LIMIT
163-
#define SMALL_MEMORY_LIMIT (64 * 1024 * 1024) /* 64 MB -- more? */
163+
#define SMALL_MEMORY_LIMIT (64 * 1024 * 1024) /* 64 MiB -- more? */
164164
#endif
165165
#endif
166166

@@ -177,7 +177,7 @@
177177
* Arenas are allocated with mmap() on systems supporting anonymous memory
178178
* mappings to reduce heap fragmentation.
179179
*/
180-
#define ARENA_SIZE (256 << 10) /* 256KB */
180+
#define ARENA_SIZE (256 << 10) /* 256 KiB */
181181

182182
#ifdef WITH_MEMORY_LIMITS
183183
#define MAX_ARENAS (SMALL_MEMORY_LIMIT / ARENA_SIZE)
@@ -435,7 +435,7 @@ currently in use isn't on either list.
435435
*/
436436

437437
/* How many arena_objects do we initially allocate?
438-
* 16 = can allocate 16 arenas = 16 * ARENA_SIZE = 4MB before growing the
438+
* 16 = can allocate 16 arenas = 16 * ARENA_SIZE = 4 MiB before growing the
439439
* `arenas` vector.
440440
*/
441441
#define INITIAL_ARENA_OBJECTS 16

Lib/distutils/cygwinccompiler.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -234,8 +234,8 @@ def link(self, target_desc, objects, output_filename, output_dir=None,
234234
# who wants symbols and a many times larger output file
235235
# should explicitly switch the debug mode on
236236
# otherwise we let dllwrap/ld strip the output file
237-
# (On my machine: 10KB < stripped_file < ??100KB
238-
# unstripped_file = stripped_file + XXX KB
237+
# (On my machine: 10KiB < stripped_file < ??100KiB
238+
# unstripped_file = stripped_file + XXX KiB
239239
# ( XXX=254 for a typical python extension))
240240
if not debug:
241241
extra_preargs.append("-s")

Lib/gzip.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -308,7 +308,7 @@ def close(self):
308308
if self.mode == WRITE:
309309
fileobj.write(self.compress.flush())
310310
write32u(fileobj, self.crc)
311-
# self.size may exceed 2GB, or even 4GB
311+
# self.size may exceed 2 GiB, or even 4 GiB
312312
write32u(fileobj, self.size & 0xffffffff)
313313
elif self.mode == READ:
314314
self._buffer.close()

Lib/test/_test_multiprocessing.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4221,7 +4221,7 @@ def handler(signum, frame):
42214221
conn.send('ready')
42224222
x = conn.recv()
42234223
conn.send(x)
4224-
conn.send_bytes(b'x'*(1024*1024)) # sending 1 MB should block
4224+
conn.send_bytes(b'x' * (1024 * 1024)) # sending 1 MiB should block
42254225

42264226
@unittest.skipUnless(hasattr(signal, 'SIGUSR1'), 'requires SIGUSR1')
42274227
def test_ignore(self):

Lib/test/libregrtest/cmdline.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -96,7 +96,7 @@
9696
9797
largefile - It is okay to run some test that may create huge
9898
files. These tests can take a long time and may
99-
consume >2GB of disk space temporarily.
99+
consume >2 GiB of disk space temporarily.
100100
101101
network - It is okay to run tests that use external network
102102
resource, e.g. testing SSL support for sockets.

Lib/test/pickletester.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2276,7 +2276,7 @@ def f():
22762276

22772277
class BigmemPickleTests(unittest.TestCase):
22782278

2279-
# Binary protocols can serialize longs of up to 2GB-1
2279+
# Binary protocols can serialize longs of up to 2 GiB-1
22802280

22812281
@bigmemtest(size=_2G, memuse=3.6, dry_run=False)
22822282
def test_huge_long_32b(self, size):
@@ -2291,7 +2291,7 @@ def test_huge_long_32b(self, size):
22912291
finally:
22922292
data = None
22932293

2294-
# Protocol 3 can serialize up to 4GB-1 as a bytes object
2294+
# Protocol 3 can serialize up to 4 GiB-1 as a bytes object
22952295
# (older protocols don't have a dedicated opcode for bytes and are
22962296
# too inefficient)
22972297

Lib/test/test_bigaddrspace.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
than what the address space allows are properly met with an OverflowError
44
(rather than crash weirdly).
55
6-
Primarily, this means 32-bit builds with at least 2 GB of available memory.
6+
Primarily, this means 32-bit builds with at least 2 GiB of available memory.
77
You need to pass the -M option to regrtest (e.g. "-M 2.1G") for tests to
88
be enabled.
99
"""

Lib/test/test_bz2.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ class BaseTest(unittest.TestCase):
6262
BAD_DATA = b'this is not a valid bzip2 file'
6363

6464
# Some tests need more than one block of uncompressed data. Since one block
65-
# is at least 100 kB, we gather some data dynamically and compress it.
65+
# is at least 100,000 bytes, we gather some data dynamically and compress it.
6666
# Note that this assumes that compression works correctly, so we cannot
6767
# simply use the bigger test data for all tests.
6868
test_size = 0

Lib/test/test_io.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -564,7 +564,7 @@ def test_raw_bytes_io(self):
564564

565565
def test_large_file_ops(self):
566566
# On Windows and Mac OSX this test comsumes large resources; It takes
567-
# a long time to build the >2GB file and takes >2GB of disk space
567+
# a long time to build the >2 GiB file and takes >2 GiB of disk space
568568
# therefore the resource must be enabled to run this test.
569569
if sys.platform[:3] == 'win' or sys.platform == 'darwin':
570570
support.requires(
@@ -736,7 +736,7 @@ def test_unbounded_file(self):
736736
if sys.maxsize > 0x7FFFFFFF:
737737
self.skipTest("test can only run in a 32-bit address space")
738738
if support.real_max_memuse < support._2G:
739-
self.skipTest("test requires at least 2GB of memory")
739+
self.skipTest("test requires at least 2 GiB of memory")
740740
with self.open(zero, "rb", buffering=0) as f:
741741
self.assertRaises(OverflowError, f.read)
742742
with self.open(zero, "rb") as f:
@@ -1421,7 +1421,7 @@ class CBufferedReaderTest(BufferedReaderTest, SizeofTest):
14211421
def test_constructor(self):
14221422
BufferedReaderTest.test_constructor(self)
14231423
# The allocation can succeed on 32-bit builds, e.g. with more
1424-
# than 2GB RAM and a 64-bit kernel.
1424+
# than 2 GiB RAM and a 64-bit kernel.
14251425
if sys.maxsize > 0x7FFFFFFF:
14261426
rawio = self.MockRawIO()
14271427
bufio = self.tp(rawio)
@@ -1733,7 +1733,7 @@ class CBufferedWriterTest(BufferedWriterTest, SizeofTest):
17331733
def test_constructor(self):
17341734
BufferedWriterTest.test_constructor(self)
17351735
# The allocation can succeed on 32-bit builds, e.g. with more
1736-
# than 2GB RAM and a 64-bit kernel.
1736+
# than 2 GiB RAM and a 64-bit kernel.
17371737
if sys.maxsize > 0x7FFFFFFF:
17381738
rawio = self.MockRawIO()
17391739
bufio = self.tp(rawio)
@@ -2206,7 +2206,7 @@ class CBufferedRandomTest(BufferedRandomTest, SizeofTest):
22062206
def test_constructor(self):
22072207
BufferedRandomTest.test_constructor(self)
22082208
# The allocation can succeed on 32-bit builds, e.g. with more
2209-
# than 2GB RAM and a 64-bit kernel.
2209+
# than 2 GiB RAM and a 64-bit kernel.
22102210
if sys.maxsize > 0x7FFFFFFF:
22112211
rawio = self.MockRawIO()
22122212
bufio = self.tp(rawio)

Lib/test/test_largefile.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -9,12 +9,12 @@
99
import io # C implementation of io
1010
import _pyio as pyio # Python implementation of io
1111

12-
# size of file to create (>2GB; 2GB == 2147483648 bytes)
12+
# size of file to create (>2 GiB; 2 GiB == 2,147,483,648 bytes)
1313
size = 2500000000
1414

1515
class LargeFileTest:
1616
"""Test that each file function works as expected for large
17-
(i.e. > 2GB) files.
17+
(i.e. > 2 GiB) files.
1818
"""
1919

2020
def setUp(self):
@@ -142,7 +142,7 @@ def setUpModule():
142142
pass
143143

144144
# On Windows and Mac OSX this test comsumes large resources; It
145-
# takes a long time to build the >2GB file and takes >2GB of disk
145+
# takes a long time to build the >2 GiB file and takes >2 GiB of disk
146146
# space therefore the resource must be enabled to run this test.
147147
# If not, nothing after this line stanza will be executed.
148148
if sys.platform[:3] == 'win' or sys.platform == 'darwin':

Lib/test/test_mmap.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -777,7 +777,7 @@ def test_large_filesize(self):
777777
with mmap.mmap(f.fileno(), 0x10000, access=mmap.ACCESS_READ) as m:
778778
self.assertEqual(m.size(), 0x180000000)
779779

780-
# Issue 11277: mmap() with large (~4GB) sparse files crashes on OS X.
780+
# Issue 11277: mmap() with large (~4 GiB) sparse files crashes on OS X.
781781

782782
def _test_around_boundary(self, boundary):
783783
tail = b' DEARdear '

Lib/test/test_os.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -171,7 +171,7 @@ def test_large_read(self, size):
171171
with open(support.TESTFN, "rb") as fp:
172172
data = os.read(fp.fileno(), size)
173173

174-
# The test does not try to read more than 2 GB at once because the
174+
# The test does not try to read more than 2 GiB at once because the
175175
# operating system is free to return less bytes than requested.
176176
self.assertEqual(data, b'test')
177177

@@ -2573,7 +2573,7 @@ def handle_error(self):
25732573
@unittest.skipUnless(hasattr(os, 'sendfile'), "test needs os.sendfile()")
25742574
class TestSendfile(unittest.TestCase):
25752575

2576-
DATA = b"12345abcde" * 16 * 1024 # 160 KB
2576+
DATA = b"12345abcde" * 16 * 1024 # 160 KiB
25772577
SUPPORT_HEADERS_TRAILERS = not sys.platform.startswith("linux") and \
25782578
not sys.platform.startswith("solaris") and \
25792579
not sys.platform.startswith("sunos")

Lib/test/test_socket.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5299,7 +5299,7 @@ class SendfileUsingSendTest(ThreadedTCPSocketTest):
52995299
Test the send() implementation of socket.sendfile().
53005300
"""
53015301

5302-
FILESIZE = (10 * 1024 * 1024) # 10MB
5302+
FILESIZE = (10 * 1024 * 1024) # 10 MiB
53035303
BUFSIZE = 8192
53045304
FILEDATA = b""
53055305
TIMEOUT = 2

Lib/test/test_tarfile.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -779,12 +779,12 @@ class Bz2DetectReadTest(Bz2Test, DetectReadTest):
779779
def test_detect_stream_bz2(self):
780780
# Originally, tarfile's stream detection looked for the string
781781
# "BZh91" at the start of the file. This is incorrect because
782-
# the '9' represents the blocksize (900kB). If the file was
782+
# the '9' represents the blocksize (900,000 bytes). If the file was
783783
# compressed using another blocksize autodetection fails.
784784
with open(tarname, "rb") as fobj:
785785
data = fobj.read()
786786

787-
# Compress with blocksize 100kB, the file starts with "BZh11".
787+
# Compress with blocksize 100,000 bytes, the file starts with "BZh11".
788788
with bz2.BZ2File(tmpname, "wb", compresslevel=1) as fobj:
789789
fobj.write(data)
790790

Lib/test/test_threading.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -132,10 +132,10 @@ def f():
132132
# Kill the "immortal" _DummyThread
133133
del threading._active[ident[0]]
134134

135-
# run with a small(ish) thread stack size (256kB)
135+
# run with a small(ish) thread stack size (256 KiB)
136136
def test_various_ops_small_stack(self):
137137
if verbose:
138-
print('with 256kB thread stack size...')
138+
print('with 256 KiB thread stack size...')
139139
try:
140140
threading.stack_size(262144)
141141
except _thread.error:
@@ -144,10 +144,10 @@ def test_various_ops_small_stack(self):
144144
self.test_various_ops()
145145
threading.stack_size(0)
146146

147-
# run with a large thread stack size (1MB)
147+
# run with a large thread stack size (1 MiB)
148148
def test_various_ops_large_stack(self):
149149
if verbose:
150-
print('with 1MB thread stack size...')
150+
print('with 1 MiB thread stack size...')
151151
try:
152152
threading.stack_size(0x100000)
153153
except _thread.error:

Lib/test/test_zipfile64.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ def zipTest(self, f, compression):
3939
# Create the ZIP archive.
4040
zipfp = zipfile.ZipFile(f, "w", compression)
4141

42-
# It will contain enough copies of self.data to reach about 6GB of
42+
# It will contain enough copies of self.data to reach about 6 GiB of
4343
# raw data to store.
4444
filecount = 6*1024**3 // len(self.data)
4545

Lib/test/test_zlib.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,7 @@ def test_same_as_binascii_crc32(self):
7272
self.assertEqual(binascii.crc32(b'spam'), zlib.crc32(b'spam'))
7373

7474

75-
# Issue #10276 - check that inputs >=4GB are handled correctly.
75+
# Issue #10276 - check that inputs >=4 GiB are handled correctly.
7676
class ChecksumBigBufferTestCase(unittest.TestCase):
7777

7878
@bigmemtest(size=_4G + 4, memuse=1, dry_run=False)
@@ -130,7 +130,7 @@ def test_overflow(self):
130130
class BaseCompressTestCase(object):
131131
def check_big_compress_buffer(self, size, compress_func):
132132
_1M = 1024 * 1024
133-
# Generate 10MB worth of random, and expand it by repeating it.
133+
# Generate 10 MiB worth of random, and expand it by repeating it.
134134
# The assumption is that zlib's memory is not big enough to exploit
135135
# such spread out redundancy.
136136
data = b''.join([random.getrandbits(8 * _1M).to_bytes(_1M, 'little')

Lib/xmlrpc/client.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1046,7 +1046,7 @@ def gzip_encode(data):
10461046
# in the HTTP header, as described in RFC 1952
10471047
#
10481048
# @param data The encoded data
1049-
# @keyparam max_decode Maximum bytes to decode (20MB default), use negative
1049+
# @keyparam max_decode Maximum bytes to decode (20 MiB default), use negative
10501050
# values for unlimited decoding
10511051
# @return the unencoded data
10521052
# @raises ValueError if data is not correctly coded.

Misc/NEWS.d/3.5.0a1.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3035,7 +3035,7 @@ by Phil Elson.
30353035
.. section: Library
30363036
30373037
os.read() now uses a :c:func:`Py_ssize_t` type instead of :c:type:`int` for
3038-
the size to support reading more than 2 GB at once. On Windows, the size is
3038+
the size to support reading more than 2 GiB at once. On Windows, the size is
30393039
truncted to INT_MAX. As any call to os.read(), the OS may read less bytes
30403040
than the number of requested bytes.
30413041

@@ -3144,7 +3144,7 @@ by Pablo Torres Navarrete and SilentGhost.
31443144
.. nonce: u_oiv9
31453145
.. section: Library
31463146
3147-
ssl.RAND_add() now supports strings longer than 2 GB.
3147+
ssl.RAND_add() now supports strings longer than 2 GiB.
31483148

31493149
..
31503150

0 commit comments

Comments
 (0)