Skip to content

Conversation

@hnnynh
Copy link

@hnnynh hnnynh commented Oct 30, 2025

@hnnynh
Copy link
Author

hnnynh commented Oct 30, 2025

  • Benchmark Result
encode_basestring_medium: Mean +- std dev: [base] 2.11 us +- 0.05 us -> [ac] 2.33 us +- 0.04 us: 1.11x slower
encode_basestring_medium_escapes: Mean +- std dev: [base] 15.3 us +- 0.4 us -> [ac] 17.6 us +- 0.2 us: 1.15x slower
scanstring_medium: Mean +- std dev: [base] 158 ns +- 2 ns -> [ac] 110 ns +- 2 ns: 1.43x faster

Benchmark hidden because not significant (4): encode_basestring_ascii_medium, encode_basestring_ascii_medium_escapes, encode_basestring_ascii_medium_unicode, encode_basestring_medium_unicode

Geometric mean: 1.02x faster
  • Script
import pyperf
import _json

# Test data - medium objects (typical API responses)
medium_string = "The quick brown fox jumps over the lazy dog. " * 20  # ~900 chars
medium_string_with_escapes = '''Line 1: "status": "success"
Line 2: Response received at 2024-01-15T10:30:00Z
\tError code: null
\tMessage: "Operation completed successfully"
''' * 50
medium_unicode = "User: 김철수 | Email: [email protected] | Status: ✓ Active | Location: 東京都 | Rating: ★★★★☆ | " * 30
medium_json_string = '"' + '{"id": 12345, "name": "John Doe", "email": "[email protected]"}' * 20 + '"'


def bench_encode_basestring_ascii(s):
    return _json.encode_basestring_ascii(s)

@hnnynh hnnynh marked this pull request as ready for review October 30, 2025 06:51
@hnnynh
Copy link
Author

hnnynh commented Oct 30, 2025

I've checked the memory leak test.

>  ./python.exe -m test test_json -R3:3

Using random seed: 2482685647
0:00:00 load avg: 5.30 Run 1 test sequentially in a single process
0:00:00 load avg: 5.30 [1/1] test_json
beginning 6 repetitions. Showing number of leaks (. for 0 or less, X for 10 or more)
123:456
XX. ...
0:00:30 load avg: 4.92 [1/1] test_json passed in 30.6 sec

== Tests result: SUCCESS ==

1 test OK.

Total duration: 30.6 sec
Total tests: run=203 skipped=3
Total test files: run=1/1
Result: SUCCESS

Copy link
Member

@corona10 corona10 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, parsing overhead does not exists and it becomes better!

@corona10 corona10 enabled auto-merge (squash) October 30, 2025 07:18
@corona10 corona10 disabled auto-merge October 30, 2025 07:20
@corona10 corona10 enabled auto-merge (squash) October 30, 2025 07:54
@corona10 corona10 disabled auto-merge October 30, 2025 07:54
@corona10 corona10 enabled auto-merge (squash) October 30, 2025 07:55
@corona10 corona10 disabled auto-merge October 30, 2025 08:23
@corona10 corona10 enabled auto-merge (squash) October 30, 2025 09:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants