Skip to content

add port_yield from #10795 only on zephyr to fix #10822#10824

Merged
dhalbert merged 2 commits intoadafruit:10.1.xfrom
dhalbert:revert-port_yield-10795-10.1.x
Feb 17, 2026
Merged

add port_yield from #10795 only on zephyr to fix #10822#10824
dhalbert merged 2 commits intoadafruit:10.1.xfrom
dhalbert:revert-port_yield-10795-10.1.x

Conversation

@dhalbert
Copy link
Collaborator

-- Fixes #10822

Disables calls added to #10795 to port_yield(), except on zephyr port.

Thank you @daniel-alsen.

Tested on Metro ESP32-S3 to confirm it restores proper performance.

@dhalbert dhalbert requested review from FoamyGuy and tannewt February 16, 2026 21:28
Copy link
Collaborator

@FoamyGuy FoamyGuy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me. restricts change to zephyr build only.

@dhalbert
Copy link
Collaborator Author

@tannewt if you have a more in-depth fix that is fine. I was just trying to reverse the effects on the change on the non-zephyr builds.

The original port_yield() was for #6742, which maybe is a different use for port_yield(). I think it might be good to figure out if portYIELD() is good enough, instead of adding a delay. I think it could also be done only on msec ticks instead of on every HOOK (RUN_BACKGROUND_TASKS) call, which is a lot more often. We can discuss.

Copy link
Member

@tannewt tannewt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry about this! We should definitely try FreeRTOS portYield. IIRC I added the time in order to get the idle task to run and reset its timer. There's gotta be another way to fix it though. We can do that in a follow up though.

@dhalbert dhalbert merged commit 9b72899 into adafruit:10.1.x Feb 17, 2026
17 checks passed
@dhalbert dhalbert deleted the revert-port_yield-10795-10.1.x branch February 17, 2026 18:11
@daniel-alsen
Copy link

Thanks guys!

Is there any way to run an automated performance regression test in an emulator as part of the build process? E.g. running a wide range of operations and checking vs. a known duration timestamp for a board?

I saw that there were changes to the garbage collector as well. These changes might be difficult to to test and maybe a "peformance guard" could help in some way?

@bablokb
Copy link

bablokb commented Feb 18, 2026

@daniel-alsen

Automated performance regression tests would be great. But I am not so sure about the emulator part. Even if there were an emulator, it would not detect problems triggered by USB or the network stack.

Running tests on dedicated hardware is possible with Github actions, see e.g. https://github.com/ChandimaJayaneththi/ESP32-GitHub-Actions-CI

The CircuitPython sources do contain a number of performance tests, see directory tests/perf_bench. But these tests are targeted at MicroPython and don't run out-of-the-box on CircuitPython. I "ported" a few to CircuitPython, see https://github.com/bablokb/circuitpython-examples/tree/master/perf, but that was for manually testing.

@tannewt
Copy link
Member

tannewt commented Feb 18, 2026

I don't think an emulator would work because I don't know of one that is rigorous enough to get timing right. For performance testing, I think we'd want to do it on actual devices.

tannewt added a commit that referenced this pull request Feb 18, 2026
@daniel-alsen
Copy link

daniel-alsen commented Feb 19, 2026

Consistent emulator runs in the same environment would be the key.

At least one emulator for one port needs to produce reasonably consistent timestamps for a performance test run, in the same CI environment on the same hardware. No need to match the exact timestamp for a run on real device. Then it could say "Loop test is up by 10%", "BLE serial slowed down by 20%", "Overall improvement by 15%".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants