Open
Description
What problem does this solve or what need does it fill?
- Frame pacing assumes the event loop is being run on a single display. If multiple windows share the same event loop, there is no way to do any kind of framepacing or framelimiting across displays without introducing excess lag. Framepacing/limiting works by pausing the schedule to delay input gathering until just enough time is left before frame presentation to render the frame. If two windows are on different displays with different refresh rates (consider XR), there is no way for both of these windows to share an event loop without causing some kind of visual/temporal artifacts (stuttering vs. lag).
- AFAIK, winit only reports touch events for the primary window, making it impossible to support touch on multiple windows within the same winit context.
What solution would you like?
- Every window would have its own event loop. I have no idea if it would even be possible to sync worlds if the schedules, by definition, cannot be in lockstep.
- Alternatively, gather inputs constantly in channels, and pass them into the app, with each window's schedule choosing when to read from the input channel. This is a pseudo event loop that relies on inputs being pulled instead of being pushed.
What alternative(s) have you considered?
- Accepting this as a limitation of bevy, and rely on IPC to handle multi-window applications. Document this limitation.
- With "true" render pipelining, decouple render sync points and frequency from the main app schedule. This doesn't solve the input latency vs. stuttering tradeoff, but it would allow independent rendering framerates per window.