Description
I came to a point where I need some ns
delays, which can't be achieved by using the Delay
trait. So I have done a bit of research and found that I am not the only one having this requirement. My exact use case is timing GPIO pins from within a embedded-hal
driver.
The user nagisa
suggested to use DMA to control the pins on #rust-embedded
. But, unfortunately not every MCU comes with DMA support.
Another idea is to use an implementation similar like the one in the Linux kernel. It basically works by determining the loops_per_jiffy
at start-up. The loops_per_jiffy
value describes how often a given "do nothing" instruction can be called in a given amount of time. This value is then used to calibrate the ndelay
function.
I don't know how reliable such an solution would be, but I know that it comes with some limitations like that it works on an approximately base and should only be used for very small delay, because of overflows.
What do you think, would such an implementation make sense in embedded-hal? Do you know better solutions for this kind of problem?