Description
Hi again - I'm think this is probably setup-agnostic, but as quick rundown, I'm running on an Android device with SDK enabled. I've got a binary application that can generate an etdump file while registering the XNNPack delegate, then I'm running a version of mobilenet_v2 that has been partitioned for the same delegate.
When I print out the resulting event blocks to find the operator timings, I'm seeing the following pattern:
event_block_name | event_name | raw | p10 (ms) |
---|---|---|---|
Default | Method::init | [18.762083] | 18.76208 |
Default | Program::load_method | [18.774635] | 18.77464 |
Execute | Transpose (ND, X32) #1 | [1172000.0] | 1172000 |
Execute | Convolution (NHWC, F32) IGEMM #1 | [802000.0] | 802000 |
Execute | Convolution (NHWC, F32) DWConv #1 | [439000.0] | 439000 |
Execute | Convolution (NHWC, F32) GEMM #1 | [357000.0] | 357000 |
Execute | Convolution (NHWC, F32) GEMM #2 | [911000.0] | 911000 |
The subject of the issue is specifically that the time format seems inconsistent - the Executorch calls and column headers are recorded in milliseconds, while from what I gather from XNNProfiler.cpp, the delegate calls are recorded in "PAL ticks" / system ticks?
Is this a bug, and/or is there a way to get the output in a more comparable time unit? Thanks for any assistance.