[cadence][hifi] update quantized_relu_per_tensor_out signature to match internal flow#8143
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/8143
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ❌ 1 New FailureAs of commit ae50298 with merge base ee6f2d9 ( NEW FAILURE - The following job has failed:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
This pull request was exported from Phabricator. Differential Revision: D69015308 |
588969b to
6a0c088
Compare
|
This pull request was exported from Phabricator. Differential Revision: D69015308 |
Summary: fix quantized_relu_per_tensor Differential Revision: D69015308
6a0c088 to
ae50298
Compare
|
This pull request was exported from Phabricator. Differential Revision: D69015308 |
| @@ -48,24 +48,27 @@ void quantized_relu_( | |||
| void quantized_relu_per_tensor_out( | |||
There was a problem hiding this comment.
Is this not the operator called by ET? If so, please move this inside an anonymous namespace.
There was a problem hiding this comment.
realized most of hifi ops are not in anonymous namespace yet. Will do separately
Differential Revision: D69015308