Skip to content

implement flip layer and pnnx torch.flip conversion #6233

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 22 commits into from
Aug 6, 2025
Merged

Conversation

nihui
Copy link
Member

@nihui nihui commented Aug 5, 2025

No description provided.

@tencent-adm
Copy link
Member

CLA assistant check
Thank you for your submission, we really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
0 out of 2 committers have signed the CLA.

❌ Baiyuetribe
❌ nihui
You have signed the CLA already but the status is still pending? Let us recheck it.

@codecov-commenter
Copy link

codecov-commenter commented Aug 5, 2025

Codecov Report

❌ Patch coverage is 93.22034% with 4 lines in your changes missing coverage. Please review.
✅ Project coverage is 95.96%. Comparing base (e207b3b) to head (4871990).
⚠️ Report is 2 commits behind head on master.

Files with missing lines Patch % Lines
src/layer/flip.cpp 93.22% 4 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #6233      +/-   ##
==========================================
+ Coverage   95.91%   95.96%   +0.04%     
==========================================
  Files         835      836       +1     
  Lines      264420   264401      -19     
==========================================
+ Hits       253631   253738     +107     
+ Misses      10789    10663     -126     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link

github-actions bot commented Aug 5, 2025

The binary size change of libncnn.so (bytes)

architecture base size pr size difference
x86_64 15643128 15648008 +4880 ⚠️
armhf 6648156 6649020 +864 ⚠️
aarch64 9987296 9987736 +440 ⚠️

@nihui nihui requested a review from Copilot August 6, 2025 02:15
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR implements the flip layer functionality for ncnn and PNNX, adding support for torch.flip operation conversion. The implementation includes the core flip layer in ncnn, PNNX conversion passes for both TorchScript and ONNX models, and comprehensive test coverage.

Key changes:

  • Added a new Flip layer to ncnn that reverses tensor dimensions along specified axes
  • Implemented PNNX passes to convert torch.flip operations from TorchScript and ONNX formats
  • Added comprehensive test suites for the flip functionality across different tensor dimensions

Reviewed Changes

Copilot reviewed 20 out of 20 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
src/layer/flip.h Header file defining the Flip layer class interface
src/layer/flip.cpp Core implementation of the Flip layer with forward pass logic
tools/pnnx/src/pass_level2/torch_flip.cpp PNNX pass for converting torch.flip operations from TorchScript and ONNX
tools/pnnx/src/pass_ncnn/torch_flip.cpp PNNX pass for generating ncnn-compatible flip operations
tests/test_flip.cpp Comprehensive test suite for the flip layer functionality
Multiple test files Test scripts for validating torch.flip conversion in different contexts
Multiple CMakeLists.txt Build configuration updates to include new flip functionality
Documentation Updated operator documentation to include flip layer

@nihui nihui closed this Aug 6, 2025
@nihui nihui reopened this Aug 6, 2025
@nihui nihui merged commit 9b91fe5 into Tencent:master Aug 6, 2025
111 of 125 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants