Skip to content

Conversation

@XinweiHe
Copy link

@XinweiHe XinweiHe commented Apr 6, 2025

Fixes #9112

@XinweiHe XinweiHe force-pushed the xinwei_support_attention_explainer_hetero_v1 branch from 0cbde7a to 61e25e6 Compare April 6, 2025 00:37
@XinweiHe XinweiHe marked this pull request as ready for review April 6, 2025 00:52
Copy link
Member

@wsad1 wsad1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work.
Can we test for HGTConv and HanConv, to ensure that it works for Hetero GNNs not created by the to_hetero operation.

@XinweiHe XinweiHe force-pushed the xinwei_support_attention_explainer_hetero_v1 branch from 1c533cc to 3062317 Compare April 12, 2025 23:32
@XinweiHe
Copy link
Author

Can we test for HGTConv and HanConv, to ensure that it works for Hetero GNNs not created by the to_hetero operation.

HGTConv and HanConv are not relevant here, because AttentionExplainer only works with GNN model with attentions, in particular, GATConv, GATv2Conv or TransformerConv.

We will support HeteroConv with these attention GNN models in this PR though.

@XinweiHe XinweiHe force-pushed the xinwei_support_attention_explainer_hetero_v1 branch from 3062317 to eb396e6 Compare April 12, 2025 23:37
@codecov
Copy link

codecov bot commented Apr 12, 2025

Codecov Report

Attention: Patch coverage is 89.13043% with 10 lines in your changes missing coverage. Please review.

Please upload report for BASE (xinwei_support_pg_explainer_hetero_v3@744f830). Learn more about missing BASE report.

Files with missing lines Patch % Lines
...geometric/explain/algorithm/attention_explainer.py 89.13% 10 Missing ⚠️
Additional details and impacted files
@@                           Coverage Diff                            @@
##             xinwei_support_pg_explainer_hetero_v3   #10169   +/-   ##
========================================================================
  Coverage                                         ?   85.36%           
========================================================================
  Files                                            ?      496           
  Lines                                            ?    33840           
  Branches                                         ?        0           
========================================================================
  Hits                                             ?    28889           
  Misses                                           ?     4951           
  Partials                                         ?        0           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@XinweiHe XinweiHe requested a review from wsad1 April 13, 2025 05:43
@wsad1
Copy link
Member

wsad1 commented Apr 16, 2025

HGTConv and HanConv are not relevant here, because AttentionExplainer only works with GNN model with attentions, in particular, GATConv, GATv2Conv or TransformerConv.

HanConv and HGTConv use attention while message passing.

Base automatically changed from xinwei_support_pg_explainer_hetero_v3 to master April 16, 2025 09:10
@XinweiHe
Copy link
Author

HanConv and HGTConv use attention while message passing.

You are right but at the moment, attention explainer is only designed to handle the case where alpha exists as an input parameter in def message(...) function. Since this is not the setup for HanConv and HGTConv at the moment we would need to spend extra effort supporting them which would probably be the best to not mix with this initial heterogeneous graph support for attention explainer.

@XinweiHe
Copy link
Author

could we get this effort merge and create a follow-up ticket to handle the HanConv and HGTConv cases?

Copy link
Member

@wsad1 wsad1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good.
Lets have a quick follow up for HANConv and other hetro GNNs.

@XinweiHe XinweiHe merged commit 6e4a563 into master Apr 23, 2025
17 checks passed
@XinweiHe XinweiHe deleted the xinwei_support_attention_explainer_hetero_v1 branch April 23, 2025 08:01
@XinweiHe XinweiHe changed the title Added support for heterogenous graphs in AttentionExplainer Added support for heterogeneous graphs in AttentionExplainer May 9, 2025
chrisn-pik pushed a commit to chrisn-pik/pytorch_geometric that referenced this pull request Jun 30, 2025
…am#10169)

Fixes pyg-team#9112

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Akihiro Nitta <[email protected]>
Co-authored-by: Jinu Sunil <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Roadmap] Heterogeneous Graphs Explainability Support

4 participants