Skip to content

Conversation

@xnuohz
Copy link
Contributor

@xnuohz xnuohz commented Aug 11, 2025

Issue

Usage

python examples/ogbn_train.py --model nodeformer

Highlight

image

Result

  • GAT and SAGE use batch_size=1024
  • SGFormer and Polynormer use batch_size=512, otherwise OOM
Dataset GAT SAGE SGFormer Polynormer NodeFormer
ogbn-arxiv 68.53% 70.17% 71.86% 65.90% 66.98%

@xnuohz xnuohz mentioned this pull request Aug 11, 2025
7 tasks
@codecov
Copy link

codecov bot commented Aug 11, 2025

Codecov Report

❌ Patch coverage is 81.11888% with 54 lines in your changes missing coverage. Please review.
✅ Project coverage is 85.06%. Comparing base (c211214) to head (7ad95d4).
⚠️ Report is 142 commits behind head on master.

Files with missing lines Patch % Lines
torch_geometric/nn/models/nodeformer.py 81.05% 54 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master   #10409      +/-   ##
==========================================
- Coverage   86.11%   85.06%   -1.05%     
==========================================
  Files         496      511      +15     
  Lines       33655    36250    +2595     
==========================================
+ Hits        28981    30836    +1855     
- Misses       4674     5414     +740     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@puririshi98
Copy link
Contributor

@xnuohz plz get CI passing

Copy link
Contributor

@puririshi98 puririshi98 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM just get CI green

@puririshi98 puririshi98 enabled auto-merge (squash) September 15, 2025 16:19
@puririshi98
Copy link
Contributor

@rusty1s @akihironitta @wsad1 ready for final review/merge

Copy link
Contributor

@puririshi98 puririshi98 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@xnuohz can you explain where we are ensuring attention is not being done across the full minibatch? we should be utilizing the batch tensor, no? (remember what we had to change for sgformer)

This was referenced Oct 29, 2025
auto-merge was automatically disabled November 1, 2025 17:12

Head branch was pushed to by a user without write access

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants