Skip to content

Conversation

@HarryShomer
Copy link
Contributor

@HarryShomer HarryShomer commented Jan 16, 2025

This pull request adds an implementation of the SOTA Link Prediction model LPFormer from the KDD'24 paper - LPFormer: An Adaptive Graph Transformer for Link Prediction.

Example

python examples/lpformer.py.

I am able to replicate the reported performance on ogbl-ppa.

The results are shown below. I use the same hyperparameters used in the paper. Results are over 5 seeds.

Reported PyG
63.32 ± 0.6 63.15 ± 0.9

Testing

The tests for LPFormer can be found in test/nn/models/test_lpformer.py. They each pass.

Docs

I've also included the documentation for the code. It builds and renders correctly with Sphinx with no errors.

@codecov
Copy link

codecov bot commented Jan 17, 2025

Codecov Report

❌ Patch coverage is 88.95899% with 35 lines in your changes missing coverage. Please review.
✅ Project coverage is 85.93%. Comparing base (c211214) to head (ff86503).
⚠️ Report is 138 commits behind head on master.

Files with missing lines Patch % Lines
torch_geometric/nn/models/lpformer.py 88.92% 35 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #9956      +/-   ##
==========================================
- Coverage   86.11%   85.93%   -0.18%     
==========================================
  Files         496      502       +6     
  Lines       33655    35128    +1473     
==========================================
+ Hits        28981    30189    +1208     
- Misses       4674     4939     +265     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@HarryShomer
Copy link
Contributor Author

@wsad1 @EdisonLeeeee Can I get an ETA on a review? Thanks!

@puririshi98 puririshi98 self-requested a review May 30, 2025 18:36
Copy link
Contributor

@puririshi98 puririshi98 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you update this to use modern pyg (torch-geometric and pyg-lib) not deprecated packages such as torch_sparse

@HarryShomer
Copy link
Contributor Author

HarryShomer commented May 30, 2025

@puririshi98 Thanks for the heads up!

I'm curious though, what's the alternative to torch_sparse? As far as I can tell, pyg still doesn't support torch.sparse for memory-efficient aggregations, which is why I use SparseTensor.

Could you let me know what I should use instead of torch_sparse?

@puririshi98
Copy link
Contributor

@puririshi98 Thanks for the heads up!

I'm curious though, what's the alternative to torch_sparse? As far as I can tell, pyg still doesn't support torch.sparse for memory-efficient aggregations, which is why I use SparseTensor.

Could you let me know what I should use instead of torch_sparse?

there should be a pytorch native version, look into it, if you struggle to find an option let me know and i can try to check

@HarryShomer
Copy link
Contributor Author

@puririshi98 I've updated the code to remove the torch_sparse references.

For some reason I was under the impression that the MessagePassing layers could not accept torch.sparse for the edge_index argument. I think this may have been true for some older versions.

Please let me know if there's anything else needed on my side.

@puririshi98
Copy link
Contributor

will merge once we get the onnx issue fixed upstream and merge that in . aiming EoD tmrw

Copy link
Contributor

@puririshi98 puririshi98 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM now

@puririshi98 puririshi98 merged commit 1648a0c into pyg-team:master Aug 27, 2025
17 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants