-
Notifications
You must be signed in to change notification settings - Fork 3.9k
Add LPFormer model and example
#9956
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
for more information, see https://pre-commit.ci
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #9956 +/- ##
==========================================
- Coverage 86.11% 85.93% -0.18%
==========================================
Files 496 502 +6
Lines 33655 35128 +1473
==========================================
+ Hits 28981 30189 +1208
- Misses 4674 4939 +265 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
@wsad1 @EdisonLeeeee Can I get an ETA on a review? Thanks! |
puririshi98
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you update this to use modern pyg (torch-geometric and pyg-lib) not deprecated packages such as torch_sparse
|
@puririshi98 Thanks for the heads up! I'm curious though, what's the alternative to torch_sparse? As far as I can tell, pyg still doesn't support torch.sparse for memory-efficient aggregations, which is why I use SparseTensor. Could you let me know what I should use instead of torch_sparse? |
there should be a pytorch native version, look into it, if you struggle to find an option let me know and i can try to check |
|
@puririshi98 I've updated the code to remove the torch_sparse references. For some reason I was under the impression that the MessagePassing layers could not accept torch.sparse for the edge_index argument. I think this may have been true for some older versions. Please let me know if there's anything else needed on my side. |
for more information, see https://pre-commit.ci
|
will merge once we get the onnx issue fixed upstream and merge that in . aiming EoD tmrw |
puririshi98
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM now
This pull request adds an implementation of the SOTA Link Prediction model
LPFormerfrom the KDD'24 paper - LPFormer: An Adaptive Graph Transformer for Link Prediction.Example
python examples/lpformer.py.I am able to replicate the reported performance on
ogbl-ppa.The results are shown below. I use the same hyperparameters used in the paper. Results are over 5 seeds.
Testing
The tests for LPFormer can be found in
test/nn/models/test_lpformer.py. They each pass.Docs
I've also included the documentation for the code. It builds and renders correctly with Sphinx with no errors.