-
Notifications
You must be signed in to change notification settings - Fork 3.9k
Add llm generated explanations to TAGDataset
#9918
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add llm generated explanations to TAGDataset
#9918
Conversation
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
|
this is awesome, but can you add an example to run? (also make sure the CI checks are all green) |
…ch_geometric into tagdataset/add-llm-exp-pred
|
Thanks, @puririshi98 Can you clarify more about the runnable example? A simple way is to apply it to the GLEM model. However, a better way is to add a TAPE model, which is worth opening a new PR for easy review. Hi @akihironitta CI error is weird, seems like the error has nothing to do with my changes, can you help take a look? |
|
@xnuohz i think for now just adding it as an optional flag to GLEM example is okay. feel free to submit a seperate PR for TAPE. plz ping me on slack since i have github emails heavily filtered otherwise my inbox would explode. feel free to include this as a flag there as well when you do it |
@xnuohz ignore those for now, i was previously just talking about the linters that were red, your CI was functionally green before. once you address my above comment im sure these new issues will go away since their unrelated to your code, ive had this happen to me many times and they always go away on future respins. just my experience. |
|
puririshi98
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good but can you make an argparser option that combines the raw text and the llm explanation and run that? curious how it effects accuracy
|
@puririshi98 @akihironitta Inspector is broken in this CI, any thoughts? |
|
@xnuohz i recommend analysing the current latest CI and suggesting some potential solutions and i can review and see if we can help unblock you |
|
@puririshi98 In my local env, found that running individual tests without errors, but running all tests simultaneously resulted in the same type inference errors as CI/CD, which suggests that PyTorch's JIT type inference system may be corrupted, causing Optional[Tensor] to be incorrectly inferred as Optional[Any]. I think still need to clean the cache. It is wrong to modify the assert result directly as in PR.
|
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## master #9918 +/- ##
==========================================
- Coverage 86.11% 85.16% -0.95%
==========================================
Files 496 510 +14
Lines 33655 35952 +2297
==========================================
+ Hits 28981 30620 +1639
- Misses 4674 5332 +658 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
…ch_geometric into tagdataset/add-llm-exp-pred
|
can you rerun w/ and w/o tag dataset on using the latest nvidia container: https://catalog.ngc.nvidia.com/orgs/nvidia/containers/pyg/tags |
|
puririshi98
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
|
lgtm @akihironitta @rusty1s @wsad1 to merge |
Issue
#9361
Script