Skip to content

Conversation

@rusty1s
Copy link
Member

@rusty1s rusty1s commented Mar 25, 2023

No description provided.

@codecov
Copy link

codecov bot commented Mar 25, 2023

Codecov Report

Merging #7034 (e48d058) into master (743c1c1) will decrease coverage by 0.29%.
The diff coverage is 0.00%.

❗ Current head e48d058 differs from pull request most recent head af5010b. Consider uploading reports for the commit af5010b to get more accurate results

@@            Coverage Diff             @@
##           master    #7034      +/-   ##
==========================================
- Coverage   91.79%   91.51%   -0.29%     
==========================================
  Files         435      435              
  Lines       23779    23776       -3     
==========================================
- Hits        21829    21759      -70     
- Misses       1950     2017      +67     
Impacted Files Coverage Δ
torch_geometric/data/collate.py 91.72% <0.00%> (ø)

... and 16 files with indirect coverage changes

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@rusty1s rusty1s merged commit a51e1db into master Mar 25, 2023
@rusty1s rusty1s deleted the fix_storage_warn branch March 25, 2023 14:05
@davodogster
Copy link

davodogster commented Mar 30, 2023

Hi Is that this warning I'm getting?

I'm using Python 3.9. I was very happy for a successful PyG PyT2.0 CU118 installation but when I train this warning shows up so I cant see training results very well.
image

image

Regards, Sam

@rusty1s
Copy link
Member Author

rusty1s commented Mar 30, 2023

Yes, I am sorry for that. We will patch this soon. For now, you can just use

warnings.filterwarnings('ignore', '.*TypedStorage is deprecated.*')

at the top of your script.

rusty1s added a commit that referenced this pull request Apr 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants