-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
open_dataset not closing NetCDF file (Windows) #1629
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Thank you for the report! This is expected behavior: users have the responsibility to open and close their files. You can either close it manually: test_ds.close() Or (better), use a context manager: with xr.open_dataset(path_to_nc) as ds:
# do stuff |
I just read the second part of you question: you can't delete the file and then still read data out of it. What is your use case? |
Thanks for response. Basically I've inherited some legacy code that subclasses an xarray dataset, and all of the tests pass on Linux but one fails on window from this issue. I don't think the code ever closes a dataset, nor does it use context managers. We're loading data stored in a netcdf into the subclass instance and then doing a bunch of calculations on it. Anyway, I agree that the current behavior is actually the ideal scenario, it was just weird at first when the tests passed on Linux and not Windows. I'll have to see if I can patch the code to use context managers, since manually closing the file will probably not always happen. |
In my experience, Windows is certainly much pickier than Linux about deleting files that are still open. |
…when leaving the context manager (with block). see pydata/xarray#1629 (comment)
* bugfix so that the .nc file is closed automatically when erroring or when leaving the context manager (with block). see pydata/xarray#1629 (comment) * fixed all occurences of xr.opendataset with the safe open&close pattern * review comments by Roel * fix linting --------- Co-authored-by: roeldegoede <[email protected]>
* potential bugfix havg * undo previous commit and update computation of wet fractions (and increased readability) * fix wet fractions and corresponding havg * add dependencies * add reading/writing quadtree netcdf files * add read/write subgrid quadtree * add quadtree io in main sfincs.py * Solve some linting warnings * make datashader optional dep * add bounds property (since the original calls read_grid) and remove grid api * fixing read_results and plot_basemap for quadtree models. Also fixed #133 * add quadtree IO tests * extended tests for quadtree io, plot_basemap and read_results * imrpoved downscaling methods (both bugfixes and allow for ugrids) * pre-commit linting * pre-commit test-data * changed fix for plotting; not longer reprojecting to epsg4326 by default but using another cartopy projection * fix pyflwdir version for now (to be investigated) * fix typo in pyproject.toml * test fixing docs workflows, since Mambaforge gets deprecated ... * miniforge3? * Delete tests/data/sfincs_test_quadtree/sfincs_log.txt * bugfix so sfincs_his.nc files are closed correctly (#232) * bugfix so that the .nc file is closed automatically when erroring or when leaving the context manager (with block). see pydata/xarray#1629 (comment) * fixed all occurences of xr.opendataset with the safe open&close pattern * review comments by Roel * fix linting --------- Co-authored-by: roeldegoede <[email protected]> * added xu_open_dataset wrapper (#236) * added xu_open_dataset wrapper. * load_dataset -> open_dataset * linting --------- Co-authored-by: roeldegoede <[email protected]> --------- Co-authored-by: DirkEilander <[email protected]> Co-authored-by: Tim Leijnse <[email protected]> Co-authored-by: LuukBlom <[email protected]>
MWE
Entering
results in
If I use the option
autoclose=True
, I can delete the file without this error, but then when I try to accesstest_ds
I getOSError: No such file or directory
. (This second part might be expected behavior depending on howautoclose
is supposed to function.)conda info
The text was updated successfully, but these errors were encountered: