Skip to content

Quadtree read/write, no build #226

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 27 commits into from
Jan 22, 2025
Merged

Quadtree read/write, no build #226

merged 27 commits into from
Jan 22, 2025

Conversation

roeldegoede
Copy link
Collaborator

Issue addressed

Fixes #

Explanation

Explain how you addressed the bug/feature request, what choices you made and why.

Checklist

  • Updated tests or added new tests
  • Branch is up to date with main
  • Updated documentation if needed
  • Updated changelog.rst if needed

Additional Notes (optional)

Add any additional notes or information that may be helpful.

@roeldegoede
Copy link
Collaborator Author

Done

  • Contains reading/writing of Quadtree files
  • Reading/writing of Quadtree subgrid files
  • New property bbox that can be used to clip for example forcing data
  • Up to date with other open PRs and main branch

Not working/testes:

  • Plot basemap
  • Read results

@roeldegoede roeldegoede requested a review from Leynse November 29, 2024 16:48
Copy link
Collaborator

@Leynse Leynse left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work Roel!

Looks good and as discussed

* bugfix so that the .nc file is closed automatically when erroring or when leaving the context manager (with block). see pydata/xarray#1629 (comment)

* fixed all occurences of xr.opendataset with the safe open&close pattern

* review comments by Roel

* fix linting

---------

Co-authored-by: roeldegoede <[email protected]>
@@ -3126,7 +3196,7 @@ def read_forcing(self, data_vars: List = None):
elif name in ["netbndbzsbzi", "netsrcdis"]:
ds = GeoDataset.from_netcdf(fn, crs=self.crs, chunks="auto")
else:
ds = xr.open_dataset(fn, chunks="auto")
ds = xr.load_dataset(fn, chunks="auto")
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@LuukBlom zou dit niet ook zo'n indented statement geweest moeten zijn?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nee, hier wordt in 1x de hele ds geladen, dus load_dataset is prima hier vgm.

docs xarray zeggen dit:
Open, load into memory, and close a Dataset from a file or file-like object.

This is a thin wrapper around open_dataset(). It differs from open_dataset in that it loads the Dataset into memory, closes the file, and returns the Dataset. In contrast, open_dataset keeps the file handle open and lazy loads its contents. All parameters are passed directly to open_dataset. See that documentation for further details.

https://docs.xarray.dev/en/stable/generated/[xarray.load_dataset](https://docs.xarray.dev/en/stable/generated/xarray.load_dataset.html).html

* added xu_open_dataset wrapper.

* load_dataset -> open_dataset

* linting

---------

Co-authored-by: roeldegoede <[email protected]>
@roeldegoede roeldegoede merged commit 8de6cf2 into main Jan 22, 2025
5 checks passed
@roeldegoede roeldegoede mentioned this pull request Jan 22, 2025
@savente93
Copy link

🎉 ❤️

LuukBlom added a commit to Deltares-research/FloodAdapt that referenced this pull request Apr 30, 2025
Making this PR since I accidentally merged the workflow PR when we still
have to wait on the hydromt_sfincs release.

- [x] [this PR in
hydromt_sfincs](Deltares/hydromt_sfincs#226) has
to be merged and released to pypi.
- [x] create and upload release of flood-adapt to pypi
@roeldegoede roeldegoede deleted the quadtree_io branch July 31, 2025 09:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants