Skip to content

Conversation

kmuehlbauer
Copy link
Contributor

@kmuehlbauer kmuehlbauer commented Aug 16, 2025

ping @xylar

Copy link

@xylar xylar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kmuehlbauer, yes this solves the issue reported in #10647. Thanks very much!

When I run my reproducer from that issue, I now see:

$ ./reproducer.py 
/home/xylar/Desktop/reproducer/tmp/./reproducer.py:25: UserWarning: Unlimited dimension(s) {'time'} declared in 'dataset.encoding', but not part of current dataset dimensions. Consider removing {'time'} from 'dataset.encoding'.
  ds0.to_netcdf('dataset_time0.nc')

@kmuehlbauer
Copy link
Contributor Author

Thanks @xylar, for confirming, much appreciated. Trying to finish this up later today.

@kmuehlbauer kmuehlbauer enabled auto-merge (squash) August 18, 2025 05:55
@kmuehlbauer kmuehlbauer merged commit e3359bc into pydata:main Aug 18, 2025
35 of 36 checks passed
@kmuehlbauer kmuehlbauer deleted the fix-unlimited_dims branch August 18, 2025 06:27
dcherian added a commit to dhruvak001/xarray that referenced this pull request Aug 24, 2025
* main: (46 commits)
  use the new syntax of ignoring bots (pydata#10668)
  modification methods on `Coordinates` (pydata#10318)
  Silence warnings from test_tutorial.py (pydata#10661)
  test: update write_empty test for zarr 3.1.2 (pydata#10665)
  Bump actions/checkout from 4 to 5 in the actions group (pydata#10652)
  Add load_datatree function (pydata#10649)
  Support compute=False from DataTree.to_netcdf (pydata#10625)
  Fix typos (pydata#10655)
  In case of misconfiguration of dataset.encoding `unlimited_dims` warn instead of raise (pydata#10648)
  fix ``auto_complex`` for ``open_datatree`` (pydata#10632)
  Fix bug indexing with boolean scalars (pydata#10635)
  Improve DataTree typing (pydata#10644)
  Update Cartopy and Iris references (pydata#10645)
  Empty release notes (pydata#10642)
  release notes for v2025.08.0 (pydata#10641)
  Fix `ds.merge` to prevent altering original object depending on join value (pydata#10596)
  Add asynchronous load method (pydata#10327)
  Add DataTree.prune() method              … (pydata#10598)
  Avoid refining parent dimensions in NetCDF files (pydata#10623)
  clarify lazy behaviour and eager loading chunks=None in open_*-functions (pydata#10627)
  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

ds.encoding['unlimited_dims'] not getting updated properly
3 participants