-
Notifications
You must be signed in to change notification settings - Fork 29
Fix format in volume annotation download zarr.json #8800
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
📝 WalkthroughWalkthroughThis change updates the handling of Zarr3 array metadata and chunk paths for volume annotation downloads. It ensures compression codec information is included in the generated zarr.json, adjusts the chunk path naming convention, and updates a route parameter. The Zarr3 array header logic is extended to accept additional codec configurations. Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~8 minutes Possibly related PRs
Poem
Note ⚡️ Unit Test Generation is now available in beta!Learn more here, or try it out under "Finishing Touches" below. 📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (5)
🧰 Additional context used🧠 Learnings (3)webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeDataZipHelper.scala (1)Learnt from: frcroth
webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/zarr3/Zarr3ArrayHeader.scala (1)Learnt from: frcroth
webknossos-tracingstore/conf/tracingstore.latest.routes (2)Learnt from: philippotto Learnt from: dieknolle3333 ⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
🔇 Additional comments (9)
✨ Finishing Touches🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for fixing all this.
The changes lgtm, but when I tested the download and using the zipped data as a dataset by putting it into the binaryData folder, the voxelISzeFactor was still at [1,1,1]. I also peeked at the datasource-properties.json
and it said the same. Therefore, sorry, but this part seems not to be fully fixed. But the scale in the nml was correct
POST /volume/:tracingId/initialDataMultiple @com.scalableminds.webknossos.tracingstore.controllers.VolumeTracingController.initialDataMultiple(annotationId: ObjectId, tracingId: String) | ||
GET /volume/:tracingId @com.scalableminds.webknossos.tracingstore.controllers.VolumeTracingController.get(tracingId: String, annotationId: ObjectId, version: Option[Long]) | ||
GET /volume/:tracingId/allDataZip @com.scalableminds.webknossos.tracingstore.controllers.VolumeTracingController.allDataZip(tracingId: String, annotationId: Option[ObjectId], version: Option[Long], volumeDataZipFormat: String, voxelSize: Option[String], voxelSizeUnit: Option[String]) | ||
GET /volume/:tracingId/allDataZip @com.scalableminds.webknossos.tracingstore.controllers.VolumeTracingController.allDataZip(tracingId: String, annotationId: Option[ObjectId], version: Option[Long], volumeDataZipFormat: String, voxelSizeFactor: Option[String], voxelSizeUnit: Option[String]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh no 🙈 Good find!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok nvm 🙈
I just noticed that during testing I accidentally revoked your changes of webknossos-tracingstore/conf/tracingstore.latest.routes
locally. Leading to the regression. Now with your version of the file the downloaded datasource-properties.json
has the correct voxelScaleFactor 👍
Fixes a regression introduced in #7995 – the volume buckets are blosc compressed, but that was no longer mentioned in the zarr header.
Also, the chunks had the
c.
name prefix, butchunk_key_encoding
was alreadyv2
in the zarr header. Now I actually remove thec.
name prefix so that it is actuallyv2
compliant.Also, fixes passing the voxel size to the volume zip download so that it is correct in the included datasource-properties.json
Steps to test:
Issues:
$PR_NUMBER.md
file inunreleased_changes
or use./tools/create-changelog-entry.py
)