Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions unreleased_changes/8832.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
### Fixed
- Fixed a bug where hdf5 attachments explicitly mentioned in the datasource-properties.json would not be readable if they have an absolute path but no `file://` prefix.
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,11 @@ case class LayerAttachment(name: String,
throw new Exception(
"Trying to open non-local hdf5 file. Hdf5 files are only supported on the datastore-local file system.")
}
Path.of(path)
if (path.getScheme == null) {
Path.of(path.toString)
} else {
Path.of(path)
}
Comment on lines +49 to +53
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

LGTM! The URI scheme handling fix looks correct.

The change properly addresses the bug by handling URIs with null schemes (absolute paths without file:// prefix) by converting them to strings before creating a Path. This maintains backward compatibility while fixing the reported issue.

Consider testing edge cases to ensure robustness:


🏁 Script executed:

#!/bin/bash
# Verify that the Path.of behavior works correctly for various URI formats
# Look for existing tests that might cover these scenarios

rg -A 10 -B 5 "localPath|Path\.of" --type scala

Length of output: 74815


Approve URI scheme handling in localPath; add unit tests for edge cases

The conditional correctly covers both “no scheme” URIs (absolute paths without file://) and preserves original behavior for file:// URIs, while rejecting remote schemes. To ensure full coverage, please add unit tests for the following scenarios in a new DatasetLayerAttachmentsSpec (or similar):

  • URI with null scheme (e.g. new URI("/absolute/path/data.h5")) → localPath returns Path.of(path.toString)
  • URI with explicit file scheme (e.g. new URI("file:///absolute/path/data.h5")) → localPath returns Path.of(uri)
  • URI with non-file scheme (e.g. new URI("http://example.com/data.h5")) → localPath throws an exception
  • (Optional) Relative URI literal (e.g. new URI("data/data.h5")) if you expect to support it

Example test stub:

"localPath" should {
  "handle null scheme URIs as absolute paths" in {
    val uri = new URI("/tmp/foo.h5")
    val attachment = LayerAttachment("foo", uri, LayerAttachmentDataformat.HDF5)
    attachment.localPath shouldBe Path.of(uri.toString)
  }
  "handle file:// URIs correctly" in {
    val uri = new URI("file:///tmp/foo.h5")
    val attachment = LayerAttachment("foo", uri, LayerAttachmentDataformat.HDF5)
    attachment.localPath shouldBe Path.of(uri)
  }
  "reject non-file schemes" in {
    val uri = new URI("http://example.com/foo.h5")
    val attachment = LayerAttachment("foo", uri, LayerAttachmentDataformat.HDF5)
    an [Exception] should be thrownBy attachment.localPath
  }
}
🤖 Prompt for AI Agents
In
webknossos-datastore/app/com/scalableminds/webknossos/datastore/models/datasource/DatasetLayerAttachments.scala
around lines 49 to 53, the URI scheme handling logic is correct but lacks unit
test coverage. Add a new test suite named DatasetLayerAttachmentsSpec (or
similar) with tests covering: URIs with null scheme returning
Path.of(path.toString), URIs with explicit file scheme returning Path.of(uri),
and URIs with non-file schemes throwing an exception. Optionally, include tests
for relative URI literals if supported. Implement these tests to ensure all edge
cases are properly validated.

}
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ import com.scalableminds.webknossos.datastore.models.requests.DataServiceDataReq
import com.scalableminds.webknossos.datastore.services.DataConverter
import com.scalableminds.webknossos.datastore.storage._

import java.nio.file.{Files, Path}
import java.nio.file.Files
import java.nio.{ByteBuffer, ByteOrder, LongBuffer}
import javax.inject.Inject
import scala.annotation.tailrec
Expand Down Expand Up @@ -263,7 +263,7 @@ class Hdf5AgglomerateService @Inject()(config: DataStoreConfig) extends DataConv
// Otherwise, we read configurable sized blocks from the agglomerate file and save them in a LRU cache.
private def openAsCachedAgglomerateFile(agglomerateFileKey: AgglomerateFileKey) = {
val cumsumPath =
Path.of(agglomerateFileKey.attachment.path).getParent.resolve(cumsumFileName)
agglomerateFileKey.attachment.localPath.getParent.resolve(cumsumFileName)

val reader = openHdf5(agglomerateFileKey)

Expand Down