Skip to content

Apply and Resample #5420

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 57 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
2c682fe
Apply and MetaMatrix; partial functionality
atbenmurray Oct 27, 2022
dbbe26c
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 27, 2022
9598480
making import for MetaTensor more specific to avoid circular reference
atbenmurray Oct 27, 2022
bf99722
Merge branch 'lr_apply' of github.com:project-monai/monai into lr_apply
atbenmurray Oct 27, 2022
0ed7c2d
Making Affine import more specific in apply to avoid circular reference
atbenmurray Oct 27, 2022
56717f6
Fixing typing signature issue on apply method
atbenmurray Oct 27, 2022
f15f46d
Adding missing license boilerplate
atbenmurray Oct 27, 2022
6904106
Minimal docstrings for apply / Apply
atbenmurray Oct 27, 2022
54c6f2a
Splitting apply.py into lazy/functional and lazy/array
atbenmurray Oct 27, 2022
472c3cd
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 27, 2022
5f52959
Removing spurious test file tempscript
atbenmurray Oct 27, 2022
59bc288
Merge branch 'lr_apply' of github.com:project-monai/monai into lr_apply
atbenmurray Oct 27, 2022
f0b010a
Auto formatting fixes
atbenmurray Oct 27, 2022
69568ff
Fixing issues raised by linter
atbenmurray Oct 27, 2022
9bf2ed8
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 27, 2022
6e77d57
5422 update attentionunet parameters (#5423)
wyli Oct 28, 2022
95e37c4
Starting tests for apply
atbenmurray Oct 28, 2022
0d1d250
Resolving conflicts
atbenmurray Oct 28, 2022
01b8e12
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 28, 2022
5d0db75
Further array functionality and testing; waiting on PR #5107
atbenmurray Oct 28, 2022
3fc2c8a
Fixing merge conflicts
atbenmurray Oct 28, 2022
aca9a8b
Adding resample function for unified resampling and test placeholder
atbenmurray Oct 28, 2022
5788a23
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 28, 2022
3f82016
5425 conda tests (#5426)
wyli Oct 28, 2022
8b1f0c3
Type classes for lazy resampling (#5418)
atbenmurray Oct 28, 2022
350fe6e
5432 convert metadict types (#5433)
wyli Oct 29, 2022
f201883
4922 adding a minimal lazy transform interface (#5407)
wyli Oct 29, 2022
5b9345b
Apply and MetaMatrix; partial functionality
atbenmurray Oct 27, 2022
e201853
making import for MetaTensor more specific to avoid circular reference
atbenmurray Oct 27, 2022
4f5c44e
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 27, 2022
4370e5a
Making Affine import more specific in apply to avoid circular reference
atbenmurray Oct 27, 2022
d959925
Fixing typing signature issue on apply method
atbenmurray Oct 27, 2022
5f254c7
Adding missing license boilerplate
atbenmurray Oct 27, 2022
cd28575
Minimal docstrings for apply / Apply
atbenmurray Oct 27, 2022
5653a64
Splitting apply.py into lazy/functional and lazy/array
atbenmurray Oct 27, 2022
70badbf
Removing spurious test file tempscript
atbenmurray Oct 27, 2022
227b58b
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 27, 2022
47f00ca
Auto formatting fixes
atbenmurray Oct 27, 2022
3043ae7
Fixing issues raised by linter
atbenmurray Oct 27, 2022
4aae9da
Starting tests for apply
atbenmurray Oct 28, 2022
b6f1bd9
Further array functionality and testing; waiting on PR #5107
atbenmurray Oct 28, 2022
f6b889f
Adding resample function for unified resampling and test placeholder
atbenmurray Oct 28, 2022
98143d0
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 28, 2022
eae5825
Apply and MetaMatrix; partial functionality
atbenmurray Oct 27, 2022
6ba693b
making import for MetaTensor more specific to avoid circular reference
atbenmurray Oct 27, 2022
209eb30
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 27, 2022
e7cb648
Making Affine import more specific in apply to avoid circular reference
atbenmurray Oct 27, 2022
bfd9458
Fixing typing signature issue on apply method
atbenmurray Oct 27, 2022
45551d1
Adding missing license boilerplate
atbenmurray Oct 27, 2022
3dd6164
Minimal docstrings for apply / Apply
atbenmurray Oct 27, 2022
29123aa
Splitting apply.py into lazy/functional and lazy/array
atbenmurray Oct 27, 2022
b827cff
Removing spurious test file tempscript
atbenmurray Oct 27, 2022
1cfe0a0
Auto formatting fixes
atbenmurray Oct 27, 2022
adbd5fa
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 27, 2022
e62a316
Further array functionality and testing; waiting on PR #5107
atbenmurray Oct 28, 2022
d10d74f
Adding resample function for unified resampling and test placeholder
atbenmurray Oct 28, 2022
f9a000c
Resolving conflicts
atbenmurray Oct 30, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .github/workflows/pythonapp.yml
Original file line number Diff line number Diff line change
Expand Up @@ -88,6 +88,10 @@ jobs:
name: Install torch cpu from pytorch.org (Windows only)
run: |
python -m pip install torch==1.12.1+cpu torchvision==0.13.1+cpu -f https://download.pytorch.org/whl/torch_stable.html
- if: runner.os == 'Linux'
name: Install itk pre-release (Linux only)
run: |
python -m pip install --pre -U itk
- name: Install the dependencies
run: |
python -m pip install torch==1.12.1 torchvision==0.13.1
Expand Down
20 changes: 20 additions & 0 deletions docs/source/transforms.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,11 +22,31 @@ Generic Interfaces
:members:
:special-members: __call__

`RandomizableTrait`
^^^^^^^^^^^^^^^^^^^
.. autoclass:: RandomizableTrait
:members:

`LazyTrait`
^^^^^^^^^^^
.. autoclass:: LazyTrait
:members:

`MultiSampleTrait`
^^^^^^^^^^^^^^^^^^
.. autoclass:: MultiSampleTrait
:members:

`Randomizable`
^^^^^^^^^^^^^^
.. autoclass:: Randomizable
:members:

`LazyTransform`
^^^^^^^^^^^^^^^
.. autoclass:: LazyTransform
:members:

`RandomizableTransform`
^^^^^^^^^^^^^^^^^^^^^^^
.. autoclass:: RandomizableTransform
Expand Down
7 changes: 6 additions & 1 deletion monai/data/image_reader.py
Original file line number Diff line number Diff line change
Expand Up @@ -318,7 +318,12 @@ def _get_meta_dict(self, img) -> Dict:

"""
img_meta_dict = img.GetMetaDataDictionary()
meta_dict = {key: img_meta_dict[key] for key in img_meta_dict.GetKeys() if not key.startswith("ITK_")}
meta_dict = {}
for key in img_meta_dict.GetKeys():
if key.startswith("ITK_"):
continue
val = img_meta_dict[key]
meta_dict[key] = np.asarray(val) if type(val).__name__.startswith("itk") else val

meta_dict["spacing"] = np.asarray(img.GetSpacing())
return meta_dict
Expand Down
14 changes: 14 additions & 0 deletions monai/data/meta_obj.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,7 @@ class MetaObj:
def __init__(self):
self._meta: dict = MetaObj.get_default_meta()
self._applied_operations: list = MetaObj.get_default_applied_operations()
self._pending_operations: list = MetaObj.get_default_applied_operations() # the same default as applied_ops
self._is_batch: bool = False

@staticmethod
Expand Down Expand Up @@ -199,6 +200,19 @@ def push_applied_operation(self, t: Any) -> None:
def pop_applied_operation(self) -> Any:
return self._applied_operations.pop()

@property
def pending_operations(self) -> list[dict]:
"""Get the pending operations. Defaults to ``[]``."""
if hasattr(self, "_pending_operations"):
return self._pending_operations
return MetaObj.get_default_applied_operations() # the same default as applied_ops

def push_pending_operation(self, t: Any) -> None:
self._pending_operations.append(t)

def pop_pending_operation(self) -> Any:
return self._pending_operations.pop()

@property
def is_batch(self) -> bool:
"""Return whether object is part of batch or not."""
Expand Down
18 changes: 16 additions & 2 deletions monai/data/meta_tensor.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,8 @@
from monai.data.meta_obj import MetaObj, get_track_meta
from monai.data.utils import affine_to_spacing, decollate_batch, list_data_collate, remove_extra_metadata
from monai.utils import look_up_option
from monai.utils.enums import MetaKeys, PostFix, SpaceKeys
from monai.utils.type_conversion import convert_data_type, convert_to_tensor
from monai.utils.enums import LazyAttr, MetaKeys, PostFix, SpaceKeys
from monai.utils.type_conversion import convert_data_type, convert_to_numpy, convert_to_tensor

__all__ = ["MetaTensor"]

Expand Down Expand Up @@ -445,6 +445,20 @@ def pixdim(self):
return [affine_to_spacing(a) for a in self.affine]
return affine_to_spacing(self.affine)

def peek_pending_shape(self):
"""Get the currently expected spatial shape as if all the pending operations are executed."""
res = None
if self.pending_operations:
res = self.pending_operations[-1].get(LazyAttr.SHAPE, None)
# default to spatial shape (assuming channel-first input)
return tuple(convert_to_numpy(self.shape, wrap_sequence=True).tolist()[1:]) if res is None else res

def peek_pending_affine(self):
res = None
if self.pending_operations:
res = self.pending_operations[-1].get(LazyAttr.AFFINE, None)
return self.affine if res is None else res

def new_empty(self, size, dtype=None, device=None, requires_grad=False):
"""
must be defined for deepcopy to work
Expand Down
35 changes: 27 additions & 8 deletions monai/networks/nets/attentionunet.py
Original file line number Diff line number Diff line change
Expand Up @@ -143,12 +143,27 @@ def forward(self, g: torch.Tensor, x: torch.Tensor) -> torch.Tensor:


class AttentionLayer(nn.Module):
def __init__(self, spatial_dims: int, in_channels: int, out_channels: int, submodule: nn.Module, dropout=0.0):
def __init__(
self,
spatial_dims: int,
in_channels: int,
out_channels: int,
submodule: nn.Module,
up_kernel_size=3,
strides=2,
dropout=0.0,
):
super().__init__()
self.attention = AttentionBlock(
spatial_dims=spatial_dims, f_g=in_channels, f_l=in_channels, f_int=in_channels // 2
)
self.upconv = UpConv(spatial_dims=spatial_dims, in_channels=out_channels, out_channels=in_channels, strides=2)
self.upconv = UpConv(
spatial_dims=spatial_dims,
in_channels=out_channels,
out_channels=in_channels,
strides=strides,
kernel_size=up_kernel_size,
)
self.merge = Convolution(
spatial_dims=spatial_dims, in_channels=2 * in_channels, out_channels=in_channels, dropout=dropout
)
Expand All @@ -174,7 +189,7 @@ class AttentionUnet(nn.Module):
channels (Sequence[int]): sequence of channels. Top block first. The length of `channels` should be no less than 2.
strides (Sequence[int]): stride to use for convolutions.
kernel_size: convolution kernel size.
upsample_kernel_size: convolution kernel size for transposed convolution layers.
up_kernel_size: convolution kernel size for transposed convolution layers.
dropout: dropout ratio. Defaults to no dropout.
"""

Expand Down Expand Up @@ -210,9 +225,9 @@ def __init__(
)
self.up_kernel_size = up_kernel_size

def _create_block(channels: Sequence[int], strides: Sequence[int], level: int = 0) -> nn.Module:
def _create_block(channels: Sequence[int], strides: Sequence[int]) -> nn.Module:
if len(channels) > 2:
subblock = _create_block(channels[1:], strides[1:], level=level + 1)
subblock = _create_block(channels[1:], strides[1:])
return AttentionLayer(
spatial_dims=spatial_dims,
in_channels=channels[0],
Expand All @@ -227,17 +242,19 @@ def _create_block(channels: Sequence[int], strides: Sequence[int], level: int =
),
subblock,
),
up_kernel_size=self.up_kernel_size,
strides=strides[0],
dropout=dropout,
)
else:
# the next layer is the bottom so stop recursion,
# create the bottom layer as the sublock for this layer
return self._get_bottom_layer(channels[0], channels[1], strides[0], level=level + 1)
# create the bottom layer as the subblock for this layer
return self._get_bottom_layer(channels[0], channels[1], strides[0])

encdec = _create_block(self.channels, self.strides)
self.model = nn.Sequential(head, encdec, reduce_channels)

def _get_bottom_layer(self, in_channels: int, out_channels: int, strides: int, level: int) -> nn.Module:
def _get_bottom_layer(self, in_channels: int, out_channels: int, strides: int) -> nn.Module:
return AttentionLayer(
spatial_dims=self.dimensions,
in_channels=in_channels,
Expand All @@ -249,6 +266,8 @@ def _get_bottom_layer(self, in_channels: int, out_channels: int, strides: int, l
strides=strides,
dropout=self.dropout,
),
up_kernel_size=self.up_kernel_size,
strides=strides,
dropout=self.dropout,
)

Expand Down
24 changes: 23 additions & 1 deletion monai/transforms/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -227,6 +227,8 @@
from .inverse_batch_transform import BatchInverseTransform, Decollated, DecollateD, DecollateDict
from .io.array import SUPPORTED_READERS, LoadImage, SaveImage
from .io.dictionary import LoadImaged, LoadImageD, LoadImageDict, SaveImaged, SaveImageD, SaveImageDict
from .lazy.array import Apply
from .lazy.functional import apply
from .meta_utility.dictionary import (
FromMetaTensord,
FromMetaTensorD,
Expand All @@ -235,6 +237,13 @@
ToMetaTensorD,
ToMetaTensorDict,
)
from .meta_matrix import (
Grid,
matmul,
Matrix,
MatrixFactory,
MetaMatrix,
)
from .nvtx import (
Mark,
Markd,
Expand Down Expand Up @@ -449,7 +458,18 @@
ZoomD,
ZoomDict,
)
from .transform import MapTransform, Randomizable, RandomizableTransform, ThreadUnsafe, Transform, apply_transform
from .transform import (
LazyTrait,
LazyTransform,
MapTransform,
MultiSampleTrait,
Randomizable,
RandomizableTrait,
RandomizableTransform,
ThreadUnsafe,
Transform,
apply_transform,
)
from .utility.array import (
AddChannel,
AddCoordinateChannels,
Expand Down Expand Up @@ -621,6 +641,8 @@
generate_label_classes_crop_centers,
generate_pos_neg_label_crop_centers,
generate_spatial_bounding_box,
get_backend_from_tensor_like,
get_device_from_tensor_like,
get_extreme_points,
get_largest_connected_component_mask,
get_number_image_type_conversions,
Expand Down
10 changes: 10 additions & 0 deletions monai/transforms/lazy/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# Copyright (c) MONAI Consortium
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
48 changes: 48 additions & 0 deletions monai/transforms/lazy/array.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
# Copyright (c) MONAI Consortium
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

from monai.transforms.lazy.functional import apply
from monai.transforms.inverse import InvertibleTransform

__all__ = ["Apply"]


class Apply(InvertibleTransform):
"""
Apply wraps the apply method and can function as a Transform in either array or dictionary
mode.
"""

def __init__(self):
super().__init__()

def __call__(self, *args, **kwargs):
return apply(*args, **kwargs)

def inverse(self, data):
return NotImplementedError()


# class Applyd(MapTransform, InvertibleTransform):
#
# def __init__(self):
# super().__init__()
#
# def __call__(
# self,
# d: dict
# ):
# rd = dict()
# for k, v in d.items():
# rd[k] = apply(v)
#
# def inverse(self, data):
# return NotImplementedError()
Loading