Skip to content
This repository was archived by the owner on Nov 15, 2022. It is now read-only.

Move to __torch_function__ #11

Merged
merged 22 commits into from
Dec 11, 2019
40 changes: 6 additions & 34 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,52 +6,24 @@ version: 2
jobs:
build:
docker:
# specify the version you desire here
# use `-browsers` prefix for selenium tests, e.g. `3.6.1-browsers`
- image: circleci/python:3.6.1

# Specify service dependencies here if necessary
# CircleCI maintains a library of pre-built images
# documented at https://circleci.com/docs/2.0/circleci-images/
# - image: circleci/postgres:9.4
- image: continuumio/miniconda3

working_directory: ~/repo

steps:
- checkout

# Download and cache dependencies
- restore_cache:
keys:
- v1-dependencies-{{ checksum "requirements.txt" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-

- run:
name: install dependencies
command: |
python3 -m venv venv
. venv/bin/activate
pip3 install -q -r requirements.txt
pip3 install -q torch

- save_cache:
paths:
- ./venv
key: v1-dependencies-{{ checksum "requirements.txt" }}

- run:
name: run tests
command: |
. venv/bin/activate
pip3 install .
conda install pytorch cpuonly -c pytorch-nightly -q -y
apt-get update
apt-get install -y gcc
apt-get install -y g++
pip install .
python test/test_nested_tensor_autograd.py
python test/test_nested_tensor_class.py
python test/test_nested_tensor_functional.py
python test/test_nested_tensor_masking.py
python test/test_nested_tensor_nary.py
python test/test_nested_tensor_tensorwise.py

- store_artifacts:
path: test-reports
destination: test-reports
15 changes: 9 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# The nestedtensor package

NOTE: PLEASE NOTE, NESTEDTENSOR IS UNDER ACTIVE DEVELOPMENT AND EVERYTHING HERE IS SUBJECT TO CHANGE.
NOTE: nestedtensor is under active development and various aspects may change.
NOTE: We test and develop against nightlies! Please use the most recent version of PyTorch if you plan to use this code.

## Motivation

Expand Down Expand Up @@ -44,7 +45,7 @@ Import nested tensors and torch via ```from nestedtensor import torch```
### Creation

```
nt = torch.nested_tensor(
nt = nestedtensor.nested_tensor(
[
[
torch.rand(2, 3),
Expand All @@ -62,13 +63,15 @@ b = torch.tensor([[2, 2],
[3, 3],
[4, 4],
[5, 5]])
nt2 = torch.nested_tensor([[a],[b]])
nt2 = nestedtensor.nested_tensor([[a],[b]])
```

The level of nesting is inferred from the input. The constructor always copies. Whatever you pass into the constructor will share no data with what the constructor returns. This matches torch.tensor's behavior.

If given a NestedTensor or Tensor it will return a detached copy, which is consistent with the behavior of torch.tensor. Remember that you cannot mix Tensors and NestedTensors within a given list.

A side-note on naming: nestedtensor is a python packed and as such [shouldn't have underscores and is lower case](https://www.python.org/dev/peps/pep-0008/#package-and-module-names), but nested_tensor is a python function and as [such should use underscores](https://www.python.org/dev/peps/pep-0008/#function-and-variable-names) in contrast to the [CapWorded NestedTensor class](https://www.python.org/dev/peps/pep-0008/#class-names).

### Conversion/unbind()
A user can retrieve the constituent Tensors via unbind. Unbind is currently used by torch to turn Tensors into tuples of Tensors. Unbind always returns a tuple of views.

Expand All @@ -80,7 +83,7 @@ A user can retrieve the constituent Tensors via unbind. Unbind is currently used
... [torch.rand(3, 2)]
... ]
>>>
>>> b = torch.nested_tensor(a)
>>> b = nestedtensor.nested_tensor(a)
>>> print(b)
nested_tensor([
[
Expand Down Expand Up @@ -136,8 +139,8 @@ tensor([ 9, 11])
>>> print(simple_fn(c, d))
tensor([10, 10])
>>>
>>> n = torch.nested_tensor([a, c])
>>> m = torch.nested_tensor([b, d])
>>> n = nestedtensor.nested_tensor([a, c])
>>> m = nestedtensor.nested_tensor([b, d])
>>> print(simple_fn(n, m))
nested_tensor([
tensor([ 9, 11]),
Expand Down
73 changes: 37 additions & 36 deletions examples/basic.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,8 @@
"metadata": {},
"outputs": [],
"source": [
"from nestedtensor import torch\n",
"import torch\n",
"import nestedtensor\n",
"from IPython.display import Markdown, display\n",
"\n",
"def print_eval(s):\n",
Expand Down Expand Up @@ -54,23 +55,23 @@
"text": [
"nested_tensor([\n",
"\t[\n",
"\t\ttensor([[0.8264, 0.2200, 0.4197],\n",
"\t\t [0.6789, 0.7460, 0.1694]]),\n",
"\t\ttensor([[0.7467, 0.8433, 0.6429, 0.9890, 0.0170],\n",
"\t\t [0.6297, 0.3899, 0.7025, 0.0812, 0.9585],\n",
"\t\t [0.1113, 0.4260, 0.4245, 0.7971, 0.7910],\n",
"\t\t [0.7077, 0.6765, 0.0228, 0.5461, 0.4095]])\n",
"\t\ttensor([[0.1525, 0.9457, 0.8438],\n",
"\t\t [0.6784, 0.9376, 0.5344]]),\n",
"\t\ttensor([[0.5654, 0.6054, 0.2726, 0.8868, 0.3417],\n",
"\t\t [0.1225, 0.4104, 0.9022, 0.6978, 0.2081],\n",
"\t\t [0.5641, 0.2983, 0.7589, 0.5495, 0.1304],\n",
"\t\t [0.1999, 0.3803, 0.0336, 0.4855, 0.9838]])\n",
"\t],\n",
"\t[\n",
"\t\ttensor([[0.0660, 0.9756]])\n",
"\t\ttensor([[0.8105, 0.6778]])\n",
"\t]\n",
"])\n",
"\n"
]
}
],
"source": [
"nt = torch.nested_tensor(\n",
"nt = nestedtensor.nested_tensor(\n",
" [\n",
" [\n",
" torch.rand(2, 3),\n",
Expand Down Expand Up @@ -227,7 +228,7 @@
" [3, 3],\n",
" [4, 4],\n",
" [5, 5]])\n",
"nt2 = torch.nested_tensor([[a],[b]])\n",
"nt2 = nestedtensor.nested_tensor([[a],[b]])\n",
"print_eval(\"nt2.nested_dim()\")\n",
"print_eval(\"nt2.tensor_dim()\")\n",
"print_eval(\"nt2.dim()\")"
Expand Down Expand Up @@ -281,15 +282,15 @@
"text": [
"nested_tensor([\n",
"\t[\n",
"\t\ttensor([[0.8264, 0.2200, 0.4197],\n",
"\t\t [0.6789, 0.7460, 0.1694]]),\n",
"\t\ttensor([[0.7467, 0.8433, 0.6429, 0.9890, 0.0170],\n",
"\t\t [0.6297, 0.3899, 0.7025, 0.0812, 0.9585],\n",
"\t\t [0.1113, 0.4260, 0.4245, 0.7971, 0.7910],\n",
"\t\t [0.7077, 0.6765, 0.0228, 0.5461, 0.4095]])\n",
"\t\ttensor([[0.1525, 0.9457, 0.8438],\n",
"\t\t [0.6784, 0.9376, 0.5344]]),\n",
"\t\ttensor([[0.5654, 0.6054, 0.2726, 0.8868, 0.3417],\n",
"\t\t [0.1225, 0.4104, 0.9022, 0.6978, 0.2081],\n",
"\t\t [0.5641, 0.2983, 0.7589, 0.5495, 0.1304],\n",
"\t\t [0.1999, 0.3803, 0.0336, 0.4855, 0.9838]])\n",
"\t],\n",
"\t[\n",
"\t\ttensor([[0.0660, 0.9756]])\n",
"\t\ttensor([[0.8105, 0.6778]])\n",
"\t]\n",
"])\n",
"\n"
Expand Down Expand Up @@ -621,7 +622,7 @@
{
"data": {
"text/markdown": [
"**<span style='color:darkred'>$ torch.nested_tensor_from_tensor_mask(tensor, mask)</span>**"
"**<span style='color:darkred'>$ nestedtensor.nested_tensor_from_tensor_mask(tensor, mask)</span>**"
],
"text/plain": [
"<IPython.core.display.Markdown object>"
Expand All @@ -646,7 +647,7 @@
{
"data": {
"text/markdown": [
"**<span style='color:darkred'>$ torch.nested_tensor_from_padded_tensor(tensor, padding=0)</span>**"
"**<span style='color:darkred'>$ nestedtensor.nested_tensor_from_padded_tensor(tensor, padding=0)</span>**"
],
"text/plain": [
"<IPython.core.display.Markdown object>"
Expand Down Expand Up @@ -688,9 +689,9 @@
" [ True, True, True, True]]])\n",
"print_eval(\"tensor\")\n",
"print_eval(\"mask\")\n",
"nt2 = torch.nested_tensor_from_tensor_mask(tensor, mask)\n",
"print_eval(\"torch.nested_tensor_from_tensor_mask(tensor, mask)\")\n",
"print_eval(\"torch.nested_tensor_from_padded_tensor(tensor, padding=0)\")"
"nt2 = nestedtensor.nested_tensor_from_tensor_mask(tensor, mask)\n",
"print_eval(\"nestedtensor.nested_tensor_from_tensor_mask(tensor, mask)\")\n",
"print_eval(\"nestedtensor.nested_tensor_from_padded_tensor(tensor, padding=0)\")"
]
},
{
Expand Down Expand Up @@ -795,12 +796,12 @@
"output_type": "stream",
"text": [
"nested_tensor([\n",
"\ttensor([[0.8264, 0.2200, 0.4197],\n",
"\t [0.6789, 0.7460, 0.1694]]),\n",
"\ttensor([[0.7467, 0.8433, 0.6429, 0.9890, 0.0170],\n",
"\t [0.6297, 0.3899, 0.7025, 0.0812, 0.9585],\n",
"\t [0.1113, 0.4260, 0.4245, 0.7971, 0.7910],\n",
"\t [0.7077, 0.6765, 0.0228, 0.5461, 0.4095]])\n",
"\ttensor([[0.1525, 0.9457, 0.8438],\n",
"\t [0.6784, 0.9376, 0.5344]]),\n",
"\ttensor([[0.5654, 0.6054, 0.2726, 0.8868, 0.3417],\n",
"\t [0.1225, 0.4104, 0.9022, 0.6978, 0.2081],\n",
"\t [0.5641, 0.2983, 0.7589, 0.5495, 0.1304],\n",
"\t [0.1999, 0.3803, 0.0336, 0.4855, 0.9838]])\n",
"])\n",
"\n"
]
Expand All @@ -822,7 +823,7 @@
"output_type": "stream",
"text": [
"nested_tensor([\n",
"\ttensor([[0.0660, 0.9756]])\n",
"\ttensor([[0.8105, 0.6778]])\n",
"])\n",
"\n"
]
Expand Down Expand Up @@ -857,15 +858,15 @@
"text": [
"nested_tensor([\n",
"\t[\n",
"\t\ttensor([[0.6776, 0.9759, 0.9132],\n",
"\t\t [0.7783, 0.7344, 0.9857]]),\n",
"\t\ttensor([[0.7467, 0.8433, 0.6429, 0.9890, 0.0170],\n",
"\t\t [0.6297, 0.3899, 0.7025, 0.0812, 0.9585],\n",
"\t\t [0.1113, 0.4260, 0.4245, 0.7971, 0.7910],\n",
"\t\t [0.7077, 0.6765, 0.0228, 0.5461, 0.4095]])\n",
"\t\ttensor([[0.9884, 0.5852, 0.6646],\n",
"\t\t [0.7786, 0.5917, 0.8606]]),\n",
"\t\ttensor([[0.5654, 0.6054, 0.2726, 0.8868, 0.3417],\n",
"\t\t [0.1225, 0.4104, 0.9022, 0.6978, 0.2081],\n",
"\t\t [0.5641, 0.2983, 0.7589, 0.5495, 0.1304],\n",
"\t\t [0.1999, 0.3803, 0.0336, 0.4855, 0.9838]])\n",
"\t],\n",
"\t[\n",
"\t\ttensor([[0.0660, 0.9756]])\n",
"\t\ttensor([[0.8105, 0.6778]])\n",
"\t]\n",
"])\n",
"\n"
Expand Down
Loading