You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
"The PyG dataloader supports a form of mini-batching which is [decribed here](https://pytorch-geometric.readthedocs.io/en/latest/notes/batching.html). Effectively each mini-batch is a concatenation of multiple graphs (molecules in QM9). Another way to understand this is that each mini-batch is one large graph comprised of multiple disconnected sub-graphs. The PyG dataloader will generate a `batch` vector that assigns each feature in the mini-batch into a distinct subgraph. This is useful for message passing networks (such as SchNet) and pooling layers to produce a distinct regression prediction for each molecule. Refer to the following tutorials for additional background:\n",
347
+
"The PyG dataloader supports a form of mini-batching which is [described here](https://pytorch-geometric.readthedocs.io/en/latest/notes/batching.html). Effectively each mini-batch is a concatenation of multiple graphs (molecules in QM9). Another way to understand this is that each mini-batch is one large graph comprised of multiple disconnected sub-graphs. The PyG dataloader will generate a `batch` vector that assigns each feature in the mini-batch into a distinct subgraph. This is useful for message passing networks (such as SchNet) and pooling layers to produce a distinct regression prediction for each molecule. Refer to the following tutorials for additional background:\n",
0 commit comments