You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Summary:
This change makes README.md compatible with both Github and VSTS markdown engines. Images can be reduced if necessary
Pull Request resolved: pytorch#9296
Differential Revision: D8874931
Pulled By: soumith
fbshipit-source-id: 0c530c1e00b06fc891301644c92c33007060bf27
@@ -34,32 +34,14 @@ See also the [ci.pytorch.org HUD](https://ezyang.github.io/pytorch-ci-hud/build/
34
34
35
35
At a granular level, PyTorch is a library that consists of the following components:
36
36
37
-
<table>
38
-
<tr>
39
-
<td><b> torch </b></td>
40
-
<td> a Tensor library like NumPy, with strong GPU support </td>
41
-
</tr>
42
-
<tr>
43
-
<td><b> torch.autograd </b></td>
44
-
<td> a tape-based automatic differentiation library that supports all differentiable Tensor operations in torch </td>
45
-
</tr>
46
-
<tr>
47
-
<td><b> torch.nn </b></td>
48
-
<td> a neural networks library deeply integrated with autograd designed for maximum flexibility </td>
49
-
</tr>
50
-
<tr>
51
-
<td><b> torch.multiprocessing </b></td>
52
-
<td> Python multiprocessing, but with magical memory sharing of torch Tensors across processes. Useful for data loading and Hogwild training. </td>
53
-
</tr>
54
-
<tr>
55
-
<td><b> torch.utils </b></td>
56
-
<td> DataLoader, Trainer and other utility functions for convenience </td>
57
-
</tr>
58
-
<tr>
59
-
<td><b> torch.legacy(.nn/.optim) </b></td>
60
-
<td> legacy code that has been ported over from torch for backward compatibility reasons </td>
61
-
</tr>
62
-
</table>
37
+
| Component | Description |
38
+
| ---- | --- |
39
+
|**torch**| a Tensor library like NumPy, with strong GPU support |
40
+
|**torch.autograd**| a tape-based automatic differentiation library that supports all differentiable Tensor operations in torch |
41
+
|**torch.nn**| a neural networks library deeply integrated with autograd designed for maximum flexibility |
42
+
|**torch.multiprocessing**| Python multiprocessing, but with magical memory sharing of torch Tensors across processes. Useful for data loading and Hogwild training |
43
+
|**torch.utils**| DataLoader, Trainer and other utility functions for convenience |
44
+
|**torch.legacy(.nn/.optim)**| legacy code that has been ported over from torch for backward compatibility reasons |
63
45
64
46
Usually one uses PyTorch either as:
65
47
@@ -72,7 +54,7 @@ Elaborating further:
72
54
73
55
If you use NumPy, then you have used Tensors (a.k.a ndarray).
0 commit comments