distdl.nn¶
Overview¶
These are the public interfaces.
Distributed Containers¶
Base class of all distributed layers in DistDL. |
Primitive Distributed Data Movement Layers¶
We implement a number of primitives.
Performs an all-sum-reduction within a partition. |
|
Performs a broadcast of a tensor from one partition to another. |
|
Performs the halo exchange operation of a tensor on a partition. |
|
Performs a sum-reduction of a tensor from one partition to another. |
|
Performs a repartition or a tensor from one partition to another. |
Distributed Comptutation Layers¶
We implement a number of distributed layers based on the actual layers and the primitives.
Distributed convolutional layers. |
|
Distributed pooling layers. |
|
Distributed linear layer. |
|
Distributed upsampling layer. |
Distributed Loss Functions¶
We implement a number of distributed loss functions based on the PyTorch losses and the primitives.
Distributed loss functions. |
Additional Sequential Layers¶
We implement some useful sequential modules.
code_reference/nn/interpolate:Interpolate Layer |
N-dimensional, interpolation. |
code_reference/nn/padding:Padding Layer |
N-dimensional, unbalanced padding. |