distdl.nn

Overview

These are the public interfaces.

Distributed Containers

Base Distributed Module

Base class of all distributed layers in DistDL.

Primitive Distributed Data Movement Layers

We implement a number of primitives.

AllSumReduce Layer

Performs an all-sum-reduction within a partition.

Broadcast Layer

Performs a broadcast of a tensor from one partition to another.

Halo Exchange Layer

Performs the halo exchange operation of a tensor on a partition.

SumReduce Layer

Performs a sum-reduction of a tensor from one partition to another.

Repartition Layer

Performs a repartition or a tensor from one partition to another.

Distributed Comptutation Layers

We implement a number of distributed layers based on the actual layers and the primitives.

Convolution Layers

Distributed convolutional layers.

Pooling Layers

Distributed pooling layers.

Linear Layer

Distributed linear layer.

Upsampling Layer

Distributed upsampling layer.

Distributed Loss Functions

We implement a number of distributed loss functions based on the PyTorch losses and the primitives.

Loss Functions

Distributed loss functions.

Additional Sequential Layers

We implement some useful sequential modules.

code_reference/nn/interpolate:Interpolate Layer

N-dimensional, interpolation.

code_reference/nn/padding:Padding Layer

N-dimensional, unbalanced padding.