pytorch / 2 / generated / torch.nn.replicationpad2d.html

ReplicationPad2d

class torch.nn.ReplicationPad2d(padding) [source]

Pads the input tensor using replication of the input boundary.

For N-dimensional padding, use torch.nn.functional.pad().

Parameters

padding (int, tuple) – the size of the padding. If is int, uses the same padding in all boundaries. If a 4-tuple, uses ( padding_left \text{padding\_left} , padding_right \text{padding\_right} , padding_top \text{padding\_top} , padding_bottom \text{padding\_bottom} )

Shape:
  • Input: ( N , C , H i n , W i n ) (N, C, H_{in}, W_{in}) or ( C , H i n , W i n ) (C, H_{in}, W_{in}) .
  • Output: ( N , C , H o u t , W o u t ) (N, C, H_{out}, W_{out}) or ( C , H o u t , W o u t ) (C, H_{out}, W_{out}) , where

    H o u t = H i n + padding_top + padding_bottom H_{out} = H_{in} + \text{padding\_top} + \text{padding\_bottom}

    W o u t = W i n + padding_left + padding_right W_{out} = W_{in} + \text{padding\_left} + \text{padding\_right}

Examples:

>>> m = nn.ReplicationPad2d(2)
>>> input = torch.arange(9, dtype=torch.float).reshape(1, 1, 3, 3)
>>> input
tensor([[[[0., 1., 2.],
          [3., 4., 5.],
          [6., 7., 8.]]]])
>>> m(input)
tensor([[[[0., 0., 0., 1., 2., 2., 2.],
          [0., 0., 0., 1., 2., 2., 2.],
          [0., 0., 0., 1., 2., 2., 2.],
          [3., 3., 3., 4., 5., 5., 5.],
          [6., 6., 6., 7., 8., 8., 8.],
          [6., 6., 6., 7., 8., 8., 8.],
          [6., 6., 6., 7., 8., 8., 8.]]]])
>>> # using different paddings for different sides
>>> m = nn.ReplicationPad2d((1, 1, 2, 0))
>>> m(input)
tensor([[[[0., 0., 1., 2., 2.],
          [0., 0., 1., 2., 2.],
          [0., 0., 1., 2., 2.],
          [3., 3., 4., 5., 5.],
          [6., 6., 7., 8., 8.]]]])

© 2024, PyTorch Contributors
PyTorch has a BSD-style license, as found in the LICENSE file.
https://pytorch.org/docs/2.1/generated/torch.nn.ReplicationPad2d.html