PyTorch
This page contains the list of all attention models and non-local layers for computer vision enabled with PyTorch backend available in Echo.
Triplet Attention
Parameters:
no_spatial - switches on the spatial attention branch in Triplet Attention. Default:
False
kernel_size - window size of the convolution filters in Triplet Attention. Default: 7
Shape:
Input:4 dimensional feature map tensor.
Output:,same shape as input
Reference:
Rotate to Attend: Convolutional Triplet Attention Module
Squeeze Excite Attention
Parameters:
gate_channels - number of channels in the input tensor. Datatype:
Integer
reduction_ratio - squeeze bottleneck factor of the MLP in Squeeze Excite Attention. Default: 16
Shape:
Input:4 dimensional feature map tensor.
Output:,same shape as input
Reference:
Squeeze-and-Excitation Networks
Convolutional Block Attention Module
Supports both Convolutional Block Attention Module (CBAM) and Bottleneck Attention Module (CBAM)
Parameters:
gate_channels - number of channels in the input tensor. Datatype:
Integer
kernel_size - window size of the convolution filters in CBAM/ BAM. Default: 3
reduction_ratio - width factor of the MLP in CBAM/BAM. Default: 16
pool_types -
list
of global pooling operators for channel attention gate in CBAM/BAM. Default:['avg', 'max']
. Note: This is the default for CBAM, which expects two operators, however, if BAM is switched on, pass['avg']
. Available options:avg
,lp
,max
no_spatial - switches off the spatial attention gate in CBAM. Default:
False
bam - initializes BAM. Default:
False
num_layers - controls the number of hidden layers in the MLP of channel attention gate in CBAM/BAM. Default: 1
bn - adds a Batch Normalization layer in the MLP of the channel attention gate in CBAM/BAM. Default:
False
. Pass True when bam isTrue
.dilation_conv_num - number of dilated channel preserving convolution layers in the spatial attention gate in BAM. Default: 2
dilation_val - dilation factor for the convolution layers in the spatial attention gate in BAM. Default: 4
Note: By default, CBAM is initialized.
Shape:
Input:4 dimensional feature map tensor.
Output:,same shape as input
References:
Last updated