MegEngine

This page contains details of all activation functions for MegEngine backend supported in Echo.

Mish

echoAI.Activation.m_ops.Mish()

Applies the element-wise function:

Shape:

Reference:

Mish: A Self Regularized Non-Monotonic Activation Function

Swish

echoAI.Activation.m_ops.Swish(eswish = False, swish = True, beta = 1.735, flatten = False)

Allows the following element-wise functions:

Parameters:

  • eswish - Uses E-Swish activation function. Default: False.

  • swish - Uses Swish activation function. Default: False.

  • flatten - Uses Flatten T-Swish activation function. Default: False.

Note: When eswish, swish and flatten are False, it initializes the SILU activation function by default.

Shape:

References:

Searching for Activation Functions

E-swish: Adjusting Activations to Different Network Depths

Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning

Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning

Aria2

echoAI.Activation.m_ops.Aria2(beta = 0.5, alpha = 1.0)

Applies the element-wise function:

Parameters:

Shape:

Reference:

ARiA: Utilizing Richard's Curve for Controlling the Non-monotonicity of the Activation Function in Deep Neural Nets

ELiSH

echoAI.Activation.m_ops.Elish(hard = False)

Allows the following element-wise functions:

Parameter:

  • hard - Uses Hard ELiSH activation function. Default: False

Shape:

Reference:

The Quest for the Golden Activation Function

ISRU

echoAI.Activation.m_ops.ISRU(alpha = 1.0, isrlu = False)

Allows the following element-wise functions:

Parameters:

  • isrlu - Uses ISRLU activation function. Default: False

Shape:

Reference:

Improving Deep Learning by Inverse Square Root Linear Units (ISRLUs)

NLReLU

echoAI.Activation.m_ops.NLReLU(beta = 1.0)

Applies the element-wise function:

Parameters:

Shape:

Reference:

Natural-Logarithm-Rectified Activation Function in Convolutional Neural Networks

Soft Clipping

echoAI.Activation.m_ops.SoftClipping(alpha = 0.5)

Applies the element-wise function:

Parameters:

Shape:

Reference:

Neural Network-Based Approach to Phase Space Integration

Soft Exponential

echoAI.Activation.m_ops.SoftExponential(alpha = None)

Applies the element-wise function:

Parameters:

Shape:

Reference:

A continuum among logarithmic, linear, and exponential functions, and its potential to improve generalization in neural networks

SQNL

echoAI.Activation.m_ops.SQNL()

Applies the element-wise function:

Shape:

Reference:

SQNL: A New Computationally Efficient Activation Function

SReLU

echoAI.Activation.m_ops.SReLU(in_features, parameters = None)

Applies the element-wise function:

Parameters:

  • in_features - Shape of the input. Datatype: Tuple

Shape:

Reference:

Deep Learning with S-shaped Rectified Linear Activation Units

FReLU

echoAI.Activation.m_ops.FReLU(in_channels)

Applies the element-wise function:

Parameter:

  • in_channels - Number of channels in the input tensor. Datatype: Integer

Shape:

Reference:

Funnel Activation for Visual Recognition

Last updated