SwiGLU Activation Function

Abdulkader Helwan
3 min readMar 23, 2023

SwiGLU (Swish-Gated Linear Unit) is a novel activation function that combines the advantages of the Swish activation function and the Gated Linear Unit (GLU). This activation function was proposed in a paper by researchers at the University of Copenhagen in 2019, and has since gained popularity in the deep learning community.

In this blog post, we will explore the SwiGLU activation function in detail and discuss its advantages over other activation functions. this post was Originally published by AI-ContentLab

AI-ContentLab

What is an Activation Function?

In neural networks, activation functions are used to introduce non-linearity into the output of a neuron. They are responsible for deciding whether or not a neuron should be activated, based on the input it receives.

Activation functions help neural networks to learn complex non-linear relationships between inputs and outputs. There are several types of activation functions used in deep learning, such as the ‘sigmoid’, ‘ReLU’, and ‘tanh’.

Left: GeLU, Right: Swish

(Source: https://towardsdatascience.com/on-the-disparity-between-swish-and-gelu-1ddde902d64b)

What is Swish Activation Function?

Swish is a non-monotonic activation function that was proposed by Google researchers in 2017. Swish is defined as follows:

Swish(x) = x * sigmoid(beta * x)

where ‘beta’ is a trainable parameter.

Swish has been shown to perform better than ReLU in many applications, especially in deep networks. The main advantage of Swish is that it is smoother than ReLU, which can lead to better optimization and faster convergence.

What is GLU Activation Function?

Gated Linear Units (GLU) are a type of activation function that were proposed by researchers at Microsoft in 2016. GLU is defined as follows:

GLU(x) = x * sigmoid(Wx + b)

where ‘W’ and ‘b’ are trainable parameters.

GLU is similar to Swish in that it combines a linear function with a non-linear function. However, in GLU, the linear function is gated by a sigmoid…

--

--