Tuesday, October 8, 2024

PReLU in Deep Learning: How Parametric ReLU Improves Neural Networks


PReLU vs ReLU Explained – Activation Functions Made Simple

๐Ÿง  ReLU vs PReLU – The Smart Gatekeepers of Neural Networks

In deep learning, activation functions decide what information should pass forward in a neural network. Think of them as intelligent filters.

๐Ÿ‘‰ ReLU is strict. ๐Ÿ‘‰ PReLU is flexible.

๐Ÿ“š Table of Contents


⚡ What is ReLU?

ReLU (Rectified Linear Unit) is one of the most commonly used activation functions.

\[ f(x) = \max(0, x) \]

๐Ÿ‘‰ If input is positive → keep it ๐Ÿ‘‰ If input is negative → make it 0

⚠️ Problem: Dying ReLU

When neurons receive negative inputs repeatedly:

\[ f(x) = 0 \]

they stop learning entirely.

๐Ÿ‘‰ This is called the "dying ReLU problem".

๐Ÿš€ What is PReLU?

PReLU (Parametric ReLU) solves this problem by allowing a small portion of negative values to pass.


๐Ÿ“ PReLU Mathematics (Simple)

\[ f(x) = \begin{cases} x, & x > 0 \\ \alpha x, & x \leq 0 \end{cases} \]

Explanation:

  • \(x\): input
  • \(\alpha\): small learnable parameter
๐Ÿ‘‰ Instead of blocking negatives, we scale them.

๐Ÿ” Example

Let’s assume:

\[ \alpha = 0.2 \]

InputReLU OutputPReLU Output
444
-20-0.4

๐Ÿ’ป Code Example

import torch import torch.nn as nn relu = nn.ReLU() prelu = nn.PReLU() x = torch.tensor([-2.0, 4.0]) print("ReLU:", relu(x)) print("PReLU:", prelu(x))

๐Ÿ–ฅ️ CLI Output

Click to Expand
ReLU: tensor([0., 4.])
PReLU: tensor([-0.5, 4.])

๐Ÿ“Š Comparison

FeatureReLUPReLU
Negative values0Scaled
LearnableNoYes
Dying neuron issueYesReduced

๐Ÿ’ก Key Takeaways

  • ReLU is simple and efficient
  • PReLU adds flexibility
  • Math shows how scaling works
  • PReLU can improve learning performance

๐ŸŽฏ Final Thought

ReLU is like a strict teacher. PReLU is a smarter one—it still corrects mistakes but doesn’t ignore useful signals.

And in deep learning, that flexibility can make all the difference.

No comments:

Post a Comment

Featured Post

How HMT Watches Lost the Time: A Deep Dive into Disruptive Innovation Blindness in Indian Manufacturing

The Rise and Fall of HMT Watches: A Story of Brand Dominance and Disruptive Innovation Blindness The Rise and Fal...

Popular Posts