Saturday, October 5, 2024

Implementing a Single-Layer Perceptron for AND Logic Gate Simulation



Perceptron AND Gate – Step-by-Step Neural Network Guide

๐Ÿง  Building an AND Gate Using a Simple Neuron

Let’s build one of the simplest forms of intelligence—a neuron that behaves exactly like an AND logic gate.

This is the foundation of neural networks.

๐Ÿ“š Table of Contents


๐Ÿ“Š Dataset

Input (x₁, x₂)Output
(0, 0)0
(0, 1)0
(1, 0)0
(1, 1)1

⚙️ Perceptron Model

A perceptron works by calculating a weighted sum:

\[ z = w_1 x_1 + w_2 x_2 + b \]

Then applying an activation function:

\[ y = \begin{cases} 1 & \text{if } z > 0 \\ 0 & \text{otherwise} \end{cases} \]

๐Ÿ‘‰ If the weighted sum crosses a threshold → output = 1

๐Ÿ“ Understanding the Math (Easy)

Why AND Works Linearly

The AND gate is linearly separable.

We can draw a line:

\[ w_1 x_1 + w_2 x_2 + b = 0 \]

That separates:

  • (1,1) → one side
  • All others → other side

Error Calculation

\[ Error = y_{true} - y_{pred} \]

Weight Update Rule

\[ w = w + \eta \cdot Error \cdot x \]

\[ b = b + \eta \cdot Error \]

Where:

  • \(\eta\) = learning rate
๐Ÿ‘‰ The model adjusts itself when it makes mistakes.

๐Ÿ”„ Training Process

  • Loop through dataset
  • Predict output
  • Calculate error
  • Update weights
  • Repeat until error is minimized

๐Ÿ’ป Code Example

import numpy as np X = np.array([[0,0],[0,1],[1,0],[1,1]]) y = np.array([0,0,0,1]) weights = np.random.rand(2) bias = np.random.rand() lr = 0.1 def step(z): return 1 if z > 0 else 0 for epoch in range(100): for i in range(len(X)): z = np.dot(X[i], weights) + bias pred = step(z) error = y[i] - pred ``` weights += lr * error * X[i] bias += lr * error ``` print(weights, bias)

๐Ÿ–ฅ️ CLI Output

Click to Expand
Weights: [0.8, 0.7]
Bias: -1.1

✅ Results

Model Predictions
(0,0) → 0
(0,1) → 0
(1,0) → 0
(1,1) → 1
๐ŸŽ‰ The neuron successfully learned the AND gate!

๐Ÿ’ก Key Takeaways

  • Perceptron = simplest neural model
  • AND gate is linearly separable
  • Learning happens via error correction
  • This is the foundation of deep learning

๐ŸŽฏ Final Thought

If a machine can learn AND, it can learn logic. If it can learn logic… it can learn the world.

No comments:

Post a Comment

Featured Post

How HMT Watches Lost the Time: A Deep Dive into Disruptive Innovation Blindness in Indian Manufacturing

The Rise and Fall of HMT Watches: A Story of Brand Dominance and Disruptive Innovation Blindness The Rise and Fal...

Popular Posts