Neural Netowrk: Building an AND Gate

Building an AND Gate with a Single Neuron

One of the simplest and most powerful ideas in neural networks is that a single neuron can be trained to perform logical operations. In this post, we'll walk through how a single artificial neuron can model the AND gate—a foundational building block of digital logic.

Understanding the AND Gate

The AND operator returns 1 only when both inputs are 1; otherwise, it returns 0. Here's the truth table:

$x_1$ $x_2$ $x_1$ AND $x_2$
0 0 0
0 1 0
1 0 0
1 1 1

Our goal is to build a neuron that learns to replicate this logic.


The Neuron: Architecture and Activation

A neuron, also called a perceptron, takes input values ($x_1$, $x_2$), computes a weighted sum, adds a bias term, and passes the result through an activation function.

We add a bias input $x_0 = 1$ so the neuron has more flexibility. The neuron's hypothesis function is:

hθ(x)=g(θ0x0+θ1x1+θ2x2)h_\theta(x) = g(\theta_0 x_0 + \theta_1 x_1 + \theta_2 x_2)

Where:

  • $x_0 = 1$ is the bias.

  • $\theta_0$, $\theta_1$, $\theta_2$ are the neuron’s weights.

  • $g(z)$ is the activation function.

For this example, we'll use the sigmoid function:

g(z)=11+ezg(z) = \frac{1}{1 + e^{-z}}

This function smoothly maps input values to a range between 0 and 1.


Choosing the Right Weights

The behavior of our neuron depends on the weights we assign. We'll use:

θ=[30,20,20]\theta = [-30, 20, 20]

Let’s test these weights with each possible input combination.


Case 1: $x_1 = 0$, $x_2 = 0$

z=30+20(0)+20(0)=30hθ(x)=g(30)0z = -30 + 20(0) + 20(0) = -30 \\ h_\theta(x) = g(-30) \approx 0

Case 2: $x_1 = 0$, $x_2 = 1$

z=30+20(0)+20(1)=10hθ(x)=g(10)0z = -30 + 20(0) + 20(1) = -10 \\ h_\theta(x) = g(-10) \approx 0

Case 3: $x_1 = 1$, $x_2 = 0$

z=30+20(1)+20(0)=10hθ(x)=g(10)0z = -30 + 20(1) + 20(0) = -10 \\ h_\theta(x) = g(-10) \approx 0

Case 4: $x_1 = 1$, $x_2 = 1$

z=30+20(1)+20(1)=10hθ(x)=g(10)1z = -30 + 20(1) + 20(1) = 10 \\ h_\theta(x) = g(10) \approx 1

These results perfectly replicate the AND gate. The neuron outputs a value close to 1 only when both inputs are 1.


Decision Boundary and Intuition

The weighted sum $z$ creates a linear decision boundary:

30+20x1+20x2=0-30 + 20x_1 + 20x_2 = 0

This equation defines a line in the input space. On one side of the line, the neuron outputs a value close to 0. On the other side—where both $x_1$ and $x_2$ are 1—the output is close to 1.





Comments