Single Layer Perceptron • Multi-Layer Perceptron (MLP) • Simple Classification & Regression Tasks
In this chapter, you will learn how to build your very first neural networks. These networks are the foundation of Deep Learning and help computers recognize patterns—just like human learning. You will explore the Single Layer Perceptron, the Multi-Layer Perceptron (MLP), and learn how to solve simple tasks like classification and regression using neural networks.
1. Single Layer Perceptron (SLP)
The Single Layer Perceptron is the simplest type of neural network. It has:
- One input layer
- One output layer
- No hidden layers
It is mostly used for simple tasks where the data can be separated into two classes.
How It Works
A perceptron:
- Takes input values
- Multiplies them by weights
- Adds a bias
- Uses an activation function
- Produces an output (0 or 1)
Real-World Example
A perceptron can help classify:
- Whether an email is spam or not
- Whether a student passed or failed
- Whether a fruit is ripe or unripe
Simple Diagram (Conceptual)
Inputs → [Perceptron] → Output
Code Example: Single Layer Perceptron (Keras)
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
# Single Layer Perceptron
model = Sequential([
Dense(1, activation='sigmoid', input_shape=(2,))
])
model.compile(
optimizer='sgd',
loss='binary_crossentropy',
metrics=['accuracy']
)
print(model.summary())2. Multi-Layer Perceptron (MLP)
A Multi-Layer Perceptron has more than one layer.
It includes:
- Input layer
- One or more hidden layers
- Output layer
Each hidden layer helps the network learn deeper and more complex patterns.
Why MLPs Are Powerful
- Can learn curved boundaries
- Can identify shapes in data
- Can solve problems SLP cannot solve
Example of Hidden Layer Magic
If we want to classify images of handwritten numbers (0–9), an MLP:
- First hidden layer detects simple edges
- Second layer detects shapes
- Third layer identifies the actual number
Code Example: Simple MLP for Classification
import tensorflow as tf
from tensorflow.keras import Sequential
from tensorflow.keras.layers import Dense
# Multi-Layer Perceptron
model = Sequential([
Dense(16, activation='relu', input_shape=(4,)),
Dense(8, activation='relu'),
Dense(3, activation='softmax')
])
model.compile(
optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy']
)
print(model.summary())3. Simple Classification Tasks
Classification means predicting a category.
Examples of classification:
- Predicting whether a picture contains a cat or dog
- Predicting whether a student will pass or fail
- Predicting weather type (sunny, rainy, cloudy)
How Classification Works
- Input data → Neural Network
- Network learns patterns
- Output → probability of each class
Code Example: Classification with MLP
import numpy as np
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
# Fake data (for demonstration)
X = np.random.rand(200, 4) # 200 samples, 4 input features
y = np.random.randint(0, 2, 200) # Binary labels: 0 or 1
model = Sequential([
Dense(8, activation='relu', input_shape=(4,)),
Dense(1, activation='sigmoid')
])
model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
model.fit(X, y, epochs=5, batch_size=16)4. Simple Regression Tasks
Regression predicts numbers, not categories.
Examples of Regression
- Predicting house prices
- Predicting marks based on study hours
- Predicting temperature
How Regression Works
- The neural network outputs a continuous value
- No softmax or sigmoid needed
- Usually uses a linear output layer
Code Example: Regression with MLP
import numpy as np
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
# Fake data for regression
X = np.random.rand(300, 3) # 300 samples, 3 features
y = X[:, 0] * 5 + 2 # A simple relationship for demonstration
model = Sequential([
Dense(8, activation='relu', input_shape=(3,)),
Dense(1) # Linear output for regression
])
model.compile(optimizer='adam', loss='mse')
model.fit(X, y, epochs=5, batch_size=16)5. Understanding the Difference: Classification vs Regression
| Feature | Classification | Regression |
|---|---|---|
| Output type | Category/label | Continuous number |
| Activation | Sigmoid / Softmax | Linear |
| Loss function | Cross-entropy | MSE |
| Example | Cat vs Dog | Predict house price |
6. Real-World Examples of Neural Networks
A. Single Layer Perceptron
Used for:
- Yes/No questions
- Basic sensor decisions
- Spam filtering
B. Multi-Layer Perceptron
Used for:
- Handwriting recognition
- Predicting marks
- Simple image classification
C. Classification Tasks
Used for:
- Medical disease detection
- Student performance prediction
- Fraud detection
D. Regression Tasks
Used for:
- Predicting salaries
- Energy consumption forecasting
- Predicting crop yield