Whenever the target variable contains more than two categories, multiclass classification methods are used. Many real-world applications belong to this category, such as handwritten digit recognition, product category prediction, disease diagnosis type selection, sentiment analysis with positive, neutral, and negative labels, and document topic classification.
This article explains multiclass classification in a practical way, covering core intuition, probability outputs, training logic, common strategies, evaluation metrics, and implementation using Python.
Suppose we want to predict a student's final grade category based on study hours. The possible outcomes are A Grade, B Grade, or C Grade. This cannot be solved using binary classification directly because there are more than two possible outputs. The model must compare several classes and choose the most likely one.
Core Idea of Multiclass Classification
In multiclass classification, the model receives input features and produces a probability for each possible class. For example, a model may estimate A Grade = 0.70, B Grade = 0.20, and C Grade = 0.10. Since A Grade has the highest probability, the final prediction becomes A Grade.Unlike binary classification, where the decision is between only two outcomes, multiclass classification compares multiple classes at the same time. This makes it useful whenever a system must assign one best category from several choices.
Many machine learning algorithms support multiclass problems directly, while others extend binary classification methods using special strategies.
One common method is One-vs-Rest (OvR). In this approach, a separate binary classifier is trained for each class. For example, one model learns Apple vs Others, another learns Banana vs Others, and another learns Orange vs Others. During prediction, the class with the highest confidence score is selected.
Another common method is the Softmax or Multinomial approach. Here, the model learns all classes together and returns probabilities that sum to 1. For example, Apple = 0.15, Banana = 0.25, and Orange = 0.60 would lead to a final prediction of Orange.
Training the Model
During training, the model learns weights that connect features to different classes. These weights are adjusted repeatedly so that predicted probabilities move closer to the correct labels.For example, in fruit classification, redness may increase the probability of Apple, length may favor Banana, and roundness may favor Orange. The algorithm gradually improves these weights until it can classify examples more accurately.
In multiclass Logistic Regression, this process is commonly optimized using cross-entropy loss.
Implementation with sklearn
Single Feature
In practice, scikit-learn makes multiclass classification simple.import numpy as np
from sklearn.linear_model import LogisticRegression
# Feature: study hours
X = np.array([[1], [2], [3], [4], [5], [6], [7], [8]])
# Target:
# 0 = C Grade
# 1 = B Grade
# 2 = A Grade
y = np.array([0, 0, 0, 1, 1, 1, 2, 2])
model = LogisticRegression()
model.fit(X, y)
pred = model.predict([[6]])[0]
prob = model.predict_proba([[6]])[0]
print("Predicted Class:", pred)
print("Probabilities:", prob.round(3))
This code trains a Logistic Regression model where lower study hours are linked to C Grade, medium study hours to B Grade, and higher study hours to A Grade.
Possible Output:
Predicted Class: 1
Probabilities: [0.041 0.601 0.358]
This means the student has a 41% chance of C Grade, a 60% chance of B Grade, and a 35% chance of A Grade. Since B Grade has the highest probability, the final prediction is class 1, which represents B Grade.
Multiple Features
Multiclass models become much stronger when they use several features together instead of relying on only one variable. Suppose a college predicts final grade category using study hours, attendance, assignment score, test score, and classroom participation.A student may earn A Grade not because of one feature alone, but because several positive signals combine together. Likewise, weak attendance and low test scores together may indicate lower grade categories. This is why feature engineering often improves multiclass model performance significantly.
Python Example:
This code trains a multiclass Logistic Regression model to predict a student's final grade category using multiple features such as study hours, attendance, assignment score, test score, and classroom participation.import numpy as np
from sklearn.linear_model import LogisticRegression
from sklearn.preprocessing import StandardScaler
X = np.array([
[2, 60, 55, 50, 40],
[3, 65, 58, 54, 45],
[4, 72, 68, 65, 60],
[5, 78, 72, 70, 65],
[6, 85, 82, 80, 78],
[7, 92, 90, 88, 85],
[5, 75, 70, 72, 68],
[6, 88, 85, 84, 80]
])
y = np.array([0, 0, 1, 1, 2, 2, 1, 2])
# Scale features
scaler = StandardScaler()
X_scaled = scaler.fit_transform(X)
# Train model
model = LogisticRegression(max_iter=1000)
model.fit(X_scaled, y)
# Predict new student
student = [[6, 86, 84, 82, 79]]
student_scaled = scaler.transform(student)
pred = model.predict(student_scaled)[0]
prob = model.predict_proba(student_scaled)[0]
print("Predicted Class:", pred)
print("Probabilities:", prob.round(3))
Different features have different ranges:
- Study hours = 2 to 7
- Attendance = 60 to 92
- Scores = 50 to 90
Scaling converts them to a similar range so Logistic Regression can learn faster and more accurately. Before prediction, the student data must be scaled using the same scaler. We use transform(), not fit_transform(), because we apply the already learned scaling rules.
Example output:
Predicted Class: 2
Probabilities: [0.004 0.218 0.778]
The model uses all five features together instead of relying on only one signal. Strong attendance, high scores, and good participation together increase the probability of a better grade. This is why multiple-feature multiclass models are often much more powerful than single-feature models.
Conclusion
Multiclass classification is used when the target contains three or more categories. Instead of predicting only yes or no, the model compares several possible outcomes and selects the most likely one.With good features, proper evaluation, and enough training data, multiclass classification becomes a powerful solution for many real-world prediction systems.
Join the discussion