1.
Support Vector Machine (SVM)
   •   Concept
       SVM aims to find the optimal hyperplane that best separates different classes in the
       feature space with the maximum margin.
   •   Working Principle
           o   For linearly separable data, SVM finds a hyperplane that maximizes the
               margin between classes.
           o   For non-linear data, it uses kernel functions (e.g., RBF, polynomial) to
               transform data into a higher-dimensional space where it may be linearly
               separable.
   •   Key Parameters
           o   KernelFunction: Defines the type of kernel used — options include 'linear',
               'rbf', 'polynomial'.
           o   BoxConstraint (C): Controls the trade-off between a wide margin and
               classification errors (penalty for misclassification).
           o   KernelScale: A scaling factor applied to the kernel function, affecting the
               shape of the decision boundary.
2. Decision Tree (DT)
   •   Concept
       A Decision Tree is a flowchart-like structure, where:
           o   Internal nodes represent feature-based decision tests.
           o   Branches represent outcomes of these tests.
           o   Leaf nodes represent class labels or decisions.
   •   Working Principle
       The tree splits the dataset using features that provide the best separation between
       classes, typically using Gini impurity or entropy (information gain).
   •   Key Parameters
           o   MaxNumSplits: The maximum number of binary splits allowed — controls
               tree depth.
           o   SplitCriterion: Criterion for splitting, such as 'gini' (Gini impurity) or 'deviance'
               (entropy).
          o   MinLeafSize: Minimum number of observations required in a leaf node to
              avoid overfitting.
3. Random Forest
   •   Concept
       A Random Forest is an ensemble learning method that combines multiple decision
       trees trained on bootstrapped subsets of data and features.
   •   Working Principle
          o   Each tree is trained independently using a random subset of features and
              data (bagging).
          o   Final prediction is made by majority voting (for classification) or averaging
              (for regression).
          o   Helps improve accuracy and reduce overfitting.
   •   Key Parameters
          o   NumLearningCycles: The number of trees (iterations) in the ensemble.
          o   Method: Type of task — 'classification' or 'regression'.
          o   Learners: Specifies the type of base learners, typically decision trees.
4. k-Nearest Neighbors (KNN)
   •   Concept
       KNN is a non-parametric, instance-based learning algorithm. It classifies a data point
       based on the majority label among its k closest neighbors in the feature space.
   •   Working Principle
          o   It calculates distances (e.g., Euclidean) from the test point to all training
              points.
          o   Finds the k nearest points and assigns the most common label among them.
   •   Key Parameters
          o   NumNeighbors: The number of neighbors (k) used to make the prediction.
          o   Distance: The distance metric — e.g., 'euclidean', 'cityblock', 'cosine'.
          o   DistanceWeight: How neighbors are weighted — 'uniform' (equal weight) or
              'distance' (closer points have more weight).
Q2. Write a Matlab program to compare the classification result of above classifiers. Use
inbuilt functions of above classifiers. Write in detail about the parameters.
clc;
clear;
close all;
%% Load and preprocess dataset
load fisheriris
X = meas;
y = grp2idx(species); % Convert categorical to numeric (1, 2, 3)
% Standardize features (z-score normalization)
[X, mu, sigma] = zscore(X);
%% Initialize classifier names and results
models = {'SVM', 'Decision Tree', 'Random Forest', 'KNN'};
accuracy = zeros(size(models));
confMatrices = cell(size(models));
%% 10-fold cross-validation setup
cv = cvpartition(y, 'KFold', 10);
%% Classifier Loop
for i = 1:length(models)
  predictions = zeros(size(y));
  for fold = 1:cv.NumTestSets
       % Split data for current fold
       trainIdx = training(cv, fold);
       testIdx = test(cv, fold);
       Xtrain = X(trainIdx, :);
       Ytrain = y(trainIdx);
Xtest = X(testIdx, :);
Ytest = y(testIdx);
switch models{i}
  case 'SVM'
       % SVM with linear kernel
       model = fitcecoc(Xtrain, Ytrain, 'Learners', templateSVM('KernelFunction', 'linear', 'BoxConstraint', 1));
  case 'Decision Tree'
       % Fine tree (maximum splits = 100)
       model = fitctree(Xtrain, Ytrain, 'MaxNumSplits', 100, 'SplitCriterion', 'gdi');
  case 'Random Forest'
       % 100-tree ensemble
       model = TreeBagger(100, Xtrain, Ytrain, ...
         'Method', 'classification', ...
         'OOBPrediction', 'off', ...
         'MinLeafSize', 5);
  case 'KNN'
       % KNN with 3 neighbors using Euclidean distance
       model = fitcknn(Xtrain, Ytrain, ...
         'NumNeighbors', 3, ...
         'Distance', 'euclidean', ...
         'Standardize', false);
end
% Predict on test fold
if strcmp(models{i}, 'Random Forest')
  ypred = str2double(predict(model, Xtest)); % TreeBagger output = cell array
else
          ypred = predict(model, Xtest);
      end
      % Store predictions
      predictions(testIdx) = ypred;
  end
  % Evaluate full predictions
  confMatrices{i} = confusionmat(y, predictions);
  accuracy(i) = sum(predictions == y) / length(y);
  % Display Results
  fprintf('\n%s Accuracy (10-fold CV): %.2f%%\n', models{i}, accuracy(i)*100);
  disp('Confusion Matrix:');
  disp(confMatrices{i});
  % Confusion chart
  figure;
  confusionchart(confMatrices{i}, {'Setosa', 'Versicolor', 'Virginica'});
  title([models{i} ' - Confusion Matrix']);
end
%% Bar Chart for Comparison
figure;
bar(accuracy * 100);
title('Classification Accuracy Comparison (10-Fold CV)');
ylabel('Accuracy (%)');
xticklabels(models);
grid on;
Q3. Fuzzy Logic Controller Using Mamdani-Type FIS in MATLAB
Objective
To design a Mamdani-type Fuzzy Logic Controller (FLC) for motor speed control with the following parameters:
     •     Inputs:
               1.    Error (E) = Desired Speed − Current Speed
               2.    Change in Error (dE) = Current Error − Previous Error
     •     Output:
               o     Control Signal (Voltage) to adjust motor speed.
MATLAB Code Implementation
1. Create Mamdani FIS
matlab
CopyEdit
fis = mamfis('Name', 'MotorSpeedController');
2. Define Input 1: Error (E)
matlab
CopyEdit
fis = addInput(fis, [-10 10], 'Name', 'Error');
fis = addMF(fis, 'Error', 'trimf', [-10 -10 0], 'Name', 'Negative');
fis = addMF(fis, 'Error', 'trimf', [-10 0 10], 'Name', 'Zero');
fis = addMF(fis, 'Error', 'trimf', [0 10 10], 'Name', 'Positive');
3. Define Input 2: Change in Error (dE)
matlab
CopyEdit
fis = addInput(fis, [-5 5], 'Name', 'dError');
fis = addMF(fis, 'dError', 'trimf', [-5 -5 0], 'Name', 'Negative');
fis = addMF(fis, 'dError', 'trimf', [-5 0 5], 'Name', 'Zero');
fis = addMF(fis, 'dError', 'trimf', [0 5 5], 'Name', 'Positive');
4. Define Output: Control Signal
fis = addOutput(fis, [-1 1], 'Name', 'Control');
fis = addMF(fis, 'Control', 'trimf', [-1 -1 0], 'Name', 'Decrease');
fis = addMF(fis, 'Control', 'trimf', [-1 0 1], 'Name', 'Zero');
fis = addMF(fis, 'Control', 'trimf', [0 1 1], 'Name', 'Increase');
5. Define Fuzzy Rules
ruleList = [
     1 1 1 1 1; % If Error is Negative and dError is Negative → Decrease
     1 2 1 1 1;
     1 3 2 1 1;
     2 1 1 1 1;
     2 2 2 1 1; % Zero Error and Zero dError → Zero Control
     2 3 3 1 1;
     3 1 2 1 1;
     3 2 3 1 1;
     3 3 3 1 1; % Positive Error and Positive dError → Increase
];
fis = addRule(fis, ruleList);
6. Visualize the FIS and Surface
matlab
CopyEdit
figure;
plotfis(fis);
figure;
gensurf(fis); % Shows 3D surface of output vs inputs
7. Test FIS with Sample Input
matlab
CopyEdit
input = [4, -1]; % Example: Error = 4, dError = -1
output = evalfis(fis, input);
disp(['Control Signal Output: ', num2str(output)]);
Simulink Integration
    1.     Open Simulink.
    2.     Use "Fuzzy Logic Controller" block from the Fuzzy Logic Toolbox.
    3.     Link to your FIS file:
matlab
CopyEdit
writeFIS(fis, 'MotorSpeedController.fis');
    4.     Design a Simulink model:
                o    Inputs: Desired Speed − Actual Speed (Error)
                o    Output: Control Signal to a DC motor or plant model
Summary
This controller uses a Mamdani-type Fuzzy Inference System to simulate expert control logic for motor speed
regulation. The controller uses linguistic rules based on error and change in error to generate a voltage output.
This helps:
    •      Smooth response
    •      Intuitive control without requiring a precise mathematical model
    •      Robust handling of non-linearities in motor behavior