ML CP-23-24 EVEN As On 81.25
ML CP-23-24 EVEN As On 81.25
OFCSE
YEAR/SEMESTER: II/IV
PREPAREDBY
MS.B.SANGEETHA, AP/CSE
Review of Linear Algebra for machine learning- Introduction and motivation for machine learning;
Examples of machine learning applications - Vapnik-Chervonenkis (VC) dimension- Probably
Approximately Correct (PAC) learning- Hypothesis spaces- Inductive bias –Generalization-Bias
variance trade-off.
.
Linear Regression Models: Least squares, single & multiple variables-Bayesian linear regression-
gradient descent, Linear Classification Models-Discriminant function – Perceptron algorithm-
Probabilistic discriminative model - Logistic regression-Probabilistic generative model – Naive
Bayes-Maximum margin classifier – Support vector machine- Decision Tree-Random Forests.
Guidelines for machine learning experiments-Cross Validation (CV) and re sampling – K-fold CV,
bootstrapping-measuring classifier performance- assessing a single classification algorithm and
comparing two classification algorithms – t test, McNemar’s test, K-fold CV paired t test.
TOTAL:45 PERIODS
COURSE OBJECTIVES
1. To understand the basic concepts of machine learning.
2. To understand and build supervised learning models.
3. To understand and build unsupervised learning models.
4. To evaluate the algorithms based on corresponding metrics identified
TEXT BOOK
T1. Ethem Alpaydin, “Introduction to Machine Learning”, MIT Press, Fourth Edition, 2020.
T2. Stephen Marsland, “Machine Learning: An Algorithmic Perspective, “Second Edition”, CRC
Press, 2014.
REFERENCE BOOKS
R1. Christopher M. Bishop, “Pattern Recognition and Machine Learning”, Springer, 2006.
R2. Tom Mitchell, “Machine Learning”, McGraw Hill, 3rd Edition, 1997.
R3. Mehryar Mohri, Afshin Rostamizadeh, Ameet Talwalkar, “Foundations of Machine Learning”,
Second Edition, MIT Press, 2012, 2018.
R4. Ian Goodfellow, Yoshua Bengio, Aaron Courville, “Deep Learning”, MIT Press, 2016.
R5. Sebastain Raschka, Vahid Mirjalili , “Python Machine Learning”, Packt publishing, 3rd Edition,
2019.
WEB RESOURCES
W1.https://www.tableau.com/learn/articles/machine-learning-examples (Topic.No:03)
W4:https://www.techtarget.com/searchenterpriseai/definition/backpropagation-algorithm
(Topic.No.35)
No. of Cumulative
Topic Books for Page Teaching
Topic Hours No. of
No Reference No. Methodology
Required periods
UNIT I INTRODUCTION TO MACHINE LEARNING 8+1
1. Review of Linear Algebra T1 1-3 BB/PPT 1 1
for machine learning
2. Introduction and T1 3-4 BB/PPT 1 2
motivation for machine
learning
3. Examples of machine T1 4-13 BB/PPT 3
learning applications W1 1
4. Vapnik-Chervonenkis (VC) T1 21-27 L.VIDEO 1 4
dimension
5. Probably Approximately T1 27-29 BB/PPT 1 5
Correct (PAC) learning,
6. Hypothesis spaces, T2 40-42 BB/PPT 1 6
7. Inductive bias T1 329-330 SEMINAR 1 7
8. Generalization T1 331-335 BB/PPT 1 8
9. Bias variance trade-off. T1 336-340 BB/PPT 1 9
LEARNING OUTCOME
At the end of unit, students will be able to
Gain knowledge about problem solving agents
Implement the search algorithms to solve problems
Implement the machine learning applications
UNIT II SUPERVISED LEARNING 11+1
No. of Cumulative
Topic Books for Page Teaching
Topic Hours No. of
No Reference No. Methodology
Required periods
10. Linear Regression Models R4 1-8 NPTEL 1 10
No. of Cumulative
Topic Books for Page Teaching
Topic Hours No. of
No Reference No. Methodology
Required periods
20. Support vector machine, R3 79-90 BB/PPT 1 20
Decision Tree
21. Random Forests R2 75-90 BB/PPT 1 21
LEARNING OUTCOME
At the end of unit, students will be able to
Understand about Uncertain environment
Implement probabilistic reasoning technique and solve problem
Gain knowledge about Logistic regression
LEARNING OUTCOME
At the end of unit, students will be able to
Implement various supervised machine learning algorithm and build model
Appraise the performance of built model
Able to write Kmeans and KNN algorithm
No. of Cumulative
Topic Books for Page Teaching
Topic Hours No. of
No Reference No. Methodology
Required periods
37. Unit saturation (aka the
T2 43-45 BB/PPT 1 38
vanishing gradient problem)
38. ReLU T2 50-55 L.VIDEO 1 39
39. Hyper parameter tuning T2 58-59 BB/PPT 1 40
40. batch normalization,
regularization, dropout. R2 110-115 BB/PPT 1 41
LEARNING OUTCOME
At the end of unit, students will be able to
Apply ensemble technique and improve model performance
Implement various unsupervised machine learning algorithm and build model
Appraise the performance of neural networks
LEARNINGOUTCOME
At the end of unit, students should be able to
Gain knowledge about neural networks
Implement deep learning technique and built model.
Design and analysis of machine learning experiments
COURSE OUTCOME
After the completion of this course, students will be able to:
ASSIGNMENT DETAILS
ASSIGNMENT I II
Topic Nos. for reference 1-26 27-49
Deadline
EVALUATION
Coding Poster Presentation: Flash Card
Problem Understand: 10 Marks Poster Design: 20 Marks Card design:10
Coding: 10 Marks Explanation: 15 Marks
Execution: 10 Presentation:25
Q & A: 5 Marks
Q&A: 10 Marks Q &b A:05
CASE Study Seminar: Problem Solving:
Presentation: 40 Marks Presentation:15 Successful Completion of
Q&A: 20x2=40 Marks Communication:05 Badge: 40 Marks
Report:15 Marks.
Q & A: 5 Marks
13.a.i Construct a decision tree for the expression A=X AND Y OR Z. 6 L3 CO3
ii. Provide outline of the ID3 algorithm used for inducing decision tree
from the training tuples. Also list down the different attribute 7
selection measures used in the process of decision.
13.b.i Explain the Support Vector machine from the perspective of a non- 6 L3 CO3
linear kernel by means of an algorithm.
ii. Derive the Margin of the support vectors with an example and depict
it with necessary diagrams. 7
14.a. Explain the steps in k-means algorithm. Cluster the following set of 4 13 L3 CO4
ML 3.11 KCE/B. Tech/AIDS/QB/IIYR/ML
FORMAT:QB09 KCE/DEPT.OFCSE
objects into two clusters using K-means A(3,5), B(4,5), C(1,3), D(2,4).
Consider the objects A and C as the initial cluster centers.
14.b. With suitable illustration explain Gaussian mixture model. 13 L3 CO4
15.a.i Design a multilayer perceptron that solves the XOR problem. 7 L3 CO5
ii Write the algorithm for the above and illustrate 6
15.b.i. Suppose that we want to build a neural network that classifies two 7 L3 CO5
dimensional data (ie X=[x1,x2]) into two classes: diamonds and
crosses. We have a set of training data that is plotted as follows
15a,15b 14a,14b
16a,16b
TOTAL 19 30 51
Prepared by Verified By
(Mrs.B.SANGEETHA) HOD/CSE
Approved by
PRINCIPAL