0% found this document useful (0 votes)
80 views13 pages

ML CP-23-24 EVEN As On 81.25

The document outlines the course plan for a Machine Learning class (AL3451) offered by the Department of Artificial Intelligence and Data Science for the B.Tech program. It includes course objectives, unit topics, learning outcomes, assessment details, and references for study materials. The course covers various aspects of machine learning including supervised and unsupervised learning, neural networks, and experimental design.

Uploaded by

sangeetha.cse
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
80 views13 pages

ML CP-23-24 EVEN As On 81.25

The document outlines the course plan for a Machine Learning class (AL3451) offered by the Department of Artificial Intelligence and Data Science for the B.Tech program. It includes course objectives, unit topics, learning outcomes, assessment details, and references for study materials. The course covers various aspects of machine learning including supervised and unsupervised learning, neural networks, and experimental design.

Uploaded by

sangeetha.cse
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 13

FORMAT:QB09 KCE/DEPT.

OFCSE

DEPARTMENT OF ARTIFICIAL INTELLIGENCE AND DATA SCIENCE

SUBNAME: MACHINE LEARNING

COURSE PLAN (AL3451)


(Version 1)

YEAR/SEMESTER: II/IV

PREPAREDBY
MS.B.SANGEETHA, AP/CSE

ML 3.1 KCE/B. Tech/AIDS/QB/IIYR/ML


FORMAT:QB09 KCE/DEPT.OFCSE

AL3451 MACHINE LEARNING LTPC


SDG-09 3003

UNIT I INTRODUCTION TO MACHINE LEARNING 8

Review of Linear Algebra for machine learning- Introduction and motivation for machine learning;
Examples of machine learning applications - Vapnik-Chervonenkis (VC) dimension- Probably
Approximately Correct (PAC) learning- Hypothesis spaces- Inductive bias –Generalization-Bias
variance trade-off.
.

UNIT II SUPERVISED LEARNING 11

Linear Regression Models: Least squares, single & multiple variables-Bayesian linear regression-
gradient descent, Linear Classification Models-Discriminant function – Perceptron algorithm-
Probabilistic discriminative model - Logistic regression-Probabilistic generative model – Naive
Bayes-Maximum margin classifier – Support vector machine- Decision Tree-Random Forests.

UNIT III ENSEMBLE TECHNIQUES AND UNSUPERVISED LEARNING 9

Combining multiple learners: Model combination schemes-Voting, Ensemble Learning – bagging-


boosting, stacking-Unsupervised learning: K-means-Instance Based Learning: KNN- Gaussian
mixture models and Expectation maximization.

UNIT IV NEURAL NETWORKS 9

Multilayer perceptron, activation functions-network training – gradient descent optimization –


stochastic gradient descent-error back propagation- from shallow networks to deep networks –Unit
saturation (aka the vanishing gradient problem) – ReLU, hyper parameter tuning-batch
normalization-regularization, dropout.

UNIT V DESIGN AND ANALYSIS OF MACHINE LEARNING EXPERIMENTS 8

Guidelines for machine learning experiments-Cross Validation (CV) and re sampling – K-fold CV,
bootstrapping-measuring classifier performance- assessing a single classification algorithm and
comparing two classification algorithms – t test, McNemar’s test, K-fold CV paired t test.

TOTAL:45 PERIODS

SIGNATURE OF STAFF IN-CHARGE HOD/CSE


(B.SANGEETHA)

ML 3.2 KCE/B. Tech/AIDS/QB/IIYR/ML


FORMAT:QB09 KCE/DEPT.OFCSE

DEPARTMENT OF ARTIFICIAL INTELLIGENCE AND DATA SCIENCE


COURSE PLAN

Sub.Code : AL3451 Branch/Year/Sem :B.Tech AIDS/II/IV


Sub.Name :Machine Learning Batch :2023-2027
StaffName :Ms.B.Sangeetha Academic Year :2024-25(EVEN)

COURSE OBJECTIVES
1. To understand the basic concepts of machine learning.
2. To understand and build supervised learning models.
3. To understand and build unsupervised learning models.
4. To evaluate the algorithms based on corresponding metrics identified

TEXT BOOK

T1. Ethem Alpaydin, “Introduction to Machine Learning”, MIT Press, Fourth Edition, 2020.

T2. Stephen Marsland, “Machine Learning: An Algorithmic Perspective, “Second Edition”, CRC
Press, 2014.

REFERENCE BOOKS

R1. Christopher M. Bishop, “Pattern Recognition and Machine Learning”, Springer, 2006.
R2. Tom Mitchell, “Machine Learning”, McGraw Hill, 3rd Edition, 1997.
R3. Mehryar Mohri, Afshin Rostamizadeh, Ameet Talwalkar, “Foundations of Machine Learning”,
Second Edition, MIT Press, 2012, 2018.
R4. Ian Goodfellow, Yoshua Bengio, Aaron Courville, “Deep Learning”, MIT Press, 2016.
R5. Sebastain Raschka, Vahid Mirjalili , “Python Machine Learning”, Packt publishing, 3rd Edition,
2019.

WEB RESOURCES

W1.https://www.tableau.com/learn/articles/machine-learning-examples (Topic.No:03)

W2. https://www.javatpoint.com/perceptron-in-machine-learning (Topic.No:15)

W3. https://www.geeksforgeeks.org/k-nearest-neighbours/ (Topic.No.29)

W4:https://www.techtarget.com/searchenterpriseai/definition/backpropagation-algorithm

(Topic.No.35)

ML 3.3 KCE/B. Tech/AIDS/QB/IIYR/ML


FORMAT:QB09 KCE/DEPT.OFCSE

No. of Cumulative
Topic Books for Page Teaching
Topic Hours No. of
No Reference No. Methodology
Required periods
UNIT I INTRODUCTION TO MACHINE LEARNING 8+1
1. Review of Linear Algebra T1 1-3 BB/PPT 1 1
for machine learning
2. Introduction and T1 3-4 BB/PPT 1 2
motivation for machine
learning
3. Examples of machine T1 4-13 BB/PPT 3
learning applications W1 1
4. Vapnik-Chervonenkis (VC) T1 21-27 L.VIDEO 1 4
dimension
5. Probably Approximately T1 27-29 BB/PPT 1 5
Correct (PAC) learning,
6. Hypothesis spaces, T2 40-42 BB/PPT 1 6
7. Inductive bias T1 329-330 SEMINAR 1 7
8. Generalization T1 331-335 BB/PPT 1 8
9. Bias variance trade-off. T1 336-340 BB/PPT 1 9

LEARNING OUTCOME
At the end of unit, students will be able to
 Gain knowledge about problem solving agents
 Implement the search algorithms to solve problems
 Implement the machine learning applications
UNIT II SUPERVISED LEARNING 11+1
No. of Cumulative
Topic Books for Page Teaching
Topic Hours No. of
No Reference No. Methodology
Required periods
10. Linear Regression Models R4 1-8 NPTEL 1 10

11. Least squares, single & T1 676-686 BB/PPT 1 11


multiple variables
12. Bayesian linear regression T1 456-467 BB/PPT 1 12

13. gradient descent, Linear T2 467-482 BB/PPT 1 13


Classification Models:
14. Discriminant function T1 55-57 L.VIDEO 1 14

15. Perceptron algorithm R2 478-480 BB/PPT 1 15


W2
16. Probabilistic R1 196-200 BB/PPT 1 16
discriminative model
17. Logistic regression R1 205-207 BB/PPT 1 17
18. Probabilistic generative R1 198-200 BB/PPT 1 18
model
19. Naive Bayes, Maximum R1 147-152 BB/PPT 1 19
margin classifier

ML 3.4 KCE/B. Tech/AIDS/QB/IIYR/ML


FORMAT:QB09 KCE/DEPT.OFCSE

No. of Cumulative
Topic Books for Page Teaching
Topic Hours No. of
No Reference No. Methodology
Required periods
20. Support vector machine, R3 79-90 BB/PPT 1 20
Decision Tree
21. Random Forests R2 75-90 BB/PPT 1 21
LEARNING OUTCOME
At the end of unit, students will be able to
 Understand about Uncertain environment
 Implement probabilistic reasoning technique and solve problem
 Gain knowledge about Logistic regression

UNIT III ENSEMBLE TECHNIQUES AND UNSUPERVISED LEARNING 9+1


No. of Cumulative
Topic Books for Page Teaching
Topic Hours No. of
No Reference No. Methodology
Required periods
22. Combining multiple
T2 225-227 BB/PPT 1 22
learners
23. Model combination
T2 227-229 BB/PPT 1 23
schemes
24. Voting, Ensemble Learning T2 150-155 BB/PPT 1 24
25. bagging, boosting R3 145-154 L.VIDEO 1 25
26. stacking, Unsupervised
learning T1 11-10 BB/PPT 1 26
27. K-means T1 4320435 BB/PPT 1 27
28. Instance Based Learning R2 230-247 BB/PPT 1 28
29. KNN W3
247-250 SEMINAR 1 29
R2
30. Gaussian mixture models T2 451-482 BB/PPT 2 31
and Expectation
maximization.

LEARNING OUTCOME
At the end of unit, students will be able to
 Implement various supervised machine learning algorithm and build model
 Appraise the performance of built model
 Able to write Kmeans and KNN algorithm

UNIT IV NEURAL NETWORKS 9+1


31. Multilayer perceptron,
R1 225-230 BB/PPT 1 32
activation functions
32. network training R1 231-232 BB/PPT 1 33
33. gradient descent
optimization R1 235-236 BB/PPT 1 34
34. stochastic gradient descent,
R1 239-240 1 35
BB/PPT
35. error back propagation R1 240-247
BB/PPT 1 36
W4
36. from shallow networks to
deep networks T2 40-42 BB/PPT 1 37

ML 3.5 KCE/B. Tech/AIDS/QB/IIYR/ML


FORMAT:QB09 KCE/DEPT.OFCSE

No. of Cumulative
Topic Books for Page Teaching
Topic Hours No. of
No Reference No. Methodology
Required periods
37. Unit saturation (aka the
T2 43-45 BB/PPT 1 38
vanishing gradient problem)
38. ReLU T2 50-55 L.VIDEO 1 39
39. Hyper parameter tuning T2 58-59 BB/PPT 1 40
40. batch normalization,
regularization, dropout. R2 110-115 BB/PPT 1 41

LEARNING OUTCOME
At the end of unit, students will be able to
 Apply ensemble technique and improve model performance
 Implement various unsupervised machine learning algorithm and build model
 Appraise the performance of neural networks

UNIT V DESIGN AND ANALYSIS OF MACHINE LEARNING EXPERIMENTS 8+1


41. Guidelines for machine
T1 547-550 BB/PPT 1 42
learning experiments

42. Cross Validation (CV) and T1 558-560


re sampling L.VIDEO 1 43
R5 191-192

43. K-fold CV T1 558-559 BB/PPT 1 44


Boot strapping T1 560-561 BB/PPT 1 45
44.
measuring classifier T1 560-562
46
45. performance R5 255-256 BB/PPT 1
assessing a single
46. classification algorithm T1 10-15 47
SEMINAR 1
and
47. comparing two
T1 16-20 BB/PPT 1 48
classification algorithms
48. t test, McNemar’s test
R5 300-305 BB/PPT 1 49
49. K-fold CV paired t test
R5 306-310 BB/PPT 1 50

LEARNINGOUTCOME
At the end of unit, students should be able to
 Gain knowledge about neural networks
 Implement deep learning technique and built model.
 Design and analysis of machine learning experiments

ML 3.6 KCE/B. Tech/AIDS/QB/IIYR/ML


FORMAT:QB09 KCE/DEPT.OFCSE

COURSE OUTCOME
After the completion of this course, students will be able to:

CO1: Explain the basic concepts of machine learning

CO2: Construct supervised learning models

CO3: Construct unsupervised learning algorithms

CO4: Evaluate and compare different models

CONTENT BEYOND THE SYLLABUS

Regression model using Cyber Security

INTERNAL ASSESSMENT DETAILS

ASSESSMENT NO. CAT I CAT II CAT III


TopicNos. 1-15 16-30 31-49
Date

ASSIGNMENT DETAILS

ASSIGNMENT I II
Topic Nos. for reference 1-26 27-49
Deadline

ML 3.7 KCE/B. Tech/AIDS/QB/IIYR/ML


FORMAT:QB09 KCE/DEPT.OFCSE

Class Strength:27 Before CAT-I MARKS-40


ASSIGNMENT–I
Sl.NO Roll No ACTIVITY TOPIC
1 23AI01 SEMINAR Review of Linear Algebra for machine learning
2 23AI02 SEMINAR Examples of machine learning applications
3 23AI03 FLASH CARD Vapnik-Chervonenkis (VC) dimension
4 23AI04 FLASH CARD Probably Approximately Correct (PAC)
5 23AI05 FLASH CARD Hypothesis spaces
6 23AI06 POSTER PRESENTATION Review of Linear Algebra for machine learning
7 23AI07 POSTER PRESENTATION Examples of machine learning applications
8 23AI08 POSTER PRESENTATION Linear Regression Models
9 23AI09 POSTER PRESENTATION Least squares, single & multiple variables
10 23AI10 THINK BREAK Bayesian linear regression
11 23AI11 THINK BREAK gradient descent, Linear Classification Models
12 23AI12 FLASH CARD Discriminant function
13 23AI13 SEMINAR Perceptron algorithm
14 23AI15 FLASH CARD Probabilistic discriminative model
15 23AI16 FLASH CARD Logistic regression
16 23AI17 PRESENTATION
CASE STUDY Probabilistic generative model
17 23AI18 PRESENTATION
FLASH CARD Naive Bayes, Maximum margin classifier
18 23AI19 PRESENTATION
THINK BREAK Support vector machine
19 23AI20 CASE STUDY Decision Tree
20 23AI21 PRESENTATION
SEMINAR Random Forests
21 23AI22 THINK BREAK Combining multiple learners
22 23AI23 CODING Model combination schemes
23 23AI24 CODING Ensemble Learning
24 23AI25 CASE STUDY bagging
25 23AI26 PRESENTATION
THINK BREAK boosting
26 23AI27 CODING Stacking,
27 23AI28 CODING
Unsupervised learning

EVALUATION
Coding Poster Presentation: Flash Card
 Problem Understand: 10 Marks  Poster Design: 20 Marks  Card design:10
 Coding: 10 Marks  Explanation: 15 Marks
 Execution: 10  Presentation:25
 Q & A: 5 Marks
 Q&A: 10 Marks  Q &b A:05
CASE Study Seminar: Problem Solving:
 Presentation: 40 Marks  Presentation:15  Successful Completion of
 Q&A: 20x2=40 Marks  Communication:05 Badge: 40 Marks
 Report:15 Marks.
 Q & A: 5 Marks

ML 3.8 KCE/B. Tech/AIDS/QB/IIYR/ML


FORMAT:QB09 KCE/DEPT.OFCSE

Class Strength:27 Before CAT-II MARKS-40


ASSIGNMENT–II
EVALUATION
Sl. Student. ACTIVITY TOPIC
APPLICATION ON CONCEPT Mindmap Quiz:
No
1 Roll No
23AI01 APPLICATION
 20ON CONCEPT
marks for design Multilayer perceptron& activation functions
 20 marks for design  20 questions.
 220 marks
23AI02 MIND MAP 20 marks for description
for description network training
(20*2= 40 Marks)
document document
3 23AI03 CASE STUDY gradient descent optimization C
PICTURE PROMPT Case study report Problem Solving: O
4 23AI04 CASE STUDY stochastic gradient descent
 20 marks for design  20 marks for  Successful Completion of Badge: 40 U
 520 marks
23AI05 CASE STUDY
for presentation executive summary error back propagation
Marks R
6 23AI06 MIND MAP 20 marks for from shallow networks to deep networks S
introduction, Analysis
7 23AI07 PICTURE PROMPT Unit saturation (aka the vanishing gradient E
problem
8 23AI08 MIND MAP ReLU
9 23AI09 MIND MAP Hyper parameter tuning
10 23AI10 CASE STUDY batch normalization,
11 23AI11 APPLICATION ON CONCEPT regularization, dropout
12 23AI12 PICTURE PROMPT Guidelines for machine learning experiments
13 23AI13 MIND MAP Cross Validation (CV)
14 23AI14 MIND MAP re sampling
15 23AI15 CASE STUDY K-fold CV
16 23AI16 APPLICATION ON CONCEPT Boot strapping
17 23AI17 CASE STUDY measuring classifier performance
18 23AI18 MIND MAP assessing a single classification algorithm
19 23AI19 APPLICATION ON CONCEPT t test
20 23AI20 CASE STUDY McNemar’s test
21 23AI21 PICTURE PROMPT Introduction and motivation for machine
22 23AI22 PICTURE PROMPT learning
KNN types
23 23AI23 APPLICATION ON CONCEPT causal networks
24 23AI24 APPLICATION ON CONCEPT K-means types
25 23AI25 PICTURE PROMPT Agents
26 23AI26 APPLICATION ON CONCEPT Heuristic search strategies
27 23AI27 PICTURE PROMPT Gaussian mixture models and Expectation
maximization
ASSESSMENT PLAN

CO CO Description Weightage CAT1 CAT2 CAT3 ASS-1 ASS-2 ESE

CO1 Explain the basic concepts of machine 20%


learning
√ √

Construct supervised learning models


CO2 20% √
√ √

ML 3.9 KCE/B. Tech/AIDS/QB/IIYR/ML


FORMAT:QB09 KCE/DEPT.OFCSE

Construct unsupervised learning


algorithms
CO3 30% √
√ √ √ √

Evaluate and compare different


models
CO4 30% √
√ √

ML 3.10 KCE/B. Tech/AIDS/QB/IIYR/ML


FORMAT:QB09 KCE/DEPT.OFCSE

COURSE OUTCOME ALIGNMENT MATRIX–MODEL EXAM SAMPLE QUESTION SET

Q. Question Marks BTL CO


No
1 List down the characteristics of Intelligent agent 2 L1 CO1
2 Explain briefly about Heuristic function 2 L2 CO1
3 Mention the causes of uncertainty in the real world. 2 L1 CO2
4 State Baye’s rule and explain in brief. 2 L1 CO2
5 Distinguish Supervised Learning from Unsupervised learning. 2 L2 CO3
6 What is meant by Regression? 2 L2 CO3
7 State the significance of Ensemble technique. 2 L2 CO4
8 Mention the advantages of Bagging over Boosting. 2 L1 CO4
9 Define Perceptron. 2 L1 CO5
10 Write short note on Unit Saturation. 2 L1 CO5
11.a.i Exemplify the necessary components to define an AI problem with 4 L2 CO1
an example.
ii. Explain about Heuristic function. Write the algorithm for Generate
and Test and simple Hill Climbing with a problem of your choice.
9 L3 CO1
11.b.i Illustrate DFS algorithm with a graph of your choice 4 L2 CO1
Discuss Constraint Satisfaction problem with an algorithm to solve a
Crypt arithmetic problem. 9 L3 CO1
12.a Describe how Bayesian reasoning handles uncertain knowledge in 13 L3 CO2
problem solving.
A doctor is aware that disease meningitis causes a patient to have a
stiff neck, and it occurs 80% of the time. He is also aware of some
more facts, which are given as follows:
o The Known probability that a patient has meningitis disease
is 1/30,000.
o The Known probability that a patient has a stiff neck is 2%.
What is the probability that a patient has disease meningitis with a
stiff neck?
12.b. Harry installed a new burglar alarm at his home to detect burglary. 13 L3 CO2
The alarm reliably responds at detecting a burglary but also
responds for minor earthquakes. Harry has two neighbors David and
Sophia, who have taken a responsibility to inform Harry at work
when they hear the alarm. David always calls Harry when he hears
the alarm, but sometimes he got confused with the phone ringing
and calls at that time too. On the other hand, Sophia likes to listen to
high music, so sometimes she misses to hear the alarm. Here we
would like to compute the probability of Burglary Alarm.
Calculate the probability that alarm has sounded, but there is neither
a burglary, nor an earthquake occurred, and David and Sophia both
called the Harry.

13.a.i Construct a decision tree for the expression A=X AND Y OR Z. 6 L3 CO3
ii. Provide outline of the ID3 algorithm used for inducing decision tree
from the training tuples. Also list down the different attribute 7
selection measures used in the process of decision.
13.b.i Explain the Support Vector machine from the perspective of a non- 6 L3 CO3
linear kernel by means of an algorithm.
ii. Derive the Margin of the support vectors with an example and depict
it with necessary diagrams. 7
14.a. Explain the steps in k-means algorithm. Cluster the following set of 4 13 L3 CO4
ML 3.11 KCE/B. Tech/AIDS/QB/IIYR/ML
FORMAT:QB09 KCE/DEPT.OFCSE

objects into two clusters using K-means A(3,5), B(4,5), C(1,3), D(2,4).
Consider the objects A and C as the initial cluster centers.
14.b. With suitable illustration explain Gaussian mixture model. 13 L3 CO4
15.a.i Design a multilayer perceptron that solves the XOR problem. 7 L3 CO5
ii Write the algorithm for the above and illustrate 6
15.b.i. Suppose that we want to build a neural network that classifies two 7 L3 CO5
dimensional data (ie X=[x1,x2]) into two classes: diamonds and
crosses. We have a set of training data that is plotted as follows

ii. Draw a network that can solve this classification problem.


Justify your choice of the number of nodes and the architecture.
Draw the decision boundary that your network can find on the 6
diagram.
16.a. Solve the given problem. Describe the operators involve. 15 L3 CO1
Consider a Water Jug problem: You are given two jugs, a 4-gallon
one and 3-gallon one.
Neither has any measuring markers on it. There is a pump that can
be used to fill the jugs with water. How can you get exactly 2 gallons
of water into the 4-gallon jug? Explicit assumptions: a jug can be
filled from the pump, water can be poured out of a jug onto the
ground, water can be poured from one jug to another and that there
are no other measuring devices available.

16.b. Nowadays, data stored in medical databases are growing in an 15 L3 CO2


increasingly rapid way. Analyzing the data is crucial for medical CO3
decision making and management. There is a huge requirement for CO4
the support of specific knowledge-based problem solving activities
through the analysis of patients raw data collected during diagnosis.
There is a increasing demand for discovery of new knowledge to be
extracted by the analysis of representative collections of example
cases, described by symbolic or numeric descriptors. Explain how
Machine earning can deal with the problem of finding interesting
regularities and patterns in data for the above scenario. Choose an
,
appropriate model and explain for the application.

ML 3.12 KCE/B. Tech/AIDS/QB/IIYR/ML


FORMAT:QB09 KCE/DEPT.OFCSE

ASSESSMENT PAPER QUALITY MATRIX(APQM)

PART BTL1 BTL2 BTL3 BTL4 BTL5 BTL6

A 1,4,5 2,8 7,9,10,3,6

B 11a,11b 12a,12b 13a,13b

15a,15b 14a,14b

16a,16b
TOTAL 19 30 51

Distribution 49% 51%

Prepared by Verified By
(Mrs.B.SANGEETHA) HOD/CSE

Approved by
PRINCIPAL

ML 3.13 KCE/B. Tech/AIDS/QB/IIYR/ML

You might also like