0% found this document useful (0 votes)
15 views18 pages

Nitin ML Assignment 1

The document is a lab manual for Data Visualization and Machine Learning, submitted for a BCA degree. It includes various algorithms such as K-means clustering, Linear Regression, Decision Trees, and others, along with their tasks and sample Python code. Each section provides a brief description of the algorithm's purpose and the corresponding implementation in Python.

Uploaded by

nitinsahu7982
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views18 pages

Nitin ML Assignment 1

The document is a lab manual for Data Visualization and Machine Learning, submitted for a BCA degree. It includes various algorithms such as K-means clustering, Linear Regression, Decision Trees, and others, along with their tasks and sample Python code. Each section provides a brief description of the algorithm's purpose and the corresponding implementation in Python.

Uploaded by

nitinsahu7982
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

LAB MANUAL OF DATA VISUALIZATION

AND MACHINE LEARNING


Submitted to
Dr. Raj Kumar

Quantum Ilnivei si , Rooi kee


Towards the partial fulfilment of the degree of
BCA, 2“’ Year

ROORKEE, UNARAKHAND

ACADE3IIC SESSION : 2024 —25

Candidate Faculty In Charge

Name : Nitin Dr. Raj Kumar


Qid : 23120005
Section : B
Index Page
•t•K-means clustering Algorithm
Task: Making images smaller by grouping similar colour

•JLinear Regression
Task: Predicting how much a company will sell waste on, how
much they advertise

4•Decision Tree Algorithm


Task: Scientist classifying Plants and Animals

•«Logisic Regression
Task: Predicting if a customer will cancel a service

'4•Sum :- Support Vector Machine


Task : Recognizing faces (Cats and Dogs)

4 K-NN
Task: Recommending Movies orProduct

â•Random Forest Algorithm


Task: Diagnosis medical condition

•i•Naive Bayes Algorithm


Task: Classify news article

3
K-æœDs clwteFîog Algo7itlua
Task-Making imag sma w @ gFO QÎ@gSÎIOÎÎaF
colour

Python Code

# Generate randbm data


‹›P ndorrœeedt0)
X = npawz¥rn.rarxl(1Œl, 2) û1iXI poäzts ki 2£I

#AppIy KMeans ck›stering k= 3


# numt›er ofcK›sters kmeans =
n_ j
m
0

› Scanned with OKEN Scanner


Linear Regression
task- Predicting how much a company will sell waste
on, how much they advertise
Python Code
# Importing required libraries
import numpy as npimport
matpIotIib.pyplot as plt
from skleam.linearmodel import LinearRagression

# Sample data (adver0sing spend in $1000s and waste sales in tons)


adveñisingspend np.array(t1, 2, 3, 4, s. g. z)).reshape(-1, 1) # X
waste sales np.array([1.5, 2.0, 2.5, 3.5, 3.8, 4.2, 5.0]) #y

# Cr0ate linear regression medalmodel


= LinearRegression()
model.fittadverfising_spend. waste_saIes)

predIcted_salos = model.predict(advertising spend)


# Plotting pILscatter{advertising spend, waste_pales, coIor•’blue‘,
Iabel=’ActuaI Sales')
pft.plc(adveitising_spend. predlcted_saIes, co[or='red',

pit.xIabei('Advertising Spend ($1O00a)') pit.ylabeI(Wa


Sales (Tons)’)
pI1tit/e(’Linear RegfeBBjon: Advertising vs Was, Sales’)
piLlegsnd() piLgrid{True)
› Scanned with OKEN Scanner
Decision Tree Algorithm
Task : Scientist classifying Plants and Animals

Python Code

# lmport necessary libraries from


sklearn import tree import
matpIotIib.pypIot as pit

II Sample dataset: [has leaves, can move, has eyes]


# 0 = No, 1 = Yes
# Features: [has leaves, can move, has eyes\ X

[1, 0, 0], # Plant


[1, D, 0], # Plant
}0, 1, 1], 4 Animal
[0, 1, 1], # Animal
(1, 0, 0], # Plant
[0, 1, 11. # Animal

# Labels: 0 Plant, 1 Animal y


= [0, 0, 1, 1, 0, 1]
# Train Decision Tree Classifier
clf —tree.DecisionTreeCIassifier()clf
clf,fit(X, y)

Plat the tree plt.figure(figsize=(8,6)) tree.plot tree(cIf,


feature_names'["has_Ieaves can_move
"has eyes"],

pit.titIe(' Des s Tree: Plan( VS Animal Cla


plt.show{)
Output

has leavœ <= 0.5


ginì - 0.5
samples = 6
value = [3, 3]
class = Plant
True False
Lo«gistic Regression

P›zftoa Code
lmpon ui lñ ins-
numpya>no pan

modeLprodict_proba(x_t 'snt:. *łt


s
Scanned with OKENScanner
Sum :- Support Vector Machine

Python Co‹lc

# Import necessary libraries


import numpy as nd \mpor\
matplotIib.pyplot as plt from
sklearn import svm

# Sample feature date: ear size, snout length]


# Values are mads-up for demonst‹a\ion
X = np.array([
|2, 1j, # Cat
t1.5, 1}, # Ca\
(2, t.2}, # fat
|8, 6], # Dog
[9, 6.5j, # Oog
l+ 5, 5.8] I' Dog

// Labels: O Cat, 1 = Oog y


{0, 0, 0, 1, 1, 1]

# Train SVM model clf =


svm.SVC{ke ne\=’linear')

#Plotting decision bDundary


pit.figure(figsize=(g,6))
Plot points
pit.scatter|xl:,o),xl:,i),c-y,cmap='coolwarM'.s-i0 %d£ecolnrs-'t'l
it Plot decision boundary
at=ptt.gca{}xlirn=ax.get_zl rnli
Sc
# Create grid to evaluate model
gsnp.Iinspace(xIim(0],xIimtq)
g-pp,|jnspace(yIim[0],yIirn(1])
yy, =np,meshgrid(yy,xx)
gnp,vstack((XX.raveI(),YY.ravE'I())).T
/-df.decision function(Xg).reshape(XX.shape)

r Nat marginsand boundary


tit.‹ontouF(XX,W,Z,COffJ‹s 'k’,fevefs=[-1,0,1},linestyles-('—’,'-’,'1)
p|t.xIabe1('Ear Size') pit.yIabeI('SnoutLengh')
plt.fitIe('SVM: Cat vsDog£ace Classification')
pit.grid(True)
pIt.ShoW()

Output

SVH. Cat vsDogFace CbssifJcaticr


Scanned wi
9 K-NN
Task : Recommending Movies or Product

Pythun Code

# Impart librarles
import numpy as np import matplotIib.pyplot as plt
I\am sklearn.neighbors import KNeighborsClassifier
# Sample daia: [Action rating, Romance rating]
X = np.arrays{
[5, 1]. # Likes action Only
|4, 1]. # Mostly action
(J, 5], # Likes romance only
[2, 4], # Mostly romance
[3 3}, # Balanced
[4, 2], # More action
(1, 4], # More romance
I)
// Labels: 0 = Recommend Action Movie, 1 = Recommend Romance
Msvie
y np,array({0, 0, 1, 1, 1, 0t 1])

# Train K-NN model


knn = KHeighbo‹sCIass ifiertn neighbors=3) km.fit(X,

# Plotting decision bOU*ld I J


h = D.1 x min,
x max o. 6 y min.
I max = 0, 6
Scanned

x yy=np,me5hgéd(np.arang•
np.a f a nQ e (_+"'
. plt.figure(figsize=(8, 6))
lgC0ntAUff(XX, , Z, cmap=ptt.cm.Pastell, alpha=O.6)
} p!t.scatter(X[:, 0], X[:, 1], c—y, cmap=pIt.cm.coolwarm, edgecotors-
s=100)
R”plLxlabel('Action Rating') pIt.ytabe\( Romance

ph.title(‘K-NN: Movie Recommendation System')


Spit.grid(True) plt.show(}
•,.Dutput
Novic Recommendation System
S
Random Forwf Algorithm

joy Code
plmport required libraries
y x›rtnumpy as npimport

[36.6. 70}. // HaaltHy


[37.a, 7Z{, # Henttt‹y

|39.0, 95], # Sick


|36.7, 6B], P Heah y
[38.0, 85], # Sick
(37a 75j, # Healthy
[39.5, 98], # Sick

8Labea: 0=Healthy. 1= Sity


=np y([0. 0, 1, 1, 0, 1. 0, 1]) #
TfginRandomForest m0del
=RandomForestCi» ifier(n estimators-10, random@tate=42)

**btân9 ascision boundary


min, x _ 3g, y min. y_m
yy -Up.rriesñgfid(np.a/ange(x m
*_ h),
"

- Mhspe{mshape)
pit.figure(figsize=(s, 6))
*! TI*P"P!t C/Ti.RdYlBu, alpha=0.6)
pit 1}, c=y, *map=plt.cm.bwr,edgecolors=’k'. s=100)
pit.xlabeI('Temperature(•C)’) It.ylabel(’Heart Rate (bpm)')
bit.title(’RandomForest: I Diagnosis‘
pjt,grId(True} pit.show()

Randomror•st: Medical Ofagnosis


2’ sk : Classify news article

y jmpor! necessary librar es


inportnumpy as np impDrt
a yb.pypJot as plt fram
stl6Brn.naivebayes impurt
Ga ssianN3

r Sarriple fuature data: [sports words count, pofitics words count] X

[5 1]. #ćports
[6 2], # Sportu
{t. 6}, ąPc›/i i«s
|2, 7], g Politics
[4, 1j, # Sports
{I, 5ł. # Poliiics
[ó, 2J, # Sports
l2 6j, // Polłt‹cs
i)
# Labe/s.g- gpgrłs - pqlitics y
* np.array({0, 0, 1, 1. 0, 1. 0, 1])

* Train Naive Bayes modelmodel


' GaussianNB() model.fit(X,y)

P|0hing decision doundary = 0,8


nned with
-y,reshape(xx.shBpl°')

figure(figsize—(8, 6)d
C
*!P'|0lt.CfTl.Pas
tet2, alpha=0,6)
p›cafe(X[:,0],X{:,
Cmap Pvt.¢M.coolwarm,
s-100)plt.xIabel('S 5BOQ 9
Cou nt') p/t,yfabeI('coliti
want) sword
II,title(’Naive Bayes:News Article ClaSS/
fiCdtion')
&.gri4(True) pit.show()
Scanned with OKEN 5

You might also like