Identify the unstructured data from the following
Both Image and Video clip
Which preprocessing technique is used for dimensinality reduction?
SVD
True Positive is when
The predicted instance and the actual instance are not negative
Technique used to evaluate a classifier by dividing the data set into train set to
train the classifier and test set to test the same.
Cross Validation
True Negative is when
The predicted instance and the actual instance are negative
Which one of the following is not a classification technique?
StratifiedShuffleSplit
SIFT computes the gradient histogram only for patches where as HOG is computed for
an entire image.
False
Select the correct option that directly achieves multi-class classification
(without support of binary classifiers).
K Nearest Neighbor
PCA stands for
Principal component analysis
In Supervised learning, class labels of the training samples are ___________
Known
High classification accuracy always indicates a good classifier.
True
Which of the following is not a performance evaluation measure?
Confusion matrix
Which algorithm can be used for matching local regions in two images?
SIFT
Choose the right options based on Pooling.
All the above options
Pruning is a technique associated with ______________.
Decision tree
The variation present in the PCs decrease as we move from the 1st PC to the last
one.
True
Which of the following is not an example of CNN architectures?
InceptionResnet
Which classification techniques involves finding the eigenvalues and eigenvectors?
PCA -- wrong
Clustering is a supervised classification.
False
Classification where each data is mapped to more than one class is called
____________
Binary Classification
A classifer that can compute using numeric as well as categorical values is
_________
Decision Tree Classifier -- wrong
What is the function that converts K-dimensional vector containing real values to
the same shaped vector of real values in the range of (0,1), whose sum is 1?
Softmax
Which of the given hyper parameter(s), when increased may cause random forest to
over fit the data?
Depth of Tree
The scale-invariant feature transform can be used to detect and describe local
features in images.
True
The process of changing the pixel intensity values to achieve consistency in
dynamic range for images is ___________
Image normalisation
Images, documents are examples of ___________.
Unstructured Data
The dimensionality reduction technique that efficiently represents interesting
parts of an image as a compact feature vector.
Feature extraction -- wrong
Which of the following is not a characteristics of HOG?
Computes a simple histogram of oriented gradients -- wrong
A technique used to depict the performance in a tabular form that has 2 dimensions
namely “actual” and “predicted” sets of data is called ___________.
Confusion Matrix
Unsupervised classification identifies larger number of spectrally-distinct classes
than supervised classification.
True
HOG refers to _________.
Histogram of Oriented Gradient
HOG is simplified version of SIFT
False
SVM is a __________.
supervised learning algorithm.
In SVD, the matrix A of dimension m x n can be decomposed in to A=USVT, where V is
a ___________.
n x n orthonormal matrix
Supervised learning differs from unsupervised learning. Supervised learning
requires ____________.
Labeled data
Which of the following is a feature extraction technique?
All the options
The most widely used package for machine learning in python is ____________.
sklearn
The improvement of the image data that suppresses distortions or enhances image
features is called ____________.
Image Feature extraction
SIFT stands for ________________/
Scale Invariant Feature Transform
Which among the following is True? A. SIFT is used for identification of specific
objects B. HOG is used for classification
Both A and B
Model Tuning helps to increase the accuracy
True
Higher value of which of the following hyperparameters is better for decision tree
algorithm?
Cannot say
Which type of cross validation is used for imbalanced dataset?
K-Fold
The fit(X, y) is used to ___________
Train the Classifier
Which of the following is not a preprocessing technique used for image processing?
Noise filtering
Select the correct statements about Nonlinear classification.
Kernel tricks are used by Nonlinear classifiers to achieve maximum-margin
hyperplanes.
Choose the correct sequence from the following:
Image Analysis -> PreProcessing -> Model Building--> Predict
GradientDescent is one of Backward propagation techniques to find the best set of
parameters of the network.
True
TF-IDF is a common methodology used in pre-processing of images
False
Choose the correct sequence for classifier building from the following:
Initialize -> Train - -> Predict-->Evaluate
The first layer in a CNN is never a Convolutional Layer.
False
Which classifier involves finding Optimal hyperplane for linearly separable
Patterns?
SVM
The normalized linear combination of the original predictors in a data set is
called ____________.
Principal component