0% found this document useful (0 votes)
44 views16 pages

NNML Mid 1 Objective

The document contains multiple-choice questions (MCQs) related to Artificial Neural Networks (ANN) and Unsupervised Learning Networks, covering topics such as basic models of ANN, activation functions, learning algorithms, and specific types of networks like Perceptron, Hopfield, and Kohonen. It addresses key concepts like weight adjustment, training objectives, and the characteristics of various neural network architectures. The questions aim to assess understanding of fundamental principles and applications of neural networks.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views16 pages

NNML Mid 1 Objective

The document contains multiple-choice questions (MCQs) related to Artificial Neural Networks (ANN) and Unsupervised Learning Networks, covering topics such as basic models of ANN, activation functions, learning algorithms, and specific types of networks like Perceptron, Hopfield, and Kohonen. It addresses key concepts like weight adjustment, training objectives, and the characteristics of various neural network architectures. The questions aim to assess understanding of fundamental principles and applications of neural networks.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 16

NNML Mid 1 MCQ's

UNIT I: Artificial Neural Networks

1) Which of the following is a basic model of an Artificial Neural Network (ANN)?


A) Support Vector Machine
B) Perceptron
C) Decision Tree
D) Random Forest
2) In an ANN, __________ is responsible for determining the output based on input signals
and weights.
A) Node
B) Synapse
C) Activation Function
D) Learning Rate
3) The Perceptron Network is primarily used for _______ type of tasks.
A) Regression
B) Clustering
C) Binary Classification
D) Dimensionality Reduction
4) ________ is the primary purpose of the Back-propagation algorithm.
A) To initialize weights
B) To adjust weights to minimize error
C) To calculate output
D) To add more layers to the network
5) Which of the following networks is used in pattern recognition applications?
A) Hopfield Network
B) Kohonen Network
C) Back-Propagation Network
D) Perceptron Network
6) ________ neural network is known for storing patterns as stable states.
A) Perceptron
B) Hopfield Network
C) Kohonen Network
D) Back-Propagation Network
7) BAM in neural networks stands for _________.
A) Backward Associative Memory
B) Bidirectional Associative Memory
C) Bifocal Associative Memory
D) Binary Associative Memory
8) In an artificial neuron, weights are initialized to ________.
A) Zero
B) Random values
C) One
D) Negative values
9) The Perceptron model has ______ type of activation function.
A) Linear
B) ReLU
C) Sigmoid
D) Step
10) ________ refers to the process of updating weights in a neural network.
A) Normalization
B) Optimization
C) Training
D) Testing
11) _______ (learning) algorithm is commonly used in supervised learning networks.
A) Kohonen Network
B) Back-propagation
C) Hebbian Learning
D) Maxnet
12) ______ is the main goal of pattern association in neural networks.
A) Classification
B) Memory Retrieval
C) Weight Adjustment
D) Decision Making
13) A Hopfield network is an example of _______?
A) Feed-forward Network
B) Recurrent Network
C) Layered Network
D) Hybrid Network
14) _____ neural network is primarily used for auto-associative memory tasks.
A) Back-propagation Network
B) Perceptron
C) Hopfield Network
D) Kohonen Network
15) ______ component of an ANN adjusts during the training process.
A) Activation Function
B) Weights
C) Input
D) Output
16) ______ network is often used as a model for binary classification.
A) Perceptron
B) Kohonen Network
C) Hopfield Network
D) Back-Propagation Network
17) ______ is the primary function of the activation function in a neural network.
A) Adjust weights
B) Initialize network
C) Control output based on input
D) Normalize data
18) Which learning rule is used by the Perceptron network?
A) Delta rule
B) Hebbian learning
C) Competitive learning
D) Threshold logic
19) In a neural network, the term “epoch” refer to __________.
A) One complete forward pass
B) A single update to a weight
C) One full training cycle over all training data
D) Number of hidden layers
20) ________ is the primary objective of training an ANN.
A) To maximize the number of neurons
B) To minimize error
C) To increase layers
D) To maximize hidden units
21) The Perceptron can be considered as a special case of which other neural model?
A) Hopfield Network
B) Kohonen Network
C) ADALINE
D) Back-Propagation Network
22) Which neural network model is characterized by a feedback loop structure?
A) Feed-forward Network
B) Hopfield Network
C) Perceptron
D) Kohonen Network
23) _______ network was designed to handle linearly inseparable data using multiple
layers.
A) Perceptron
B) ADALINE
C) Multilayer Perceptron (MLP)
D) Hopfield Network
24) The Back-Propagation algorithm adjusts weights based on ___.
A) Sum of inputs
B) Output gradient
C) Error gradient
D) Constant factor
25) _____ network type is typically used for associative memory.
A) Perceptron
B) Hopfield Network
C) Kohonen Network
D) Feed-forward Network
26) Bidirectional Associative Memory (BAM) networks are designed to ______.
A) Adjust weights dynamically
B) Store input-output patterns in both directions
C) Classify patterns
D) Perform clustering
27) The main limitation of the Perceptron network is that it cannot handle __________.
A) Linearly separable data
B) Multi-layer structures
C) Non-linear data
D) Supervised learning
28) ADALINE uses ______ learning rule for training.
A) Delta rule
B) Competitive learning
C) Hebbian learning
D) Linear threshold rule
29) ______ is the fundamental building block of artificial neural networks (ANNs).
A) Neurons
B) Weights
C) Activation functions
D) Data points
30) In a neural network, the term "weights" refers to ______.
A) The size of the input data
B) Parameters that determine the strength of connections between neurons
C) The learning rate of the network
D) The number of hidden layers in the network
31) ___ of the following is not a basic model of ANN.
A) Perceptron
B) Recurrent Neural Network (RNN)
C) Long Short-Term Memory (LSTM)
D) Support Vector Machine (SVM)
32) The Perceptron is a type of neural network model that is primarily used for _______.
A) Regression tasks
B) Image recognition
C)Binary classification
D) Natural language processing
33) ______ is commonly used to describe the process of adjusting the weights in a neural
network to minimize the error.
A) Training
B) Validation
C) Testing
D) Inference
34) The Adaptive Linear Neuron (Adaline) is an improvement over the Perceptron that
uses ______ activation function.
A) Step function
B) Sigmoid function
C) Rectified Linear Unit (ReLU)
D) Hyperbolic Tangent (tanh)
35) Back-propagation is a supervised learning algorithm used to train ______.
A) Convolutional Neural Networks (CNNs)
B) Recurrent Neural Networks (RNNs)
C) Multilayer Perceptrons (MLPs)
D) Support Vector Machines (SVMs)
36) ______ is the primary goal of back-propagation in neural networks.
A) Minimize the number of neurons in the network
B) Maximize the training data size
C) Minimize the error between predicted and actual output
D) Maximize the number of hidden layers
37) Associative Memory Networks are used for _______.
A) Regression tasks
B) Unsupervised learning
C) Pattern association and retrieval
D) Reinforcement learning

38) ___ of the following is NOT a training algorithm used for pattern association in neural
networks.
A) Gradient Descent
B) Bidirectional Associative Memory (BAM)
C) Hopfield Network
D) Stochastic Gradient Descent

39) The Hopfield Network is a type of recurrent neural network primarily used for _____.
A) Image segmentation
B) Image classification
C) Solving optimization problems and pattern recognition)
D) Natural language processing

40) ______ training algorithm is typically used to store and retrieve patterns in Hopfield
Networks.
A) Hebbian learning
B) Q-learning
C) Boltzmann learning
D) Backpropagation

41) In Hopfield Networks, patterns are stored as ___.


A) Synaptic weights
B) Loss functions
C) Hidden layers
D) Convolutional filters

42) _____ of the following is a common limitation of Hopfield Networks.


A) Limited capacity to store patterns
B) Inability to solve classification problems
C) Unstable weight updates
D) High computational complexity
43) Special Networks are a category of neural networks that _______.
A) Can only be used for specific, specialized tasks
B) Do not require any training
C) Are not suitable for pattern recognition
D) Are commonly used for linear regression

44) ______ of the following is an example of a Special Network often used for face recognition.
A) Hopfield Network
B) Long Short-Term Memory (LSTM)
C) Radial Basis Function (RBF) Network
D) Perceptron

45) Artificial Neural Networks (ANN) are computational models inspired by the structure and
function of the _Human brain______.

46) In ANNs, the fundamental building block is the __Neuron_____, which processes and
transmits information.

47) The weights in a neural network represent the ___Strength of connections____ that
determine the strength of connections between neurons.

48) In supervised learning networks, the network is trained using __Labeled_____ data, where
both input and target outputs are provided.

49) The __ Perceptron _____ is a basic model of a neural network used for binary classification
tasks.

50) The Adaptive Linear Neuron, also known as _ Adaline ______ (abbreviated as Adaline), is
an improvement over the Perceptron.

51) Back-propagation is a popular supervised learning algorithm used to train multilayer


perceptrons (MLPs) by minimizing the error or __Error_____.

52) Associative Memory Networks are used for pattern ___Association____ and retrieval tasks.

53) One training algorithm used for pattern association is __ Hebbian _____ learning, which
strengthens connections between neurons when they fire together.

54) __ Associative Memory _____ and Hopfield Networks are examples of neural networks
used for associative memory tasks.
UNIT II: Unsupervised Learning Networks

1) _____ type of learning is used in Kohonen Self-Organizing Maps.

A) Supervised
B) Unsupervised
C) Reinforcement
D) Semi-supervised

2) Maxnet is a type of _______.

A) Competitive Network
B) Recurrent Network
C) Convolutional Network
D) Supervised Network
Answer: A) Competitive Network

3) In Kohonen’s Self-Organizing Map, neurons are arranged in _________.

A) A random structure
B) A fully connected layer
C) A lattice structure
D) A linear arrangement

4) ______ network architecture is used to solve clustering problems without supervision.

A) Back-propagation Network
B) Perceptron
C) Self-Organizing Map
D) ADALINE

5) Learning Vector Quantization (LVQ) is based on _____ learning principle.

A) Competitive learning
B) Supervised learning
C) Reinforcement learning
D) Hebbian learning

6) The Hamming Network is used for ______.

A) Supervised learning tasks


B) Clustering
C) Pattern recognition
D) Reinforcement learning

7) ______ is the primary characteristic of Adaptive Resonance Theory (ART) networks.


A) It requires labeled data
B) It automatically adjusts clusters
C) It uses back-propagation for learning
D) It has a static structure

8) In Competitive Learning, the neuron with the strongest response is referred to as the
_________.

A) Perceptron
B) Winner
C) Receptor
D) Feature extractor

9) Counter Propagation Networks are a combination of ____ and _____ types of networks.

A) Supervised and Competitive


B) Self-organizing and Supervised
C) Hebbian and Unsupervised
D) Competitive and Reinforcement

10) _____ of the following networks can adjust to new data without disrupting previously
learned patterns.

A) Hopfield Network
B) Perceptron
C) Adaptive Resonance Theory (ART) Network
D) Kohonen Network

11) In ___ type of learning are there no target output values.

A) Supervised learning
B) Unsupervised learning
C) Reinforcement learning
D) Semi-supervised learning

12) ______ network is used to perform clustering by arranging similar patterns close to each
other.

A) Perceptron
B) Self-Organizing Map (SOM)
C) Hopfield Network
D) Recurrent Neural Network

13) Maxnet is a network where _________.


A) All neurons are fully connected
B) Only the neuron with maximum input is activated
C) Weights are adjusted by back-propagation
D) The network is used for pattern association

14) ______ type of learning is used in Kohonen’s Self-Organizing Maps.

A) Supervised
B) Unsupervised
C) Reinforcement
D) Semi-supervised

15) ______ is the primary application of Hamming Networks.

A) Clustering
B) Pattern recognition
C) Reinforcement learning
D) Memory retention

16) Adaptive Resonance Theory (ART) networks are known for their ability to __________.

A) Forget old patterns


B) Retain stability with new data
C) Reinforce incorrect patterns
D) Forget past clusters

17) In competitive learning, only the “winner” neuron _______.

A) Gets its weights updated


B) Has weights set to zero
C) Is removed from the network
D) Classifies patterns into clusters

18) Kohonen Network focus on the ________.

A) Pattern retrieval
B) Feature mapping and clustering
C) Recurrent memory
D) Temporal sequence learning

19) Learning Vector Quantization (LVQ) combines competitive learning with ___________.

A) Hebbian learning
B) Error correction
C) Delta rule
D) Back-propagation
20) ______ network is considered a competitive learning model.

A) ADALINE
B) Kohonen SOM
C) Back-Propagation Network
D) Hopfield Network

21) The Hamming distance in neural networks is used for measuring _________.

A) Weight adjustment
B) Neuron activation
C) Similarity between patterns
D) Learning rate

22) In Kohonen’s Self-Organizing Map, neurons are arranged in ______ structure.

A) A single layer
B) A grid or lattice
C) A fully connected network
D) A recurrent loop

23) Adaptive Resonance Theory (ART) networks addresses ________ problem common in
clustering.

A) Catastrophic forgetting
B) Overfitting
C) Activation function saturation
D) Slow convergence

24) ____ feature is unique to the Kohonen Self-Organizing Map.

A) Supervised learning
B) Back-propagation
C) Weight sharing
D) Feature mapping in a topological order

25) Unsupervised learning networks are primarily used for ____


A) Classification tasks
B) Regression tasks
C) Finding hidden patterns and structures in data
D) Reinforcement learning

26) Fixed Weight Competitive Networks are used for __________.


A) Clustering and competitive learning
B) Predicting future values of time series data
C) Solving optimization problems
D) Image recognition tasks

27) ________ of the following is an example of a Fixed Weight Competitive Network.


A) Hopfield Network
B) Maxnet
C) Perceptron
D) Long Short-Term Memory (LSTM)

28) Maxnet is a type of competitive network used for ________.


A) Clustering and finding cluster centers
B) Image segmentation
C) Time series prediction
D) Principal Component Analysis (PCA)

29) The Hamming Network is mainly employed for __________.


A) Principal Component Analysis (PCA)
B) Pattern classification and recognition
C) Solving optimization problems
D) Natural language processing

30) ______ of the following is an example of a Fixed Weight Competitive Network.


A) Hopfield Network
B) Maxnet
C) Perceptron
D) Long Short-Term Memory (LSTM)

31) Kohonen Self-Organizing Feature Maps (SOM) are commonly used for __________.
A) Classification tasks
B) Dimensionality reduction and clustering
C) Time series prediction
D) Speech recognition

32) In Kohonen's SOM, what is the key objective during training ___________.
A) Minimize the mean squared error
B) Maximize the number of neurons
C) Self-organize the weight vectors based on input data
D) Increase the learning rate

33) Learning Vector Quantization (LVQ) is a type of neural network used for ________.
A) Reinforcement learning
B) Binary classification
C) Vector quantization and classification
D) Image generation

34) Counter Propagation Networks (CPN) are known for ________.


A) Regressing output values
B) Combining supervised and unsupervised learning
C) Solving optimization problems
D) Natural language processing

40) In Adaptive Resonance Theory (ART) Networks, ______ is the main idea behind the
"resonance" concept.
A) Networks adapt to new information without forgetting previously learned patterns
B) Networks undergo continuous weight updates
C ) Networks resonate with noise in the input data
D) Networks resonate with their own output values

41) In Kohonen's SOM, the primary goal during training is to self-organize the __Weight
vectors _____ vectors based on input data.

42) Learning Vector Quantization (LVQ) is a type of neural network that combines vector
quantization and ___Supervised and unsupervised learning ____ tasks.

43) Counter Propagation Networks (CPN) are neural networks that blend elements of both
_Forgetting ______ and unsupervised learning.

44) In Adaptive Resonance Theory (ART) Networks, the concept of "resonance" ensures that the
networks adapt to new information without __ Forgetting _____ previously learned patterns

45) Special Networks are neural network architectures designed for ___ Specialized ____ tasks.

46) Unsupervised learning networks are used to discover __Hidden _____ patterns or structures
in data without labeled outputs.

47) Fixed Weight Competitive Nets are neural networks commonly employed for data _
Clustering ______ and clustering tasks.

48) Maxnet is a type of competitive network that is used to find __ Cluster_____ centers.

49) The Hamming Network is primarily used for __ Pattern classification and recognition
_____ classification and recognition tasks.

50) Kohonen Self-Organizing Feature Maps (SOM) are known for dimensionality reduction and
___ Clustering ____.
UNIT III: Deep Learning

1) A Feed-Forward Network is a type of _________.

A) Unsupervised Network
B) Recurrent Network
C) Supervised Network
D) Competitive Network

2) The process of error calculation and back-propagation is repeated until ___________.

A) Weights reach zero


B) The learning rate becomes negative
C) Error is minimized to an acceptable level
D) No adjustments are needed

3) In Deep Learning, _______ network is suitable for sequence prediction tasks.

A) Feed-Forward Network
B) Recurrent Neural Network (RNN)
C) Hopfield Network
D) Kohonen Network

4) The Back-propagation algorithm uses _______ type of gradient for optimization.

A) Positive
B) Descending
C) Ascending
D) Parallel

5) Deep networks typically require _______ type of learning strategy to optimize large-scale
applications.

A) Unsupervised learning
B) Supervised learning
C) Hybrid learning
D) Self-organizing learning

6) _______ neural network is widely used in deep learning for sequence prediction.

A) Feed-forward Neural Network


B) Recurrent Neural Network (RNN)
C) Hopfield Network
D) Kohonen Network

7) In deep learning, Convolutional Neural Networks (CNNs) are primarily used for _______.
A) Image recognition
B) Text processing
C) Time series analysis
D) Dimensionality reduction

8) _______ type of gradient is commonly used in back-propagation for deep networks.

A) Ascending gradient
B) Stochastic gradient
C) Descending gradient
D) Adaptive gradient

9) _______ is a common issue faced when training deep networks due to many layers.

A) Data sparsity
B) Vanishing gradients
C) High bias
D) Limited capacity

10) A common activation function for hidden layers in deep learning is ___________.

A) Linear
B) ReLU
C) Sigmoid
D) Step function

Fill in the blanks

Unit - 1

1) Artificial Neural Networks (ANN) are computational models inspired by the structure and
function of the ___Human brain____.

OR

1) Artificial Neural Networks (ANN) are inspired from the ___Human brain____.

2) In ANNs, the fundamental building block is the __Neuron_____, which processes and
transmits information.

OR

2) In ANNs, the fundamental building block is the ___ Neuron_________.


3) The weights in a neural network represent the __ Strength of connections_____ that
determine the strength of connections between neurons.

4) In supervised learning networks, the network is trained using __ Labeled_____ data, where
both input and target outputs are provided.

OR

4) In supervised learning networks, the network is trained using ___ Labeled ____ data.

5) The __ Perceptron _____ is a basic model of a neural network used for binary classification
tasks.

6) The Adaptive Linear Neuron, also known as __ Adaline _____ (abbreviated as Adaline), is an
improvement over the Perceptron.

7) Back-propagation is a popular supervised learning algorithm used to train multilayer


perceptrons (MLPs) by minimizing the error or _Error______.

8) Associative Memory Networks are used for pattern __ Association _____ and retrieval tasks.

9) One training algorithm used for pattern association is __ Hebbian _____ learning, which
strengthens connections between neurons when they fire together.

10) ___ Associative ____ and Hopfield Networks are examples of neural networks used for
associative memory tasks.

Unit - 2

1) In Kohonen's SOM, the primary goal during training is to self-organize the __Weight _____
vectors based on input data.

2) Learning Vector Quantization (LVQ) is a type of neural network that combines vector
quantization and __Classification_____ tasks.

3) Counter Propagation Networks (CPN) are neural networks that blend elements of both
_supervised______ and unsupervised learning.

4) In Adaptive Resonance Theory (ART) Networks, the concept of "resonance" ensures that the
networks adapt to new information without _Forgetting______ previously learned patterns

5) Special Networks are neural network architectures designed for ___Specialized____ tasks.

6) Unsupervised learning networks are used to discover _Hidden______ patterns or structures in


data without labeled outputs.

7) Fixed Weight Competitive Nets are neural networks commonly employed for data
_Clustering______ and clustering tasks.

8) Maxnet is a type of competitive network that is used to find __Cluster_____ centers.

9) The Hamming Network is primarily used for __ Pattern classification and recognition
_____ classification and recognition tasks.

10) Kohonen Self-Organizing Feature Maps (SOM) are known for dimensionality reduction and
__ Clustering _____.

You might also like