Indian sign language alphabet recognition system using CNN with diffGrad optimizer and stochastic pooling
India has the largest deaf population in the world and sign language is the principal medium
for such persons to share information with normal people and among themselves. Yet, …
for such persons to share information with normal people and among themselves. Yet, …
DiffMoment: an adaptive optimization technique for convolutional neural network
Stochastic Gradient Decent (SGD) is a very popular basic optimizer applied in the learning
algorithms of deep neural networks. However, it has fixed-sized steps for every epoch without …
algorithms of deep neural networks. However, it has fixed-sized steps for every epoch without …
emapDiffP: A novel learning algorithm for convolutional neural network optimization
Deep neural networks (DNNs) having multiple hidden layers are very efficient to learn large
volume datasets and applied in a wide range of applications. The DNNs are trained on …
volume datasets and applied in a wide range of applications. The DNNs are trained on …
SWOSBC: A novel optimizer for learning Convolutional Neural Network
Deep Neural Networks (DNNs) that aim to maximize accuracy and decrease loss can be
trained using optimization algorithms. One of the most significant fields of research is the …
trained using optimization algorithms. One of the most significant fields of research is the …
sqFm: a novel adaptive optimization scheme for deep learning model
S Bhakta, U Nandi, M Mondal, KR Mahapatra… - Evolutionary …, 2024 - Springer
For deep model training, an optimization technique is required that minimizes loss and
maximizes accuracy. The development of an effective optimization method is one of the most …
maximizes accuracy. The development of an effective optimization method is one of the most …
Angularparameter: a novel optimization technique for deep learning models
S Bhakta, U Nandi, C Changdar… - … Techniques for Data …, 2023 - Springer
Training of deep learning models requires an optimization algorithm that minimizes error and
maximizes accuracy. The development of an efficient optimization algorithm is one of the …
maximizes accuracy. The development of an efficient optimization algorithm is one of the …
aMacP: An adaptive optimization algorithm for Deep Neural Network
Stochastic gradient-based optimizers are used to train convolutional neural networks (CNNs).
Due to its adaptable momentum, the Adam optimizer has recently gained a lot of attention …
Due to its adaptable momentum, the Adam optimizer has recently gained a lot of attention …
[PDF][PDF] Numerical study of cavity-based flame-holder with slot injection for supersonic combustion
S BHAKTA, B SINGH - … and Development (IJMPERD) 8. 2, Apr …, 2005 - researchgate.net
The current paper has focused on finding the optimum cavity properties to enhance the mixing
phenomenon and combustion processes in supersonic flow using Computational Fluid …
phenomenon and combustion processes in supersonic flow using Computational Fluid …
ATCBBC: A Novel Optimizer for Neural Network Architectures
For deep neural networks, gradient descent (GD) is the backbone. Slow convergence is an
issue with GD. Using momentum is the well-known method of overcoming delayed …
issue with GD. Using momentum is the well-known method of overcoming delayed …
ATCBBC: A Novel Optimizer for Neural Network Architectures Check for updates
S Bhakta, U Nandi, KR Mahapatra… - Proceedings of the …, 2024 - books.google.com
Deep learning [1, 2] simulates the way the human brain makes decisions, creates patterns,
and processes data using artificial neural network (ANN) to analyze the data. To accomplish it…
and processes data using artificial neural network (ANN) to analyze the data. To accomplish it…