Cancers 14 05382 v3
Cancers 14 05382 v3
Review
Artificial Intelligence-Driven Diagnosis of Pancreatic Cancer
Bahrudeen Shahul Hameed 1,2 and Uma Maheswari Krishnan 1,2,3, *
1 Centre for Nanotechnology & Advanced Biomaterials (CeNTAB), Shanmugha Arts, Science,
Technology and Research Academy, Deemed University, Thanjavur 613401, India
2 School of Chemical & Biotechnology (SCBT), Shanmugha Arts, Science, Technology and Research Academy,
Deemed University, Thanjavur 613401, India
3 School of Arts, Sciences, Humanities & Education (SASHE), Shanmugha Arts, Science, Technology and
Research Academy, Deemed University, Thanjavur 613401, India
* Correspondence: umakrishnan@sastra.edu; Tel.: +91-4362264101 (ext. 2677); Fax: +91-436226412
Simple Summary: Pancreatic cancer poses a grave threat to mankind, due to its poor prognosis
and aggressive nature. An accurate diagnosis is critical for implementing a successful treatment
plan given the risk of exacerbation. The diagnosis of pancreatic cancer relies on medical imaging,
which provides inaccurate information about the prognosis of the patient and makes it difficult for
clinicians to select the optimal treatment. Data derived from medical imaging has been integrated
with artificial intelligence, an emerging technology, to facilitate clinical decision making. This review
explores the implementation of artificial intelligence for various imaging modalities to obtain a precise
cancer diagnosis.
Abstract: Pancreatic cancer is among the most challenging forms of cancer to treat, owing to its
late diagnosis and aggressive nature that reduces the survival rate drastically. Pancreatic cancer
diagnosis has been primarily based on imaging, but the current state-of-the-art imaging provides a
poor prognosis, thus limiting clinicians’ treatment options. The advancement of a cancer diagnosis
Citation: Hameed, B.S.; Krishnan,
has been enhanced through the integration of artificial intelligence and imaging modalities to make
U.M. Artificial Intelligence-Driven
better clinical decisions. In this review, we examine how AI models can improve the diagnosis
Diagnosis of Pancreatic Cancer.
of pancreatic cancer using different imaging modalities along with a discussion on the emerging
Cancers 2022, 14, 5382.
https://doi.org/10.3390/
trends in an AI-driven diagnosis, based on cytopathology and serological markers. Ethical concerns
cancers14215382 regarding the use of these tools have also been discussed.
cancer include abdominal pain, changes in the consistency of faeces, nausea, bloated body,
co-morbidities, such as diabetes and jaundice, abnormal liver function parameters, loss
of weight, etc. [5]. These symptoms usually become prominent only during the advanced
stage of the disease and are often missed during the early stages. Further, serological mark-
ers for pancreatic cancer, such as CA-19-9 (Carbohydrate antigen), are not highly specific
and indicate only the advanced stage of the disease, thereby increasing the mortality risk
of the affected individual. Several imaging tools, including magnetic resonance imaging
(MRI), computed tomography (CT), endoscopic ultrasound (EUS), etc., have also been
explored for the diagnosis of pancreatic cancer. Due to rapid advances in recent years,
imaging technology has emerged in the forefront for the diagnosis, staging, and prognosis
of pancreatic cancer [6]. However, distinction of a cancerous lesion from other pancreatic
disorders, such as pancreatitis, a chronic inflammation of the pancreas, remains a major
roadblock in the accurate and early diagnosis of pancreatic cancer. Despite the existence
of advanced imaging equipment, confirmation of pancreatic cancer is confirmed through
biopsy after imaging. Not only is this time-consuming, but it also increases the probability
of mortality in the affected individual, due to the inordinate delay. A study had reported
that nearly 90% of the misdiagnosis of pancreatic cancer was due to the inability to identify
the vascular invasion and the difficulty in spotting the underlying tumour mass, due to
the inflammation [7]. Table 1 lists some of the common imaging techniques used for the
clinical diagnosis of pancreatic cancer, along with their merits and limitations.
Table 1. Major imaging techniques employed for the diagnosis of pancreatic cancer and their
limitations.
Several approaches to improve the sensitivity and prediction accuracy of these imaging
techniques have been reported in the literature. These include the use of image contrast
agents to improve the resolution and sensitivity and the use of image processing software
for a better diagnostic accuracy. In recent years, the emergence of artificial intelligence and
deep learning has transformed the landscape of an image-driven diagnosis of pancreatic
cancer with a dramatic improvement in the prediction accuracy. The various attempts to
Cancers 2022, 14, 5382 3 of 21
integrate artificial intelligence for the diagnosis of pancreatic cancer are discussed in the
following sections.
Cancers 2022, 14, 5382 support vector machine, linear regression analysis, ensemble methods, decision 4tree, of 21
K-mode, hidden Markov model, hierarchical, Gaussian mixture, and neural networks
have all been explored with different imaging data sets for distinguishing cancerous tis-
model,
sue fromhierarchical, Gaussian
non-cancerous tissuesmixture, and
[22]. The neural
work flownetworks have allofbeen
in the detection explored
cancer with
using ML is
different
depictedimaging data
in Figure 1. sets for distinguishing cancerous tissue from non-cancerous tissues [22].
The work flow in the detection of cancer using ML is depicted in Figure 1.
Figure Work-flowofofthe
Figure 1. Work-flow the stages
stages during
during thethe training
training of ML
of the the models
ML models
for thefor the diagnosis
diagnosis of
of cancer
lesions.lesions.
cancer
The classification of images for diagnosis using various AI models can be broadly
divided
divided into
intoone-stage
one-stage andand
two-stage methods.
two-stage The one-stage
methods. methodmethod
The one-stage segments the medical
segments the
image
medicalinto gridsinto
image andgrids
applies
andthe model
applies thefor classification
model while thewhile
for classification two-stage method
the two-stage
demarcates several candidate
method demarcates zones that
several candidate are used
zones for used
that are classification during the
for classification training.
during the
Though time-consuming, the two-stage object method identifies and
training. Though time-consuming, the two-stage object method identifies and screensscreens regions of
interest resulting in more accurate predictions. Region-based convolution
regions of interest resulting in more accurate predictions. Region-based convolution network (R-
CNN),
network Fast R-CNN,Fast
(R-CNN), and R-CNN,
Faster R-CNN have R-CNN
and Faster been employed
have beenin the two-stage
employed method
in the as an
two-stage
integrated network for discriminative feature extraction, segmentation, and
method as an integrated network for discriminative feature extraction, segmentation, and classification
for an improved
classification for cancer detection
an improved without
cancer compromising
detection the spatial structures
without compromising [6]. struc-
the spatial
tures [6].
3. AI Models for the Diagnosis of Pancreatic Cancer
3. AIMedical
models imaging has been widely
for the Diagnosis used for
of Pancreatic locating and diagnosing cancerous tissue
Cancer
in the gastrointestinal tract. Current analysis is largely dependent upon the expertise and
Medical imaging has been widely used for locating and diagnosing cancerous tissue
experience of the clinician. The quality of the images also influences the diagnosis through
in the gastrointestinal tract. Current analysis is largely dependent upon the expertise and
conventional methods [23]. The field of digital pathology continues to evolve from the first
experience of the clinician. The quality of the images also influences the diagnosis
generation of image processing that involved the use of image processing tools to analyse
athrough conventional
single slide, to much methods [23]. The
more advanced field of digital pathology
second-generation continues
tools that could scan,toanalyse,
evolve
fromstore
and the first generation
records of wholeof tissue
image samples.
processingThethatcurrent
involved the use in
paradigm of digital
image processing
pathology
involves the use of AI-based algorithms to analyse images, diagnose the tools
tools to analyse a single slide, to much more advanced second-generation that could
condition with
scan, analyse, and store records of whole tissue samples. The current
a high accuracy, and even predict the possibility of developing the disease even paradigm in before
digital
pathology
the onset ofinvolves the use
the disease [24].ofThe
AI-based algorithms
development to analysetools
of AI-based images, diagnosethe
has enabled therapid
con-
dition with a high accuracy, and even predict the possibility of developing
and high precision diagnosis of cancer using different medical images [25]. In the context the disease
even
of before the
pancreatic onsetAI-based
cancer, of the disease [24]. The
diagnostic development
tools of AI-basedfor
have been employed tools
riskhas enabled
prediction,
the rapid and high precision diagnosis of cancer using different medical
survival prediction, and the distinction of cancer masses from other pancreatic lesions images [25]. as
In
the context of pancreatic cancer, AI-based diagnostic
well as for the evaluation of the response post-therapy. tools have been employed for risk
Machine learning tools, such as the K-nearest neighbour (k-NN), ANN, and SVM, have
been extensively investigated for their ability to extract unique signatures from medical
images that could be used for the identification of abnormalities [26] in different types of
Cancers 2022, 14, 5382 5 of 21
digestive system cancers that also includes pancreatic cancer [27]. The k-NN algorithm,
first introduced in 1967 by Cover and Hart, calculates and predicts the distance between
the values of the specified features in the sample data and training data. Based on the
calculated distance, the sample data is grouped with its nearest neighbour class [28]. The
k-NN concept was employed by Kilicet al. [29] to identify colonic polyps using region
covariance in CT-colonography images as the distinguishing features. In another report
employing k-NN [30], the gray level co-occurrence matrix was employed as the classifying
feature in medical images of the brain and pancreatic cancers. However, k-NN is limited
by issues pertaining to local structure sensitivity and the possibility of over-fitting, leading
to errors.
Artificial Neural Networks (ANNs), the concept of which was first proposed in the
early 1940s by McCulloch and Pitts, attempt to mimic the human neuronal network. The
input layer receives the input signal that is then passed on to each of the inner hidden
layers that understands and transforms them and passes it on to the next layers, until it
reaches the final output layer [31], as shown in Figure 2. Unlike k-NN models that can only
handle limited data, the ANN model is adaptive and can be trained using large volumes of
data to become more robust and accurate. The progress in ANNs has been accelerated, due
to advances in big data, affordable graphics processing units (GPUs) and the development
of novel algorithms [32]. The ANN method used in diagnosing digestive cancers is the
back-propagating (BP) network that was first introduced in 1986 by Rumelhart [33]. This
strategy enables the error correction as the output is sent back to the inner layers if found
erroneous, to refine the output parameters during the training period. This iterative process
ensures the minimization of errors and the improved accuracy. In the context of a pancreatic
cancer diagnosis, Săftoiu et al. [34] successfully employed ANNs to differentiate chronic
pancreatitis and pancreatic adenocarcinoma, using endoscopic ultrasound images with a
sensitivity of 94%. The ANN method has advantages of being able to handle large data
sets and predict all types of interactions and inter-relationships between dependent and
independent variables [35]. However, ANN algorithms are slow when large numbers of
inputs are provided during the training period and require a large computational load,
apart from adopting a black-box approach that makes it challenging for achieving accuracy
in multi-layer networks [36].
To overcome some of the limitations of ANNs, Vapnik et al. [37] developed a super-
vised learning algorithm, in 1995, known as the support vector machine (SVM) algorithm,
that defines the boundaries known as support vectors to construct a hyperplane, which is
used to classify data [38]. The negative and positive boundaries and the maximum margin
are defined, based upon the training set of data fed as inputs. The SVM is capable of pattern
recognition and regression analysis in addition to the classification of data [39]. Zhang
et al. [40] had effectively applied the SVM to identify pancreatic cancers from EUS images,
by classifying textural features to achieve a detection accuracy of 99.07%. Though SVM
models display a high accuracy and can work with remarkable efficiency when there is a
clear demarcation of the data classes, its efficiency reduces when the size of the data set
increases or when there is extensive overlap of the data. In addition, despite being memory
efficient, SVM algorithms are slow, both during the training, as well as the testing phases.
Cancers 2022,
Cancers 14, 5382
2022, 14, 5382 6 6ofof21
21
Figure 2. Schematic representation of the process flow in a sample ANN model for the diagnosis of
Figure 2. Schematic representation of the process flow in a sample ANN model for the diagnosis of
pancreatic cancer.
pancreatic cancer.
Deep learning networks exhibit superior diagnostic abilities when compared to ML
To overcome some of the limitations of ANNs, Vapnik et al. [37] developed a su-
models as they could extract all features rather than selected ones from the medical images,
pervised
as in the learning
case of ML. algorithm, in 1995,
As a result, DL known
modelsas arethe support for
preferred vector machine (SVM)
the detection algo-
of digestive
rithm,
cancersthatanddefines
image the boundaries[41].
segmentation known as supportneural
Convolutional vectorsnetworks
to construct (CNNs)a hyperplane,
are among
which is used to classify data [38]. The negative and
the most extensively employed supervised DL techniques. These consist of inputpositive boundaries and the maxi-
layers
mum margin are defined, based upon the training set of data
where different clusters of nodes, each for a specific feature, interact with the hidden fed as inputs. The SVM is
layers
capable of pattern recognition and regression analysis in
that have the same weightage and bias and perform convolutional operations on theseaddition to the classification of
data [39].These
inputs. Zhangare et then
al. [40] had effectively
pooled applied to
and transformed thegive
SVM theto final
identify
outputpancreatic
[42]. Acancers
typical
from EUS images, by classifying textural features to achieve
CNN network comprises the input, convolutional, activating, pooling, fully connected, and a detection accuracy of
99.07%. Though SVM models display a high accuracy and can
output layers [43]. CNNs are computationally efficient but consume lots of computational work with remarkable ef-
ficiency
power and when arethere
slow.isCNNs
a clearprovide
demarcation of the data
a probabilistic classes, its
depiction of efficiency
the complete reduces
image when
that
the size of the data set increases or when there is extensive
can be preferably employed for the image classification, rather than the segmentation overlap of the data. In addi-
[44].
tion,
Amongdespite being memory
the various types ofefficient,
CNNs, U-Net SVM algorithms
algorithms that are slow, both convolutional
use fewer during the training,layers
as wellalso
have as thebeentesting phases.employed for the diagnosis of digestive cancers, including
commonly
Deep learning
pancreatic cancer, bynetworks
classifying exhibit superior diagnostic
and segmenting abilitiesinwhen
specific features compared
the medical images to ML
[45].
models
The LeNet,as they could by
proposed extract all features
Lecunet al. [46] inrather
1989, than selected the
is considered ones from
basic the medical
structure im-
of CNNs.
ages, as other
Several in thevariants,
case of ML. suchAs as aAlexNet,
result, DL models(visual
VGGNet are preferred
geometryfor the detection
group), Inception of Net,
di-
gestive cancers
and ResNet, andbeen
have image segmentation
introduced, [41]. Convolutional
between 2012 and 2015,neural networks
that vary in the(CNNs)
numberare of
among the most
convolutional andextensively
pooling layers employed
employed supervised DLcontext
[47]. In the techniques. These consist
of digestive cancers,ofSharma
input
layers where
et al. [48] different
classified andclusters
detected ofnecrosis
nodes, each for a specific
in medical images of feature,
gastricinteract
carcinoma withusing
the
hidden layersarchitecture
the AlexNet that have the withsame weightage and
a classification bias and
accuracy perform
of 69.9% andconvolutional opera-
a detection accuracy
tions on these
of 81%. Colonic inputs.
polyps These
were are then pooled and
automatically transformed
detected by Shin to et give thefrom
al. [49] final colonoscopy
output [42].
images using the Inception-Resnet network. Long et al. [50]
A typical CNN network comprises the input, convolutional, activating, pooling, fully proposed a fully convolutional
network (FCN)
connected, model, in
and output 2015,[43].
layers for the
CNNs semantic segmentation where
are computationally eachbut
efficient pixel is classified
consume lots
as computational
of an image. As the final and
power fullyare
connected
slow. CNNs layer is substituted
provide by a convolutional
a probabilistic depictionlayerof thein
the FCN,image
complete resultingthatincan thebesuperior
preferably segmentation
employed for effects, it has classification,
the image been extensively ratherstudied
than
for the
the diagnosis of
segmentation digestive
[44]. Among cancers. Oda ettypes
the various al. [51]
of employed
CNNs, U-Net a three-dimensional
algorithms thatFCN use
model to segment the pancreas automatically using CT images
fewer convolutional layers have also been commonly employed for the diagnosis of di- and an average Dice score
gestive cancers, including pancreatic cancer ,by classifying and segmenting specific fea-
tures in the medical images [45]. The LeNet, proposed by Lecunet al. [46] in 1989, is con-
sidered the basic structure of CNNs. Several other variants, such as AlexNet, VGGNet
Cancers 2022, 14, 5382 7 of 21
of 89.7 ± 3.8, was obtained. The Dice score indicates the precision of the segmentation
model employed by eliminating false positives and is computed as follows:
areao f overlapbetweentwoimagesets
Dicescore = 2 × (1)
totalnumbero f pixelsinbothimages
Generally, a Dice score above 88% is considered highly precise. In another study, Guo
et al. [52] employed a Gaussian mixture model and used morphological operations on a
three-dimensional U-Net segmentation technique, to achieve an improved segmentation
accuracy with a Dice score of 83.2 ± 7.8%. It is also evident from the various reports, that the
type of AI tool employed will be different for various imaging techniques. The following
sections highlight some recent AI-based strategies for different imaging modalities.
calculated, based on the deviations from the expected output and these are used to modify
the layers to reduce the error during the training period. In another study [66], both one
and two hidden layers were employed that exhibited a 97% accuracy with the training
data set and a 95% accuracy with the testing data set for discriminating the malignant and
non-malignant samples in the different age categories. The high accuracy was achieved
for the data sets that were initially segregated into different age groups when compared to
their uncategorized counterparts.
Yet another independent study employed MNNs for identifying pancreatic cancer
from images of cell clusters, obtained from individuals using fine needle aspiration (FNA).
Post-training, the MNN model was found to match the accuracy of an experienced cy-
topathologist. Additionally, the MNN model was able to predict accurately even incon-
clusive images, with 80% sensitivity, clearly demonstrating the promise of this tool for
the screening of FNA specimens for pancreatic cancer with a conclusive diagnosis, espe-
cially those that are deemed inconclusive by cytopathologists. In an interesting study,
a computer-assisted diagnosis (CAD) system was developed to analyse EUS images, using
deep learning models (EUS-CAD) to identify PDAC, CP, and a normal pancreas (NP). The
training set used 920 EUS images and the testing set used 470 EUS images. The detection
efficiency was 92% and 94% in the validation and testing phases, respectively. Errors in
diagnosis were identified only using the multivariate analysis of non-PDAC cases that was
attributed to mass formation resulting in an over diagnosis of tumours [67].
EUS images of intraductal papillary mucinous neoplasms (IPMNs), that are precursors
of PDAC, were analysed using deep learning algorithms to predict malignancy, using EUS
images of patients acquired before a pancreatectomy. A total of 3970 images were used
for the study and the malignant probability was calculated. The probability of the deep
learning algorithm to diagnose malignant IPMN was 0.98 (p < 0.001) with a sensitivity,
specificity, and accuracy of calculated to be 95.7%, 92.6%, and 94.0%, respectively. The
accuracy was significantly superior to the corresponding human diagnosis (56.0%) [68].
A comparison of the literature on pancreatic cancer discrimination from EUS images using
AI tools revealed that deep learning and ANN techniques exhibited the greatest accuracy,
followed by CNNs and the SVM. However, the literature reports chosen for the study
had used images that compared normal and pancreatic cancer while some had tried to
differentiate pancreatic cancer with CP. Similarly, the size of the cancerous tissues varied
between the studies [69]. Therefore, additional studies are required to address if these
differences could reflect in the prediction accuracy of the AI tool employed.
5. MRI
MRI is used to visualise the thinned slices of two-dimensional or three-dimensional
soft tissues, due to the presence of water molecules in our body. The shift in the precessional
frequency and alignment of the nuclei of the protons in the water molecule, in the presence
of an external applied magnetic field and radiofrequency, is used for acquiring the image.
The technique measures the relaxation times, T1 and T2 that denote the spin-lattice and spin-
spin relaxation, respectively, to reach the original equilibrium position [70]. Relaxitivities
(r1 and r2), which are the inverse of the respective relaxation times are also measured.
Most of the cases employ positive or negative contrast agents, such as gadolinium-based
chelates or iron oxide, respectively, to significantly enhance the ratio of the relaxivities for
an improved resolution and sensitivity [71].
Early detection of pancreatic cancer is essential to provide the affected individual
with a fair chance of survival beyond five years. However, most imaging techniques,
including MRI, fail to identify conclusively subtle changes observed in the pre-malignant
stages, such as the pancreatic intraepithelial neoplasia, which is commonly associated with
the tumorigenesis of PDAC [72]. Even an individual with stage I (localized) pancreatic
cancer has only a 39% survival rate over a five-year period [73]. In a typical example
of the use of AI for diagnosis using MRI images, a supervised machine learning (ML)
algorithm was developed to predict the overall survival rates in PDAC affected patients,
Cancers 2022, 14, 5382 9 of 21
using a cohort of 102 MRI images during training and a further 30 images during the testing
period [74]. The algorithm was used to segment the images’ extract features. The sensitivity
of the ML algorithm was 87%, while the specificity was determined to be 80%. The
considerable overlap between the clinical histopathological conclusions and the ML-driven
predictions indicates the promise of this strategy for classifying pancreatic cancer sub-types
and diagnosis.
Another study [75] had investigated the ability of deep learning to distinguish be-
tween different pancreatic diseases from magnetic resonance (MR) images that were
contrast-enhanced, using the T1 contrast agent gadopentetate dimeglumine. The gen-
erative adversarial network (GAN) form of machine learning can generate new sets of
data which resemble/mimic the data used for training. GAN was employed to generate
synthetic images that augmented the T1 contrast enhanced MRI data of 398 subjects within
the age range of 16 and 85 years, acquired before the commencement of any treatment from
a single hospital centre. The Inception-V4 network, a type of CNN with multiple hidden
layers, was trained on the GAN augmented data set. Following the training, the MRI
images acquired from two different hospital centres, comprising 50 images from subjects
in the age group 24–85 years, and 56 images from patients aged between 26–80 years,
were used for validating the performance of the Inception-V4 network towards the disease
classification. The results were compared with the predictions made by the radiologist.
To augment the diagnostic accuracy of MRI on paediatric pancreatic cancer, Zhang
et al. [76] used a quantum genetic algorithm to optimize the parameters of a traditional
SVM classification model, for the improved prediction accuracy. In addition, this study
acquired test samples from real life cases, and assessed the image processing performance
of the algorithm for an efficient detection. The results revealed that the model distinguished
clearly the cancer features with a high accuracy when compared with the conventional
detection algorithm. Another study had employed a robust and intelligent method of
ANNs combined with the SVM for the classification of pancreatic cancer to improve the
diagnostic process, in terms of both accuracy and time [77]. Here, features of the MR images
of the pancreas were extracted using the GLCM (gray-level co-occurrence matrix) method, a
second order image texture analysis technique, that defines the spatial relationships among
pixels in the region of interest. The best features extracted, using the JAFER algorithm, were
analysed using five classification techniques: ANN BP (back propagation ANN), ANN RBF
(radial basis function ANN), SVM Linear, SVM Poly (polynomial kernel), and SVM RBF
(radial basis function SVM). The two best features selected, using the ANN BP techniques
were used for the classification of pancreatic cancer with a 98% accuracy.
Corral et al. [78] employed a deep learning tool to identify neoplasia in intraductal
papillary mucinous neoplasia (IPMN), using CNNs for the classification of the MRI scans
of the pancreas. The classification was based on the guidelines issued by the American
Gastroenterology Association, as well as the Fukuoka guidelines. When tested in 139 MRI
scans of individuals, among which 22% were of a normal pancreas, 34% had a low-grade
dysplasia while 14% were diagnosed with a high-grade dysplasia and the remaining 29%
had adenocarcinoma, the model exhibited a detection sensitivity of 92% and a specificity
of 52% for the detection of dysplasia. The deep learning technique exhibited an accu-
racy of 78%, in comparison to the 76% obtained by the classification using the American
Gastroenterology Association guidelines.
For improving the accuracy, reliability, and efficiency of diagnosis, Chen et al. [79]
developed an automated deep learning model (ALAMO) for the segmentation of multiple
organs-at-risk (OARs) from the clinical MR images of the abdomen. The model had in-
cluded training procedures, such as Multiview, deep connection, and auxiliary supervision.
The model used multislice MR images as the input and generated segmented images as
the output. The model was investigated using ten different OARs, such as the pancreas,
liver, spleen, stomach, duodenum, small intestine, kidneys, spinal cord, and vertebral
bodies. The results from the model correlated well with those obtained using the manual
techniques. However, further studies integrating AI-based algorithms with these ALAMO
Cancers 2022, 14, 5382 10 of 21
generated segmented MR images of the pancreas are required for the extraction of features
to confirm the onset or progression of PC.
6. Computed Tomography
A computed tomography (CT) scan is a non-invasive clinical imaging technique that
employs X-rays to obtain images at different angles. The resultant images are processed
using customized software to obtain a reconstructed 3D image, which provides valuable
anatomical information [80]. This technique is widely employed in healthcare centres for
the diagnosis of tumours or internal injuries [81,82]. Despite its merits, CT scan images pose
a challenge to clinicians for the accurate diagnosis of cancers, owing to irregular contours
presented by regions with lesions, vasculature, bony structures, and soft tissues that display
a mosaic of densities and intensities [83]. Additional challenges involved in the precise
prediction of the disease from the CT scans are associated with fuzzy and noisy images that
lack adequate contrast [84]. AI-driven methods that enable image segmentation, contour
identification, and disease classification, therefore will be invaluable in improving the
prediction efficiency for pancreatic diseases from CT images [85]. The currently employed
conventional image segmentation models consume considerable computational time and
power, as they perform every operation for each pixel in the image [86]. Further, the
resultant processed image quality also lacks quality, thereby necessitating the development
of more robust tools for AI-driven tools for image segmentation and processing that may
provide a better diagnostic accuracy [87]. In an interesting study [88], about 19,500 non-
contrast CT scan images, acquired from 469 scans, were segmented using CNNs and
the mean pancreatic tissue density, in terms of the Hounsfield unit (HU), as well as the
pancreatic volume, were computed using the CNN algorithm. The comparison of the
results of the pre-diagnostic scans from individuals who later developed PDAC and those
that remained cancer-free, revealed that there was a significant reduction in the mean whole
gland pancreatic HU of 0.2 vs. 7.8 in individuals who developed PDAC. This suggests
that the attenuation of the HU intensity in the CT images of the pancreas could imply
a risk of PDAC. This study has opened new avenues for employing CNNs as a tool for the
pre-diagnosis/very early diagnosis of PDAC from CT scan images.
In another attempt to classify PDAC, a regular CNN algorithm with four hidden layers
was trained using CT images obtained from 222 affected individuals and 190 non-cancerous
individuals. Though a diagnostic accuracy of 95% was achieved using CNNs, it was not
superior to the predictions made by human experts indicating the need for an appropriate
AI architecture for the classification of pancreatic cancer [89]. Zhang et al. [90] employed
feature pyramid networks with a recurrent CNN (R-CNN) that could identify the sequential
patterns and predict the subsequent patterns of a given data set for classifying PDAC from CT
scan images. A dataset of 2890 CT images was employed for training the network to achieve
a classification accuracy of about 94.5%. Though this method proved to be superior to the
existing methods, it was limited by the input uncertainty that is generally associated with
closed-source data. This drawback could be eliminated by using a public data set for training.
In a more advanced variant, a 16-layer VGG16 CNN model was employed along with R-
CNN to diagnose PDAC from 6084 enhanced CT scans obtained from 338 PDAC-affected
individuals. The combination of VGG16 and R-CNN exhibited a high prediction accuracy of
about 96%. Each CT image was processed by the R-CNN within 0.2 s that was considerably
faster than a clinical imaging expert [6]. Additionally, a deep learning algorithm has been
developed by Chen et al. [91] for detecting pancreatic cancer that is smaller than 2 cm on
CT scans. The study result showed that the CNN was effective in distinguishing patients
with pancreatic cancer from normal pancreatic individuals, achieving an 89.7% sensitivity
and a 92.8% specificity. It also showed a higher sensitivity of 74.7% for the identification of
pancreatic cancer malignancies, smaller than 2 cm.
An attempt to employ CNN models to distinguish different kinds of pancreatic cysts
was made using CT images from 206 patients. Among these individuals, 64 suffered from
intraductal papillary mucinous neoplasms (IPMNs), 66 had been diagnosed with serous
Cancers 2022, 14, 5382 11 of 21
cystic neoplasms (SCN), 35 had mucinous cystic neoplasms (MCNs) while 41 individuals
suffered from solid pseudopapillary epithelial neoplasms (SPENs). The feature extraction
from the CT images and classification of the type of pancreatic cyst, was accomplished using
densely connected convolutional network (Dense-Net) architecture that uses dense layers
which receives inputs from all neurons/nodes and dense blocks connecting all layers by the
shortest route. The Dense-Net algorithm performed better than the conventional CNN model
in discriminating between the different types of cysts with the highest accuracy of 81.3%
observed for IPMNs followed by 75.8% for the SCNs and 61% for the SPENs [92]. Though the
Dense-Net model outperformed the CNNs in all categories, the study lacked information on
the tumour size and failed to provide reasons for the positive and negative errors encountered
in the identification of the type of pancreatic cysts. The model needs to be tested rigorously
with a wider range of cysts to understand its capability for discriminating between different
types of pancreatic cysts if it is to be adopted in the clinics.
survival, while the total lesion glycolysis ranked second. This information was used to stratify
individuals into poor prognosis groups with a high risk of mortality [101].
It is thus evident that every imaging technique will require customized robust algo-
rithms to extract the subtle but distinctive features of pancreatic cancer for the accurate
identification and stratification. The evolution of new ML algorithms continues to improve
the sensitivity and selectivity of the diagnosis of pancreatic cancer at an early stage, thereby
improving the survival chances of the affected individual. Table 2 lists some of the major
studies, using various AI driven models for the diagnosis of pancreatic cancer.
Table 2. Summary of the AI driven models for the pancreatic cancer diagnosis.
Study
Modality AI Model Purpose Sensitivity Specificity Accuracy Reference
Population
Pancreatic
cystic
Watson et al.,
CT CNN 27 neoplasm - - 92.9
2021 [102]
malignancy
prediction
Naïve Bayer PDAC Ahamed et al.,
CT 72 - - 86
classifier identification 2022 [103]
Pancreas Lim et al.,
CT CNN 1006 - - -
segmentation 2022 [104]
Serum tumor
Qiao et al.,
CT CNN 68 marker 89.31 92.31 -
2022 [105]
analysis
Pancreatico
enteric
Anastomotic
Mu et al.,
CT CNN 513 Fistulas 86.7 87.3 87.1
2020 [106]
prediction after
a pancreato-
duodenectomy
Acute
Keogan et al.,
CT ANN 62 pancreatitis - - -
2002 [107]
risk prediction
PDAC
Support vector histopathologi- Qiu et al.,
CT 56 78 95 86
machine cal grade 2019 [108]
discrimination
370 patients, 97.3 (Test set 1) 100 (Test set 1) 98.6(Test set 1) Liu et al.,
CT CNN PC detection
320 controls 99 (Test set 2) 98.9 (Test set 2) 98.9 (Test set 2) 2020 [109]
750 patients PDAC Chu et al.,
CT Deep learning - - 87.8
575 controls detection 2019 [110]
222 patients
CT CNN PC diagnosis 91.58 98.27 95.47 Ma et al., 2020 [89]
190 controls
Pancreatic
2890 CT Zhang et al.,
CT DCNN cancer 83.76 91.79 94
images 2020 [90]
detection
Preoperative
pancreatic
CT Deep learning 319 86.8 69.5 87.1 Si et al., 2021 [111]
cancer
diagnosis
Cancers 2022, 14, 5382 13 of 21
Table 2. Cont.
Study
Modality AI Model Purpose Sensitivity Specificity Accuracy Reference
Population
Cancer risk Muhammad et al.,
CT ANN 898 80.7 80.7 -
prediction 2019 [112]
669 patients PC Chen et al.,
CT CNN 89.7 92.8 -
804 controls differentiation 2022 [91]
Identification
of intraductal
Juan et al.,
MRI CNN 139 papillary 75 78 -
2019 [78]
mucinous
neoplasia
Automatic
Liang et al.,
MRI CNN 27 image - - -
2020 [113]
segmentation
PDAC Devi et al.,
MRI ANN 168 - - 96
differentiation 2018 [114]
Autoimmune
Marya et al.,
EUS CNN 583 pancreatitis 90 85 -
2021 [115]
from PDAC
920
PDAC Tonozuka et al.,
EUS CAD (Validation) - - -
detection 2021 [67]
+470 (test)
Computer-
aided
202 (cancerous) pancreatic
Ozkan et al.,
EUS ANN & 130 (Non- cancer 83.3 93.3 87.5
2019 [65]
cancerous) diagnosis
using image
processing
Pancreatic
Saftoiu et al.,
EUS ANN 258 lesion charac- - - 91
2012 [34]
terization
PDAC and CP
EUS ANN 388 96 93 94 Zhu et al., 2013 [63]
differentiation
PDAC and CP Saftoiu et al.,
EUS ANN 167 94 94 -
differentiation 2015 [116]
Normal, CP
EUS ANN 56 and PDAC - - 93 Das et al., 2008 [64]
differentiation
PDAC and CP Norton et al.,
EUS ANN 21 - - 89
differentiation 2001 [62]
Pancreatic
PET/CT SVM 80 cancer 95.23 97.51 96.47 Li et al., 2018 [100]
segmentation
detection of subtle textural and morphological changes in CT and MRI scans of the pan-
creas could also be facilitated through customized AI algorithms [117]. Several attempts
have also been reported to employ AI tools to predict the risk of developing pancreatic
cancer from biomarker measurements, as well as abdominal scans to discern pre-cancerous
abnormalities [117].
anatomical pathological laboratory and this, coupled with insufficient skilled pathologists,
leads to long turn-around-times [125]. Additionally, cytopathology requires the accurate
slide preparation and optimal staining of the tissue slices. However, the staining intensity of
biopsy slides exhibit analyst-based, sample thickness-based and laboratory protocol-based
variations in the intensity [125]. In this context, deep learning algorithms, such as VGG,
DenseNet, ResNet etc., and machine learning algorithms, based on SVM and the random
forest, can be employed to extract specific tumour features from the tissue slices to improve
the speed of detection and reduce the burden on the clinical pathologists. Similarly, the use
of algorithms, such as SA-GAN (stain acclimatization generative adversarial network) that
employs a generator that imports the input source image and generates a target image that
incorporates the features of the input sample and the colour intensity of a training sample.
Two discriminators are also incorporated into this deep learning model, which ensure that
the colour intensity of the desired training image and textural features of the source image
are maintained in the generated image, thus ensuring the stain colour normalization across
the different images [126]. Such approaches have been attempted, to identify various types
of gastrointestinal and breast cancer, using mammograms and tissue biopsies [127]. Using
a similar concept, a deep learning-based spiral algorithm was employed to transform 3D
MRI images of the pancreatic tissue into 2D images without compromising then original
image texture and edge parameters. The CNN-based models were employed for the feature
extraction and the bilinear pooling module was used to improve the prediction accuracy.
Parameters, such as size, shape, volume, texture, and intensity, were employed to classify
the image as pancreatic cancer with TP53 gene mutation or otherwise. The prediction
results agreed well with the actual mutation status. This approach overcomes the drawback
of the need for painful biopsies for classifying a tumour as TP53 positive. In addition,
this novel method offers a non-invasive approach for predicting gene mutations, using
AI-driven cytopathology that may also be extended for other forms of cancer or gene
mutations [128]. Similarly, ResNet and DenseNet models have been employed to identify
Helicobacter pylori, a key causative pathogen in different gastric cancers from stained tissue
biopsy specimens [129]. The advantage of using machine learning models in this case over
conventional cytopathology, is the ability of the model to identify even small numbers of the
bacteria, which is very tedious and time-consuming in the conventional mode. Abnormal
goblet cells have been identified with an 86% accuracy in tissue samples of individuals with
Barrett’s esophagus, using VGG algorithms [130]. AI-driven algorithms can be useful in
detecting microsatellite instabilities in the biopsy samples, that area hallmark of many forms
of cancers [125]. These studies clearly demonstrate that the integration of machine learning
in cytopathology can be useful for the faster, efficient, and early diagnosis of pancreatic
cancer. This field is slowly gaining prominence and may soon lead to the establishment of
a digital cytopathology as a mainstay in the detection and stratification of cancers.
AI-based tools, that may result in disastrous consequences, is another facet that is being
debated as a negative aspect of all AI-driven cancer diagnoses.
Author Contributions: Conceptualization, U.M.K.; resources, B.S.H. and U.M.K.; data curation,
B.S.H.; writing—original draft preparation, B.S.H. and U.M.K.; writing—review and editing, B.S.H.
and U.M.K.; visualization, U.M.K.; supervision, U.M.K.; project administration, U.M.K.; funding
acquisition, U.M.K. All authors have read and agreed to the published version of the manuscript.
Funding: This research was funded by Nano Mission, Department of Science & Technology grant num-
ber (SR/NM/NS-1095/2012) and FIST, Department of Science & Technology (SR/FST/LSI-622/2014).
Acknowledgments: The authors gratefully acknowledge the financial support from Nano Mission,
Department of Science & Technology and FIST, Department of Science & Technology. The authors
also thank SASTRA Deemed University for financial and infrastructural support.
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Rawla, P.; Sunkara, T.; Gaduputi, V. Epidemiology of Pancreatic Cancer: Global Trends, Etiology and Risk Factors. World J. Oncol.
2019, 10, 10–27. [CrossRef] [PubMed]
2. Hu, C.; Li, M. In Advanced Pancreatic Cancer: The Value and Significance of Interventional Therapy. J. Interv. Med. 2020, 3,
118–121. [CrossRef] [PubMed]
3. Gordon-Dseagu, V.L.; Devesa, S.S.; Goggins, M.; Stolzenberg-Solomon, R. Pancreatic Cancer Incidence Trends: Evidence from
the Surveillance, Epidemiology and End Results (SEER) Population-Based Data. Int. J. Epidemiol. 2018, 47, 427–439. [CrossRef]
[PubMed]
4. Maisonneuve, P.; Lowenfels, A.B. Epidemiology of Pancreatic Cancer: An Update. Dig. Dis. 2010, 28, 645–656. [CrossRef]
5. Kamisawa, T.; Wood, L.D.; Itoi, T.; Takaori, K. Pancreatic Cancer. Lancet 2016, 388, 73–85. [CrossRef]
6. Liu, S.-L.; Li, S.; Guo, Y.-T.; Zhou, Y.-P.; Zhang, Z.-D.; Li, S.; Lu, Y. Establishment and Application of an Artificial Intelligence
Diagnosis System for Pancreatic Cancer with a Faster Region-Based Convolutional Neural Network. Chin. Med. J. 2019, 132,
2795–2803. [CrossRef]
7. Kang, J.D.; Clarke, S.E.; Costa, A.F. Factors Associated with Missed and Misinterpreted Cases of Pancreatic Ductal Adenocarci-
noma. Eur. Radiol. 2021, 31, 2422–2432. [CrossRef]
8. Lee, D.; Yoon, S.N. Application of Artificial Intelligence-Based Technologies in the Healthcare Industry: Opportunities and
Challenges. Int. J. Environ. Res. Public Health 2021, 18, 271. [CrossRef]
9. González García, C.; Núñez-Valdez, E.; García-Díaz, V.; Pelayo G-Bustelo, C.; Cueva-Lovelle, J.M. A Review of Artificial
Intelligence in the Internet of Things. Int. J. Interact. Multimed. Artif. Intell. 2019, 5, 9–20. [CrossRef]
10. Cohen, S. The Evolution of Machine Learning: Past, Present, and Future. In Artificial Intelligence and Deep Learning in Pathology;
Elsevier: Amsterdam, The Netherlands, 2021; pp. 1–12. ISBN 978-0-323-67538-3.
11. Luchini, C.; Pea, A.; Scarpa, A. Artificial Intelligence in Oncology: Current Applications and Future Perspectives. Br. J. Cancer
2022, 126, 4–9. [CrossRef]
12. Induja, S.N.; Raji, C.G. Computational Methods for Predicting Chronic Disease in Healthcare Communities. In Proceedings of the
2019 International Conference on Data Science and Communication (IconDSC), Bangalore, India, 1–2 March 2019; pp. 1–6.
Cancers 2022, 14, 5382 17 of 21
13. Kumar, U. Applications of Machine Learning in Disease Pre-screening. In Research Anthology on Artificial Intelligence Applications
in Security; Information Resources Management Association, Ed.; IGI Global: Hershey, PA, USA, 2020; pp. 1052–1084. ISBN
978-1-79987-705-9.
14. Noori, A.; Alfi, A.; Noori, G. An Intelligent Control Strategy for Cancer Cells Reduction in Patients with Chronic Myelogenous
Leukaemia Using the Reinforcement Learning and Considering Side Effects of the Drug. Expert Syst. 2021, 38, e12655. [CrossRef]
15. Zhao, Y.; Kosorok, M.R.; Zeng, D. Reinforcement Learning Design for Cancer Clinical Trials. Statist. Med. 2009, 28, 3294–3315.
[CrossRef] [PubMed]
16. Zhu, W.; Xie, L.; Han, J.; Guo, X. The Application of Deep Learning in Cancer Prognosis Prediction. Cancers 2020, 12, 603.
[CrossRef] [PubMed]
17. Vial, A.; Stirling, D.; Field, M.; Ros, M.; Ritz, C.; Carolan, M.; Holloway, L.; Miller, A.A. The Role of Deep Learning and Radiomic
Feature Extraction in Cancer-Specific Predictive Modelling: A Review. Transl. Cancer Res. 2018, 7, 803–816. [CrossRef]
18. Ghosh, P.; Azam, S.; Hasib, K.M.; Karim, A.; Jonkman, M.; Anwar, A. A Performance Based Study on Deep Learning Algorithms
in the Effective Prediction of Breast Cancer. In Proceedings of the 2021 International Joint Conference on Neural Networks
(IJCNN), Shenzhen, China, 18–22 July 2021; pp. 1–8.
19. Lin, C.-Y.; Chien, T.-W.; Chen, Y.-H.; Lee, Y.-L.; Su, S.-B. An App to Classify a 5-Year Survival in Patients with Breast Cancer Using
the Convolutional Neural Networks (CNN) in Microsoft Excel: Development and Usability Study. Medicine 2022, 101, e28697.
[CrossRef] [PubMed]
20. Bakasa, W.; Viriri, S. Pancreatic Cancer Survival Prediction: A Survey of the State-of-the-Art. Comput. Math. Methods Med. 2021,
2021, 1188414. [CrossRef] [PubMed]
21. Capobianco, E. High-Dimensional Role of AI and Machine Learning in Cancer Research. Br. J. Cancer 2022, 126, 523–532.
[CrossRef]
22. Hussain, L.; Saeed, S.; Awan, I.A.; Idris, A.; Nadeem, M.S.A.; Chaudhry, Q.-A. Detecting Brain Tumor Using Machines Learning
Techniques Based on Different Features Extracting Strategies. CMIR 2019, 15, 595–606. [CrossRef]
23. Gassenmaier, S.; Afat, S.; Nickel, D.; Mostapha, M.; Herrmann, J.; Othman, A.E. Deep Learning–Accelerated T2-Weighted Imaging
of the Prostate: Reduction of Acquisition Time and Improvement of Image Quality. Eur. J. Radiol. 2021, 137, 109600. [CrossRef]
24. Iqbal, M.J.; Javed, Z.; Sadia, H.; Qureshi, I.A.; Irshad, A.; Ahmed, R.; Malik, K.; Raza, S.; Abbas, A.; Pezzani, R.; et al. Clinical
Applications of Artificial Intelligence and Machine Learning in Cancer Diagnosis: Looking into the Future. Cancer Cell Int. 2021,
21, 270. [CrossRef]
25. Xu, J.; Jing, M.; Wang, S.; Yang, C.; Chen, X. A Review of Medical Image Detection for Cancers in Digestive System Based on
Artificial Intelligence. Expert Rev. Med. Devices 2019, 16, 877–889. [CrossRef] [PubMed]
26. Davatzikos, C.; Sotiras, A.; Fan, Y.; Habes, M.; Erus, G.; Rathore, S.; Bakas, S.; Chitalia, R.; Gastounioti, A.; Kontos, D. Precision
Diagnostics Based on Machine Learning-Derived Imaging Signatures. Magn. Reson. Imaging 2019, 64, 49–61. [CrossRef] [PubMed]
27. Chen, W.; Chen, Q.; Parker, R.A.; Zhou, Y.; Lustigova, E.; Wu, B.U. Risk Prediction of Pancreatic Cancer in Patients with Abnormal
Morphologic Findings Related to Chronic Pancreatitis: A Machine Learning Approach. Gastro Hep Adv. 2022, 1, 1014–1026.
[CrossRef]
28. Avuçlu, E.; Elen, A. Evaluation of Train and Test Performance of Machine Learning Algorithms and Parkinson Diagnosis with
Statistical Measurements. Med. Biol. Eng. Comput. 2020, 58, 2775–2788. [CrossRef]
29. Kilic, N.; Kursun, O.; Ucan, O.N. Classification of the Colonic Polyps in CT-Colonography Using Region Covariance as Descriptor
Features of Suspicious Regions. J. Med. Syst. 2010, 34, 101–105. [CrossRef]
30. Reddy, D.J.; Arun Prasath, T.; PallikondaRajasekaran, M.; Vishnuvarthanan, G. Brain and Pancreatic Tumor Classification Based
on GLCM—K-NN Approaches. In International Conference on Intelligent Computing and Applications; Bhaskar, M.A., Dash, S.S., Das,
S., Panigrahi, B.K., Eds.; Advances in Intelligent Systems and Computing; Springer: Singapore, 2019; Volume 846, pp. 293–302.
ISBN 9789811321818.
31. Jamshidi, M.; Zilouchian, A. Intelligent Control Systems Using Soft Computing Methodologies; CRC Press: Boca Raton, FL, USA, 2001;
ISBN 978-0-8493-1875-7.
32. Lee, J.-G.; Jun, S.; Cho, Y.-W.; Lee, H.; Kim, G.B.; Seo, J.B.; Kim, N. Deep Learning in Medical Imaging: General Overview. Korean
J. Radiol. 2017, 18, 570. [CrossRef]
33. Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning Representations by Back-Propagating Errors. Nature 1986, 323, 533–536.
[CrossRef]
34. Săftoiu, A.; Vilmann, P.; Gorunescu, F.; Janssen, J.; Hocke, M.; Larsen, M.; Iglesias–Garcia, J.; Arcidiacono, P.; Will, U.; Giovannini,
M.; et al. Efficacy of an Artificial Neural Network–Based Approach to Endoscopic Ultrasound Elastography in Diagnosis of Focal
Pancreatic Masses. Clin. Gastroenterol. Hepatol. 2012, 10, 84–90.e1. [CrossRef]
35. Tu, J.V. Advantages and Disadvantages of Using Artificial Neural Networks versus Logistic Regression for Predicting Medical
Outcomes. J. Clin. Epidemiol. 1996, 49, 1225–1231. [CrossRef]
36. Hakkoum, H.; Idri, A.; Abnane, I. Assessing and Comparing Interpretability Techniques for Artificial Neural Networks Breast
Cancer Classification. Comput. Methods Biomech. Biomed. Eng. Imaging Vis. 2021, 9, 587–599. [CrossRef]
37. Cortes, C.; Vapnik, V. Support-Vector Networks. Mach. Learn. 1995, 20, 273–297. [CrossRef]
38. Reeves, D.M.; Jacyna, G.M. Support Vector Machine Regularization. WIREs Comp. Stat. 2011, 3, 204–215. [CrossRef]
39. Huang, C.-H. A Reduced Support Vector Machine Approach for Interval Regression Analysis. Inf. Sci. 2012, 217, 56–64. [CrossRef]
Cancers 2022, 14, 5382 18 of 21
40. Zhang, M.-M.; Yang, H.; Jin, Z.-D.; Yu, J.-G.; Cai, Z.-Y.; Li, Z.-S. Differential Diagnosis of Pancreatic Cancer from Normal Tissue
with Digital Imaging Processing and Pattern Recognition Based on a Support Vector Machine of EUS Images. Gastrointest. Endosc.
2010, 72, 978–985. [CrossRef] [PubMed]
41. Du, W.; Rao, N.; Liu, D.; Jiang, H.; Luo, C.; Li, Z.; Gan, T.; Zeng, B. Review on the Applications of Deep Learning in the Analysis
of Gastrointestinal Endoscopy Images. IEEE Access 2019, 7, 142053–142069. [CrossRef]
42. Kim, P. Convolutional Neural Network. In MATLAB Deep Learning; Apress: Berkeley, CA, USA, 2017; pp. 121–147. ISBN
978-1-4842-2844-9.
43. Liu, Y.H. Feature Extraction and Image Recognition with Convolutional Neural Networks. J. Phys. Conf. Ser. 2018, 1087, 062032.
[CrossRef]
44. Kumar, A.; Kim, J.; Lyndon, D.; Fulham, M.; Feng, D. An Ensemble of Fine-Tuned Convolutional Neural Networks for Medical
Image Classification. IEEE J. Biomed. Health Inform. 2017, 21, 31–40. [CrossRef]
45. Siddique, N.; Paheding, S.; Elkin, C.P.; Devabhaktuni, V. U-Net and Its Variants for Medical Image Segmentation: A Review of
Theory and Applications. IEEE Access 2021, 9, 82031–82057. [CrossRef]
46. Lecun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-Based Learning Applied to Document Recognition. Proc. IEEE 1998, 86,
2278–2324. [CrossRef]
47. Özyurt, F. A Fused CNN Model for WBC Detection with MRMR Feature Selection and Extreme Learning Machine. Soft Comput.
2020, 24, 8163–8172. [CrossRef]
48. Sharma, H.; Zerbe, N.; Klempert, I.; Hellwich, O.; Hufnagl, P. Deep Convolutional Neural Networks for Automatic Classification
of Gastric Carcinoma Using Whole Slide Images in Digital Histopathology. Comput. Med. Imaging Graph. 2017, 61, 2–13. [CrossRef]
[PubMed]
49. Shin, Y.; Qadir, H.A.; Aabakken, L.; Bergsland, J.; Balasingham, I. Automatic Colon Polyp Detection Using Region Based Deep
CNN and Post Learning Approaches. IEEE Access 2018, 6, 40950–40962. [CrossRef]
50. Long, J.; Shelhamer, E.; Darrell, T. Fully Convolutional Networks for Semantic Segmentation. In Proceedings of the IEEE
Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015. [CrossRef]
51. Oda, M.; Shimizu, N.; Oda, H.; Hayashi, Y.; Kitasaka, T.; Fujiwara, M.; Misawa, K.; Mori, K.; Roth, H.R. Towards Dense Volumetric
Pancreas Segmentation in CT Using 3D Fully Convolutional Networks. In Proceedings of the Medical Imaging 2018: Image
Processing, Houston, TX, USA, 2 March 2018; p. 10.
52. Guo, Z.; Zhang, L.; Lu, L.; Bagheri, M.; Summers, R.M.; Sonka, M.; Yao, J. Deep LOGISMOS: Deep Learning Graph-Based 3D
Segmentation of Pancreatic Tumors on CT Scans. In Proceedings of the 2018 IEEE 15th International Symposium on Biomedical
Imaging (ISBI 2018), Washington, DC, USA, 4–7 April 2018; pp. 1230–1233.
53. Luthra, A.K.; Evans, J.A. Review of Current and Evolving Clinical Indications for Endoscopic Ultrasound. World J. Gastrointest.
Endosc. 2016, 8, 157. [CrossRef] [PubMed]
54. Gonzalo-Marin, J. Role of Endoscopic Ultrasound in the Diagnosis of Pancreatic Cancer. World J. Gastrointest. Oncol. 2014, 6, 360.
[CrossRef] [PubMed]
55. Munroe, C.A.; Fehmi, S.M.A.; Savides, T.J. Endoscopic Ultrasound in the Diagnosis of Pancreatic Cancer. Expert Opin. Med. Diagn.
2013, 7, 25–35. [CrossRef] [PubMed]
56. Bhutani, M.; Koduru, P.; Joshi, V.; Saxena, P.; Suzuki, R.; Irisawa, A.; Yamao, K. The Role of Endoscopic Ultrasound in Pancreatic
Cancer Screening. Endosc. Ultrasound 2016, 5, 8. [CrossRef] [PubMed]
57. DeWitt, J.; Devereaux, B.M.; Lehman, G.A.; Sherman, S.; Imperiale, T.F. Comparison of Endoscopic Ultrasound and Computed
Tomography for the Preoperative Evaluation of Pancreatic Cancer: A Systematic Review. Clin. Gastroenterol. Hepatol. 2006, 4,
717–725. [CrossRef]
58. Pausawasdi, N.; Hongsrisuwan, P.; Chalermwai, W.V.; Butt, A.S.; Maipang, K.; Charatchareonwitthaya, P. The Diagnostic
Performance of Combined Conventional Cytology with Smears and Cell Block Preparation Obtained from Endoscopic Ultrasound-
Guided Fine Needle Aspiration for Intra-Abdominal Mass Lesions. PLoS ONE 2022, 17, e0263982. [CrossRef]
59. Hayashi, H.; Uemura, N.; Matsumura, K.; Zhao, L.; Sato, H.; Shiraishi, Y.; Yamashita, Y.; Baba, H. Recent Advances in Artificial
Intelligence for Pancreatic Ductal Adenocarcinoma. World J. Gastroenterol. 2021, 27, 7480–7496. [CrossRef]
60. Herth, F.J.F.; Rabe, K.F.; Gasparini, S.; Annema, J.T. Transbronchial and Transoesophageal (Ultrasound-Guided) Needle Aspira-
tions for the Analysis of Mediastinal Lesions. Eur. Respir. J. 2006, 28, 1264–1275. [CrossRef]
61. Cazacu, I.; Udristoiu, A.; Gruionu, L.; Iacob, A.; Gruionu, G.; Saftoiu, A. Artificial Intelligence in Pancreatic Cancer: Toward
Precision Diagnosis. Endosc. Ultrasound 2019, 8, 357. [CrossRef] [PubMed]
62. Norton, I.D.; Zheng, Y.; Wiersema, M.S.; Greenleaf, J.; Clain, J.E.; DiMagno, E.P. Neural Network Analysis of EUS Images to
Differentiate between Pancreatic Malignancy and Pancreatitis. Gastrointest. Endosc. 2001, 54, 625–629. [CrossRef] [PubMed]
63. Zhu, M.; Xu, C.; Yu, J.; Wu, Y.; Li, C.; Zhang, M.; Jin, Z.; Li, Z. Differentiation of Pancreatic Cancer and Chronic Pancreatitis Using
Computer-Aided Diagnosis of Endoscopic Ultrasound (EUS) Images: A Diagnostic Test. PLoS ONE 2013, 8, e63820. [CrossRef]
64. Das, A.; Nguyen, C.C.; Li, F.; Li, B. Digital Image Analysis of EUS Images Accurately Differentiates Pancreatic Cancer from
Chronic Pancreatitis and Normal Tissue. Gastrointest. Endosc. 2008, 67, 861–867. [CrossRef] [PubMed]
65. Ozkan, M.; Cakiroglu, M.; Kocaman, O.; Kurt, M.; Yilmaz, B.; Can, G.; Korkmaz, U.; Dandil, E.; Eksi, Z. Age-Based Computer-
Aided Diagnosis Approach for Pancreatic Cancer on Endoscopic Ultrasound Images. Endosc. Ultrasound 2016, 5, 101. [CrossRef]
[PubMed]
Cancers 2022, 14, 5382 19 of 21
66. Săftoiu, A.; Vilmann, P.; Gorunescu, F.; Gheonea, D.I.; Gorunescu, M.; Ciurea, T.; Popescu, G.L.; Iordache, A.; Hassan, H.;
Iordache, S. Neural Network Analysis of Dynamic Sequences of EUS Elastography Used for the Differential Diagnosis of Chronic
Pancreatitis and Pancreatic Cancer. Gastrointest. Endosc. 2008, 68, 1086–1094. [CrossRef]
67. Tonozuka, R.; Itoi, T.; Nagata, N.; Kojima, H.; Sofuni, A.; Tsuchiya, T.; Ishii, K.; Tanaka, R.; Nagakawa, Y.; Mukai, S. Deep Learning
Analysis for the Detection of Pancreatic Cancer on Endosonographic Images: A Pilot Study. J. Hepato-Biliary-Pancreat. Sci. 2021,
28, 95–104. [CrossRef]
68. Kuwahara, T.; Hara, K.; Mizuno, N.; Okuno, N.; Matsumoto, S.; Obata, M.; Kurita, Y.; Koda, H.; Toriyama, K.; Onishi, S.; et al.
Usefulness of Deep Learning Analysis for the Diagnosis of Malignancy in Intraductal Papillary Mucinous Neoplasms of the
Pancreas. Clin. Transl. Gastroenterol. 2019, 10, e00045. [CrossRef]
69. Dumitrescu, E.A.; Ungureanu, B.S.; Cazacu, I.M.; Florescu, L.M.; Streba, L.; Croitoru, V.M.; Sur, D.; Croitoru, A.; Turcu-Stiolica, A.;
Lungulescu, C.V. Diagnostic Value of Artificial Intelligence-Assisted Endoscopic Ultrasound for Pancreatic Cancer: A Systematic
Review and Meta-Analysis. Diagnostics 2022, 12, 309. [CrossRef]
70. Viard, A.; Eustache, F.; Segobin, S. History of Magnetic Resonance Imaging: A Trip Down Memory Lane. Neuroscience 2021, 474,
3–13. [CrossRef]
71. Mao, X.; Xu, J.; Cui, H. Functional Nanoparticles for Magnetic Resonance Imaging. WIREs Nanomed. Nanobiotechnol. 2016, 8,
814–841. [CrossRef] [PubMed]
72. Hanada, K.; Shimizu, A.; Kurihara, K.; Ikeda, M.; Yamamoto, T.; Okuda, Y.; Tazuma, S. Endoscopic Approach in the Diagnosis of
High-grade Pancreatic Intraepithelial Neoplasia. Dig. Endosc. 2022, 34, 927–937. [CrossRef] [PubMed]
73. Enriquez, J.S.; Chu, Y.; Pudakalakatti, S.; Hsieh, K.L.; Salmon, D.; Dutta, P.; Millward, N.Z.; Lurie, E.; Millward, S.; McAllister,
F.; et al. Hyperpolarized Magnetic Resonance and Artificial Intelligence: Frontiers of Imaging in Pancreatic Cancer. JMIR Med.
Inform. 2021, 9, e26601. [CrossRef] [PubMed]
74. Kaissis, G.; Ziegelmayer, S.; Lohöfer, F.; Algül, H.; Eiber, M.; Weichert, W.; Schmid, R.; Friess, H.; Rummeny, E.; Ankerst, D.;
et al. A Machine Learning Model for the Prediction of Survival and Tumor Subtype in Pancreatic Ductal Adenocarcinoma from
Preoperative Diffusion-Weighted Imaging. Eur. Radiol. Exp. 2019, 3, 41. [CrossRef]
75. Gao, X.; Wang, X. Performance of Deep Learning for Differentiating Pancreatic Diseases on Contrast-Enhanced Magnetic
Resonance Imaging: A Preliminary Study. Diagn. Interv. Imaging 2020, 101, 91–100. [CrossRef]
76. Zhang, Y.; Wang, S.; Qu, S.; Zhang, H. Support Vector Machine Combined with Magnetic Resonance Imaging for Accurate
Diagnosis of Paediatric Pancreatic Cancer. IET Image Process. 2020, 14, 1233–1239. [CrossRef]
77. Balasubramanian, A.D.; Murugan, P.R.; Thiyagarajan, A.P. Analysis and Classification of Malignancy in Pancreatic Magnetic
Resonance Images Using Neural Network Techniques. Int. J. Imaging Syst. Technol. 2019, 29, 399–418. [CrossRef]
78. Corral, J.E.; Hussein, S.; Kandel, P.; Bolan, C.W.; Bagci, U.; Wallace, M.B. Deep Learning to Classify Intraductal Papillary Mucinous
Neoplasms Using Magnetic Resonance Imaging. Pancreas 2019, 48, 805–810. [CrossRef]
79. Chen, Y.; Ruan, D.; Xiao, J.; Wang, L.; Sun, B.; Saouaf, R.; Yang, W.; Li, D.; Fan, Z. Fully Automated Multiorgan Segmentation in
Abdominal Magnetic Resonance Imaging with Deep Neural Networks. Med. Phys. 2020, 47, 4971–4982. [CrossRef]
80. Brooks, S.L. Computed Tomography. Dent. Clin. N. Am. 1993, 37, 575–590. [CrossRef]
81. Raman, S.P.; Horton, K.M.; Fishman, E.K. Multimodality Imaging of Pancreatic Cancer—Computed Tomography, Magnetic
Resonance Imaging, and Positron Emission Tomography. Cancer J. 2012, 18, 511–522. [CrossRef] [PubMed]
82. Múnera, F.; Cohn, S.; Rivas, L.A. Penetrating Injuries of the Neck: Use of Helical Computed Tomographic Angiography. J. Trauma
Inj. Infect. Crit. Care 2005, 58, 413–418. [CrossRef] [PubMed]
83. Miller, T.T.; Sofka, C.M.; Zhang, P.; Khurana, J.S. Systematic Approach to Tumors and Tumor-Like Conditions of Soft Tissue. In
Diagnostic Imaging of Musculoskeletal Diseases; Bonakdarpour, A., Reinus, W.R., Khurana, J.S., Eds.; Humana Press: Totowa, NJ,
USA, 2009; pp. 313–349. ISBN 978-1-58829-947-5.
84. Willemink, M.J.; Persson, M.; Pourmorteza, A.; Pelc, N.J.; Fleischmann, D. Photon-Counting CT: Technical Principles and Clinical
Prospects. Radiology 2018, 289, 293–312. [CrossRef] [PubMed]
85. Litjens, G.; Kooi, T.; Bejnordi, B.E.; Setio, A.A.A.; Ciompi, F.; Ghafoorian, M.; van der Laak, J.A.W.M.; van Ginneken, B.; Sánchez,
C.I. A Survey on Deep Learning in Medical Image Analysis. Med. Image Anal. 2017, 42, 60–88. [CrossRef] [PubMed]
86. Razzak, M.I.; Naz, S.; Zaib, A. Deep Learning for Medical Image Processing: Overview, Challenges and the Future. In Classification
in BioApps; Dey, N., Ashour, A.S., Borra, S., Eds.; Lecture Notes in Computational Vision and Biomechanics; Springer International
Publishing: Cham, Switzerland, 2018; Volume 26, pp. 323–350. ISBN 978-3-319-65980-0.
87. Dhruv, B.; Mittal, N.; Modi, M. Early and Precise Detection of Pancreatic Tumor by Hybrid Approach with Edge Detection and
Artificial Intelligence Techniques. EAI Endorsed Trans. Pervasive Health Technol. 2021, 7, e1. [CrossRef]
88. Drewes, A.M.; van Veldhuisen, C.L.; Bellin, M.D.; Besselink, M.G.; Bouwense, S.A.; Olesen, S.S.; van Santvoort, H.; Vase, L.;
Windsor, J.A. Assessment of Pain Associated with Chronic Pancreatitis: An International Consensus Guideline. Pancreatology
2021, 21, 1256–1284. [CrossRef]
89. Ma, H.; Liu, Z.-X.; Zhang, J.-J.; Wu, F.-T.; Xu, C.-F.; Shen, Z.; Yu, C.-H.; Li, Y.-M. Construction of a Convolutional Neural
Network Classifier Developed by Computed Tomography Images for Pancreatic Cancer Diagnosis. World J. Gastroenterol. 2020,
26, 5156–5168. [CrossRef]
Cancers 2022, 14, 5382 20 of 21
90. Zhang, Z.; Li, S.; Wang, Z.; Lu, Y. A Novel and Efficient Tumor Detection Framework for Pancreatic Cancer via CT Images. In
Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC),
Montreal, QC, Canada, 20–24 July 2020; pp. 1160–1164.
91. Chen, P.-T.; Wu, T.; Wang, P.; Chang, D.; Liu, K.-L.; Wu, M.-S.; Roth, H.R.; Lee, P.-C.; Liao, W.-C.; Wang, W. Pancreatic Cancer
Detection on CT Scans with Deep Learning: A Nationwide Population-Based Study. Radiology 2022, 2022, 220152. [CrossRef]
92. Barat, M.; Chassagnon, G.; Dohan, A.; Gaujoux, S.; Coriat, R.; Hoeffel, C.; Cassinotto, C.; Soyer, P. Artificial Intelligence: A Critical
Review of Current Applications in Pancreatic Imaging. Jpn. J. Radiol. 2021, 39, 514–523. [CrossRef]
93. Sandland, J.; Malatesti, N.; Boyle, R. Porphyrins and Related Macrocycles: Combining Photosensitization with Radio- or
Optical-Imaging for next Generation Theranostic Agents. Photodiagnosis Photodyn. Ther. 2018, 23, 281–294. [CrossRef]
94. Kumar, K.; Ghosh, A. 18 F-AlF Labeled Peptide and Protein Conjugates as Positron Emission Tomography Imaging Pharmaceuti-
cals. Bioconjug. Chem. 2018, 29, 953–975. [CrossRef] [PubMed]
95. Jacobson, O.; Chen, X. PET Designated Flouride-18 Production and Chemistry. Curr. Top. Med. Chem. 2010, 10, 1048–1059.
[CrossRef] [PubMed]
96. Buchmann, I.; Ganten, T.; Haberkorn, U. [18 F]-FDG-PET in der Diagnostik gastrointestinaler Tumoren. Z. Gastroenterol. 2008, 46,
367–375. [CrossRef] [PubMed]
97. Rosenbaum, S.J.; Lind, T.; Antoch, G.; Bockisch, A. False-Positive FDG PET Uptake—The Role of PET/CT. Eur. Radiol. 2006, 16,
1054–1065. [CrossRef] [PubMed]
98. Pakzad, F.; Groves, A.M.; Ell, P.J. The Role of Positron Emission Tomography in the Management of Pancreatic Cancer. Semin.
Nucl. Med. 2006, 36, 248–256. [CrossRef] [PubMed]
99. Rankin, S. [18 F]2-Fluoro-2-Deoxy-D-Glucose PET/CT in Mediastinal Masses. Cancer Imaging 2010, 10, S156–S160. [CrossRef]
100. Li, S.; Jiang, H.; Wang, Z.; Zhang, G.; Yao, Y. An Effective Computer Aided Diagnosis Model for Pancreas Cancer on PET/CT
Images. Comput. Methods Programs Biomed. 2018, 165, 205–214. [CrossRef]
101. Toyama, Y.; Hotta, M.; Motoi, F.; Takanami, K.; Minamimoto, R.; Takase, K. Prognostic Value of FDG-PET Radiomics with Machine
Learning in Pancreatic Cancer. Sci. Rep. 2020, 10, 17024. [CrossRef]
102. Watson, M.D.; Lyman, W.B.; Passeri, M.J.; Murphy, K.J.; Sarantou, J.P.; Iannitti, D.A.; Martinie, J.B.; Vrochides, D.; Baker, E.H. Use
of Artificial Intelligence Deep Learning to Determine the Malignant Potential of Pancreatic Cystic Neoplasms with Preoperative
Computed Tomography Imaging. Am. Surg. 2021, 87, 602–607. [CrossRef]
103. Qureshi, T.A.; Gaddam, S.; Wachsman, A.M.; Wang, L.; Azab, L.; Asadpour, V.; Chen, W.; Xie, Y.; Wu, B.; Pandol, S.J.; et al.
Predicting Pancreatic Ductal Adenocarcinoma Using Artificial Intelligence Analysis of Pre-Diagnostic Computed Tomography
Images. Cancer Biomark. 2022, 33, 211–217. [CrossRef]
104. Lim, S.-H.; Kim, Y.J.; Park, Y.-H.; Kim, D.; Kim, K.G.; Lee, D.-H. Automated Pancreas Segmentation and Volumetry Using Deep
Neural Network on Computed Tomography. Sci. Rep. 2022, 12, 4075. [CrossRef] [PubMed]
105. Qiao, Z.; Ge, J.; He, W.; Xu, X.; He, J. Artificial Intelligence Algorithm-Based Computerized Tomography Image Features
Combined with Serum Tumor Markers for Diagnosis of Pancreatic Cancer. Comput. Math. Methods Med. 2022, 2022, 8979404.
[CrossRef] [PubMed]
106. Mu, W.; Liu, C.; Gao, F.; Qi, Y.; Lu, H.; Liu, Z.; Zhang, X.; Cai, X.; Ji, R.Y.; Hou, Y.; et al. Prediction of Clinically Relevant Pancreatico-
Enteric Anastomotic Fistulas after Pancreatoduodenectomy Using Deep Learning of Preoperative Computed Tomography.
Theranostics 2020, 10, 9779–9788. [CrossRef] [PubMed]
107. Keogan, M.T.; Lo, J.Y.; Freed, K.S.; Raptopoulos, V.; Blake, S.; Kamel, I.R.; Weisinger, K.; Rosen, M.P.; Nelson, R.C. Outcome
Analysis of Patients with Acute Pancreatitis by Using an Artificial Neural Network. Acad. Radiol. 2002, 9, 410–419. [CrossRef]
108. Qiu, W.; Duan, N.; Chen, X.; Ren, S.; Zhang, Y.; Wang, Z.; Chen, R. Pancreatic Ductal Adenocarcinoma: Machine Learning–Based
Quantitative Computed Tomography Texture Analysis for Prediction of Histopathological Grade. Cancer Manag. Res. 2019, 11,
9253–9264. [CrossRef] [PubMed]
109. Liu, K.-L.; Wu, T.; Chen, P.-T.; Tsai, Y.M.; Roth, H.; Wu, M.-S.; Liao, W.-C.; Wang, W. Deep Learning to Distinguish Pancreatic
Cancer Tissue from Non-Cancerous Pancreatic Tissue: A Retrospective Study with Cross-Racial External Validation. Lancet Digit.
Health 2020, 2, e303–e313. [CrossRef]
110. Chu, L.C.; Park, S.; Kawamoto, S.; Wang, Y.; Zhou, Y.; Shen, W.; Zhu, Z.; Xia, Y.; Xie, L.; Liu, F.; et al. Application of Deep Learning
to Pancreatic Cancer Detection: Lessons Learned from Our Initial Experience. J. Am. Coll. Radiol. 2019, 16, 1338–1342. [CrossRef]
111. Si, K.; Xue, Y.; Yu, X.; Zhu, X.; Li, Q.; Gong, W.; Liang, T.; Duan, S. Fully End-to-End Deep-Learning-Based Diagnosis of Pancreatic
Tumors. Theranostics 2021, 11, 1982–1990. [CrossRef]
112. Muhammad, W.; Hart, G.R.; Nartowt, B.; Farrell, J.J.; Johung, K.; Liang, Y.; Deng, J. Pancreatic Cancer Prediction Through an
Artificial Neural Network. Front. Artif. Intell. 2019, 2, 2. [CrossRef]
113. Liang, Y.; Schott, D.; Zhang, Y.; Wang, Z.; Nasief, H.; Paulson, E.; Hall, W.; Knechtges, P.; Erickson, B.; Li, X.A. Auto-Segmentation
of Pancreatic Tumor in Multi-Parametric MRI Using Deep Convolutional Neural Networks. Radiother. Oncol. 2020, 145, 193–200.
[CrossRef]
114. Aruna Devi, B.; PallikondaRajasekaran, M. Performance Evaluation of MRI Pancreas Image Classification Using Artificial Neural
Network (ANN). In Smart Intelligent Computing and Applications; Smart Innovation, Systems and Technologies; Satapathy, S.C.,
Bhateja, V., Das, S., Eds.; Springer: Singapore, 2019; Volume 104, pp. 671–681. ISBN 9789811319204.
Cancers 2022, 14, 5382 21 of 21
115. Marya, N.B.; Powers, P.D.; Chari, S.T.; Gleeson, F.C.; Leggett, C.L.; Abu Dayyeh, B.K.; Chandrasekhara, V.; Iyer, P.G.; Majumder,
S.; Pearson, R.K.; et al. Utilisation of Artificial Intelligence for the Development of an EUS-Convolutional Neural Network Model
Trained to Enhance the Diagnosis of Autoimmune Pancreatitis. Gut 2021, 70, 1335–1344. [CrossRef] [PubMed]
116. Săftoiu, A.; Vilmann, P.; Dietrich, C.F.; Iglesias-Garcia, J.; Hocke, M.; Seicean, A.; Ignee, A.; Hassan, H.; Streba, C.T.; Ioncică,
A.M.; et al. Quantitative Contrast-Enhanced Harmonic EUS in Differential Diagnosis of Focal Pancreatic Masses (with Videos).
Gastrointest. Endosc. 2015, 82, 59–69. [CrossRef] [PubMed]
117. Qureshi, T.A.; Javed, S.; Sarmadi, T.; Pandol, S.J.; Li, D. Artificial Intelligence and Imaging for Risk Prediction of Pancreatic
Cancer: A Narrative Review. Chin. Clin. Oncol. 2022, 11, 1. [CrossRef] [PubMed]
118. Yu, Y.; Chen, S.; Wang, L.-S.; Chen, W.-L.; Guo, W.-J.; Yan, H.; Zhang, W.-H.; Peng, C.-H.; Zhang, S.-D.; Li, H.-W.; et al. Prediction of
Pancreatic Cancer by Serum Biomarkers Using Surface-Enhanced Laser Desorption/Ionization-Based Decision Tree Classification.
Oncology 2005, 68, 79–86. [CrossRef] [PubMed]
119. Brezgyte, G.; Shah, V.; Jach, D.; Crnogorac-Jurcevic, T. Non-Invasive Biomarkers for Earlier Detection of Pancreatic Cancer—A
Comprehensive Review. Cancers 2021, 13, 2722. [CrossRef] [PubMed]
120. Wang, Y.; Liu, K.; Ma, Q.; Tan, Y.; Du, W.; Lv, Y.; Tian, Y.; Wang, H. Pancreatic Cancer Biomarker Detection by Two Support Vector
Strategies for Recursive Feature Elimination. Biomark. Med. 2019, 13, 105–121. [CrossRef] [PubMed]
121. Wu, H.; Ou, S.; Zhang, H.; Huang, R.; Yu, S.; Zhao, M.; Tai, S. Advances in Biomarkers and Techniques for Pancreatic Cancer
Diagnosis. Cancer Cell Int. 2022, 22, 220. [CrossRef]
122. Yang, J.; Xu, R.; Wang, C.; Qiu, J.; Ren, B.; You, L. Early Screening and Diagnosis Strategies of Pancreatic Cancer: A Comprehensive
Review. Cancer Commun. 2021, 41, 1257–1274. [CrossRef]
123. Ko, J.; Bhagwat, N.; Yee, S.S.; Ortiz, N.; Sahmoud, A.; Black, T.; Aiello, N.M.; McKenzie, L.; O’Hara, M.; Redlinger, C.; et al.
Combining Machine Learning and Nanofluidic Technology to Diagnose Pancreatic Cancer Using Exosomes. ACS Nano 2017, 11,
11182–11193. [CrossRef]
124. Patel, H.Y.; Mukherjee, I. A Novel Neural Network to Predict Locally Advanced Pancreatic Cancer Using 4 Urinary Biomarkers:
REG1A/1B, LYVE1, and TFF1. J. Am. Coll. Surg. 2022, 235, S144–S145. [CrossRef]
125. Wong, A.N.N.; He, Z.; Leung, K.L.; To, C.C.K.; Wong, C.Y.; Wong, S.C.C.; Yoo, J.S.; Chan, C.K.R.; Chan, A.Z.; Lacambra, M.D.;
et al. Current Developments of Artificial Intelligence in Digital Pathology and Its Future Clinical Applications in Gastrointestinal
Cancers. Cancers 2022, 14, 3780. [CrossRef] [PubMed]
126. Kausar, T.; Kausar, A.; Ashraf, M.A.; Siddique, M.F.; Wang, M.; Sajid, M.; Siddique, M.Z.; Haq, A.U.; Riaz, I. SA-GAN: Stain
Acclimation Generative Adversarial Network for Histopathology Image Analysis. Appl. Sci. 2021, 12, 288. [CrossRef]
127. Hamidinekoo, A.; Denton, E.; Rampun, A.; Honnor, K.; Zwiggelaar, R. Deep Learning in Mammography and Breast Histology, an
Overview and Future Trends. Med. Image Anal. 2018, 47, 45–67. [CrossRef] [PubMed]
128. Chen, X.; Lin, X.; Shen, Q.; Qian, X. Combined Spiral Transformation and Model-Driven Multi-Modal Deep Learning Scheme for
Automatic Prediction of TP53 Mutation in Pancreatic Cancer. IEEE Trans. Med. Imaging 2021, 40, 735–747. [CrossRef] [PubMed]
129. Zhou, S.; Marklund, H.; Blaha, O.; Desai, M.; Martin, B.; Bingham, D.; Berry, G.J.; Gomulia, E.; Ng, A.Y.; Shen, J. Deep Learning
Assistance for the Histopathologic Diagnosis of Helicobacter Pylori. Intell. Based Med. 2020, 1–2, 100004. [CrossRef]
130. Gehrung, M.; Crispin-Ortuzar, M.; Berman, A.G.; O’Donovan, M.; Fitzgerald, R.C.; Markowetz, F. Triage-Driven Diagnosis of
Barrett’s Esophagus for Early Detection of Esophageal Adenocarcinoma Using Deep Learning. Nat. Med. 2021, 27, 833–841.
[CrossRef]
131. Carter, S.M.; Rogers, W.; Win, K.T.; Frazer, H.; Richards, B.; Houssami, N. The Ethical, Legal and Social Implications of Using
Artificial Intelligence Systems in Breast Cancer Care. Breast 2020, 49, 25–32. [CrossRef]
132. Shreve, J.T.; Khanani, S.A.; Haddad, T.C. Artificial Intelligence in Oncology: Current Capabilities, Future Opportunities, and
Ethical Considerations. Am. Soc. Clin. Oncol. Educ. Book 2022, 42, 842–851. [CrossRef]