Srujana Documenatation
Srujana Documenatation
A DISSERTRATION SUBMITTED TO
JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY, KAKINADA
A Project Report submitted in the partial fulfillment of the requirements for the award of the
Degree of
BACHELOR OF TECHNOLOGY
in
COMPUTER SCIENCE AND ENGINEERING
Submitted by
Y. SRUJANA 18NG1A0559
AUTONOMOUS
(Affiliated of to JNTUK Kakinada, Approved by A.I.C.T.E, New Delhi)
TELAPROLU, UNGUTURU MANDAL, KRISHNA DISTRICT - 521109
2018-2022
DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING
USHA RAMA COLLEGE OF ENGINEERING AND TECHNOLOGY
(Affiliated of to JNTU Kakinada, Approved by A.I.C.T.E, New Delhi)
TELAPROLU, UNGUTURU MANDAL, KRISHNA DISTRICT-
521109 2018-2022
CERTIFICATE
This is to certify that this project entitled “BRAIN TUMOR IMAGE SEGMENTATION
USING DEEP NETWORKS” is the bonafide work of Y. Srujana (18NG1A0559) and
submitted in partial fulfillment of the requirements for the award of the Degree in Bachelor of
Technology in Computer Science & Engineering, during the academic year 2018-22.
BY
Y. Srujana (18NG1A0559)
ACKNOWLEDGEMENT
We are pleased to acknowledge our sincere thanks to our Honorable Chairman SRI.
S. RAMABRAHMAM for the guidance and advice which is given and for providing
sufficient resources.
We take this opportunity to express our gratitude to Dr. S M ROY CHOUDRI, Head
of the Department and also our guide Dr. K P N V SATYA SREE, Professor in Computer
Science and Engineering for her valuable support and motivation at each and every point in
successful completion of the project.
We also place our floral gratitude to all other teaching staff and lab technicians for
their constant support and advice throughout the project.
Project Associate
Y. Srujana (18NG1A0559)
BRAIN TUMOR IMAGE SEGMENTATION
USING
DEEP NETWORKS
ABSTRACT
ABSTRACT
TOPIC PAGE NO
1. INTRODUCTION 01
1.1. Literature Survey 03
1.1.1. Machine Learning 04
1.1.2. Features of Machine Learning 10
1.1.3. Existing System 11
1.1.4. Proposed System 12
2. AIM & SCOPE 13
2.1. Requirement Analysis 14
2.1.1. Functional Requirement Analysis 14
2.1.2. User Requirement Analysis 14
2.1.3. Non-Functional Requirement Analysis 15
2.2. Module Description 15
2.3. Feasibility Study 16
2.3.1. Technical Feasibility 17
2.3.2. Operational Feasibility 17
2.3.3. Behavioural Feasibility 17
CHAPTER - 1
INTRODUCTIO
INTRODUCTION
The cells in the body grow and divide in an orderly manner and form some new
cells. These new cells help to keep the human body healthy and properly working. When
some cells lose their capability to regulate their growth, they grow with none order. The extra
cells formed form a mass of tissue that is named as the tumor. The tumors can be benign or
malignant. Malignant tumors lead to cancer while benign tumor is not Cancerous. The
important think about the diagnosis includes the medical image data obtained from various
biomedical devices that use different imaging techniques like x-ray, CT scan, MRI. Magnetic
resonance imaging (MRI) may be a technique that depends on the measurement of magnetic
flux vectors that are generated after an appropriate excitation of strong magnetic fields and
radiofrequency pulses in the nuclei of hydrogen atoms present in the water molecules of a
patient's body. The MRI scan is much better than the CT scan for diagnosis as it doesn't use
any radiation. The radiologists can evaluate the brain using MRI. The MRI technique can
determine the presence of tumors within the brain. The MRI also contains noise caused
thanks to operator intervention which may cause inaccurate classification. The large volume
of MRI is to analyze; thus, automated systems are needed because they're less expensive -.
Automated detection of tumors in MR images is important as high accuracy is required when
handling human life. The supervised and unsupervised machine learning algorithm technique
can be employed for the classification of brain MR image either as normal or abnormal.
During this paper, an efficient automated classification technique for brain MRI is proposed
using machine learning algorithms. The supervised machine learning algorithm is used for
classification of brain MR image.
Simonyan& Zisserman 2014 they investigated the effect of the convolutional network
depth on its accuracy in the large-scale image recognition setting. These findings were the
basis of their ImageNet Challenge 2014 submission, where their team secured the first and
the second places in the localisation and classification tracks respectively. Their main
contribution was a thorough evaluation of networks of increasing depth using architecture
with very small (3×3) convolution filters, which shows that a significant improvement on the
prior-art configurations can be achieved by pushing the depth to 16–19 weight layers after
training smaller versions of VGG with less weight layers. Pan & Yang 2010‘s survey focused
on categorizing and reviewing the current progress on transfer learning for classification,
regression and clustering problems. In this survey, they discussed the relationship between
transfer learning and other related machine learning techniques such as domain adaptation,
multitask learning and sample selection bias, as well as covariate shift. They also explored
some potential future issues in transfer learning research. In this survey article, they reviewed
several current trends of transfer learning.
Tom Mitchell states machine learning as “A computer program is said to learn from
experience and from some tasks and some performance on, as measured by, improves with
experience”. Machine Learning is combination of correlations and relationships, most
machine learning algorithms in existence are concerned with finding and/or exploiting
relationship between datasets. Once Machine Learning Algorithms can pinpoint on certain
correlations, the model can either use these relationships to predict future observations or
generalize the data to reveal interesting patterns. In Machine Learning there are various types of
algorithms such as Regression, Linear Regression, Logistic Regression, Naive Bayes Classifier,
Bayes theorem, KNN (K-Nearest Neighbour Classifier), Decision Tress, Entropy, ID3, SVM
(Support Vector Machines), K-means Algorithm, Random Forest and etc.,
The name machine learning was coined in 1959 by Arthur Samuel. Machine
learning explores the study and construction of algorithms that can learn from and make
predictions on data Machine learning is closely related to (and often overlaps with)
computational statistics, which also focuses on prediction-making through the use of
computers. It has strong ties to mathematical optimization, which delivers methods, theory and
application domains to the field. Machine learning is sometimes conflated with data mining,
where the latter subfield focuses more on exploratory data analysis and is known as
unsupervised learning.
With in the field of data analytics, machine learning is a method used to devise
complex models and algorithms that lend themselves to prediction; in commercial use, this is
known as predictive analytics. These analytical models allow researchers, data scientists,
engineers, and analysts to "produce reliable, repeatable decisions and results" and uncover
"hidden insights" through learning from historical relationships and trends in the data.
Supervised learning: When an algorithm learns from example data and associated
target responses that can consist of numeric values or string labels, such as classes or tags, in
order to later predict the correct response when posed with new examples comes under the
category of Supervised learning. This approach is indeed similar to human learning under the
supervision of a teacher. The teacher provides good examples for the student to memorize,
USHA RAMA COLLEGE OF ENGINEERING AND TECHNOLOGY DEPT OF CSE 4
BRAIN TUMOR IMAGE SEGMENTATION USING DEEP NETWORKS
and the student then derives general rules from these specific examples.
Unsupervised learning: When an algorithm learns from plain examples without any
associated response, leaving to the algorithm to determine the data patterns on its own. This
type of algorithm tends to restructure the data into something else, such as new features that
may represent a class or a new series of un-correlated values. They are quite useful in
providing humans with insights into the meaning of data and new useful inputs to supervised
machine learning algorithms. As a kind of learning, it resembles the methods humans use to
figure out that certain objects or events are from the same class, such as by observing the
degree of similarity between objects. Some recommendation systems that you find on the
web in the form of marketing automation are based on this type of learning.
Reinforcement learning: When you present the algorithm with examples that lack
labels, as in unsupervised learning. However, you can accompany an example with positive
or negative feedback according to the solution the algorithm proposes comes under the
category of Reinforcement learning, which is connected to applications for which the
algorithm must make decisions (so the product is prescriptive, not just descriptive, as in
unsupervised learning), and the decisions bear consequences. In the human world, it is just
like learning by trial and error. Errors help you learn because they have a penalty added (cost,
loss of time, regret, pain, and so on), teaching you that a certain course of action is less likely
to succeed than others.
practical machine learning uses supervised learning. Supervised learning is where you have
input variables (x) and an output variable (Y) and you use an algorithm to learn the mapping
function from the input to the output.
Y = f(X)
The goal is to approximate the mapping function so well that when you have new
(x) input data that you can predict the output variables (Y) for that data. It is called
supervised learning because the process of an algorithm learning from the training dataset can
be thought of as a teacher supervising the learning process. We know the correct answers, the
algorithm iteratively makes predictions on the training data and is corrected by the teacher.
Learning stops when the algorithm achieves an acceptable level of performance.
Classification:
Unlike regression, the output variable of Classification is a category, not a value, such
as "Green or Blue", "fruit or animal", etc. Since the Classification algorithm is a Supervised
learning technique, hence it takes labeled input data, which means it contains input with the
corresponding output. In classification algorithm, a discrete output function(y) is mapped to
input variable(x). The main goal of the Classification algorithm is to identify the category of a
given dataset, and these algorithms are mainly used to predict the output for the categorical
data.
Association rule mining and classification are two major techniques of data mining.
Association rule mining is an unsupervised learning method for discovering interesting
patterns and their association in large data bases.
Classification is a supervised learning method used to find class labels for unknown
samples. Classification is the task of assigning an object's tone of special predefined
categories. It is pervasive problem that encompasses many applications.
Classification is designed as the task of learning a target function F that maps each
attribute set A to one of the predefined class labels C. The target function is also known as
classification model.
1) Descriptive modeling.
2) Predictive modeling.
Classification algorithms in machine learning use input training data to predict the
likelihood that subsequent data will fall into one of the predetermined categories. One of the
most common uses of classification is filtering emails into “spam” or “non-spam.”
Lazy Learners: Lazy Learner firstly stores the training dataset and wait until it receives the
test dataset. In Lazy learner case, classification is done on the basis of the most related data
Eager Learners: Eager Learners develop a classification model based on a training dataset
before receiving a test dataset. Opposite to Lazy learners, Eager Learner takes more time in
learning, and less time in prediction. Example: Decision Trees, Naïve Bayes, ANN.
Classification model: A classification model tries to draw some conclusion from the input
values given for training. It will predict the class labels/categories for the new data.
Classification: Classification task with two possible outcomes. E.g., Gender classification
(Male / Female).
Multi-class classification: Classification with more than two classes. In multi class
classification each sample is assigned to one and only one target label. E.g., An animal can be
cat or dog but not both at the same time.
Multi-label classification: Classification task where each sample is mapped to a set of target
labels (more than one class). E.g., A news article can be about sports, a person, and location
at the same time.
• Machine leaning models involves machines learning from data without the help of
humans or any kind of human intervention.
• Machine Learning is the science of making of making the computers learn and act
like humans by feeding data and information without being explicitly programmed.
• Machine Learning is totally different from traditionally programming, here data and
output is given to the computer and in return it gives us the program which
provides solution to the various problems. Below is the figure.
• There are Many Algorithms in Machine Learning through which we will provide us the
exact solution in predicting the disease of the patients.
• Solution to the above question is Machine learning works by taking in data, finding
relationships within that data and then giving the output.
Traffic Alerts
Social Media
Transportation and Commuting
Products Recommendations
Virtual Personal Assistants
Self Driving Cars
Dynamic Pricing
Google Translate
Online Video Streaming
As per the literature survey, it was found that automation of brain tumor detection
is very essential as high accuracy is needed when human life is involved. Automated
detection of tumors in MR images involves feature extraction and classification using a
machine learning algorithm. In this paper, a system to automatically detect a tumor in MR
images s proposed as shown in the figure.
2. Accurate results.
CHAPTER-2
AIM & SCOPE
The process to gather the software requirements from clients, analyze and document
them is known as requirements engineering or requirements analysis. The goal of
requirement engineering is to develop and maintain sophisticated and descriptive
‘System/Software Requirements Specification’ documents. It is a four step process generally,
which includes -
• Feasibility Study
• Requirements Gathering
• Python installed
• Research Papers
• Datasets
• Accuracy calculation
User Requirements Analysis is the process of determining user expectations for a new
or modified product. These features must be quantifiable, relevant and detailed. The main
user requirements of our project are as follows:
• RAM 8 or 16 GB
• Memory 1GB
Performance:
Response Time: Response time is the time a system or functional unit takes to react
to a given input.
The following modules are required for effective purposes. They are,
• Physical Data Acquisition: Acquiring the physical image of any device means
extracting an exact bit-by-bit copy of the original device's flash memory. In contrast
to logical acquisition, physically acquired images hold unallocated space, files, and
the volume stack, in addition to the extraction of data remnants present in the
memory.
network administrators to control the flow of traffic between subnets based on granular
policies.
• Data Post Processing: Post processing procedures usually include various pruning
routines, rule quality processing, rule filtering, rule combination, model
combination, or even knowledge integration. All these procedures provide a kind of
symbolic filter for noisy, imprecise, or non-user-friendly knowledge derived by an
inductive algorithm.
Feasibility Study is a high level capsule version of the entire process intended to
answer a number of questions like: What is the problem? Is there any feasible solution to the
given problem? Is the problem even worth solving? Feasibility study is conducted once the
problem is clearly understood. Feasibility study is necessary to determine that the proposed
system is Feasible by considering the technical, Operational, and Economical factors. By
having a detailed feasibility study the management will have a clear-cut view of the
proposed system. A well designed feasibility study should provide a historical background
of the business or project, the operations and management, marketing research and policies,
financial data, legal requirements and tax obligations. The following feasibilities are
considered for the project in order to ensure that the project is variable and it does not have
any major obstructions. Feasibility study encompasses the following things:
In this step, we verify whether the proposed systems are technically feasible or not.
i.e., all the technologies required to develop the system are available readily or not.
Technical Feasibility determines whether the organization has the technology and skills
necessary to carry out the project and how this should be obtained. The system can be
feasible because of the following grounds.
All necessary technology exists to develop the system
This system is flexible and it can be expanded further
This system can give guarantee of accuracy, ease of use, and reliability
Our project is technically feasible because, all the technology needed for our project
is readily available.
The clients have been involved in the planning and development of the system.
The proposed system will not cause any problem under any circumstances.
If a system exists one and modification and addition of a new module is needed,
analysis of the present system can be used as a basic model. The design starts after the
requirement analysis is complete and the coding begins after the design is complete. Once
the programming is completed, the testing is done.
The output of each phase is to be consistent with the overall requirement of the
system. Some of the qualities of the spiral model are also incorporated like after the people
concerned with the project review completion of each of the phases of the work done.
WATERFALL MODEL was being chosen because all requirements were known
beforehand and the objective of our software development is the computerization/
automation of an already existing manual working system.
SCALABILITY:
PORTABILITY:
VALIDATION:
It is the process of checking that a software system meets specifications and that it fulfills
its intended purpose. It may also be referred to as software quality control. It is normally
the responsibility of software testers as part of the software development lifecycle. Software
validation checks that the software product satisfies or fits the intended use (high-level
checking)
i.e., the software meets the user requirements, not as specification artefacts or as needs of those
who will operate the software only; but, as the need so fall the stakeholders.
CHAPTER - 3
DESIGN PHASE
3 Design phase
Design is a multi step process that focuses on data structure, Software architecture,
procedural details and interface between modules. The design process also translates the
requirements into the presentation of software that can be accessed for quality before coding
begins.
Computer software design changes continuously as new methods; better analysis and
broader understanding evolved. Software design at a relatively early stage in its revolution.
Therefore, software design methodology lacks the depth, flexibility and quantitative nature
that are normally associated with more classical engineering disciplines. However, the
techniques for software design do exist, criteria for design qualities are available and design
notation can be applied.
The purpose of the design phase is to plan a solution of the problem specified by the
requirements document. The design of a system is perhaps the most critical factor affecting
the quality of the software. It has a major impact on the project during later phases,
particularly during testing and maintenance.
Software design sits at the technical kernel of the software engineering process and
is applied regardless of the development paradigm and area of application. Design is the
first step in the development phase for any engineered product or system. The designer’s
goal is to produce a model or representation of an entity that will later be built. Beginning,
once the system requirements have been specified and analyzed, system design is the first of
the three technical activities design, code and test that is required to build and verify
software.
The importance can be stated with a single word “Quality”. Design is the place
where quality is fostered in software development. Design provides a representation of
software that can be accessed for quality. Design is the only way that can accurately
translate a customer’s view into a finished software product or system. Software design
serves as a foundation for all the software engineering steps to follow for this design pattern.
Without a strong design, we risk building an unstable system that will be difficult to
test. One whose quality cannot be assessed until the last stage.
Abstraction:
The lower level of abstraction provides a more detailed description of the solution. A
sequence of instruction that contains a specific and limited function refers to a procedural
abstraction. A collection of data that describes a data object is a data abstraction.
Architecture:
The complete structure of the software is known as software architecture. Structure
provides conceptual integrity for a system in a number of ways. The architecture is the
structure of program modules where they interact with each other in a specialized way. The
aim of the software design is to obtain an architectural framework of a system.
Patterns:
In software engineering, a design pattern is a general repeatable solution to a
commonly occurring problem in software design. A design pattern isn't a finished design
that can be transformed directly into code. It is a description or template for how to solve a
problem that can be used in many different situations and problems. A design pattern
describes a design structure and that structure solves a particular design problem in a
specifiedcontent.
Modularity:
Information hiding:
Modules must be specified and designed so that the information like algorithm and data
presented in a module is not accessible for other modules not requiring that
information.
Functional independence:
Refactoring:
Refactoring is the process of changing the software system in a way that it does not
change the external behavior of the code and still improves its internal structure.
Design classes:
The model of software is defined as a set of design classes. Every class describes the
elements of the problem domain and that focus on features of the problem which are
user visible.
Design Constraints are generally the limitations on a design. They include imposed
limitations that you don't control and limitations that are self-imposed as a way to improve a
design. The following are common types of design constraints. 9 Types of Design
Constraints:
Commercial Constraints:
Basic commercial constraints such as time and budget come under commercial constraints
Requirements:
Requirements specify the basic needs of a project. Ex: Functional requirements.
Non-Functional Requirements:
Non-Functional requirements are the requirements that specify intangible elements
of a design.
Compliance:
Compliance refers to applicable laws, regulations and standards.
Style:
Usability:
Usability principles imply frameworks and standards. Ex: The principle of least
astonishment.
Principles:
Integration:
A design that needs to work with other things such as products, services,
systems, processes, controls, partners and information.
Conceptual Design is an early phase of the design process, in which the broad
outlines of function and form of something are articulated. It includes the design of
interactions, experiences, processes and strategies. It involves an understanding of people's
needs - and how to meet them with products, services, & processes. Common artifacts of
conceptual design are concept sketches and models.
The unified modeling language allows the software engineer to express an analysis
model using the modeling notation that is governed by a set of syntactic, semantic and
pragmatic rules.
A UML system is represented using five different views that describe the system
from a distinctly different perspective. Each view can be defined by a set of diagrams. UML
is specifically constructed through two different domains. They are:
UML analysis modeling, this focuses on the user model and structural model views
of the system.
UML design modeling, which focuses on the behavioral modeling, implementation
modeling and environment model views.
Use case diagram at its simplest is a representation of a user's interaction with the
system that shows the relationship between the user and the different use cases in which the
user is involved. A use case diagram can identify the different types of users of a system and
the different use cases and will often be accompanied by other types of diagrams as well.
Actors are the external entities that interact with the system. The use cases are represented
by either circles or ellipses.
The alert message contains "emergency occurred 'old age home name' consult. Immediately
to doctor 'phone number' through telegram bot.
The Use Case diagram of the project based on using machine learning consist of all the various
aspects a normal use case diagram requires. This use case diagram shows how from starting the
model flows from one step to another, like he enters into the system then enters all the
information’s and all other general information along with the symptoms that goes into the
system, compares with the prediction model and if true is predicts the appropriate results
otherwise it shows the details where the user if gone wrong while entering the information’s
and it also shows the appropriate precautionary measure for the user to follow. Here the
use case diagram of all the entities is linked to each other where the user gets started with
the system.
Activity diagram is another important diagram in UML to describe the dynamic aspects of the
system. Activity diagram is basically a flowchart to represent the flow from one activity to
another activity. The activity can be described as an operation of the system. The control flow
is drawn from one operation to another. Here in this diagram the activity starts from user
where the user registers into the system then login using the credentials and then the credentials
are matched in the system and if it’s true, then the user proceeds to the prediction phase where
the prediction happens. Then finally after processing the data from datasets the analysis
will happen then the correct result will be displayed that is nothing but the Output.
It is based on using Deep learning consist of class diagram that all the other application that
consists the basic class diagram, here the class diagram is the basic entity that is required in
order to carry on with the project. Class diagram consist information about all the classes
that is used and all the related datasets, and all the other necessary attributes and their
relationships with other entities, all these information is necessary in order to use the concept
of the prediction, where the user will enter all necessary information such as username,
email, phone number, and many more attributes that is required in order to login into the
system and using the files concept we will store the information of the users who are
registering into the system and retrieves those information later while logging into the
system.
The Sequence diagram of the project based on deep learning consist of all the various aspects
a normal sequence diagram requires. This sequence diagram shows how from starting the
model flows from one step to another, like he enters into the system then enters all the
information’s and all other general information along with the symptoms that goes into the
system, compares with the prediction model and if true is predicts the appropriate results
otherwise it shows the details where the user if gone wrong while entering the information’s
and it also shows the appropriate precautionary measure for the user to follow. Here the
sequence of all the entities is linked to each other where the user gets started with the system.
CHAPTER – 4
IMPLEMENTATIO
4. IMPLEMENTATION
4.1.1. ANACONDA
Anaconda Individual Edition contains conda and Anaconda Navigator, as well as Python
and hundreds of scientific packages. When you installed Anaconda, you installed all these
too.
Conda works on your command line interface such as Anaconda Prompt on Windows and
terminal on macOS and Linux.
Navigator is a desktop graphical user interface that allows you to launch applications
and easily manage conda packages, environments, and channels without using
command-line commands.
You can try both conda and Navigator to see which is right for you to manage your
packages and environments. You can even switch between them, and the work you do with
one can be viewed in the other.
ANACONDA NAVIGATOR
To get Navigator, get the Navigator Cheat Sheet and install Anaconda. The Getting started
with Navigator section shows how to start Navigator from the shortcuts or from a terminal
window.
In order to run, many scientific packages depend on specific versions of other packages. Data
scientists often use multiple versions of many packages and use multiple environments to
separate these different versions.
The command-line program conda is both a package manager and an environment manager.
This helps data scientists ensure that each version of each package has all the dependencies it
requires and works correctly.
Navigator is an easy, point-and-click way to work with packages and environments without
needing to type conda commands in a terminal window. You can use it to find the packages
you want, install them in an environment, run the packages, and update them – all inside
Navigator.
JupyterLab
Jupyter Notebook
Spyder
PyCharm
VSCode
Glueviz
Orange 3 App
RStudio
Anaconda Prompt (Windows only)
Anaconda PowerShell (Windows only)
Advanced conda users can also build their own Navigator applications.
ANACONDA PROMPT
Anaconda Prompt is a command line shell (a program where you type in commands instead
of using a mouse). The black screen and text that makes up the Anaconda Prompt doesn't
look like much, but it is really helpful for problem solvers using Python.
If you prefer using a command line interface (CLI), you can use conda to verify the
installation using Anaconda Prompt on Windows or terminal on Linux and macOS.
o Windows: Click Start, search, or select Anaconda Prompt from the menu.
Sublime Text is a shareware cross-platform source code editor with a Python application
programming interface (API).
It natively supports many programming languages and markup languages, and functions can
be added by users with plugins, typically community-built and maintained under free-
software licenses.
import os
from flask import Flask, render_template, request
from predictor import check
APP_ROOT = os.path.dirname(os.path.abspath(__file__))
@app.route('/')
@app.route('/index')
def index():
return render_template('upload.html')
if not os.path.isdir(target):
os.mkdir(target)
status = check(filename)
if __name__ == "main":
app.run(port=4555, debug=True)
import numpy as np
from keras.preprocessing import image
from tensorflow.keras.models import load_model
saved_model = load_model("model/VGG_model.h5")
status = True
def check(input_img):
print(" your image is : " + input_img)
print(input_img)
print(img)
output = saved_model.predict(img)
print(output)
if output[0][0] == 1:
status = True
else:
status = False
print(status)
return status
A deployment diagram shows the configuration of run time processing nodes and the
components that live on them. Deployment Diagrams is a kind of structure diagram used in
modeling the physical aspects of an object-oriented system. Here the deployment diagram
show the final stage of the project and it also shows how the model looks like after doing all
the processes and deploying in the machine. Starting from the system how it processes the user
entered information and then comparing that information with the help of datasets, then training
and testing those data using the algorithms such as decision tree, naïve Bayes, Random forest.
Then finally processing all those data and information the system gives the desired result in the
interface.
CHAPTER -5
SCREEN SHOTS
In the above screenshot, select the address bar and type the cmd in the address bar.
On typing cmd and pressing “Enter” key, it will take us to command prompt.
Later on, in the above screen, select the browse files to give the desired input of user’s choice
to check whether the tumor status.
The above screenshot shows the set of sample images. Choose any one of them.
Now after selecting 0th image as filename click on “check tumor status”. It will redirect you to
the brain tumor website which is shown below.
In the above screen we can see that it checks that the input image doesn’t have any brain
tumor.
In the above screen we see that it checks that the brain tumor detected for the selected input
image.
CHAPTER -6
TESTING
6. TESTING
Testing is a process of executing a program with the intent of finding an error. A good test
case is one that has a high probability of finding an as-yet –undiscovered error. System
testing is the stage of implementation, which is aimed at ensuring that the system works
accurately and efficiently as expected before live operation commences. It verifies that the
whole set of programs hang together. System testing requires a test consists of several key
activities and steps for run program, string, system and is important in adopting a successful
new system.
TYPES OF TESTING
Unit testing involves the design of test cases that validate that the internal program logic is
functioning properly, and that program inputs produce valid outputs. All decision branches
and internal code flow should be validated. It is the testing of individual software units of the
application. It is done after the completion of an individual unit before integration. This is a
structural testing, that relies on knowledge of its construction and is invasive. Unit tests
perform basic tests at component level and test a specific business process, application,
and/or system configuration.
Integration tests are designed to test integrated software components to determine if they
actually run as one program. Testing is event driven and is more concerned with the basic
outcome of screens or fields. Integration tests demonstrate that although the components were
individually satisfaction, as shown by successfully unit testing, the combination of
components is correct and consistent. Integration testing is specifically aimed at exposing
the problems that arise from the combination of components.
time and within budget. Too often, product design and performance problems are not detected
until late in the product development cycle — when the product is ready to be shipped. The
old adage holds true: It costs a penny to make a change in engineering, a dime in
production and a dollar after a product is in the field.
Verification is a Quality control process that is used to evaluate whether or not a product,
service, or system complies with regulations, specifications, or conditions imposed at the start of
a development phase. Verification can be in development, scale-up, or production. This is often
an internal process.
scope of black box testing, and as such, should require no knowledge of the inner design of
the code or logic.
As a rule, system testing takes, as its input, all of the "integrated" software components that
have successfully passed integration testing and also the software system itself integrated
with any applicable hardware system. System testing is a more limited type of testing; it seeks to
detect defects both within the "inter-assemblages" and also within the system as a whole.
System testing is performed on the entire system in the context of a Functional Requirement
Specification (FRS) or System Requirement Specification (SRS).
Output
Tumor not detected
CHAPTER -7
So, Finally I conclude by saying that, this project medical imaging is gaining
importance with an increase in the demand for automated, reliable, fast and efficient
diagnosis which can provide insight into the image better than human eyes. The brain tumor
is the second leading cause for cancer-related deaths in men age 20 to 39 and leading cause
cancer among women in the same age group. Brain tumors are painful and should end in
various diseases if not cured properly. The diagnosis of the tumor is a very important part of
its treatment. Identification plays an important part in the diagnosis of benign and malignant
tumors. A prime reason behind a rise in the number of cancer patients worldwide is the
ignorance towards the treatment of a tumor in its early stages. This paper discusses such a
machine learning algorithm that can write the user about the details of the tumor using brain
MRI. These methods include noise removal and sharpening of the image along with basic
morphological functions, erosion, and dilation, to obtain the background. Subtractions of
background and its negative from different sets of images result in extracted in age. Plotting
contour and c-label of the tumor and its boundary provides us with information related to
the tumor that can help in a better visualization in diagnosing cases. This process helps in
identifying the size, shape, and position of the tumor. It helps the medical staff as well the
patient to understand the seriousness of the tumor with the help of different color-labeling
for different levels of elevation. A GUI for the contour of the tumor and its boundary can
provide information to the medical staff on the click of user choice buttons. Keywords:
classification, convolutional neural network, feature extraction, machine learning, magnetic
resonance imaging, segmentation, texture features.
CHAPTER -8
FUTURE ENHANCEMENT
8. FUTURE ENHANCEMENT
In this proposed work different medical images like MRI brain cancer images are taken
for detecting tumor. The proposed approach for brain tumor detection supported
convolution neural network categorizes into multi-layer perceptron neural network.
The proposed approach utilizes a mixture of this neural network technique and
consists of several steps including training the system, pre-processing, implementation
of the tensor flow, classification. In the future, we'll take an outsized database and
check out to offer more accuracy which can work on any sort of MRI brain tumor.
CHAPTER-9
BIBLIOGRAPH
9. BIBLIOGRAPHY
[3] Kiran R. Gavhale, and U. Gawande, “An Overview of the Research on Plant
LeavesDisease detection using Image Processing Techniques,” IOSR J. of Compu. Eng.
(IOSR-JCE),vol. 16, PP 10- 16,Jan. 2014.
[4] Y. Q. Xia, Y. Li, and C. Li, “Intelligent Diagnose System of Wheat Diseases
Based on Android Phone,” J. of Infor. & Compu. Sci., vol. 12, pp. 6845-6852, Dec. 2015.
[5] Wenjiang Huang, Qingsong Guan, JuhuaLuo, Jingcheng Zhang, Jinling Zhao,
Dong Liang, Linsheng Huang, and Dongyan Zhang, “New Optimized Spectral Indices for
Identifying and Monitoring Winter Wheat Diseases”, IEEE journal of selected topics in
applied earth observation and remote sensing,Vol. 7, No. 6, June 2014
[6] Monica Jhuria, Ashwani Kumar, and RushikeshBorse, “Image Processing For
Smart Farming: Detection Of Disease And Fruit Grading”, Proceedings of the 2013 IEEE
Second International Conference on Image Information Processing (ICIIP-2013)
[7] Zulkifli Bin Husin, Abdul Hallis Bin Abdul Aziz, Ali Yeon Bin
MdShakaffRohaniBinti S Mohamed Farook, “Feasibility Study on Plant Chili Disease
Detection Using Image Processing Techniques”, 2012 Third International Conference on
Intelligent Systems Modelling and Simulation.
[10] Chunxia Zhang, Xiuqing Wang, Xudong Li, “Design of Monitoring and Control
Plant Disease System Based on DSP&FPGA”, 2010 Second International Conference on
Networks Security, Wireless Communications and Trusted Computing.
[14] Sonal P Patel. Mr. Arun Kumar Dewangan “A Comparative Study on Various Plant
Leaf Diseases Detection and Classification” (IJSRET), ISSN 2278 - 0882 Volume 6, Issue 3,
March 2017
[15] R.Rajmohan, M.Pajany, Smart paddy crop disease identification and management
using deep convolution neural network & svm classifier, International journal of pure and
applied mathematics, vol 118, no 5, pp. 255-264, 2017.