online banking project
Online banking Abstract.pdf (Size: 87.87 KB / Downloads: 156)
 INTRODUCTION
 The project entitled "Online
 Banking " is a computerized telecommunications device that provides the customers of a
 financial institution with access to financial transactions in a p ublic space without the need
 for a human clerk or bank teller. On most modern ATMs, the customer is identified by
 inserting a plastic ATM card with a magnetic stripe or a plastic smartcard with a chip, that
 contains a unique card number and some security information, such as an expiration date or
 CVC (CVV). Security is provided by the customer entering a personal identification number
 (PIN).
 Using an ATM, customers can access their bank accounts in order to make cash withdrawals
 (or credit card cash advances) and check their account balances as well as purchasing mobile
 cell phone prepaid credit. ATMs are known by various other names including automated
 banking machine, money machine, bank machine, cash machine, hole-in-the-wall, cashpoint,
 Bancomat (in various countries in Europe and Russia), Multibanco (after a registered trade
 mark, in Portugal), and Any Time Money (in India)..
 SYNOPSIS
 "Online Banking " is a computerized telecommunications device that provides the customers
 of a financial institution with access to financial transactions in a public space without the
 need for a human clerk or bank teller. On most modern ATMs, the customer is identified by
 inserting a plastic ATM card with a magnetic stripe or a plastic smartcard with a chip, that
 contains a unique card number and some security information, such as an expiration date or
 CVC (CVV). Security is provided by the customer entering a personal identification number
 (PIN).
 AIM
 In the existing system the transactions are done only manually but in proposed system we
 have to computerize all the banking transaction using the software Online Banking. They are:
 User Module
 ADMINISTRATIVE MODULE
 This module is the main module which performs all the main operations in the system. The
 major operations in the system are:
¢ Cash Withdrawal(Saving/Current withdrawal)
¢ Inquery
¢ Statement report
¢ Cash transformation
¢ Pin change
SYSTEM STUDY AND ANALISYS
SYSTEM ANALYSIS
System analysis is a process of gathering and interpreting facts, diagnosing problems and the
information to recommend improvements on the system. It is a problem solving activity that
requires intensive communication between the system users and system developers. System
analysis or study is an important phase of any system development process. The system is
studied to the minutest detail and analyzed. The system analyst plays the role of the
interrogator and dwells deep into the working of the present system. The system is viewed as
a whole and the input to the system are identified. The outputs from the organizations are
traced to the various processes. System analysis is concerned with becoming aware of the
problem, identifying the relevant and decisional variables, analyzing and synthesizing the
various factors and determining an optimal or at least a satisfactory solution or program of
action.
A detailed study of the process must be made by various techniques like interviews,
questionnaires etc. The data collected by these sources must be scrutinized to arrive to a
conclusion. The conclusion is an understanding of how the system functions. This system is
called the existing system. Now the existing system is subjected to close study and problem
areas are identified. The designer now functions as a problem solver and tries to sort out the
difficulties that the enterprise faces. The solutions are given as proposals. The proposal is
then weighed with the existing system analytically and the best one is selected. The proposal
is presented to the user for an endorsement by the user. The proposal is reviewed on user
request and suitable changes are made. This is loop that ends as soon as the user is satisfied
with proposal.
Preliminary study is the process of gathering and interpreting facts, using the information for
further studies on the system. Preliminary study is problem solving activity that requires
intensive communication between the system users and system developers. It does various
feasibility studies. In these studies a rough figure of the system activities can be obtained,
from which the decision about the strategies to be followed for effective system study and
analysis can be taken.
EXISTING SYSTEM
In the existing system the transactions are done only manually but in proposed system we
have to computerize all the banking transaction using the software Online Banking.
2. 1.1 PROBLEMS WITH EXISTING SYSTEM
¢ Lack of security of data.
¢ More man power.
¢ Time consuming.
¢ Consumes large volume of pare work.
¢ Needs manual calculations.
¢ No direct role for the higher officials.
¢ Damage of machines due to lack of attention.
To avoid all these limitations and make the working more accurately the system needs to be
computerized.
PROPOSED SYSTEM
The aim of proposed system is to develop a system of improved facilities. The proposed
system can overcome all the limitations of the existing system. The system provides proper
security and reduces the manual work.
2. 2. 1 ADVANTAGES OF THE PROPOSED SYSTEM
The system is very simple in design and to implement. The system requires very low system
resources and the system will work in almost all configurations. It has got following features
¢ Security of data.
¢ Ensure data accuracy's.
¢ Proper control of the higher officials.
¢ Reduce the damages of the machines.
¢ Minimize manual data entry.
¢ Minimum time needed for the various processing.
¢ Greater efficiency.
¢ Better service.
¢ User friendliness and interactive.
¢ Minimum time required.
2.3. FEASIBILITY STUDY
Feasibility study is made to see if the project on completion will serve the purpose of the
organization for the amount of work, effort and the time that spend on it. Feasibility study
lets the developer foresee the future of the project and the usefulness. A feasibility study of a
system proposal is according to its workability, which is the impact on the organization,
ability to meet their user needs and effective use of resources. Thus when a new application
is proposed it normally goes through a feasibility study before it is approved for
development.
The document provide the feasibility of the project that is being designed and lists various
areas that were considered very carefully during the feasibility study of this project such as
Technical, Economic and Operational feasibilities. The following are its features:
2.3.1. TECHNICAL FEASIBILITY
The system must be evaluated from the technical point of view first. The assessment of this
feasibility must be based on an outline design of the system requirement in the terms of
input, output, programs and procedures. Having identified an outline system, the
investigation must go on to suggest the type of equipment, required method developing the
system, of running the system once it has been designed.
Technical issues raised during the investigation are:
Does the existing technology sufficient for the suggested one Can the system expand if
developed
The project should be developed such that the necessary functions and performance are
achieved within the constraints. The project is developed within latest technology. Through
the technology may become obsolete after some period of time, due to the fact that never
version of same software supports older versions, the system may still be used. So there are
minimal constraints involved with this project. The system has been developed using Java the
project is technically feasible for development.
2.3.2. ECONOMIC FEASIBILITY
The developing system must be justified by cost and benefit. Criteria to ensure that effort is
concentrated on project, which will give best, return at the earliest. One of the factors, which
affect the development of a new system, is the cost it would require.
The following are some of the important financial questions asked during preliminary
investigation:
¢ The costs conduct a full system investigation.
¢ The cost of the hardware and software.
¢ The benefits in the form of reduced costs or fewer costly errors.
Since the system is developed as part of project work, there is no manual cost to spend for
the proposed system. Also all the resources are already available, it give an indication of the
system is economically possible for development.
2.3.3. BEHAVIORAL FEASIBILITY
This includes the following questions:
> Is there sufficient support for the users
> Will the proposed system cause harm
The project would be beneficial because it satisfies the objectives when developed and
installed. All behavioral aspects are considered carefully and conclude that the project is
behaviorally feasible.
SYSTEM DESIGN 3.1 INTRODUCTION
Design is the first step into the development phase for any engineered product or system.
Design is a creative process. A good design is the key to effective system. The term "design" is
defined as "the process of applying various techniques and principles for the purpose of
defining a process or a system in sufficient detail to permit its physical realization". It may be
defined as a process of applying various techniques and principles for the purpose of defining
a device, a process or a system in sufficient detail to permit its physical realization. Software
design sits at the technical kernel of the software engineering process and is applied
regardless of the development paradigm that is used. The system design develops the
architectural detail required to build a system or product. As in the case of any systematic
approach, this software too has undergone the best possible design phase fine tuning all
efficiency, performance and accuracy levels. The design phase is a transition from a user
oriented document to a document to the programmers or database personnel. System
design goes through two phases of development: Logical and Physical Design.
LOGICAL DESIGN:
The logical flow of a system and define the boundaries of a system. It includes the following
steps:
¢ Reviews the current physical system - its data flows, file content, volumes , frequencies
etc.
¢ Prepares output specifications - that is, determines the format, content and frequency of
reports.
¢ Prepares input specifications - format, content and most of the input functions.
¢ Prepares edit, security and control specifications.
¢ Specifies the implementation plan.
¢ Prepares a logical design walk through of the information flow, output, input, controls and
implementation plan.
¢ Reviews benefits, costs, target dates and system constraints.
PHYSICAL DESIGN:
Physical system produces the working systems by define the design specifications that tell the
programmers exactly what the candidate system must do. It includes the following steps.
¢ Design the physical system.
¢ Specify input and output media.
¢ Design the database and specify backup procedures.
¢ Design physical information flow through the system and a physical design Walk through.
¢ Plan system implementation.
¢ Prepare a conversion schedule and target date.
¢ Determine training procedures, courses and timetable.
¢ Devise a test and implementation plan and specify any new hardware/software.
¢ Update benefits , costs , conversion date and system constraints
Design/Specification activities:
¢ Concept formulation.
¢ Problem understanding.
¢ High level requirements proposals.
¢ Feasibility study.
¢ Requirements engineering.
¢ Architectural design.
MODULE DESIGN Admin
The Administrator logs in using the admin login. In this module two operations are done.
During login the Login and Password is verified with that in the database
INPUT DESIGN
The design of input focuses on controlling the amount of input required, controlling the
errors, avoiding delay, avoiding extra steps and keeping the process simple. The input is
designed in such a way so that it provides security and ease of use with retaining the privacy.
Input Design considered the following things:
o What data should be given as input
o How the data should be arranged or coded
o The dialog to guide the operating personnel in providing input.
o Methods for preparing input validations and steps to follow when error occur.
OBJECTIVES
Input Design is the process of converting a user-oriented description of the input into a
computer-based system. This design is important to avoid errors in the data input process
and show the correct direction to the management for getting correct information from the
computerized system.
It is achieved by creating user-friendly screens for the data entry to handle large volume of
data. The goal of designing input is to make data entry easier and to be free from errors. The
data entry screen is designed in such a way that all the data manipulates can be performed. It
also provides record viewing facilities.
When the data is entered it will check for its validity. Data can be entered with the help of
screens. Appropriate messages are provided as when needed so that the user will not be in a
maize of instant. Thus the objective of input design is to create an input layout that is easy to
follow
OUTPUT DESIGN
A quality output is one, which meets the requirements of the end user and presents the
information clearly. In output design it is determined how the information is to be displaced
for immediate need and also the hard copy output. It is the most important and direct source
information to the user. Efficient and intelligent output design improves the system's
relationship to help user decision-making.
Designing computer output should proceed in an organized, well thought out manner; the
right output must be developed while ensuring that each output element is designed so that
people will find the system can use easily and effectively. When analysis design computer
output, they should :
Identify the specific output that is needed to meet the requirements.
Select methods for presenting information.
Create document, report, or other formats that contain information produced by the system.
3.3 DATABASE DESIGN
A database is an organized mechanism that has the capability of storing information through
which a user can retrieve stored information in an effective and efficient manner. The data is
the purpose of any database and must be protected.
The database design is a two level process. In the first step, user requirements are gathered
together and a database is designed which will meet these requirements as clearly as
possible. This step is called Information Level Design and it is taken independent of any
individual DBMS.
In the second step, this Information level design is transferred into a design for the specific
DBMS that will be used to implement the system in question. This step is called Physical Level
Design, concerned with the characteristics of the specific DBMS that will be used. A database
design runs parallel with the system design. The organization of the data in the database is
aimed to achieve the following two major objectives.
¢ Data Integrity
¢ Data independence
Normalization is the process of decomposing the attributes in an application, which results in
a set of tables with very simple structure. The purpose of normalization is to make tables as
simple as possible. Normalization is carried out in this system for the following reasons.
¢ To structure the data so that there is no repetition of data , this helps in saving.
¢ To permit simple retrieval of data in response to query and report request.
¢ To simplify the maintenance of the data through updates, insertions, deletions.
¢ To reduce the need to restructure or reorganize data which new application requirements
arise.
RELATIONAL DATABASE MANAGEMENT SYSTEM (RDBMS):
A relational model represents the database as a collection of relations. Each relation
resembles a table of values or file of records. In formal relational model terminology, a row is
called a tuple, a column header is called an attribute and the table is called a relation. A
relational database consists of a collection of tables, each of which is assigned a unique
name. A row in a tale represents a set of related values.
RELATIONS, DOMAINS & ATTRIBUTES:
A table is a relation. The rows in a table are called tuples. A tuple is an ordered set of n
elements. Columns are referred to as attributes. Relationships have been set between every
table in the database. This ensures both Referential and Entity Relationship Integrity. A
domain D is a set of atomic values. A common method of specifying a domain is to specify a
data type from which the data values forming the domain are drawn. It is also useful to
specify a name for the domain to help in interpreting its values. Every value in a relation is
atomic, that is not decomposable.
RELATIONSHIPS:
Table relationships are established using Key. The two main keys of prime importance are
Primary Key & Foreign Key. Entity Integrity and Referential Integrity Relationships can be
established with these keys.Entity Integrity enforces that no Primary Key can have null
values.Referential Integrity enforces that no Primary Key can have null values. Referential
Integrity for each distinct Foreign Key value, there must exist a matching Primary Key value in
the same domain. Other key are Super Key and Candidate Keys. Relationships have been set
between every table in the database. This ensures both Referential and Entity Relationship
Integrity.
NORMALIZATION:
As the name implies, it denoted putting things in the normal form. The application developer
via normalization tries to achieve a sensible organization of data into proper tables and
columns and where names can be easily correlated to the data by the user. Normalization
eliminates repeating groups at data and thereby avoids data redundancy which proves to be
a great burden on the computer resources. These includes:
Normalize the data.
Choose proper names for the tables and columns.
Choose the proper name for the data.
First Normal Form:
The First Normal Form states that the domain of an attribute must include only atomic values
and that the value of any attribute in a tuple must be a single value from the domain of that
attribute. In other words 1NF disallows "relations within relations" or "relations as attribute
values within tuples". The only attribute values permitted by 1NF are single atomic or
indivisible values.
The first step is to put the data into First Normal Form. This can be donor by moving data into
separate tables where the data is of similar type in each table. Each table is given a Primary
Key or Foreign Key as per requirement of the project. In this we form new relations for each
nonatomic attribute or nested relation. This eliminated repeating groups of data.
A relation is said to be in first normal form if only if it satisfies the constraints that contain the
primary key only.
Second Normal Form:
According to Second Normal Form, For relations where primary key contains multiple
attributes, no nonkey attribute should be functionally dependent on a part of the primary
key.
In this we decompose and setup a new relation for each partial key with its dependent
attributes. Make sure to keep a relation with the original primary key and any attributes that
are fully functionally dependent on it. This step helps in taking out data that is only
dependant on apart of the key.
A relation is said to be in second normal form if and only if it satisfies all the first normal form
conditions for the primary key and every non-primary key attributes of the relation is fully
dependent on its primary key alone. Third Normal Form:
According to Third Normal Form, Relation should not have a nonkey attribute functionally
determined by another nonkey attribute or by a set of nonkey attributes. That is, there
should be no transitive dependency on the primary key.
In this we decompose and set up relation that includes the nonkey attributes that
functionally determines other nonkey attributes. This step is taken to get rid of anything that
does not depend entirely on the Primary Key.
A relation is said to be in third normal form if only if it is in second normal form and more
over the non key attributes of the relation should not be depend on other non key attribute.
5.2 SYSTEM IMPLEMENTATION AND TESTING
Implementation is the stage of the project where the theoretical design is turned into a
working system. It can be considered to be the most crucial stage in achieving a successful
new system gaining the users confidence that the new system will work and will be effective
and accurate. It is primarily concerned with user training and documentation. Conversion
usually takes place about the same time the user is being trained or later. Implementation
simply means convening a new system design into operation, which is the process of
converting a new revised system design into an operational one.
5.2.1. SYSTEM TESTING
Software Testing is the process of executing software in a controlled manner, in order to
answer the question - Does the software behave as specified. Software testing is often used
in association with the terms verification and validation. Validation is the checking or testing
of items, includes software, for conformance and consistency with an associated
specification. Software testing is just one kind of verification, which also uses techniques such
as reviews, analysis, inspections, and walkthroughs. Validation is the process of checking that
what has been specified is what the user actually wanted.
Validation : Are we doing the right job Verification : Are we doing the job right
Software testing should not be confused with debugging. Debugging is the process of
analyzing and localizing bugs when software does not behave as expected. Although the
identification of some bugs will be obvious from playing with the software, a methodical
approach to software testing is a much more thorough means for identifying bugs. Debugging
is therefore an activity which supports testing, but cannot replace testing. Other activities
which are often associated with software testing are static analysis and dynamic analysis.
Static analysis investigates the source code of software, looking for problems and gathering
metrics without actually executing the code. Dynamic analysis looks at the behavior of
software while it is executing, to provide information such as execution traces, timing
profiles, and test coverage information.
Testing is a set of activity that can be planned in advanced and conducted systematically.
Testing begins at the module level and work towards the integration of entire computers
based system. Nothing is complete without testing, as it vital success of the system testing
objectives, there are several rules that can serve as testing objectives. They are
Testing is a process of executing a program with the intend of finding an error.A good test
case is one that has high possibility of finding an undiscovered error.A successful test is one
that uncovers an undiscovered error.
If a testing is conducted successfully according to the objectives as stated above, it would
uncovered errors in the software also testing demonstrate that the software function appear
to be working according to the specification, that performance requirement appear to have
been met.
There are three ways to test program.
¢ For correctness
¢ For implementation efficiency
¢ For computational complexity
Test for correctness are supposed to verify that a program does exactly what it was designed
to do. This is much more difficult than it may at first appear, especially for large programs.
TEST PLAN
A test plan implies a series of desired course of action to be followed in accomplishing various
testing methods. The Test Plan acts as a blue print for the action that is to be followed. The
software engineers create a computer program, its documentation and related data
structures. The software developers is always responsible for testing the individual units of
the programs, ensuring that each performs the function for which it was designed. There is
an independent test group (ITG) which is to remove the inherent problems associated with
letting the builder to test the thing that has been built. The specific objectives of testing
should be stated in measurable terms. So that the mean time to failure, the cost to find and
fix the defects, remaining defect density or frequency of occurrence and test work-hours per
regression test all should be stated within the test plan.
The levels of testing include:
¢ Unit testing
¢ Integration Testing
¢ Data validation Testing
¢ Output Testing
UNIT TESTING
Unit testing focuses verification effort on the smallest unit of software design - the software
component or module. Using the component level design description as a guide, important
control paths are tested to uncover errors within the boundary of the module. The relative
complexity of tests and uncovered scope established for unit testing. The unit testing is
white-box oriented, and step can be conducted in parallel for multiple components. The
modular interface is tested to ensure that information properly flows into and out of the
program unit under test. The local data structure is examined to ensure that data stored
temporarily maintains its integrity during all steps in an algorithm's execution. Boundary
conditions are tested to ensure that all statements in a module have been executed at least
once. Finally, all error handling paths are tested.
Tests of data flow across a module interface are required before any other test is initiated. If
data do not enter and exit properly, all other tests are moot. Selective testing of execution
paths is an essential task during the unit test. Good design dictates that error conditions be
anticipated and error handling paths set up to reroute or cleanly terminate processing when
an error does occur. Boundary testing is the last task of unit testing step. Software often fails
at its boundaries.
Unit testing was done in Sell-Soft System by treating each module as separate entity and
testing each one of them with a wide spectrum of test inputs. Some flaws in the internal logic
of the modules were found and were rectified.
INTEGRATION TESTING
Integration testing is systematic technique for constructing the program structure while at
the same time conducting tests to uncover errors associated with interfacing. The objective is
to take unit tested components and build a program structure that has been dictated by
design. The entire program is tested as whole. Correction is difficult because isolation of
causes is complicated by vast expanse of entire program. Once these errors are corrected,
new ones appear and the process continues in a seemingly endless loop.
After unit testing in Sell-Soft System all the modules were integrated to test for any
inconsistencies in the interfaces. Moreover differences in program structures were removed
and a unique program structure was evolved.
VALIDATION TESTING OR SYSTEM TESTING
This is the final step in testing. In this the entire system was tested as a whole with all forms,
code, modules and class modules. This form of testing is popularly known as Black Box testing
or System testing.
Black Box testing method focuses on the functional requirements of the software. That is,
Black Box testing enables the software engineer to derive sets of input conditions that will
fully exercise all functional requirements for a program.
Black Box testing attempts to find errors in the following categories; incorrect or missing
functions, interface errors, errors in data structures or external data access, performance
errors and initialization errors and termination errors.
OUTPUT TESTING OR USER ACCEPTANCE TESTING
The system considered is tested for user acceptance; here it should satisfy the firm's need.
The software should keep in touch with perspective system; user at the time of developing
and making changes whenever required. This done with respect to the following points
Input Screen Designs,
Output Screen Designs,
Online message to guide the user and the like.
The above testing is done taking various kinds of test data. Preparation of test data plays a
vital role in the system testing. After preparing the test data, the system under study is tested
using that test data. While testing the system by which test data errors are again uncovered
and corrected by using above testing steps and corrections are also noted for future use.
5.3. TRAINING
Once the system is successfully developed the next important step is to ensure that the
administrators are well trained to handle the system. This is because the success of a system
invariably depends on how they are operated and used. The implementation depends upon
the right people being at the right place at the right time. Education involves creating the
right atmosphere and motivating the user. The administrators are familiarized with the run
procedures of the system, working through the sequence of activities on an ongoing basis.
Implementation is the state in the project where the theoretical design is turned into a
working system. By this, the users get the confidence that the system will work effectively.
The system can be implemented only after through testing.
The systems personnel check the feasibility of the system. The actual data were inputted to
the system and the working of the system was closely monitored. The master option was
selected from the main menu and the actual data were input through the corresponding
input screens. The data movement was studied and found to be correct queries option was
then selected and this contains various reports. Utilities provide various data needed for
inventory was input and the module was test run. Satisfactory results were obtained. Reports
related to these processes were also successfully generated. Various input screen formats are
listed in the appendix.
Implementation walkthroughs ensure that the completed system actually solves the original
problem. This walkthrough occurs just before the system goes into use, and it should include
careful review of all manuals, training materials and system documentation. Again, users, the
analyst and the members of the computer services staff may attend this meeting.
CONCLUSION
Using "Online Banking " , customers can access their bank accounts in order to make cash
withdrawals (or credit card cash advances) and check their account balances as well as
purchasing mobile cell phone prepaid credit. In this the customer is identified by inserting a
plastic ATM card with a magnetic stripe or a plastic smartcard with a chip, that contains a
unique card number and some security information, such as an expiration date or CVC (CVV).
Security is provided by the customer entering a personal identification number (PIN).
BIBLIOGRAPHY
BOOKS:
1. Charles Hampfed (2000) 'Visual Basic' University of Toronto
2. Herbert Schildt (2000) 'Visual Basic 6.0' Tata McGraw Hill
3. John Zukowski (2000) 'Visual Basic 6.0' 'BPB Publications
4. Jamie Jaworsky 'Visual Basic 6.0'Techmedia
5. Stefen Denninger 'Visual Basic 6.0'Author's Press
 6. Ian Somerville 'Software engineering'
 7. Rajeev mall 'Software engineering'
 8. Elmasri Navathe 'Fundamentals of database systems'
 ONLINE REFERENCE:
Reference: http://www.seminarprojects.com/Thread-online-banking-project#ixzz1Ai5kX49E