Total Quality Management?
Total Quality Management is an extensive and structured organization management
approach that focuses on continuous quality improvement of products and services by
using continuous feedback. Joseph Juran was one of the founders of total quality management
just like William E. Deming.
Total quality management originated in the industrial sector of Japan (1954). Since that time the
concept has been developed and can be used for almost all types of organizations such as
schools, motorway maintenance, hotel management and churches. Nowadays, Total Quality
Management is also used within the e-business sector and it perceives quality management
entirely from the point of view of the customer. The objective of TQM is doing things right the
first time over and over again. This saves the organization the time that is needed to correct poor
work and failed product and service implementations (such as warranty repairs).
Total Quality Management can be set up separately for an organization as well as for a set of
standards that must be followed- for instance the International Organization for Standardization
(ISO) in the ISO 9000 series. Total Quality Management uses strategy, data and communication
channels to integrate the required quality principles into the organization’s activities and culture.
Total Quality Management principles
TQM has a number of basic principles which can be converted to the figure below.
Focus on customer
Employee involvement
Process centered
Integrated system
Strategic and systematic approach
Decision-making based on facts
Communication
Continuous improvement
Focus on customer
When using TQM it is of crucial importance to remember that only customers determine the
level of quality. Whatever efforts are made with respect to training employees or improving
processes, only customers determine, for example through evaluation or satisfaction
measurement, whether your efforts have contributed to the continuous improvement of product
quality and services.
Employee involvement
Employees are an organization’s internal customers. Employee involvement in the development
of products or services of an organization largely determines the quality of these products or
services. Ensure that you have created a culture in which employees feel they are involved with
the organization and its products and services.
Process centered
Process thinking and process handling are a fundamental part of total quality management.
Processes are the guiding principle and people support these processes based on basis objectives
that are linked to the mission, vision and strategy.
Integrated system
Following the Process centered principle, it is important to have an integrated organization
system that can be modelled for example ISO 9000 or a company quality system for the
understanding and handling of the quality of the products or services of an organization.
Strategic and systematic approach
A strategic plan must embrace the integration and quality development and the development or
services of an organization.
Decision-making based on facts
Decision-making within the organization must only be based on facts and not on opinions
(emotions and personal interests). Data should support this decision-making process.
Communication
A communication strategy must be formulated in such a way that it is in line with the mission,
vision and objectives of the organization. This strategy comprises the stakeholders, the level
within the organization, the communications channels, the measurability of effectiveness,
timeliness, etc.
Continuous improvement
By using the right measuring tools and innovative and creative thinking, continuous
improvement proposals will be initiated and implemented so that the organization can develop
into a higher level of quality.
A supporting Total Quality Management tool that could be used is the Deming cycle (Plan-Do-
Check-Act) or the DMAIC process.
Practical approach on Total Quality Management / TQM
When you implement TQM, you implement a concept. It is not a system that can be
implemented but a line of reasoning that must be incorporated into the organization and its
culture.
Practice has proven that there are a number of basic assumptions that contribute to a successful
roll-out of TQM within an organization.
These basic assumptions are:
Train senior management on TQM principles and ask for their commitment with respect to its roll-out.
Assess the current culture, customer satisfaction and the quality system.;
Senior management determines the desired core values and principles and communicates this within
the organization.
Develop a basic TQM plan using the basic starting principles mentioned above.
Identify and prioritize customer needs and the market and determine the organization’s products and
services to meet those needs.
Determine the critical processes that can make a substantial contribution to the products and services.
Create teams that can work on process improvement for example quality circles.
Managers support these teams using planning, resources, and by providing time training.
Management integrates the desired changes for improvement in daily processes. After the
implementation of improved processes, standardization takes place.;
Evaluate progress continuously and adjust the planning or other issues if necessary.
Stimulate employee involvement. Awareness and feedback lead to an overall improvement of the
entire process. Support this for example by means of a reward model, i.e. Management by Objectives,
and recognition.
Kaizen Technique
Kaizen offers a variety of tools and strategies a business can use to continually improve. Below are just a few
techniques and tools that will aide in the implementation of Kaizen:
PDCA Cycle: PDCA cycles are cycle that is often implemented when completing Kaizen strategies. This
cycle can be used employees of all levels in the organization and is an effective introduction to Lean
manufacturing. There are 4 phases of the cycle: Plan > Do > Check > Act that allows a continuous structure
for kaizen strategies to be implemented and assessed while providing a framework for continuous
Improvement.
Gemba: Gemba means the real place. Often managers and supervisors can get valuable information by
actually going down to the production line and talking with employees. Gemba is often used in the form
of Gemba walks, which are scheduled walks for managers and supervisors observe processes up close and
talk with frontline employees.
Jishuken: Jishuken can be translated into an autonomous study groups or self-study. This concept
encouragers managers to be more directly involved and learn about the processes they are responsible for
and how they can work to improve them.
5 Whys: This is an important tool when it comes to identifying the root cause of an issue. It is difficult to
make an impactful change in the workplace if the root cause has yet to be considered. The 5 Whys is
exactly like the name implies; after a problem arises you should ask yourself why five times.
Value Stream Mapping: Mapping out processes and streams in a facility can be very beneficial for a
business practicing Kaizen. These maps are usually hand-drawn and include a diagram of materials moving
through the different areas of the workplace. Value stream maps aim to identify wastes in the manufacturing
process and find areas where improvement within a process is possible, these potential improvements can
be the subject of future Kaizen activities and events.
Statistical Quality Assurance (SQA)
As brands and retailers experience growing demand for the latest
consumer products, the resulting increase in production and batch sizes
makes quality control more challenging for companies.
Traditional compliance testing techniques can sometimes provide limited pass/fail information, which results
in insufficient measurements on the batch’s quality control, identification of the root cause of failure results
and overall quality assurance (QA) in the production process.
Intertek combines legal, customer and essential safety requirements to customize a workable QA process,
called Statistical Quality Assurance (SQA). SQA is used to identify the potential variations in the
manufacturing process and predict potential defects on a parts-per-million (PPM) basis. It provides a
statistical description of the final product and addresses quality and safety issues that arise during
manufacturing.
SQA consists of three major methodologies:
1. Force Diagram - A Force Diagram describes how a product should be tested. Intertek engineers base
the creation of Force Diagrams on our knowledge of foreseeable use, critical manufacturing process
and critical components that have high potential to fail.
2. Test-to-Failure (TTF) - Unlike any legal testing, TTF tells manufacturers on how many defects they are
likely to find in every million units of output. This information is incorporated into the process and
concludes if a product needs improvement in quality or if it is being over engineered, which will
eventually lead to cost savings.
3. Intervention - Products are separated into groups according to the total production quantity and
production lines. Each group then undergoes an intervention. The end result is measured by Z-value,
which is the indicator of quality and consistency of a product to a specification. Intervention allows
manufacturers to pinpoint a defect to a specific lot and production line; thus saving time and money
in corrective actions.
Mc call quality factors
A quality factor represents a behavioural characteristic of a system. Following are the list of quality
factors:
1. Correctness:
Defination: Extent to which a program satisfies its specifications and fulfills the user’s mission
objectives
A software system is expected to meets the explicitly specified functional requirements and
the implicitly expected non-functional requirements.
If a software system satisfies all the functional requirements, the system is said to be correct.
2. Reliability
Defination: Extent to which a program can be expected to perform its intended function with
required precision
Customers may still consider an incorrect system to be reliable if the failure rate is very small
and it does not adversely affect their mission objectives.
Reliability is a customer perception, and an incorrect software can still be considered to be
reliable.
3. Efficiency:
Defination: Amount of computing resources and code required by a program to perform a
function
Efficiency concerns to what extent a software system utilizes resources, such as computing
power, memory, disk space, communication bandwidth, and energy.
A software system must utilize as little resources as possible to perform its functionalities.
4. Integrity:
Defination: Extent to which access to software or data by unauthorized persons can be
controlled
A system’s integrity refers to its ability to withstand attacks to its security.
In other words, integrity refers to the extent to which access to software or data by
unauthorized persons or programs can be controlled.
5. Usability:
Defination: Effort required to learn, operate, prepare input, and interpret output of a program
A software is considered to be usable if human users find it easy to use.
Without a good user interface a software system may fizzle out even if it possesses many
desired qualities.
6. Maintainability:
Defination: Effort required to locate and fix a defect in an operational program
Maintenance refers to the upkeep of products in response to deterioration of their
components due to continuous use of the products.
Maintenance refers to how easily and inexpensively the maintenance tasks can be
performed.
For software products, there are three categories of maintenance activities : corrective,
adaptive and perfective maintenance.
7. Testability:
Defination: Effort required to test a program to ensure that it performs its intended functions
Testability means the ability to verify requirements. At every stage of software development,
it is necessary to consider the testability aspect of a product.
To make a product testable, designers may have to instrument a design with functionalities
not available to the customer.
8. Flexibility:
Defination: Effort required to modify an operational program
Flexibility is reflected in the cost of modifying an operational system.
In order to measure the flexibility of a system, one has to find an answer to the question:
How easily can one add a new feature to a system.
9. Portability
Defination: Effort required to transfer a program from one hardware and/or software
environment to another
Portability of a software system refers to how easily it can be adapted to run in a different
execution environment.
Portability gives customers an option to easily move from one execution environment to
another to best utilize emerging technologies in furthering their business.
10. Reusability
Defination: Extent to which parts of a software system can be reused in other applications
Reusability means if a significant portion of one product can be reused, maybe with minor
modifications, in another product.
Reusability saves the cost and time to develop and test the component being reused.
11. Interoperability :
Defination: Effort required to couple one system with another
Interoperability means whether or not the output of one system is acceptable as input to
another system, it is likely that the two systems run on different computers interconnected by
a network.
An example of interoperability is the ability to roam from one cellular phone network in one
country to another cellular network in another country.
Quality Criteria
A quality criteria is an attribute of a quality factor that is related to software development. For
example, modularity is an attribute of the architecture of a software system.
List of Quality Criteria :
1. Access Audit: Ease with which the software and data can be checked for compliance with
standards.
2. Access Control: Provisions for control and protection of the software
3. Accuracy: Precisions of computations and output.
4. Completeness: Degree to which full implementation of required functionalities have been
achieved.
5. Communicativeness: Ease with which the inputs and outputs can be assimilated.
6. Conciseness: Compactness of the source code, in terms of lines of code.
7. Consistency: Use of uniform design and implementation techniques.
8. Data commonality: Use of standard data representation.
9. Error tolerance: Degree to which continuity of operation is ensured under adverse conditions.
10. Execution efficiency: Run time efficiency of the software.
11. Expandability: Degree to which storage requirements or software functions can be expanded.
12. Hardware independence: Degree to which a software is dependent on the underlying
hardware.
13. Modularity: Provision of highly independent modules.
14. Operability: Ease of operation of the software.
15. Simplicity: Ease with which the software can be understood.
16. Software efficiency: Run time storage requirements of the software.
17. Traceability: Ability to link software components to requirements.
18. Training: Ease with which new users can use the system.