AI Integrated Fabric
Inspection
Mechatronics
Jury Assignment
Submitted By:
Jishnu (BFT/22/359)
Vibha S. Vinod (BFT/22/724)
Yatik A.M. (BFT/22/354)
AUTOMATED FABRIC DEFECT DETECTION
SYSTEM USING IoT AND REAL-TIME
VISUAL ANALYTICS
Vibha Vinod¹, Jishnu S², Yatik A.M3
¹National Institute of Fashion Technology (NIFT), India
²National Institute of Fashion Technology (NIFT), India
3
National Institute of Fashion Technology (NIFT), India
Abstract
The textile industry faces persistent challenges in manual fabric inspection and color
consistency under variable lighting. This study introduces an automated system
integrating ESP32, RGB LED lighting, TCS34725 color sensor, VL53L0X
Time-of-Flight (ToF) sensor, and a phone camera for defect detection. Lighting
conditions are normalized in real-time using sensor feedback. A Python-OpenCV
algorithm identifies defects and triggers the ESP32 via serial communication. The ESP32
actuates a servo motor and buzzer to mark defective fabric segments. The ToF sensor
ensures consistent camera-to-fabric distance, improving detection reliability. This
prototype demonstrates an efficient, cost-effective system with industrial applications.
Keywords
Smart Inspection, Real-time Monitoring, IoT in Textiles, ESP32, Fabric Defect
Detection, Heatmap Visualization
Highlights of the Project
• Real-time fabric defect detection with camera and sensors
• Smart Quality Dashboard with heatmaps and defect logs
• Automatic defect marking system using servo motor
• Cloud-based monitoring and analytics system.
Introduction
The fabric inspection process plays a critical role in textile manufacturing by identifying
defects such as stains, slubs, holes, and shade mismatches before the material proceeds to
garment production. Traditional inspection methods rely heavily on human judgment and
are subject to fatigue, inconsistency, and slow throughput. In recent years, the demand for
automation in quality control has increased substantially to improve reliability,
consistency, and productivity [1].
Technological advancements in microcontrollers and sensors have made it feasible to
develop intelligent inspection systems that combine embedded electronics with computer
vision. The ESP32 microcontroller offers an optimal platform for such applications due
to its cost-effectiveness, built-in wireless communication, and processing capabilities [2].
Moreover, integrating RGB color sensors and time-of-flight (ToF) sensors enables
adaptive lighting control and fabric-to-sensor distance calibration, which are essential for
achieving consistent and reliable results under variable lighting and movement conditions
[1][3].
To further automate the detection process, a Python-based computer vision algorithm
using OpenCV can process video feeds from a mobile or IP camera to identify defects in
real-time. Upon detecting a defect, the system sends a command to the ESP32, which
then actuates a servo motor and buzzer for marking the defect location. This closed-loop
interaction enables real-time feedback and marking without human involvement [4].
In this study, we propose a fully automated, low-cost defect detection and shade
correction system that combines ESP32-based hardware, RGB and ToF sensors, adaptive
LED lighting, and computer vision. This system addresses the core challenges in manual
inspection by enhancing speed, reliability, and lighting consistency across various fabric
types and textures.
Literature Survey
Numerous studies have addressed the inefficiencies of traditional fabric inspection, which
is labor-intensive, subjective, and prone to error. Manual processes often miss subtle
defects and cannot ensure consistent quality across large fabric volumes (Kumar &
Mehta, 2020).
Recent research has explored the use of computer vision and AI algorithms for defect
detection, demonstrating significant improvements in accuracy and speed. For example,
studies have shown that AI-powered systems can detect defects with up to 85%
accuracy, and automated inspection systems can reduce inspection time by nearly 60%
compared to manual methods (Kumar & Mehta, 2020).
Further, IoT integration has been explored for real-time data transmission and
centralized monitoring. Systems using microcontrollers like ESP32-CAM have proven
effective for low-cost, scalable solutions (Espressif Systems, 2022). However, many of
these systems lack the ability to process images at the edge (locally), instead relying on
cloud servers, which can introduce latency and limit responsiveness (Smith, 2019).
Additionally, while some models provide basic defect logging, few offer comprehensive
features like smart dashboards, heatmaps, or automated marking systems, which are
essential for practical implementation in industrial settings (Sharma & Rao, 2021).
This project builds on existing work by combining edge-based image analysis,
real-time IoT communication, and automated defect marking—bridging the gap
between accuracy, speed, and operational scalability.
Research Gap
Despite innovations in fabric inspection, most existing systems are either too costly or
lack real-time analytics and scalable cloud features. There's a need for an integrated,
cost-effective solution that detects, logs, and visualizes fabric defects in real-time using
embedded IoT.
Problem Statement
Fabric inspection is vital in textile manufacturing, yet traditional manual methods are
slow, error-prone, and inconsistent due to human fatigue and subjectivity. Small defects
like stains or misprints often go unnoticed, especially during high-speed production.
These inefficiencies lead to quality issues and higher operational costs.
With the rise of Industry 4.0, there's a clear need for an affordable, automated system that
can detect, mark, and log defects in real time. Such a solution must be scalable, integrate
seamlessly into existing workflows, and provide actionable insights via cloud analytics.
Objective
This project is driven by three primary objectives that aim to improve the efficiency,
accuracy, and visibility of fabric inspection in textile manufacturing:
1. Automate Fabric Defect Detection and Marking:
Develop a vision-based inspection system using an ESP32-CAM that can
automatically detect surface-level defects such as stains, cuts, holes, or weaving
faults and physically mark their positions using a servo motor.
2. Enable Real-time Monitoring and Logging:
Integrate IoT capabilities to continuously capture and analyze fabric images in
real-time, while logging defects with time-stamped data for traceability and future
review.
3. Visualize Quality Data for Analysis and Trends:
Build a smart quality dashboard that represents defect distribution using heatmaps
and charts, helping quality managers to identify recurring issues, defective zones,
and trends across batches.
Design Methodology
This project uses an automated process to detect fabric defects using the ESP32-CAM.
The system captures images, checks for defects, marks faulty areas, logs data, and
updates a smart dashboard. The diagram below outlines the complete workflow of the
inspection process.
Figure 2: Block Diagram of the Automated Fabric Defect Detection System
This flowchart represents an automated fabric inspection process using an ESP32-CAM
module. Here's a step-by-step explanation of what each part does:
1. Start Fabric Movement
● The inspection process begins when the fabric starts moving, typically on a
motorized conveyor belt or roller system.
● This movement ensures a continuous flow of fabric under the camera, mimicking
real-time conditions on a production line.
● A sensor (e.g., IR or proximity) can optionally detect the fabric's presence and
trigger system activation.
2. Capture Image via ESP32-CAM
● As the fabric moves, the ESP32-CAM module captures images at fixed intervals
or in real time.
● The ESP32-CAM is a compact, Wi-Fi-enabled microcontroller with a built-in
camera, ideal for edge-based visual inspection tasks.
● Each image (frame) is sent for on-board or off-board processing depending on the
system architecture.
3. Check for Defect (Image Analysis)
● Captured images are analyzed using image processing algorithms (e.g.,
thresholding, edge detection, contour analysis) or machine learning models.
● This step involves:
○ Identifying irregular patterns or features in the fabric.
○ Comparing detected patterns against a trained model or predefined criteria
for defects.
● Decision Point:
○ ✅ Yes — Defect Detected
○ ❌ No — Fabric is Clean
4A. If Defect is Detected
a. Trigger Servo to Mark Defect
● A servo motor is activated to physically mark the fabric at the location of the
defect.
● This marking can be done with:
○ A small ink dot
○ A tag
○ A mechanical punch
● The mark allows for easy identification during downstream processing (e.g.,
quality check, cutting, or rejection).
b. Log Defect Data
● The system logs detailed information about the defect, such as:
○ 📅 Timestamp
○ 📍 Position on the fabric
○ 🧵 Type or category of defect
○ 🎞️ Image snippet or frame number (optional)
● This data can be stored locally (on SD card/EEPROM) or sent to a central
database for quality analysis.
c. Update Smart Dashboard
● A connected dashboard interface (e.g., on a PC, tablet, or web portal) is updated
in real-time.
● The dashboard displays:
○ Current defect count and rate
○ Defect types and frequency
○ Fabric color or GSM readings
○ Historical logs and trends for performance monitoring
4B. If No Defect is Detected
● No action is needed for marking or logging.
● The system skips to the next frame and continues inspection, optimizing
performance and storage.
5. Continue Inspection (Loop)
● The system continues looping through steps 2 to 4 as the fabric keeps moving.
● This loop is continuous and synchronized with the fabric movement speed,
ensuring no segment is missed.
● Inspection continues until the entire fabric roll or batch is processed.
Design and Simulation
• Block Diagram
Figure 2: Block Diagram of the Automated Fabric Defect Detection System
The block diagram outlines the logical flow of the automated fabric defect detection
system and is divided into three main sections:
1. Input Section:
o Fabric & Conveyor System: Moves the fabric consistently under the
camera.
o ESP32-CAM: Captures real-time images of the moving fabric.
o IR Sensors & LED Lights: Ensure accurate positioning and provide
optimal lighting for clear image capture.
2. Processing Section:
o Microcontroller (ESP32): Receives images from the ESP32-CAM.
o Image Processing Algorithm: Detects defects using methods like edge
detection, contrast analysis, and pattern recognition to identify visual
anomalies.
3. Output Section:
o Servo Motor: Marks the defect location on the fabric for manual
verification or rework.
o Smart Dashboard: Logs defect data with timestamps and generates
heatmaps to visualize high-defect zones.
o Cloud Storage: Stores inspection data for long-term analysis and remote
monitoring.
• Circuit Diagram
Figure 3: Circuit Diagram of the System (Placeholder)
Explanation of Components and Connections
1. ESP32-CAM Module
o The heart of the system. It captures real-time images of the fabric as it
passes below.
o It connects to various GPIO pins for controlling and reading external
components.
2. Servo Motor (Top Left)
o Used to mark the fabric when a defect is detected.
o It is connected to one of the GPIO pins of the ESP32-CAM and powered
via the 5V rail.
3. Infrared Sensor / Proximity Sensor (Bottom Center)
o Detects the presence or movement of the fabric.
o This helps synchronize image capturing and ensures defects are marked at
the right moment.
4. LED (Top Center)
o Acts as a status indicator (e.g., blinks when a defect is detected or image
is being processed).
5. Buzzer (Top Right)
o Provides an audible alert when a defect is found.
o This can be useful for operator awareness or manual intervention when
necessary.
6. VL53L0X (Right)
o A Time-of-Flight distance sensor from Adafruit.
o It’s used to measure the exact distance between the fabric and the camera
for accurate focusing and image clarity.
7. Push Button (Bottom Right)
o Likely used for resetting the system or manually triggering a test capture.
Power and Connections
● All components are powered through the breadboard, with 5V and GND lines
properly distributed.
● Communication lines like SDA/SCL (I2C) are used to talk to the VL53L0X
sensor.
● Signal lines from the ESP32-CAM control the servo, LED, and buzzer.
• List of Components & Justification:
Sr. Component Quantity Purpose / Justification Cost(INR)
No
1 ESP32 Development 1 Main processing unit 450
Board
2 Display and Camera 1 To sense the defects Mobile Phone
on the fabric and
monitor the activity
3 RGB Sensor 1 Detects fabric color 250
and sets appropriate
3 VL53Lox (Flight of 1 Detects fabric 162
Distance Sensor) thickness
inconsistencies
4 Servo Motor 1 Marks defective fabric 230
sections
5 Buzzer 1 Alerts for detected 39
defects
6 Power Supply nil Provides power to the nil
system
Table 1: List of components and Justification
Key features
1. Live Real-Time Fabric Inspection
● A live video stream is captured using a camera (e.g., DroidCam or USB/IP
camera).
● The fabric moves continuously on a conveyor while frames are analyzed in real
time using OpenCV in Python.
● Detected defects are visually emphasized using heatmap-style contour
highlighting.
● A display window offers continuous visual feedback for operators during
inspection.
2. Defect Detection Using Computer Vision
● OpenCV techniques such as adaptive thresholding and contour detection are used.
● Detects a variety of fabric defects, including:
○ Black dots
○ Holes
○ Stains
○ Lint or slubs
● Capable of determining both the size and location of defects.
● On detection, a signal ‘D’ is transmitted via Serial to the ESP32 microcontroller
for action.
3. Automated Defect Marking (ESP32)
● The ESP32 receives the ‘D’ signal and initiates a dual response:
○ Buzzer: Audio alert to notify of the defect.
○ Servo Motor: Automatically marks the defect on the fabric at the
identified location.
● The system requires no human intervention during this process.
4. Fabric Color Detection Using TCS34725 RGB Sensor
● Measures fabric color before the inspection phase.
● Detects values for Red, Green, Blue (RGB) and Clear (C) channels.
● Functional with both static and moving fabrics.
● Minimizes detection errors due to shade or lighting variations.
5. Adaptive Lighting Control Using WS2812 RGB Strip
● Provides intelligent lighting control based on fabric color:
○ Cool white for red fabrics
○ Neutral light for green fabrics
○ Warm tones for blue/gray fabrics
● Automatically adjusts contrast for better visibility via camera.
● Controlled by ESP32 with no manual adjustment required.
Conclusion
The developed system significantly enhances fabric quality inspection by automating
defect detection, marking, and logging. It eliminates human error and subjectivity,
ensuring consistent and accurate inspection across all fabric types.
By using real-time data from multiple sensors, the system identifies defects early,
marks them instantly, and logs key parameters like color variation and GSM deviations.
This reduces material waste, lowers rework costs, and improves overall product quality.
In addition, real-time dashboards provide supervisors with instant feedback and
actionable insights, enabling quicker decision-making and long-term process
improvements. As a result, the system not only boosts operational efficiency but also
supports sustainable, data-driven manufacturing.
Future scope
A)Lay & Spreading Guide with Heatmapping + Defect Logging
● Generate fabric-wide heatmaps to visualize defect concentration across rolls.
● Assist laying/spreading machines by aligning fabric zones with fewer defects.
● Store defect location data for traceability, pattern cutting, and post-inspection
analysis.
B)Industry 4.1 Standardization (Defect Detection 4.1)
● Integrate with MES/ERP systems for real-time quality feedback loops.
● Enable machine-to-machine communication for automatic fabric rejection or
sorting.
● Ensure compatibility with Industry 4.1 protocols (OPC-UA, MQTT, digital twin
models).
C)Faster Detection Speeds
● Upgrade processing to handle ≥30 FPS using multithreading or GPU acceleration.
● Implement hardware-based edge processing (e.g., Jetson Nano, Coral TPU) for
rapid image analysis.
D)Printed & Patterned Fabric Defect Detection
● Train the system to detect defects in non-solid or printed fabrics using:
● ML classification models (SVM, CNN, Teachable Machine)
● Pattern consistency checks using template matching or feature comparison
● Support for multi-color fabrics and tolerances for design-based irregularities.
The current system effectively automates fabric defect detection, but several
enhancements can elevate its industrial applicability. Future developments may include
integrating Edge AI models for on-device classification of defect types, reducing latency
and reliance on cloud infrastructure (Kumar & Mehta, 2020). Additionally, implementing
histogram-based color analysis could help detect shade variations—a frequent issue in
dyeing and finishing stages (Gupta & Singh, 2023). Enhancing the smart dashboard UI
with multi-line and batch-wise visualization will assist quality managers in identifying
defect patterns across production shifts (Sharma & Rao, 2021). Integration with
ERP/MES platforms can further improve traceability and enable automated quality
reporting, making the system suitable for large-scale textile operations (Smith, 2019). In
the long term, embedding predictive analytics can offer insights into recurring defects,
enabling factories to proactively adjust processes or schedule maintenance, thus reducing
downtime and improving efficiency (Espressif Systems, 2022).
Bibliography
Kumar, R., & Mehta, P. (2020). AI-based fabric fault identification using deep learning.
*International Journal of Textile Science, 9*(3), 45-52.
Sharma, A., & Rao, D. (2021). IoT-based textile quality monitoring system. *IEEE
Internet of Things Journal, 8*(10), 7802-7809.
Espressif Systems. (2022). *ESP32-CAM technical reference manual*. Retrieved from
https://www.espressif.com/en/products/socs/esp32-cam
Smith, J. (2019). Smart manufacturing and Industry 4.0 in textile industries. *Journal of
Industrial Automation, 7*(1), 12-2