0% found this document useful (0 votes)
32 views7 pages

Image Processing PDF

The document explains the processes of quantization and sampling in image processing, detailing how quantization digitizes image amplitude values into distinct levels and how sampling affects spatial resolution by determining pixel density. It also covers the concept of pixel neighborhoods, types of adjacency, and connectivity, which are crucial for image analysis tasks such as filtering, feature extraction, and segmentation. Additionally, it discusses various distance measures used in image processing for applications like segmentation, pattern recognition, and object tracking.

Uploaded by

md.saalim003
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views7 pages

Image Processing PDF

The document explains the processes of quantization and sampling in image processing, detailing how quantization digitizes image amplitude values into distinct levels and how sampling affects spatial resolution by determining pixel density. It also covers the concept of pixel neighborhoods, types of adjacency, and connectivity, which are crucial for image analysis tasks such as filtering, feature extraction, and segmentation. Additionally, it discusses various distance measures used in image processing for applications like segmentation, pattern recognition, and object tracking.

Uploaded by

md.saalim003
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

. 2.

Quantization is a process of transforming a real valued


sampled image to one taking only a finite number of distinct
values. Under quantization process the amplitude values of the
image are digitized. In simple words, when you are quantizing an
image, you are actually dividing a signal into quanta(partitions).

Now let’s see how quantization is done. Here we assign levels to


the values generated by sampling process. In the image showed
in sampling explanation, although the samples has been taken,
but they were still spanning vertically to a continuous range of
gray level values. In the image shown below, these vertically
ranging values have been quantized into 5 different levels or
partitions. Ranging from 0 black to 4 white. This level could vary
according to the type of image you want.

------------------------------------------------------------------------------------------
-------------------------

4.. Sampling → Spatial Resolution

Sampling refers to how often we measure (sample) the image signal in


space — i.e., how many pixels we use to represent the image.

➤ Relation to Resolution:

 Higher sampling rate → more pixels → higher spatial


resolution.

 Lower sampling rate → fewer pixels → lower spatial resolution,


leading to blurring or pixelation.Example:

A photo sampled at:

 300×300 pixels will be sharp.


 30×30 pixels will be very blurry or blocky.

2. Quantization → Color or Intensity Resolution

Quantization refers to how many intensity or color levels each pixel


can have — it maps continuous color values to discrete ones.

➤ Relation to Color Depth:

 More quantization levels (bits) → more color detail (e.g., 8-bit =


256 levels per channel).

 Fewer levels → lower color or grayscale resolution → can cause


banding or posterization.

Example:

 8-bit grayscale: 256 shades → smooth gradients.

 2-bit grayscale: 4 shades → visible bands or sharp steps in


brightness.

----------------------------------------------------------------------------------------------------

5.

1. Definition: Neighbors of a pixel are the adjacent pixels that


surround it in a defined neighborhood. The neighborhood can vary in
size and shape depending on the specific image processing
algorithm or task.

2. Types of Neighborhoods:

o 4-neighborhood: Includes the pixel directly above, below,


left, and right of the central pixel.

o 8-neighborhood: Includes the 4-neighborhood plus the


diagonal pixels (top-left, top-right, bottom-left, bottom-right).

3. Spatial Arrangement: Neighbors are typically identified based on


their relative positions to the central pixel within the image grid.
This arrangement helps in defining local structures and patterns
within the image.

4. Importance in Image Processing:

o Filtering: Neighboring pixels are often used in spatial filters


(e.g., averaging, edge detection) to compute new values for
the central pixel based on its surroundings.
o Feature Extraction: Local features such as texture, edges,
and corners are often characterized by the intensity variations
among neighboring pixels.

o Segmentation: Neighboring pixel properties can help


delineate object boundaries or regions of interest within an
image.

5. Neighborhood Size: The size of the neighborhood (e.g., 3x3, 5x5)


determines the extent of influence neighboring pixels have on each
other. Larger neighborhoods capture more spatial context but can
increase computational complexity.

6. Boundary Handling: Special considerations are required for pixels


near the image boundaries where not all neighbors may be
available. Techniques such as zero-padding or mirror-padding are
commonly used to handle edge cases.

7. Connectivity: The notion of connectivity in image processing refers


to how neighbors are defined and used. Higher connectivity (e.g., 8-
neighborhood) captures more spatial relationships but may require
more computational resources.

12.An image is denoted by f(x,y) and p,q are used to represent individual
pixels of the image.

Neighbours of a pixel

A pixel p at (x,y) has 4-horizontal/vertical neighbours at (x+1,y), (x-1,y),


(x,y+1) and (x,y-1). These are called the 4-neighbours of p : N4(p).

A pixel p at (x,y) has 4 diagonal neighbours at (x+1,y+1), (x+1,y-1), (x-


1,y+1) and (x-1,y-1). These are called the diagonal-neighbours of p :
ND(p).

The 4-neighbours and the diagonal neighbours of p are called 8-


neighbours of p : N8(p).

Adjacency between pixels

Let V be the set of intensity values used to define adjacency.

In a binary image, V ={1} if we are referring to adjacency of pixels with


value 1. In a gray-scale image, the idea is the same, but set V typically
contains more elements.

For example, in the adjacency of pixels with a range of possible intensity


values 0 to 255, set V could be any subset of these 256 values.

We consider three types of adjacency:


a) 4-adjacency: Two pixels p and q with values from V are 4-adjacent if q
is in the set N4(p).

b) 8-adjacency: Two pixels p and q with values from V are 8-adjacent if q


is in the set N8(p).

c) m-adjacency(mixed adjacency): Two pixels p and q with values from


V are m-adjacent if

1. q is in N4(p), or

2. 2) q is in ND(p) and the set N4(p)∩N4(q) has no pixels whose values


are from V.

Connectivity between pixels

It is an important concept in digital image processing.

It is used for establishing boundaries of objects and components of


regions in an image.

Two pixels are said to be connected:

 if they are adjacent in some sense(neighbour pixels,4/8/m-


adjacency)

 if their gray levels satisfy a specified criterion of similarity(equal


intensity level)

There are three types of connectivity on the basis of adjacency. They are:

a) 4-connectivity: Two or more pixels are said to be 4-connected if they


are 4-adjacent with each others.

b) 8-connectivity: Two or more pixels are said to be 8-connected if they


are 8-adjacent with each others.

c) m-connectivity: Two or more pixels are said to be m-connected if they


are m-adjacent with each others.
--------------------------------------------------------------------------------------------------------
---------
============================================
==========================14.Types of Distance
Measures

1. Spatial (Geometric) Distance

Used to measure distance between pixel positions in the image.

Euclidean Distance (most common):

Manhattan Distance (City Block):

Chessboard Distance:

2. Feature-Based Distance

Used to compare pixel values or features, such as color vectors or


texture descriptors.

 Color Distance (e.g., Euclidean distance in RGB space)

 Histogram Distance (e.g., Bhattacharyya, Chi-square)

 Cosine Similarity, Correlation, etc.

Why is Distance Important in Image Processing?

Application Role of Distance

Image Group similar pixels based on color or texture


Segmentation distances.

Pattern Compare features (e.g., face descriptors) to find


Recognition matches.

Measure motion by calculating distances between


Object Tracking
object locations over frames.

Distance transforms used to define shapes and


Morphology
skeletons.

Clustering (e.g., Assign pixels to clusters based on color or intensity


K-means) distance.

Calculate gradients or differences between


Edge Detection
neighboring pixels.
Simple Example

Imagine a grayscale image:

cpp

Copy code

Pixel A (10, 10) = 50

Pixel B (12, 13) = 150

 Spatial distance: Use Euclidean distance between (10,10) and


(12,13)

 Intensity difference: |150 - 50| = 100

If you're doing region growing, you might accept B into a region if its
intensity distance from A is small.

15.1. Euclidean Distance Between Pixel Coordinates (Spatial)

➤ Formula:

If you have two pixels located at


P₁ = (x₁, y₁) and P₂ = (x₂, y₂),

then the Euclidean distance D between them is:

follow class note

Example:

Let’s say:

 Pixel A is at (10, 20)

 Pixel B is at (13, 24)

Then the Euclidean distance is:

Follow class note

This is used when:

 Calculating proximity in region growing

 Tracking objects or pixels over time

 Distance transforms (e.g., skeletonization)


2. Euclidean Distance in Feature Space (e.g., RGB Values)

This version of Euclidean distance is used when comparing color or


intensity values between two pixels.

➤ Formula (for RGB):

Let:

 Pixel A = (R₁, G₁, B₁)

 Pixel B = (R₂, G₂, B₂)

Then:

Follow class notes Example:

 Pixel A = (100, 150, 200)

 Pixel B = (110, 140, 220)

Follow class note

Used in:

 Color-based segmentation

 K-means clustering

 Image comparison and retrieval

You might also like