Chapter 4: Filtering in the Frequency Domain
Notes from Digital Image Processing by Gonzalez
and Woods
1 Overview
What: Chapter 4 focuses on filtering techniques in the frequency domain, where images are
transformed from the spatial domain to a frequency-based representation to manipulate specific
frequency components.
How: The Fourier Transform, particularly the Discrete Fourier Transform (DFT) and Fast
Fourier Transform (FFT), is used to convert images into the frequency domain, where filtering
is performed by multiplying the transform with a filter function.
Why: Frequency domain filtering enables targeted enhancement or suppression of image fea-
tures (e.g., edges, noise) and is computationally efficient for certain operations, such as convo-
lution, making it essential for image enhancement, restoration, and analysis.
2 Preliminary Concepts
2.1 Fourier Series and Transform
What: The Fourier Series and Fourier Transform decompose a signal into sums of sinusoids
or a continuous spectrum of frequencies, respectively.
How:
• Fourier Series: Represents periodic functions as:
∞ ∫ T
1
f (x) = ∑ cn e j2π nx/T
, cn =
T 0
f (x)e− j2π nx/T dx
n=−∞
• Continuous Fourier Transform: For a function f (x), defined as:
∫ ∞ ∫ ∞
− j2π ux
F(u) = f (x)e dx, f (x) = F(u)e j2π ux du
−∞ −∞
• Discrete Fourier Transform (DFT): For a discrete signal f (x) of length N:
N−1
1 N−1
F(u) = ∑ f (x)e− j2π ux/N , f (x) = ∑ F(u)e j2π ux/N
N u=0
x=0
1
For 2D images of size M × N:
M−1 N−1
F(u, v) = ∑ ∑ f (x, y)e− j2π (ux/M+vy/N)
x=0 y=0
1 M−1 N−1
f (x, y) = ∑ ∑
MN u=0 v=0
F(u, v)e j2π (ux/M+vy/N)
Why: The Fourier Transform enables analysis of frequency components, which is crucial for
filtering operations that target specific image features like edges or smooth regions.
2.2 Fast Fourier Transform (FFT)
What: The FFT is an efficient algorithm for computing the DFT.
How: Reduces computational complexity from O(N 2 ) to O(N log N) for 1D signals and is
applied separably for 2D images.
Why: Enables practical implementation of frequency domain techniques for large images,
making real-time processing feasible.
3 Sampling and the Fourier Transform
What: Sampling converts continuous signals into discrete ones, affecting their frequency do-
main representation.
How:
• Sampling Theorem: A band-limited signal with maximum frequency umax can be recon-
structed if sampled at ∆x ≤ 2u1max .
• Sampling creates periodic replicas in the frequency domain:
1 ∞
Fs (u) = ∑ F(u − k/∆x)
∆x k=−∞
• Aliasing: Occurs when sampling is insufficient, causing frequency overlap and artifacts
like moiré patterns.
Why: Proper sampling prevents aliasing, ensuring accurate frequency domain representation,
which is critical for effective filtering.
4 Properties of the Discrete Fourier Transform
What: The DFT has properties like periodicity, symmetry, and convolution that facilitate image
processing.
How:
• Periodicity: F(u) = F(u + N).
• Symmetry: For real inputs, F(u) = F ∗ (N − u).
• Centering: Multiply f (x, y) by (−1)x+y to shift the spectrum’s origin to the center.
2
• Convolution Theorem: Spatial convolution is equivalent to frequency domain multipli-
cation:
f (x, y) ∗ h(x, y) ↔ F(u, v)H(u, v)
Why: These properties enable efficient computation and intuitive manipulation of image fea-
tures in the frequency domain.
5 2D Fourier Transform
What: Extends the 1D DFT to two dimensions for image processing.
How: The 2D DFT decomposes an image into frequency components in u (horizontal) and v
(vertical) directions, computed separably via FFT. Low frequencies are near the center, high
frequencies at the edges.
Why: Allows analysis of spatial variations in both directions, enabling targeted filtering of
image features like edges or textures.
6 Basics of Frequency Domain Filtering
What: Filtering involves modifying the frequency spectrum of an image to achieve desired
effects.
How:
1. Compute the DFT of the image f (x, y) to get F(u, v).
2. Multiply by a filter function H(u, v).
3. Compute the inverse DFT to obtain the filtered image.
Why: Frequency domain filtering is efficient for operations like convolution and allows precise
control over frequency components.
7 Smoothing Filters
7.1 Ideal Low-pass Filter (ILPF)
What: Passes frequencies within a radius D0 and attenuates others.
How: Defined as:
{
1 if D(u, v) ≤ D0 √
H(u, v) = , D(u, v) = u2 + v2
0 otherwise
Why: Used for blurring and noise reduction, but sharp cutoff causes ringing artifacts.
7.2 Butterworth Low-pass Filter (BLPF)
What: Provides a smooth transition between passed and attenuated frequencies.
How: Defined as:
1
H(u, v) =
1 + [D(u, v)/D0 ]2n
where n controls the filter’s sharpness.
Why: Reduces ringing compared to ILPF, suitable for controlled smoothing.
3
7.3 Gaussian Low-pass Filter (GLPF)
What: Uses a Gaussian function for smooth frequency attenuation.
How: Defined as:
H(u, v) = e−D (u,v)/(2D0 )
2 2
Why: Produces natural-looking smoothing without ringing, ideal for noise reduction.
8 Sharpening Filters
8.1 Ideal High-pass Filter (IHPF)
What: Passes frequencies outside a radius D0 .
How: Defined as: {
0 if D(u, v) ≤ D0
H(u, v) =
1 otherwise
Why: Enhances edges but causes ringing artifacts.
8.2 Butterworth High-pass Filter (BHPF)
What: Smooth high-pass filter.
How: Defined as:
1
H(u, v) =
1 + [D0 /D(u, v)]2n
Why: Reduces ringing, suitable for edge enhancement.
8.3 Gaussian High-pass Filter (GHPF)
What: Uses a Gaussian function for high-pass filtering.
How: Defined as:
H(u, v) = 1 − e−D (u,v)/(2D0 )
2 2
Why: Enhances edges without ringing, ideal for sharpening.
8.4 Laplacian Filter
What: Enhances edges by amplifying high frequencies.
How: In the frequency domain:
∇2 f (x, y) ↔ −4π 2 (u2 + v2 )F(u, v)
Why: Useful for edge detection and sharpening.
8.5 High-Frequency Emphasis
What: Combines high-pass filtering with the original image.
How: g(x, y) = f (x, y) + k · High-pass( f (x, y)).
Why: Preserves low frequencies while enhancing edges, improving visual quality.
4
9 Homomorphic Filtering
What: Separates illumination and reflectance components for dynamic range compression and
detail enhancement.
How:
1. Model image as f (x, y) = i(x, y) · r(x, y).
2. Take logarithm: ln( f (x, y)) = ln(i(x, y)) + ln(r(x, y)).
3. Apply DFT and filter with:
H(u, v) = (γH − γL ) · (1 − e−cD
2 (u,v)/D2
0 ) + γL
4. Inverse DFT and exponentiate to recover the image.
Why: Corrects non-uniform illumination, enhancing details in shadowed regions.
10 Implementation
What: Practical considerations for frequency domain filtering.
How:
• Padding: Use power-of-2 dimensions to optimize FFT and avoid wraparound errors.
• Centering: Multiply by (−1)x+y to center the spectrum.
• Normalization: Scale filter outputs to maintain intensity ranges.
• Use tools like MATLAB or Python (NumPy, OpenCV) for FFT and filtering.
Why: Ensures accurate and efficient implementation, addressing issues like aliasing and com-
putational cost.
11 Key Takeaways
• Frequency domain filtering leverages the Fourier Transform to manipulate specific fre-
quency components.
• Low-pass filters smooth images, high-pass filters enhance edges, and homomorphic fil-
ters address illumination issues.
• The FFT enables efficient computation, making frequency domain techniques practical.
• Proper sampling and implementation are critical to avoid artifacts and ensure accuracy.