Capacity of Wireless
Channels
• Wireless channel capacity is a fundamental concept in
information theory that defines the maximum achievable data rate
for a given channel, considering the effects of noise, fading, and
interference.
Practical Considerations
• Interference: In multi-user environments, interference channels
reduce capacity.
• Bandwidth Limitations: Higher capacity requires larger
bandwidths (e.g., mmWave communications).
• Energy Efficiency: Low-power IoT networks optimize capacity
while minimizing energy consumption.
• The capacity of wireless channels varies based on channel
conditions, fading effects, and antenna configurations.
• MIMO and adaptive power control enhance capacity
significantly.
• The capacity of wireless channels is discussed, focusing on the
fundamental limits of data transmission over such channels. It is
based on Shannon's information theory, which defines channel
capacity as the maximum data rate that can be transmitted with
an arbitrarily low probability of error.
Shannon's Capacity Theorem:
• Defines the mutual information between the input and output of
a channel.
• Proves that a code exists to achieve a rate close to channel
capacity with negligible error.
• States that exceeding capacity results in an error probability
approaching 1.
Single-User Wireless Channels:
• Examines time-invariant channels, including Additive White
Gaussian Noise (AWGN) channels.
• Considers flat fading and frequency-selective fading channels,
where the fading distribution impacts capacity.
• The concept of water-filling power allocation is introduced,
which optimally adjusts power over time and frequency to
maximize data rate.
Fading Channel Capacity:
• Flat fading capacity is discussed, distinguishing cases where the
transmitter does or does not know the channel conditions.
• Frequency-selective fading capacity involves optimal power
allocation over different frequencies.
Sampling Considerations:
• Even though continuous-time systems are mostly converted to
discrete-time equivalents, proper sampling rates must be chosen
to maintain channel capacity, especially in time-varying
conditions.
• This concept is fundamental to understanding wireless
communication limits, efficient coding and modulation
techniques, and strategies to achieve near-optimal data
transmission in practical systems.
Capacity of an AWGN Channel
• In communication systems, the Additive White Gaussian Noise
(AWGN) channel is a widely used model. It represents a system
where the transmitted signal is affected by random Gaussian
noise. The key goal is to determine the maximum data rate
(capacity) at which information can be transmitted reliably.
Shannon Capacity Formula
•Higher SNR leads to higher capacity: More signal power improves the achievable data rate.
•Increasing bandwidth increases capacity, but only up to a point—eventually, noise dominates.
Mutual Information and Channel Capacity
•Capacity is achieved by maximizing mutual information over all possible input distributions p(x).
•For the AWGN channel, the optimal input follows a Gaussian distribution, meaning we use Gaussian
signaling (e.g., QAM in modern communication).
1.Shannon’s capacity formula shows the fundamental data rate
limit for reliable communication in an AWGN channel.
2.Mutual information defines how much information is
transferred, and maximizing it gives the best possible capacity.
3.Gaussian-distributed input signals achieve the highest
capacity in an AWGN channel, which is why modern systems use
modulation techniques like QAM.
• At 𝑑=100𝑚, capacity is 152.6𝑘𝑏𝑝𝑠.
• At 𝑑=1𝑘𝑚, capacity drops drastically to 1.4𝑘𝑏𝑝𝑠.
• Reason: Due to the path-loss exponent of 3, received power
decreases significantly at larger distances, reducing SNR and
capacity.