0% found this document useful (0 votes)
93 views10 pages

Channel Capacity

Channel capacity is the maximum rate at which information can be reliably transmitted over a communication channel. It is limited by the bandwidth and noise power of the channel. The Shannon-Hartley theorem provides a formula for calculating channel capacity based on bandwidth, signal power, and noise power. Channel capacity is measured in bits per second and represents the highest data rate that can be achieved with arbitrarily low error probability.

Uploaded by

bhprajapati.ict
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
93 views10 pages

Channel Capacity

Channel capacity is the maximum rate at which information can be reliably transmitted over a communication channel. It is limited by the bandwidth and noise power of the channel. The Shannon-Hartley theorem provides a formula for calculating channel capacity based on bandwidth, signal power, and noise power. Channel capacity is measured in bits per second and represents the highest data rate that can be achieved with arbitrarily low error probability.

Uploaded by

bhprajapati.ict
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 10

CHANNEL CAPACITY

INTODUCTION

• Channel capacity, in Electrical engineering,


Computer science, and Information theory, is the tight
upper bound on the rate at which information can be
reliably transmitted over a communication channel .
• The channel capacity of a given channel is the highest
information rate (in units of information per unit time) that
can be achieved with arbitrarily small error probability.

• Bandwidth and noise power place a restriction upon the


rate of information through a channel for low error
transmission. The highest bit rate achievable for no
error transmission is termed as the channel capacity.
 BIT RATE -
• Bit rate, as the name implies, describes the rate at which bits are
transferred from one location to another. In other words, it measures how
much data is transmitted in a given amount of time. Bit rate is commonly
measured in bits per second (bps), kilobits per second (Kbps), or megabits
per second (Mbps).

 BANDWIDTH –
Bandwidth describes the maximum data transfer rate of
a network or Internet connection. It measures how much data can
be sent over a specific connection in a given amount of time. For
example, a gigabit Ethernet connection has a bandwidth of
1,000 Mbps. An Internet connection via cable modem may provide
25 Mbps of bandwidth.
Mathematical Explanation of channel capacity:

If a source gives M equally likely message M >>1, With rate


of information R and given channel with capacity C.

Then if
R <=C In this condition error free transmission is possible
in presence of noise
If
R>c In this conditions probability of error is close
to unity or equal to 1.
 Shannon Hartley channel capacity formula :

Here
• C -Channel capacity in bits per sec
• B -Bandwidth of the channel in hertz
• S -Average signal power over the bandwidth (watt)
• N- Average power of the noise and interference over
the bandwidth (watts)
• S/N – Signal to Noise Ratio (SNR) or carrier – to –
noise ratio (CNR)
• Here one can receive a signal with noise in every session.
Because of noise is there at the channel we receive signal
and noise both together.
Noiseless Channels and Nyquist
Theorem
• For a noiseless channel, Nyquist theorem gives the
relationship
• between the channel bandwidth and maximum data rate
that can be
transmitted over thisNyquist
channel.
Theorem

C  2B log 2 m

C: channel capacity (bps)


B: RF bandwidth
m: number of finite states in a symbol of transmitted signal
So we receive
Signal = Signal power (S) + Noise Power (N)
And its mean square value is
Where S = signal power
N= Noise power and
Root ( ) means square value of signal is ,
So noise power is N and its mean square value is .

So if we want to identify number of levels will be


separated without error is

m= Ratio

of

Signal
m - Here levels of signals without error ,
is denoted as received signal with error and is noise signal.

So here the signal without error is


>

So digital information is

- I = log2 m
= log 2

= ½ log2

Here I is digital
information
m is the signal without error and it is
So the channel signal is ½ log 2
Now if a channel transmits K pulses per
second then channel capacity is
C = IK (Information multiplied with pulses)

= K/2 log2 ( 1+S/N)

• From Nyquist theorem we know that k=2B, then we


get the value of channel capacity C,
 Conclusion -
 Here we can see that the channel capacity is
measured with the multiplication of
pulses per second and information. This is how
we can measure the channel capacity.
 Though Shannon’s theory was presented with
regard to theproblem of transmitting error- free
messages across telephone lines, this theory is being
used in such fields as , psychology, education ,
managmen decision process and information science.
Because of its generality, this theory became known
as information theory.

You might also like