Relationship between Bandwidth, Data Rate and Channel Capacity

This posts describes the relationship between signal bandwidth, channel bandwidth and maximum achievable data rate. Before, going into detail, knowing the definitions of the following terms would help:

  • Signal Bandwidth – the bandwidth of the transmitted signal or the range of frequencies present in the signal, as constrained by the transmitter.
  • Channel Bandwidth – the range of signal bandwidths allowed by a communication channel without significant loss of energy (attenuation).
  • Channel Capacity or Maximum Data rate – the maximum rate (in bps) at which data can be transmitted over a given communication link, or channel.
In general, information is conveyed by change in values of the signal in time.  Since frequency of a signal is a direct measure of the rate of change in values of the signal, the more the frequency of a signal, more is the achievable data rate or information transfer rate. This can be illustrated by taking the example of both an analog and a digital signal.

If we take analog transmission line coding techniques like Binary ASK, Binary FSK or Binary PSK, information is tranferred by altering the property of a high frequency carrier wave. If we increase the frequency of this carrier wave to a higher value, then this reduces the bit interval T (= 1/f) duration, thereby enabling us to transfer more bits per second.

Similarly, if we take digital transmission techniques like NRZ, Manchester encoding etc., these signals can be modelled as periodic signals and hence is composed of an infinite number of sinusoids, consisting of a fundamental frequency (f) and its harmonics. Here too, the bit interval (T) is equal to the reciprocal of the fundamental frequency (T =  1/f). Hence, if the fundamental frequency is increased, then this would represent a digital signal with shorter bit interval and hence this would increase the data rate.

So, whether it is analog or digital transmission, an increase in the bandwidth of the signal, implies a corresponding increase in the data rate. For e.g. if we double the signal bandwidth,  then the data rate would also double.

In practise however, we cannot keep increasing the signal bandwidth infinitely. The telecommunication link or the communication channel acts as a police and has limitations on the maximum bandwidth that it would allow. Apart from this, there are standard transmission constraints in the form of different channel noise sources that strictly limit the signal bandwidth to be used.  So the achievable data rate is influenced more by the channel’s bandwidth and noise characteristics than the signal bandwidth.

Nyquist and Shannon have given methods for calculating the channel capacity (C) of bandwidth limited communication channels.

Nyquist Criteria for maximum data rate for noiseless channels

Given a noiseless channel with bandwidth B Hz., Nyquist stated that it can be used to carry atmost  2B signal changes (symbols) per second. The converse is also true, namely for achieving a signal transmission rate of 2B symbols per second over a channel, it is enough if the channel allows signals with frequencies upto B Hz.

Another implication of the above result is the sampling theorem, which states that for a signal whose maximum bandwidth is f Hz., it is enough to sample the signals at 2f samples per second for the purpose of quantization (A/D conversion) and also for reconstruction of the signal at the receiver (D/A conversion). This is because, even if the signals are sampled at a higher rate than 2f ( and thereby including the higher harmonic components), the channel would anyway filter out those higher frequency components.

Also,  symbols could have more than two different values, as is the case in line coding schemes like QAM, QPSK etc. In such cases, each symbol value could represent more than 1 digital bit.

Nyquist’s formulae for multi-level signalling for a noiseless channel is

C = 2 * B * log M,

where C is the channel capacity in bits per second, B is the maximum bandwidth allowed by the channel, M is the number of different signalling values or symbols and log is to the base 2.

For example, assume a noiseless 3-kHz channel.
  1. If binary signals are used, then M= 2 and hence maximum channel capacity or achievable data rate is C = 2 * 3000 * log 2 = 6000 bps.
  2. Similarly, if QPSK is used instead of binary signalling, then M = 4. In that case, the maximum channel capacity  is C = 2 * 3000 * log 4 = 2 * 3000 * 2 = 12000bps.
Thus, theoritically, by increasing the number of signalling values or symbols, we could keep on increasing the channel capacity C indefinitely. But however, in practise, no channel is noiseless and so we cannot simply keep increasing the number of symbols indefinitely, as the receiver would not be able to distinguish between different symbols in the presence of channel noise.
It is here that Shannon’s theorem comes in handy, as he specifies a maximum theoritical limit for the channel capacity C of a noisy channel.

Shannon’s channel capacity criteria for noisy channels

Given a communication channel with bandwidth of B Hz. and a signal-to-noise ratio of S/N, where S is the signal power and N is the noise power, Shannon’s formulae for the maximum channel capacity C of such a channel is

C = B log (1 + S/N)    

(log is to base 2)

For example, for a channel with bandwidth of 3 KHz and with a S/N value of 1000, like that of a typical telephone line, the maximum channel capacity is

    C = 3000 * log (1 + 1000)  = 30000 bps (approx.)

Using the previous examples of Nyquist criteria, we saw that for a channel with bandwidth 3 KHz, we could double the data rate from 6000 bps to 12000 bps., by using QPSK instead of binary signalling as the line encoding technique. Using Shannon’s criteria for the same channel, we can conclude that irrespective of the line encoding technique used, we cannot increase the channel capacity of this channel beyond 30000bps.

In practise however, due to receiver constraints and due to external noise sources, Shannon’s theoritical limit is never achieved in practise.

Thus to summarize the relationship between bandwidth, data rate and channel capacity,
  • In general, greater the signal bandwidth, the higher the information-carrying capacity
  • But transmission system & receiver’s capability limit the bandwidth that can be transmitted
Hence data rate depends on
  • Available bandwidth for transmission
  • Channel capacity and Signal-to-Noise Ratio
  • Receiver Capability
More the frequency allotted,  more the channel bandwidth, more the processing capability of the receiver, greater the information transfer rate that can be achieved.

 

 

 

14 comments for “Relationship between Bandwidth, Data Rate and Channel Capacity

  1. nasir
    May 18, 2014 at 23:41

    A very comprehensive and crisp concept.

  2. R D
    July 31, 2014 at 22:06

    Good one in a nutshell.

  3. jones
    July 3, 2015 at 22:06

    short and to the point. thanks a lot 🙂

  4. chidume chidiebere sunday
    July 6, 2015 at 11:58

    very brief and straight to the point

  5. Sangeeta Sharma
    August 13, 2015 at 11:42

    explained so well!! thanks

  6. Joshi V K
    August 15, 2015 at 16:57

    excellent summerzation. Thanks very much . Next time please make more descriptive

  7. vaibhav poman
    August 20, 2015 at 14:49

    it’s basic of communication..and you present it very well..

  8. khalid sophee
    September 21, 2015 at 15:03

    Yes,,usefull

  9. October 7, 2015 at 19:14

    Good. Well presented.

  10. ENOWMANYI DANIEL ASHU
    November 14, 2015 at 12:01

    Explained so well and straight to the point

  11. mohammed aadhil
    November 16, 2015 at 22:25

    If we take analog transmission line coding techniques like Binary ASK, Binary FSK or Binary PSK, information is tranferred by altering the property of a high frequency carrier wave. If we increase the frequency of this carrier wave to a higher value, then this reduces the bit interval T (= 1/f) duration, thereby enabling us to transfer more bits per second.

    Similarly, if we take digital transmission techniques like NRZ, Manchester encoding etc., these signals can be modelled as periodic signals and hence is composed of an infinite number of sinusoids, consisting of a fundamental frequency (f) and its harmonics. Here too, the bit interval (T) is equal to the reciprocal of the fundamental frequency (T = 1/f). Hence, if the fundamental frequency is increased, then this would represent a digital signal with shorter bit interval and hence this would increase the data rate.

    So, whether it is analog or digital transmission, an increase in the bandwidth of the signal, implies a corresponding increase in the data rate. For e.g. if we double the signal bandwidth, then the data rate would also double.

    Well.. in the first two paras, u explained about how freq is related to bps(bit rate) but, you ended up with saying that “Thus , BW is related to bps”……….
    CONFUSED..

  12. Richard Felten
    November 18, 2015 at 01:02

    I noticed in your example with Shannon’s channel capacity, you substitute the value of 30db into the equation for the SNR. Shouldn’t this be expressed as a gain not in a decible value? i didn’t think you should use a decibel value in a log like that. for instance it should be 1000 (30dB=10log10[Psig/Pnoise]–>Psig/Pnoise = 1000)

    Is this correct or am i mistaken?

    • November 18, 2015 at 19:21

      Thanks for pointing out the error. I have corrected the values for the example.

  13. Johnmark Bollo
    December 6, 2015 at 00:36

    Comment *I love this post.U r smart.thanx

Leave a Reply

Your email address will not be published. Required fields are marked *

Current ye@r *