Digital Communication Systems Master of Electrical Engineering and Informaiton Technology Lecture 4 : Channel Models and Channel Capacity
Channel Models and Channel Capacity Ref: Digital Communication Systems by J.G. Proakis Channel Capacity is defined for any communication channel and gives a fundamental limit on the amount of information that can be limited to through the channel Types of Channels Binary Symmetric Channel (BSC) Additive white Gaussian Noise (AWGN) Channels
Channel Models and Channel Capacity Transmitting the information signal is subject to a variety of changes deterministic and probabilistic e.g. Addition of noise, multi-path fading etc. The mathematical model for a communication channel is stochastic dependence between the input and the output signal Channel can be modeled as a conditional probability relating each output of the channel to is corresponding Input Genral model is called Discrete Memory less Channel (DMC) Input Alphabet x Output Alphabet Y Channel Transition Probability p(y X) for all x Ξ X, y Ξ Y
Specific forms of the DMC Binary Symmetric Channel (BSC) Binary Eraser Channel (BEC) p p -p p q -p e Error, erased -p -p Binary Channel (BC) Binary Eraser & Error Channel (BE&EC) -p p q - p - p 2 p 2 p q q 2 e -q - q - q 2
DMC with feedback X i DMC Y i feedback If feedback is available to the Tx then its a feedback channel With feedback you can not transmit more information compared to that without feedback Decrete Channel with feedback use previous information to transmit the future symbols capacity of the feedback channel does not increase If channel has memory you can transmit possibly more informaiton Continuous & Discrete Valued Channel Transmits Continuous/Discrete information AWGN Channel Noise is added and is White X i Y i =X i +Z i X(t) Continuous Discrete X Z(t) Z Y(t) =X(t) + Z(t) Y=X+Z Z i Autocorrelation fn. of noise is derac delta fn. Guassian with PSD N /2 Channel is memory less PSD N /2
Additive Gaussian Noise (AGN) Channel with memory Transmits Continuous/Discrete information with memory Continuous Noise Z and i Z(t) are added but NOT White, although correlated in time Discrete Autocorrelation fn. of noise is NOT FLAT in FD There is a correlation in noise in time domain Channel is has a memory Linear Filter Channel(LF Ch) with memory X(t) Continuous If g i is the IR of the inverse filter of h i at reciever Discrete Then we can simply pass the signal through g i X i Y h i i g i Y i X(t) X i h(t) h(t) finite/infinite IR Z(t) Z(t) Z i Y(t) =X(t) + Z(t) Y i =X i +Z i Z i, Z(t) Correlated Gaussian Noise Z(t) White Gaussian Noise Y(t) =X(t) X h(t) +Z(t) X i Y =X X i i h i +Z i h i h i finite/infinite IR Z i Z i White Gaussian Noise Zi If we consider dotted box as a channel then LFCh can be shown as Y i = g X i Y i = (X X i h i +Z i ) X g i = X X i (h g i i ) + Z X X i g i (h g i X i ) convolution of h i & g i is delta fn. = X X i (delta fn.) + Z X i g i X i = X i + Z X i g i X (delta fn.) = X i X i Z i X g i But noise is not white, it is colored noise AGN Ch with colored noise can be converted to LF Ch by use of whitening filter, which makes the noise white Y i X i Z i g i Z i Colored Gaussian Noise Z i White Gaussian Noise Y i =X X g i i + Z i Z i = Z X g i i AGN Ch and LFCh are equivalent and convertible by use of whitening filter
Binary Symmetric Channel (BSC) Special case of DMC A mathematical model for binary transmission over a Guassian Channel with hard decision at the output. X = y = [,] and p (y= x=) = (y= x=) =e e is called the crossover probability of the channel Channel Capacity C Is the maximum rate at which reliable transmission of information over the channel is possible Conditions for reliable transmission There exists a sequence of codes with increasing block length ofr which the errror probability tends to as the block lenght increases At rates R<C reliable transmission over the channel is possible, R>C reliable transmission over the channel is not possible
Channel Capacity for DMC Shannon s fundamental result of information theory states where denotes the Mutual Informaiton between x (ch I/P) and y (ch O/P) Mutual Information between two r.v. the maximization is carried out over all input probability distributions of the channel Capacity of BSC In bits and the logrithm is in base 2. Binary Entropy H b (.) where Ξ is the cross probability of the channe Hb(.) is Binary Entropy
The Bandlimited Additive White Gaussian Noise Channel with an input power constraint Channel is and limited to [-w,w] Noise is Guassian and White Power Spectral density of N/2 bits/ second For a discrete time Additivie White Gaussian Noise Channel with input power constraint Noise variance bits/ transmission
Ex:Capacity of the BSC Binary data AWGN BPSK signaling Optimal matched filter detection Hard decision decoding Plot Error probability of the channel as a function of Energy in each BPSK signal N/2 Noise Power Spectral density Assume changes from -2 db to 2dB Plot resulting channel a function of Error probability of BPSK with optimal detection is given by bits/ second Use relation to obtain a plot between C verus
The Q-Function and Error Function Q-Function Q(x) is the probability that a standard normal r.v. (t) will obtain a value larger than x. t 2 dt Its a simple transformations of the normal CDF which assumes a value in the range [,x] The Q-function can be expressed in terms of the Error Function as Q(x).9.8.7.6.5.4.3.2. -4-3 -2-2 3 4 X The Error Function ( erf(x) ) or Gauss error function is a function of sigmoid shape The Complementary Error Function, erfc(x), is defined as erf(x).9.8.7.6.5.4.3.2. -3-2 - 2 3 X
Ex: Gaussian Channel Capacity. Plot the capacity of an additive white Gaussian noise channel with a bandwidth W= 3 Hz as a function P/N o for values of P/No between -2 an d 3 db 2. Plot the capacity of an additive white Gaussian noise channel with P/No = 25 db as a function of W. In particular, what is the channel capacity when W increases indefinitely? As in seen in the plots, when either P/No or W tend to zero, the capacity of the channel also tends to zero when P/No or W tends to infinity, the capacity behaves differently. when P/No tends to infinity, the capacity tends to infinity when W tends to infinity, the capacity goes to a limit determined by P/No
Ex: Capacity of the Binary input AWGN Channel A binary input AWGN channel is modeled by two binary i/p levels A & -A and additive (zero mean) Gaussian noise with variance In this case x= {A, -A }, Plot the capcity of this channel as a function of Due to symmetry in this problem the capacity is achieved for uniform input distribution i.e., for For this input distribution the output distribution is given by and the mutual information between the input and th eoutput is given by Simple integration and change of variable results in where Using these relaitons we can calculate I(X;Y) for vaious values of
Ex: Capacity of the bandlimited AWGN Channel with input power P and Bandwidth W Capacity a descrete time AWGN channel as a function as a function of BW and SNR in AWGN channel
xed