itc_unit_i

Upload: ankurwidguitar

Post on 07-Apr-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/3/2019 ITC_Unit_I

    1/31

    Unit 1PART B

    ote that the answer for the questions is indicated via the page number given alongside. Match the page numbersto the scanned pages given below to get the answer.

    1. What is Entropy? Explain the properties and types of Entropy? (1-7 to 1-11)

    2. Explain about the relationship between Joint and Conditional Entropy. (1-7)

    3. The Joint Probability Matrix is given as

    nd all the Entropies and Mutual Information.Prove that the Upper bound on Entropy is given as Hmaxlog2M.Here M is the number of messages

    mitted by the source. (1-11)Prove that H(X,Y)=H(X/Y)+H(Y) (1-72)

    =H(Y/X) +H(X)(i) A channel has the following Channel matrix.

    P(Y/X) = (1-70)

    )Draw the Channel diagram.)If the source has equally likely outputs, Compute the probabilities associated with the channel

    utputs for P=0.2 (6 marks)i)Two BSCs are connected in cascade as shown in the figure.

    a)Find the Channel Matrix of the resultant channel. (1-74)

    )Find P (Z1) and P (Z2), if P(X1) =0.6, P(X2) =0.4(i)Prove that the Mutual information of the channel is Symmetric.

    I(X, Y) =I(Y, X) (6 marks) (1-87)

    i)Prove that the mutual information is always positiveI(X, Y) 0 (6marks) (1-89)

    Prove the following relationships: I(X,Y)=H(X)-H(X/Y) (1-87)

    b) I(X,Y)=H(Y)-H(Y/X) (1-89)(i) Consider the Binary Symmetric Channel shown in the figure. (1-91)

    P(X1) = P

    P(X2) = 1-P

    alculate H(X),H(Y),H(Y/X) and I(X,Y)i)Prove the following (1-90)

    1

    2

    Z1

    Z2

    Y

    Y2

    0.8 0.7

    0.3

    0.3

    0.8 0.7

    0.2

    0.2

    Y1

    X2

    X1

    Y2

    1-

    1-

  • 8/3/2019 ITC_Unit_I

    2/31

    (X,Y)=H(X)+H(Y)-H(X,Y)0. (a)Find the Mutual Information and Channel capacity for the channel shown in the figure. Givenat P(X1) =0.6 and P(X2) =0.4 (6 Marks) (1-94)

    )A Binary Channel Matrix is given as:

    etermine H(X), H(X/Y), H(Y/X) and Mutual Information I(X, Y). (1-98)

    Y1

    X2

    X1

    Y2

    =

    1= =

    1- =

  • 8/3/2019 ITC_Unit_I

    3/31

  • 8/3/2019 ITC_Unit_I

    4/31

  • 8/3/2019 ITC_Unit_I

    5/31

  • 8/3/2019 ITC_Unit_I

    6/31

  • 8/3/2019 ITC_Unit_I

    7/31

  • 8/3/2019 ITC_Unit_I

    8/31

  • 8/3/2019 ITC_Unit_I

    9/31

  • 8/3/2019 ITC_Unit_I

    10/31

  • 8/3/2019 ITC_Unit_I

    11/31

  • 8/3/2019 ITC_Unit_I

    12/31

  • 8/3/2019 ITC_Unit_I

    13/31

  • 8/3/2019 ITC_Unit_I

    14/31

  • 8/3/2019 ITC_Unit_I

    15/31

  • 8/3/2019 ITC_Unit_I

    16/31

  • 8/3/2019 ITC_Unit_I

    17/31

  • 8/3/2019 ITC_Unit_I

    18/31

  • 8/3/2019 ITC_Unit_I

    19/31

  • 8/3/2019 ITC_Unit_I

    20/31

  • 8/3/2019 ITC_Unit_I

    21/31

  • 8/3/2019 ITC_Unit_I

    22/31

  • 8/3/2019 ITC_Unit_I

    23/31

  • 8/3/2019 ITC_Unit_I

    24/31

  • 8/3/2019 ITC_Unit_I

    25/31

  • 8/3/2019 ITC_Unit_I

    26/31

  • 8/3/2019 ITC_Unit_I

    27/31

  • 8/3/2019 ITC_Unit_I

    28/31

  • 8/3/2019 ITC_Unit_I

    29/31

  • 8/3/2019 ITC_Unit_I

    30/31

  • 8/3/2019 ITC_Unit_I

    31/31