problem 1 part b)

First, select values for the inverse covariance matrix for 5 variables:

function LAMBDA = makeRandomCovarianceMatrix(dims, zeroidxs)
% this function returns a random covariance matrix of dimensions 'dims'
% (ie, one that is symmetric, positive semidefinite)
% where there are zeros at the indexes specified by 
% zeroidxs
  notposdef = 1;

  % choose a random pos semi definite matrix
  while (notposdef > 0),
    LAMBDA = rand(dims);

    % make it symmetric
    LAMBDA = LAMBDA - tril(LAMBDA) + triu(LAMBDA)';

    LAMBDA(zeroidxs) = 0;
    [x notposdef] = chol(LAMBDA);
  end;
end;

ynow we can invoke it:


  % invoke it using the structure of the covariance matrix we assumed from
  % part a)
  LAMBDA = makeRandomCovarianceMatrix([5,5],[2 4 5 6 8 10 12 15 16 21 22 23])

  
  LAMBDA =

    0.8569         0    0.0029         0         0
         0    0.7313         0    0.2416         0
    0.0029         0    0.9941    0.2820         0
         0    0.2416    0.2820    0.9203    0.3765
         0         0         0    0.3765    0.5864


i) estimate the covariance of the 5 jointly gaussian random variables and convert their inverse covariance matrix. how does the result compare with the values implied by the truth inverse covariance matrix?

  mu = zeros(size(LAMBDA,1),1)';
  samples = gausssamp(mu,inv(LAMBDA),50000);

  % MIXTURE OF GAUSSIAN COVARIANCE ESTIMATE
  INV_MLMIXCOV = inv(cov(samples))
  % ERROR
  INV_MIXCOV_DIFFERENCE = INV_MLMIXCOV - LAMBDA
  INV_MIXCOV_ERROR = sum(sum(abs(INV_MIXCOV_DIFFERENCE)))

results:


  INV_MLMIXCOV =

    0.6066   -0.0334    0.1333   -0.1043   -0.0531
   -0.0334    0.6932   -0.0596    0.3723    0.1151
    0.1333   -0.0596    0.8874   -0.0024   -0.1636
   -0.1043    0.3723   -0.0024    1.0861    0.5461
   -0.0531    0.1151   -0.1636    0.5461    0.5437


INV_MIXCOV_DIFFERENCE =

   -0.2503   -0.0334    0.1304   -0.1043   -0.0531
   -0.0334   -0.0382   -0.0596    0.1307    0.1151
    0.1304   -0.0596   -0.1067   -0.2844   -0.1636
   -0.1043    0.1307   -0.2844    0.1658    0.1696
   -0.0531    0.1151   -0.1636    0.1696   -0.0427


INV_MIXCOV_ERROR =

    3.0923

  

ii) Now exploit the known graphical model sturcture: find the maximum likelihood estimate of the covariance matrix for these 4 pairs of jointly gaussian random variables: (x1,x3), (x3,x4), (x4,x2), (x4,x5),. Assuming these estimates are correct, form another estimate of the 5 jointly gaussian random variables. How does it compare with your estimate in part 1?

  % estimate each inverse covariance matrix separately
  x1x3 = inv(cov(samples(:,[1 3])));
  x2x4 = inv(cov(samples(:,[2 4])));
  x3x4 = inv(cov(samples(:,[3 4])));
  x4x5 = inv(cov(samples(:,[4 5])));
  x3x3 = inv(cov(samples(:,[3])));
  x4x4 = inv(cov(samples(:,[4])));
  x5x5 = inv(cov(samples(:,[5])));
  
  % now form new estimate of covariance matrix
  MC = zeros(5,5);
  MC(1,1) = x1x3(1,1);
  MC(1,3) = x1x3(1,2);
  MC(2,2) = x2x4(1,1);
  MC(2,4) = x2x4(1,2);
  MC(3,3) = x1x3(2,2) + x3x4(1,1) - x3x3;
  MC(3,4) = x3x4(1,2);
  MC(4,4) = x2x4(2,2) + x3x4(2,2) + x4x5(1,1) - x4x4 - x4x4;
  MC(4,5) = x4x5(1,2);
  MC(5,5) = x4x5(2,2);
  MC = MC - tril(MC) + triu(MC)';

  MODELCONV = MC
  MODELERROR = sum(sum(abs(MODELCONV - LAMBDA)))

results:


  MODELCONV =

    0.5966         0    0.1331         0         0
         0    0.6675         0    0.2592         0
    0.1331         0    0.8443    0.1798         0
         0    0.2592    0.1798    1.0114    0.4856
         0         0         0    0.4856    0.4967


  MODELERROR =

    1.3729

  

Therefore, our estimate assuming these constraints in the model structure yields an error that is distinctly less than when we estimate the variables as an unconstraint set of joint gaussianly r.v.'s.