- Shopping Bag ( 0 items )
In the second part, we consider the problem of estimating directed graphs from observed data. The general problem of estimation of directed graphs is computationally NP-hard and direction of interactions may not be distinguishable from observations. I consider a special case of this problem, where the nodes (i.e. variables) inherit a natural ordering, and propose efficient penalized likelihood methods for estimating the underlying network structure. Consistency of the estimators in the high dimensional setting (more variables than observations) is established. I also propose an extension of the lasso penalty that results in improved estimation of graphical Granger causality from time-course observations.
The last part of the dissertation is devoted to issues of dimension reduction and efficient computation in networks. I propose a dimension reduction algorithm for networks using Laplacian eigenmaps, discuss the connections of this method to principal component analysis, and formulate the inference problem using a group lasso penalty. I also address computational aspects of estimation in networks, by proposing a distributed algorithm based on block-relaxation and derive conditions required for convergence of the algorithm to the maximum likelihood estimates. Finally, I present an extension of the block-relaxation algorithm, called approximate block-relaxation, that facilitates the use of iterative algorithms in optimization problems with complex objective functions.