About the Event
In this talk, we study the PageRank problem and illustrate it using the Google search engine as a paradigmatic example. Specifically, we introduce PageRank discussing the so-called random surfer model and the teleportation matrix. Subsequently, we present new distributed randomized algorithms (of Las Vegas type) for its efficient computation, and show the main properties of these algorithms utilizing results of the theory of positive matrices and Markov Chains.
Finally, we discuss how these ideas are related to consensus of multi-agent systems in uncertain environments, and explain the main differences and similarities between consensus and the PageRank computation. In particular, we analyze agreement problems of mainstream journals in the systems and control area where consensus plays the role of cross-citations and leads to aggregation of journals in different categories.