# Courses

**Course 1: ****Froilán C. Martínez Dopico (Universidad Carlos III de Madrid, Spain)**

Title: "Structural global backward stability of polynomial eigenvalue problems solved by linearizations"

Abstract: The numerical solution of polynomial eigenvalue problems has received considerable attention in the literature in the last decade. These problems are very often solved via linearizations, i.e., by applying backward stable algorithms for generalized eigenvalue problems to a matrix pencil with the same complete eigenstructure as the original matrix polynomials. However, such algorithms only guarantee backward stability with respect to the pencil, but not with respect to the original data, i.e., the original matrix polynomial. The analysis of the "structural" (this means with respect to the original matrix polynomial) "global" (this means valid simultaneously for all computed eigenvalues and minimal indices in the singular case) "backward stability" of these methods leads to a highly structured matrix perturbation problem whose solution requires the use of many interesting concepts of the theory of matrix polynomials and matrix equations. The goal of this mini-course is to present a self-contained friendly treatment of this type of stability analyses that can be applied to many different types of linearizations.
____________________________________________________________________________
**Course 2 : Bruno Iannazzo (Universita degli Studi di Perugia, Italy)**

Title: Computational methods for matrix geometric means

Abstract: Matrix geometric means have been introduced on one hand, because of the mathematical wish to generalize concepts as much as possible; on the other hand, because of the demand of suitable models of matrix averages by applications. These facts provide a motivation for the need of efficient algorithms for the matrix geometric means.
We review the computational problems related to the matrix means. First, we consider the geometric mean of two matrices, both in the dense and moderate size, and in the large-scale case. Then, we consider the geometric mean of more than two matrices, for which no explicit expression is known, and which can be obtained just as a limit of certain sequences.
The computation of matrix geometric means requires a wise application of customary techniques in numerical linear algebra, together with the use of advanced techniques in matrix computation, such as the optimization on matrix manifold and the rational Krylov
subspaces approximation.
____________________________________________________________________________
**Course 3: Federico Poloni ** **(University of Pisa, Italy)**

Title: Monotonic iterations for nonlinear matrix equations

Abstract: We study several nonlinear matrix equations arising in applications, mainly from probability, and fixed-point iterations to solve them. We shall see how convergence of the iterations and properties of the solutions can often be proved relying on positiveness and ordering properties (either in the componentwise or Löwner ordering). We will start from matrix equations such as the ones coming from binary trees and quasi-birth-death models (e.g., $AX^2+BX+C=0$, and then move on to other Riccati-type equations with a richer linear algebraic structure, depending on time availability.
____________________________________________________________________________
**Course 4 : ****Gema Maria Diaz-Toca ** **(Universidad de Murcia, Spain)**

Title: The Bezout Matrix

Abstract: This course is devoted to presenting the Bezout Matrix. We will see Barnett's method through Bezoutians, which allows to compute the gcd of several univariate polynomials. Two different uses of this method will be discussed. First, we describe an algorithm for parameterizing the gcd of several polynomials. Secondly, we consider the problem of computing the approximate gcd. The application of the Bezout matrix to the solution of a zero dimensional bivariate polynomial system will also be presented.
____________________________________________________________________________
**Course 5: Bernard Mourrain (Inria, Sophia Antipolis Méditerranée, France)**

Title: Structured matrix computation and polynomial algebra

Abstract: There is a strong relationship between polynomials and structured matrices. In this course, we will consider multivariate polynomials and related structures of matrices such as Toeplitz, Hankel and vandermonde matrices. We will show how these types of matrices appear in methods for the resolution of polynomial systems, such as resultant constructions, Grobner basis and border basis computation. We will analyze their properties and the relations between these different structures. We will describe matrix-based methods for solving zero-dimensional systems of equations. Hankel structured matrices are also present in multivariate polynomial algebra. Using duality, we will also describe them as operators on polynomials and analyze their properties. In particular, we will detail their correlation with Gorenstein algebras. This will lead us to a method for the decomposition of series as polynomial-exponential functions. We will apply this method in different contexts such as sparse representation of symbols of convolution operators, sparse interpolation, tensor decomposition. Explicit computations and examples will illustrate the different notions introduced in the presentation.