Magdalena Salazar-Palma

Modern Characterization of Electromagnetic Systems and its Associated Metrology


Скачать книгу

side of (1.28) over to the left side of the equation and equate it to zero as such

      where Ux has n columns, uy is a column vector, ∑x contains the n largest singular values diagonally, σy is the smallest singular value, Vxx is a n × n matrix, and vyy is scalar. Let us multiple both sides by matrix V.

      (1.33)equation

      From the Eckart‐Young theorem, we know that {[X y] + [images]} is the closest rank‐n approximation to [X y]. Matrix {[X y] + [images]} has the same singular vectors contained in ∑x above with σy equal to zero. We can then write the SVD of {[X y] + [images]} as such

      To obtain [images] we must solve the following

      (1.38)equation

      Finally, {[X y] + [images]} can be defined as

      (1.40)equation

      The right‐hand side cancels and we are left with

      (1.42)equation

      The vector vxy is the first n elements of the n +1‐th columns of the right singular matrix V, of [X y] and vyy is the n + 1‐th element of the n + 1 columns of V. The best approximation of the model is then given by

      (1.43)equation

      This completes the total least squares solution.

      This first chapter provides the mathematical fundamentals that will be utilized later. The principles presented are the basis of low‐rank modelling, singular value decomposition and the method of total least squares.

      1 1 L. I. Scharf and D. Tufts, “Rank Reduction for Modeling Stationary Signals,” IEEE Transactions on Acoustics,