Pełnotekstowe zasoby PLDML oraz innych baz dziedzinowych są już dostępne w nowej Bibliotece Nauki.
Zapraszamy na https://bibliotekanauki.pl
Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników

Znaleziono wyników: 4

Liczba wyników na stronie
first rewind previous Strona / 1 next fast forward last

Wyniki wyszukiwania

help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 1 next fast forward last
1
Content available remote

Equalities for orthogonal projectors and their operations

100%
Open Mathematics
|
2010
|
tom 8
|
nr 5
855-870
EN
A complex square matrix A is called an orthogonal projector if A 2 = A = A*, where A* denotes the conjugate transpose of A. In this paper, we give a comprehensive investigation to matrix expressions consisting of orthogonal projectors and their properties through ranks of matrices. We first collect some well-known rank formulas for orthogonal projectors and their operations, and then establish various new rank formulas for matrix expressions composed by orthogonal projectors. As applications, we derive necessary and sufficient conditions for various equalities for orthogonal projectors and their operations to hold.
2
Content available remote

Matrix rank and inertia formulas in the analysis of general linear models

100%
Open Mathematics
|
2017
|
tom 15
|
nr 1
126-150
EN
Matrix mathematics provides a powerful tool set for addressing statistical problems, in particular, the theory of matrix ranks and inertias has been developed as effective methodology of simplifying various complicated matrix expressions, and establishing equalities and inequalities occurred in statistical analysis. This paper describes how to establish exact formulas for calculating ranks and inertias of covariances of predictors and estimators of parameter spaces in general linear models (GLMs), and how to use the formulas in statistical analysis of GLMs. We first derive analytical expressions of best linear unbiased predictors/best linear unbiased estimators (BLUPs/BLUEs) of all unknown parameters in the model by solving a constrained quadratic matrix-valued function optimization problem, and present some well-known results on ordinary least-squares predictors/ordinary least-squares estimators (OLSPs/OLSEs). We then establish some fundamental rank and inertia formulas for covariance matrices related to BLUPs/BLUEs and OLSPs/OLSEs, and use the formulas to characterize a variety of equalities and inequalities for covariance matrices of BLUPs/BLUEs and OLSPs/OLSEs. As applications, we use these equalities and inequalities in the comparison of the covariance matrices of BLUPs/BLUEs and OLSPs/OLSEs. The work on the formulations of BLUPs/BLUEs and OLSPs/OLSEs, and their covariance matrices under GLMs provides direct access, as a standard example, to a very simple algebraic treatment of predictors and estimators in linear regression analysis, which leads a deep insight into the linear nature of GLMs and gives an efficient way of summarizing the results.
3
63%
Special Matrices
|
2016
|
tom 4
|
nr 1
130-140
EN
Least-Squares Solution (LSS) of a linear matrix equation and Ordinary Least-Squares Estimator (OLSE) of unknown parameters in a general linear model are two standard algebraical methods in computational mathematics and regression analysis. Assume that a symmetric quadratic matrix-valued function Φ(Z) = Q − ZPZ0 is given, where Z is taken as the LSS of the linear matrix equation AZ = B. In this paper, we establish a group of formulas for calculating maximum and minimum ranks and inertias of Φ(Z) subject to the LSS of AZ = B, and derive many quadratic matrix equalities and inequalities for LSSs from the rank and inertia formulas. This work is motivated by some inference problems on OLSEs under general linear models, while the results obtained can be applied to characterize many algebraical and statistical properties of the OLSEs.
EN
A general linear model can be given in certain multiple partitioned forms, and there exist submodels associated with the given full model. In this situation, we can make statistical inferences from the full model and submodels, respectively. It has been realized that there do exist links between inference results obtained from the full model and its submodels, and thus it would be of interest to establish certain links among estimators of parameter spaces under these models. In this approach the methodology of additive matrix decompositions plays an important role to obtain satisfactory conclusions. In this paper, we consider the problem of establishing additive decompositions of estimators in the context of a general linear model with partial parameter restrictions. We will demonstrate how to decompose best linear unbiased estimators (BLUEs) under the constrained general linear model (CGLM) as the sums of estimators under submodels with parameter restrictions by using a variety of effective tools in matrix analysis. The derivation of our main results is based on heavy algebraic operations of the given matrices and their generalized inverses in the CGLM, while the whole contributions illustrate various skillful uses of state-of-the-art matrix analysis techniques in the statistical inference of linear regression models.
first rewind previous Strona / 1 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.