Pełnotekstowe zasoby PLDML oraz innych baz dziedzinowych są już dostępne w nowej Bibliotece Nauki.
Zapraszamy na https://bibliotekanauki.pl
Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników

Znaleziono wyników: 9

Liczba wyników na stronie
first rewind previous Strona / 1 next fast forward last

Wyniki wyszukiwania

help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 1 next fast forward last
1
Content available remote

Singular M-matrices which may not have a nonnegative generalized inverse

100%
EN
A matrix A ∈ ℝn×n is a GM-matrix if A = sI − B, where 0 < ρ(B) ≤ s and B ∈WPFn i.e., both B and Bt have ρ(B) as their eigenvalues and their corresponding eigenvector is entry wise nonnegative. In this article, we consider a generalization of a subclass of GM-matrices having a nonnegative core nilpotent decomposition and prove a characterization result for such matrices. Also, we study various notions of splitting of matrices from this new class and obtain sufficient conditions for their convergence.
2
Content available remote

A Hadamard product involving inverse-positive matrices

100%
EN
In this paperwe study the Hadamard product of inverse-positive matrices.We observe that this class of matrices is not closed under the Hadamard product, but we show that for a particular sign pattern of the inverse-positive matrices A and B, the Hadamard product A ◦ B−1 is again an inverse-positive matrix.
EN
In this paper we characterize Moore-Penrose inverses of Gram matrices leaving a cone invariant in an indefinite inner product space using the indefinite matrix multiplication. This characterization includes the acuteness (or obtuseness) of certain closed convex cones.
4
Content available remote

Dual Lattice of ℤ-module Lattice

76%
EN
In this article, we formalize in Mizar [5] the definition of dual lattice and their properties. We formally prove that a set of all dual vectors in a rational lattice has the construction of a lattice. We show that a dual basis can be calculated by elements of an inverse of the Gram Matrix. We also formalize a summation of inner products and their properties. Lattice of ℤ-module is necessary for lattice problems, LLL(Lenstra, Lenstra and Lovász) base reduction algorithm and cryptographic systems with lattice [20], [10] and [19].
5
Content available remote

On the Drazin index of regular elements

76%
EN
It is known that the existence of the group inverse a # of a ring element a is equivalent to the invertibility of a 2 a − + 1 − aa −, independently of the choice of the von Neumann inverse a − of a. In this paper, we relate the Drazin index of a to the Drazin index of a 2 a − + 1 − aa −. We give an alternative characterization when considering matrices over an algebraically closed field. We close with some questions and remarks.
6
Content available remote

Pairs ofk-step reachability andm-step observability matrices

76%
EN
Let V and W be matrices of size n × pk and qm × n, respectively. A necessary and sufficient condition is given for the existence of a triple (A,B,C) such that V a k-step reachability matrix of (A,B) andW an m-step observability matrix of (A,C).
EN
A general linear model can be given in certain multiple partitioned forms, and there exist submodels associated with the given full model. In this situation, we can make statistical inferences from the full model and submodels, respectively. It has been realized that there do exist links between inference results obtained from the full model and its submodels, and thus it would be of interest to establish certain links among estimators of parameter spaces under these models. In this approach the methodology of additive matrix decompositions plays an important role to obtain satisfactory conclusions. In this paper, we consider the problem of establishing additive decompositions of estimators in the context of a general linear model with partial parameter restrictions. We will demonstrate how to decompose best linear unbiased estimators (BLUEs) under the constrained general linear model (CGLM) as the sums of estimators under submodels with parameter restrictions by using a variety of effective tools in matrix analysis. The derivation of our main results is based on heavy algebraic operations of the given matrices and their generalized inverses in the CGLM, while the whole contributions illustrate various skillful uses of state-of-the-art matrix analysis techniques in the statistical inference of linear regression models.
8
Content available remote

Matrix rank and inertia formulas in the analysis of general linear models

64%
Open Mathematics
|
2017
|
tom 15
|
nr 1
126-150
EN
Matrix mathematics provides a powerful tool set for addressing statistical problems, in particular, the theory of matrix ranks and inertias has been developed as effective methodology of simplifying various complicated matrix expressions, and establishing equalities and inequalities occurred in statistical analysis. This paper describes how to establish exact formulas for calculating ranks and inertias of covariances of predictors and estimators of parameter spaces in general linear models (GLMs), and how to use the formulas in statistical analysis of GLMs. We first derive analytical expressions of best linear unbiased predictors/best linear unbiased estimators (BLUPs/BLUEs) of all unknown parameters in the model by solving a constrained quadratic matrix-valued function optimization problem, and present some well-known results on ordinary least-squares predictors/ordinary least-squares estimators (OLSPs/OLSEs). We then establish some fundamental rank and inertia formulas for covariance matrices related to BLUPs/BLUEs and OLSPs/OLSEs, and use the formulas to characterize a variety of equalities and inequalities for covariance matrices of BLUPs/BLUEs and OLSPs/OLSEs. As applications, we use these equalities and inequalities in the comparison of the covariance matrices of BLUPs/BLUEs and OLSPs/OLSEs. The work on the formulations of BLUPs/BLUEs and OLSPs/OLSEs, and their covariance matrices under GLMs provides direct access, as a standard example, to a very simple algebraic treatment of predictors and estimators in linear regression analysis, which leads a deep insight into the linear nature of GLMs and gives an efficient way of summarizing the results.
9
Content available remote

Inertias and ranks of some Hermitian matrix functions with applications

52%
EN
Let S be a given set consisting of some Hermitian matrices with the same size. We say that a matrix A ∈ S is maximal if A − W is positive semidefinite for every matrix W ∈ S. In this paper, we consider the maximal and minimal inertias and ranks of the Hermitian matrix function f(X,Y) = P − QXQ* − TYT*, where * means the conjugate and transpose of a matrix, P = P*, Q, T are known matrices and for X and Y Hermitian solutions to the consistent matrix equations AX =B and YC = D respectively. As applications, we derive the necessary and sufficient conditions for the existence of maximal matrices of $$H = \{ f(X,Y) = P - QXQ* - TYT* : AX = B,YC = D,X = X*, Y = Y*\} .$$ The corresponding expressions of the maximal matrices of H are presented when the existence conditions are met. In this case, we further prove the matrix function f(X,Y)is invariant under changing the pair (X,Y). Moreover, we establish necessary and sufficient conditions for the system of matrix equations $$AX = B, YC = D, QXQ* + TYT* = P$$ to have a Hermitian solution and the system of matrix equations $$AX = C, BXB* = D$$ to have a bisymmetric solution. The explicit expressions of such solutions to the systems mentioned above are also provided. In addition, we discuss the range of inertias of the matrix functions P ± QXQ* ± TYT* where X and Y are a nonnegative definite pair of solutions to some consistent matrix equations. The findings of this pape extend some known results in the literature.
first rewind previous Strona / 1 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.