We try to show that Discriminant Analysis can be considered as a branch of Statistical Decision Theory when viewed from a Bayesian approach. First we present the necessary measure theory results, next we briefly outline the foundations of Bayesian Inference before developing Discriminant Analysis as an application of Bayesian Estimation. Our approach renders Discriminant Analysis more flexible since it gives the possibility of classing an element as belonging to a group of populations. This possibility arises from the introduction of the concept of regions of controled posterior risk.
The model $y = ∑_{j=1}^w X_j β̲_j + e̲$ is generalized orthogonal if the orthogonal projection matrices on the range spaces of matrices $X_j$, j = 1, ..., w, commute. Unbiased estimators are obtained for the variance components of such models with cross-nesting.
Stable hypothesis are hypothesis that may refer either for the fixed part or for the random part of the model. We will consider such hypothesis for models with balanced cross-nesting. Generalized F tests will be derived. It will be shown how to use Monte-Carlo methods to evaluate p-values for those tests.
Commutative Jordan algebras are used to express the structure of mixed orthogonal models and to derive complete sufficient statistics. From these statistics, UMVUE, (Uniformly Minimum Variance Unbiased Estimators), are derived for the relevant parameters, first of single models then of several such models. These models may correspond to experiments designed separately so our results may be seen as a contribution to this meta-analysis.
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.