EN
The problem of estimating unknown parameters of Markov-additive processes from data observed up to a random stopping time is considered. To the problem of estimation, the intermediate approach between the Bayes and the minimax principle is applied in which it is assumed that a vague prior information on the distribution of the unknown parameters is available. The loss in estimating is assumed to consist of the error of estimation (defined by a weighted squared loss function) as well as a cost of observing the process up to a stopping time. Several classes of optimal sequential procedures are obtained explicitly in the case when the available information on the prior distribution is restricted to a set Γ which is determined by certain moment-type conditions imposed on the prior distributions.