In process robustness studies, it is desirable to minimize the influence of noise factors on the system and simultaneously determine the levels of controllable factors optimizing the overall response or outcome. In the cases when a random effects model is applicable and a fixed effects model is assumed instead, an increase in the variance of the coefficient vector should be expected. In this paper, the impacts of this assumption on the results of the experiment in the context of robust parameter design are investigated. Furthermore, two criteria are considered to determine the optimum settings for the control factors. In order to better understand the proposed method and to evaluate its performances, a numerical example for the case of 'the smaller the better' is included.