Title :
Selection procedures with standardized time series variance estimators
Author :
Goldsman, David ; Marshall, William S.
Author_Institution :
Sch. of Ind. & Syst. Eng., Georgia Inst. of Technol., Atlanta, GA, USA
fDate :
6/21/1905 12:00:00 AM
Abstract :
Studies a modification of Y. Rinott´s (1978) two-stage procedure for selecting the normal population with the largest (or smallest) mean. The modification, which is appropriate for use in the simulation environment, uses, in the procedure´s first stage, different variance estimators than the usual batch means (BM) variance estimator. In particular, we use variance estimators arising from the method of standardized time series (STS). On the plus side, certain STS estimators have more degrees of freedom than the BM estimator does. On the other hand, STS variance estimators tend to require larger sample sizes than the BM estimator in order to converge to their assumed distributions. These considerations result in trade-offs involving the procedure´s achieved probability of correct selection as well as the procedure´s expected sample size
Keywords :
convergence; estimation theory; normal distribution; simulation; time series; batch means; convergence; correct selection probability; degrees of freedom; expected sample size; normal population; sample size; simulation; standardized time series; statistical selection procedures; tradeoffs; variance estimators; Computer simulation; Diseases; Drugs; Manufacturing; Modeling; Nominations and elections; Probability; Sociotechnical systems; Systems engineering and theory; Terminology;
Conference_Titel :
Simulation Conference Proceedings, 1999 Winter
Conference_Location :
Phoenix, AZ
Print_ISBN :
0-7803-5780-9
DOI :
10.1109/WSC.1999.823099