# Mixture Experiments and Conditional Inference

• Ishwar V. Basawa
• David John Scott
Part of the Lecture Notes in Statistics book series (LNS, volume 17)

## Abstract

Consider a simple mixture experiment conducted in two stages as follows: V is a positive random variable with density pV(v); in the first stage of the experiment the value v of V is observed; at the second stage, a random sample X(n) = (X1, X2,..., Xn) is drawn from a normal density with an unknown mean α and variance v. Our aim is to draw inference about α. If we consider (X(n),v) as our sample the joint likelihood is given by
$$\left\{ {{p_{{X(n)}}}|{V^{{(x(n)|V = v;\alpha )}}}} \right\}{p_{V}}(v)$$
where we assume that pV is free from α. The usual conditionality principle (see e.g. Cox and Hinkley (1974), p.38) then suggests that the inference about θ should be based on the density pX(n) | V where the value of v is known. Suppose now that the value v is unknown to the experimenter even though it is known that the first stage of the experiment has been performed. We then have only X(n) as our sample and the information that the experiment on V has been performed. The conditionality principle will still be in force; we may treat v as an unknown nuisance parameter and use the density pX(n) | V for inference about α.

## Keywords

Likelihood Function Nuisance Parameter Exponential Family Exponential Random Variable Conditional Likelihood
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

## Preview

© Springer-Verlag New York Inc. 1983

## Authors and Affiliations

• Ishwar V. Basawa
• 1
• David John Scott
• 1
1. 1.Department of Mathematical StatisticsLa Trobe UniversityBundooraAustralia