# Mixture Experiments and Conditional Inference

Chapter

## Abstract

Consider a simple mixture experiment conducted in two stages as follows: V is a positive random variable with density p where we assume that p

_{V}(v); in the first stage of the experiment the value v of V is observed; at the second stage, a random sample X(n) = (X_{1}, X_{2},..., X_{n}) is drawn from a normal density with an unknown mean α and variance v. Our aim is to draw inference about α. If we consider (X(n),v) as our sample the joint likelihood is given by$$\left\{ {{p_{{X(n)}}}|{V^{{(x(n)|V = v;\alpha )}}}} \right\}{p_{V}}(v)$$

_{V}is free from α. The usual conditionality principle (see e.g. Cox and Hinkley (1974), p.38) then suggests that the inference about θ should be based on the density p_{X(n) | V}where the value of v is known. Suppose now that the value v is unknown to the experimenter even though it is known that the first stage of the experiment has been performed. We then have only X(n) as our sample and the information that the experiment on V has been performed. The conditionality principle will still be in force; we may treat v as an unknown nuisance parameter and use the density p_{X(n) | V}for inference about α.## Keywords

Likelihood Function Nuisance Parameter Exponential Family Exponential Random Variable Conditional Likelihood
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

## Preview

Unable to display preview. Download preview PDF.

## Copyright information

© Springer-Verlag New York Inc. 1983