Problems of Stability
Let a real system S be modeled by an equation like (1.3). A particular evolution x(t, t 0, x 0) of S is completely determined by assigning the Cauchy datum (t 0, x 0). However, this datum is obtained by an experimental procedure and is therefore affected by an error. If a “small” difference in the initial data leads to a new solution that is “close” to the previous one, x(t, t 0, x 0) is said to be stable in the sense of Liapunov (see, e.g., , , , ). To make precise the notion of Liapunov stability, one has to attribute a meaning to terms like small and close. By considering that x is a point of \( \Re ^n \),the Euclidean norm for evaluating the difference between two solutions or two initial data can be used. In particular, the stability property can be referred to an equilibrium solution. By introducing the new unknown x − x(t, t 0, x 0), the analysis of stability of any solution is always reduced to the analysis of the equilibrium stability. An equilibrium position that is not stable is called unstable. In such a case near the equilibrium there are initial data whose corresponding solutions go definitively away from the equilibrium. Finally, the equilibrium is asymptotically stable if it is stable, and the solutions associated to initial data in a neighborhood of the equilibrium tend to the equilibrium position when the independent variable goes to infinity. We remark that a stable equilibrium is observable, whereas an unstable equilibrium is not, owing to the presence of inevitable perturbations.
Unable to display preview. Download preview PDF.