Differential Equations with Random Delay
The Multiplicative Ergodic Theorem by Oseledets on Lyapunov spectrum and Oseledets subspaces is extended to linear random differential equations with random delay, using a recent result by Lian and Lu. Random differential equations with bounded delay are discussed as an example.
The authors were supported in part by DFG Emmy Noether Grant Si801/1-3.
Received 4/16/2009; Accepted 2/14/2010