Imagine a large building with many corridors. A robot cleans these corridors in a greedy fashion, the next corridor cleaned is always the dirtiest to which it is incident. We determine bounds on the minimum s(G) and maximum S(G) number of time steps (over all edge weightings) before every edge of a graph G has been cleaned. We show that Eulerian graphs have a self-stabilizing property that holds for any initial edge weighting: after the initial cleaning of all edges, all subsequent cleanings require s(G) time steps. Finally, we show the only self-stabilizing trees are a subset of the superstars.
KeywordsCleaning process Searching Greedy algorithms Edge traversing
- Messinger ME, Nowakowski RJ, Prałat P, Wormald NC (2007) Cleaning random d-regular graphs with brushes using a degree–greedy algorithm. In: Proceedings of the 4th workshop on combinatorial and algorithmic aspects of networking (CAAN2007). Lecture notes in computer science. Springer, Berlin, pp 13–26 CrossRefGoogle Scholar