The robot cleans up



Imagine a large building with many corridors. A robot cleans these corridors in a greedy fashion, the next corridor cleaned is always the dirtiest to which it is incident. We determine bounds on the minimum s(G) and maximum S(G) number of time steps (over all edge weightings) before every edge of a graph G has been cleaned. We show that Eulerian graphs have a self-stabilizing property that holds for any initial edge weighting: after the initial cleaning of all edges, all subsequent cleanings require s(G) time steps. Finally, we show the only self-stabilizing trees are a subset of the superstars.


Cleaning process Searching Greedy algorithms Edge traversing 


  1. Cook WJ, Cunningham WH, Pulleyblank WR, Schrijver A (1998) Combinatorial optimization. Wiley, New York MATHGoogle Scholar
  2. Edmonds J, Johnson EL (1973) Matching, Euler tours and the Chinese postman. Math Program 5:88–124 MATHCrossRefMathSciNetGoogle Scholar
  3. Eiselt HA, Gendreau M, Laporte G (1995) Arc routing problems, part 1: the Chinese postman problem. Oper Res 43(2):213–242 CrossRefMathSciNetGoogle Scholar
  4. Gajardo A, Goles E, Moreira A (2002) Complexity of Langton’s ant. Discrete Appl Math 117:41–50 MATHCrossRefMathSciNetGoogle Scholar
  5. Gale D (1998) Tracking the automatic ant and other mathematical explorations. Springer, New York MATHGoogle Scholar
  6. Messinger ME, Nowakowski RJ, Prałat P, Wormald NC (2007) Cleaning random d-regular graphs with brushes using a degree–greedy algorithm. In: Proceedings of the 4th workshop on combinatorial and algorithmic aspects of networking (CAAN2007). Lecture notes in computer science. Springer, Berlin, pp 13–26 CrossRefGoogle Scholar
  7. Messinger ME, Nowakowski RJ, Prałat P (2008) Cleaning a network with brushes. Theor Comput Sci 399:191–205 MATHCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2009

Authors and Affiliations

  1. 1.Department of Math. & Stats.Dalhousie UniversityHalifaxCanada

Personalised recommendations