Learning Rules from Distributed Data

  • Lawrence O. Hall
  • Nitesh Chawla
  • Kevin W. Bowyer
  • W. Philip Kegelmeyer
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1759)


In this paper a concern about the accuracy (as a function of parallelism) of a certain class of distributed learning algorithms is raised, and one proposed improvement is illustrated. We focus on learning a single model from a set of disjoint data sets, which are distributed across a set of computers. The model is a set of rules. The distributed data sets may be disjoint for any of several reasons. In our approach, the first step is to construct a rule set (model) for each of the original disjoint data sets. Then rule sets are merged until an eventual final rule set is obtained which models the aggregate data. We show that this approach compares to directly creating a rule set from the aggregate data and promises faster learning. Accuracy can drop off as the degree of parallelism increases. However, an approach has been developed to extend the degree of parallelism achieved before this problem takes over.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    G. Williams, Inducing and Combining Multiple Decision Trees. PhD thesis, Australian National University, Canberra, Australia, 1990.Google Scholar
  2. 2.
    F. Provost and D. Hennessy, “Scaling up: Distributed machine learning with cooperation,” in Proceedings of AAAI’96, pp. 74–79, 1996.Google Scholar
  3. 3.
    J. Quinlan, C4.5: Programs for Machine Learning. Morgan Kaufmann, 1992. San Mateo, CA.Google Scholar
  4. 4.
    J. Quinlan, “Improved use of continuous attributes in C4.5,” Journal of Artificial Intelligence Research, vol. 4, pp. 77–90, 1996.zbMATHCrossRefGoogle Scholar
  5. 5.
    S. Clearwater, T. Cheng, H. Hirsh, and B. Buchanan, “Incremental batch learning,” in Proceedings of the Sixth Int. Workshop on Machine Learning, pp. 366–370, 1989.Google Scholar
  6. 6.
    W. Cohen, “Fast effective rule induction,” in Proceedings of the 12th Conference on Machine Learning, 1995.Google Scholar
  7. 7.
    R. Kufrin, “Generating C4.5 production rules in parallel,” in Proceedings of the Fourteenth National Conference on Artificial Intelligence (AAAI-97), pp. 565–570, July 1997.Google Scholar
  8. 8.
    J. Quinlan, “Generating production rules from decision trees,” in Proceedings of IJCAI-87, pp. 304–307, 1987.Google Scholar
  9. 9.
    F. Provost and D. Hennessy, “Distributed machine learning: Scaling up with coarse-grained parallelism,” in Proceedings of the Second International Conference on Intelligent Systems for Molecular Biology, 1994.Google Scholar
  10. 10.
    C. Merz and P. Murphy, UCI Repository of Machine Learning Databases. Univ. of CA., Dept. of CIS, Irvine, CA.
  11. 11.
    L. Hall, N. Chawla, and K. Bowyer, “Decision tree learning on very large data sets,” in International Conference on Systems, Man and Cybernetics, pp. 2579–2584, Oct 1998.Google Scholar
  12. 12.
    R. Fisher, “The use of multiple measurements in taxonomic problems,” Ann. Eugenics, vol. 7, 1936.Google Scholar
  13. 13.
    S. Weiss, R. Galen, and P. Tadepalli, “Maximizing the predictive value of production rules,” Artificial Intelligence, vol. 45, pp. 47–71, 1990.CrossRefGoogle Scholar
  14. 14.
    P. Chan and S. Stolfo, “Scaling learning by meta-learning over disjoint and partially replicated data,” in Proceedings of the Florida Artificial Intelligence Society, 1996.Google Scholar
  15. 15.
    S. Stolfo, A. Prodromidis, S. Tselepis, W. Lee, D. Fan, and P. Chan, “JAM: Java agents for meta-learning over distributed databases,” in Proc. KDD-97, 1997.Google Scholar
  16. 16.
    P. K. Chan and S. J. Stolfo, “Toward scalable learning with non-uniform class and cost distributions: A case study in credit card fraud detection,” in Proc. KDD-98, 1998.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2000

Authors and Affiliations

  • Lawrence O. Hall
    • 1
  • Nitesh Chawla
    • 1
  • Kevin W. Bowyer
    • 1
  • W. Philip Kegelmeyer
    • 2
  1. 1.Department of Computer Science and Engineering, ENB 118University of South FloridaTampa
  2. 2.Advanced Concepts DepartmentSandia National LaboratoriesLivermore

Personalised recommendations