Cooperative Component Testing Architecture in Collaborating Network Environment

  • Gaeil An
  • Joon S. Park
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4610)


In a large distributed enterprise multiple organizations may be involved in a collaborative effort to provide software components that they developed and maintain based on their own policies. When a local system downloads a component from a remote system into such an environment, the downloaded component should be checked to find if it contains internal failures or malicious codes before it is executed in the local system. Although the software was tested by the original developer in its local environment, we cannot simply assume that it will work correctly and safely in other organizations’ computing environments. Furthermore, there is a possibility that some malicious codes are added to the original component by a mistake or intentionally. To address this problem, we propose a cooperative component-testing architecture that consists of three testing schemes, a provider node testing, a multiple-aspect testing, and a cooperative testing. The proposed architecture is able to effectively and efficiently detect malicious codes in a component. The provider node testing can increase the possibility of choosing the cleanest (least infected) component among components that exist on multiple remote systems. The multiple-aspect testing can improve the ability to detect a fault or malicious contents. And the cooperative testing scheme provides fast detection speed by integrating detection schemes effectively. Finally, we simulate our proposed ideas and provide a performance evaluation.


Detection Accuracy Testing Scheme Trojan Horse Malicious Code Malicious User 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Shum, S.B., De Roure, D., Eisenstadt, M., Shadbolt, N., Tate, A.: CoAKTinG: Collaborative Advanced Knowledge Technologies in the Grid. In: Proc. of the IEEE International Symposium on High Performance Distributed Computing (HPDC) (2002)Google Scholar
  2. 2.
    Risson, J., Moors, T.: Survey of Research towards Robust Peer-to-Peer Networks: Search Methods. Internet Research Task Force (IRTF) draft-irtf-p2prg-survey-search-00.txt (2006)Google Scholar
  3. 3.
    Chen, M., Kiciman, E., Brewer, E., Fox, A.: Pinpoint: Problem Determination in Large, Dynamic Internet Services. In: Proc. of the IEEE International Conference on Dependable Systems and Networks (DSN) (2002)Google Scholar
  4. 4.
    Park, J.S., Suresh, A.T., An, G., Giordano, J.: A framework of multiple-aspect component-testing for trusted collaboration in mission-critical systems. In: Proc. of the IEEE Workshop on Trusted Collaboration (TrustCol) (2006)Google Scholar
  5. 5.
    Park, J.S., Chandramohan, P., Suresh, A.T., Giordano, J.: Component survivability for mission-critical distributed systems. Journal of Automatic and Trusted Computing (JoATC) (in press)Google Scholar
  6. 6.
    Park, J.S., Giordano, J.: Software component survivability in information warfare. In: Encyclopedia of Information Warfare and Cyber Terrorism, IDEA Group Publishing (in press)Google Scholar
  7. 7.
    Szo, P.: The Art of Computer Virus Research and Defense. Addison-Wesley Publishing, London (2005)Google Scholar
  8. 8.
    Kienzie, D.M., Elder, M.C.: Recent Worms: A Survey and Trends. In: Proc. of the ACM Workshop on Rapid Malcode (WORM) (2003)Google Scholar
  9. 9.
    Milenkovic, M., Milenkovic, A., Jovanov, E.: Using Instruction Block Signatures to Counter Code Injection Attacks. Computer Architecture News 33(1), 108–117 (2005)CrossRefGoogle Scholar
  10. 10.
    Almgren, M., Barse, E.L., Jonsson, E.: Consolidation and Evaluation of IDS Taxonomies. In: Proc. of the Nordic Workshop on Secure IT-systems (NordSec), pp. 57–70 (2003)Google Scholar
  11. 11.
    Axelsson, S.: Intrusion Detection Systems: A Survey and Taxonomy. Technical Report 99-15, Depart. of Computer Engineering, Chalmers University (2000)Google Scholar
  12. 12.
    Hansman, S., Hunt, R.: A taxonomy of network and computer attacks. Int. Journal of Computers and Security 24(1), 31–43 (2005)CrossRefGoogle Scholar
  13. 13.
    Abadi, M., Lamport, L.: Composing specifications. ACM Transactions on Programming Languages and Systems 15(1), 73–132 (1993)CrossRefGoogle Scholar
  14. 14.
    Voas, J. M., Miller, K. W., Payne, J.: PISCES: A tool for predicting software testability. Technical Report, NASA (1992) Google Scholar
  15. 15.
    Voas, J.M., Payne, J.: Dependability certification of software components. Journal of Systems and Software 52(2-3), 165–172 (2000)CrossRefGoogle Scholar
  16. 16.
    Chen, L., Avizienis, A.: N-version programming: a fault-tolerance approach to reliability of software operation. In: Digest of the 8th International Conference on Dependable Systems and Networks (FTCS), pp. 3–9 (1978)Google Scholar
  17. 17.
    Cai, X., Lyu, M.R., Vouk, M.A.: An experimental evaluation on reliability features of n-version programming. In: Proc. of the 16th IEEE International Symposium on Software Reliability Engineering, pp. 161–170 (2005)Google Scholar
  18. 18.
    UCB/LBNL/VINT: Network simulator (ns) Notes and Documentation.

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Gaeil An
    • 1
  • Joon S. Park
    • 2
  1. 1.Electronics and Telecommunications Research Institute (ETRI), 161 Gajeong-Dong, Yuseong-Gu, Daejeon, 305-350Korea
  2. 2.The Laboratory for Applied Information Security Technology (LAIST), School of, Information Studies, Syracuse University, Syracuse, NY 13244-4100USA

Personalised recommendations