Advertisement

Automatic Modulation Classification Using Induced Class Hierarchies and Deep Learning

  • Toluwanimi Odemuyiwa
  • Birsen Sirkeci-MergenEmail author
Conference paper
  • 14 Downloads
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1130)

Abstract

In this work, we contribute to the emerging field of deep learning (DL) methods of automatic modulation classification (AMC) for cognitive radios. Traditional AMC methods rely on expert-based knowledge of the wireless channel and incoming signals. This method suffers from a lack of generalizability to real-world channels that may be severely impaired or unknown. DL does not require a priori or expert-based knowledge and has seen success in other fields such as image processing and natural language processing. In recent years, DL has been explored as an alternative to traditional methods; however, currently proposed DL AMC methods suffer from high training times due to the many layers used to improve classification accuracy. We propose the use of induced class hierarchies to decompose the AMC task into subcomponents, while still maintaining deep architectures for improved classification accuracy. A publicly available, synthetic radio data set is used, which models severe channel impairments under a range of various signal-to-noise (SNR) levels. Three hierarchical convolutional neural network (h-CNN) architectures are developed: a single-level, baseline model; a two-level hierarchical model, termed model A; and a three-level hierarchical model, termed model B. Model A achieves a 4% improvement in classification accuracy over the baseline model while model B maintains comparable accuracy. Moreover, the training times of both models are reduced, with 50% improvement with model A and 28.6% improvement with model B, from the baseline model.

Keywords

Cognitive radio Deep learning Hierarchical classification 

References

  1. 1.
    Dobre, O.A., Abdi, A., Bar-Ness, Y., Su, W.: Survey of automatic modulation classification techniques: classical approaches and new trends. IET Commun. 1(2), 137–156 (2007)CrossRefGoogle Scholar
  2. 2.
    Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Netw. 2(5), 359–366 (1989). 197-11 (8-12)CrossRefGoogle Scholar
  3. 3.
    O’Shea, T.J., West, N.: Radio machine learning dataset generation with gnu radio. In: Proceedings of the GNU Radio Conference, vol. 1, no. 1 (2016)Google Scholar
  4. 4.
    Karra, K., Kuzdeba, S., Petersen, J.: Modulation recognition using hierarchical deep neural networks. In: 2017 IEEE International Symposium on Dynamic Spectrum Access Networks (DySPAN) (2017).  https://doi.org/10.1109/dyspan.2017.7920746
  5. 5.
    Silva-Palacios, D., Ferri, C., Ramrez-Quintana, M.J.: Improving performance of multiclass classification by inducing class hierarchies. Procedia Comput. Sci. 108, 16921701 (2017).  https://doi.org/10.1016/j.procs.2017.05.218CrossRefGoogle Scholar
  6. 6.
    O’Shea, T.J., Corgan, J., Clancy, T.C.: Convolutional radio modulation recognition networks. In: Engineering Applications of Neural Networks. Communications in Computer and Information Science, pp. 213–226 (2016).  https://doi.org/10.1007/978-3-319-44188-7_16
  7. 7.
    O’shea, T., Hoydis, J.: An introduction to deep learning for the physical layer. IEEE Trans. Cogn. Commun. Netw. 3(4), 563575 (2017).  https://doi.org/10.1109/tccn.2017.2758370CrossRefGoogle Scholar
  8. 8.
    Datasets: DeepSig Inc. https://www.deepsig.io/datasets
  9. 9.
    Liu, X., Yang, D., Gamal, A.E.: Deep neural network architectures for modulation classification. In: 2017 51st Asilomar Conference on Signals, Systems, and Computers (2017).  https://doi.org/10.1109/acssc.2017.8335483
  10. 10.
    Ramjee, S., Ju, S., Yang, D., Liu, X., Gamal, A., Eldar, Y.C.: Fast deep learning for automatic modulation classification. J. Sel. Areas Commun. (2019)Google Scholar
  11. 11.
  12. 12.
    Colaboratory: Frequently asked questions. https://research.google.com/colaboratory/faq.html
  13. 13.
    Carneiro, T., Da Nobrega, R.V.M., Nepomuceno, T., Bian, G.-B., De Albuquerque, V.H.C., Reboucas Filho, P.P.: Performance analysis of Google colaboratory as a tool for accelerating deep learning applications. IEEE Access 6, 61677–61685 (2018).  https://doi.org/10.1109/access.2018.2874767CrossRefGoogle Scholar
  14. 14.
  15. 15.
    Malik, U.: Hierarchical clustering with python and scikit-learn (2019). https://stackabuse.com/hierarchical-clusteringwith-python-and-scikit-learn/
  16. 16.
    Sarali, S., Doan, N., Doan, I.: Comparison of hierarchical cluster analysis methods by cophenetic correlation. J. Inequalities Appl. 1, 2013 (2013).  https://doi.org/10.1186/1029-242x-2013-203CrossRefGoogle Scholar
  17. 17.
    Wang, T., Wen, C.-K., Wang, H., Gao, F., Jiang, T., Jin, S.: Deep learning for wireless physical layer: opportunities and challenges. China Commun. 14(11), 92111 (2017).  https://doi.org/10.1109/cc.2017.8233654CrossRefGoogle Scholar
  18. 18.
    Silla, C.N., Freitas, A.A.: A survey of hierarchical classification across different application domains. Data Min. Knowl. Discov. 22(1–2), 31–72 (2010).  https://doi.org/10.1007/s10618-010-0175-9MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    Zhu, S., Wei, X.-Y., Ngo, C.-W.: Error recovered hierarchical classification. In: Proceedings of the 21st ACM International Conference on Multimedia, MM 2013 (2013).  https://doi.org/10.1145/2502081.2502182

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.San Jose State UniversitySan JoseUSA

Personalised recommendations