Rényi Resolvability and Its Applications to the Wiretap Channel

  • Lei YuEmail author
  • Vincent Y. F. Tan
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10681)


The conventional channel resolvability problem refers to the determination of the minimum rate needed for an input process to approximate the output distribution of a channel in either the total variation distance or the relative entropy. In this paper, we use the (normalized or unnormalized) Rényi divergence (with the Rényi parameter in [0,2]) to measure the level of approximation. We also provide asymptotic expressions for normalized Rényi divergence when the Rényi parameter is larger than or equal to 1 as well as (lower and upper) bounds for the case when the same parameter is smaller than 1. We characterize the minimum rate needed to ensure that the Rényi resolvability vanishes asymptotically. The optimal rates are the same for both the normalized and unnormalized cases. In addition, the minimum rate when the Rényi parameter no larger than 1 equals the minimum mutual information over all input distributions that induce the target output distribution similarly to the traditional case. When the Rényi parameter is larger than 1 the minimum rate is, in general, larger than the mutual information. We apply these results to the wiretap channel, and completely characterize the optimal tradeoff between the rates of the secret and non-secret messages when the leakage measure is given by the (unnormalized) Rényi divergence (which is a generalization of effective secrecy). This tradeoff differs from the conventional setting when the leakage is measured by the traditional mutual information.



The authors are supported by a Singapore National Research Foundation (NRF) National Cybersecurity R&D Grant (R-263-000-C74-281 and NRF2015NCR-NCR003-006).


  1. 1.
    Han, T., Verdú, S.: Approximation theory of output statistics. IEEE Trans. Inf. Theory 39(3), 752–772 (1993)MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Hayashi, M.: General nonasymptotic and asymptotic formulas in channel resolvability and identification capacity and their application to the wiretap channel. IEEE Trans. Inf. Theory 52(4), 1562–1575 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Hayashi, M.: Exponential decreasing rate of leaked information in universal random privacy amplification. IEEE Trans. Inf. Theory 57(6), 3989–4001 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Liu, J., Cuff, P., Verdú, S.: \(E_{\gamma }\)-resolvability. IEEE Trans. Inf. Theory 63(5), 2629–2658 (2017)MathSciNetzbMATHGoogle Scholar
  5. 5.
    Wyner, A.: The common information of two dependent random variables. IEEE Trans. Inf. Theory 21(2), 163–179 (1975)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Cuff, P.: Distributed channel synthesis. IEEE Trans. Inf. Theory 59(11), 7071–7096 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Bloch, M.R., Laneman, J.N.: Strong secrecy from channel resolvability. IEEE Trans. Inf. Theory 59(12), 8077–8098 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Han, T.S., Endo, H., Sasaki, M.: Reliability and secrecy functions of the wiretap channel under cost constraint. IEEE Trans. Inf. Theory 60(11), 6819–6843 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Parizi, M.B., Telatar, E., Merhav, N.: Exact random coding secrecy exponents for the wiretap channel. IEEE Trans. Inf. Theory 63(1), 509–531 (2017)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Hou, J., Kramer, G.: Effective secrecy: reliability, confusion and stealth. In: 2014 IEEE International Symposium on Information Theory (ISIT), pp. 601–605. IEEE (2014)Google Scholar
  11. 11.
    Shikata, J.: Design and analysis of information-theoretically secure authentication codes with non-uniformly random keys. IACR Cryptology ePrint Archive 2015: 250 (2015)Google Scholar
  12. 12.
    Bai, S., Langlois, A., Lepoint, T., Stehlé, D., Steinfeld, R.: Improved security proofs in lattice-based cryptography: using the Rényi divergence rather than the statistical distance. In: Iwata, T., Cheon, J.H. (eds.) ASIACRYPT 2015. LNCS, vol. 9452, pp. 3–24. Springer, Heidelberg (2015). CrossRefGoogle Scholar
  13. 13.
    Csiszár, I., Körner, J.: Broadcast channels with confidential messages. IEEE Trans. Inf. Theory 24(3), 339–348 (1978)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Hayashi, M., Tan, V.Y.F.: Equivocations, exponents, and second-order coding rates under various Rényi information measures. IEEE Trans. Inf. Theory 63(2), 975–1005 (2017)CrossRefzbMATHGoogle Scholar
  15. 15.
    Tan, V.Y.F., Hayashi, M.: Analysis of remaining uncertainties and exponents under various conditional Rényi entropies. arXiv preprint arXiv:1605.09551 (2016)
  16. 16.
    Van Erven, T., Harremos, P.: Rényi divergence and Kullback-Leibler divergence. IEEE Trans. Inf. Theory 60(7), 3797–3820 (2014)CrossRefzbMATHGoogle Scholar
  17. 17.
    S. Verdú. \(\alpha \)-mutual information. In: Information Theory and Applications Workshop (ITA), pp. 1–6 (2015)Google Scholar
  18. 18.
    Fong, S.L., Tan, V.Y.F.: Strong converse theorems for classes of multimessage multicast networks: A Rényi divergence approach. IEEE Trans. Inf. Theory 62(9), 4953–4967 (2016)CrossRefzbMATHGoogle Scholar
  19. 19.
    Hayashi, M., Matsumoto, R.: Secure multiplex coding with dependent and non-uniform multiple messages. IEEE Trans. Inf. Theory 62(5), 2355–2409 (2016)MathSciNetCrossRefzbMATHGoogle Scholar
  20. 20.
    Yu, L., Tan, V.Y.F.: Rényi resolvability and its applications to the wiretap channel. arXiv preprint arXiv:1707.00810 (2017)
  21. 21.
    Csiszár, I., Narayan, P.: Secrecy capacities for multiple terminals. IEEE Trans. Inf. Theory 50(12), 3047–3061 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  22. 22.
    Kobayashi, D., Yamamoto, H., Ogawa, T.: Secure multiplex coding attaining channel capacity in wiretap channels. IEEE Trans. Inf. Theory 59(12), 8131–8143 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  23. 23.
    Gohari, A., Anantharam, V.: Generating dependent random variables over networks. In: 2011 IEEE Information Theory Workshop (ITW), pp. 698–702 (2011)Google Scholar
  24. 24.
    Goldfeld, Z., Cuff, P., Permuter, H.H.: Semantic-security capacity for wiretap channels of type II. IEEE Trans. Inf. Theory 62(7), 3863–3879 (2016)MathSciNetCrossRefzbMATHGoogle Scholar
  25. 25.
    Wyner, A.: The wire-tap channel. Bell Labs Tech. J. 54(8), 1355–1387 (1975)MathSciNetCrossRefzbMATHGoogle Scholar
  26. 26.
    Yu, L., Tan, V.Y.F.: Wyner’s common information under Rényi divergence measures. arXiv preprint arXiv:1709.02168 (2017)
  27. 27.
    Cuff, P., Permuter, H., Cover, T.: Coordination capacity. IEEE Trans. Inf. Theory 56(9), 4181–4206 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  28. 28.
    Csiszár, I., Körner, J.: Information Theory: Coding Theorems for Discrete Memoryless Systems. Cambridge University Press, Cambridge (2011)CrossRefzbMATHGoogle Scholar
  29. 29.
    Dembo, A., Zeitouni, O.: Large Deviations Techniques and Applications, 2nd edn. Springer, Heidelberg (1998)CrossRefzbMATHGoogle Scholar
  30. 30.
    Yassaee, M., Gohari, A., Aref, M.: Channel simulation via interactive communications. IEEE Trans. Inf. Theory 61(6), 2964–2982 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  31. 31.
    El Gamal, A., Kim, Y.-H.: Network Information Theory. Cambridge University Press, Cambridge (2011)CrossRefzbMATHGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Department of Electrical and Computer EngineeringNational University of SingaporeSingaporeSingapore
  2. 2.Department of MathematicsNational University of SingaporeSingaporeSingapore

Personalised recommendations