Abstract
For the additive white Gaussian noise channel with average power constraint, sparse superposition codes (or sparse regression codes), proposed by Barron and Joseph in 2010, achieve the capacity. While the codewords of the original sparse superposition codes are made with a dictionary matrix drawn from a Gaussian distribution, we consider the case that it is drawn from a Bernoulli distribution. We show an improved upper bound on its block error probability with least squares decoding, which is fairly simplified and tighter bound than our previous result in 2014.
Similar content being viewed by others
References
Arikan, E. (2009). Channel polarization. IEEE Transactions Information Theory, 55(7), 3051–3073.
Barbier, J., & Krzakala, F. (2014). Replica analysis and approximate message passing decoder for superposition codes. In Proc. 2014 IEEE int. symp. inf. theory, Honolulu, HI, USA, June 29–July 4, pp. 1494–1498.
Barbier, J., & Krzakala, F. (2017). Approximate message-passing decoder and capacity-achieving sparse superposition codes. IEEE Transactions Information Theory, 63(8), 4894–4927.
Barron, A. R., & Cho, S. (2012). High-rate sparse superposition codes with iteratively optimal estimates. In Proc. 2012 IEEE int. symp. inf. theory, Boston, MA, USA, July 1–6, pp. 120–124.
Barron, A. R., & Joseph, A. (2010a). Least squares superposition coding of moderate dictionary size, reliable at rates up to channel capacity. In Proc. 2010 IEEE int. symp. inf. theory, Austin, Texas, USA, June 13–18, pp. 275–279.
Barron, A. R., & Joseph, A. (2010b). Towards fast reliable communication at rates near capacity with Gaussian noise. In Proc. 2010 IEEE. Int. symp. inf. theory, Austin, Texas, USA, June 13–18, pp. 315–319.
Berrou, C., Glavieux, A., & Thitimajshima, P. (1993). Near Shannon limit error-correcting coding: turbo codes. In Proc. Int. Conf. Commun, Geneva, Switzerland, May, pp. 1064–1070.
Bourbaki, N. (1986). Functions of a real variable, (Japanese translation). Tokyo: TokyoTosho.
Cho, S., & Barron, A. R. (2013). Approximate iterative Bayes optimal estimates for high-rate sparse superposition codes. In The sixth workshop on information-theoretic methods in science and engineering.
Cover, T. M., & Thomas, J. A. (2006). Elements of information theory. New York: Wiley-Interscience.
Joseph, A., & Barron, A. R. (2012). Least squares superposition codes of moderate dictionary size are reliable at rates up to capacity. IEEE Transactions Information Theory, 58(5), 2541–2557.
Joseph, A., & Barron, A. R. (2014). Fast sparse superposition codes have near exponential error probability for \(R < mathcal {C}\). IEEE Transactions Information Theory, 60(2), 919–942.
Kudekar, S., Richardson, T. J., & Urbanke, R. (2011). Threshold saturation via sparitial coupling: why convolutional LDPC ensembles perform so well over the BEC. IEEE Transactions Information Theory, 57(2), 803–834.
Rush, C., Greig, A., & Venkataramanan, R. (2017). Capacity achieving sparse regression codes via approximate message passing decoding. IEEE Transactions Information Theory, 63(3), 1476–1500.
Rush, C., & Venkataramanan, R. (2018). Finite sample analysis of approximate message passing. IEEE Transactions Information Theory, 64(11), 7264–7286.
Reed, I. S., & Solomon, G. (1960). Polynomial codes over certain finite fields. Journal of SIAM, 8, 300–304.
Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal, 27, 379–423.
Takeishi, Y., Kawakita, M., & Takeuchi, J. (2014). Least squares superposition codes with Bernoulli dictionary are still reliable at rates up to capacity. Bell System Technical Journal, 60(5), 2737–2750.
Venkataramanan, R., Tatikonda, S., & Barron, A. R. (2019). Sparse regression codes. Foundations and Trends in Communications and Information Theory, 15(1–2), 85–283. https://doi.org/10.1561/0100000092.
Osada, N. (2008). A story: numerical analysis part 3; the Euler-MacLaurin formula. Rikei Heno Suugaku, July 2008. (in Japanese) http://www.lab.twcu.ac.jp/~osada/rikei/rikei2008-7.pdf. Accessed 8 June 2019
Acknowledgements
The authors thank Professor Andrew R. Barron for his valuable comments. This research was partially supported by JSPS KAKENHI Grant numbers JP16K12496 and JP18H03291.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This material was presented in part at IEEE International Symposium on Information Theory 2016, in Barcelona, Spain.
Rights and permissions
About this article
Cite this article
Takeishi, Y., Takeuchi, J. An improved analysis of least squares superposition codes with bernoulli dictionary. Jpn J Stat Data Sci 2, 591–613 (2019). https://doi.org/10.1007/s42081-019-00057-9
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s42081-019-00057-9