In 1999, Guo et al. proposed a new probabilistic symmetric probabilistic encryption scheme based on chaotic attractors of neural networks. The scheme is based on chaotic properties of the Overstoraged Hopfield Neural Network (OHNN). The approach bridges the relationship between neural network and cryptography. However, there are some problems in their scheme: (1) exhaustive search is needed to find all the attractors; (2) the data expansion in the paper is wrongly derived; (3) problem exists on creating the synaptic weight matrix. In this letter, we propose a symmetric probabilistic encryption scheme based on Clipped Hopfield Neural Network (CHNN), which solves the above mentioned problems. Furthermore, it keeps the length of the ciphertext equals to that of the plaintext.
Guo D.H., Cheng L.M., Cheng L.L. (1999). A new symmetric probabilistic encryption scheme based on chaotic attractors of neural networks. Applied Intelligence 10(1):71–84CrossRefGoogle Scholar
Hopfield J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences of the United States of America-Biological Sciences 79(8):2554–2558MathSciNetCrossRefGoogle Scholar
Gardner E. (1987). Maximum storage capacity in neural networks. Europhysics Letters 4(4):481–485ADSGoogle Scholar
Amit D., Gutfreund H., Sompolinsky H. (1987). Statistical-mechanics of neural networks near saturation. Annals of Physics 173(1): 30–67CrossRefADSGoogle Scholar
Chan C.K., Cheng L.M. (2001). The convergence properties of a clipped Hopfield and its application in the design of keystream generator. IEEE Transactions on Neural Networks 12(2):340–348CrossRefGoogle Scholar