Optimization in Reproducing Kernel Hilbert Spaces of Spike Trains

  • António R.C. Paiva
  • Il Park
  • José C. Príncipe
Part of the Springer Optimization and Its Applications book series (SOIA, volume 38)


This chapter presents a framework based on reproducing kernel Hilbert spaces (RKHS) for optimization with spike trains. To establish the RKHS for optimization we start by introducing kernels for spike trains. It is shown that spike train kernels can be built from ideas of kernel methods or from the intensity functions underlying the spike trains. However, the later approach shall be the main focus of this study. We introduce the memoryless cross-intensity (mCI) kernel as an example of an inner product of spike trains, which defines the RKHS bottom-up as an inner product of intensity functions. Being defined in terms of the intensity functions, this approach toward defining spike train kernels has the advantage that points in the RKHS incorporate a statistical description of the spike trains, and the statistical model is explicitly stated. Some properties of the mCI kernel and the RKHS it induces will be given to show that this RKHS has the necessary structure for optimization. The issue of estimation from data is also addressed. We finalize with an example of optimization in the RKHS by deriving an algorithm for principal component analysis (PCA) of spike trains.


Point Process Spike Train Intensity Function Kernel Method Reproduce Kernel Hilbert Space 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



A. R. C. Paiva was supported by Fundaçao para a Ciência e a Tecnologia (FCT), Portugal, under grant SRFH/BD/18217/2004. This work was partially supported by NSF grants ECS-0422718 and CISE-0541241.


  1. 1.
    1. Aronszajn, N. Theory of reproducing kernels. Trans Am Math Soc 68(3), 337–404 (1950)MathSciNetMATHCrossRefGoogle Scholar
  2. 2.
    2. Berg, C. Christensen, J.P.R., Ressel, P. Harmonic Analysis on Semigroups: Theory of Positive Definite and Related Functions. Springer-Verlag, New York (1984)Google Scholar
  3. 3.
    3. Bohte, S.M., Kok, J.N., Poutré, H.L.: Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48(1–4), 17–37 (2002). DOI 10.1016/S0925-2312(01)00658-0MATHCrossRefGoogle Scholar
  4. 4.
    4. Carnell, A., Richardson, D.: Linear algebra for time series of spikes. In: Proceedings European Symposium on Artificial Neural Networks, pp. 363–368. Bruges, Belgium (2005)Google Scholar
  5. 5.
    5. Dayan, P., Abbott, L.F. Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. MIT Press, Cambridge, MA (2001)Google Scholar
  6. 6.
    6. Diggle, P., Marron, J.S. Equivalence of smoothing parameter selectors in density and intensity estimation. J Acoust Soc Am 83(403), 793–800 (1988)MathSciNetMATHGoogle Scholar
  7. 7.
    7. Harville, D.A. Matrix Algebra from a Statistician's Perspective. Springer, New York (1997)CrossRefGoogle Scholar
  8. 8.
    8. Haykin, S. Adaptive Filter Processing, 4th edn. Prentice-Hall, Upper Saddle River, NJ (2002)Google Scholar
  9. 9.
    9. Kailath, T. RKHS approach to detection and estimation problems–part I: Deterministic signals in gaussian noise. IEEE Trans Inform Theory 17(5), 530–549 (1971)MathSciNetMATHCrossRefGoogle Scholar
  10. 10.
    10. Kailath, T., Duttweiler, D.L. An RKHS approach to detection and estimation problems–part III: Generalized innovations representations and a likelihood-ratio formula. IEEE Trans Inform Theory 18(6), 730–745 (1972)MathSciNetMATHCrossRefGoogle Scholar
  11. 11.
    11. Kailath, T., Weinert, H.L. An RKHS approach to detection and estimation problems–part II: Gaussian signal detection. IEEE Trans Inform Theory 21(1), 15–23 (1975)MathSciNetMATHCrossRefGoogle Scholar
  12. 12.
    12. Maass, W., Bishop, C.M. (eds.) Pulsed Neural Networks. MIT Press, Cambridge, MA (1998)Google Scholar
  13. 13.
    13. Maass, W., Natschläger, T., Markram, H. Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Comp 14(11), 2531–2560 (2002).DOI 10.1162/089976602760407955MATHCrossRefGoogle Scholar
  14. 14.
    14. Mardia, K.V., Jupp, P.E. Directional Statistics. John Wiley & Sons, West Sussex, England (2000)Google Scholar
  15. 15.
    15. McClurkin, J.W., Gawne, T.J., Optican, L.M., Richmond, B.J. Lateral geniculate neurons in behaving primates. II. Encoding of visual information in the temporal shape of the response. J Neurophysiol 66(3), 794–808 (1991)Google Scholar
  16. 16.
    16. Mercer, J. Functions of positive and negative type, and their connection with the theory of integral equations. Phil Trans R Soc Lond – A 209, 415–446 (1909)MATHCrossRefGoogle Scholar
  17. 17.
    17. Moore, E.H. On properly positive Hermitian matrices. Bull Am Math Soc 23, 59 (1916)CrossRefGoogle Scholar
  18. 18.
    Paiva, A.R.C., Park, I., Príncipe, J.C. Reproducing kernel Hilbert spaces for spike train analysis. In: Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP-2008, Las Vegas, NV, USA (2008)Google Scholar
  19. 19.
    Paiva, A.R.C., Xu, J.W., Pr ncipe, J.C. Kernel principal components are maximum entropy projections. In: Proceedings of International Conference on Independent Component Analysis and Blind Source Separation, ICA-2006, pp. 846–853. Charleston, SC (2006). DOI 10.1007/11679363_105Google Scholar
  20. 20.
    20. Parzen, E. Statistical inference on time series by Hilbert space methods. Tech. Rep. 23, Applied Mathematics and Statistics Laboratory, Stanford University, Stanford, CA (1959)Google Scholar
  21. 21.
    21. Parzen, E. On the estimation of a probability density function and the mode. Ann Math Stat 33(2), 1065–1076 (1962)MathSciNetMATHCrossRefGoogle Scholar
  22. 22.
    22. Parzen, E. Time Series Analysis Papers. Holden-Day, San Francisco, CA (1967)Google Scholar
  23. 23.
    Parzen, E. Statistical inference on time series by RKHS methods. In: Pyke, R. (ed.) Proceedings of 12th Biennal International Seminar of the Canadian Mathematical Congress, pp. 1–37 (1970)Google Scholar
  24. 24.
    24. Príncipe, J.C., Xu, D., Fisher, J.W. Information theoretic learning. In: Haykin, S. (ed.) Unsupervised Adaptive Filtering, vol. 2, pp. 265–319. John Wiley & Sons, New York (2000)Google Scholar
  25. 25.
    25. Ramsay, J.O., Silverman, B.W. Functional Data Analysis. Springer-Verlag, New York (1997)Google Scholar
  26. 26.
    26. Reiss, R.D. A Course on Point Processes. Springer-Verlag, New York (1993)CrossRefGoogle Scholar
  27. 27.
    27. Richmond, B.J., Optican, L.M. Temporal encoding of two-dimensional patterns by single units in primate inferior temporal cortex. II. Quantification of response waveform. J Neurophysiol 51(1), 147–161 (1987)Google Scholar
  28. 28.
    28. Rieke, F., Warland, D., de Ruyter van Steveninck, R., Bialek, W. Spikes: Exploring the Neural Code. MIT Press, Cambridge, MA (1999)Google Scholar
  29. 29.
    29. Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods: Support Vector Learning. MIT Press, Cambridge, MA (1999)Google Scholar
  30. 30.
    30. Schökopf, B., Smola, A., Müller, K.R. Nonlinear component analysis as a kernel eigenvalue problem. Neural Comp 10(5), 1299–1319 (1998)CrossRefGoogle Scholar
  31. 31.
    31. Schrauwen, B., Campenhout, J.V. Linking non-binned spike train kernels to several existing spike train distances. Neurocomputing 70(7–8), 1247–1253 (2007). DOI 10.1016/j.neucom.2006.11.017CrossRefGoogle Scholar
  32. 32.
    32. Schreiber, S., Fellous, J.M., Whitmer, D., Tiesinga, P., Sejnowski, T.J. A new correlation-based measure of spike timing reliability. Neurocomputing 52–54, 925–931 (2003). DOI 10.1016/S0925-2312(02)00838-XCrossRefGoogle Scholar
  33. 33.
    33. Snyder, D.L. Random Point Process in Time and Space. John Viley & Sons, New York (1975)Google Scholar
  34. 34.
    34. van Rossum, M.C.W. A novel spike distance. Neural Comp 13(4), 751–764 (2001)MATHCrossRefGoogle Scholar
  35. 35.
    35. Vapnik, V.N. The Nature of Statistical Learning Theory. Springer, New York (1995)Google Scholar
  36. 36.
    36. Victor, J.D., Purpura, K.P. Nature and precision of temporal coding in visual cortex: A metric-space analysis. J Neurophysiol 76(2), 1310–1326 (1996)Google Scholar
  37. 37.
    37. Victor, J.D., Purpura, K.P. Metric-space analysis of spike trains: theory, algorithms, and application. Netw Comp Neural Sys 8, 127–164 (1997)MATHCrossRefGoogle Scholar
  38. 38.
    Wahba, G. Spline Models for Observational Data, CBMS-NSF Regional Conference Series in Applied Mathematics, Vol. 59. SIAM (1990)Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  1. 1.Department of Electrical and Computer EngineeringUniversity of FloridaGainesvilleUSA
  2. 2.Pruitt Family Department of Biomedical EngineeringUniversity of FloridaGainesvilleUSA

Personalised recommendations