Fast state-space methods for inferring dendritic synaptic connectivity
- 371 Downloads
We present fast methods for filtering voltage measurements and performing optimal inference of the location and strength of synaptic connections in large dendritic trees. Given noisy, subsampled voltage observations we develop fast l 1-penalized regression methods for Kalman state-space models of the neuron voltage dynamics. The value of the l 1-penalty parameter is chosen using cross-validation or, for low signal-to-noise ratio, a Mallows’ C p -like criterion. Using low-rank approximations, we reduce the inference runtime from cubic to linear in the number of dendritic compartments. We also present an alternative, fully Bayesian approach to the inference problem using a spike-and-slab prior. We illustrate our results with simulations on toy and real neuronal geometries. We consider observation schemes that either scan the dendritic geometry uniformly or measure linear combinations of voltages across several locations with random coefficients. For the latter, we show how to choose the coefficients to offset the correlation between successive measurements imposed by the neuron dynamics. This results in a “compressed sensing” observation scheme, with an important reduction in the number of measurements required to infer the synaptic weights.
KeywordsLasso Dendrites Low-rank State-space Synapses Spike-and-slab Compressed sensing
This work was supported by an NSF CAREER grant, a McKnight Scholar award, and by NSF grant IIS-0904353. This material is based upon work supported by, or in part by, the U. S. Army Research Laboratory and the U. S. Army Research Office under contract number W911NF-12-1-0594. JHH was partially supported by the Columbia College Rabi Scholars Program. AP was partially supported by the Swartz Foundation. The computer simulations were done in the Hotfoot HPC Cluster of Columbia University. We thank E. Pnevmatikakis for helpful discussions and comments.
Conflict of interests
The authors declare that they have no conflict of interest.
- Durbin, J., Koopman, S., Atkinson, A. (2001). Time series analysis by state space methods (Vol. 15). Oxford: Oxford University Press.Google Scholar
- Fisher, J.A.N., Barchi, J.R., Welle, C.G., Kim, G.-H., Kosterin, P., Obaid, A.L., Yodh, A.G., Contreras, D., Salzberg, B.M. (2008). Two-photon excitation of potentiometric probes enables optical recording of action potentials from mammalian nerve terminals in situ. Journal of Neurophysiology, 99(3), 1545–1553.PubMedCrossRefGoogle Scholar
- Friedman, J., Hastie, T., Tibshirani, R. (2008). The elements of statistical learning. Springer.Google Scholar
- Gelman, A., Carlin, J., Stern, H., Rubin, D. (2004). Bayesian data analysis. CRC press.Google Scholar
- Huggins, J., & Paninski, L. (2012). Optimal experimental design for sampling voltage on dendritic trees. Journal of Computational Neuroscience (in press).Google Scholar
- Kralj, J., Douglass, A., Hochbaum, D., Maclaurin, D., Cohen, A. (2011). Optical recording of action potentials in mammalian neurons using a microbial rhodopsin. Nature Methods.Google Scholar
- Mallows, C. (1973). Some comments on Cp. Technometrics, pp. 661–675.Google Scholar
- Mishchenko, Y., & Paninski, L. (2012). A Bayesian compressed-sensing approach for reconstructing neural connectivity from subsampled anatomical data. Under review.Google Scholar
- Pakman, A., & Paninski, L. (2013). Exact hamiltonian Monte Carlo for truncated multivariate gaussians. Journal of Computational and Graphical Statistics, preprint arXiv:1208.4118.
- Paninski, L., & Ferreira, D. (2008). State-space methods for inferring synaptic inputs and weights. COSYNE.Google Scholar
- Paninski, L., Vidne, M., DePasquale, B., Ferreira, D. (2012). Inferring synaptic inputs given a noisy voltage trace via sequential Monte Carlo methods. Journal of Computational Neuroscience (in press).Google Scholar
- Pnevmatikakis, E.A., & Paninski, L. (2012). Fast interior-point inference in high-dimensional sparse, penalized state-space models. Proceedings of the 15th International Conference on Artificial Intelligence and Statistics (AISTATS) 2012, La Palma, Canary Islands. Volume XX of JMLR: W&CP XX.Google Scholar
- Pnevmatikakis, E.A., Kelleher, K., Chen, R., Saggau, P., Josić, K., Paninski, L. (2012a). Fast spatiotemporal smoothing of calcium measurements in dendritic trees, submitted. PLoS Computational Biology, 8, e1002569.Google Scholar
- Pnevmatikakis, E.A., Paninski, L., Rad, K.R., Huggins, J. (2012b). Fast Kalman filtering and forward-backward smoothing via a low-rank perturbative approach. Journal of Computational and Graphical Statistics (in press).Google Scholar
- Press, W., Teukolsky, S., Vetterling, W., Flannery, B. (1992). Numerical recipes in C. Cambridge: Cambridge University Press.Google Scholar
- Smith, C. (2013). Low-rank graphical models and Bayesian analysis of neural data: PhD Thesis, Columbia University.Google Scholar
- Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society. Series B, 58, 267–288.Google Scholar