Format

Send to

Choose Destination
J Neurosci Methods. 2011 Mar 15;196(1):81-7. doi: 10.1016/j.jneumeth.2011.01.002. Epub 2011 Jan 8.

Measuring entropy in continuous and digitally filtered neural signals.

Author information

1
Department of Physiology and Biophysics, Dalhousie University, Halifax, Nova Scotia, Canada B3H 1X5. andrew.french@dal.ca

Abstract

Neurons receive, process and transmit information using two distinct types of signaling methods: analog signals, such as graded changes in membrane potential, and binary digital action potentials. Quantitative estimates of information in neural signals have been based either on information capacity, which measures the theoretical maximum information flow through a communication channel, or on entropy, the amount of information that is required to describe or reproduce a signal. Measurement of entropy is straightforward for digital signals, including action potentials, but is more difficult for analog signals. This problem compromises attempts to estimate information in many neural signals, particularly when there is conversion between the two signal formats. We extended an established method for action potential entropy estimation to provide entropy estimation of analog signals. Our approach is based on context-independent data compression of analog signals, which we call analog compression. Although compression of analog signals is computationally intensive, we describe an algorithm that provides practical, efficient and reliable entropy estimation via analog compression. Implementation of the algorithm is demonstrated at two stages of sensory processing by a mechanoreceptor.

PMID:
21219926
DOI:
10.1016/j.jneumeth.2011.01.002
[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Elsevier Science
Loading ...
Support Center