Abstract
The study of complex information processing systems requires appropriate theoretical tools to help unravel their underlying design principles. Information theory is one such tool, and has been utilized extensively in the study of the neural code. Although much progress has been made in information theoretic methodology, there is still no satisfying answer to the question: "What is the information that a given property of the neural population activity (e.g., the responses of single cells within the population) carries about a set of stimuli?" Here, we answer such questions via the minimum mutual information (MinMI) principle. We quantify the information in any statistical property of the neural response by considering all hypothetical neuronal populations that have the given property and finding the one that contains the minimum information about the stimuli. All systems with higher information values necessarily contain additional information processing mechanisms and, thus, the minimum captures the information related to the given property alone. MinMI may be used to measure information in properties of the neural response, such as that conveyed by responses of small subsets of cells (e.g., singles or pairs) in a large population and cooperative effects between subunits in networks. We show how the framework can be used to study neural coding in large populations and to reveal properties that are not discovered by other information theoretic methods.
Original language | English |
---|---|
Pages (from-to) | 3490-3495 |
Number of pages | 6 |
Journal | Proceedings of the National Academy of Sciences of the United States of America |
Volume | 106 |
Issue number | 9 |
DOIs | |
State | Published - 3 Mar 2009 |
Externally published | Yes |
Keywords
- Information theory
- Maximum entropy
- Neural coding
- Population coding
ASJC Scopus subject areas
- General