Format

Send to

Choose Destination

See 1 citation found by title matching your search:

J Neurosci Methods. 2018 Jan 15;294:72-80. doi: 10.1016/j.jneumeth.2017.11.006. Epub 2017 Nov 14.

Detection of relationships among multi-modal brain imaging meta-features via information flow.

Author information

1
The Mind Research Network, 1101 Yale Blvd. NE, Albuquerque, NM, 87131, USA. Electronic address: rmiller@mrn.org.
2
The Mind Research Network, 1101 Yale Blvd. NE, Albuquerque, NM, 87131, USA.
3
The Mind Research Network, 1101 Yale Blvd. NE, Albuquerque, NM, 87131, USA; Department of Electrical and Computer Engineering, 498 Terrace St. NE, Albuquerque, NM, 87106, USA.

Abstract

BACKGROUND:

Neuroscientists and clinical researchers are awash in data from an ever-growing number of imaging and other bio-behavioral modalities. This flow of brain imaging data, taken under resting and various task conditions, combines with available cognitive measures, behavioral information, genetic data plus other potentially salient biomedical and environmental information to create a rich but diffuse data landscape. The conditions being studied with brain imaging data are often extremely complex and it is common for researchers to employ more than one imaging, behavioral or biological data modality (e.g., genetics) in their investigations. While the field has advanced significantly in its approach to multimodal data, the vast majority of studies still ignore joint information among two or more features or modalities.

NEW METHOD:

We propose an intuitive framework based on conditional probabilities for understanding information exchange between features in what we are calling a feature meta-space; that is, a space consisting of many individual featurae spaces. Features can have any dimension and can be drawn from any data source or modality. No a priori assumptions are made about the functional form (e.g., linear, polynomial, exponential) of captured inter-feature relationships.

RESULTS:

We demonstrate the framework's ability to identify relationships between disparate features of varying dimensionality by applying it to a large multi-site, multi-modal clinical dataset, balance between schizophrenia patients and controls. In our application it exposes both expected (previously observed) relationships, and novel relationships rarely considered investigated by clinical researchers.

COMPARISON WITH EXISTING METHOD(S):

To the best of our knowledge there is not presently a comparably efficient way to capture relationships of indeterminate functional form between features of arbitrary dimension and type. We are introducing this method as an initial foray into a space that remains relatively underpopulated.

CONCLUSIONS:

The framework we propose is powerful, intuitive and very efficiently provides a high-level overview of a massive data space. In our application it exposes both expected relationships and relationships very rarely considered worth investigating by clinical researchers.

KEYWORDS:

Big data; Biomedical imaging; Biomedical signal processing; Data fusion; Multimodal brain imaging

Supplemental Content

Full text links

Icon for Elsevier Science
Loading ...
Support Center