Neuro Reading Documents

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 1

This study builds on previous work in multi-modal AD data fusion to advance AD stage

prediction. We combine imaging, EHR, and genomic SNP data using DL to classify patients into
control (CN), MCI, and AD groups. We use stacked denoising auto-encoders for EHR and SNP
data, and novel 3D convolutional neural networks (CNNs) to train MRI imaging data. Once the
networks have been separately trained for each data modality, they are combined using different
classification layers, including decision trees, random forests, support vector machines (SVM),
and k-nearest neighbours (kNN). The performance of our integration models is demonstrated
using the ADNI37 dataset, which contains SNP data (808 patients), MRI imaging data (503
patients), and clinical and neurological test data (2004 patients).

Despite being better at helping doctors make decisions, doctors don't use deep learning models
because they don't know how to interpret them. We're developing new ways to find the most
important features in the models.

This article uses data from the Alzheimer’s Disease Neuroimaging Initiative (ADNI) database
(adni.loni.usc.edu) to analyse Alzheimer’s disease. ADNI tests whether different types of data
can be used to measure the progression of MCI and early AD. The ADNI data repository
contains data from over 2220 patients in four studies (ADNI1, ADNI2, ADNI GO, and ADNI3).
Our study focuses on ADNI1, 2, and GO because ADNI 3 is an ongoing study. The data is
currently being released in phases with limited availability. The imaging data (ADNI1, 2 and
GO) includes MRI and PET images. We use cross-sectional MRI data from ADNI1 (503
patients) at the baseline screenings. The data publisher has made the images consistent across
different scanners.

You might also like