Johns Hopkins University
Title: You Can Teach an Old Dog New Tricks - Deep Learning in Data-Starved Regimes
Date: Friday, October 08, 2021
Place and Time: Zoom, 3:05-3:55 pm
Deep learning has infiltrated nearly every major field of study. The unprecedented success of these models has, in many cases, been fueled by an explosion of data. Millions of labeled images, thousands of annotated ICU admissions, and hundreds of hours of transcribed speech are common standards in the literature. Clinical neuroscience is a notable holdout to this trend. It is a field of unavoidably small datasets, massive patient variability, and an arguable lack of ground truth information. My lab tackles the challenges of this domain by blending the interpretability of generative models with the representational power of deep learning. This talk will highlight three ongoing projects that span a range of "old school" methodologies and applications. First, I will discuss a joint optimization framework that combines dictionary learning with recurrent neural networks to predict behavioral deficits from multimodal brain connectivity. Second, I will describe a probabilistic graphical model for epileptic seizure detection using multichannel EEG. The latent variables in this model capture the spatiotemporal spread of a seizure; they are complemented by a nonparametric likelihood based on convolutional neural networks. Finally, I will touch on a recent initiative to inject emotional cues into human speech. Our approach combines diffeomorphic registration with generative adversarial networks.