I'm a postdoc in Statistics at Stanford University working with Scott Linderman's research group on developing neural data analysis methods. My work is currently supported by the Wu Tsai Neurosciences Institute and the NIH BRAIN initiative (1F32MH122998-01).

In 2019, I received a PhD in Neuroscience from Stanford with supervision from Surya Ganguli, and funding through the DOE Computational Science Graduate Fellowship Program. Before that, I worked one year at the Salk Institute with Terry Sejnowski and two years at Brandeis University with Eve Marder and Tim O'Leary. I got my undergrad degree at Bowdoin College, advised by Patsy Dickinson.

- Unsupervised learning methods for large-scale, multi-trial neural data
- A common paradigm in systems neuroscience is to simultaneously record the activity of many neurons over repeated experimental trials (e.g., multiple presentations of a sensory stimulus, or a repeated motor action). The resulting datasets can be very large, potentially containing thousands of neurons over thousands of trials. I'm interested in finding general-purpose statistical approaches for understanding datasets of this form.
- Tensor Decomposition — In collaboration with experimental groups headed by Krishna Shenoy and Mark Schnitzer we demonstrate how to extract functional neural subpopulations across learning and changing environments
- Time Warping — In collaboration with Rinberg, Shenoy, and Ölveczky, labs we demonstrate how to discover precisely timed neural spike patterns in the presence of trial-to-trial jitter and other temporal misalignments.
- Tutorial video — In this talk (given at MIT, 2017) I give a high level overview of some of the research cited above.
- Automated segmentation and detection of motifs in high-dimensional time series
- Trial-structured data (described above) are convienent for data analysis — in particular, if neural dynamics on different trials follow a similar (though not identical) trajectory, we can build a model that isolates this structure while discarding the remaining "noise" (for lack of a better term). A more challenging problem is how to extract scientifically interpretable features from unstructured neural and behavioral data streams (e.g. during unconstrained natural behaviors). To address this, I have developed some models that extract temporal motifs or "pseudo-trials" from raw time series.
- Convolutional NMF — Led by Emily Mackevicius and Andrew Bahle, we repurposed a convolutional nonnegative matrix factorization (convNMF) model (see Smaragdis, 2007) to extract sequential firing patterns in songbirds and rats from unannotated neural population recordings. Thanks also to Michale Fee and Mark Goldman for guidance and oversight!
- A Point Process Framework for Sequence Detection — Here we reformulate the convNMF model for spike train data, and in doing so draw a connection to Neyman-Scott point process models. We find that this model is more capable of handling noise in low firing rate regimes than the original convNMF model. We also get better quantification of model uncertainty by performing model inference in a fully Bayesian manner.

- Point process models for sequence detection in high-dimensional neural spike trains
- Dynamic and reversible remapping of network representations in an unchanging environment
- Combining tensor decomposition and time warping models for multi-neuronal spike train analysis
- Thalamic activity patterns unfolding over multiple time scales predict seizure onset in absence epilepsy
- Discovering precise temporal patterns in large-scale neural recordings through robust and interpretable time warping
- Universality and individuality in neural dynamics across large populations of recurrent networks
- Reverse engineering recurrent networks for sentiment classification reveals line attractor dynamics
- Fast Convolutive Nonnegative Matrix Factorization Through Coordinate and Block Coordinate Updates
- Unsupervised discovery of temporal sequences in high-dimensional datasets, with applications to neuroscience
- Unsupervised discovery of demixed, low-dimensional neural dynamics across multiple timescales through tensor components analysis
- Dendritic trafficking faces physiologically critical speed-precision tradeoffs
- Distinct or shared actions of peptide family isoforms: II. Multiple pyrokinins exert similar effects in the lobster stomatogastric nervous system.
- Summary of the DREAM8 parameter estimation challenge: Toward parameter identification for whole-cell models.
- Cell types, network homeostasis and pathological compensation from a biologically plausible ion channel expression model.
- Many parameter sets in a multicompartment model oscillator are robust to temperature perturbations.
- The neuromuscular transform of the lobster cardiac system explains the opposing effects of a neuromodulator on muscle output.
- Correlations in ion channel expression emerge from homeostatic regulation mechanisms.
- Animal-to-animal variability in the phasing of the crustacean cardiac motor pattern: an experimental and computational analysis.