I'm a postdoc in Statistics at Stanford University working with Scott Linderman's research group on developing neural data analysis methods. My work is currently supported by the Wu Tsai Neurosciences Institute and the NIH BRAIN initiative (1F32MH122998-01).

In 2019, I received a PhD in Neuroscience from Stanford with supervision from Surya Ganguli, and funding through the DOE Computational Science Graduate Fellowship Program. Before that, I worked one year at the Salk Institute with Terry Sejnowski and two years at Brandeis University with Eve Marder and Tim O'Leary. I got my undergrad degree at Bowdoin College, advised by Patsy Dickinson.

- Unsupervised learning methods for large-scale, multi-trial neural data
- A common paradigm in systems neuroscience is to simultaneously record the activity of many neurons over repeated experimental trials (e.g., multiple presentations of a sensory stimulus, or a repeated motor action). The resulting datasets can be very large, potentially containing thousands of neurons over thousands of trials. I'm interested in finding general-purpose statistical approaches for understanding datasets of this form.
- Tensor Decomposition — In collaboration with experimental groups headed by Krishna Shenoy and Mark Schnitzer we demonstrate how to extract functional neural subpopulations across learning and changing environments
- Time Warping — In collaboration with Rinberg, Shenoy, and Ölveczky, labs we demonstrate how to discover precisely timed neural spike patterns in the presence of trial-to-trial jitter and other temporal misalignments.
- Tutorial video — In this talk (given at MIT, 2017) I give a high level overview of some of the research cited above.
- Automated segmentation and detection of motifs in high-dimensional time series
- Trial-structured data (described above) are convenient for data analysis — in particular, if neural dynamics on different trials follow a similar (though not identical) trajectory, we can build a model that isolates this structure while discarding the remaining "noise" (for lack of a better term). A more challenging problem is how to extract scientifically interpretable features from unstructured neural and behavioral data streams (e.g. during unconstrained natural behaviors). To address this, I have developed some models that extract temporal motifs or "pseudo-trials" from raw time series.
- Convolutional NMF — In collaboration with Emily Mackevicius, Andrew Bahle, Mark Goldman, and Michale Fee, I helped repurpose a convolutional nonnegative matrix factorization (convNMF) model (see Smaragdis, 2007) to extract sequential firing patterns in songbirds and rats from unannotated neural population recordings.
- A Point Process Framework for Sequence Detection — Here we reformulate the convNMF model for spike train data, and in doing so draw a connection to Neyman-Scott point process models. We find that this model is more capable of handling noise in low firing rate regimes than the original convNMF model. We also get better quantification of model uncertainty by performing parameter inference in a fully Bayesian manner.
- Modeling individual variability in biological and model networks
- Comparative analyses are a well-established approach used in many fields of biology including anatomy, ecology, and cellular physiology. Applying similar approaches in systems-level neuroscience is challenging because we often lack a principled mapping between neurons recorded in different subjects (except in important invertebrate model organisms). However, recent progress has shown that we can often extract population-level statistical features from large-scale neural recording traces, which replicate (at least qualitatively) across subjects (see, e.g., Churchland et al., 2012). I'm interested in developing statistical methods that formalize these observations and using these new tools to perform larger-scale comparative analyses across cohorts containing hundreds of experimental subjects.
- Universality & Individuality in Artificial Networks — I worked with Niru Maheswaranathan and David Sussillo to study these questions across large populations of trained recurrent neural networks (RNNs). On simple tasks that were amenable to low-dimensional visualization and interpretation, we found that different network architectures (e.g. vanilla RNNs vs. LSTMs) often showed qualitatively similar computational mechanisms (loosely, "universality"). However, we also found differences ("individuality") in the geometric structure of different solutions. We expect that we will need to wrestle with similar outcomes when these comparative analyses are brought to bear on biological data. Also see this talk from David Sussillo for more details.
- Distance Metrics for Comparing Neural Representations — I have recently been interested in revisiting some of the foundational approaches to quantifying similarity across neurocomputational systems. I describe some of the high-level ideas in the video above, and I have shared a small code package on github. A forthcoming manuscript will provide more details and specifics.

- Statistical Neuroscience in the Single Trial Limit
- Point process models for sequence detection in high-dimensional neural spike trains
- Dynamic and reversible remapping of network representations in an unchanging environment
- Combining tensor decomposition and time warping models for multi-neuronal spike train analysis
- Thalamic activity patterns unfolding over multiple time scales predict seizure onset in absence epilepsy
- Discovering precise temporal patterns in large-scale neural recordings through robust and interpretable time warping
- Universality and individuality in neural dynamics across large populations of recurrent networks
- Reverse engineering recurrent networks for sentiment classification reveals line attractor dynamics
- Fast Convolutive Nonnegative Matrix Factorization Through Coordinate and Block Coordinate Updates
- Unsupervised discovery of temporal sequences in high-dimensional datasets, with applications to neuroscience
- Unsupervised discovery of demixed, low-dimensional neural dynamics across multiple timescales through tensor components analysis
- Dendritic trafficking faces physiologically critical speed-precision tradeoffs
- Distinct or shared actions of peptide family isoforms: II. Multiple pyrokinins exert similar effects in the lobster stomatogastric nervous system.
- Summary of the DREAM8 parameter estimation challenge: Toward parameter identification for whole-cell models.
- Cell types, network homeostasis and pathological compensation from a biologically plausible ion channel expression model.
- Many parameter sets in a multicompartment model oscillator are robust to temperature perturbations.
- The neuromuscular transform of the lobster cardiac system explains the opposing effects of a neuromodulator on muscle output.
- Correlations in ion channel expression emerge from homeostatic regulation mechanisms.
- Animal-to-animal variability in the phasing of the crustacean cardiac motor pattern: an experimental and computational analysis.