I'm a postdoc in Statistics at Stanford University working with Scott Linderman's research group on a variety of data analytic challenges in neuroscience and biology.

I completed the Stanford Neurosciences PhD program in 2019 with supervision from Surya Ganguli, and funding through the DOE Computational Science Graduate Fellowship Program. Before that, I worked at the Salk Institute (with Terry Sejnowski) and Brandeis University (with Eve Marder and Tim O'Leary). I performed my undergraduate studies at Bowdoin College, where I worked with Patsy Dickinson.

- Unsupervised Learning Techniques for Large-Scale, Multi-Trial Neural Data
- A common paradigm in systems neuroscience is to simultaneously record the activity of many neurons over repeated experimental trials (e.g., multiple presentations of a sensory stimulus, or a repeated motor action). The resulting datasets can be very large, potentially containing recordings from thousands of neurons over thousands of experimental trials. I'm interested in finding general-purpose statistical approaches for understanding datasets of this form.
- Piecewise Linear Time Warping Models. — Analysis of neural data often relies on manual alignment of neural activity to a stimulus or behavioral event on each trial. However, alignment to external events is not always possible (e.g., in cases where neural activity is locked to internal cognitive states or decisions). I'm working to develop automatic alignment methods for these cases, enabling discovery of otherwise hidden neural coding patterns.
- Tensor Decompositions of Neural Data. — Commonly used methods for dimensionality reduction (such as PCA) identify low-dimensional features of within-trial neural dynamics, but do not model changes in neural activity across trials. To better understand processes like learning and trial-to-trial variability, I'm exploring tensor decomposition methods to find reduced representations of multi-trial datasets.
- Extracting Temporal Sequences From Neural Time Series. — Many animal behaviors are built as a sequence of motor primitives or decisions. The neurons that support these (and other) behaviors can also fire in repeatable sequences. Yet, statistical methods for identifying neural sequences in an unbiased manner (without pre-conceived reference to animal behavior) are not widely used. I am interested in using convolutive matrix factorization methods to address this problem.
- Theoretical Molecular Neurobiology
- Biology computes with both electrical and biochemical signals. In past work, I studied how these two substrates of computation interact with each other.
- Microtubular Transport in Complex Dendritic Trees — Neurons are remarkably complex cells. Given this, it seems an almost insurmountable challenge to transport molecular cargo reliably. I studied a few simple models of how reliable transport can be accomplished.
- Code
- Open-sourced research code.
- tensortools — My Python toolbox for fitting tensor decompositions.
- affinewarp — Time warping models for neural data.
- PyNeuronToolbox — A package I wrote to enable better NEURON simulations in Jupyter notebooks.

- Reverse engineering recurrent networks for sentiment classification reveals line attractor dynamics
- Fast Convolutive Nonnegative Matrix Factorization Through Coordinate and Block Coordinate Updates
- Discovering precise temporal patterns in large-scale neural recordings through robust and interpretable time warping
- Unsupervised discovery of temporal sequences in high-dimensional datasets, with applications to neuroscience
- Unsupervised discovery of demixed, low-dimensional neural dynamics across multiple timescales through tensor components analysis
- Dendritic trafficking faces physiologically critical speed-precision tradeoffs
- Distinct or shared actions of peptide family isoforms: II. Multiple pyrokinins exert similar effects in the lobster stomatogastric nervous system.
- Summary of the DREAM8 parameter estimation challenge: Toward parameter identification for whole-cell models.
- Cell types, network homeostasis and pathological compensation from a biologically plausible ion channel expression model.
- Many parameter sets in a multicompartment model oscillator are robust to temperature perturbations.
- The neuromuscular transform of the lobster cardiac system explains the opposing effects of a neuromodulator on muscle output.
- Correlations in ion channel expression emerge from homeostatic regulation mechanisms.
- Animal-to-animal variability in the phasing of the crustacean cardiac motor pattern: an experimental and computational analysis.