I'm a postdoc in Statistics at Stanford University working with Scott Linderman's research group on a variety of data analytic challenges in neuroscience and biology.

I completed the Stanford Neurosciences PhD program in 2019 with supervision from Surya Ganguli, and funding through the DOE Computational Science Graduate Fellowship Program. Before that, I worked at the Salk Institute (with Terry Sejnowski) and Brandeis University (with Eve Marder and Tim O'Leary). I performed my undergraduate studies at Bowdoin College, where I worked with Patsy Dickinson.

- Unsupervised learning methods for large-scale, multi-trial neural data
- A common paradigm in systems neuroscience is to simultaneously record the activity of many neurons over repeated experimental trials (e.g., multiple presentations of a sensory stimulus, or a repeated motor action). The resulting datasets can be very large, potentially containing recordings from thousands of neurons over thousands of experimental trials. I'm interested in finding general-purpose statistical approaches for understanding datasets of this form. See Williams et al. (2018) and Williams et al. (2019). Also check out this tutorial (from 2017) for a broader overview of this subject.
- Automated segmentation and detection of motifs in high-dimensional time series
- Trial-structured data (described above) are convienent for data analysis---in particular, if neural dynamics on different trials follow a similar (though not identical) trajectory, we can build a model that isolates this structure while discarding the remaining "noise" (for lack of a better term). A more challenging problem is how to extract scientifically interpretable features from unstructured neural and behavioral data streams (e.g. during unconstrained natural behaviors). I am working on models that extract temporal motifs or pseudo-trials from these unannotated time series. See Mackevicius et al. (2019) and Degleris et al. (2019).
- Code
- I have released a few code packages related to the above research interests.
- tensortools — My Python toolbox for fitting tensor decompositions.
- affinewarp — Time warping models for neural data.
- PyNeuronToolbox — A package I wrote to enable better NEURON simulations in Jupyter notebooks.

- Universality and individuality in neural dynamics across large populations of recurrent networks
- Reverse engineering recurrent networks for sentiment classification reveals line attractor dynamics
- Fast Convolutive Nonnegative Matrix Factorization Through Coordinate and Block Coordinate Updates
- Discovering precise temporal patterns in large-scale neural recordings through robust and interpretable time warping
- Unsupervised discovery of temporal sequences in high-dimensional datasets, with applications to neuroscience
- Unsupervised discovery of demixed, low-dimensional neural dynamics across multiple timescales through tensor components analysis
- Dendritic trafficking faces physiologically critical speed-precision tradeoffs
- Distinct or shared actions of peptide family isoforms: II. Multiple pyrokinins exert similar effects in the lobster stomatogastric nervous system.
- Summary of the DREAM8 parameter estimation challenge: Toward parameter identification for whole-cell models.
- Cell types, network homeostasis and pathological compensation from a biologically plausible ion channel expression model.
- Many parameter sets in a multicompartment model oscillator are robust to temperature perturbations.
- The neuromuscular transform of the lobster cardiac system explains the opposing effects of a neuromodulator on muscle output.
- Correlations in ion channel expression emerge from homeostatic regulation mechanisms.
- Animal-to-animal variability in the phasing of the crustacean cardiac motor pattern: an experimental and computational analysis.