Publications

(*) denotes equal contribution

2022

  1. arXiv
    Lottery Tickets on a Data Diet: Finding Initializations with Sparse Trainable Networks
    Mansheej Paul*, Brett W. Larsen*, Surya Ganguli, Jonathan Frankle,  and Gintare Karolina Dziugaite
    arXiv preprint arXiv:2206.01278 2022
    Spotlight presentation at the Sparsity in Neural Networks (SNN) Workshop 2022.
  2. PLOS Comp Bio
    Towards a more general understanding of the algorithmic utility of recurrent connections
    Brett W. Larsen,  and Shaul Druckmann
    PLOS Computational Biology 2022
    Poster presentation at COSYNE 2019.
  3. ICLR
    How many degrees of freedom do we need to train deep networks: a loss landscape perspective
    Brett W. Larsen, Stanislav Fort, Nic Becker,  and Surya Ganguli
    International Conference on Learning Representations (ICLR) 2022

2020

  1. SIMAX
    Practical leverage-based sampling for low-rank tensor decomposition
    Brett W. Larsen,  and Tamara G. Kolda
    arXiv preprint arXiv:2006.16438 2020
    To appear in SIAM Journal on Matrix Analysis and Applications

2019

  1. arXiv
    Avoiding Spurious Local Minima in Deep Quadratic Networks
    Abbas Kazemipour,  Brett W. Larsen,  and Shaul Druckmann
    arXiv preprint arXiv:2001.00098 2019
    Oral presentation at Deep Math 2020.

2016

  1. Thesis
    Graph-Based Clustering: Distributed Algorithms for Balanced Graph Cuts
    Brett W. Larsen
    2016

2013

  1. SPIE
    Applying matching pursuit decomposition time-frequency processing to UGS footstep classification
    Brett W. Larsen, Hugh Chung, Alfonso Dominguez, Jacob Sciacca, Narayan Kovvali, Antonia Papandreou-Suppappola,  and David R Allee
    In Sensors, and Command, Control, Communications, and Intelligence (C3I) Technologies for Homeland Security and Homeland Defense XII 2013