Publications
Minimax experimental design: Bridging the gap between statistical and worst-case approaches to least squares regression.
Proceedings of 2019 COLT.
(2019). Statistical Mechanics Methods for Discovering Knowledge from Modern Production Quality Neural Networks.
Proceedings of the 25th Annual SIGKDD. 3239-3240.
(2019). Sub-Sampled Newton Methods.
Mathematical Programming. 293-326.
(2019). Traditional and Heavy-Tailed Self Regularization in Neural Network Models.
Proceeding of the 36th ICML Conference. 4284-4293.
(2019). Trust Region Based Adversarial Attack on Neural Networks.
Proceedings of the 32nd CVPR Conference. 11350-11359.
(2019). Heavy-Tailed Universality Predicts Trends in Test Accuracies for Very Large Pre-Trained Deep Neural Networks.
Proceedings of 2020 SDM Conference.
(2020). Inefficiency of K-FAC for Large Batch Size Training.
Proceedings of the AAAI-20 Conference.
(2020).
(2020). Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT.
Proceedings of the AAAI-20 Conference.
(2020). Shallow neural networks for fluid flow reconstruction with limited sensors.
Proceedings of the Royal Society A. 476(2238),
(2020).
(2021).
Adversarially-Trained Deep Nets Transfer Better: Illustration on Image Classification.
International Conference on Learning Representations.
(2021).
(2021).
(2021). Lipschitz recurrent neural networks.
International Conference on Learning Representations.
(2021). Noise-Response Analysis of Deep Neural Networks Quantifies Robustness and Fingerprints Structural Malware.
Proceedings of the 2021 SIAM International Conference on Data Mining (SDM). 100-108.
(2021). Noisy Recurrent Neural Networks.
Advances in Neural Information Processing Systems Conference. 34,
(2021). LSAR: Efficient Leverage Score Sampling Algorithm for the Analysis of Big Time Series Data.
Journal of Machine Learning Research. 23, 1-36.
(2022).
(2022).
(2022). Pages
- « first
- ‹ previous
- 1
- 2
- 3