Research
Broadly speaking, my research interests revolve around Machine Learning, Artificial Intelligence,
Statistics, and Theoretical Computer Science.
I am interested in formulating new/emerging learning scenarios
(including various forms of unsupervised learning), and providing provably efficient methods for
-- or establishing inherent limitations in --
solving them.
Some of the directions that I am currently focusing on areStatistically/Computationally Efficient Distribution Learning
Differentially Private Distribution Learning
Domain Adaptation and Robust Learning under Distribution Shift
Learning under Adversarial Attacks
Modern Generalization Bounds for Supervised Learning
Some of the directions that I am currently focusing on are
Highlighted Publications [Full List]
- Mixtures of Gaussians are Privately Learnable with a Polynomial Number of Samples
[paper]
Mohammad Afzali, Hassan Ashtiani, Christopher Liaw
Arxiv Preprint - On the Role of Noise in the Sample Complexity of Learning Recurrent Neural Networks: Exponential Gaps for Long Sequences
[paper]
Alireza Fathollah Pour, Hassan Ashtiani
NeurIPS 2023 - Polynomial time and private learning of unbounded Gaussian Mixture Models
[paper]
Jamil Arbas, Hassan Ashtiani, Christopher Liaw
ICML 2023 - Adversarially Robust Learning with Tolerance
[paper]
Hassan Ashtiani, Vinayak Pathak, Ruth Urner
ALT 2023 - Benefits of Additive Noise in Composing Classes with Bounded Capacity
[paper]
Alireza Fathollah Pour, Hassan Ashtiani
NeurIPS 2022 - Private and polynomial time algorithms for learning Gaussians and beyond
[paper]
Hassan Ashtiani, Christopher Liaw
COLT 2022 - On the Sample Complexity of Privately Learning Unbounded High-Dimensional Gaussians
[paper]
Ishaq Aden-Ali, Hassan Ashtiani, Gautam Kamath
ALT 2021 - Near-optimal Sample Complexity Bounds for Robust Learning of Gaussian Mixtures via Compression Schemes
[paper], [arXiv]
Hassan Ashtiani, Shai Ben-David, Nick Harvey, Chris Liaw, Abbas Mehrabian, Yaniv Plan
Journal of the ACM, 2020 - Nearly tight sample complexity bounds for learning mixtures of Gaussians via sample compression schemes
[paper]
Hassan Ashtiani, Shai Ben-David, Nick Harvey, Chris Liaw, Abbas Mehrabian, Yaniv Plan
NeurIPS (NIPS) 2018, Oral Presentation (Best Paper Award)