I am a Ph.D. Candidate in Mathematics at the Courant Institute of Mathematical Sciences, New York University, where I am very fortunate to be advised by Prof. Mehryar Mohri. My main research interests are machine learning theory and algorithms.
Publications
Cardinality-Aware Set Prediction and Top-k Classification.
In Advances in Neural Information Processing Systems (NeurIPS 2024). Vancouver, Canada, 2024.
Realizable H-Consistent and Bayes-Consistent Loss Functions for Learning to Defer.
In Advances in Neural Information Processing Systems (NeurIPS 2024). Vancouver, Canada, 2024.
A Universal Growth Rate for Learning with Smooth Surrogate Losses.
In Advances in Neural Information Processing Systems (NeurIPS 2024). Vancouver, Canada, 2024.
Multi-Label Learning with Stronger Consistency Guarantees.
In Advances in Neural Information Processing Systems (NeurIPS 2024). Vancouver, Canada, 2024.
Regression with Multi-Expert Deferral.
In Proceedings of the 41st International Conference on Machine Learning (ICML 2024). Vienna, Austria, 2024.
(Spotlight Presentation)
Differentially Private Domain Adaptation with Theoretical Guarantees.
In Proceedings of the 41st International Conference on Machine Learning (ICML 2024). Vienna, Austria, 2024.
H-Consistency Guarantees for Regression.
In Proceedings of the 41st International Conference on Machine Learning (ICML 2024). Vienna, Austria, 2024.
Theoretically Grounded Loss Functions and Algorithms for Score-Based Multi-Class Abstention.
In Twenty-seventh Conference on Artificial Intelligence and Statistics (AISTATS 2024). Valencia, Spain, 2024.
Learning to Reject with a Fixed Predictor: Application to Decontextualization.
In Twelfth International Conference on Learning Representations (ICLR 2024). Vienna, Austria, 2024.
Predictor-Rejector Multi-Class Abstention: Theoretical Analysis and Algorithms.
In Proceedings of the 35th International Conference on Algorithmic Learning Theory (ALT 2024). San Diego, California, 2024.
Principled Approaches for Learning to Defer with Multiple Experts.
In International Symposium on Artificial Intelligence and Mathematics (ISAIM 2024). Fort Lauderdale, Florida, 2024.
Two-stage learning to defer with multiple experts.
In Advances in Neural Information Processing Systems (NeurIPS 2023). New Orleans, Louisiana, 2023.
Structured prediction with stronger consistency guarantees.
In Advances in Neural Information Processing Systems (NeurIPS 2023). New Orleans, Louisiana, 2023.
H-consistency bounds: characterization and extensions.
In Advances in Neural Information Processing Systems (NeurIPS 2023). New Orleans, Louisiana, 2023.
Cross-entropy loss functions: Theoretical analysis and applications.
In Proceedings of the 40th International Conference on Machine Learning (ICML 2023). Honolulu, Hawaii, 2023.
H-consistency bounds for pairwise misranking loss surrogates.
In Proceedings of the 40th International Conference on Machine Learning (ICML 2023). Honolulu, Hawaii, 2023.
Ranking with Abstention.
ICML 2023 Workshop on the Many Facets of Preference-Based Learning. Honolulu, Hawaii, 2023.
DC-programming for neural network optimizations.
Journal of Global Optimization (JOGO), 2023.
Theoretically grounded loss functions and algorithms for adversarial robustness.
In Twenty-sixth Conference on Artificial Intelligence and Statistics (AISTATS 2023). Valencia, Spain, 2023.
Multi-class H-consistency bounds.
In Advances in Neural Information Processing Systems (NeurIPS 2022). New Orleans, Louisiana, 2022.
H-consistency bounds for surrogate loss minimizers.
In Proceedings of the 39th International Conference on Machine Learning (ICML 2022). Baltimore, MD, 2022.
(Long Presentation)
A Finer Calibration Analysis for Adversarial Robustness.
CoRR, abs/2105.01550, 2021.
Calibration and consistency of adversarial surrogate losses.
In Advances in Neural Information Processing Systems (NeurIPS 2021). Online, 2021.
(Spotlight Presentation)
Variational training of neural network approximations of solution maps for physical models.
Journal of Computational Physics, 2020.
Service
Teaching
Contact