Mahdi Haghifam

alt text 

Ph.D. candidate,
University of Toronto‬, Vector Institute‬
[Google Scholar] [Twitter] [Linkedin]
Email: mahdi dot haghifam AT mail.utoronto.ca

About Me

I am a final year PhD candidate at University of Toronto and a graduate student researcher at Vector Institute‬. I am honored to be advised by Prof. ‪Daniel M. Roy‬. I also work closely with Dr. Gintare Karolina Dziugaite‬. I received my B.Sc. and M.Sc. degrees in Electrical Engineering from Sharif University of Technology in 2014 and 2016 respectively.
Previously, I was a research intern at Element AI‬ in Winter 2019 and Fall 2020. In early 2020, I was a visiting student at Institute of Advanced Study‬ (IAS) for special-year program on Optimization, Statistics, and Theoretical Machine Learning.

I was a research intern at Google Brain where I was extremely lucky to be mentored by Dr. Thomas Steinke and Dr. Abhradeep Guha Thakurta‬‪.

I am looking for postdoc and research scientist positions with the starting date of Summer 2023

Research Interests

My research focuses broadly on statistical learning theory, Differential Privacy, and Information Theory. In particular, I have been working on several areas of Generalization Theory in Machine Learning with a focus on deriving provable guarantees for Machine Learning methods using information-theoretic tools. I am also interested in statistical inference and learning under privacy constraints. For a complete list of my publications, please visit the Publications page.

Selected Papers

  • M. Haghifam*, B. Rodriguez-Galvez*, R. Thobaben, M. Skoglund, D. M. Roy, G. K. Dziugaite ‘‘Limitations of Information-Theoretic Generalization Bounds for Gradient Descent Methods in Stochastic Convex Optimization’’, ALT 2023 [paper].

  • M. Haghifam, S. Moran, D. M. Roy, G. K. Dziugaite, ‘‘Understanding Generalization via Leave-One-Out Conditional Mutual Information’’, ISIT 2022 [paper].

  • M. Haghifam, G. K. Dziugaite, S. Moran, D. M. Roy, ‘‘Towards a Unified Information–Theoretic Framework for Generalization’’, NeurIPS 2021 (Spotlight, <3% of submissions) [paper].

  • G. Neu, G. K. Dziugaite, M. Haghifam, D. M. Roy , ‘‘Information-Theoretic Generalization Bounds for Stochastic Gradient Descent’’, COLT 2021 [paper].

  • M. Haghifam, V. Y. F. Tan, A. Khisti, ‘‘Sequential Classification with Empirically Observed Statistics’’, IEEE Transactions on Information Theory (Volume: 67, Issue: 5, May 2021) [paper].

  • M. Haghifam, J. Negrea, A. Khisti, D. M. Roy , G. K. Dziugaite, ‘‘Sharpened Generalization Bounds based on Conditional Mutual Information and an Application to Noisy, Iterative Algorithms’’, NeurIPS 2020 [paper].

  • J. Negrea*, M. Haghifam*, G. K. Dziugaite, A. Khisti, D. M. Roy, ‘‘Information-Theoretic Generalization Bounds for SGLD via Data-Dependent Estimates’’, NeurIPS 2019 [paper].