Mahdi Haghifam

alt text 

Ph.D. candidate,
University of Toronto‬, Vector Institute‬
[Google Scholar] [Twitter] [Linkedin]
Email: mahdi dot haghifam AT mail.utoronto.ca

About Me

I am a final year PhD candidate at University of Toronto and a graduate student researcher at Vector Institute‬. I am honored to be advised by Prof. ‪Daniel M. Roy‬. I also work closely with Dr. Gintare Karolina Dziugaite‬. I received my B.Sc. and M.Sc. degrees in Electrical Engineering from Sharif University of Technology in 2014 and 2016 respectively.
Previously, I was a research intern at Element AI‬ in Winter 2019 and Fall 2020. In early 2020, I was a visiting student at Institute of Advanced Study‬ (IAS) for special-year program on Optimization, Statistics, and Theoretical Machine Learning.

I am Research Intern at Google Brain working with Dr. Thomas Steinke and Dr. Abhradeep Guha Thakurta‬‪ for Summer 2022.

My Erdős number is 3!

Research Interests

My research focuses broadly on information theory and statistical learning theory. In particular, I have been working on several areas of Generalization Theory in Machine Learing with a focus on deriving provable guarantees for Machine Learing methods using information-theoretic tools. I am also interested in statistical inference and learning under privacy constraints. For a complete list of my publications, please visit the Publications page.

Selected Papers

  • M. Haghifam, S. Moran, D. M. Roy, G. K. Dziugaite, ‘‘Understanding Generalization via Leave-One-Out Conditional Mutual Information’’, ISIT 2022 [paper].

  • M. Haghifam, G. K. Dziugaite, S. Moran, D. M. Roy, ‘‘Towards a Unified Information–Theoretic Framework for Generalization’’, NeurIPS 2021 (Spotlight, <3% of submissions) [paper].

  • G. Neu, G. K. Dziugaite, M. Haghifam, D. M. Roy , ‘‘Information-Theoretic Generalization Bounds for Stochastic Gradient Descent’’, COLT 2021 [paper].

  • M. Haghifam, V. Y. F. Tan, A. Khisti, ‘‘Sequential Classification with Empirically Observed Statistics’’, IEEE Transactions on Information Theory [paper].

  • M. Haghifam, J. Negrea, A. Khisti, D. M. Roy , G. K. Dziugaite, ‘‘Sharpened Generalization Bounds based on Conditional Mutual Information and an Application to Noisy, Iterative Algorithms’’, NeurIPS 2020 [paper].

  • J. Negrea*, M. Haghifam*, G. K. Dziugaite, A. Khisti, D. M. Roy, ‘‘Information-Theoretic Generalization Bounds for SGLD via Data-Dependent Estimates’’, NeurIPS 2019 [paper].