Parthe Pandit
|
Thakur Family Chair Assistant Professor
Center for Machine Intelligence and Data Science (C-MInDS)
Indian Institute of Technology, Bombay
email: pandit@iitb.ac.in
|
About Me
I am a core faculty member of C-MInDS at IIT Bombay since 2023.
My current research focusses on understanding the generalization behaviour of interpolating machine learning model classes such as neural networks and Reproducing Kernel Hilbert Spaces (RKHS) (a.k.a kernel methods). I also develop algorithms for training kernel models for large scale applications.
Prior to joining IIT Bombay, I was the Simons Postdoctoral Fellow with HDSI at UCSD where I primarily worked with Misha Belkin. I obtained my Ph.D. in ECE from UCLA in 2021 where I worked with Alyson Fletcher, Sundeep Rangan, and Arash Amini. I also received an M.S. in Statistics from UCLA, and a B.Tech.+M.Tech. in EE from IIT Bombay, with a minor in CS.
News
[Nov 2023] Zhichao Wang presented our contributed talk at DeepMath 2023 based on our work 'Quadratic Approximation of Random Inner-Product Kernel Matrices’
[Nov 2023] I am now with C-MInDS at IIT Bombay.
[Apr 2023] Paper accepted at ICML 2023 titled 'Toward large kernel models.'
[Apr 2023] Paper accepted at SIMODS journal titled 'On the inconsistency of kernel ridgeless regression in fixed dimensions.'
[Feb 2023] We have a new training algorithm for large kernel models with linear memory footprint! arxiv:2302.02605
[Dec 2022] New paper on feature learning kernel machines beating neural networks! See arxiv:2212.13881
[Sept 2022] 2 papers accepted at NeurIPS! See arXiv:2208.09938 and arXiv:2207.06569
[Aug 2022] I gave talks at TIFR, IISc, Google AI, IIT Bombay, IISER Pune, and IIT Kanpur on performance analysis of learning GLMs in high dimensions
[July 2022] I am visiting the Simons Institute at UC Berkeley as part of the summer cluster on Deep Learning Theory
[June 2022] Libin Zhu, Misha Belkin and I wrote a note on the bilinear nature of bottleneck networks arXiv:2206.15058
[May 2022] Daniel Beaglehole, Misha Belkin and I submitted a paper on the inconsistency of kernel interpolation in low dimensions arXiv:2205.13525
[May 2022] I was recognized by the UCLA ECE department as a Distinguished Ph.D. Dissertation Research Award Finalist
[Jan 2022] New paper on hidden linearity of kernels and multi-layer perceptrons in high dimensions: arXiv:2201.08082
[Jan 2022] I am now a postdoc at the Halıcıoğlu Data Science Institute at UC San Diego.
[Dec 2021] Submitted my MS Statistics thesis titled “Non-asymptotic Analysis of Learning Long-range Autoregressive Generalized Linear Models for Discrete High-dimensional Data”.
[Nov 2021] I defended my PhD dissertation titled “Exact analysis of Inverse problems in High dimensions with Applications to Machine Learning”.
[June 2021] I am a Quantitative Researcher Intern at Citadel LLC. working with Hua Zheng, Julius Bonart, and Paul Jefferys from the CLB Trading team.
[May 2021] Paper accepted at ICML 2021: “Implicit Bias of Linear RNNs”.
[Mar 2021] I have been awarded the HDSI-Simons Postdoctoral Fellowship
|